Sample records for science verification proposal

  1. Model-Based Building Verification in Aerial Photographs.

    DTIC Science & Technology

    1987-09-01

    Powers ’ordon E. Schacher Chaii nan Dean of Science and Electrical and Computer Engineering Engineering "p. 5.€ ’ ,’"..€ € . € -, _ _ . ."€ . 4...paper, we have proposed an ex)erimental knowledge-based " verification syste, te organization for change (letection is oitliinet. , Kowledge rules and

  2. Computer Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossiblemore » to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes« less

  3. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    NASA Technical Reports Server (NTRS)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  4. Gravity Fields and Interiors of the Saturnian Satellites

    NASA Technical Reports Server (NTRS)

    Rappaport, N. J.; Armstrong, J. W.; Asmar, Sami W.; Iess, L.; Tortora, P.; Somenzi, L.; Zingoni, F.

    2006-01-01

    This viewgraph presentation reviews the Gravity Science Objectives and accomplishments of the Cassini Radio Science Team: (1) Mass and density of icy satellites (2) Quadrupole field of Titan and Rhea (3) Dynamic Love number of Titan (4) Moment of inertia of Titan (in collaboration with the Radar Team) (5) Gravity field of Saturn. The proposed measurements for the extended tour are: (1) Quadrupole field of Enceladus (2) More accurate measurement of Titan k2 (3) Local gravity/topography correlations for Iapetus (4) Verification/disproof of "Pioneer anomaly".

  5. 75 FR 43943 - Defense Science Board; Task Force on Nuclear Treaty Monitoring and Verification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ... DEPARTMENT OF DEFENSE Office of the Secretary Defense Science Board; Task Force on Nuclear Treaty... meetings. SUMMARY: The Defense Science Board Task Force on Nuclear Treaty Monitoring and Verification will... held September 13-14, and 25-26, 2010. ADDRESSES: The meetings will be held at Science Applications...

  6. 75 FR 34439 - Defense Science Board Task Force on Nuclear Treaty Monitoring and Verification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-17

    ... DEPARTMENT OF DEFENSE Office of the Secretary Defense Science Board Task Force on Nuclear Treaty... meetings. SUMMARY: The Defense Science Board Task Force on Nuclear Treaty Monitoring and Verification will... Applications International Corporation, 4001 North Fairfax Drive, Suite 300, Arlington, VA. FOR FURTHER...

  7. Chaotic maps and biometrics-based anonymous three-party authenticated key exchange protocol without using passwords

    NASA Astrophysics Data System (ADS)

    Xie, Qi; Hu, Bin; Chen, Ke-Fei; Liu, Wen-Hao; Tan, Xiao

    2015-11-01

    In three-party password authenticated key exchange (AKE) protocol, since two users use their passwords to establish a secure session key over an insecure communication channel with the help of the trusted server, such a protocol may suffer the password guessing attacks and the server has to maintain the password table. To eliminate the shortages of password-based AKE protocol, very recently, according to chaotic maps, Lee et al. [2015 Nonlinear Dyn. 79 2485] proposed a first three-party-authenticated key exchange scheme without using passwords, and claimed its security by providing a well-organized BAN logic test. Unfortunately, their protocol cannot resist impersonation attack, which is demonstrated in the present paper. To overcome their security weakness, by using chaotic maps, we propose a biometrics-based anonymous three-party AKE protocol with the same advantages. Further, we use the pi calculus-based formal verification tool ProVerif to show that our AKE protocol achieves authentication, security and anonymity, and an acceptable efficiency. Project supported by the Natural Science Foundation of Zhejiang Province, China (Grant No. LZ12F02005), the Major State Basic Research Development Program of China (Grant No. 2013CB834205), and the National Natural Science Foundation of China (Grant No. 61070153).

  8. Goddard high resolution spectrograph science verification and data analysis

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The data analysis performed was to support the Orbital Verification (OV) and Science Verification (SV) of the GHRS was in the areas of the Digicon detector's performance and stability, wavelength calibration, and geomagnetic induced image motion. The results of the analyses are briefly described. Detailed results are given in the form of attachments. Specialized software was developed for the analyses. Calibration files were formatted according to the specifications in a Space Telescope Science report. IRAS images were restored of the Large Magellanic Cloud using a blocked iterative algorithm. The algorithm works with the raw data scans without regridding or interpolating the data on an equally spaced image grid.

  9. Experimental evaluation of fingerprint verification system based on double random phase encoding

    NASA Astrophysics Data System (ADS)

    Suzuki, Hiroyuki; Yamaguchi, Masahiro; Yachida, Masuyoshi; Ohyama, Nagaaki; Tashima, Hideaki; Obi, Takashi

    2006-03-01

    We proposed a smart card holder authentication system that combines fingerprint verification with PIN verification by applying a double random phase encoding scheme. In this system, the probability of accurate verification of an authorized individual reduces when the fingerprint is shifted significantly. In this paper, a review of the proposed system is presented and preprocessing for improving the false rejection rate is proposed. In the proposed method, the position difference between two fingerprint images is estimated by using an optimized template for core detection. When the estimated difference exceeds the permissible level, the user inputs the fingerprint again. The effectiveness of the proposed method is confirmed by a computational experiment; its results show that the false rejection rate is improved.

  10. Draft Plan for Characterizing Commercial Data Products in Support of Earth Science Research

    NASA Technical Reports Server (NTRS)

    Ryan, Robert E.; Terrie, Greg; Berglund, Judith

    2006-01-01

    This presentation introduces a draft plan for characterizing commercial data products for Earth science research. The general approach to the commercial product verification and validation includes focused selection of a readily available commercial remote sensing products that support Earth science research. Ongoing product verification and characterization will question whether the product meets specifications and will examine its fundamental properties, potential and limitations. Validation will encourage product evaluation for specific science research and applications. Specific commercial products included in the characterization plan include high-spatial-resolution multispectral (HSMS) imagery and LIDAR data products. Future efforts in this process will include briefing NASA headquarters and modifying plans based on feedback, increased engagement with the science community and refinement of details, coordination with commercial vendors and The Joint Agency Commercial Imagery Evaluation (JACIE) for HSMS satellite acquisitions, acquiring waveform LIDAR data and performing verification and validation.

  11. Weak lensing magnification in the Dark Energy Survey Science Verification Data

    DOE PAGES

    Garcia-Fernandez, M.; et al.

    2018-02-02

    In this paper the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using data from the Dark Energy Survey Science Verification dataset. The analysis is carried out for two photometrically-selected galaxy samples, with mean photometric redshifts in themore » $0.2 < z < 0.4$ and $0.7 < z < 1.0$ ranges, in the riz bands. A signal is detected with a $$3.5\\sigma$$ significance level in each of the bands tested, and is compatible with the magnification predicted by the $$\\Lambda$$CDM model. After an extensive analysis, it cannot be attributed to any known systematic effect. The detection of the magnification signal is robust to estimated uncertainties in the outlier rate of the pho- tometric redshifts, but this will be an important issue for use of photometric redshifts in magnification mesurements from larger samples. In addition to the detection of the magnification signal, a method to select the sample with the maximum signal-to-noise is proposed and validated with data.« less

  12. Weak lensing magnification in the Dark Energy Survey Science Verification Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia-Fernandez, M.; et al.

    In this paper the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using data from the Dark Energy Survey Science Verification dataset. The analysis is carried out for two photometrically-selected galaxy samples, with mean photometric redshifts in themore » $0.2 < z < 0.4$ and $0.7 < z < 1.0$ ranges, in the riz bands. A signal is detected with a $$3.5\\sigma$$ significance level in each of the bands tested, and is compatible with the magnification predicted by the $$\\Lambda$$CDM model. After an extensive analysis, it cannot be attributed to any known systematic effect. The detection of the magnification signal is robust to estimated uncertainties in the outlier rate of the pho- tometric redshifts, but this will be an important issue for use of photometric redshifts in magnification mesurements from larger samples. In addition to the detection of the magnification signal, a method to select the sample with the maximum signal-to-noise is proposed and validated with data.« less

  13. Weak lensing magnification in the Dark Energy Survey Science Verification Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia-Fernandez, M.; et al.

    2016-11-30

    In this paper the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using data from the Dark Energy Survey Science Verification dataset. The analysis is carried out for two photometrically-selected galaxy samples, with mean photometric redshifts in themore » $0.2 < z < 0.4$ and $0.7 < z < 1.0$ ranges, in the riz bands. A signal is detected with a $$3.5\\sigma$$ significance level in each of the bands tested, and is compatible with the magnification predicted by the $$\\Lambda$$CDM model. After an extensive analysis, it cannot be attributed to any known systematic effect. The detection of the magnification signal is robust to estimated uncertainties in the outlier rate of the pho- tometric redshifts, but this will be an important issue for use of photometric redshifts in magnification mesurements from larger samples. In addition to the detection of the magnification signal, a method to select the sample with the maximum signal-to-noise is proposed and validated with data.« less

  14. Process Document, Joint Verification Protocol, and Joint Test Plan for Verification of HACH-LANGE GmbH LUMIStox 300 Bench Top Luminometer and ECLOX Handheld Luminometer for Luminescent Bacteria Test for use in Wastewater

    EPA Science Inventory

    The Danish Environmental Technology Verification program (DANETV) Water Test Centre operated by DHI, is supported by the Danish Ministry for Science, Technology and Innovation. DANETV, the United States Environmental Protection Agency Environmental Technology Verification Progra...

  15. Environmental Technology Verification: Baghouse Filtration Products--Sinoma Science & Technology Co. Ltd FT-806 Filtration Media

    EPA Science Inventory

    EPA created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. It seeks to achieve this goal by providing high-quality, peer r...

  16. Cross-correlation redshift calibration without spectroscopic calibration samples in DES Science Verification Data

    NASA Astrophysics Data System (ADS)

    Davis, C.; Rozo, E.; Roodman, A.; Alarcon, A.; Cawthon, R.; Gatti, M.; Lin, H.; Miquel, R.; Rykoff, E. S.; Troxel, M. A.; Vielzeuf, P.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Bechtol, K.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Doel, P.; Drlica-Wagner, A.; Fausti Neto, A.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gaztanaga, E.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; Jain, B.; James, D. J.; Jeltema, T.; Krause, E.; Kuehn, K.; Kuhlmann, S.; Kuropatkin, N.; Lahav, O.; Li, T. S.; Lima, M.; March, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Ogando, R. L. C.; Plazas, A. A.; Romer, A. K.; Sanchez, E.; Scarpine, V.; Schindler, R.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, M.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Vikram, V.; Walker, A. R.; Wechsler, R. H.

    2018-06-01

    Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogues with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty of Δz ˜ ±0.01. We forecast that our proposal can, in principle, control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Our results provide strong motivation to launch a programme to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.

  17. Cross-correlation redshift calibration without spectroscopic calibration samples in DES Science Verification Data

    DOE PAGES

    Davis, C.; Rozo, E.; Roodman, A.; ...

    2018-03-26

    Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogs with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty ofmore » $$\\Delta z \\sim \\pm 0.01$$. We forecast that our proposal can in principle control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Here, our results provide strong motivation to launch a program to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.« less

  18. Cross-correlation redshift calibration without spectroscopic calibration samples in DES Science Verification Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, C.; Rozo, E.; Roodman, A.

    Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogs with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty ofmore » $$\\Delta z \\sim \\pm 0.01$$. We forecast that our proposal can in principle control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Here, our results provide strong motivation to launch a program to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.« less

  19. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  20. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  1. Using Small-Step Refinement for Algorithm Verification in Computer Science Education

    ERIC Educational Resources Information Center

    Simic, Danijela

    2015-01-01

    Stepwise program refinement techniques can be used to simplify program verification. Programs are better understood since their main properties are clearly stated, and verification of rather complex algorithms is reduced to proving simple statements connecting successive program specifications. Additionally, it is easy to analyse similar…

  2. Enhanced Verification Test Suite for Physics Simulation Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, J R; Brock, J S; Brandon, S T

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest.more » This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of greater sophistication or other physics regimes (e.g., energetic material response, magneto-hydrodynamics), would represent a scientifically desirable complement to the fundamental test cases discussed in this report. The authors believe that this document can be used to enhance the verification analyses undertaken at the DOE WP Laboratories and, thus, to improve the quality, credibility, and usefulness of the simulation codes that are analyzed with these problems.« less

  3. A Quantitative Approach to the Formal Verification of Real-Time Systems.

    DTIC Science & Technology

    1996-09-01

    Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic

  4. Property-driven functional verification technique for high-speed vision system-on-chip processor

    NASA Astrophysics Data System (ADS)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  5. Verification of operational solar flare forecast: Case of Regional Warning Center Japan

    NASA Astrophysics Data System (ADS)

    Kubo, Yûki; Den, Mitsue; Ishii, Mamoru

    2017-08-01

    In this article, we discuss a verification study of an operational solar flare forecast in the Regional Warning Center (RWC) Japan. The RWC Japan has been issuing four-categorical deterministic solar flare forecasts for a long time. In this forecast verification study, we used solar flare forecast data accumulated over 16 years (from 2000 to 2015). We compiled the forecast data together with solar flare data obtained with the Geostationary Operational Environmental Satellites (GOES). Using the compiled data sets, we estimated some conventional scalar verification measures with 95% confidence intervals. We also estimated a multi-categorical scalar verification measure. These scalar verification measures were compared with those obtained by the persistence method and recurrence method. As solar activity varied during the 16 years, we also applied verification analyses to four subsets of forecast-observation pair data with different solar activity levels. We cannot conclude definitely that there are significant performance differences between the forecasts of RWC Japan and the persistence method, although a slightly significant difference is found for some event definitions. We propose to use a scalar verification measure to assess the judgment skill of the operational solar flare forecast. Finally, we propose a verification strategy for deterministic operational solar flare forecasting. For dichotomous forecast, a set of proposed verification measures is a frequency bias for bias, proportion correct and critical success index for accuracy, probability of detection for discrimination, false alarm ratio for reliability, Peirce skill score for forecast skill, and symmetric extremal dependence index for association. For multi-categorical forecast, we propose a set of verification measures as marginal distributions of forecast and observation for bias, proportion correct for accuracy, correlation coefficient and joint probability distribution for association, the likelihood distribution for discrimination, the calibration distribution for reliability and resolution, and the Gandin-Murphy-Gerrity score and judgment skill score for skill.

  6. 78 FR 28812 - Energy Efficiency Program for Industrial Equipment: Petition of UL Verification Services Inc. for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... are engineers. UL today is comprised of five businesses, Product Safety, Verification Services, Life..., Director--Global Technical Research, UL Verification Services. Subscribed and sworn to before me this 20... (431.447(c)(4)) General Personnel Overview UL is a global independent safety science company with more...

  7. 77 FR 64596 - Proposed Information Collection (Income Verification) Activity: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-22

    ... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0518] Proposed Information Collection (Income... to income- dependent benefits. DATES: Written comments and recommendations on the proposed collection... techniques or the use of other forms of information technology. Title: Income Verification, VA Form 21-0161a...

  8. Evolution of Galaxy Luminosity and Stellar-Mass Functions since $z=1$ with the Dark Energy Survey Science Verification Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Capozzi, D.; et al.

    We present the first study of the evolution of the galaxy luminosity and stellar-mass functions (GLF and GSMF) carried out by the Dark Energy Survey (DES). We describe the COMMODORE galaxy catalogue selected from Science Verification images. This catalogue is made ofmore » $$\\sim 4\\times 10^{6}$$ galaxies at $$0« less

  9. 78 FR 27882 - VA Veteran-Owned Small Business (VOSB) Verification Guidelines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-13

    ... Verification Self-Assessment Tool that walks the veteran through the regulation and how it applies to the...) Verification Guidelines AGENCY: Department of Veterans Affairs. ACTION: Advanced notice of proposed rulemaking... regulations governing the Department of Veterans Affairs (VA) Veteran-Owned Small Business (VOSB) Verification...

  10. 78 FR 56268 - Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-12

    .... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension... public workshop on ``Integrity Verification Process'' which took place on August 7, 2013. The notice also sought comments on the proposed ``Integrity Verification Process.'' In response to the comments received...

  11. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  12. Information verification and encryption based on phase retrieval with sparsity constraints and optical inference

    NASA Astrophysics Data System (ADS)

    Zhong, Shenlu; Li, Mengjiao; Tang, Xiajie; He, Weiqing; Wang, Xiaogang

    2017-01-01

    A novel optical information verification and encryption method is proposed based on inference principle and phase retrieval with sparsity constraints. In this method, a target image is encrypted into two phase-only masks (POMs), which comprise sparse phase data used for verification. Both of the two POMs need to be authenticated before being applied for decrypting. The target image can be optically reconstructed when the two authenticated POMs are Fourier transformed and convolved by the correct decryption key, which is also generated in encryption process. No holographic scheme is involved in the proposed optical verification and encryption system and there is also no problem of information disclosure in the two authenticable POMs. Numerical simulation results demonstrate the validity and good performance of this new proposed method.

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  14. The Mars Science Laboratory Organic Check Material

    NASA Astrophysics Data System (ADS)

    Conrad, Pamela G.; Eigenbrode, Jennifer L.; Von der Heydt, Max O.; Mogensen, Claus T.; Canham, John; Harpold, Dan N.; Johnson, Joel; Errigo, Therese; Glavin, Daniel P.; Mahaffy, Paul R.

    2012-09-01

    Mars Science Laboratory's Curiosity rover carries a set of five external verification standards in hermetically sealed containers that can be sampled as would be a Martian rock, by drilling and then portioning into the solid sample inlet of the Sample Analysis at Mars (SAM) suite. Each organic check material (OCM) canister contains a porous ceramic solid, which has been doped with a fluorinated hydrocarbon marker that can be detected by SAM. The purpose of the OCM is to serve as a verification tool for the organic cleanliness of those parts of the sample chain that cannot be cleaned other than by dilution, i.e., repeated sampling of Martian rock. SAM possesses internal calibrants for verification of both its performance and its internal cleanliness, and the OCM is not used for that purpose. Each OCM unit is designed for one use only, and the choice to do so will be made by the project science group (PSG).

  15. The Yes-No Question Answering System and Statement Verification.

    ERIC Educational Resources Information Center

    Akiyama, M. Michael; And Others

    1979-01-01

    Two experiments investigated the relationship of verification to the answering of yes-no questions. Subjects verified simple statements or answered simple questions. Various proposals concerning the relative difficulty of answering questions and verifying statements were considered, and a model was proposed. (SW)

  16. Time-space modal logic for verification of bit-slice circuits

    NASA Astrophysics Data System (ADS)

    Hiraishi, Hiromi

    1996-03-01

    The major goal of this paper is to propose a new modal logic aiming at formal verification of bit-slice circuits. The new logic is called as time-space modal logic and its major feature is that it can handle two transition relations: one for time transition and the other for space transition. As for a verification algorithm, a symbolic model checking algorithm of the new logic is shown. This could be applicable to verification of bit-slice microprocessor of infinite bit width and 1D systolic array of infinite length. A simple benchmark result shows the effectiveness of the proposed approach.

  17. Hubble Space Telescope high speed photometer science verification test report

    NASA Technical Reports Server (NTRS)

    Richards, Evan E.

    1992-01-01

    The purpose of this report is to summarize the results of the HSP Science Verification (SV) tests, the status of the HSP at the end of the SV period, and the work remaining to be done. The HSP OV report (November 1991) covered all activities (OV, SV, and SAO) from launch to the completion of phase three alignment, OV 3233 performed in the 91154 SMS, on June 8, 1991. This report covers subsequent activities through May 1992.

  18. Using formal methods for content validation of medical procedure documents.

    PubMed

    Cota, Érika; Ribeiro, Leila; Bezerra, Jonas Santos; Costa, Andrei; da Silva, Rosiana Estefane; Cota, Gláucia

    2017-08-01

    We propose the use of a formal approach to support content validation of a standard operating procedure (SOP) for a therapeutic intervention. Such an approach provides a useful tool to identify ambiguities, omissions and inconsistencies, and improves the applicability and efficacy of documents in the health settings. We apply and evaluate a methodology originally proposed for the verification of software specification documents to a specific SOP. The verification methodology uses the graph formalism to model the document. Semi-automatic analysis identifies possible problems in the model and in the original document. The verification is an iterative process that identifies possible faults in the original text that should be revised by its authors and/or specialists. The proposed method was able to identify 23 possible issues in the original document (ambiguities, omissions, redundant information, and inaccuracies, among others). The formal verification process aided the specialists to consider a wider range of usage scenarios and to identify which instructions form the kernel of the proposed SOP and which ones represent additional or required knowledge that are mandatory for the correct application of the medical document. By using the proposed verification process, a simpler and yet more complete SOP could be produced. As consequence, during the validation process the experts received a more mature document and could focus on the technical aspects of the procedure itself. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. An Efficient Location Verification Scheme for Static Wireless Sensor Networks.

    PubMed

    Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok

    2017-01-24

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.

  20. An Efficient Location Verification Scheme for Static Wireless Sensor Networks

    PubMed Central

    Kim, In-hwan; Kim, Bo-sung; Song, JooSeok

    2017-01-01

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007

  1. 78 FR 46359 - 30-Day Notice of Proposed Information Collection: Federal Labor Standards Payee Verification and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-31

    ... Information Collection: Federal Labor Standards Payee Verification and Payment Processing AGENCY: Office of the Chief Information Officer, HUD. ACTION: Notice. SUMMARY: HUD has submitted the proposed information collection requirement described below to the Office of Management and Budget (OMB) for review, in...

  2. Formal verification and testing: An integrated approach to validating Ada programs

    NASA Technical Reports Server (NTRS)

    Cohen, Norman H.

    1986-01-01

    An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.

  3. Weak lensing magnification in the Dark Energy Survey Science Verification data

    NASA Astrophysics Data System (ADS)

    Garcia-Fernandez, M.; Sanchez, E.; Sevilla-Noarbe, I.; Suchyta, E.; Huff, E. M.; Gaztanaga, E.; Aleksić, J.; Ponce, R.; Castander, F. J.; Hoyle, B.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Eifler, T. F.; Evrard, A. E.; Fernandez, E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Jarvis, M.; Kirk, D.; Krause, E.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; MacCrann, N.; Maia, M. A. G.; March, M.; Marshall, J. L.; Melchior, P.; Miquel, R.; Mohr, J. J.; Plazas, A. A.; Romer, A. K.; Roodman, A.; Rykoff, E. S.; Scarpine, V.; Schubnell, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Tarle, G.; Thomas, D.; Walker, A. R.; Wester, W.; DES Collaboration

    2018-05-01

    In this paper, the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using the Dark Energy Survey Science Verification data set. This analysis is carried out for galaxies that are selected only by its photometric redshift. An extensive analysis of the systematic effects, using new methods based on simulations is performed, including a Monte Carlo sampling of the selection function of the survey.

  4. Hierarchical Representation Learning for Kinship Verification.

    PubMed

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  5. Quantum money with classical verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gavinsky, Dmitry

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  6. Quantum money with classical verification

    NASA Astrophysics Data System (ADS)

    Gavinsky, Dmitry

    2014-12-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  7. An ontology based trust verification of software license agreement

    NASA Astrophysics Data System (ADS)

    Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo

    2017-08-01

    When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.

  8. 75 FR 17923 - Agency Information Collection Activities: Proposed Collection: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-08

    ... Verification and Community Site Information form, the Loan Information and Verification form, the Authorization to Release Information form, the Applicant Checklist, and the Self-Certification form. The annual... respondent responses response hours NHSC LRP Application 5,175 1 5,175 0.30 1,553 Employment Verification-- 5...

  9. 78 FR 32431 - Notice of Submission of Proposed Information Collection to OMB; Enterprise Income Verification...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-30

    ... Proposed Information Collection to OMB; Enterprise Income Verification (EIV) System User Access Authorization Form and Rules of Behavior and User Agreement AGENCY: Office of the Chief Information Officer, HUD... user with information related to the Rules of Behavior for system usage and the user's responsibilities...

  10. 78 FR 27988 - Notice of Submission of Proposed Information Collection to OMB Enterprise Income Verification...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-13

    ... Proposed Information Collection to OMB Enterprise Income Verification (EIV) System--Debts Owed to Public... of Management and Budget (OMB) for review, as required by the Paperwork Reduction Act. HUD is... Management and Budget, New Executive Office Building, Washington, DC 20503; fax: 202-395-5806. Email: OIRA...

  11. Quantum Chemical Mass Spectrometry: Verification and Extension of the Mobile Proton Model for Histidine

    NASA Astrophysics Data System (ADS)

    Cautereels, Julie; Blockhuys, Frank

    2017-06-01

    The quantum chemical mass spectrometry for materials science (QCMS2) method is used to verify the proposed mechanism for proton transfer - the Mobile Proton Model (MPM) - by histidine for ten XHS tripeptides, based on quantum chemical calculations at the DFT/B3LYP/6-311+G* level of theory. The fragmentations of the different intermediate structures in the MPM mechanism are studied within the QCMS2 framework, and the energetics of the proposed mechanism itself and those of the fragmentations of the intermediate structures are compared, leading to the computational confirmation of the MPM. In addition, the calculations suggest that the mechanism should be extended from considering only the formation of five-membered ring intermediates to include larger-ring intermediates. [Figure not available: see fulltext.

  12. Requirement Specifications for a Design and Verification Unit.

    ERIC Educational Resources Information Center

    Pelton, Warren G.; And Others

    A research and development activity to introduce new and improved education and training technology into Bureau of Medicine and Surgery training is recommended. The activity, called a design and verification unit, would be administered by the Education and Training Sciences Department. Initial research and development are centered on the…

  13. High-resolution face verification using pore-scale facial features.

    PubMed

    Li, Dong; Zhou, Huiling; Lam, Kin-Man

    2015-08-01

    Face recognition methods, which usually represent face images using holistic or local facial features, rely heavily on alignment. Their performances also suffer a severe degradation under variations in expressions or poses, especially when there is one gallery per subject only. With the easy access to high-resolution (HR) face images nowadays, some HR face databases have recently been developed. However, few studies have tackled the use of HR information for face recognition or verification. In this paper, we propose a pose-invariant face-verification method, which is robust to alignment errors, using the HR information based on pore-scale facial features. A new keypoint descriptor, namely, pore-Principal Component Analysis (PCA)-Scale Invariant Feature Transform (PPCASIFT)-adapted from PCA-SIFT-is devised for the extraction of a compact set of distinctive pore-scale facial features. Having matched the pore-scale features of two-face regions, an effective robust-fitting scheme is proposed for the face-verification task. Experiments show that, with one frontal-view gallery only per subject, our proposed method outperforms a number of standard verification methods, and can achieve excellent accuracy even the faces are under large variations in expression and pose.

  14. 75 FR 4101 - Enterprise Income Verification (EIV) System User Access Authorization Form and Rules of Behavior...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-26

    ... Verification (EIV) System User Access Authorization Form and Rules of Behavior and User Agreement AGENCY... lists the following information: Title of Proposal: Enterprise Income Verification (EIV) System User Access, Authorization Form and Rules Of Behavior and User Agreement. OMB Approval Number: 2577-New. Form...

  15. Face verification system for Android mobile devices using histogram based features

    NASA Astrophysics Data System (ADS)

    Sato, Sho; Kobayashi, Kazuhiro; Chen, Qiu

    2016-07-01

    This paper proposes a face verification system that runs on Android mobile devices. In this system, facial image is captured by a built-in camera on the Android device firstly, and then face detection is implemented using Haar-like features and AdaBoost learning algorithm. The proposed system verify the detected face using histogram based features, which are generated by binary Vector Quantization (VQ) histogram using DCT coefficients in low frequency domains, as well as Improved Local Binary Pattern (Improved LBP) histogram in spatial domain. Verification results with different type of histogram based features are first obtained separately and then combined by weighted averaging. We evaluate our proposed algorithm by using publicly available ORL database and facial images captured by an Android tablet.

  16. Mechanical verification of a schematic Byzantine clock synchronization algorithm

    NASA Technical Reports Server (NTRS)

    Shankar, Natarajan

    1991-01-01

    Schneider generalizes a number of protocols for Byzantine fault tolerant clock synchronization and presents a uniform proof for their correctness. The authors present a machine checked proof of this schematic protocol that revises some of the details in Schneider's original analysis. The verification was carried out with the EHDM system developed at the SRI Computer Science Laboratory. The mechanically checked proofs include the verification that the egocentric mean function used in Lamport and Melliar-Smith's Interactive Convergence Algorithm satisfies the requirements of Schneider's protocol.

  17. A Secure Framework for Location Verification in Pervasive Computing

    NASA Astrophysics Data System (ADS)

    Liu, Dawei; Lee, Moon-Chuen; Wu, Dan

    The way people use computing devices has been changed in some way by the relatively new pervasive computing paradigm. For example, a person can use a mobile device to obtain its location information at anytime and anywhere. There are several security issues concerning whether this information is reliable in a pervasive environment. For example, a malicious user may disable the localization system by broadcasting a forged location, and it may impersonate other users by eavesdropping their locations. In this paper, we address the verification of location information in a secure manner. We first present the design challenges for location verification, and then propose a two-layer framework VerPer for secure location verification in a pervasive computing environment. Real world GPS-based wireless sensor network experiments confirm the effectiveness of the proposed framework.

  18. ESTEST: An Open Science Platform for Electronic Structure Research

    ERIC Educational Resources Information Center

    Yuan, Gary

    2012-01-01

    Open science platforms in support of data generation, analysis, and dissemination are becoming indispensible tools for conducting research. These platforms use informatics and information technologies to address significant problems in open science data interoperability, verification & validation, comparison, analysis, post-processing,…

  19. Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation & Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsao, Jeffrey Y.; Trucano, Timothy G.; Kleban, Stephen D.

    This report contains the written footprint of a Sandia-hosted workshop held in Albuquerque, New Mexico, June 22-23, 2016 on “Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation and Uncertainty Quantification,” as well as of pre-work that fed into the workshop. The workshop’s intent was to explore and begin articulating research opportunities at the intersection between two important Sandia communities: the complex systems (CS) modeling community, and the verification, validation and uncertainty quantification (VVUQ) community The overarching research opportunity (and challenge) that we ultimately hope to address is: how can we quantify the credibility of knowledgemore » gained from complex systems models, knowledge that is often incomplete and interim, but will nonetheless be used, sometimes in real-time, by decision makers?« less

  20. Sensor Based Framework for Secure Multimedia Communication in VANET

    PubMed Central

    Rahim, Aneel; Khan, Zeeshan Shafi; Bin Muhaya, Fahad T.; Sher, Muhammad; Kim, Tai-Hoon

    2010-01-01

    Secure multimedia communication enhances the safety of passengers by providing visual pictures of accidents and danger situations. In this paper we proposed a framework for secure multimedia communication in Vehicular Ad-Hoc Networks (VANETs). Our proposed framework is mainly divided into four components: redundant information, priority assignment, malicious data verification and malicious node verification. The proposed scheme jhas been validated with the help of the NS-2 network simulator and the Evalvid tool. PMID:22163462

  1. Biometrics based authentication scheme for session initiation protocol.

    PubMed

    Xie, Qi; Tang, Zhixiong

    2016-01-01

    Many two-factor challenge-response based session initiation protocol (SIP) has been proposed, but most of them are vulnerable to smart card stolen attacks and password guessing attacks. In this paper, we propose a novel three-factor SIP authentication scheme using biometrics, password and smart card, and utilize the pi calculus-based formal verification tool ProVerif to prove that the proposed protocol achieves security and authentication. Furthermore, our protocol is highly efficient when compared to other related protocols.

  2. The JWST Science Instrument Payload: Mission Context and Status

    NASA Technical Reports Server (NTRS)

    Greenhouse, Matthew A.

    2014-01-01

    The James Webb Space Telescope (JWST) is the scientific successor to the Hubble Space Telescope. It is a cryogenic infrared space observatory with a 25 sq m aperture (6 m class) telescope that will achieve diffraction limited angular resolution at a wavelength of 2 microns. The science instrument payload includes four passively cooled near-infrared instruments providing broad- and narrow-band imagery, coronography, as well as multi-object and integral-field spectroscopy over the 0.6 < lambda < 5.0 microns spectrum. An actively cooled mid-infrared instrument provides broad-band imagery, coronography, and integral-field spectroscopy over the 5.0 < lambda < 29 microns spectrum. The JWST is being developed by NASA, in partnership with the European and Canadian Space Agencies, as a general user facility with science observations to be proposed by the international astronomical community in a manner similar to the Hubble Space Telescope. Technology development and mission design are complete. Construction, integration and verification testing is underway in all areas of the program. The JWST is on schedule for launch during 2018.

  3. Cryo Testing of tbe James Webb Space Telescope's Integrated Science Instrument Module

    NASA Technical Reports Server (NTRS)

    VanCampen, Julie

    2004-01-01

    The Integrated Science Instrument Module (ISIM) of the James Webb Space Telescope will be integrated and tested at the Environmental Test Facilities at Goddard Space Flight Center (GSFC). The cryogenic thermal vacuum testing of the ISIM will be the most difficult and problematic portion of the GSFC Integration and Test flow. The test is to validate the coupled interface of the science instruments and the ISIM structure and to sufficiently stress that interface while validating image quality of the science instruments. The instruments and the structure are not made from the same materials and have different CTE. Test objectives and verification rationale are currently being evaluated in Phase B of the project plan. The test program will encounter engineering challenges and limitations, which are derived by cost and technology many of which can be mitigated by facility upgrades, creative GSE, and thorough forethought. The cryogenic testing of the ISIM will involve a number of risks such as the implementation of unique metrology techniques, mechanical, electrical and optical simulators housed within the cryogenic vacuum environment. These potential risks are investigated and possible solutions are proposed.

  4. Method for secure electronic voting system: face recognition based approach

    NASA Astrophysics Data System (ADS)

    Alim, M. Affan; Baig, Misbah M.; Mehboob, Shahzain; Naseem, Imran

    2017-06-01

    In this paper, we propose a framework for low cost secure electronic voting system based on face recognition. Essentially Local Binary Pattern (LBP) is used for face feature characterization in texture format followed by chi-square distribution is used for image classification. Two parallel systems are developed based on smart phone and web applications for face learning and verification modules. The proposed system has two tire security levels by using person ID followed by face verification. Essentially class specific threshold is associated for controlling the security level of face verification. Our system is evaluated three standard databases and one real home based database and achieve the satisfactory recognition accuracies. Consequently our propose system provides secure, hassle free voting system and less intrusive compare with other biometrics.

  5. Collaborative Localization and Location Verification in WSNs

    PubMed Central

    Miao, Chunyu; Dai, Guoyong; Ying, Kezhen; Chen, Qingzhang

    2015-01-01

    Localization is one of the most important technologies in wireless sensor networks. A lightweight distributed node localization scheme is proposed by considering the limited computational capacity of WSNs. The proposed scheme introduces the virtual force model to determine the location by incremental refinement. Aiming at solving the drifting problem and malicious anchor problem, a location verification algorithm based on the virtual force mode is presented. In addition, an anchor promotion algorithm using the localization reliability model is proposed to re-locate the drifted nodes. Extended simulation experiments indicate that the localization algorithm has relatively high precision and the location verification algorithm has relatively high accuracy. The communication overhead of these algorithms is relative low, and the whole set of reliable localization methods is practical as well as comprehensive. PMID:25954948

  6. Functions of social support and self-verification in association with loneliness, depression, and stress.

    PubMed

    Wright, Kevin B; King, Shawn; Rosenberg, Jenny

    2014-01-01

    This study investigated the influence of social support and self-verification on loneliness, depression, and stress among 477 college students. The authors propose and test a theoretical model using structural equation modeling. The results indicated empirical support for the model, with self-verification mediating the relation between social support and health outcomes. The results have implications for social support and self-verification research, which are discussed along with directions for future research and limitations of the study.

  7. A new method to address verification bias in studies of clinical screening tests: cervical cancer screening assays as an example.

    PubMed

    Xue, Xiaonan; Kim, Mimi Y; Castle, Philip E; Strickler, Howard D

    2014-03-01

    Studies to evaluate clinical screening tests often face the problem that the "gold standard" diagnostic approach is costly and/or invasive. It is therefore common to verify only a subset of negative screening tests using the gold standard method. However, undersampling the screen negatives can lead to substantial overestimation of the sensitivity and underestimation of the specificity of the diagnostic test. Our objective was to develop a simple and accurate statistical method to address this "verification bias." We developed a weighted generalized estimating equation approach to estimate, in a single model, the accuracy (eg, sensitivity/specificity) of multiple assays and simultaneously compare results between assays while addressing verification bias. This approach can be implemented using standard statistical software. Simulations were conducted to assess the proposed method. An example is provided using a cervical cancer screening trial that compared the accuracy of human papillomavirus and Pap tests, with histologic data as the gold standard. The proposed approach performed well in estimating and comparing the accuracy of multiple assays in the presence of verification bias. The proposed approach is an easy to apply and accurate method for addressing verification bias in studies of multiple screening methods. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. The Screaming Boredom of Learning Science

    ERIC Educational Resources Information Center

    Krips, H.

    1977-01-01

    Advocates changing the role of secondary school science from one of theory verification and problem solving to the formulation and acceptance of hypotheses for observed phenomena. Provides an example of the procedure using Hooke's Law. (CP)

  9. Developing a Test for Assessing Elementary Students' Comprehension of Science Texts

    ERIC Educational Resources Information Center

    Wang, Jing-Ru; Chen, Shin-Feng; Tsay, Reuy-Fen; Chou, Ching-Ting; Lin, Sheau-Wen; Kao, Huey-Lien

    2012-01-01

    This study reports on the process of developing a test to assess students' reading comprehension of scientific materials and on the statistical results of the verification study. A combination of classic test theory and item response theory approaches was used to analyze the assessment data from a verification study. Data analysis indicates the…

  10. 77 FR 20889 - Proposed Information Collection (Request One-VA Identification Verification Card) Activity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-06

    ... solicits comments on information needed to issue a Personal Identity Verification (PIV) identification card... Personnel Security and Identity Management (07C), Department of Veterans Affairs, 810 Vermont Avenue NW...

  11. The DES Bright Arcs Survey: Hundreds of Candidate Strongly Lensed Galaxy Systems from the Dark Energy Survey Science Verification and Year 1 Observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diehl, H. T.; Buckley-Geer, E. J.; Lindgren, K. A.

    We report the results of our searches for strong gravitational lens systems in the Dark Energy Survey (DES) Science Verication and Year 1 observations. The Science Verication data spans approximately 250 sq. deg. with median i

  12. The DES Bright Arcs Survey: Hundreds of Candidate Strongly Lensed Galaxy Systems from the Dark Energy Survey Science Verification and Year 1 Observations

    DOE PAGES

    Diehl, H. T.; Buckley-Geer, E. J.; Lindgren, K. A.; ...

    2017-09-01

    We report the results of our searches for strong gravitational lens systems in the Dark Energy Survey (DES) Science Verication and Year 1 observations. The Science Verication data spans approximately 250 sq. deg. with median i

  13. Earth Science Activities: A Guide to Effective Elementary School Science Teaching.

    ERIC Educational Resources Information Center

    Kanis, Ira B.; Yasso, Warren E.

    The primary emphasis of this book is on new or revised earth science activities that promote concept development rather than mere verification of concepts learned by passive means. Chapter 2 describes philosophies, strategies, methods, and techniques to guide preservice and inservice teachers, school building administrators, and curriculum…

  14. Character Recognition Method by Time-Frequency Analyses Using Writing Pressure

    NASA Astrophysics Data System (ADS)

    Watanabe, Tatsuhito; Katsura, Seiichiro

    With the development of information and communication technology, personal verification becomes more and more important. In the future ubiquitous society, the development of terminals handling personal information requires the personal verification technology. The signature is one of the personal verification methods; however, the number of characters is limited in the case of the signature and therefore false signature is used easily. Thus, personal identification is difficult from handwriting. This paper proposes a “haptic pen” that extracts the writing pressure, and shows a character recognition method by time-frequency analyses. Although the figures of characters written by different amanuenses are similar, the differences appear in the time-frequency domain. As a result, it is possible to use the proposed character recognition for personal identification more exactly. The experimental results showed the viability of the proposed method.

  15. The Maximal Oxygen Uptake Verification Phase: a Light at the End of the Tunnel?

    PubMed

    Schaun, Gustavo Z

    2017-12-08

    Commonly performed during an incremental test to exhaustion, maximal oxygen uptake (V̇O 2max ) assessment has become a recurring practice in clinical and experimental settings. To validate the test, several criteria were proposed. In this context, the plateau in oxygen uptake (V̇O 2 ) is inconsistent in its frequency, reducing its usefulness as a robust method to determine "true" V̇O 2max . Moreover, secondary criteria previously suggested, such as expiratory exchange ratios or percentages of maximal heart rate, are highly dependent on protocol design and often are achieved at V̇O 2 percentages well below V̇O 2max . Thus, an alternative method termed verification phase was proposed. Currently, it is clear that the verification phase can be a practical and sensitive method to confirm V̇O 2max ; however, procedures to conduct it are not standardized across the literature and no previous research tried to summarize how it has been employed. Therefore, in this review the knowledge on the verification phase was updated, while suggestions on how it can be performed (e.g. intensity, duration, recovery) were provided according to population and protocol design. Future studies should focus to identify a verification protocol feasible for different populations and to compare square-wave and multistage verification phases. Additionally, studies assessing verification phases in different patient populations are still warranted.

  16. Distance Metric Learning Using Privileged Information for Face Verification and Person Re-Identification.

    PubMed

    Xu, Xinxing; Li, Wen; Xu, Dong

    2015-12-01

    In this paper, we propose a new approach to improve face verification and person re-identification in the RGB images by leveraging a set of RGB-D data, in which we have additional depth images in the training data captured using depth cameras such as Kinect. In particular, we extract visual features and depth features from the RGB images and depth images, respectively. As the depth features are available only in the training data, we treat the depth features as privileged information, and we formulate this task as a distance metric learning with privileged information problem. Unlike the traditional face verification and person re-identification tasks that only use visual features, we further employ the extra depth features in the training data to improve the learning of distance metric in the training process. Based on the information-theoretic metric learning (ITML) method, we propose a new formulation called ITML with privileged information (ITML+) for this task. We also present an efficient algorithm based on the cyclic projection method for solving the proposed ITML+ formulation. Extensive experiments on the challenging faces data sets EUROCOM and CurtinFaces for face verification as well as the BIWI RGBD-ID data set for person re-identification demonstrate the effectiveness of our proposed approach.

  17. Cognitive neuroscience in forensic science: understanding and utilizing the human element

    PubMed Central

    Dror, Itiel E.

    2015-01-01

    The human element plays a critical role in forensic science. It is not limited only to issues relating to forensic decision-making, such as bias, but also relates to most aspects of forensic work (some of which even take place before a crime is ever committed or long after the verification of the forensic conclusion). In this paper, I explicate many aspects of forensic work that involve the human element and therefore show the relevance (and potential contribution) of cognitive neuroscience to forensic science. The 10 aspects covered in this paper are proactive forensic science, selection during recruitment, training, crime scene investigation, forensic decision-making, verification and conflict resolution, reporting, the role of the forensic examiner, presentation in court and judicial decisions. As the forensic community is taking on the challenges introduced by the realization that the human element is critical for forensic work, new opportunities emerge that allow for considerable improvement and enhancement of the forensic science endeavour. PMID:26101281

  18. Cognitive neuroscience in forensic science: understanding and utilizing the human element.

    PubMed

    Dror, Itiel E

    2015-08-05

    The human element plays a critical role in forensic science. It is not limited only to issues relating to forensic decision-making, such as bias, but also relates to most aspects of forensic work (some of which even take place before a crime is ever committed or long after the verification of the forensic conclusion). In this paper, I explicate many aspects of forensic work that involve the human element and therefore show the relevance (and potential contribution) of cognitive neuroscience to forensic science. The 10 aspects covered in this paper are proactive forensic science, selection during recruitment, training, crime scene investigation, forensic decision-making, verification and conflict resolution, reporting, the role of the forensic examiner, presentation in court and judicial decisions. As the forensic community is taking on the challenges introduced by the realization that the human element is critical for forensic work, new opportunities emerge that allow for considerable improvement and enhancement of the forensic science endeavour. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  19. Elementary School Students' Science Talk Ability in Inquiry-Oriented Settings in Taiwan: Test Development, Verification, and Performance Benchmarks

    ERIC Educational Resources Information Center

    Lin, Sheau-Wen; Liu, Yu; Chen, Shin-Feng; Wang, Jing-Ru; Kao, Huey-Lien

    2016-01-01

    The purpose of this study was to develop a computer-based measure of elementary students' science talk and to report students' benchmarks. The development procedure had three steps: defining the framework of the test, collecting and identifying key reference sets of science talk, and developing and verifying the science talk instrument. The…

  20. A robust method using propensity score stratification for correcting verification bias for binary tests

    PubMed Central

    He, Hua; McDermott, Michael P.

    2012-01-01

    Sensitivity and specificity are common measures of the accuracy of a diagnostic test. The usual estimators of these quantities are unbiased if data on the diagnostic test result and the true disease status are obtained from all subjects in an appropriately selected sample. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the result of the diagnostic test and other characteristics of the subjects. Estimators of sensitivity and specificity based on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias under the assumption that the missing data on disease status are missing at random (MAR), that is, the probability of missingness depends on the true (missing) disease status only through the test result and observed covariate information. When some of the covariates are continuous, or the number of covariates is relatively large, the existing methods require parametric models for the probability of disease or the probability of verification (given the test result and covariates), and hence are subject to model misspecification. We propose a new method for correcting verification bias based on the propensity score, defined as the predicted probability of verification given the test result and observed covariates. This is estimated separately for those with positive and negative test results. The new method classifies the verified sample into several subsamples that have homogeneous propensity scores and allows correction for verification bias. Simulation studies demonstrate that the new estimators are more robust to model misspecification than existing methods, but still perform well when the models for the probability of disease and probability of verification are correctly specified. PMID:21856650

  1. Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System

    NASA Astrophysics Data System (ADS)

    Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li

    The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.

  2. Aqueous cleaning and verification processes for precision cleaning of small parts

    NASA Technical Reports Server (NTRS)

    Allen, Gale J.; Fishell, Kenneth A.

    1995-01-01

    The NASA Kennedy Space Center (KSC) Materials Science Laboratory (MSL) has developed a totally aqueous process for precision cleaning and verification of small components. In 1990 the Precision Cleaning Facility at KSC used approximately 228,000 kg (500,000 lbs) of chlorofluorocarbon (CFC) 113 in the cleaning operations. It is estimated that current CFC 113 usage has been reduced by 75 percent and it is projected that a 90 percent reduction will be achieved by the end of calendar year 1994. The cleaning process developed utilizes aqueous degreasers, aqueous surfactants, and ultrasonics in the cleaning operation and an aqueous surfactant, ultrasonics, and Total Organic Carbon Analyzer (TOCA) in the nonvolatile residue (NVR) and particulate analysis for verification of cleanliness. The cleaning and verification process is presented in its entirety, with comparison to the CFC 113 cleaning and verification process, including economic and labor costs/savings.

  3. Inverse probability weighting estimation of the volume under the ROC surface in the presence of verification bias.

    PubMed

    Zhang, Ying; Alonzo, Todd A

    2016-11-01

    In diagnostic medicine, the volume under the receiver operating characteristic (ROC) surface (VUS) is a commonly used index to quantify the ability of a continuous diagnostic test to discriminate between three disease states. In practice, verification of the true disease status may be performed only for a subset of subjects under study since the verification procedure is invasive, risky, or expensive. The selection for disease examination might depend on the results of the diagnostic test and other clinical characteristics of the patients, which in turn can cause bias in estimates of the VUS. This bias is referred to as verification bias. Existing verification bias correction in three-way ROC analysis focuses on ordinal tests. We propose verification bias-correction methods to construct ROC surface and estimate the VUS for a continuous diagnostic test, based on inverse probability weighting. By applying U-statistics theory, we develop asymptotic properties for the estimator. A Jackknife estimator of variance is also derived. Extensive simulation studies are performed to evaluate the performance of the new estimators in terms of bias correction and variance. The proposed methods are used to assess the ability of a biomarker to accurately identify stages of Alzheimer's disease. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. An effective one-dimensional anisotropic fingerprint enhancement algorithm

    NASA Astrophysics Data System (ADS)

    Ye, Zhendong; Xie, Mei

    2012-01-01

    Fingerprint identification is one of the most important biometric technologies. The performance of the minutiae extraction and the speed of the fingerprint verification system rely heavily on the quality of the input fingerprint images, so the enhancement of the low fingerprint is a critical and difficult step in a fingerprint verification system. In this paper we proposed an effective algorithm for fingerprint enhancement. Firstly we use normalization algorithm to reduce the variations in gray level values along ridges and valleys. Then we utilize the structure tensor approach to estimate each pixel of the fingerprint orientations. At last we propose a novel algorithm which combines the advantages of onedimensional Gabor filtering method and anisotropic method to enhance the fingerprint in recoverable region. The proposed algorithm has been evaluated on the database of Fingerprint Verification Competition 2004, and the results show that our algorithm performs within less time.

  5. An effective one-dimensional anisotropic fingerprint enhancement algorithm

    NASA Astrophysics Data System (ADS)

    Ye, Zhendong; Xie, Mei

    2011-12-01

    Fingerprint identification is one of the most important biometric technologies. The performance of the minutiae extraction and the speed of the fingerprint verification system rely heavily on the quality of the input fingerprint images, so the enhancement of the low fingerprint is a critical and difficult step in a fingerprint verification system. In this paper we proposed an effective algorithm for fingerprint enhancement. Firstly we use normalization algorithm to reduce the variations in gray level values along ridges and valleys. Then we utilize the structure tensor approach to estimate each pixel of the fingerprint orientations. At last we propose a novel algorithm which combines the advantages of onedimensional Gabor filtering method and anisotropic method to enhance the fingerprint in recoverable region. The proposed algorithm has been evaluated on the database of Fingerprint Verification Competition 2004, and the results show that our algorithm performs within less time.

  6. ED08-0053-07

    NASA Image and Video Library

    2008-02-28

    An ER-2 high-altitude Earth science aircraft banks away during a flight over the southern Sierra Nevada. NASA’s Armstrong Flight Research Center operates two of the Lockheed-built aircraft on a wide variety of environmental science, atmospheric sampling, and satellite data verification missions.

  7. Trajectory Based Behavior Analysis for User Verification

    NASA Astrophysics Data System (ADS)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  8. Formal methods for dependable real-time systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    The motivation for using formal methods to specify and reason about real time properties is outlined and approaches that were proposed and used are sketched. The formal verifications of clock synchronization algorithms are concluded as showing that mechanically supported reasoning about complex real time behavior is feasible. However, there was significant increase in the effectiveness of verification systems since those verifications were performed, at it is to be expected that verifications of comparable difficulty will become fairly routine. The current challenge lies in developing perspicuous and economical approaches to the formalization and specification of real time properties.

  9. Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth

    2016-01-01

    If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.

  10. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  11. The DES Science Verification Weak Lensing Shear Catalogs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarvis, M.

    We present weak lensing shear catalogs for 139 square degrees of data taken during the Science Verification (SV) time for the new Dark Energy Camera (DECam) being used for the Dark Energy Survey (DES). We describe our object selection, point spread function estimation and shear measurement procedures using two independent shear pipelines, IM3SHAPE and NGMIX, which produce catalogs of 2.12 million and 3.44 million galaxies respectively. We also detail a set of null tests for the shear measurements and find that they pass the requirements for systematic errors at the level necessary for weak lensing science applications using the SVmore » data. Furthermore, we discuss some of the planned algorithmic improvements that will be necessary to produce sufficiently accurate shear catalogs for the full 5-year DES, which is expected to cover 5000 square degrees.« less

  12. The DES Science Verification Weak Lensing Shear Catalogs

    DOE PAGES

    Jarvis, M.

    2016-05-01

    We present weak lensing shear catalogs for 139 square degrees of data taken during the Science Verification (SV) time for the new Dark Energy Camera (DECam) being used for the Dark Energy Survey (DES). We describe our object selection, point spread function estimation and shear measurement procedures using two independent shear pipelines, IM3SHAPE and NGMIX, which produce catalogs of 2.12 million and 3.44 million galaxies respectively. We also detail a set of null tests for the shear measurements and find that they pass the requirements for systematic errors at the level necessary for weak lensing science applications using the SVmore » data. Furthermore, we discuss some of the planned algorithmic improvements that will be necessary to produce sufficiently accurate shear catalogs for the full 5-year DES, which is expected to cover 5000 square degrees.« less

  13. Exploration of Uncertainty in Glacier Modelling

    NASA Technical Reports Server (NTRS)

    Thompson, David E.

    1999-01-01

    There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.

  14. 75 FR 5853 - Proposed Collection; Comment Request for Form 13803

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-04

    ... 13803, Income Verification Express Service Application and Employee Delegation Form. DATES: Written... Application and Employee Delegation Form. OMB Number: 1545-2032. Form Number: Form 13803. Abstract: Form 13803, Income Verification Express Service Application and Employee Delegation Form, is used to submit the...

  15. 29 CFR 1903.19 - Abatement verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... INSPECTIONS, CITATIONS AND PROPOSED PENALTIES § 1903.19 Abatement verification. Purpose. OSHA's inspections... 1970 (the OSH Act). This section sets forth the procedures OSHA will use to ensure abatement. These... standard or regulation or to eliminate a recognized hazard identified by OSHA during an inspection. (2...

  16. The concept verification testing of materials science payloads

    NASA Technical Reports Server (NTRS)

    Griner, C. S.; Johnston, M. H.; Whitaker, A.

    1976-01-01

    The concept Verification Testing (CVT) project at the Marshall Space Flight Center, Alabama, is a developmental activity that supports Shuttle Payload Projects such as Spacelab. It provides an operational 1-g environment for testing NASA and other agency experiment and support systems concepts that may be used in shuttle. A dedicated Materials Science Payload was tested in the General Purpose Laboratory to assess the requirements of a space processing payload on a Spacelab type facility. Physical and functional integration of the experiments into the facility was studied, and the impact of the experiments on the facility (and vice versa) was evaluated. A follow-up test designated CVT Test IVA was also held. The purpose of this test was to repeat Test IV experiments with a crew composed of selected and trained scientists. These personnel were not required to have prior knowledge of the materials science disciplines, but were required to have a basic knowledge of science and the scientific method.

  17. Weak-lensing mass calibration of redMaPPer galaxy clusters in Dark Energy Survey Science Verification data

    DOE PAGES

    Melchior, P.; Gruen, D.; McClintock, T.; ...

    2017-05-16

    Here, we use weak-lensing shear measurements to determine the mean mass of optically selected galaxy clusters in Dark Energy Survey Science Verification data. In a blinded analysis, we split the sample of more than 8000 redMaPPer clusters into 15 subsets, spanning ranges in the richness parameter 5 ≤ λ ≤ 180 and redshift 0.2 ≤ z ≤ 0.8, and fit the averaged mass density contrast profiles with a model that accounts for seven distinct sources of systematic uncertainty: shear measurement and photometric redshift errors; cluster-member contamination; miscentring; deviations from the NFW halo profile; halo triaxiality and line-of-sight projections.

  18. Weak-lensing mass calibration of redMaPPer galaxy clusters in Dark Energy Survey Science Verification data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melchior, P.; Gruen, D.; McClintock, T.

    Here, we use weak-lensing shear measurements to determine the mean mass of optically selected galaxy clusters in Dark Energy Survey Science Verification data. In a blinded analysis, we split the sample of more than 8000 redMaPPer clusters into 15 subsets, spanning ranges in the richness parameter 5 ≤ λ ≤ 180 and redshift 0.2 ≤ z ≤ 0.8, and fit the averaged mass density contrast profiles with a model that accounts for seven distinct sources of systematic uncertainty: shear measurement and photometric redshift errors; cluster-member contamination; miscentring; deviations from the NFW halo profile; halo triaxiality and line-of-sight projections.

  19. Reconstruction based finger-knuckle-print verification with score level adaptive binary fusion.

    PubMed

    Gao, Guangwei; Zhang, Lei; Yang, Jian; Zhang, Lin; Zhang, David

    2013-12-01

    Recently, a new biometrics identifier, namely finger knuckle print (FKP), has been proposed for personal authentication with very interesting results. One of the advantages of FKP verification lies in its user friendliness in data collection. However, the user flexibility in positioning fingers also leads to a certain degree of pose variations in the collected query FKP images. The widely used Gabor filtering based competitive coding scheme is sensitive to such variations, resulting in many false rejections. We propose to alleviate this problem by reconstructing the query sample with a dictionary learned from the template samples in the gallery set. The reconstructed FKP image can reduce much the enlarged matching distance caused by finger pose variations; however, both the intra-class and inter-class distances will be reduced. We then propose a score level adaptive binary fusion rule to adaptively fuse the matching distances before and after reconstruction, aiming to reduce the false rejections without increasing much the false acceptances. Experimental results on the benchmark PolyU FKP database show that the proposed method significantly improves the FKP verification accuracy.

  20. The Effects of Sentence Imitation and Picture Verification on the Recall of Subsequent Digits. Lektos: Interdisciplinary Working Papers in Language Sciences, Vol. 3, No. 1.

    ERIC Educational Resources Information Center

    Scholes, Robert J.; And Others

    The effects of sentence imitation and picture verification on the recall of subsequent digits were studied. Stimuli consisted of 20 sentences, each sentence followed by a string of five digit names, and five structural types of sentences were presented. Subjects were instructed to listen to the sentence and digit string and then either immediately…

  1. Sandia National Laboratories: Directed-energy tech receives funding to

    Science.gov Websites

    Accomplishments Energy Stationary Power Earth Science Transportation Energy Energy Research Global Security WMD & Figures Programs Nuclear Weapons About Nuclear Weapons Safety & Security Weapons Science & Cyber & Infrastructure Security Global Security Remote Sensing & Verification Research Research

  2. Two-Level Verification of Data Integrity for Data Storage in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Xu, Guangwei; Chen, Chunlin; Wang, Hongya; Zang, Zhuping; Pang, Mugen; Jiang, Ping

    Data storage in cloud computing can save capital expenditure and relive burden of storage management for users. As the lose or corruption of files stored may happen, many researchers focus on the verification of data integrity. However, massive users often bring large numbers of verifying tasks for the auditor. Moreover, users also need to pay extra fee for these verifying tasks beyond storage fee. Therefore, we propose a two-level verification of data integrity to alleviate these problems. The key idea is to routinely verify the data integrity by users and arbitrate the challenge between the user and cloud provider by the auditor according to the MACs and ϕ values. The extensive performance simulations show that the proposed scheme obviously decreases auditor's verifying tasks and the ratio of wrong arbitration.

  3. The role of science in treaty verification.

    PubMed

    Gavron, Avigdor

    2005-01-01

    Technologically advanced nations are currently applying more science to treaty verification than ever before. Satellites gather a multitude of information relating to proliferation concerns using thermal imaging analysis, nuclear radiation measurements, and optical and radio frequency signals detection. Ground stations gather complementary signals such as seismic events and radioactive emissions. Export controls in many countries attempt to intercept materials and technical means that could be used for nuclear proliferation. Nevertheless, we have witnessed a plethora of nuclear proliferation episodes, that were undetected (or were belatedly detected) by these technologies--the Indian nuclear tests in 1998, the Libyan nuclear buildup, the Iranian enrichment program and the North Korea nuclear weapons program are some prime examples. In this talk, we will discuss some of the technologies used for proliferation detection. In particular, we will note some of the issues relating to nuclear materials control agreements that epitomize political difficulties as they impact the implementation of science and technology.

  4. A feasibility study of treatment verification using EPID cine images for hypofractionated lung radiotherapy

    NASA Astrophysics Data System (ADS)

    Tang, Xiaoli; Lin, Tong; Jiang, Steve

    2009-09-01

    We propose a novel approach for potential online treatment verification using cine EPID (electronic portal imaging device) images for hypofractionated lung radiotherapy based on a machine learning algorithm. Hypofractionated radiotherapy requires high precision. It is essential to effectively monitor the target to ensure that the tumor is within the beam aperture. We modeled the treatment verification problem as a two-class classification problem and applied an artificial neural network (ANN) to classify the cine EPID images acquired during the treatment into corresponding classes—with the tumor inside or outside of the beam aperture. Training samples were generated for the ANN using digitally reconstructed radiographs (DRRs) with artificially added shifts in the tumor location—to simulate cine EPID images with different tumor locations. Principal component analysis (PCA) was used to reduce the dimensionality of the training samples and cine EPID images acquired during the treatment. The proposed treatment verification algorithm was tested on five hypofractionated lung patients in a retrospective fashion. On average, our proposed algorithm achieved a 98.0% classification accuracy, a 97.6% recall rate and a 99.7% precision rate. This work was first presented at the Seventh International Conference on Machine Learning and Applications, San Diego, CA, USA, 11-13 December 2008.

  5. 77 FR 291 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-04

    ... Verification System (IEVS) Reporting and Supporting Regulations Contained in 42 CFR 431.17, 431.306, 435.910... verifications; Form Number: CMS-R-74 (OCN 0938-0467); Frequency: Monthly; Affected Public: State, Local, or..., issuers offering group health insurance coverage, and self-insured nonfederal governmental plans (through...

  6. 76 FR 45902 - Agency Information Collection Activities: Proposed Request and Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-01

    ... will allow our users to maintain one User ID, consisting of a self-selected Username and Password, to...) Registration and identity verification; (2) enhancement of the User ID; and (3) authentication. The...- person identification verification process for individuals who cannot or are not willing to register...

  7. 76 FR 2124 - Agency Information Collection Activities; Proposed Collection; Comment Request; Voluntary...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-12

    ... (original and update), and verification audit; names of the person(s) who completed the self-assessment... of the self assessment, date of the verification audit report, name of the auditor, signature and... self assessment, (2) conducting a baseline survey of the regulated industry, and (3) obtaining an...

  8. 76 FR 9020 - Proposed Information Collection Activity; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-16

    ... -------- Preparation and Submission of Data 54 1 640 34,560 Verification Procedures--Sec. Sec. 261.60-261.63 Caseload... for Needy Families (TANF) program, it imposed a new data requirement that States prepare and submit data verification procedures and replaced other data requirements with new versions including: the TANF...

  9. High stakes in INF verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krepon, M.

    1987-06-01

    The stakes involved in negotiating INF verification arrangements are high. While these proposals deal only with intermediate-range ground-launched cruise and mobile missiles, if properly devised they could help pave the way for comprehensive limits on other cruise missiles and strategic mobile missiles. In contrast, poorly drafted monitoring provisions could compromise national industrial security and generate numerous compliance controversies. Any verification regime will require new openness on both sides, but that means significant risks as well as opportunities. US and Soviet negotiators could spend weeks, months, and even years working out in painstaking detail verification provisions for medium-range missiles. Alternatively, ifmore » the two sides wished to conclude an INF agreement quickly, they could defer most of the difficult verification issues to the strategic arms negotiations.« less

  10. A Comparison of the Effects of Two Instructional Sequences Involving Science Laboratory Activities.

    ERIC Educational Resources Information Center

    Ivins, Jerry Edward

    This study attempted to determine if students learn science concepts better when laboratories are used to verify concepts already intorduced through lectures and textbooks (verification laboratories or whether achievement and retention are improved when laboratories are used to introduce new concepts (directed discovery learning laboratories). The…

  11. Verification of road databases using multiple road models

    NASA Astrophysics Data System (ADS)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  12. A New Integrated Threshold Selection Methodology for Spatial Forecast Verification of Extreme Events

    NASA Astrophysics Data System (ADS)

    Kholodovsky, V.

    2017-12-01

    Extreme weather and climate events such as heavy precipitation, heat waves and strong winds can cause extensive damage to the society in terms of human lives and financial losses. As climate changes, it is important to understand how extreme weather events may change as a result. Climate and statistical models are often independently used to model those phenomena. To better assess performance of the climate models, a variety of spatial forecast verification methods have been developed. However, spatial verification metrics that are widely used in comparing mean states, in most cases, do not have an adequate theoretical justification to benchmark extreme weather events. We proposed a new integrated threshold selection methodology for spatial forecast verification of extreme events that couples existing pattern recognition indices with high threshold choices. This integrated approach has three main steps: 1) dimension reduction; 2) geometric domain mapping; and 3) thresholds clustering. We apply this approach to an observed precipitation dataset over CONUS. The results are evaluated by displaying threshold distribution seasonally, monthly and annually. The method offers user the flexibility of selecting a high threshold that is linked to desired geometrical properties. The proposed high threshold methodology could either complement existing spatial verification methods, where threshold selection is arbitrary, or be directly applicable in extreme value theory.

  13. Concept Verification Test - Evaluation of Spacelab/Payload operation concepts

    NASA Technical Reports Server (NTRS)

    Mcbrayer, R. O.; Watters, H. H.

    1977-01-01

    The Concept Verification Test (CVT) procedure is used to study Spacelab operational concepts by conducting mission simulations in a General Purpose Laboratory (GPL) which represents a possible design of Spacelab. In conjunction with the laboratory a Mission Development Simulator, a Data Management System Simulator, a Spacelab Simulator, and Shuttle Interface Simulator have been designed. (The Spacelab Simulator is more functionally and physically representative of the Spacelab than the GPL.) Four simulations of Spacelab mission experimentation were performed, two involving several scientific disciplines, one involving life sciences, and the last involving material sciences. The purpose of the CVT project is to support the pre-design and development of payload carriers and payloads, and to coordinate hardware, software, and operational concepts of different developers and users.

  14. Cleared for Launch - Lessons Learned from the OSIRIS-REx System Requirements Verification Program

    NASA Technical Reports Server (NTRS)

    Stevens, Craig; Adams, Angela; Williams, Bradley; Goodloe, Colby

    2017-01-01

    Requirements verification of a large flight system is a challenge. It is especially challenging for engineers taking on their first role in space systems engineering. This paper describes our approach to verification of the Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer (OSIRIS-REx) system requirements. It also captures lessons learned along the way from developing systems engineers embroiled in this process. We begin with an overview of the mission and science objectives as well as the project requirements verification program strategy. A description of the requirements flow down is presented including our implementation for managing the thousands of program and element level requirements and associated verification data. We discuss both successes and methods to improve the managing of this data across multiple organizational interfaces. Our approach to verifying system requirements at multiple levels of assembly is presented using examples from our work at instrument, spacecraft, and ground segment levels. We include a discussion of system end-to-end testing limitations and their impacts to the verification program. Finally, we describe lessons learned that are applicable to all emerging space systems engineers using our unique perspectives across multiple organizations of a large NASA program.

  15. Implementation and verification of global optimization benchmark problems

    NASA Astrophysics Data System (ADS)

    Posypkin, Mikhail; Usov, Alexander

    2017-12-01

    The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  16. Why Verifying Diagnostic Decisions with a Checklist Can Help: Insights from Eye Tracking

    ERIC Educational Resources Information Center

    Sibbald, Matthew; de Bruin, Anique B. H.; Yu, Eric; van Merrienboer, Jeroen J. G.

    2015-01-01

    Making a diagnosis involves ratifying or verifying a proposed answer. Formalizing this verification process with checklists, which highlight key variables involved in the diagnostic decision, is often advocated. However, the mechanisms by which a checklist might allow clinicians to improve their verification process have not been well studied. We…

  17. 75 FR 62911 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-13

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-63044; File No. SR-FINRA-2010-042] Self..., Relating to FINRA Rule 4160 (Verification of Assets) October 5, 2010. I. Introduction On August 4, 2010... verification of assets maintained by the member at such financial institutions. The proposed rule change was...

  18. 78 FR 14106 - Notice of Proposed Information Collection for Public Comment Enterprise Income Verification (EIV...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-04

    ... Public Housing Agency (PHA). The information is used by PHAs to determine a family's suitability for... Information Collection for Public Comment Enterprise Income Verification (EIV) Systems--Debts Owed to Public Housing Agencies and Terminations AGENCY: Office of the Assistance Secretary for Public and Indian Housing...

  19. 77 FR 60333 - Verification of Statements of Account Submitted by Cable Operators and Satellite Carriers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-03

    ... regulation that will allow copyright owners to audit the Statements of Account and royalty fees that cable... Proposed Rulemaking concerning the verification of Statements of Account and royalty payments that are... NCTA's members file Statements of Account with and pay royalties to the Copyright Office under the...

  20. 76 FR 60004 - Proposed Information Collection; Comment Request; Data Collection and Verification for the Marine...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-28

    ... Collection; Comment Request; Data Collection and Verification for the Marine Protected Areas Inventory AGENCY... developing a national system of marine protected areas (MPAs). These departments are working closely with... Administration (NOAA) and DOI have created the Marine Protected Areas Inventory, an online spatial database that...

  1. Fuzzy Logic Controller Stability Analysis Using a Satisfiability Modulo Theories Approach

    NASA Technical Reports Server (NTRS)

    Arnett, Timothy; Cook, Brandon; Clark, Matthew A.; Rattan, Kuldip

    2017-01-01

    While many widely accepted methods and techniques exist for validation and verification of traditional controllers, at this time no solutions have been accepted for Fuzzy Logic Controllers (FLCs). Due to the highly nonlinear nature of such systems, and the fact that developing a valid FLC does not require a mathematical model of the system, it is quite difficult to use conventional techniques to prove controller stability. Since safety-critical systems must be tested and verified to work as expected for all possible circumstances, the fact that FLC controllers cannot be tested to achieve such requirements poses limitations on the applications for such technology. Therefore, alternative methods for verification and validation of FLCs needs to be explored. In this study, a novel approach using formal verification methods to ensure the stability of a FLC is proposed. Main research challenges include specification of requirements for a complex system, conversion of a traditional FLC to a piecewise polynomial representation, and using a formal verification tool in a nonlinear solution space. Using the proposed architecture, the Fuzzy Logic Controller was found to always generate negative feedback, but inconclusive for Lyapunov stability.

  2. Assume-Guarantee Verification of Source Code with Design-Level Assumptions

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.

    2004-01-01

    Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.

  3. An analysis of random projection for changeable and privacy-preserving biometric verification.

    PubMed

    Wang, Yongjin; Plataniotis, Konstantinos N

    2010-10-01

    Changeability and privacy protection are important factors for widespread deployment of biometrics-based verification systems. This paper presents a systematic analysis of a random-projection (RP)-based method for addressing these problems. The employed method transforms biometric data using a random matrix with each entry an independent and identically distributed Gaussian random variable. The similarity- and privacy-preserving properties, as well as the changeability of the biometric information in the transformed domain, are analyzed in detail. Specifically, RP on both high-dimensional image vectors and dimensionality-reduced feature vectors is discussed and compared. A vector translation method is proposed to improve the changeability of the generated templates. The feasibility of the introduced solution is well supported by detailed theoretical analyses. Extensive experimentation on a face-based biometric verification problem shows the effectiveness of the proposed method.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myers, R.K.

    This paper examines the political and technical verification issues associated with proposals to place quantitative and/or qualitative limits on the deployment of nuclear-armed sea-launched cruise missiles (SLCMs). Overviews of the arms control relationship between the United States and the Soviet Union, the development of the SLCM, and Soviet and American concepts of verification are presented. The views of the American arms control and defense communities regarding the SLCM is discussed in depth, accompanied by a detailed examination of the various methods which have been proposed to verify a SLCM limitation agreement. The conclusion is that there are no technological barriers,more » per se, to SLCM verification, but as the decision on an agreement's verifiability is a political one, the U.S. Navy should concentrate its arguments against SLCM limitations on the weapon's operational utility rather than argue that such an agreement is unverifiable.« less

  5. Verification of Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  6. Imperceptible watermarking for security of fundus images in tele-ophthalmology applications and computer-aided diagnosis of retina diseases.

    PubMed

    Singh, Anushikha; Dutta, Malay Kishore

    2017-12-01

    The authentication and integrity verification of medical images is a critical and growing issue for patients in e-health services. Accurate identification of medical images and patient verification is an essential requirement to prevent error in medical diagnosis. The proposed work presents an imperceptible watermarking system to address the security issue of medical fundus images for tele-ophthalmology applications and computer aided automated diagnosis of retinal diseases. In the proposed work, patient identity is embedded in fundus image in singular value decomposition domain with adaptive quantization parameter to maintain perceptual transparency for variety of fundus images like healthy fundus or disease affected image. In the proposed method insertion of watermark in fundus image does not affect the automatic image processing diagnosis of retinal objects & pathologies which ensure uncompromised computer-based diagnosis associated with fundus image. Patient ID is correctly recovered from watermarked fundus image for integrity verification of fundus image at the diagnosis centre. The proposed watermarking system is tested in a comprehensive database of fundus images and results are convincing. results indicate that proposed watermarking method is imperceptible and it does not affect computer vision based automated diagnosis of retinal diseases. Correct recovery of patient ID from watermarked fundus image makes the proposed watermarking system applicable for authentication of fundus images for computer aided diagnosis and Tele-ophthalmology applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Formulating face verification with semidefinite programming.

    PubMed

    Yan, Shuicheng; Liu, Jianzhuang; Tang, Xiaoou; Huang, Thomas S

    2007-11-01

    This paper presents a unified solution to three unsolved problems existing in face verification with subspace learning techniques: selection of verification threshold, automatic determination of subspace dimension, and deducing feature fusing weights. In contrast to previous algorithms which search for the projection matrix directly, our new algorithm investigates a similarity metric matrix (SMM). With a certain verification threshold, this matrix is learned by a semidefinite programming approach, along with the constraints of the kindred pairs with similarity larger than the threshold, and inhomogeneous pairs with similarity smaller than the threshold. Then, the subspace dimension and the feature fusing weights are simultaneously inferred from the singular value decomposition of the derived SMM. In addition, the weighted and tensor extensions are proposed to further improve the algorithmic effectiveness and efficiency, respectively. Essentially, the verification is conducted within an affine subspace in this new algorithm and is, hence, called the affine subspace for verification (ASV). Extensive experiments show that the ASV can achieve encouraging face verification accuracy in comparison to other subspace algorithms, even without the need to explore any parameters.

  8. Restricted access processor - An application of computer security technology

    NASA Technical Reports Server (NTRS)

    Mcmahon, E. M.

    1985-01-01

    This paper describes a security guard device that is currently being developed by Computer Sciences Corporation (CSC). The methods used to provide assurance that the system meets its security requirements include the system architecture, a system security evaluation, and the application of formal and informal verification techniques. The combination of state-of-the-art technology and the incorporation of new verification procedures results in a demonstration of the feasibility of computer security technology for operational applications.

  9. Exploring Middle School Students' Representational Competence in Science: Development and Verification of a Framework for Learning with Visual Representations

    NASA Astrophysics Data System (ADS)

    Tippett, Christine Diane

    Scientific knowledge is constructed and communicated through a range of forms in addition to verbal language. Maps, graphs, charts, diagrams, formulae, models, and drawings are just some of the ways in which science concepts can be represented. Representational competence---an aspect of visual literacy that focuses on the ability to interpret, transform, and produce visual representations---is a key component of science literacy and an essential part of science reading and writing. To date, however, most research has examined learning from representations rather than learning with representations. This dissertation consisted of three distinct projects that were related by a common focus on learning from visual representations as an important aspect of scientific literacy. The first project was the development of an exploratory framework that is proposed for use in investigations of students constructing and interpreting multimedia texts. The exploratory framework, which integrates cognition, metacognition, semiotics, and systemic functional linguistics, could eventually result in a model that might be used to guide classroom practice, leading to improved visual literacy, better comprehension of science concepts, and enhanced science literacy because it emphasizes distinct aspects of learning with representations that can be addressed though explicit instruction. The second project was a metasynthesis of the research that was previously conducted as part of the Explicit Literacy Instruction Embedded in Middle School Science project (Pacific CRYSTAL, http://www.educ.uvic.ca/pacificcrystal). Five overarching themes emerged from this case-to-case synthesis: the engaging and effective nature of multimedia genres, opportunities for differentiated instruction using multimodal strategies, opportunities for assessment, an emphasis on visual representations, and the robustness of some multimodal literacy strategies across content areas. The third project was a mixed-methods verification study that was conducted to refine and validate the theoretical framework. This study examined middle school students' representational competence and focused on students' creation of visual representations such as labelled diagrams, a form of representation commonly found in science information texts and textbooks. An analysis of the 31 Grade 6 participants' representations and semistructured interviews revealed five themes, each of which supports one or more dimensions of the exploratory framework: participants' use of color, participants' choice of representation (form and function), participants' method of planning for representing, participants' knowledge of conventions, and participants' selection of information to represent. Together, the results of these three projects highlight the need for further research on learning with rather than learning from representations.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toltz, A; Seuntjens, J; Hoesl, M

    Purpose: With the aim of reducing acute esophageal radiation toxicity in pediatric patients receiving craniospinal irradiation (CSI), we investigated the implementation of an in-vivo, adaptive proton therapy range verification methodology. Simulation experiments and in-phantom measurements were conducted to validate the range verification technique for this clinical application. Methods: A silicon diode array system has been developed and experimentally tested in phantom for passively scattered proton beam range verification for a prostate treatment case by correlating properties of the detector signal to the water equivalent path length (WEPL). We propose to extend the methodology to verify range distal to the vertebralmore » body for pediatric CSI cases by placing this small volume dosimeter in the esophagus of the anesthetized patient immediately prior to treatment. A set of calibration measurements was performed to establish a time signal to WEPL fit for a “scout” beam in a solid water phantom. Measurements are compared against Monte Carlo simulation in GEANT4 using the Tool for Particle Simulation (TOPAS). Results: Measurements with the diode array in a spread out Bragg peak of 14 cm modulation width and 15 cm range (177 MeV passively scattered beam) in solid water were successfully validated against proton fluence rate simulations in TOPAS. The resulting calibration curve allows for a sensitivity analysis of detector system response with dose rate in simulation and with individual diode position through simulation on patient CT data. Conclusion: Feasibility has been shown for the application of this range verification methodology to pediatric CSI. An in-vivo measurement to determine the WEPL to the inner surface of the esophagus will allow for personalized adjustment of the treatment plan to ensure sparing of the esophagus while confirming target coverage. A Toltz acknowledges partial support by the CREATE Medical Physics Research Training Network grant of the Natural Sciences and Engineering Research Council (Grant number: 432290)« less

  11. Assessing Investigative Skill Development in Inquiry-Based and Traditional College Science Laboratory Courses

    ERIC Educational Resources Information Center

    Suits, Jerry P.

    2004-01-01

    A laboratory practical examination was used to compare the investigative skills developed in two different types of general-chemistry laboratory courses. Science and engineering majors (SEM) in the control group used a traditional verification approach (SEM-Ctrl), whereas those in the treatment group learned from an innovative, inquiry-based…

  12. The Babushka Concept--An Instructional Sequence to Enhance Laboratory Learning in Science Education

    ERIC Educational Resources Information Center

    Gårdebjer, Sofie; Larsson, Anette; Adawi, Tom

    2017-01-01

    This paper deals with a novel method for improving the traditional "verification" laboratory in science education. Drawing on the idea of integrated instructional units, we describe an instructional sequence which we call the Babushka concept. This concept consists of three integrated instructional units: a start-up lecture, a laboratory…

  13. Remote Sensing Product Verification and Validation at the NASA Stennis Space Center

    NASA Technical Reports Server (NTRS)

    Stanley, Thomas M.

    2005-01-01

    Remote sensing data product verification and validation (V&V) is critical to successful science research and applications development. People who use remote sensing products to make policy, economic, or scientific decisions require confidence in and an understanding of the products' characteristics to make informed decisions about the products' use. NASA data products of coarse to moderate spatial resolution are validated by NASA science teams. NASA's Stennis Space Center (SSC) serves as the science validation team lead for validating commercial data products of moderate to high spatial resolution. At SSC, the Applications Research Toolbox simulates sensors and targets, and the Instrument Validation Laboratory validates critical sensors. The SSC V&V Site consists of radiometric tarps, a network of ground control points, a water surface temperature sensor, an atmospheric measurement system, painted concrete radial target and edge targets, and other instrumentation. NASA's Applied Sciences Directorate participates in the Joint Agency Commercial Imagery Evaluation (JACIE) team formed by NASA, the U.S. Geological Survey, and the National Geospatial-Intelligence Agency to characterize commercial systems and imagery.

  14. Isocenter verification for linac‐based stereotactic radiation therapy: review of principles and techniques

    PubMed Central

    Sabet, Mahsheed; O'Connor, Daryl J.; Greer, Peter B.

    2011-01-01

    There have been several manual, semi‐automatic and fully‐automatic methods proposed for verification of the position of mechanical isocenter as part of comprehensive quality assurance programs required for linear accelerator‐based stereotactic radiosurgery/radiotherapy (SRS/SRT) treatments. In this paper, a systematic review has been carried out to discuss the present methods for isocenter verification and compare their characteristics, to help physicists in making a decision on selection of their quality assurance routine. PACS numbers: 87.53.Ly, 87.56.Fc, 87.56.‐v PMID:22089022

  15. Integrating image quality in 2nu-SVM biometric match score fusion.

    PubMed

    Vatsa, Mayank; Singh, Richa; Noore, Afzel

    2007-10-01

    This paper proposes an intelligent 2nu-support vector machine based match score fusion algorithm to improve the performance of face and iris recognition by integrating the quality of images. The proposed algorithm applies redundant discrete wavelet transform to evaluate the underlying linear and non-linear features present in the image. A composite quality score is computed to determine the extent of smoothness, sharpness, noise, and other pertinent features present in each subband of the image. The match score and the corresponding quality score of an image are fused using 2nu-support vector machine to improve the verification performance. The proposed algorithm is experimentally validated using the FERET face database and the CASIA iris database. The verification performance and statistical evaluation show that the proposed algorithm outperforms existing fusion algorithms.

  16. The DES Bright Arcs Survey: Hundreds of Candidate Strongly Lensed Galaxy Systems from the Dark Energy Survey Science Verification and Year 1 Observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diehl, H. T.; Buckley-Geer, E. J.; Lindgren, K. A.

    We report the results of searches for strong gravitational lens systems in the Dark Energy Survey (DES) Science Verification and Year 1 observations. The Science Verification data span approximately 250 sq. deg. with a median i -band limiting magnitude for extended objects (10 σ ) of 23.0. The Year 1 data span approximately 2000 sq. deg. and have an i -band limiting magnitude for extended objects (10 σ ) of 22.9. As these data sets are both wide and deep, they are particularly useful for identifying strong gravitational lens candidates. Potential strong gravitational lens candidate systems were initially identified basedmore » on a color and magnitude selection in the DES object catalogs or because the system is at the location of a previously identified galaxy cluster. Cutout images of potential candidates were then visually scanned using an object viewer and numerically ranked according to whether or not we judged them to be likely strong gravitational lens systems. Having scanned nearly 400,000 cutouts, we present 374 candidate strong lens systems, of which 348 are identified for the first time. We provide the R.A. and decl., the magnitudes and photometric properties of the lens and source objects, and the distance (radius) of the source(s) from the lens center for each system.« less

  17. The DES Bright Arcs Survey: Hundreds of Candidate Strongly Lensed Galaxy Systems from the Dark Energy Survey Science Verification and Year 1 Observations

    NASA Astrophysics Data System (ADS)

    Diehl, H. T.; Buckley-Geer, E. J.; Lindgren, K. A.; Nord, B.; Gaitsch, H.; Gaitsch, S.; Lin, H.; Allam, S.; Collett, T. E.; Furlanetto, C.; Gill, M. S. S.; More, A.; Nightingale, J.; Odden, C.; Pellico, A.; Tucker, D. L.; da Costa, L. N.; Fausti Neto, A.; Kuropatkin, N.; Soares-Santos, M.; Welch, B.; Zhang, Y.; Frieman, J. A.; Abdalla, F. B.; Annis, J.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Cunha, C. E.; D'Andrea, C. B.; Desai, S.; Dietrich, J. P.; Drlica-Wagner, A.; Evrard, A. E.; Finley, D. A.; Flaugher, B.; García-Bellido, J.; Gerdes, D. W.; Goldstein, D. A.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Kuehn, K.; Kuhlmann, S.; Lahav, O.; Li, T. S.; Lima, M.; Maia, M. A. G.; Marshall, J. L.; Menanteau, F.; Miquel, R.; Nichol, R. C.; Nugent, P.; Ogando, R. L. C.; Plazas, A. A.; Reil, K.; Romer, A. K.; Sako, M.; Sanchez, E.; Santiago, B.; Scarpine, V.; Schindler, R.; Schubnell, M.; Sevilla-Noarbe, I.; Sheldon, E.; Smith, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Walker, A. R.; DES Collaboration

    2017-09-01

    We report the results of searches for strong gravitational lens systems in the Dark Energy Survey (DES) Science Verification and Year 1 observations. The Science Verification data span approximately 250 sq. deg. with a median I-band limiting magnitude for extended objects (10σ) of 23.0. The Year 1 data span approximately 2000 sq. deg. and have an I-band limiting magnitude for extended objects (10σ) of 22.9. As these data sets are both wide and deep, they are particularly useful for identifying strong gravitational lens candidates. Potential strong gravitational lens candidate systems were initially identified based on a color and magnitude selection in the DES object catalogs or because the system is at the location of a previously identified galaxy cluster. Cutout images of potential candidates were then visually scanned using an object viewer and numerically ranked according to whether or not we judged them to be likely strong gravitational lens systems. Having scanned nearly 400,000 cutouts, we present 374 candidate strong lens systems, of which 348 are identified for the first time. We provide the R.A. and decl., the magnitudes and photometric properties of the lens and source objects, and the distance (radius) of the source(s) from the lens center for each system.

  18. Ultra-wideband reflective polarization converter based on anisotropic metasurface

    NASA Astrophysics Data System (ADS)

    Wu, Jia-Liang; Lin, Bao-Qin; Da, Xin-Yu

    2016-08-01

    In this paper, we propose an ultra-wideband reflective linear cross-polarization converter based on anisotropic metasurface. Its unit cell is composed of a square-shaped resonator with intersectant diagonal and metallic ground sheet separated by dielectric substrate. Simulated results show that the converter can generate resonances at four frequencies under normal incident electromagnetic (EM) wave, leading to the bandwidth expansion of cross-polarization reflection. For verification, the designed polarization converter is fabricated and measured. The measured and simulated results agree well with each other, showing that the fabricated converter can convert x- or y-polarized incident wave into its cross polarized wave in a frequency range from 7.57 GHz to 20.46 GHz with a relative bandwidth of 91.2%, and the polarization conversion efficiency is greater than 90%. The proposed polarization converter has a simple geometry but an ultra wideband compared with the published designs, and hence possesses potential applications in novel polarization-control devices. Project supported by the National Natural Science Foundation of China (Grant Nos. 61471387, 61271250, and 61571460).

  19. Integrated Formal Analysis of Timed-Triggered Ethernet

    NASA Technical Reports Server (NTRS)

    Dutertre, Bruno; Shankar, Nstarajan; Owre, Sam

    2012-01-01

    We present new results related to the verification of the Timed-Triggered Ethernet (TTE) clock synchronization protocol. This work extends previous verification of TTE based on model checking. We identify a suboptimal design choice in a compression function used in clock synchronization, and propose an improvement. We compare the original design and the improved definition using the SAL model checker.

  20. Radiation effects on science instruments in Grand Tour type missions

    NASA Technical Reports Server (NTRS)

    Parker, R. H.

    1972-01-01

    The extent of the radiation effects problem is delineated, along with the status of protective designs for 15 representative science instruments. Designs for protecting science instruments from radiation damage is discussed for the various instruments to be employed in the Grand Tour type missions. A literature search effort was undertaken to collect science instrument components damage/interference effects data on the various sensitive components such as Si detectors, vidicon tubes, etc. A small experimental effort is underway to provide verification of the radiation effects predictions.

  1. Verification and Optimal Control of Context-Sensitive Probabilistic Boolean Networks Using Model Checking and Polynomial Optimization

    PubMed Central

    Hiraishi, Kunihiko

    2014-01-01

    One of the significant topics in systems biology is to develop control theory of gene regulatory networks (GRNs). In typical control of GRNs, expression of some genes is inhibited (activated) by manipulating external stimuli and expression of other genes. It is expected to apply control theory of GRNs to gene therapy technologies in the future. In this paper, a control method using a Boolean network (BN) is studied. A BN is widely used as a model of GRNs, and gene expression is expressed by a binary value (ON or OFF). In particular, a context-sensitive probabilistic Boolean network (CS-PBN), which is one of the extended models of BNs, is used. For CS-PBNs, the verification problem and the optimal control problem are considered. For the verification problem, a solution method using the probabilistic model checker PRISM is proposed. For the optimal control problem, a solution method using polynomial optimization is proposed. Finally, a numerical example on the WNT5A network, which is related to melanoma, is presented. The proposed methods provide us useful tools in control theory of GRNs. PMID:24587766

  2. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    NASA Astrophysics Data System (ADS)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  3. Identifying, Visualizing, and Fusing Social Media Data to Support Nonproliferation and Arms Control Treaty Verification: Preliminary Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gastelum, Zoe N.; Cramer, Nicholas O.; Benz, Jacob M.

    While international nonproliferation and arms control verification capabilities have their foundations in physical and chemical sensors, state declarations, and on-site inspections, verification experts are beginning to consider the importance of open source data to complement and support traditional means of verification. One of those new, and increasingly expanding, sources of open source information is social media, which can be ingested and understood through social media analytics (SMA). Pacific Northwest National Laboratory (PNNL) is conducting research to further our ability to identify, visualize, and fuse social media data to support nonproliferation and arms control treaty verification efforts. This paper will describemore » our preliminary research to examine social media signatures of nonproliferation or arms control proxy events. We will describe the development of our preliminary nonproliferation and arms control proxy events, outline our initial findings, and propose ideas for future work.« less

  4. Glove-based approach to online signature verification.

    PubMed

    Kamel, Nidal S; Sayeed, Shohel; Ellis, Grant A

    2008-06-01

    Utilizing the multiple degrees of freedom offered by the data glove for each finger and the hand, a novel on-line signature verification system using the Singular Value Decomposition (SVD) numerical tool for signature classification and verification is presented. The proposed technique is based on the Singular Value Decomposition in finding r singular vectors sensing the maximal energy of glove data matrix A, called principal subspace, so the effective dimensionality of A can be reduced. Having modeled the data glove signature through its r-principal subspace, signature authentication is performed by finding the angles between the different subspaces. A demonstration of the data glove is presented as an effective high-bandwidth data entry device for signature verification. This SVD-based signature verification technique is tested and its performance is shown to be able to recognize forgery signatures with a false acceptance rate of less than 1.2%.

  5. 6 CFR 5.2 - Public reading rooms.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... the non-proliferation and verification research and development program; (v) The life sciences..., Washington, DC 20472 (for regional offices, consult your phone book); (20) For the Federal Computer Incident...

  6. 6 CFR 5.2 - Public reading rooms.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... the non-proliferation and verification research and development program; (v) The life sciences..., Washington, DC 20472 (for regional offices, consult your phone book); (20) For the Federal Computer Incident...

  7. 6 CFR 5.2 - Public reading rooms.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... the non-proliferation and verification research and development program; (v) The life sciences..., Washington, DC 20472 (for regional offices, consult your phone book); (20) For the Federal Computer Incident...

  8. 6 CFR 5.2 - Public reading rooms.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... the non-proliferation and verification research and development program; (v) The life sciences..., Washington, DC 20472 (for regional offices, consult your phone book); (20) For the Federal Computer Incident...

  9. 6 CFR 5.2 - Public reading rooms.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... the non-proliferation and verification research and development program; (v) The life sciences..., Washington, DC 20472 (for regional offices, consult your phone book); (20) For the Federal Computer Incident...

  10. 78 FR 18362 - Proposed Information Collection for Public Comment: Enterprise Income Verification (EIV) Systems...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-26

    ... affected agencies concerning the proposed collection of information to: (1) Evaluate whether the proposed... user with information related to the Rules of Behavior for system usage and the user's responsibilities... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. FR-5690-N-05] Proposed Information...

  11. Methodology for the specification of communication activities within the framework of a multi-layered architecture: Toward the definition of a knowledge base

    NASA Astrophysics Data System (ADS)

    Amyay, Omar

    A method defined in terms of synthesis and verification steps is presented. The specification of the services and protocols of communication within a multilayered architecture of the Open Systems Interconnection (OSI) type is an essential issue for the design of computer networks. The aim is to obtain an operational specification of the protocol service couple of a given layer. Planning synthesis and verification steps constitute a specification trajectory. The latter is based on the progressive integration of the 'initial data' constraints and verification of the specification originating from each synthesis step, through validity constraints that characterize an admissible solution. Two types of trajectories are proposed according to the style of the initial specification of the service protocol couple: operational type and service supplier viewpoint; knowledge property oriented type and service viewpoint. Synthesis and verification activities were developed and formalized in terms of labeled transition systems, temporal logic and epistemic logic. The originality of the second specification trajectory and the use of the epistemic logic are shown. An 'artificial intelligence' approach enables a conceptual model to be defined for a knowledge base system for implementing the method proposed. It is structured in three levels of representation of the knowledge relating to the domain, the reasoning characterizing synthesis and verification activities and the planning of the steps of a specification trajectory.

  12. Spitzer Space Telescope in-orbit checkout and science verification operations

    NASA Technical Reports Server (NTRS)

    Linick, Sue H.; Miles, John W.; Gilbert, John B.; Boyles, Carol A.

    2004-01-01

    Spitzer Space Telescope, the fourth and final of NASA's great observatories, and the first mission in NASA's Origins Program was launched 25 August 2003 into an Earth-trailing solar orbit. The observatory was designed to probe and explore the universe in the infrared. Before science data could be acquired, however, the observatory had to be initialized, characterized, calibrated, and commissioned. A two phased operations approach was defined to complete this work. These phases were identified as In-Orbit Checkout (IOC) and Science Verification (SV). Because the observatory lifetime is cryogen-limited these operations had to be highly efficient. The IOC/SV operations design accommodated a pre-defined distributed organizational structure and a complex, cryogenic flight system. Many checkout activities were inter-dependent, and therefore the operations concept and ground data system had to provide the flexibility required for a 'short turn-around' environment. This paper describes the adaptive operations system design and evolution, implementation, and lessons-learned from the completion of IOC/SV.

  13. Execution of the Spitzer In-orbit Checkout and Science Verification Plan

    NASA Technical Reports Server (NTRS)

    Miles, John W.; Linick, Susan H.; Long, Stacia; Gilbert, John; Garcia, Mark; Boyles, Carole; Werner, Michael; Wilson, Robert K.

    2004-01-01

    The Spitzer Space Telescope is an 85-cm telescope with three cryogenically cooled instruments. Following launch, the observatory was initialized and commissioned for science operations during the in-orbit checkout (IOC) and science verification (SV) phases, carried out over a total of 98.3 days. The execution of the IOC/SV mission plan progressively established Spitzer capabilities taking into consideration thermal, cryogenic, optical, pointing, communications, and operational designs and constraints. The plan was carried out with high efficiency, making effective use of cryogen-limited flight time. One key component to the success of the plan was the pre-launch allocation of schedule reserve in the timeline of IOC/SV activities, and how it was used in flight both to cover activity redesign and growth due to continually improving spacecraft and instrument knowledge, and to recover from anomalies. This paper describes the adaptive system design and evolution, implementation, and lessons learned from IOC/SV operations. It is hoped that this information will provide guidance to future missions with similar engineering challenges

  14. Radio frequency interference mitigation using deep convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Akeret, J.; Chang, C.; Lucchi, A.; Refregier, A.

    2017-01-01

    We propose a novel approach for mitigating radio frequency interference (RFI) signals in radio data using the latest advances in deep learning. We employ a special type of Convolutional Neural Network, the U-Net, that enables the classification of clean signal and RFI signatures in 2D time-ordered data acquired from a radio telescope. We train and assess the performance of this network using the HIDE &SEEK radio data simulation and processing packages, as well as early Science Verification data acquired with the 7m single-dish telescope at the Bleien Observatory. We find that our U-Net implementation is showing competitive accuracy to classical RFI mitigation algorithms such as SEEK's SUMTHRESHOLD implementation. We publish our U-Net software package on GitHub under GPLv3 license.

  15. 15 CFR Supplement No. 4 to Part 748 - Authorities Administering Import Certificate/Delivery Verification (IC/DV) and End-User Statement...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 11011 Indian IC For the “organized” sector, except for computers and related equipment: Directorate... 110011 Indian IC For computers and related electronic items: Department of Electronics, Lok Nayak Bhawan... and Exports 5, Civic Center Islamabad IC Joint Science Advisor, Ministry of Science and Technology...

  16. 15 CFR Supplement No. 4 to Part 748 - Authorities Administering Import Certificate/Delivery Verification (IC/DV) and End-User Statement...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 11011 Indian IC For the “organized” sector, except for computers and related equipment: Directorate... 110011 Indian IC For computers and related electronic items: Department of Electronics, Lok Nayak Bhawan... and Exports 5, Civic Center Islamabad IC Joint Science Advisor, Ministry of Science and Technology...

  17. 15 CFR Supplement No. 4 to Part 748 - Authorities Administering Import Certificate/Delivery Verification (IC/DV) and End-User Statement...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 11011 Indian IC For the “organized” sector, except for computers and related equipment: Directorate... 110011 Indian IC For computers and related electronic items: Department of Electronics, Lok Nayak Bhawan... and Exports 5, Civic Center Islamabad IC Joint Science Advisor, Ministry of Science and Technology...

  18. 15 CFR Supplement No. 4 to Part 748 - Authorities Administering Import Certificate/Delivery Verification (IC/DV) and End-User Statement...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 11011 Indian IC For the “organized” sector, except for computers and related equipment: Directorate... 110011 Indian IC For computers and related electronic items: Department of Electronics, Lok Nayak Bhawan... Controller of Imports and Exports 5, Civic Center Islamabad IC Joint Science Advisor, Ministry of Science and...

  19. 15 CFR Supplement No. 4 to Part 748 - Authorities Administering Import Certificate/Delivery Verification (IC/DV) and End-User Statement...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 11011 Indian IC For the “organized” sector, except for computers and related equipment: Directorate... 110011 Indian IC For computers and related electronic items: Department of Electronics, Lok Nayak Bhawan... Controller of Imports and Exports 5, Civic Center Islamabad IC Joint Science Advisor, Ministry of Science and...

  20. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    NASA Astrophysics Data System (ADS)

    Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB

    2017-11-01

    Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  1. Power Performance Verification of a Wind Farm Using the Friedman's Test.

    PubMed

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L

    2016-06-03

    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman's test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  2. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    PubMed Central

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L.

    2016-01-01

    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable. PMID:27271628

  3. Ground vibration tests of a high fidelity truss for verification of on orbit damage location techniques

    NASA Technical Reports Server (NTRS)

    Kashangaki, Thomas A. L.

    1992-01-01

    This paper describes a series of modal tests that were performed on a cantilevered truss structure. The goal of the tests was to assemble a large database of high quality modal test data for use in verification of proposed methods for on orbit model verification and damage detection in flexible truss structures. A description of the hardware is provided along with details of the experimental setup and procedures for 16 damage cases. Results from selected cases are presented and discussed. Differences between ground vibration testing and on orbit modal testing are also described.

  4. Face verification with balanced thresholds.

    PubMed

    Yan, Shuicheng; Xu, Dong; Tang, Xiaoou

    2007-01-01

    The process of face verification is guided by a pre-learned global threshold, which, however, is often inconsistent with class-specific optimal thresholds. It is, hence, beneficial to pursue a balance of the class-specific thresholds in the model-learning stage. In this paper, we present a new dimensionality reduction algorithm tailored to the verification task that ensures threshold balance. This is achieved by the following aspects. First, feasibility is guaranteed by employing an affine transformation matrix, instead of the conventional projection matrix, for dimensionality reduction, and, hence, we call the proposed algorithm threshold balanced transformation (TBT). Then, the affine transformation matrix, constrained as the product of an orthogonal matrix and a diagonal matrix, is optimized to improve the threshold balance and classification capability in an iterative manner. Unlike most algorithms for face verification which are directly transplanted from face identification literature, TBT is specifically designed for face verification and clarifies the intrinsic distinction between these two tasks. Experiments on three benchmark face databases demonstrate that TBT significantly outperforms the state-of-the-art subspace techniques for face verification.

  5. A new approach to hand-based authentication

    NASA Astrophysics Data System (ADS)

    Amayeh, G.; Bebis, G.; Erol, A.; Nicolescu, M.

    2007-04-01

    Hand-based authentication is a key biometric technology with a wide range of potential applications both in industry and government. Traditionally, hand-based authentication is performed by extracting information from the whole hand. To account for hand and finger motion, guidance pegs are employed to fix the position and orientation of the hand. In this paper, we consider a component-based approach to hand-based verification. Our objective is to investigate the discrimination power of different parts of the hand in order to develop a simpler, faster, and possibly more accurate and robust verification system. Specifically, we propose a new approach which decomposes the hand in different regions, corresponding to the fingers and the back of the palm, and performs verification using information from certain parts of the hand only. Our approach operates on 2D images acquired by placing the hand on a flat lighting table. Using a part-based representation of the hand allows the system to compensate for hand and finger motion without using any guidance pegs. To decompose the hand in different regions, we use a robust methodology based on morphological operators which does not require detecting any landmark points on the hand. To capture the geometry of the back of the palm and the fingers in suffcient detail, we employ high-order Zernike moments which are computed using an effcient methodology. The proposed approach has been evaluated on a database of 100 subjects with 10 images per subject, illustrating promising performance. Comparisons with related approaches using the whole hand for verification illustrate the superiority of the proposed approach. Moreover, qualitative comparisons with state-of-the-art approaches indicate that the proposed approach has comparable or better performance.

  6. Integrity Verification for Multiple Data Copies in Cloud Storage Based on Spatiotemporal Chaos

    NASA Astrophysics Data System (ADS)

    Long, Min; Li, You; Peng, Fei

    Aiming to strike for a balance between the security, efficiency and availability of the data verification in cloud storage, a novel integrity verification scheme based on spatiotemporal chaos is proposed for multiple data copies. Spatiotemporal chaos is implemented for node calculation of the binary tree, and the location of the data in the cloud is verified. Meanwhile, dynamic operation can be made to the data. Furthermore, blind information is used to prevent a third-party auditor (TPA) leakage of the users’ data privacy in a public auditing process. Performance analysis and discussion indicate that it is secure and efficient, and it supports dynamic operation and the integrity verification of multiple copies of data. It has a great potential to be implemented in cloud storage services.

  7. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bravenec, Ronald

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less

  8. A proposed standard method for polarimetric calibration and calibration verification

    NASA Astrophysics Data System (ADS)

    Persons, Christopher M.; Jones, Michael W.; Farlow, Craig A.; Morell, L. Denise; Gulley, Michael G.; Spradley, Kevin D.

    2007-09-01

    Accurate calibration of polarimetric sensors is critical to reducing and analyzing phenomenology data, producing uniform polarimetric imagery for deployable sensors, and ensuring predictable performance of polarimetric algorithms. It is desirable to develop a standard calibration method, including verification reporting, in order to increase credibility with customers and foster communication and understanding within the polarimetric community. This paper seeks to facilitate discussions within the community on arriving at such standards. Both the calibration and verification methods presented here are performed easily with common polarimetric equipment, and are applicable to visible and infrared systems with either partial Stokes or full Stokes sensitivity. The calibration procedure has been used on infrared and visible polarimetric imagers over a six year period, and resulting imagery has been presented previously at conferences and workshops. The proposed calibration method involves the familiar calculation of the polarimetric data reduction matrix by measuring the polarimeter's response to a set of input Stokes vectors. With this method, however, linear combinations of Stokes vectors are used to generate highly accurate input states. This allows the direct measurement of all system effects, in contrast with fitting modeled calibration parameters to measured data. This direct measurement of the data reduction matrix allows higher order effects that are difficult to model to be discovered and corrected for in calibration. This paper begins with a detailed tutorial on the proposed calibration and verification reporting methods. Example results are then presented for a LWIR rotating half-wave retarder polarimeter.

  9. Verification of Gyrokinetic codes: Theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia; Bottino, Alberto; Görler, Tobias; Sonnendrücker, Eric; Told, Daniel; Villard, Laurent

    2017-05-01

    In fusion plasmas, the strong magnetic field allows the fast gyro-motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the subsequent transport. Naturally, these codes require thorough verification and validation. Here, we present a new and generic theoretical framework and specific numerical applications to test the faithfulness of the implemented models to theory and to verify the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which has rarely been done and therefore makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The verification of the numerical scheme is proposed via the benchmark effort. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC) and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations implemented in the ORB5 and GENE codes using the Lagrangian variational formulation. At the computational level, detailed verifications of global electromagnetic test cases developed from the CYCLONE Base Case are considered, including a parametric β-scan covering the transition from ITG to KBM and the spectral properties at the nominal β value.

  10. Influence of the Redundant Verification and the Non-Redundant Verification on the Hydraulic Tomography

    NASA Astrophysics Data System (ADS)

    Wei, T. B.; Chen, Y. L.; Lin, H. R.; Huang, S. Y.; Yeh, T. C. J.; Wen, J. C.

    2016-12-01

    In the groundwater study, it estimated the heterogeneous spatial distribution of hydraulic Properties, there were many scholars use to hydraulic tomography (HT) from field site pumping tests to estimate inverse of heterogeneous spatial distribution of hydraulic Properties, to prove the most of most field site aquifer was heterogeneous hydrogeological parameters spatial distribution field. Many scholars had proposed a method of hydraulic tomography to estimate heterogeneous spatial distribution of hydraulic Properties of aquifer, the Huang et al. [2011] was used the non-redundant verification analysis of pumping wells changed, observation wells fixed on the inverse and the forward, to reflect the feasibility of the heterogeneous spatial distribution of hydraulic Properties of field site aquifer of the non-redundant verification analysis on steady-state model.From post literature, finding only in steady state, non-redundant verification analysis of pumping well changed location and observation wells fixed location for inverse and forward. But the studies had not yet pumping wells fixed or changed location, and observation wells fixed location for redundant verification or observation wells change location for non-redundant verification of the various combinations may to explore of influences of hydraulic tomography method. In this study, it carried out redundant verification method and non-redundant verification method for forward to influences of hydraulic tomography method in transient. And it discuss above mentioned in NYUST campus sites the actual case, to prove the effectiveness of hydraulic tomography methods, and confirmed the feasibility on inverse and forward analysis from analysis results.Keywords: Hydraulic Tomography, Redundant Verification, Heterogeneous, Inverse, Forward

  11. Face Verification across Age Progression using Discriminative Methods

    DTIC Science & Technology

    2008-01-01

    progression. The most related study to our work is [30], where the probabilistic eigenspace frame - work [22] is adapted for face identification across...solution has the same CAR and CRR, is frequently used to measure verification performance, B. Gradient Orientation and Gradient Orientation Pyramid Now we...proposed GOP representation. The other five approaches are different from our method in both representations and classification frame - works. For

  12. Design Authority in the Test Programme Definition: The Alenia Spazio Experience

    NASA Astrophysics Data System (ADS)

    Messidoro, P.; Sacchi, E.; Beruto, E.; Fleming, P.; Marucchi Chierro, P.-P.

    2004-08-01

    In addition, being the Verification and Test Programme a significant part of the spacecraft development life cycle in terms of cost and time, very often the subject of the mentioned discussion has the objective to optimize the verification campaign by possible deletion or limitation of some testing activities. The increased market pressure to reduce the project's schedule and cost is originating a dialecting process inside the project teams, involving program management and design authorities, in order to optimize the verification and testing programme. The paper introduces the Alenia Spazio experience in this context, coming from the real project life on different products and missions (science, TLC, EO, manned, transportation, military, commercial, recurrent and one-of-a-kind). Usually the applicable verification and testing standards (e.g. ECSS-E-10 part 2 "Verification" and ECSS-E-10 part 3 "Testing" [1]) are tailored to the specific project on the basis of its peculiar mission constraints. The Model Philosophy and the associated verification and test programme are defined following an iterative process which suitably combines several aspects (including for examples test requirements and facilities) as shown in Fig. 1 (from ECSS-E-10). The considered cases are mainly oriented to the thermal and mechanical verification, where the benefits of possible test programme optimizations are more significant. Considering the thermal qualification and acceptance testing (i.e. Thermal Balance and Thermal Vacuum) the lessons learned originated by the development of several satellites are presented together with the corresponding recommended approaches. In particular the cases are indicated in which a proper Thermal Balance Test is mandatory and others, in presence of more recurrent design, where a qualification by analysis could be envisaged. The importance of a proper Thermal Vacuum exposure for workmanship verification is also highlighted. Similar considerations are summarized for the mechanical testing with particular emphasis on the importance of Modal Survey, Static and Sine Vibration Tests in the qualification stage in combination with the effectiveness of Vibro-Acoustic Test in acceptance. The apparent relative importance of the Sine Vibration Test for workmanship verification in specific circumstances is also highlighted. Fig. 1. Model philosophy, Verification and Test Programme definition The verification of the project requirements is planned through a combination of suitable verification methods (in particular Analysis and Test) at the different verification levels (from System down to Equipment), in the proper verification stages (e.g. in Qualification and Acceptance).

  13. Study for verification testing of the helmet-mounted display in the Japanese Experimental Module.

    PubMed

    Nakajima, I; Yamamoto, I; Kato, H; Inokuchi, S; Nemoto, M

    2000-02-01

    Our purpose is to propose a research and development project in the field of telemedicine. The proposed Multimedia Telemedicine Experiment for Extra-Vehicular Activity will entail experiments designed to support astronaut health management during Extra-Vehicular Activity (EVA). Experiments will have relevant applications to the Japanese Experimental Module (JEM) operated by National Space Development Agency of Japan (NASDA) for the International Space Station (ISS). In essence, this is a proposal for verification testing of the Helmet-Mounted Display (HMD), which enables astronauts to verify their own blood pressures and electrocardiograms, and to view a display of instructions from the ground station and listings of work procedures. Specifically, HMD is a device designed to project images and data inside the astronaut's helmet. We consider this R&D proposal to be one of the most suitable projects under consideration in response to NASDA's open invitation calling for medical experiments to be conducted on JEM.

  14. 78 FR 45729 - Foreign Supplier Verification Programs for Importers of Food for Humans and Animals

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-29

    ...The Food and Drug Administration (FDA) is proposing to adopt regulations on foreign supplier verification programs (FSVPs) for importers of food for humans and animals. The proposed regulations would require importers to help ensure that food imported into the United States is produced in compliance with processes and procedures, including reasonably appropriate risk-based preventive controls, that provide the same level of public health protection as those required under the hazard analysis and risk-based preventive controls and standards for produce safety sections of the Federal Food, Drug, and Cosmetic Act (the FD&C Act), is not adulterated, and is not misbranded with respect to food allergen labeling. We are proposing these regulations in accordance with the FDA Food Safety Modernization Act (FSMA). The proposed regulations would help ensure that imported food is produced in a manner consistent with U.S. standards.

  15. Competitive region orientation code for palmprint verification and identification

    NASA Astrophysics Data System (ADS)

    Tang, Wenliang

    2015-11-01

    Orientation features of the palmprint have been widely investigated in coding-based palmprint-recognition methods. Conventional orientation-based coding methods usually used discrete filters to extract the orientation feature of palmprint. However, in real operations, the orientations of the filter usually are not consistent with the lines of the palmprint. We thus propose a competitive region orientation-based coding method. Furthermore, an effective weighted balance scheme is proposed to improve the accuracy of the extracted region orientation. Compared with conventional methods, the region orientation of the palmprint extracted using the proposed method can precisely and robustly describe the orientation feature of the palmprint. Extensive experiments on the baseline PolyU and multispectral palmprint databases are performed and the results show that the proposed method achieves a promising performance in comparison to conventional state-of-the-art orientation-based coding methods in both palmprint verification and identification.

  16. Application of Architectural Patterns and Lightweight Formal Method for the Validation and Verification of Safety Critical Systems

    DTIC Science & Technology

    2013-09-01

    to a XML file, a code that Bonine in [21] developed for a similar purpose. Using the StateRover XML log file import tool, we are able to generate a...C. Bonine , M. Shing, T.W. Otani, “Computer-aided process and tools for mobile software acquisition,” NPS, Monterey, CA, Tech. Rep. NPS-SE-13...C10P07R05– 075, 2013. [21] C. Bonine , “Specification, validation and verification of mobile application behavior,” M.S. thesis, Dept. Comp. Science, NPS

  17. Hubble Space Telescope Fine Guidance Sensors Instrument Handbook, version 4.0

    NASA Technical Reports Server (NTRS)

    Holfeltz, S. T. (Editor)

    1994-01-01

    This is a revised version of the Hubble Space Telescope Fine Guidance Sensor Instrument Handbook. The main goal of this edition is to help the potential General Observer (GO) learn how to most efficiently use the Fine Guidance Sensors (FGS's). First, the actual performance of the FGS's as scientific instruments is reviewed. Next, each of the available operating modes of the FGS's are reviewed in turn. The status and findings of pertinent calibrations, including Orbital Verification, Science Verification, and Instrument Scientist Calibrations are included as well as the relevant data reduction software.

  18. WE-DE-BRA-01: SCIENCE COUNCIL JUNIOR INVESTIGATOR COMPETITION WINNER: Acceleration of a Limited-Angle Intrafraction Verification (LIVE) System Using Adaptive Prior Knowledge Based Image Estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Y; Yin, F; Ren, L

    Purpose: To develop an adaptive prior knowledge based image estimation method to reduce the scan angle needed in the LIVE system to reconstruct 4D-CBCT for intrafraction verification. Methods: The LIVE system has been previously proposed to reconstructs 4D volumetric images on-the-fly during arc treatment for intrafraction target verification and dose calculation. This system uses limited-angle beam’s eye view (BEV) MV cine images acquired from the treatment beam together with the orthogonally acquired limited-angle kV projections to reconstruct 4D-CBCT images for target verification during treatment. In this study, we developed an adaptive constrained free-form deformation reconstruction technique in LIVE to furthermore » reduce the scanning angle needed to reconstruct the CBCT images. This technique uses free form deformation with energy minimization to deform prior images to estimate 4D-CBCT based on projections acquired in limited angle (orthogonal 6°) during the treatment. Note that the prior images are adaptively updated using the latest CBCT images reconstructed by LIVE during treatment to utilize the continuity of patient motion.The 4D digital extended-cardiac-torso (XCAT) phantom was used to evaluate the efficacy of this technique with LIVE system. A lung patient was simulated with different scenario, including baseline drifts, amplitude change and phase shift. Limited-angle orthogonal kV and beam’s eye view (BEV) MV projections were generated for each scenario. The CBCT reconstructed by these projections were compared with the ground-truth generated in XCAT.Volume-percentage-difference (VPD) and center-of-mass-shift (COMS) were calculated between the reconstructed and the ground-truth tumors to evaluate the reconstruction accuracy. Results: Using orthogonal-view of 6° kV and BEV- MV projections, the VPD/COMS values were 12.7±4.0%/0.7±0.5 mm, 13.0±5.1%/0.8±0.5 mm, and 11.4±5.4%/0.5±0.3 mm for the three scenarios, respectively. Conclusion: The technique enables LIVE to accurately reconstruct 4D-CBCT images using only orthogonal 6° angle, which greatly improves the efficiency and reduces dose of LIVE for intrafraction verification.« less

  19. Dispelling myths about verification of sea-launched cruise missiles.

    PubMed

    Lewis, G N; Ride, S K; Townsend, J S

    1989-11-10

    It is widely believed that an arms control limit on nuclear-armed sea-launched cruise missiles would be nearly impossible to verify. Among the reasons usually given are: these weapons are small, built in nondistinctive industrial facilities, deployed on a variety of ships and submarines, and difficult to distinguish from their conventionally armed counterparts. In this article, it is argued that the covert production and deployment of nuclear-armed sealaunched cruise missiles would not be so straightforward. A specific arms control proposal is described, namely a total ban on nuclear-armed sea-launched cruise missiles. This proposal is used to illustrate how an effective verification scheme might be constructed.

  20. Arithmetic Circuit Verification Based on Symbolic Computer Algebra

    NASA Astrophysics Data System (ADS)

    Watanabe, Yuki; Homma, Naofumi; Aoki, Takafumi; Higuchi, Tatsuo

    This paper presents a formal approach to verify arithmetic circuits using symbolic computer algebra. Our method describes arithmetic circuits directly with high-level mathematical objects based on weighted number systems and arithmetic formulae. Such circuit description can be effectively verified by polynomial reduction techniques using Gröbner Bases. In this paper, we describe how the symbolic computer algebra can be used to describe and verify arithmetic circuits. The advantageous effects of the proposed approach are demonstrated through experimental verification of some arithmetic circuits such as multiply-accumulator and FIR filter. The result shows that the proposed approach has a definite possibility of verifying practical arithmetic circuits.

  1. Flow Friction or Spontaneous Ignition?

    NASA Technical Reports Server (NTRS)

    Stoltzfus, Joel M.; Gallus, Timothy D.; Sparks, Kyle

    2012-01-01

    "Flow friction," a proposed ignition mechanism in oxygen systems, has proved elusive in attempts at experimental verification. In this paper, the literature regarding flow friction is reviewed and the experimental verification attempts are briefly discussed. Another ignition mechanism, a form of spontaneous combustion, is proposed as an explanation for at least some of the fire events that have been attributed to flow friction in the literature. In addition, the results of a failure analysis performed at NASA Johnson Space Center White Sands Test Facility are presented, and the observations indicate that spontaneous combustion was the most likely cause of the fire in this 2000 psig (14 MPa) oxygen-enriched system.

  2. Unsupervised learning of structure in spectroscopic cubes

    NASA Astrophysics Data System (ADS)

    Araya, M.; Mendoza, M.; Solar, M.; Mardones, D.; Bayo, A.

    2018-07-01

    We consider the problem of analyzing the structure of spectroscopic cubes using unsupervised machine learning techniques. We propose representing the target's signal as a homogeneous set of volumes through an iterative algorithm that separates the structured emission from the background while not overestimating the flux. Besides verifying some basic theoretical properties, the algorithm is designed to be tuned by domain experts, because its parameters have meaningful values in the astronomical context. Nevertheless, we propose a heuristic to automatically estimate the signal-to-noise ratio parameter of the algorithm directly from data. The resulting light-weighted set of samples (≤ 1% compared to the original data) offer several advantages. For instance, it is statistically correct and computationally inexpensive to apply well-established techniques of the pattern recognition and machine learning domains; such as clustering and dimensionality reduction algorithms. We use ALMA science verification data to validate our method, and present examples of the operations that can be performed by using the proposed representation. Even though this approach is focused on providing faster and better analysis tools for the end-user astronomer, it also opens the possibility of content-aware data discovery by applying our algorithm to big data.

  3. Video-Based Fingerprint Verification

    PubMed Central

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283

  4. Optical security verification for blurred fingerprints

    NASA Astrophysics Data System (ADS)

    Soon, Boon Y.; Karim, Mohammad A.; Alam, Mohammad S.

    1998-12-01

    Optical fingerprint security verification is gaining popularity, as it has the potential to perform correlation at the speed of light. With advancement in optical security verification techniques, authentication process can be almost foolproof and reliable for financial transaction, banking, etc. In law enforcement, when a fingerprint is obtained from a crime scene, it may be blurred and can be an unhealthy candidate for correlation purposes. Therefore, the blurred fingerprint needs to be clarified before it is used for the correlation process. There are a several different types of blur, such as linear motion blur and defocus blur, induced by aberration of imaging system. In addition, we may or may not know the blur function. In this paper, we propose the non-singularity inverse filtering in frequency/power domain for deblurring known motion-induced blur in fingerprints. This filtering process will be incorporated with the pow spectrum subtraction technique, uniqueness comparison scheme, and the separated target and references planes method in the joint transform correlator. The proposed hardware implementation is a hybrid electronic-optical correlator system. The performance of the proposed system would be verified with computer simulation for both cases: with and without additive random noise corruption.

  5. A Hybrid On-line Verification Method of Relay Setting

    NASA Astrophysics Data System (ADS)

    Gao, Wangyuan; Chen, Qing; Si, Ji; Huang, Xin

    2017-05-01

    Along with the rapid development of the power industry, grid structure gets more sophisticated. The validity and rationality of protective relaying are vital to the security of power systems. To increase the security of power systems, it is essential to verify the setting values of relays online. Traditional verification methods mainly include the comparison of protection range and the comparison of calculated setting value. To realize on-line verification, the verifying speed is the key. The verifying result of comparing protection range is accurate, but the computation burden is heavy, and the verifying speed is slow. Comparing calculated setting value is much faster, but the verifying result is conservative and inaccurate. Taking the overcurrent protection as example, this paper analyses the advantages and disadvantages of the two traditional methods above, and proposes a hybrid method of on-line verification which synthesizes the advantages of the two traditional methods. This hybrid method can meet the requirements of accurate on-line verification.

  6. [Uniqueness seeking behavior as a self-verification: an alternative approach to the study of uniqueness].

    PubMed

    Yamaoka, S

    1995-06-01

    Uniqueness theory explains that extremely high perceived similarity between self and others evokes negative emotional reactions and causes uniqueness seeking behavior. However, the theory conceptualizes similarity so ambiguously that it appears to suffer from low predictive validity. The purpose of the current article is to propose an alternative explanation of uniqueness seeking behavior. It posits that perceived uniqueness deprivation is a threat to self-concepts, and therefore causes self-verification behavior. Two levels of self verification are conceived: one based on personal categorization and the other on social categorization. The present approach regards uniqueness seeking behavior as the personal-level self verification. To test these propositions, a 2 (very high or moderate similarity information) x 2 (with or without outgroup information) x 2 (high or low need for uniqueness) between-subject factorial-design experiment was conducted with 95 university students. Results supported the self-verification approach, and were discussed in terms of effects of uniqueness deprivation, levels of self-categorization, and individual differences in need for uniqueness.

  7. Formal verification of medical monitoring software using Z language: a representative sample.

    PubMed

    Babamir, Seyed Morteza; Borhani, Mehdi

    2012-08-01

    Medical monitoring systems are useful aids assisting physicians in keeping patients under constant surveillance; however, taking sound decision by the systems is a physician concern. As a result, verification of the systems behavior in monitoring patients is a matter of significant. The patient monitoring is undertaken by software in modern medical systems; so, software verification of modern medial systems have been noticed. Such verification can be achieved by the Formal Languages having mathematical foundations. Among others, the Z language is a suitable formal language has been used to formal verification of systems. This study aims to present a constructive method to verify a representative sample of a medical system by which the system is visually specified and formally verified against patient constraints stated in Z Language. Exploiting our past experience in formal modeling Continuous Infusion Insulin Pump (CIIP), we think of the CIIP system as a representative sample of medical systems in proposing our present study. The system is responsible for monitoring diabetic's blood sugar.

  8. 78 FR 57320 - Food and Drug Administration Food Safety Modernization Act: Proposed Rules on Foreign Supplier...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-18

    ...: Proposed Rules on Foreign Supplier Verification Programs and the Accreditation of Third-Party Auditors... Accreditation of Third-Party Auditors/Certification Bodies would strengthen the quality, objectivity, and... public can review the proposals on FSVP and the Accreditation of Third-Party Auditors/ Certification...

  9. 78 FR 49988 - Food and Drug Administration Food Safety Modernization Act: Proposed Rules on Foreign Supplier...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-16

    ...: Proposed Rules on Foreign Supplier Verification Programs and the Accreditation of Third-Party Auditors... Accreditation of Third-Party Auditors/Certification Bodies would strengthen the quality, objectivity, and... that the public can review the proposals on FSVP and the Accreditation of Third-Party Auditors...

  10. 6 CFR Appendix B to Part 5 - Public Reading Rooms of the Department of Homeland Security

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...-proliferation and verification research and development program. The life sciences activities related to... book) 11. Former components of the General Services Administration: For the Federal Computer Incident...

  11. 6 CFR Appendix B to Part 5 - Public Reading Rooms of the Department of Homeland Security

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...-proliferation and verification research and development program. The life sciences activities related to... book) 11. Former components of the General Services Administration: For the Federal Computer Incident...

  12. 6 CFR Appendix B to Part 5 - Public Reading Rooms of the Department of Homeland Security

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...-proliferation and verification research and development program. The life sciences activities related to... book) 11. Former components of the General Services Administration: For the Federal Computer Incident...

  13. 6 CFR Appendix B to Part 5 - Public Reading Rooms of the Department of Homeland Security

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...-proliferation and verification research and development program. The life sciences activities related to... book) 11. Former components of the General Services Administration: For the Federal Computer Incident...

  14. 6 CFR Appendix B to Part 5 - Public Reading Rooms of the Department of Homeland Security

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...-proliferation and verification research and development program. The life sciences activities related to... book) 11. Former components of the General Services Administration: For the Federal Computer Incident...

  15. Verification for measurement-only blind quantum computing

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki

    2014-06-01

    Blind quantum computing is a new secure quantum computing protocol where a client who does not have any sophisticated quantum technology can delegate her quantum computing to a server without leaking any privacy. It is known that a client who has only a measurement device can perform blind quantum computing [T. Morimae and K. Fujii, Phys. Rev. A 87, 050301(R) (2013), 10.1103/PhysRevA.87.050301]. It has been an open problem whether the protocol can enjoy the verification, i.e., the ability of the client to check the correctness of the computing. In this paper, we propose a protocol of verification for the measurement-only blind quantum computing.

  16. Behavioral biometrics for verification and recognition of malicious software agents

    NASA Astrophysics Data System (ADS)

    Yampolskiy, Roman V.; Govindaraju, Venu

    2008-04-01

    Homeland security requires technologies capable of positive and reliable identification of humans for law enforcement, government, and commercial applications. As artificially intelligent agents improve in their abilities and become a part of our everyday life, the possibility of using such programs for undermining homeland security increases. Virtual assistants, shopping bots, and game playing programs are used daily by millions of people. We propose applying statistical behavior modeling techniques developed by us for recognition of humans to the identification and verification of intelligent and potentially malicious software agents. Our experimental results demonstrate feasibility of such methods for both artificial agent verification and even for recognition purposes.

  17. A Verification Method for MASOES.

    PubMed

    Perozo, N; Aguilar Perozo, J; Terán, O; Molina, H

    2013-02-01

    MASOES is a 3agent architecture for designing and modeling self-organizing and emergent systems. This architecture describes the elements, relationships, and mechanisms, both at the individual and the collective levels, that favor the analysis of the self-organizing and emergent phenomenon without mathematically modeling the system. In this paper, a method is proposed for verifying MASOES from the point of view of design in order to study the self-organizing and emergent behaviors of the modeled systems. The verification criteria are set according to what is proposed in MASOES for modeling self-organizing and emerging systems and the principles of the wisdom of crowd paradigm and the fuzzy cognitive map (FCM) theory. The verification method for MASOES has been implemented in a tool called FCM Designer and has been tested to model a community of free software developers that works under the bazaar style as well as a Wikipedia community in order to study their behavior and determine their self-organizing and emergent capacities.

  18. A Verification Method of Inter-Task Cooperation in Embedded Real-time Systems and its Evaluation

    NASA Astrophysics Data System (ADS)

    Yoshida, Toshio

    In software development process of embedded real-time systems, the design of the task cooperation process is very important. The cooperating process of such tasks is specified by task cooperation patterns. Adoption of unsuitable task cooperation patterns has fatal influence on system performance, quality, and extendibility. In order to prevent repetitive work caused by the shortage of task cooperation performance, it is necessary to verify task cooperation patterns in an early software development stage. However, it is very difficult to verify task cooperation patterns in an early software developing stage where task program codes are not completed yet. Therefore, we propose a verification method using task skeleton program codes and a real-time kernel that has a function of recording all events during software execution such as system calls issued by task program codes, external interrupts, and timer interrupt. In order to evaluate the proposed verification method, we applied it to the software development process of a mechatronics control system.

  19. Numerical verification of composite rods theory on multi-story buildings analysis

    NASA Astrophysics Data System (ADS)

    El-Din Mansour, Alaa; Filatov, Vladimir; Gandzhuntsev, Michael; Ryasny, Nikita

    2018-03-01

    In the article, a verification proposal of the composite rods theory on the structural analysis of skeletons for high-rise buildings. A testing design model been formed on which horizontal elements been represented by a multilayer cantilever beam operates on transverse bending on which slabs are connected with a moment-non-transferring connections and a multilayer columns represents the vertical elements. Those connections are sufficiently enough to form a shearing action can be approximated by a certain shear forces function, the thing which significantly reduces the overall static indeterminacy degree of the structural model. A system of differential equations describe the operation mechanism of the multilayer rods that solved using the numerical approach of successive approximations method. The proposed methodology to be used while preliminary calculations for the sake of determining the rigidity characteristics of the structure; are needed. In addition, for a qualitative assessment of the results obtained by other methods when performing calculations with the verification aims.

  20. Partial Support of Meeting of the Board on Mathematical Sciences and Their Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weidman, Scott

    2014-08-31

    During the performance period, BMSA released the following major reports: Transforming Combustion Research through Cyberinfrastructure (2011); Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification (2012); Fueling Innovation and Discovery: The Mathematical Sciences in the 21st Century (2012); Aging and the Macroeconomy: Long-Term Implications of an Older Population (2012); The Mathematical Sciences in 2025 (2013); Frontiers in Massive Data Analysis (2013); and Developing a 21st Century Global Library for Mathematics Research (2014).

  1. The Mailbox Computer System for the IAEA verification experiment on HEU downlending at the Portsmouth Gaseous Diffusion Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aronson, A.L.; Gordon, D.M.

    IN APRIL 1996, THE UNITED STATES (US) ADDED THE PORTSMOUTH GASEOUS DIFFUSION PLANT TO THE LIST OF FACILITIES ELIGIBLE FOR THE APPLICATION OF INTERNATIONAL ATOMIC ENERGY AGENCY (IAEA) SAFEGUARDS. AT THAT TIME, THE US PROPOSED THAT THE IAEA CARRY OUT A ''VERIFICATION EXPERIMENT'' AT THE PLANT WITH RESPECT TO DOOWNBLENDING OF ABOUT 13 METRIC TONS OF HIGHLY ENRICHED URANIUM (HEU) IN THE FORM OF URANIUM HEXAFLUROIDE (UF6). DURING THE PERIOD DECEMBER 1997 THROUGH JULY 1998, THE IAEA CARRIED OUT THE REQUESTED VERIFICATION EXPERIMENT. THE VERIFICATION APPROACH USED FOR THIS EXPERIMENT INCLUDED, AMONG OTHER MEASURES, THE ENTRY OF PROCESS-OPERATIONAL DATA BYmore » THE FACILITY OPERATOR ON A NEAR-REAL-TIME BASIS INTO A ''MAILBOX'' COMPUTER LOCATED WITHIN A TAMPER-INDICATING ENCLOSURE SEALED BY THE IAEA.« less

  2. ADVANCED SURVEILLANCE OF ENVIROMENTAL RADIATION IN AUTOMATIC NETWORKS.

    PubMed

    Benito, G; Sáez, J C; Blázquez, J B; Quiñones, J

    2018-06-01

    The objective of this study is the verification of the operation of a radiation monitoring network conformed by several sensors. The malfunction of a surveillance network has security and economic consequences, which derive from its maintenance and could be avoided with an early detection. The proposed method is based on a kind of multivariate distance, and the verification for the methodology has been tested at CIEMAT's local radiological early warning network.

  3. A hybrid Bayesian hierarchical model combining cohort and case-control studies for meta-analysis of diagnostic tests: Accounting for partial verification bias.

    PubMed

    Ma, Xiaoye; Chen, Yong; Cole, Stephen R; Chu, Haitao

    2016-12-01

    To account for between-study heterogeneity in meta-analysis of diagnostic accuracy studies, bivariate random effects models have been recommended to jointly model the sensitivities and specificities. As study design and population vary, the definition of disease status or severity could differ across studies. Consequently, sensitivity and specificity may be correlated with disease prevalence. To account for this dependence, a trivariate random effects model had been proposed. However, the proposed approach can only include cohort studies with information estimating study-specific disease prevalence. In addition, some diagnostic accuracy studies only select a subset of samples to be verified by the reference test. It is known that ignoring unverified subjects may lead to partial verification bias in the estimation of prevalence, sensitivities, and specificities in a single study. However, the impact of this bias on a meta-analysis has not been investigated. In this paper, we propose a novel hybrid Bayesian hierarchical model combining cohort and case-control studies and correcting partial verification bias at the same time. We investigate the performance of the proposed methods through a set of simulation studies. Two case studies on assessing the diagnostic accuracy of gadolinium-enhanced magnetic resonance imaging in detecting lymph node metastases and of adrenal fluorine-18 fluorodeoxyglucose positron emission tomography in characterizing adrenal masses are presented. © The Author(s) 2014.

  4. A Hybrid Bayesian Hierarchical Model Combining Cohort and Case-control Studies for Meta-analysis of Diagnostic Tests: Accounting for Partial Verification Bias

    PubMed Central

    Ma, Xiaoye; Chen, Yong; Cole, Stephen R.; Chu, Haitao

    2014-01-01

    To account for between-study heterogeneity in meta-analysis of diagnostic accuracy studies, bivariate random effects models have been recommended to jointly model the sensitivities and specificities. As study design and population vary, the definition of disease status or severity could differ across studies. Consequently, sensitivity and specificity may be correlated with disease prevalence. To account for this dependence, a trivariate random effects model had been proposed. However, the proposed approach can only include cohort studies with information estimating study-specific disease prevalence. In addition, some diagnostic accuracy studies only select a subset of samples to be verified by the reference test. It is known that ignoring unverified subjects may lead to partial verification bias in the estimation of prevalence, sensitivities and specificities in a single study. However, the impact of this bias on a meta-analysis has not been investigated. In this paper, we propose a novel hybrid Bayesian hierarchical model combining cohort and case-control studies and correcting partial verification bias at the same time. We investigate the performance of the proposed methods through a set of simulation studies. Two case studies on assessing the diagnostic accuracy of gadolinium-enhanced magnetic resonance imaging in detecting lymph node metastases and of adrenal fluorine-18 fluorodeoxyglucose positron emission tomography in characterizing adrenal masses are presented. PMID:24862512

  5. A critique of the hypothesis, and a defense of the question, as a framework for experimentation.

    PubMed

    Glass, David J

    2010-07-01

    Scientists are often steered by common convention, funding agencies, and journal guidelines into a hypothesis-driven experimental framework, despite Isaac Newton's dictum that hypotheses have no place in experimental science. Some may think that Newton's cautionary note, which was in keeping with an experimental approach espoused by Francis Bacon, is inapplicable to current experimental method since, in accord with the philosopher Karl Popper, modern-day hypotheses are framed to serve as instruments of falsification, as opposed to verification. But Popper's "critical rationalist" framework too is problematic. It has been accused of being: inconsistent on philosophical grounds; unworkable for modern "large science," such as systems biology; inconsistent with the actual goals of experimental science, which is verification and not falsification; and harmful to the process of discovery as a practical matter. A criticism of the hypothesis as a framework for experimentation is offered. Presented is an alternative framework-the query/model approach-which many scientists may discover is the framework they are actually using, despite being required to give lip service to the hypothesis.

  6. SETI and the media: Views from inside and out

    NASA Astrophysics Data System (ADS)

    Tarter, Donald E.

    Results are presented from a detailed questionnaire sent to members of the international SETI (Search for Extraterrestrial Intelligence) community and the international science media. Both groups are compared on the following dimensions: perceived importance of SETI, perceived level of information about SETI available to the media and public, perceived credibility of SETI, and attitudes toward information policy options to govern an announcement of a SETI discovery. The results indicate that SETI is perceived to be an extremely important endeavor, but it enjoys only marginal credibility among the public and the SETI community's professional constituencies. Both the SETI community and the media agree that an erroneous announcement of a discovery of extraterrestrial intelligence could be very damaging. In order to minimize the dangers of false announcement and to bring a degree of order to SETI, a scientific protocol agreement and the establishment of a contact verification committee have been recommended. Both received endorsement from the SETI community and the international science media. The science media feels that from its viewpoint, a contact verification committee would be a more effective way of assuring accurate information about SETI programs and discoveries.

  7. Sandia National Laboratories: About Sandia: Environmental Responsibility:

    Science.gov Websites

    Environmental Management: Sandia Sandia National Laboratories Exceptional service in the Environmental Responsibility Environmental Management System Pollution Prevention History 60 impacts Diversity ; Verification Research Research Foundations Bioscience Computing & Information Science Electromagnetics

  8. Velocity-image model for online signature verification.

    PubMed

    Khan, Mohammad A U; Niazi, Muhammad Khalid Khan; Khan, Muhammad Aurangzeb

    2006-11-01

    In general, online signature capturing devices provide outputs in the form of shape and velocity signals. In the past, strokes have been extracted while tracking velocity signal minimas. However, the resulting strokes are larger and complicated in shape and thus make the subsequent job of generating a discriminative template difficult. We propose a new stroke-based algorithm that splits velocity signal into various bands. Based on these bands, strokes are extracted which are smaller and more simpler in nature. Training of our proposed system revealed that low- and high-velocity bands of the signal are unstable, whereas the medium-velocity band can be used for discrimination purposes. Euclidean distances of strokes extracted on the basis of medium velocity band are used for verification purpose. The experiments conducted show improvement in discriminative capability of the proposed stroke-based system.

  9. Architectures Toward Reusable Science Data Systems

    NASA Technical Reports Server (NTRS)

    Moses, John Firor

    2014-01-01

    Science Data Systems (SDS) comprise an important class of data processing systems that support product generation from remote sensors and in-situ observations. These systems enable research into new science data products, replication of experiments and verification of results. NASA has been building systems for satellite data processing since the first Earth observing satellites launched and is continuing development of systems to support NASA science research and NOAA's Earth observing satellite operations. The basic data processing workflows and scenarios continue to be valid for remote sensor observations research as well as for the complex multi-instrument operational satellite data systems being built today.

  10. Optical Testing and Verification Methods for the James Webb Space Telescope Integrated Science Instrument Module Element

    NASA Technical Reports Server (NTRS)

    Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.; hide

    2016-01-01

    NASA's James Webb Space Telescope (JWST) is a 6.6m diameter, segmented, deployable telescope for cryogenic IR space astronomy (40K). The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) that contains four science instruments (SI) and the fine guider. The SIs are mounted to a composite metering structure. The SI and guider units were integrated to the ISIM structure and optically tested at the NASA Goddard Space Flight Center as a suite using the Optical Telescope Element SIMulator (OSIM). OSIM is a full field, cryogenic JWST telescope simulator. SI performance, including alignment and wave front error, were evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.

  11. Optical testing and verification methods for the James Webb Space Telescope Integrated Science Instrument Module element

    NASA Astrophysics Data System (ADS)

    Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.; Eichhorn, William L.; Glasse, Alistair C.; Gracey, Renee; Hartig, George F.; Howard, Joseph M.; Kelly, Douglas M.; Kimble, Randy A.; Kirk, Jeffrey R.; Kubalak, David A.; Landsman, Wayne B.; Lindler, Don J.; Malumuth, Eliot M.; Maszkiewicz, Michael; Rieke, Marcia J.; Rowlands, Neil; Sabatke, Derek S.; Smith, Corbett T.; Smith, J. Scott; Sullivan, Joseph F.; Telfer, Randal C.; Te Plate, Maurice; Vila, M. Begoña.; Warner, Gerry D.; Wright, David; Wright, Raymond H.; Zhou, Julia; Zielinski, Thomas P.

    2016-09-01

    NASA's James Webb Space Telescope (JWST) is a 6.5m diameter, segmented, deployable telescope for cryogenic IR space astronomy. The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM), that contains four science instruments (SI) and the Fine Guidance Sensor (FGS). The SIs are mounted to a composite metering structure. The SIs and FGS were integrated to the ISIM structure and optically tested at NASA's Goddard Space Flight Center using the Optical Telescope Element SIMulator (OSIM). OSIM is a full-field, cryogenic JWST telescope simulator. SI performance, including alignment and wavefront error, was evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, implementation of associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.

  12. Continuous-variable quantum homomorphic signature

    NASA Astrophysics Data System (ADS)

    Li, Ke; Shang, Tao; Liu, Jian-wei

    2017-10-01

    Quantum cryptography is believed to be unconditionally secure because its security is ensured by physical laws rather than computational complexity. According to spectrum characteristic, quantum information can be classified into two categories, namely discrete variables and continuous variables. Continuous-variable quantum protocols have gained much attention for their ability to transmit more information with lower cost. To verify the identities of different data sources in a quantum network, we propose a continuous-variable quantum homomorphic signature scheme. It is based on continuous-variable entanglement swapping and provides additive and subtractive homomorphism. Security analysis shows the proposed scheme is secure against replay, forgery and repudiation. Even under nonideal conditions, it supports effective verification within a certain verification threshold.

  13. Validation of a SysML based design for wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Berrachedi, Amel; Rahim, Messaoud; Ioualalen, Malika; Hammad, Ahmed

    2017-07-01

    When developing complex systems, the requirement for the verification of the systems' design is one of the main challenges. Wireless Sensor Networks (WSNs) are examples of such systems. We address the problem of how WSNs must be designed to fulfil the system requirements. Using the SysML Language, we propose a Model Based System Engineering (MBSE) specification and verification methodology for designing WSNs. This methodology uses SysML to describe the WSNs requirements, structure and behaviour. Then, it translates the SysML elements to an analytic model, specifically, to a Deterministic Stochastic Petri Net. The proposed approach allows to design WSNs and study their behaviors and their energy performances.

  14. eBiometrics: an enhanced multi-biometrics authentication technique for real-time remote applications on mobile devices

    NASA Astrophysics Data System (ADS)

    Kuseler, Torben; Lami, Ihsan; Jassim, Sabah; Sellahewa, Harin

    2010-04-01

    The use of mobile communication devices with advance sensors is growing rapidly. These sensors are enabling functions such as Image capture, Location applications, and Biometric authentication such as Fingerprint verification and Face & Handwritten signature recognition. Such ubiquitous devices are essential tools in today's global economic activities enabling anywhere-anytime financial and business transactions. Cryptographic functions and biometric-based authentication can enhance the security and confidentiality of mobile transactions. Using Biometric template security techniques in real-time biometric-based authentication are key factors for successful identity verification solutions, but are venerable to determined attacks by both fraudulent software and hardware. The EU-funded SecurePhone project has designed and implemented a multimodal biometric user authentication system on a prototype mobile communication device. However, various implementations of this project have resulted in long verification times or reduced accuracy and/or security. This paper proposes to use built-in-self-test techniques to ensure no tampering has taken place on the verification process prior to performing the actual biometric authentication. These techniques utilises the user personal identification number as a seed to generate a unique signature. This signature is then used to test the integrity of the verification process. Also, this study proposes the use of a combination of biometric modalities to provide application specific authentication in a secure environment, thus achieving optimum security level with effective processing time. I.e. to ensure that the necessary authentication steps and algorithms running on the mobile device application processor can not be undermined or modified by an imposter to get unauthorized access to the secure system.

  15. Verification of Gyrokinetic codes: theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia

    2016-10-01

    In fusion plasmas the strong magnetic field allows the fast gyro motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the consequent transport. We present a new and generic theoretical framework and specific numerical applications to test the validity and the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The indirect verification of numerical scheme is proposed via the Benchmark process. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC), and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations using the generic variational formulation. Then, we derive and include the models implemented in ORB5 and GENE inside this hierarchy. At the computational level, detailed verification of global electromagnetic test cases based on the CYCLONE are considered, including a parametric β-scan covering the transition between the ITG to KBM and the spectral properties at the nominal β value.

  16. Self-verification and social anxiety: preference for negative social feedback and low social self-esteem.

    PubMed

    Valentiner, David P; Skowronski, John J; McGrath, Patrick B; Smith, Sarah A; Renner, Kerry A

    2011-10-01

    A self-verification model of social anxiety views negative social self-esteem as a core feature of social anxiety. This core feature is proposed to be maintained through self-verification processes, such as by leading individuals with negative social self-esteem to prefer negative social feedback. This model is tested in two studies. In Study 1, questionnaires were administered to a college sample (N = 317). In Study 2, questionnaires were administered to anxiety disordered patients (N = 62) before and after treatment. Study 1 developed measures of preference for negative social feedback and social self-esteem, and provided evidence of their incremental validity in a college sample. Study 2 found that these two variables are not strongly related to fears of evaluation, are relatively unaffected by a treatment that targets such fears, and predict residual social anxiety following treatment. Overall, these studies provide preliminary evidence for a self-verification model of social anxiety.

  17. Exploring system interconnection architectures with VIPACES: from direct connections to NOCs

    NASA Astrophysics Data System (ADS)

    Sánchez-Peña, Armando; Carballo, Pedro P.; Núñez, Antonio

    2007-05-01

    This paper presents a simple environment for the verification of AMBA 3 AXI systems in Verification IP (VIP) production called VIPACES (Verification Interface Primitives for the development of AXI Compliant Elements and Systems). These primitives are presented as a not compiled library written in SystemC where interfaces are the core of the library. The definition of interfaces instead of generic modules let the user construct custom modules improving the resources spent during the verification phase as well as easily adapting his modules to the AMBA 3 AXI protocol. This topic is the main discussion in the VIPACES library. The paper focuses on comparing and contrasting the main interconnection schemes for AMBA 3 AXI as modeled by VIPACES. For assessing these results we propose a validation scenario with a particular architecture belonging to the domain of MPEG4 video decoding, which is compound by an AXI bus connecting an IDCT and other processing resources.

  18. Formal Verification of Complex Systems based on SysML Functional Requirements

    DTIC Science & Technology

    2014-12-23

    Formal Verification of Complex Systems based on SysML Functional Requirements Hoda Mehrpouyan1, Irem Y. Tumer2, Chris Hoyle2, Dimitra Giannakopoulou3...requirements for design of complex engineered systems. The proposed ap- proach combines a SysML modeling approach to document and structure safety requirements...methods and tools to support the integration of safety into the design solution. 2.1. SysML for Complex Engineered Systems Traditional methods and tools

  19. A field study of the accuracy and reliability of a biometric iris recognition system.

    PubMed

    Latman, Neal S; Herb, Emily

    2013-06-01

    The iris of the eye appears to satisfy the criteria for a good anatomical characteristic for use in a biometric system. The purpose of this study was to evaluate a biometric iris recognition system: Mobile-Eyes™. The enrollment, verification, and identification applications were evaluated in a field study for accuracy and reliability using both irises of 277 subjects. Independent variables included a wide range of subject demographics, ambient light, and ambient temperature. A sub-set of 35 subjects had alcohol-induced nystagmus. There were 2710 identification and verification attempts, which resulted in 1,501,340 and 5540 iris comparisons respectively. In this study, the system successfully enrolled all subjects on the first attempt. All 277 subjects were successfully verified and identified on the first day of enrollment. None of the current or prior eye conditions prevented enrollment, verification, or identification. All 35 subjects with alcohol-induced nystagmus were successfully verified and identified. There were no false verifications or false identifications. Two conditions were identified that potentially could circumvent the use of iris recognitions systems in general. The Mobile-Eyes™ iris recognition system exhibited accurate and reliable enrollment, verification, and identification applications in this study. It may have special applications in subjects with nystagmus. Copyright © 2012 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.

  20. Automated System Calibration and Verification of the Position Measurements for the Los Alamos Isotope Production Facility and the Switchyard Kicker Facilities

    NASA Astrophysics Data System (ADS)

    Barr, D.; Gilpatrick, J. D.; Martinez, D.; Shurter, R. B.

    2004-11-01

    The Los Alamos Neutron Science Center (LANSCE) facility at Los Alamos National Laboratory has constructed both an Isotope Production Facility (IPF) and a Switchyard Kicker (XDK) as additions to the H+ and H- accelerator. These additions contain eleven Beam Position Monitors (BPMs) that measure the beam's position throughout the transport. The analog electronics within each processing module determines the beam position using the log-ratio technique. For system reliability, calibrations compensate for various temperature drifts and other imperfections in the processing electronics components. Additionally, verifications are periodically implemented by a PC running a National Instruments LabVIEW virtual instrument (VI) to verify continued system and cable integrity. The VI communicates with the processor cards via a PCI/MXI-3 VXI-crate communication module. Previously, accelerator operators performed BPM system calibrations typically once per day while beam was explicitly turned off. One of this new measurement system's unique achievements is its automated calibration and verification capability. Taking advantage of the pulsed nature of the LANSCE-facility beams, the integrated electronics hardware and VI perform calibration and verification operations between beam pulses without interrupting production beam delivery. The design, construction, and performance results of the automated calibration and verification portion of this position measurement system will be the topic of this paper.

  1. Evaluation of AK-225(R), Vertrel(R) MCA and HFE A 7100 as Alternative Solvents for Precision Cleaning and Verification Technology

    NASA Technical Reports Server (NTRS)

    Melendez, Orlando; Trizzino, Mary; Fedderson, Bryan

    1997-01-01

    The National Aeronautics and Space Administration (NASA), Kennedy Space Center (KSC) Materials Science Division conducted a study to evaluate alternative solvents for CFC-113 in precision cleaning and verification on typical samples that are used in the KSC environment. The effects of AK-225(R), Vertrel(R), MCA, and HFE A 7100 on selected metal and polymer materials were studied over 1, 7 and 30 day test times. This report addresses a study on the compatibility aspects of replacement solvents for materials in aerospace applications.

  2. Mutation Testing for Effective Verification of Digital Components of Physical Systems

    NASA Astrophysics Data System (ADS)

    Kushik, N. G.; Evtushenko, N. V.; Torgaev, S. N.

    2015-12-01

    Digital components of modern physical systems are often designed applying circuitry solutions based on the field programmable gate array technology (FPGA). Such (embedded) digital components should be carefully tested. In this paper, an approach for the verification of digital physical system components based on mutation testing is proposed. The reference description of the behavior of a digital component in the hardware description language (HDL) is mutated by introducing into it the most probable errors and, unlike mutants in high-level programming languages, the corresponding test case is effectively derived based on a comparison of special scalable representations of the specification and the constructed mutant using various logic synthesis and verification systems.

  3. Origin of Initial Current Peak in High Power Impulse Magnetron Sputtering and Verification by Non-Sputtering Discharge

    NASA Astrophysics Data System (ADS)

    Zhong-Zhen, Wu; Shu, Xiao; Sui-Han, Cui; Ricky, K. Y. Fu; Xiu-Bo, Tian; Paul, K. Chu; Feng, Pan

    2016-07-01

    Not Available Supported by the National Natural Science Foundation of China under Grant Nos 51301004 and U1330110, the Guangdong Innovative and Entrepreneurial Research Team Program under Grant No 2013N080, the Shenzhen Science and Technology Research Grant under Grant Nos JCYJ20140903102215536 and JCYJ20150828093127698, and the City University of Hong Kong Applied Research Grant under Grant No 9667104.

  4. Final Report for the ZERT Project: Basic Science of Retention Issues, Risk Assessment & Measurement, Monitoring and Verification for Geologic Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spangler, Lee; Cunningham, Alfred; Lageson, David

    2011-03-31

    ZERT has made major contributions to five main areas of sequestration science: improvement of computational tools; measurement and monitoring techniques to verify storage and track migration of CO{sub 2}; development of a comprehensive performance and risk assessment framework; fundamental geophysical, geochemical and hydrological investigations of CO{sub 2} storage; and investigate innovative, bio-based mitigation strategies.

  5. Bimodal Biometric Verification Using the Fusion of Palmprint and Infrared Palm-Dorsum Vein Images

    PubMed Central

    Lin, Chih-Lung; Wang, Shih-Hung; Cheng, Hsu-Yung; Fan, Kuo-Chin; Hsu, Wei-Lieh; Lai, Chin-Rong

    2015-01-01

    In this paper, we present a reliable and robust biometric verification method based on bimodal physiological characteristics of palms, including the palmprint and palm-dorsum vein patterns. The proposed method consists of five steps: (1) automatically aligning and cropping the same region of interest from different palm or palm-dorsum images; (2) applying the digital wavelet transform and inverse wavelet transform to fuse palmprint and vein pattern images; (3) extracting the line-like features (LLFs) from the fused image; (4) obtaining multiresolution representations of the LLFs by using a multiresolution filter; and (5) using a support vector machine to verify the multiresolution representations of the LLFs. The proposed method possesses four advantages: first, both modal images are captured in peg-free scenarios to improve the user-friendliness of the verification device. Second, palmprint and vein pattern images are captured using a low-resolution digital scanner and infrared (IR) camera. The use of low-resolution images results in a smaller database. In addition, the vein pattern images are captured through the invisible IR spectrum, which improves antispoofing. Third, since the physiological characteristics of palmprint and vein pattern images are different, a hybrid fusing rule can be introduced to fuse the decomposition coefficients of different bands. The proposed method fuses decomposition coefficients at different decomposed levels, with different image sizes, captured from different sensor devices. Finally, the proposed method operates automatically and hence no parameters need to be set manually. Three thousand palmprint images and 3000 vein pattern images were collected from 100 volunteers to verify the validity of the proposed method. The results show a false rejection rate of 1.20% and a false acceptance rate of 1.56%. It demonstrates the validity and excellent performance of our proposed method comparing to other methods. PMID:26703596

  6. Bimodal Biometric Verification Using the Fusion of Palmprint and Infrared Palm-Dorsum Vein Images.

    PubMed

    Lin, Chih-Lung; Wang, Shih-Hung; Cheng, Hsu-Yung; Fan, Kuo-Chin; Hsu, Wei-Lieh; Lai, Chin-Rong

    2015-12-12

    In this paper, we present a reliable and robust biometric verification method based on bimodal physiological characteristics of palms, including the palmprint and palm-dorsum vein patterns. The proposed method consists of five steps: (1) automatically aligning and cropping the same region of interest from different palm or palm-dorsum images; (2) applying the digital wavelet transform and inverse wavelet transform to fuse palmprint and vein pattern images; (3) extracting the line-like features (LLFs) from the fused image; (4) obtaining multiresolution representations of the LLFs by using a multiresolution filter; and (5) using a support vector machine to verify the multiresolution representations of the LLFs. The proposed method possesses four advantages: first, both modal images are captured in peg-free scenarios to improve the user-friendliness of the verification device. Second, palmprint and vein pattern images are captured using a low-resolution digital scanner and infrared (IR) camera. The use of low-resolution images results in a smaller database. In addition, the vein pattern images are captured through the invisible IR spectrum, which improves antispoofing. Third, since the physiological characteristics of palmprint and vein pattern images are different, a hybrid fusing rule can be introduced to fuse the decomposition coefficients of different bands. The proposed method fuses decomposition coefficients at different decomposed levels, with different image sizes, captured from different sensor devices. Finally, the proposed method operates automatically and hence no parameters need to be set manually. Three thousand palmprint images and 3000 vein pattern images were collected from 100 volunteers to verify the validity of the proposed method. The results show a false rejection rate of 1.20% and a false acceptance rate of 1.56%. It demonstrates the validity and excellent performance of our proposed method comparing to other methods.

  7. A Verification-Driven Approach to Control Analysis and Tuning

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2008-01-01

    This paper proposes a methodology for the analysis and tuning of controllers using control verification metrics. These metrics, which are introduced in a companion paper, measure the size of the largest uncertainty set of a given class for which the closed-loop specifications are satisfied. This framework integrates deterministic and probabilistic uncertainty models into a setting that enables the deformation of sets in the parameter space, the control design space, and in the union of these two spaces. In regard to control analysis, we propose strategies that enable bounding regions of the design space where the specifications are satisfied by all the closed-loop systems associated with a prescribed uncertainty set. When this is unfeasible, we bound regions where the probability of satisfying the requirements exceeds a prescribed value. In regard to control tuning, we propose strategies for the improvement of the robust characteristics of a baseline controller. Some of these strategies use multi-point approximations to the control verification metrics in order to alleviate the numerical burden of solving a min-max problem. Since this methodology targets non-linear systems having an arbitrary, possibly implicit, functional dependency on the uncertain parameters and for which high-fidelity simulations are available, they are applicable to realistic engineering problems..

  8. Design and Verification of Critical Pressurised Windows for Manned Spaceflight

    NASA Astrophysics Data System (ADS)

    Lamoure, Richard; Busto, Lara; Novo, Francisco; Sinnema, Gerben; Leal, Mendes M.

    2014-06-01

    The Window Design for Manned Spaceflight (WDMS) project was tasked with establishing the state-of-art and explore possible improvements to the current structural integrity verification and fracture control methodologies for manned spacecraft windows.A critical review of the state-of-art in spacecraft window design, materials and verification practice was conducted. Shortcomings of the methodology in terms of analysis, inspection and testing were identified. Schemes for improving verification practices and reducing conservatism whilst maintaining the required safety levels were then proposed.An experimental materials characterisation programme was defined and carried out with the support of the 'Glass and Façade Technology Research Group', at the University of Cambridge. Results of the sample testing campaign were analysed, post-processed and subsequently applied to the design of a breadboard window demonstrator.Two Fused Silica glass window panes were procured and subjected to dedicated analyses, inspection and testing comprising both qualification and acceptance programmes specifically tailored to the objectives of the activity.Finally, main outcomes have been compiled into a Structural Verification Guide for Pressurised Windows in manned spacecraft, incorporating best practices and lessons learned throughout this project.

  9. Finger vein verification system based on sparse representation.

    PubMed

    Xin, Yang; Liu, Zhi; Zhang, Haixia; Zhang, Hong

    2012-09-01

    Finger vein verification is a promising biometric pattern for personal identification in terms of security and convenience. The recognition performance of this technology heavily relies on the quality of finger vein images and on the recognition algorithm. To achieve efficient recognition performance, a special finger vein imaging device is developed, and a finger vein recognition method based on sparse representation is proposed. The motivation for the proposed method is that finger vein images exhibit a sparse property. In the proposed system, the regions of interest (ROIs) in the finger vein images are segmented and enhanced. Sparse representation and sparsity preserving projection on ROIs are performed to obtain the features. Finally, the features are measured for recognition. An equal error rate of 0.017% was achieved based on the finger vein image database, which contains images that were captured by using the near-IR imaging device that was developed in this study. The experimental results demonstrate that the proposed method is faster and more robust than previous methods.

  10. Toward Improving Electrocardiogram (ECG) Biometric Verification using Mobile Sensors: A Two-Stage Classifier Approach

    PubMed Central

    Tan, Robin; Perkowski, Marek

    2017-01-01

    Electrocardiogram (ECG) signals sensed from mobile devices pertain the potential for biometric identity recognition applicable in remote access control systems where enhanced data security is demanding. In this study, we propose a new algorithm that consists of a two-stage classifier combining random forest and wavelet distance measure through a probabilistic threshold schema, to improve the effectiveness and robustness of a biometric recognition system using ECG data acquired from a biosensor integrated into mobile devices. The proposed algorithm is evaluated using a mixed dataset from 184 subjects under different health conditions. The proposed two-stage classifier achieves a total of 99.52% subject verification accuracy, better than the 98.33% accuracy from random forest alone and 96.31% accuracy from wavelet distance measure algorithm alone. These results demonstrate the superiority of the proposed algorithm for biometric identification, hence supporting its practicality in areas such as cloud data security, cyber-security or remote healthcare systems. PMID:28230745

  11. Toward Improving Electrocardiogram (ECG) Biometric Verification using Mobile Sensors: A Two-Stage Classifier Approach.

    PubMed

    Tan, Robin; Perkowski, Marek

    2017-02-20

    Electrocardiogram (ECG) signals sensed from mobile devices pertain the potential for biometric identity recognition applicable in remote access control systems where enhanced data security is demanding. In this study, we propose a new algorithm that consists of a two-stage classifier combining random forest and wavelet distance measure through a probabilistic threshold schema, to improve the effectiveness and robustness of a biometric recognition system using ECG data acquired from a biosensor integrated into mobile devices. The proposed algorithm is evaluated using a mixed dataset from 184 subjects under different health conditions. The proposed two-stage classifier achieves a total of 99.52% subject verification accuracy, better than the 98.33% accuracy from random forest alone and 96.31% accuracy from wavelet distance measure algorithm alone. These results demonstrate the superiority of the proposed algorithm for biometric identification, hence supporting its practicality in areas such as cloud data security, cyber-security or remote healthcare systems.

  12. Proceedings of the 2013 International Conference on Mathematics and Computational Methods Applied to Nuclear Science and Engineering - M and C 2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2013-07-01

    The Mathematics and Computation Division of the American Nuclear (ANS) and the Idaho Section of the ANS hosted the 2013 International Conference on Mathematics and Computational Methods Applied to Nuclear Science and Engineering (M and C 2013). This proceedings contains over 250 full papers with topics ranging from reactor physics; radiation transport; materials science; nuclear fuels; core performance and optimization; reactor systems and safety; fluid dynamics; medical applications; analytical and numerical methods; algorithms for advanced architectures; and validation verification, and uncertainty quantification.

  13. Life beyond MSE and R2 — improving validation of predictive models with observations

    NASA Astrophysics Data System (ADS)

    Papritz, Andreas; Nussbaum, Madlene

    2017-04-01

    Machine learning and statistical predictive methods are evaluated by the closeness of predictions to observations of a test dataset. Common criteria for rating predictive methods are bias and mean square error (MSE), characterizing systematic and random prediction errors. Many studies also report R2-values, but their meaning is not always clear (correlation between observations and predictions or MSE skill score; Wilks, 2011). The same criteria are also used for choosing tuning parameters of predictive procedures by cross-validation and bagging (e.g. Hastie et al., 2009). For evident reasons, atmospheric sciences have developed a rich box of tools for forecast verification. Specific criteria have been proposed for evaluating deterministic and probabilistic predictions of binary, multinomial, ordinal and continuous responses (see reviews by Wilks, 2011, Jollie and Stephenson, 2012 and Gneiting et al., 2007). It appears that these techniques are not very well-known in the geosciences community interested in machine learning. In our presentation we review techniques that offer more insight into proximity of data and predictions than bias, MSE and R2 alone. We mention here only examples: (i) Graphing observations vs. predictions is usually more appropriate than the reverse (Piñeiro et al., 2008). (ii) The decomposition of the Brier score score (= MSE for probabilistic predictions of binary yes/no data) into reliability and resolution reveals (conditional) bias and capability of discriminating yes/no observations by the predictions. We illustrate the approaches by applications from digital soil mapping studies. Gneiting, T., Balabdaoui, F., and Raftery, A. E. (2007). Probabilistic forecasts, calibration and sharpness. Journal of the Royal Statistical Society Series B, 69, 243-268. Hastie, T., Tibshirani, R., and Friedman, J. (2009). The Elements of Statistical Learning; Data Mining, Inference and Prediction. Springer, New York, second edition. Jolliffe, I. T. and Stephenson, D. B., editors (2012). Forecast Verification: A Practitioner's Guide in Atmospheric Science. Wiley-Blackwell, second edition. Piñeiro, G., Perelman, S., Guerschman, J., and Paruelo, J. (2008). How to evaluate models: Observed vs. predicted or predicted vs. observed? Ecological Modelling, 216, 316-322. Wilks, D. S. (2011). Statistical Methods in the Atmospheric Sciences. Academic Press, third edition.

  14. Space Weather Models and Their Validation and Verification at the CCMC

    NASA Technical Reports Server (NTRS)

    Hesse, Michael

    2010-01-01

    The Community Coordinated l\\lodeling Center (CCMC) is a US multi-agency activity with a dual mission. With equal emphasis, CCMC strives to provide science support to the international space research community through the execution of advanced space plasma simulations, and it endeavors to support the space weather needs of the CS and partners. Space weather support involves a broad spectrum, from designing robust forecasting systems and transitioning them to forecasters, to providing space weather updates and forecasts to NASA's robotic mission operators. All of these activities have to rely on validation and verification of models and their products, so users and forecasters have the means to assign confidence levels to the space weather information. In this presentation, we provide an overview of space weather models resident at CCMC, as well as of validation and verification activities undertaken at CCMC or through the use of CCMC services.

  15. A tracking and verification system implemented in a clinical environment for partial HIPAA compliance

    NASA Astrophysics Data System (ADS)

    Guo, Bing; Documet, Jorge; Liu, Brent; King, Nelson; Shrestha, Rasu; Wang, Kevin; Huang, H. K.; Grant, Edward G.

    2006-03-01

    The paper describes the methodology for the clinical design and implementation of a Location Tracking and Verification System (LTVS) that has distinct benefits for the Imaging Department at the Healthcare Consultation Center II (HCCII), an outpatient imaging facility located on the USC Health Science Campus. A novel system for tracking and verification of patients and staff in a clinical environment using wireless and facial biometric technology to monitor and automatically identify patients and staff was developed in order to streamline patient workflow, protect against erroneous examinations and create a security zone to prevent and audit unauthorized access to patient healthcare data under the HIPAA mandate. This paper describes the system design and integration methodology based on initial clinical workflow studies within a clinical environment. An outpatient center was chosen as an initial first step for the development and implementation of this system.

  16. A Conceptual Working Paper on Arms Control Verification,

    DTIC Science & Technology

    1981-08-01

    AD-AlIO 748 OPIRATIONAL RESEARCH AND ANALYSIS ESTABLISMENT OTTA-ETC F/S 5/4 -A CONCEPTUAL WORKING PAP" ON ARMS CONTROL VERItFCATION.(U) AUG 81 F R... researched for the paper comes from ORAE Report No. R73, Compendium of Arms Control Verification Proposals, submitted simultaneously to the Committee on...nuclear activities within the territory" of the non -nuclear weapon state, or carried out under its control anywhere. Parties also undertake not to

  17. Verification of respiratory-gated radiotherapy with new real-time tumour-tracking radiotherapy system using cine EPID images and a log file

    NASA Astrophysics Data System (ADS)

    Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko

    2017-02-01

    A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were  ⩽3 mm and  ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases. This work was partly presented at the 58th Annual meeting of American Association of Physicists in Medicine.

  18. Verification of respiratory-gated radiotherapy with new real-time tumour-tracking radiotherapy system using cine EPID images and a log file.

    PubMed

    Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko

    2017-02-21

    A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were  ⩽3 mm and  ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases.

  19. Category V Compliant Container for Mars Sample Return Missions

    NASA Technical Reports Server (NTRS)

    Dolgin, Benjamin; Sanok, Joseph; Sevilla, Donald; Bement, Laurence J.

    2000-01-01

    A novel containerization technique that satisfies Planetary Protection (PP) Category V requirements has been developed and demonstrated on the mock-up of the Mars Sample Return Container. The proposed approach uses explosive welding with a sacrificial layer and cut-through-the-seam techniques. The technology produces a container that is free from Martian contaminants on an atomic level. The containerization technique can be used on any celestial body that may support life. A major advantage of the proposed technology is the possibility of very fast (less than an hour) verification of both containment and cleanliness with typical metallurgical laboratory equipment. No separate biological verification is required. In addition to Category V requirements, the proposed container presents a surface that is clean from any, even nonviable organisms, and any molecular fragments of biological origin that are unique to Mars or any other celestial body other than Earth.

  20. A Computational Framework to Control Verification and Robustness Analysis

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2010-01-01

    This paper presents a methodology for evaluating the robustness of a controller based on its ability to satisfy the design requirements. The framework proposed is generic since it allows for high-fidelity models, arbitrary control structures and arbitrary functional dependencies between the requirements and the uncertain parameters. The cornerstone of this contribution is the ability to bound the region of the uncertain parameter space where the degradation in closed-loop performance remains acceptable. The size of this bounding set, whose geometry can be prescribed according to deterministic or probabilistic uncertainty models, is a measure of robustness. The robustness metrics proposed herein are the parametric safety margin, the reliability index, the failure probability and upper bounds to this probability. The performance observed at the control verification setting, where the assumptions and approximations used for control design may no longer hold, will fully determine the proposed control assessment.

  1. A framework of multitemplate ensemble for fingerprint verification

    NASA Astrophysics Data System (ADS)

    Yin, Yilong; Ning, Yanbin; Ren, Chunxiao; Liu, Li

    2012-12-01

    How to improve performance of an automatic fingerprint verification system (AFVS) is always a big challenge in biometric verification field. Recently, it becomes popular to improve the performance of AFVS using ensemble learning approach to fuse related information of fingerprints. In this article, we propose a novel framework of fingerprint verification which is based on the multitemplate ensemble method. This framework is consisted of three stages. In the first stage, enrollment stage, we adopt an effective template selection method to select those fingerprints which best represent a finger, and then, a polyhedron is created by the matching results of multiple template fingerprints and a virtual centroid of the polyhedron is given. In the second stage, verification stage, we measure the distance between the centroid of the polyhedron and a query image. In the final stage, a fusion rule is used to choose a proper distance from a distance set. The experimental results on the FVC2004 database prove the improvement on the effectiveness of the new framework in fingerprint verification. With a minutiae-based matching method, the average EER of four databases in FVC2004 drops from 10.85 to 0.88, and with a ridge-based matching method, the average EER of these four databases also decreases from 14.58 to 2.51.

  2. INF verification: a guide for the perplexed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendelsohn, J.

    1987-09-01

    The administration has dug itself some deep holes on the verification issue. It will have to conclude an arms control treaty without having resolved earlier (but highly questionable) compliance issues on which it has placed great emphasis. It will probably have to abandon its more sweeping (and unnecessary) on-site inspection (OSI) proposals because of adverse security and political implications for the United States and its allies. And, finally, it will probably have to present to the Congress an INF treaty that will provide for a considerably less-stringent (but nonetheless adequate) verification regime that it had originally demanded. It is difficultmore » to dispel the impression that, when the likelihood of concluding an INF treaty seemed remote, the administration indulged its penchant for intrusive and non-negotiable verification measures. As the possibility of, and eagerness for, a treaty increased, and as the Soviet Union shifted its policy from one of the resistance to OSI to one of indicating that on-site verification involved reciprocal obligations, the administration was forced to scale back its OSI rhetoric. This re-evaluation of OSI by the administration does not make the INF treaty any less verifiable; from the outset the Reagan administration was asking for a far more extensive verification package than was necessary, practicable, acceptable, or negotiable.« less

  3. Structural damage detection based on stochastic subspace identification and statistical pattern recognition: II. Experimental validation under varying temperature

    NASA Astrophysics Data System (ADS)

    Lin, Y. Q.; Ren, W. X.; Fang, S. E.

    2011-11-01

    Although most vibration-based damage detection methods can acquire satisfactory verification on analytical or numerical structures, most of them may encounter problems when applied to real-world structures under varying environments. The damage detection methods that directly extract damage features from the periodically sampled dynamic time history response measurements are desirable but relevant research and field application verification are still lacking. In this second part of a two-part paper, the robustness and performance of the statistics-based damage index using the forward innovation model by stochastic subspace identification of a vibrating structure proposed in the first part have been investigated against two prestressed reinforced concrete (RC) beams tested in the laboratory and a full-scale RC arch bridge tested in the field under varying environments. Experimental verification is focused on temperature effects. It is demonstrated that the proposed statistics-based damage index is insensitive to temperature variations but sensitive to the structural deterioration or state alteration. This makes it possible to detect the structural damage for the real-scale structures experiencing ambient excitations and varying environmental conditions.

  4. Efficiency and Flexibility of Fingerprint Scheme Using Partial Encryption and Discrete Wavelet Transform to Verify User in Cloud Computing.

    PubMed

    Yassin, Ali A

    2014-01-01

    Now, the security of digital images is considered more and more essential and fingerprint plays the main role in the world of image. Furthermore, fingerprint recognition is a scheme of biometric verification that applies pattern recognition techniques depending on image of fingerprint individually. In the cloud environment, an adversary has the ability to intercept information and must be secured from eavesdroppers. Unluckily, encryption and decryption functions are slow and they are often hard. Fingerprint techniques required extra hardware and software; it is masqueraded by artificial gummy fingers (spoof attacks). Additionally, when a large number of users are being verified at the same time, the mechanism will become slow. In this paper, we employed each of the partial encryptions of user's fingerprint and discrete wavelet transform to obtain a new scheme of fingerprint verification. Moreover, our proposed scheme can overcome those problems; it does not require cost, reduces the computational supplies for huge volumes of fingerprint images, and resists well-known attacks. In addition, experimental results illustrate that our proposed scheme has a good performance of user's fingerprint verification.

  5. Efficiency and Flexibility of Fingerprint Scheme Using Partial Encryption and Discrete Wavelet Transform to Verify User in Cloud Computing

    PubMed Central

    Yassin, Ali A.

    2014-01-01

    Now, the security of digital images is considered more and more essential and fingerprint plays the main role in the world of image. Furthermore, fingerprint recognition is a scheme of biometric verification that applies pattern recognition techniques depending on image of fingerprint individually. In the cloud environment, an adversary has the ability to intercept information and must be secured from eavesdroppers. Unluckily, encryption and decryption functions are slow and they are often hard. Fingerprint techniques required extra hardware and software; it is masqueraded by artificial gummy fingers (spoof attacks). Additionally, when a large number of users are being verified at the same time, the mechanism will become slow. In this paper, we employed each of the partial encryptions of user's fingerprint and discrete wavelet transform to obtain a new scheme of fingerprint verification. Moreover, our proposed scheme can overcome those problems; it does not require cost, reduces the computational supplies for huge volumes of fingerprint images, and resists well-known attacks. In addition, experimental results illustrate that our proposed scheme has a good performance of user's fingerprint verification. PMID:27355051

  6. Quantum money with nearly optimal error tolerance

    NASA Astrophysics Data System (ADS)

    Amiri, Ryan; Arrazola, Juan Miguel

    2017-06-01

    We present a family of quantum money schemes with classical verification which display a number of benefits over previous proposals. Our schemes are based on hidden matching quantum retrieval games and they tolerate noise up to 23 % , which we conjecture reaches 25 % asymptotically as the dimension of the underlying hidden matching states is increased. Furthermore, we prove that 25 % is the maximum tolerable noise for a wide class of quantum money schemes with classical verification, meaning our schemes are almost optimally noise tolerant. We use methods in semidefinite programming to prove security in a substantially different manner to previous proposals, leading to two main advantages: first, coin verification involves only a constant number of states (with respect to coin size), thereby allowing for smaller coins; second, the reusability of coins within our scheme grows linearly with the size of the coin, which is known to be optimal. Last, we suggest methods by which the coins in our protocol could be implemented using weak coherent states and verified using existing experimental techniques, even in the presence of detector inefficiencies.

  7. A zero-knowledge protocol for nuclear warhead verification

    NASA Astrophysics Data System (ADS)

    Glaser, Alexander; Barak, Boaz; Goldston, Robert J.

    2014-06-01

    The verification of nuclear warheads for arms control involves a paradox: international inspectors will have to gain high confidence in the authenticity of submitted items while learning nothing about them. Proposed inspection systems featuring `information barriers', designed to hide measurements stored in electronic systems, are at risk of tampering and snooping. Here we show the viability of a fundamentally new approach to nuclear warhead verification that incorporates a zero-knowledge protocol, which is designed in such a way that sensitive information is never measured and so does not need to be hidden. We interrogate submitted items with energetic neutrons, making, in effect, differential measurements of both neutron transmission and emission. Calculations for scenarios in which material is diverted from a test object show that a high degree of discrimination can be achieved while revealing zero information. Our ideas for a physical zero-knowledge system could have applications beyond the context of nuclear disarmament. The proposed technique suggests a way to perform comparisons or computations on personal or confidential data without measuring the data in the first place.

  8. Color separation in forensic image processing using interactive differential evolution.

    PubMed

    Mushtaq, Harris; Rahnamayan, Shahryar; Siddiqi, Areeb

    2015-01-01

    Color separation is an image processing technique that has often been used in forensic applications to differentiate among variant colors and to remove unwanted image interference. This process can reveal important information such as covered text or fingerprints in forensic investigation procedures. However, several limitations prevent users from selecting the appropriate parameters pertaining to the desired and undesired colors. This study proposes the hybridization of an interactive differential evolution (IDE) and a color separation technique that no longer requires users to guess required control parameters. The IDE algorithm optimizes these parameters in an interactive manner by utilizing human visual judgment to uncover desired objects. A comprehensive experimental verification has been conducted on various sample test images, including heavily obscured texts, texts with subtle color variations, and fingerprint smudges. The advantage of IDE is apparent as it effectively optimizes the color separation parameters at a level indiscernible to the naked eyes. © 2014 American Academy of Forensic Sciences.

  9. A Science and Risk-Based Pragmatic Methodology for Blend and Content Uniformity Assessment.

    PubMed

    Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Doshi, Chetan

    2018-04-01

    This paper describes a pragmatic approach that can be applied in assessing powder blend and unit dosage uniformity of solid dose products at Process Design, Process Performance Qualification, and Continued/Ongoing Process Verification stages of the Process Validation lifecycle. The statistically based sampling, testing, and assessment plan was developed due to the withdrawal of the FDA draft guidance for industry "Powder Blends and Finished Dosage Units-Stratified In-Process Dosage Unit Sampling and Assessment." This paper compares the proposed Grouped Area Variance Estimate (GAVE) method with an alternate approach outlining the practicality and statistical rationalization using traditional sampling and analytical methods. The approach is designed to fit solid dose processes assuring high statistical confidence in both powder blend uniformity and dosage unit uniformity during all three stages of the lifecycle complying with ASTM standards as recommended by the US FDA.

  10. 77 FR 64383 - Proposed Information Collection (Verification of VA Benefits) Activity: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-19

    ... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0406] Proposed Information Collection... any VA-guaranteed loans on an automatic basis. DATES: Written comments and recommendations on the... written comments on the collection of information through the Federal Docket Management System (FDMS) at...

  11. 75 FR 60169 - Proposed Information Collection (Vetbiz Vendor Information Pages Verification Program) Activity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-29

    ... Enterprise, Department of Veterans Affairs. ACTION: Notice. SUMMARY: The Center for Veterans Enterprise (CVE... veterans owned businesses. DATES: Written comments and recommendations on the proposed collection of... online through the Federal Docket Management System (FDMS) at http://www.Regulations.gov . FOR FURTHER...

  12. [The model of development of science by the example of limnology].

    PubMed

    Menshutkin, V V; Levchenko, V F

    2010-01-01

    Limnology--the science about lakes is the young and relatively closed area of studies; its existence is owing to several hundreds of scientists. The International Society of Limnologists holds its meetings since 1922. We used materials of these meetings to find out the main stages of development of this science; among these stages there were both fast and relatively calm periods. Based on analysis of these data, we constructed a model of development of the science, the same data being used for tuning and verification of the model. We have suggested that the main regularities and of development of limnology can be extrapolated to other sciences. The main "acting person" in the model is population of scientists. Each scientist, with some probability, can propose new ideas as well as use in his elaborations some particular complex of the already accumulated knowledge and ideas. The model also takes into consideration how the scientific information is spreading, specifically some individual peculiarities of model scientists, such as age, experience, communicability. After the model parameters had been chosen in such a way that is described adequately the development of limnology, we performed a series of experiments by changing some of the characteristics and obtained rather unexpected results published preliminary in the short work (Levchenko V. F and Menshutkin V. V. Int. J. Comp. Anticip. Syst., 2008, vol. 22, p. 63-75) and discussed here in the greater detail. It is revealed, that the development of science is passing irregularly and sharply decelerated at low level of scientists communication and absence of scientific schools, and that the age of "scientific youth" of scientist begins usually only after 40 years.

  13. First Cryo-Vacuum Test of the JWST Integrated Science Instrument Module

    NASA Astrophysics Data System (ADS)

    Kimble, Randy A.; Antonille, S. R.; Balzano, V.; Comber, B. J.; Davila, P. S.; Drury, M. D.; Glasse, A.; Glazer, S. D.; Lundquist, R.; Mann, S. D.; McGuffey, D. B.; Novo-Gradac, K. J.; Penanen, K.; Ramey, D. D.; Sullivan, J.; Van Campen, J.; Vila, M. B.

    2014-01-01

    The integration and test program for the Integrated Science Instrument Module (ISIM) of the James Webb Space Telescope (JWST) calls for three cryo-vacuum tests of the ISIM hardware. The first is a risk-reduction test aimed at checking out the test hardware and procedures; this will be followed by two formal verification tests that will bracket other key aspects of the environmental test program (e.g. vibration and acoustics, EMI/EMC). The first of these cryo-vacuum tests, the risk-reduction test, was executed at NASA’s Goddard Space Flight Center starting in late August, 2013. Flight hardware under test included two (of the eventual four) flight instruments, the Mid-Infrared Instrument (MIRI) and the Fine Guidance Sensor/Near-Infrared Imager and Slitless Spectrograph (FGS/NIRISS), mounted to the ISIM structure, as well as the ISIM Electronics Compartment (IEC). The instruments were cooled to their flight operating temperatures 40K for FGS/NIRISS, ~6K for MIRI) and optically tested against a cryo-certified telescope simulator. Key goals for the risk reduction test included: 1) demonstration of controlled cooldown and warmup, stable control at operating temperature, and measurement of heat loads, 2) operation of the science instruments with ISIM electronics systems at temperature, 3) health trending of the science instruments against instrument-level test results, 4) measurement of the pupil positions and six degree of freedom alignment of the science instruments against the simulated telescope focal surface, 5) detailed optical characterization of the NIRISS instrument, 6) verification of the signal-to-noise performance of the MIRI, and 7) exercise of the Onboard Script System that will be used to operate the instruments in flight. In addition, the execution of the test is expected to yield invaluable logistical experience - development and execution of procedures, communications, analysis of results - that will greatly benefit the subsequent verification tests. At the time of this submission, the hardware had reached operating temperature and was partway through the cryo test program. We report here on the test configuration, the overall process, and the results that were ultimately obtained.

  14. International Legal Framework for Denuclearization and Nuclear Disarmament -- Present Situation and Prospects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gastelum, Zoe N.

    This thesis is the culminating project for my participation in the OECD NEA International School of Nuclear Law. This paper will begin by providing a historical background to current disarmament and denuclearization treaties. This paper will discuss the current legal framework based on current and historical activities related to denuclearization and nuclear disarmament. Then, it will propose paths forward for the future efforts, and describe the necessary legal considerations. Each treaty or agreement will be examined in respect to its requirements for: 1) limitations and implementation; 2) and verification and monitoring. Then, lessons learned in each of the two areasmore » (limitations and verification) will be used to construct a proposed path forward at the end of this paper.« less

  15. Classification and Verification of Handwritten Signatures with Time Causal Information Theory Quantifiers.

    PubMed

    Rosso, Osvaldo A; Ospina, Raydonal; Frery, Alejandro C

    2016-01-01

    We present a new approach for handwritten signature classification and verification based on descriptors stemming from time causal information theory. The proposal uses the Shannon entropy, the statistical complexity, and the Fisher information evaluated over the Bandt and Pompe symbolization of the horizontal and vertical coordinates of signatures. These six features are easy and fast to compute, and they are the input to an One-Class Support Vector Machine classifier. The results are better than state-of-the-art online techniques that employ higher-dimensional feature spaces which often require specialized software and hardware. We assess the consistency of our proposal with respect to the size of the training sample, and we also use it to classify the signatures into meaningful groups.

  16. Sandia National Laboratories: Hydrogen Risk Assessment Models toolkit now

    Science.gov Websites

    Energy Stationary Power Earth Science Transportation Energy Energy Research Global Security WMD Cyber & Infrastructure Security Global Security Remote Sensing & Verification Research Research Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers

  17. Sandia National Laboratories: 100 Resilient Cities: Sandia Challenge:

    Science.gov Websites

    Accomplishments Energy Stationary Power Earth Science Transportation Energy Energy Research Global Security WMD Cyber & Infrastructure Security Global Security Remote Sensing & Verification Research Research Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers

  18. Sandia National Laboratories: National Security Missions: Defense Systems

    Science.gov Websites

    Accomplishments Energy Stationary Power Earth Science Transportation Energy Energy Research Global Security WMD Cyber & Infrastructure Security Global Security Remote Sensing & Verification Research Research Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers

  19. DIESEL ENGINE RETROFIT TECHNOLOGY VERIFICATION

    EPA Science Inventory

    This presentation wil be given at the EPA Science Forum 2005 in Washington, DC. According to recent estimates, there are approximately 7.9 million heavy-duty diesel trucks and buses in use in the United States. Emissions from these vehicles account for substantial portions of t...

  20. Verification of Employment (VOE)

    Science.gov Websites

    Science Programs Applied Energy Programs Civilian Nuclear Energy Programs Laboratory Directed Research Service Academies Research Associates (SARA) Postdocs, Students Employee, Retiree Resources Benefits New employees need to show a photo ID. Employee, Retiree Resources Benefits Plan Reports & Notices

  1. Analyzing Personalized Policies for Online Biometric Verification

    PubMed Central

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M.

    2014-01-01

    Motivated by India’s nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident’s biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India’s program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India’s biometric program. The mean delay is sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32–41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident. PMID:24787752

  2. Analyzing personalized policies for online biometric verification.

    PubMed

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  3. GaAs Spectrometer for Electron Spectroscopy at Europa

    NASA Astrophysics Data System (ADS)

    Lioliou, G.; Barnett, A. M.

    2016-12-01

    We propose a GaAs based electron spectrometer for a hypothetical future mission orbiting Europa. Previous observations at Europa's South Pole with the Hubble Space Telescope of hydrogen Lyman-α and oxygen OI 130.4 nm emissions were consistent with water vapor plumes [Roth et al., 2014, Science 343, 171]. Future observations and analysis of plumes on Europa could provide information about its subsurface structure and the distribution of liquid water within its icy shells [Rhoden at al. 2015, Icarus 253, 169]. In situ low energy (1keV - 100keV) electron spectroscopy along with UV imaging either in situ or with the Hubble Space Telescope Wide Field Camera 3 or similar would allow verification of the auroral observations being due to electron impact excitation of water vapor plumes. The proposed spectrometer includes a novel GaAs p+-i-n+ photodiode and a custom-made charge-sensitive preamplifier. The use of an early prototype GaAs detector for direct electron spectroscopy has already been demonstrated in ground based applications [Barnett et al., 2012, J. Instrum. 7, P09012]. Based on previous radiation hardness measurements of GaAs, the expected duration of the mission without degradation of the detector performance is estimated to be 4 months. Simulations and laboratory experiments characterising the detection performance of the proposed system are presented.

  4. Verification and Validation of NASA-Supported Enhancements to PECAD's Decision Support Tools

    NASA Technical Reports Server (NTRS)

    McKellipo, Rodney; Ross, Kenton W.

    2006-01-01

    The NASA Applied Sciences Directorate (ASD), part of the Earth-Sun System Division of NASA's Science Mission Directorate, has partnered with the U.S. Department of Agriculture (USDA) to enhance decision support in the area of agricultural efficiency-an application of national importance. The ASD integrated the results of NASA Earth science research into USDA decision support tools employed by the USDA Foreign Agricultural Service (FAS) Production Estimates and Crop Assessment Division (PECAD), which supports national decision making by gathering, analyzing, and disseminating global crop intelligence. Verification and validation of the following enhancements are summarized: 1) Near-real-time Moderate Resolution Imaging Spectroradiometer (MODIS) products through PECAD's MODIS Image Gallery; 2) MODIS Normalized Difference Vegetation Index (NDVI) time series data through the USDA-FAS MODIS NDVI Database; and 3) Jason-1 and TOPEX/Poseidon lake level estimates through PECAD's Global Reservoir and Lake Monitor. Where possible, each enhanced product was characterized for accuracy, timeliness, and coverage, and the characterized performance was compared to PECAD operational requirements. The MODIS Image Gallery and the GRLM are more mature and have achieved a semi-operational status, whereas the USDA-FAS MODIS NDVI Database is still evolving and should be considered

  5. An Overview of Integration and Test of the James Webb Space Telescope Integrated Science Instrument Module

    NASA Technical Reports Server (NTRS)

    Drury, Michael; Becker, Neil; Bos, Brent; Davila, Pamela; Frey, Bradley; Hylan, Jason; Marsh, James; McGuffey, Douglas; Novak, Maria; Ohl, Raymond; hide

    2007-01-01

    The James Webb Space Telescope (JWST) is a 6.6m diameter, segmented, deployable telescope for cryogenic IR space astronomy (approx.40K). The JWST Observatory architecture includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) element that contains four science instruments (SI) including a Guider. The SIs and Guider are mounted to a composite metering structure with outer dimensions of 2.1x2.2x1.9m. The SI and Guider units are integrated to the ISIM structure and optically tested at NASA/Goddard Space Flight Center as an instrument suite using a high-fidelity, cryogenic JWST telescope simulator that features a 1.5m diameter powered mirror. The SIs are integrated and aligned to the structure under ambient, clean room conditions. SI performance, including focus, pupil shear and wavefront error, is evaluated at the operating temperature. We present an overview of the ISIM integration within the context of Observatory-level construction. We describe the integration and verification plan for the ISIM element, including an overview of our incremental verification approach, ambient mechanical integration and test plans and optical alignment and cryogenic test plans. We describe key ground support equipment and facilities.

  6. CAPTIONALS: A computer aided testing environment for the verification and validation of communication protocols

    NASA Technical Reports Server (NTRS)

    Feng, C.; Sun, X.; Shen, Y. N.; Lombardi, Fabrizio

    1992-01-01

    This paper covers the verification and protocol validation for distributed computer and communication systems using a computer aided testing approach. Validation and verification make up the so-called process of conformance testing. Protocol applications which pass conformance testing are then checked to see whether they can operate together. This is referred to as interoperability testing. A new comprehensive approach to protocol testing is presented which address: (1) modeling for inter-layer representation for compatibility between conformance and interoperability testing; (2) computational improvement to current testing methods by using the proposed model inclusive of formulation of new qualitative and quantitative measures and time-dependent behavior; (3) analysis and evaluation of protocol behavior for interactive testing without extensive simulation.

  7. Holographic particle size extraction by using Wigner-Ville distribution

    NASA Astrophysics Data System (ADS)

    Chuamchaitrakool, Porntip; Widjaja, Joewono; Yoshimura, Hiroyuki

    2014-06-01

    A new method for measuring object size from in-line holograms by using Wigner-Ville distribution (WVD) is proposed. The proposed method has advantages over conventional numerical reconstruction in that it is free from iterative process and it can extract the object size and position with only single computation of the WVD. Experimental verification of the proposed method is presented.

  8. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices

    PubMed Central

    Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei

    2017-01-01

    In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer’s forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices. PMID:28075375

  9. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices.

    PubMed

    Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei

    2017-01-10

    In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer's forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices.

  10. 78 FR 23743 - Proposed Information Collection; Comment Request; Delivery Verification Procedure for Imports

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-22

    ..., received. This procedure increases the effectiveness of controls on the international trade of strategic... collection). Affected Public: Business or other for-profit organizations. Estimated Number of Respondents... Annual Cost to Public: $0. IV. Request for Comments Comments are invited on: (a) Whether the proposed...

  11. Modeling the Creep of Rib-Reinforced Composite Media Made from Nonlinear Hereditary Phase Materials 2. Verification of the Model

    NASA Astrophysics Data System (ADS)

    Yankovskii, A. P.

    2015-05-01

    An indirect verification of a structural model describing the creep of a composite medium reinforced by honeycombs and made of nonlinear hereditary phase materials obeying the Rabotnov theory of creep is presented. It is shown that the structural model proposed is trustworthy and can be used in practical calculations. For different kinds of loading, creep curves for a honeycomb core made of a D16T aluminum alloy are calculated.

  12. A Mechanism of Modeling and Verification for SaaS Customization Based on TLA

    NASA Astrophysics Data System (ADS)

    Luan, Shuai; Shi, Yuliang; Wang, Haiyang

    With the gradually mature of SOA and the rapid development of Internet, SaaS has become a popular software service mode. The customized action of SaaS is usually subject to internal and external dependency relationships. This paper first introduces a method for modeling customization process based on Temporal Logic of Actions, and then proposes a verification algorithm to assure that each step in customization will not cause unpredictable influence on system and follow the related rules defined by SaaS provider.

  13. Approaches to environmental verification of STS free-flier and pallet payloads

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1982-01-01

    This paper presents an overview of the environmental verification programs followed on an STS-launched free-flier payload, using the Tracking and Data Relay Satellite (TDRS) as an example, and a pallet payload, using the Office of Space Sciences-1 (OSS-1) as an example. Differences are assessed and rationale given as to why the differing programs were used on the two example payloads. It is concluded that the differences between the programs are due to inherent differences in the payload configuration, their respective mission performance objectives and their operational scenarios rather than to any generic distinctions that differentiate between a free-flier and a pallet payload.

  14. Managing Complexity in the MSL/Curiosity Entry, Descent, and Landing Flight Software and Avionics Verification and Validation Campaign

    NASA Technical Reports Server (NTRS)

    Stehura, Aaron; Rozek, Matthew

    2013-01-01

    The complexity of the Mars Science Laboratory (MSL) mission presented the Entry, Descent, and Landing systems engineering team with many challenges in its Verification and Validation (V&V) campaign. This paper describes some of the logistical hurdles related to managing a complex set of requirements, test venues, test objectives, and analysis products in the implementation of a specific portion of the overall V&V program to test the interaction of flight software with the MSL avionics suite. Application-specific solutions to these problems are presented herein, which can be generalized to other space missions and to similar formidable systems engineering problems.

  15. Assessing Requirements Quality through Requirements Coverage

    NASA Technical Reports Server (NTRS)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.

  16. Accurate Biomass Estimation via Bayesian Adaptive Sampling

    NASA Technical Reports Server (NTRS)

    Wheeler, Kevin R.; Knuth, Kevin H.; Castle, Joseph P.; Lvov, Nikolay

    2005-01-01

    The following concepts were introduced: a) Bayesian adaptive sampling for solving biomass estimation; b) Characterization of MISR Rahman model parameters conditioned upon MODIS landcover. c) Rigorous non-parametric Bayesian approach to analytic mixture model determination. d) Unique U.S. asset for science product validation and verification.

  17. Wide-Field Lensing Mass Maps from Dark Energy Survey Science Verification Data

    DOE PAGES

    Chang, C.

    2015-07-29

    We present a mass map reconstructed from weak gravitational lensing shear measurements over 139 deg 2 from the Dark Energy Survey science verification data. The mass map probes both luminous and dark matter, thus providing a tool for studying cosmology. We also find good agreement between the mass map and the distribution of massive galaxy clusters identified using a red-sequence cluster finder. Potential candidates for superclusters and voids are identified using these maps. We measure the cross-correlation between the mass map and a magnitude-limited foreground galaxy sample and find a detection at the 6.8σ level with 20 arc min smoothing.more » These measurements are consistent with simulated galaxy catalogs based on N-body simulations from a cold dark matter model with a cosmological constant. This suggests low systematics uncertainties in the map. Finally, we summarize our key findings in this Letter; the detailed methodology and tests for systematics are presented in a companion paper.« less

  18. SOFIA pointing history

    NASA Astrophysics Data System (ADS)

    Kärcher, Hans J.; Kunz, Nans; Temi, Pasquale; Krabbe, Alfred; Wagner, Jörg; Süß, Martin

    2014-07-01

    The original pointing accuracy requirement of the Stratospheric Observatory for Infrared Astronomy SOFIA was defined at the beginning of the program in the late 1980s as very challenging 0.2 arcsec rms. The early science flights of the observatory started in December 2010 and the observatory has reached in the mean time nearly 0.7 arcsec rms, which is sufficient for most of the SOFIA science instruments. NASA and DLR, the owners of SOFIA, are planning now a future 4 year program to bring the pointing down to the ultimate 0.2 arcsec rms. This may be the right time to recall the history of the pointing requirement and its verification and the possibility of its achievement via early computer models and wind tunnel tests, later computer aided end-to-end simulations up to the first commissioning flights some years ago. The paper recollects the tools used in the different project phases for the verification of the pointing performance, explains the achievements and may give hints for the planning of the upcoming final pointing improvement phase.

  19. Wide-Field Lensing Mass Maps from Dark Energy Survey Science Verification Data.

    PubMed

    Chang, C; Vikram, V; Jain, B; Bacon, D; Amara, A; Becker, M R; Bernstein, G; Bonnett, C; Bridle, S; Brout, D; Busha, M; Frieman, J; Gaztanaga, E; Hartley, W; Jarvis, M; Kacprzak, T; Kovács, A; Lahav, O; Lin, H; Melchior, P; Peiris, H; Rozo, E; Rykoff, E; Sánchez, C; Sheldon, E; Troxel, M A; Wechsler, R; Zuntz, J; Abbott, T; Abdalla, F B; Allam, S; Annis, J; Bauer, A H; Benoit-Lévy, A; Brooks, D; Buckley-Geer, E; Burke, D L; Capozzi, D; Carnero Rosell, A; Carrasco Kind, M; Castander, F J; Crocce, M; D'Andrea, C B; Desai, S; Diehl, H T; Dietrich, J P; Doel, P; Eifler, T F; Evrard, A E; Fausti Neto, A; Flaugher, B; Fosalba, P; Gruen, D; Gruendl, R A; Gutierrez, G; Honscheid, K; James, D; Kent, S; Kuehn, K; Kuropatkin, N; Maia, M A G; March, M; Martini, P; Merritt, K W; Miller, C J; Miquel, R; Neilsen, E; Nichol, R C; Ogando, R; Plazas, A A; Romer, A K; Roodman, A; Sako, M; Sanchez, E; Sevilla, I; Smith, R C; Soares-Santos, M; Sobreira, F; Suchyta, E; Tarle, G; Thaler, J; Thomas, D; Tucker, D; Walker, A R

    2015-07-31

    We present a mass map reconstructed from weak gravitational lensing shear measurements over 139  deg2 from the Dark Energy Survey science verification data. The mass map probes both luminous and dark matter, thus providing a tool for studying cosmology. We find good agreement between the mass map and the distribution of massive galaxy clusters identified using a red-sequence cluster finder. Potential candidates for superclusters and voids are identified using these maps. We measure the cross-correlation between the mass map and a magnitude-limited foreground galaxy sample and find a detection at the 6.8σ level with 20 arc min smoothing. These measurements are consistent with simulated galaxy catalogs based on N-body simulations from a cold dark matter model with a cosmological constant. This suggests low systematics uncertainties in the map. We summarize our key findings in this Letter; the detailed methodology and tests for systematics are presented in a companion paper.

  20. Improved Hip-Based Individual Recognition Using Wearable Motion Recording Sensor

    NASA Astrophysics Data System (ADS)

    Gafurov, Davrondzhon; Bours, Patrick

    In todays society the demand for reliable verification of a user identity is increasing. Although biometric technologies based on fingerprint or iris can provide accurate and reliable recognition performance, they are inconvenient for periodic or frequent re-verification. In this paper we propose a hip-based user recognition method which can be suitable for implicit and periodic re-verification of the identity. In our approach we use a wearable accelerometer sensor attached to the hip of the person, and then the measured hip motion signal is analysed for identity verification purposes. The main analyses steps consists of detecting gait cycles in the signal and matching two sets of detected gait cycles. Evaluating the approach on a hip data set consisting of 400 gait sequences (samples) from 100 subjects, we obtained equal error rate (EER) of 7.5% and identification rate at rank 1 was 81.4%. These numbers are improvements by 37.5% and 11.2% respectively of the previous study using the same data set.

  1. Life sciences laboratory breadboard simulations for shuttle

    NASA Technical Reports Server (NTRS)

    Taketa, S. T.; Simmonds, R. C.; Callahan, P. X.

    1975-01-01

    Breadboard simulations of life sciences laboratory concepts for conducting bioresearch in space were undertaken as part of the concept verification testing program. Breadboard simulations were conducted to test concepts of and scope problems associated with bioresearch support equipment and facility requirements and their operational integration for conducting manned research in earth orbital missions. It emphasized requirements, functions, and procedures for candidate research on crew members (simulated) and subhuman primates and on typical radioisotope studies in rats, a rooster, and plants.

  2. Temperature Modeling of Lost Creek Lake Using CE-QUAL-W2:A Report on the Development, Calibration, Verification, and Application of the Model

    DTIC Science & Technology

    2017-04-01

    Calendar year (January 1 through December 31) DO Dissolved oxygen ELWS Water surface elevation ERDC Engineer Research and Development Center ISS...Dorothy H. Tillman, and David L. Smith April 2017 Approved for public release; distribution is unlimited. The U.S. Army Engineer Research ...military engineering, geospatial sciences, water resources, and environmental sciences for the Army, the Department of Defense, civilian agencies

  3. Separating stages of arithmetic verification: An ERP study with a novel paradigm.

    PubMed

    Avancini, Chiara; Soltész, Fruzsina; Szűcs, Dénes

    2015-08-01

    In studies of arithmetic verification, participants typically encounter two operands and they carry out an operation on these (e.g. adding them). Operands are followed by a proposed answer and participants decide whether this answer is correct or incorrect. However, interpretation of results is difficult because multiple parallel, temporally overlapping numerical and non-numerical processes of the human brain may contribute to task execution. In order to overcome this problem here we used a novel paradigm specifically designed to tease apart the overlapping cognitive processes active during arithmetic verification. Specifically, we aimed to separate effects related to detection of arithmetic correctness, detection of the violation of strategic expectations, detection of physical stimulus properties mismatch and numerical magnitude comparison (numerical distance effects). Arithmetic correctness, physical stimulus properties and magnitude information were not task-relevant properties of the stimuli. We distinguished between a series of temporally highly overlapping cognitive processes which in turn elicited overlapping ERP effects with distinct scalp topographies. We suggest that arithmetic verification relies on two major temporal phases which include parallel running processes. Our paradigm offers a new method for investigating specific arithmetic verification processes in detail. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    PubMed Central

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  5. THRIVE: threshold homomorphic encryption based secure and privacy preserving biometric verification system

    NASA Astrophysics Data System (ADS)

    Karabat, Cagatay; Kiraz, Mehmet Sabir; Erdogan, Hakan; Savas, Erkay

    2015-12-01

    In this paper, we introduce a new biometric verification and template protection system which we call THRIVE. The system includes novel enrollment and authentication protocols based on threshold homomorphic encryption where a private key is shared between a user and a verifier. In the THRIVE system, only encrypted binary biometric templates are stored in a database and verification is performed via homomorphically randomized templates, thus, original templates are never revealed during authentication. Due to the underlying threshold homomorphic encryption scheme, a malicious database owner cannot perform full decryption on encrypted templates of the users in the database. In addition, security of the THRIVE system is enhanced using a two-factor authentication scheme involving user's private key and biometric data. Using simulation-based techniques, the proposed system is proven secure in the malicious model. The proposed system is suitable for applications where the user does not want to reveal her biometrics to the verifier in plain form, but needs to prove her identity by using biometrics. The system can be used with any biometric modality where a feature extraction method yields a fixed size binary template and a query template is verified when its Hamming distance to the database template is less than a threshold. The overall connection time for the proposed THRIVE system is estimated to be 336 ms on average for 256-bit biometric templates on a desktop PC running with quad core 3.2 GHz CPUs at 10 Mbit/s up/down link connection speed. Consequently, the proposed system can be efficiently used in real-life applications.

  6. DIESEL ENGINE RETROFIT TECHNOLOGY VERIFICATION (POSTER)

    EPA Science Inventory

    ETV is presenting a poster at the EPA's 2005 Science Forum from May 16-18, 2005 in Washington, DC. This poster will contain a summary of the performance results realized by the six verified diesel retrofit technologies, as well as potential impacts that could be realized if sigi...

  7. Materials integrity in microsystems: a framework for a petascale predictive-science-based multiscale modeling and simulation system

    NASA Astrophysics Data System (ADS)

    To, Albert C.; Liu, Wing Kam; Olson, Gregory B.; Belytschko, Ted; Chen, Wei; Shephard, Mark S.; Chung, Yip-Wah; Ghanem, Roger; Voorhees, Peter W.; Seidman, David N.; Wolverton, Chris; Chen, J. S.; Moran, Brian; Freeman, Arthur J.; Tian, Rong; Luo, Xiaojuan; Lautenschlager, Eric; Challoner, A. Dorian

    2008-09-01

    Microsystems have become an integral part of our lives and can be found in homeland security, medical science, aerospace applications and beyond. Many critical microsystem applications are in harsh environments, in which long-term reliability needs to be guaranteed and repair is not feasible. For example, gyroscope microsystems on satellites need to function for over 20 years under severe radiation, thermal cycling, and shock loading. Hence a predictive-science-based, verified and validated computational models and algorithms to predict the performance and materials integrity of microsystems in these situations is needed. Confidence in these predictions is improved by quantifying uncertainties and approximation errors. With no full system testing and limited sub-system testings, petascale computing is certainly necessary to span both time and space scales and to reduce the uncertainty in the prediction of long-term reliability. This paper presents the necessary steps to develop predictive-science-based multiscale modeling and simulation system. The development of this system will be focused on the prediction of the long-term performance of a gyroscope microsystem. The environmental effects to be considered include radiation, thermo-mechanical cycling and shock. Since there will be many material performance issues, attention is restricted to creep resulting from thermal aging and radiation-enhanced mass diffusion, material instability due to radiation and thermo-mechanical cycling and damage and fracture due to shock. To meet these challenges, we aim to develop an integrated multiscale software analysis system that spans the length scales from the atomistic scale to the scale of the device. The proposed software system will include molecular mechanics, phase field evolution, micromechanics and continuum mechanics software, and the state-of-the-art model identification strategies where atomistic properties are calibrated by quantum calculations. We aim to predict the long-term (in excess of 20 years) integrity of the resonator, electrode base, multilayer metallic bonding pads, and vacuum seals in a prescribed mission. Although multiscale simulations are efficient in the sense that they focus the most computationally intensive models and methods on only the portions of the space time domain needed, the execution of the multiscale simulations associated with evaluating materials and device integrity for aerospace microsystems will require the application of petascale computing. A component-based software strategy will be used in the development of our massively parallel multiscale simulation system. This approach will allow us to take full advantage of existing single scale modeling components. An extensive, pervasive thrust in the software system development is verification, validation, and uncertainty quantification (UQ). Each component and the integrated software system need to be carefully verified. An UQ methodology that determines the quality of predictive information available from experimental measurements and packages the information in a form suitable for UQ at various scales needs to be developed. Experiments to validate the model at the nanoscale, microscale, and macroscale are proposed. The development of a petascale predictive-science-based multiscale modeling and simulation system will advance the field of predictive multiscale science so that it can be used to reliably analyze problems of unprecedented complexity, where limited testing resources can be adequately replaced by petascale computational power, advanced verification, validation, and UQ methodologies.

  8. Improvement of a uniqueness-and-anonymity-preserving user authentication scheme for connected health care.

    PubMed

    Xie, Qi; Liu, Wenhao; Wang, Shengbao; Han, Lidong; Hu, Bin; Wu, Ting

    2014-09-01

    Patient's privacy-preserving, security and mutual authentication between patient and the medical server are the important mechanism in connected health care applications, such as telecare medical information systems and personally controlled health records systems. In 2013, Wen showed that Das et al.'s scheme is vulnerable to the replay attack, user impersonation attacks and off-line guessing attacks, and then proposed an improved scheme using biometrics, password and smart card to overcome these weaknesses. However, we show that Wen's scheme is still vulnerable to off-line password guessing attacks, does not provide user's anonymity and perfect forward secrecy. Further, we propose an improved scheme to fix these weaknesses, and use the applied pi calculus based formal verification tool ProVerif to prove the security and authentication.

  9. Classification and Verification of Handwritten Signatures with Time Causal Information Theory Quantifiers

    PubMed Central

    Ospina, Raydonal; Frery, Alejandro C.

    2016-01-01

    We present a new approach for handwritten signature classification and verification based on descriptors stemming from time causal information theory. The proposal uses the Shannon entropy, the statistical complexity, and the Fisher information evaluated over the Bandt and Pompe symbolization of the horizontal and vertical coordinates of signatures. These six features are easy and fast to compute, and they are the input to an One-Class Support Vector Machine classifier. The results are better than state-of-the-art online techniques that employ higher-dimensional feature spaces which often require specialized software and hardware. We assess the consistency of our proposal with respect to the size of the training sample, and we also use it to classify the signatures into meaningful groups. PMID:27907014

  10. DeepMitosis: Mitosis detection via deep detection, verification and segmentation networks.

    PubMed

    Li, Chao; Wang, Xinggang; Liu, Wenyu; Latecki, Longin Jan

    2018-04-01

    Mitotic count is a critical predictor of tumor aggressiveness in the breast cancer diagnosis. Nowadays mitosis counting is mainly performed by pathologists manually, which is extremely arduous and time-consuming. In this paper, we propose an accurate method for detecting the mitotic cells from histopathological slides using a novel multi-stage deep learning framework. Our method consists of a deep segmentation network for generating mitosis region when only a weak label is given (i.e., only the centroid pixel of mitosis is annotated), an elaborately designed deep detection network for localizing mitosis by using contextual region information, and a deep verification network for improving detection accuracy by removing false positives. We validate the proposed deep learning method on two widely used Mitosis Detection in Breast Cancer Histological Images (MITOSIS) datasets. Experimental results show that we can achieve the highest F-score on the MITOSIS dataset from ICPR 2012 grand challenge merely using the deep detection network. For the ICPR 2014 MITOSIS dataset that only provides the centroid location of mitosis, we employ the segmentation model to estimate the bounding box annotation for training the deep detection network. We also apply the verification model to eliminate some false positives produced from the detection model. By fusing scores of the detection and verification models, we achieve the state-of-the-art results. Moreover, our method is very fast with GPU computing, which makes it feasible for clinical practice. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. The Sedov Blast Wave as a Radial Piston Verification Test

    DOE PAGES

    Pederson, Clark; Brown, Bart; Morgan, Nathaniel

    2016-06-22

    The Sedov blast wave is of great utility as a verification problem for hydrodynamic methods. The typical implementation uses an energized cell of finite dimensions to represent the energy point source. We avoid this approximation by directly finding the effects of the energy source as a boundary condition (BC). Furthermore, the proposed method transforms the Sedov problem into an outward moving radial piston problem with a time-varying velocity. A portion of the mesh adjacent to the origin is removed and the boundaries of this hole are forced with the velocities from the Sedov solution. This verification test is implemented onmore » two types of meshes, and convergence is shown. Our results from the typical initial condition (IC) method and the new BC method are compared.« less

  12. Alternative sample sizes for verification dose experiments and dose audits

    NASA Astrophysics Data System (ADS)

    Taylor, W. A.; Hansen, J. M.

    1999-01-01

    ISO 11137 (1995), "Sterilization of Health Care Products—Requirements for Validation and Routine Control—Radiation Sterilization", provides sampling plans for performing initial verification dose experiments and quarterly dose audits. Alternative sampling plans are presented which provide equivalent protection. These sampling plans can significantly reduce the cost of testing. These alternative sampling plans have been included in a draft ISO Technical Report (type 2). This paper examines the rational behind the proposed alternative sampling plans. The protection provided by the current verification and audit sampling plans is first examined. Then methods for identifying equivalent plans are highlighted. Finally, methods for comparing the cost associated with the different plans are provided. This paper includes additional guidance for selecting between the original and alternative sampling plans not included in the technical report.

  13. Fingerprint Identification Using SIFT-Based Minutia Descriptors and Improved All Descriptor-Pair Matching

    PubMed Central

    Zhou, Ru; Zhong, Dexing; Han, Jiuqiang

    2013-01-01

    The performance of conventional minutiae-based fingerprint authentication algorithms degrades significantly when dealing with low quality fingerprints with lots of cuts or scratches. A similar degradation of the minutiae-based algorithms is observed when small overlapping areas appear because of the quite narrow width of the sensors. Based on the detection of minutiae, Scale Invariant Feature Transformation (SIFT) descriptors are employed to fulfill verification tasks in the above difficult scenarios. However, the original SIFT algorithm is not suitable for fingerprint because of: (1) the similar patterns of parallel ridges; and (2) high computational resource consumption. To enhance the efficiency and effectiveness of the algorithm for fingerprint verification, we propose a SIFT-based Minutia Descriptor (SMD) to improve the SIFT algorithm through image processing, descriptor extraction and matcher. A two-step fast matcher, named improved All Descriptor-Pair Matching (iADM), is also proposed to implement the 1:N verifications in real-time. Fingerprint Identification using SMD and iADM (FISiA) achieved a significant improvement with respect to accuracy in representative databases compared with the conventional minutiae-based method. The speed of FISiA also can meet real-time requirements. PMID:23467056

  14. Enhancing the Human Factors Engineering Role in an Austere Fiscal Environment

    NASA Technical Reports Server (NTRS)

    Stokes, Jack W.

    2003-01-01

    An austere fiscal environment in the aerospace community creates pressures to reduce program costs, often minimizing or sometimes even deleting the human interface requirements from the design process. With an assumption that the flight crew can recover real time from a poorly human factored space vehicle design, the classical crew interface requirements have been either not included in the design or not properly funded, though carried as requirements. Cost cuts have also affected quality of retained human factors engineering personnel. In response to this concern, planning is ongoing to correct the acting issues. Herein are techniques for ensuring that human interface requirements are integrated into a flight design, from proposal through verification and launch activation. This includes human factors requirements refinement and consolidation across flight programs; keyword phrases in the proposals; closer ties with systems engineering and other classical disciplines; early planning for crew-interface verification; and an Agency integrated human factors verification program, under the One NASA theme. Importance is given to communication within the aerospace human factors discipline, and utilizing the strengths of all government, industry, and academic human factors organizations in an unified research and engineering approach. A list of recommendations and concerns are provided in closing.

  15. How to Find a Bug in Ten Thousand Lines Transport Solver? Outline of Experiences from AN Advection-Diffusion Code Verification

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F.

    2011-12-01

    Almost all natural phenomena on Earth are highly nonlinear. Even simplifications to the equations describing nature usually end up being nonlinear partial differential equations. Transport (ADR) equation is a pivotal equation in atmospheric sciences and water quality. This nonlinear equation needs to be solved numerically for practical purposes so academicians and engineers thoroughly rely on the assistance of numerical codes. Thus, numerical codes require verification before they are utilized for multiple applications in science and engineering. Model verification is a mathematical procedure whereby a numerical code is checked to assure the governing equation is properly solved as it is described in the design document. CFD verification is not a straightforward and well-defined course. Only a complete test suite can uncover all the limitations and bugs. Results are needed to be assessed to make a distinction between bug-induced-defect and innate limitation of a numerical scheme. As Roache (2009) said, numerical verification is a state-of-the-art procedure. Sometimes novel tricks work out. This study conveys the synopsis of the experiences we gained during a comprehensive verification process which was done for a transport solver. A test suite was designed including unit tests and algorithmic tests. Tests were layered in complexity in several dimensions from simple to complex. Acceptance criteria defined for the desirable capabilities of the transport code such as order of accuracy, mass conservation, handling stiff source term, spurious oscillation, and initial shape preservation. At the begining, mesh convergence study which is the main craft of the verification is performed. To that end, analytical solution of ADR equation gathered. Also a new solution was derived. In the more general cases, lack of analytical solution could be overcome through Richardson Extrapolation and Manufactured Solution. Then, two bugs which were concealed during the mesh convergence study uncovered with the method of false injection and visualization of the results. Symmetry had dual functionality: there was a bug, which was hidden due to the symmetric nature of a test (it was detected afterward utilizing artificial false injection), on the other hand self-symmetry was used to design a new test, and in a case the analytical solution of the ADR equation was unknown. Assisting subroutines designed to check and post-process conservation of mass and oscillatory behavior. Finally, capability of the solver also checked for stiff reaction source term. The above test suite not only was a decent tool of error detection but also it provided a thorough feedback on the ADR solvers limitations. Such information is the crux of any rigorous numerical modeling for a modeler who deals with surface/subsurface pollution transport.

  16. Advanced Risk Reduction Tool (ARRT) Special Case Study Report: Science and Engineering Technical Assessments (SETA) Program

    NASA Technical Reports Server (NTRS)

    Kirsch, Paul J.; Hayes, Jane; Zelinski, Lillian

    2000-01-01

    This special case study report presents the Science and Engineering Technical Assessments (SETA) team's findings for exploring the correlation between the underlying models of Advanced Risk Reduction Tool (ARRT) relative to how it identifies, estimates, and integrates Independent Verification & Validation (IV&V) activities. The special case study was conducted under the provisions of SETA Contract Task Order (CTO) 15 and the approved technical approach documented in the CTO-15 Modification #1 Task Project Plan.

  17. Earth Sciences annual report, 1987

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Younker, L.W.; Donohue, M.L.; Peterson, S.J.

    1988-12-01

    The Earth Sciences Department at Lawrence Livermore National Laboratory conducts work in support of the Laboratory's energy, defense, and research programs. The Department is organized into ten groups. Five of these -- Nuclear Waste Management, Fossil Energy, Containment, Verification, and Research -- represent major programmatic activities within the Department. Five others -- Experimental Geophysics, Geomechanics, Geology/Geological Engineering, Geochemistry, and Seismology/Applied Geophysics -- are major disciplinary areas that support these and other laboratory programs. This report summarizes work carried out in 1987 by each group and contains a bibliography of their 1987 publications.

  18. KSC-02pd1947

    NASA Image and Video Library

    2002-12-17

    KENNEDY SPACE CENTER, FLA. -- Attached underneath the Orbital Sciences L-1011 aircraft is the Pegasus XL Expendable Launch Vehicle, which will be transported to the Multi-Payload Processing Facility for testing and verification. The Pegasus will undergo three flight simulations prior to its scheduled launch in late January 2003. The Pegasus XL will carry NASA's Solar Radiation and Climate Experiment (SORCE) into orbit. Built by Orbital Sciences Space Systems Group, SORCE will study and measure solar irradiance as a source of energy in the Earth's atmosphere. .

  19. KSC-02pd1952

    NASA Image and Video Library

    2002-12-17

    KENNEDY SPACE CENTER, FLA. -- Attached underneath the Orbital Sciences L-1011 aircraft is the Pegasus XL Expendable Launch Vehicle, which will be transported to the Multi-Payload Processing Facility for testing and verification. The Pegasus will undergo three flight simulations prior to its scheduled launch in late January 2003. The Pegasus XL will carry NASA's Solar Radiation and Climate Experiment (SORCE) into orbit. Built by Orbital Sciences Space Systems Group, SORCE will study and measure solar irradiance as a source of energy in the Earth's atmosphere. .

  20. 45 CFR 1705.5 - Disclosure of requested information to individuals.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... COMMISSION ON LIBRARIES AND INFORMATION SCIENCE PRIVACY REGULATIONS § 1705.5 Disclosure of requested information to individuals. Upon verification of identity, the System Manager shall disclose to the individual... 45 Public Welfare 4 2010-10-01 2010-10-01 false Disclosure of requested information to individuals...

  1. 45 CFR 1705.5 - Disclosure of requested information to individuals.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... COMMISSION ON LIBRARIES AND INFORMATION SCIENCE PRIVACY REGULATIONS § 1705.5 Disclosure of requested information to individuals. Upon verification of identity, the System Manager shall disclose to the individual... 45 Public Welfare 4 2011-10-01 2011-10-01 false Disclosure of requested information to individuals...

  2. 45 CFR 1705.5 - Disclosure of requested information to individuals.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... COMMISSION ON LIBRARIES AND INFORMATION SCIENCE PRIVACY REGULATIONS § 1705.5 Disclosure of requested information to individuals. Upon verification of identity, the System Manager shall disclose to the individual... 45 Public Welfare 4 2013-10-01 2013-10-01 false Disclosure of requested information to individuals...

  3. 45 CFR 1705.5 - Disclosure of requested information to individuals.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... COMMISSION ON LIBRARIES AND INFORMATION SCIENCE PRIVACY REGULATIONS § 1705.5 Disclosure of requested information to individuals. Upon verification of identity, the System Manager shall disclose to the individual... 45 Public Welfare 4 2012-10-01 2012-10-01 false Disclosure of requested information to individuals...

  4. 45 CFR 1705.5 - Disclosure of requested information to individuals.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... COMMISSION ON LIBRARIES AND INFORMATION SCIENCE PRIVACY REGULATIONS § 1705.5 Disclosure of requested information to individuals. Upon verification of identity, the System Manager shall disclose to the individual... 45 Public Welfare 4 2014-10-01 2014-10-01 false Disclosure of requested information to individuals...

  5. Threads of Mission Success

    NASA Technical Reports Server (NTRS)

    Gavin, Thomas R.

    2006-01-01

    This viewgraph presentation reviews the many parts of the JPL mission planning process that the project manager has to work with. Some of them are: NASA & JPL's institutional requirements, the mission systems design requirements, the science interactions, the technical interactions, financial requirements, verification and validation, safety and mission assurance, and independent assessment, review and reporting.

  6. Information Society: Agenda for Action in the UK.

    ERIC Educational Resources Information Center

    Phillips of Ellesmere, Lord

    1997-01-01

    Explains the House of Lords Select Committee on Science and Technology in the UK (United Kingdom) and discusses its report that addresses the need for information technology planning on a national basis. Topics include electronic publishing for access to government publications, universal access, regulatory framework, encryption and verification,…

  7. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  8. 36 CFR 218.8 - Filing an objection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) Signature or other verification of authorship upon request (a scanned signature for electronic mail may be... related to the proposed project; if applicable, how the objector believes the environmental analysis or...

  9. 36 CFR 218.8 - Filing an objection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) Signature or other verification of authorship upon request (a scanned signature for electronic mail may be... related to the proposed project; if applicable, how the objector believes the environmental analysis or...

  10. Efficient and Scalable Graph Similarity Joins in MapReduce

    PubMed Central

    Chen, Yifan; Zhang, Weiming; Tang, Jiuyang

    2014-01-01

    Along with the emergence of massive graph-modeled data, it is of great importance to investigate graph similarity joins due to their wide applications for multiple purposes, including data cleaning, and near duplicate detection. This paper considers graph similarity joins with edit distance constraints, which return pairs of graphs such that their edit distances are no larger than a given threshold. Leveraging the MapReduce programming model, we propose MGSJoin, a scalable algorithm following the filtering-verification framework for efficient graph similarity joins. It relies on counting overlapping graph signatures for filtering out nonpromising candidates. With the potential issue of too many key-value pairs in the filtering phase, spectral Bloom filters are introduced to reduce the number of key-value pairs. Furthermore, we integrate the multiway join strategy to boost the verification, where a MapReduce-based method is proposed for GED calculation. The superior efficiency and scalability of the proposed algorithms are demonstrated by extensive experimental results. PMID:25121135

  11. Efficient and scalable graph similarity joins in MapReduce.

    PubMed

    Chen, Yifan; Zhao, Xiang; Xiao, Chuan; Zhang, Weiming; Tang, Jiuyang

    2014-01-01

    Along with the emergence of massive graph-modeled data, it is of great importance to investigate graph similarity joins due to their wide applications for multiple purposes, including data cleaning, and near duplicate detection. This paper considers graph similarity joins with edit distance constraints, which return pairs of graphs such that their edit distances are no larger than a given threshold. Leveraging the MapReduce programming model, we propose MGSJoin, a scalable algorithm following the filtering-verification framework for efficient graph similarity joins. It relies on counting overlapping graph signatures for filtering out nonpromising candidates. With the potential issue of too many key-value pairs in the filtering phase, spectral Bloom filters are introduced to reduce the number of key-value pairs. Furthermore, we integrate the multiway join strategy to boost the verification, where a MapReduce-based method is proposed for GED calculation. The superior efficiency and scalability of the proposed algorithms are demonstrated by extensive experimental results.

  12. An improved authenticated key agreement protocol for telecare medicine information system.

    PubMed

    Liu, Wenhao; Xie, Qi; Wang, Shengbao; Hu, Bin

    2016-01-01

    In telecare medicine information systems (TMIS), identity authentication of patients plays an important role and has been widely studied in the research field. Generally, it is realized by an authenticated key agreement protocol, and many such protocols were proposed in the literature. Recently, Zhang et al. pointed out that Islam et al.'s protocol suffers from the following security weaknesses: (1) Any legal but malicious patient can reveal other user's identity; (2) An attacker can launch off-line password guessing attack and the impersonation attack if the patient's identity is compromised. Zhang et al. also proposed an improved authenticated key agreement scheme with privacy protection for TMIS. However, in this paper, we point out that Zhang et al.'s scheme cannot resist off-line password guessing attack, and it fails to provide the revocation of lost/stolen smartcard. In order to overcome these weaknesses, we propose an improved protocol, the security and authentication of which can be proven using applied pi calculus based formal verification tool ProVerif.

  13. A lightweight and secure two factor anonymous authentication protocol for Global Mobility Networks.

    PubMed

    Baig, Ahmed Fraz; Hassan, Khwaja Mansoor Ul; Ghani, Anwar; Chaudhry, Shehzad Ashraf; Khan, Imran; Ashraf, Muhammad Usman

    2018-01-01

    Global Mobility Networks(GLOMONETs) in wireless communication permits the global roaming services that enable a user to leverage the mobile services in any foreign country. Technological growth in wireless communication is also accompanied by new security threats and challenges. A threat-proof authentication protocol in wireless communication may overcome the security flaws by allowing only legitimate users to access a particular service. Recently, Lee et al. found Mun et al. scheme vulnerable to different attacks and proposed an advanced secure scheme to overcome the security flaws. However, this article points out that Lee et al. scheme lacks user anonymity, inefficient user authentication, vulnerable to replay and DoS attacks and Lack of local password verification. Furthermore, this article presents a more robust anonymous authentication scheme to handle the threats and challenges found in Lee et al.'s protocol. The proposed protocol is formally verified with an automated tool(ProVerif). The proposed protocol has superior efficiency in comparison to the existing protocols.

  14. Experimental verification of long-term evolution radio transmissions over dual-polarization combined fiber and free-space optics optical infrastructures.

    PubMed

    Bohata, J; Zvanovec, S; Pesek, P; Korinek, T; Mansour Abadi, M; Ghassemlooy, Z

    2016-03-10

    This paper describes the experimental verification of the utilization of long-term evolution radio over fiber (RoF) and radio over free space optics (RoFSO) systems using dual-polarization signals for cloud radio access network applications determining the specific utilization limits. A number of free space optics configurations are proposed and investigated under different atmospheric turbulence regimes in order to recommend the best setup configuration. We show that the performance of the proposed link, based on the combination of RoF and RoFSO for 64 QAM at 2.6 GHz, is more affected by the turbulence based on the measured difference error vector magnitude value of 5.5%. It is further demonstrated the proposed systems can offer higher noise immunity under particular scenarios with the signal-to-noise ratio reliability limit of 5 dB in the radio frequency domain for RoF and 19.3 dB in the optical domain for a combination of RoF and RoFSO links.

  15. A lightweight and secure two factor anonymous authentication protocol for Global Mobility Networks

    PubMed Central

    2018-01-01

    Global Mobility Networks(GLOMONETs) in wireless communication permits the global roaming services that enable a user to leverage the mobile services in any foreign country. Technological growth in wireless communication is also accompanied by new security threats and challenges. A threat-proof authentication protocol in wireless communication may overcome the security flaws by allowing only legitimate users to access a particular service. Recently, Lee et al. found Mun et al. scheme vulnerable to different attacks and proposed an advanced secure scheme to overcome the security flaws. However, this article points out that Lee et al. scheme lacks user anonymity, inefficient user authentication, vulnerable to replay and DoS attacks and Lack of local password verification. Furthermore, this article presents a more robust anonymous authentication scheme to handle the threats and challenges found in Lee et al.’s protocol. The proposed protocol is formally verified with an automated tool(ProVerif). The proposed protocol has superior efficiency in comparison to the existing protocols. PMID:29702675

  16. Perceptual processing affects conceptual processing.

    PubMed

    Van Dantzig, Saskia; Pecher, Diane; Zeelenberg, René; Barsalou, Lawrence W

    2008-04-05

    According to the Perceptual Symbols Theory of cognition (Barsalou, 1999), modality-specific simulations underlie the representation of concepts. A strong prediction of this view is that perceptual processing affects conceptual processing. In this study, participants performed a perceptual detection task and a conceptual property-verification task in alternation. Responses on the property-verification task were slower for those trials that were preceded by a perceptual trial in a different modality than for those that were preceded by a perceptual trial in the same modality. This finding of a modality-switch effect across perceptual processing and conceptual processing supports the hypothesis that perceptual and conceptual representations are partially based on the same systems. 2008 Cognitive Science Society, Inc.

  17. Verification hybrid control of a wheeled mobile robot and manipulator

    NASA Astrophysics Data System (ADS)

    Muszynska, Magdalena; Burghardt, Andrzej; Kurc, Krzysztof; Szybicki, Dariusz

    2016-04-01

    In this article, innovative approaches to realization of the wheeled mobile robots and manipulator tracking are presented. Conceptions include application of the neural-fuzzy systems to compensation of the controlled system's nonlinearities in the tracking control task. Proposed control algorithms work on-line, contain structure, that adapt to the changeable work conditions of the controlled systems, and do not require the preliminary learning. The algorithm was verification on the real object which was a Scorbot - ER 4pc robotic manipulator and a Pioneer - 2DX mobile robot.

  18. Automated Verification of Specifications with Typestates and Access Permissions

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Catano, Nestor

    2011-01-01

    We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).

  19. Off-line robot programming and graphical verification of path planning

    NASA Technical Reports Server (NTRS)

    Tonkay, Gregory L.

    1989-01-01

    The objective of this project was to develop or specify an integrated environment for off-line programming, graphical path verification, and debugging for robotic systems. Two alternatives were compared. The first was the integration of the ASEA Off-line Programming package with ROBSIM, a robotic simulation program. The second alternative was the purchase of the commercial product IGRIP. The needs of the RADL (Robotics Applications Development Laboratory) were explored and the alternatives were evaluated based on these needs. As a result, IGRIP was proposed as the best solution to the problem.

  20. Photometric redshift analysis in the Dark Energy Survey Science Verification data

    NASA Astrophysics Data System (ADS)

    Sánchez, C.; Carrasco Kind, M.; Lin, H.; Miquel, R.; Abdalla, F. B.; Amara, A.; Banerji, M.; Bonnett, C.; Brunner, R.; Capozzi, D.; Carnero, A.; Castander, F. J.; da Costa, L. A. N.; Cunha, C.; Fausti, A.; Gerdes, D.; Greisel, N.; Gschwend, J.; Hartley, W.; Jouvel, S.; Lahav, O.; Lima, M.; Maia, M. A. G.; Martí, P.; Ogando, R. L. C.; Ostrovski, F.; Pellegrini, P.; Rau, M. M.; Sadeh, I.; Seitz, S.; Sevilla-Noarbe, I.; Sypniewski, A.; de Vicente, J.; Abbot, T.; Allam, S. S.; Atlee, D.; Bernstein, G.; Bernstein, J. P.; Buckley-Geer, E.; Burke, D.; Childress, M. J.; Davis, T.; DePoy, D. L.; Dey, A.; Desai, S.; Diehl, H. T.; Doel, P.; Estrada, J.; Evrard, A.; Fernández, E.; Finley, D.; Flaugher, B.; Frieman, J.; Gaztanaga, E.; Glazebrook, K.; Honscheid, K.; Kim, A.; Kuehn, K.; Kuropatkin, N.; Lidman, C.; Makler, M.; Marshall, J. L.; Nichol, R. C.; Roodman, A.; Sánchez, E.; Santiago, B. X.; Sako, M.; Scalzo, R.; Smith, R. C.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Tucker, D. L.; Uddin, S. A.; Valdés, F.; Walker, A.; Yuan, F.; Zuntz, J.

    2014-12-01

    We present results from a study of the photometric redshift performance of the Dark Energy Survey (DES), using the early data from a Science Verification period of observations in late 2012 and early 2013 that provided science-quality images for almost 200 sq. deg. at the nominal depth of the survey. We assess the photometric redshift (photo-z) performance using about 15 000 galaxies with spectroscopic redshifts available from other surveys. These galaxies are used, in different configurations, as a calibration sample, and photo-z's are obtained and studied using most of the existing photo-z codes. A weighting method in a multidimensional colour-magnitude space is applied to the spectroscopic sample in order to evaluate the photo-z performance with sets that mimic the full DES photometric sample, which is on average significantly deeper than the calibration sample due to the limited depth of spectroscopic surveys. Empirical photo-z methods using, for instance, artificial neural networks or random forests, yield the best performance in the tests, achieving core photo-z resolutions σ68 ˜ 0.08. Moreover, the results from most of the codes, including template-fitting methods, comfortably meet the DES requirements on photo-z performance, therefore, providing an excellent precedent for future DES data sets.

  1. Photometric redshift analysis in the Dark Energy Survey Science Verification data

    DOE PAGES

    Sanchez, C.; Carrasco Kind, M.; Lin, H.; ...

    2014-10-09

    In this study, we present results from a study of the photometric redshift performance of the Dark Energy Survey (DES), using the early data from a Science Verification period of observations in late 2012 and early 2013 that provided science-quality images for almost 200 sq. deg. at the nominal depth of the survey. We assess the photometric redshift (photo-z) performance using about 15 000 galaxies with spectroscopic redshifts available from other surveys. These galaxies are used, in different configurations, as a calibration sample, and photo-z's are obtained and studied using most of the existing photo-z codes. A weighting method inmore » a multidimensional colour–magnitude space is applied to the spectroscopic sample in order to evaluate the photo-z performance with sets that mimic the full DES photometric sample, which is on average significantly deeper than the calibration sample due to the limited depth of spectroscopic surveys. In addition, empirical photo-z methods using, for instance, artificial neural networks or random forests, yield the best performance in the tests, achieving core photo-z resolutions σ68 ~ 0.08. Moreover, the results from most of the codes, including template-fitting methods, comfortably meet the DES requirements on photo-z performance, therefore, providing an excellent precedent for future DES data sets.« less

  2. What can Citizen Science do for Ocean Science and Ocean Scientists?

    NASA Astrophysics Data System (ADS)

    Best, M.; Hoeberechts, M.; Mangin, A.; Oggioni, A.; Orcutt, J. A.; Parrish, J.; Pearlman, J.; Piera, J.; Tagliolato, P.

    2016-12-01

    The ocean represents over 70% of our planet's surface area, over 90% of the living space. Humans are not marine creatures, we therefore have fundamentally not built up knowledge of the ocean in the same way we have on land. The more we learn about the ocean, the more we understand it is the regulatory engine of our planet…How do we catch up? Answers to this question will need to come from many quarters; A powerful and strategic option to complement existing observation programs and infrastructure is Citizen Science. There has been significant and relevant discussion of the importance of Citizen Science to citizens and stakeholders. The missing effective question is sometimes what is the potential of citizen science for scientists? The answers for both scientists and society are: spatial coverage, remote locations, temporal coverage, event response, early detection of harmful processes, sufficient data volume for statistical analysis and identification of outliers, integrating local knowledge, data access in exchange for analysis (e.g. with industry) and cost-effective monitoring systems. Citizens can be involved in: instrument manufacture and maintenance, instrument deployment/sample collection, data collection and transmission, data analysis, data validation/verification, and proposals of new topics of research. Such opportunities are balanced by concern on the part of scientists about the quality, the consistency and the reliability of citizen observations and analyses. Experience working with citizen science groups continues to suggest that with proper training and mentoring, these issues can be addressed, understanding both benefits and limitations. How to do it- implementation and maintenance of citizen science: How to recruit, engage, train, and maintain Citizen Scientists. Data systems for acquisition, assessment, access, analysis, and visualisation of distributed data sources. Tools/methods for acquiring observations: Simple instruments, Smartphone Apps, DIY-Instruments Community Online Platforms: websites, social networks, discussion forums. Crowdsourcing Tools: image acquisition, web and smartphone applications, surveys/questionnaires. Information, Engagement, and Training Resources: webinars, public lectures, websites, public/museum displays.

  3. 4D ML reconstruction as a tool for volumetric PET-based treatment verification in ion beam radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Bernardi, E., E-mail: elisabetta.debernardi@unimib.it; Ricotti, R.; Riboldi, M.

    2016-02-15

    Purpose: An innovative strategy to improve the sensitivity of positron emission tomography (PET)-based treatment verification in ion beam radiotherapy is proposed. Methods: Low counting statistics PET images acquired during or shortly after the treatment (Measured PET) and a Monte Carlo estimate of the same PET images derived from the treatment plan (Expected PET) are considered as two frames of a 4D dataset. A 4D maximum likelihood reconstruction strategy was adapted to iteratively estimate the annihilation events distribution in a reference frame and the deformation motion fields that map it in the Expected PET and Measured PET frames. The outputs generatedmore » by the proposed strategy are as follows: (1) an estimate of the Measured PET with an image quality comparable to the Expected PET and (2) an estimate of the motion field mapping Expected PET to Measured PET. The details of the algorithm are presented and the strategy is preliminarily tested on analytically simulated datasets. Results: The algorithm demonstrates (1) robustness against noise, even in the worst conditions where 1.5 × 10{sup 4} true coincidences and a random fraction of 73% are simulated; (2) a proper sensitivity to different kind and grade of mismatches ranging between 1 and 10 mm; (3) robustness against bias due to incorrect washout modeling in the Monte Carlo simulation up to 1/3 of the original signal amplitude; and (4) an ability to describe the mismatch even in presence of complex annihilation distributions such as those induced by two perpendicular superimposed ion fields. Conclusions: The promising results obtained in this work suggest the applicability of the method as a quantification tool for PET-based treatment verification in ion beam radiotherapy. An extensive assessment of the proposed strategy on real treatment verification data is planned.« less

  4. A New Virtual and Remote Experimental Environment for Teaching and Learning Science

    NASA Astrophysics Data System (ADS)

    Lustigova, Zdena; Lustig, Frantisek

    This paper describes how a scientifically exact and problem-solving-oriented remote and virtual science experimental environment might help to build a new strategy for science education. The main features are: the remote observations and control of real world phenomena, their processing and evaluation, verification of hypotheses combined with the development of critical thinking, supported by sophisticated relevant information search, classification and storing tools and collaborative environment, supporting argumentative writing and teamwork, public presentations and defense of achieved results, all either in real presence, in telepresence or in combination of both. Only then real understanding of generalized science laws and their consequences can be developed. This science learning and teaching environment (called ROL - Remote and Open Laboratory), has been developed and used by Charles University in Prague since 1996, offered to science students in both formal and informal learning, and also to science teachers within their professional development studies, since 2003.

  5. 77 FR 21616 - Agency Information Collection Activities: Proposed Request and Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-10

    ... disability payments. SSA considers the claimants the primary sources of verification; therefore, if claimants... or private self-insured companies administering WC/PDB benefits to disability claimants. Type of...

  6. Verification of relationship model between Korean new elderly class's recovery resilience and productive aging.

    PubMed

    Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk

    2015-12-01

    The purpose of this study is to verification of relationship model between Korean new elderly class's recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model.

  7. Verification of relationship model between Korean new elderly class’s recovery resilience and productive aging

    PubMed Central

    Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk

    2015-01-01

    The purpose of this study is to verification of relationship model between Korean new elderly class’s recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model. PMID:26730383

  8. Abstraction and Assume-Guarantee Reasoning for Automated Software Verification

    NASA Technical Reports Server (NTRS)

    Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.

    2004-01-01

    Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.

  9. Verification of the FtCayuga fault-tolerant microprocessor system. Volume 2: Formal specification and correctness theorems

    NASA Technical Reports Server (NTRS)

    Bickford, Mark; Srivas, Mandayam

    1991-01-01

    Presented here is a formal specification and verification of a property of a quadruplicately redundant fault tolerant microprocessor system design. A complete listing of the formal specification of the system and the correctness theorems that are proved are given. The system performs the task of obtaining interactive consistency among the processors using a special instruction on the processors. The design is based on an algorithm proposed by Pease, Shostak, and Lamport. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, providing certain preconditions hold, using a computer aided design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.

  10. Cryo-Vacuum Testing of the Integrated Science Instrument Module for the James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Kimble, Randy A.; Davila, P. S.; Drury, M. P.; Glazer, S. D.; Krom, J. R.; Lundquist, R. A.; Mann, S. D.; McGuffey, D. B.; Perry, R. L.; Ramey, D. D.

    2011-01-01

    With delivery of the science instruments for the James Webb Space Telescope (JWST) to Goddard Space Flight Center (GSFC) expected in 2012, current plans call for the first cryo-vacuum test of the Integrated Science Instrument Module (ISIM) to be carried out at GSFC in early 2013. Plans are well underway for conducting this ambitious test, which will perform critical verifications of a number of optical, thermal, and operational requirements of the IS 1M hardware, at its deep cryogenic operating temperature. We describe here the facilities, goals, methods, and timeline for this important Integration & Test milestone in the JWST program.

  11. Ames Research Center life sciences payload

    NASA Technical Reports Server (NTRS)

    Callahan, P. X.; Tremor, J. W.

    1982-01-01

    In response to a recognized need for an in-flight animal housing facility to support Spacelab life sciences investigators, a rack and system compatible Research Animal Holding Facility (RAHF) has been developed. A series of ground tests is planned to insure its satisfactory performance under certain simulated conditions of flight exposure and use. However, even under the best conditions of simulation, confidence gained in ground testing will not approach that resulting from actual spaceflight operation. The Spacelab Mission 3 provides an opportunity to perform an inflight Verification Test (VT) of the RAHF. Lessons learned from the RAHF-VT and baseline performance data will be invaluable in preparation for subsequent dedicated life sciences missions.

  12. Online pretreatment verification of high-dose rate brachytherapy using an imaging panel

    NASA Astrophysics Data System (ADS)

    Fonseca, Gabriel P.; Podesta, Mark; Bellezzo, Murillo; Van den Bosch, Michiel R.; Lutgens, Ludy; Vanneste, Ben G. L.; Voncken, Robert; Van Limbergen, Evert J.; Reniers, Brigitte; Verhaegen, Frank

    2017-07-01

    Brachytherapy is employed to treat a wide variety of cancers. However, an accurate treatment verification method is currently not available. This study describes a pre-treatment verification system that uses an imaging panel (IP) to verify important aspects of the treatment plan. A detailed modelling of the IP was only possible with an extensive calibration performed using a robotic arm. Irradiations were performed with a high dose rate (HDR) 192Ir source within a water phantom. An empirical fit was applied to measure the distance between the source and the detector so 3D Cartesian coordinates of the dwell positions can be obtained using a single panel. The IP acquires 7.14 fps to verify the dwell times, dwell positions and air kerma strength (Sk). A gynecological applicator was used to create a treatment plan that was registered with a CT image of the water phantom used during the experiments for verification purposes. Errors (shifts, exchanged connections and wrong dwell times) were simulated to verify the proposed verification system. Cartesian source positions (panel measurement plane) have a standard deviation of about 0.02 cm. The measured distance between the source and the panel (z-coordinate) have a standard deviation up to 0.16 cm and maximum absolute error of  ≈0.6 cm if the signal is close to sensitive limit of the panel. The average response of the panel is very linear with Sk. Therefore, Sk measurements can be performed with relatively small errors. The measured dwell times show a maximum error of 0.2 s which is consistent with the acquisition rate of the panel. All simulated errors were clearly identified by the proposed system. The use of IPs is not common in brachytherapy, however, it provides considerable advantages. It was demonstrated that the IP can accurately measure Sk, dwell times and dwell positions.

  13. Formal verification of software-based medical devices considering medical guidelines.

    PubMed

    Daw, Zamira; Cleaveland, Rance; Vetter, Marcus

    2014-01-01

    Software-based devices have increasingly become an important part of several clinical scenarios. Due to their critical impact on human life, medical devices have very strict safety requirements. It is therefore necessary to apply verification methods to ensure that the safety requirements are met. Verification of software-based devices is commonly limited to the verification of their internal elements without considering the interaction that these elements have with other devices as well as the application environment in which they are used. Medical guidelines define clinical procedures, which contain the necessary information to completely verify medical devices. The objective of this work was to incorporate medical guidelines into the verification process in order to increase the reliability of the software-based medical devices. Medical devices are developed using the model-driven method deterministic models for signal processing of embedded systems (DMOSES). This method uses unified modeling language (UML) models as a basis for the development of medical devices. The UML activity diagram is used to describe medical guidelines as workflows. The functionality of the medical devices is abstracted as a set of actions that is modeled within these workflows. In this paper, the UML models are verified using the UPPAAL model-checker. For this purpose, a formalization approach for the UML models using timed automaton (TA) is presented. A set of requirements is verified by the proposed approach for the navigation-guided biopsy. This shows the capability for identifying errors or optimization points both in the workflow and in the system design of the navigation device. In addition to the above, an open source eclipse plug-in was developed for the automated transformation of UML models into TA models that are automatically verified using UPPAAL. The proposed method enables developers to model medical devices and their clinical environment using clinical workflows as one UML diagram. Additionally, the system design can be formally verified automatically.

  14. Quality assurance of a gimbaled head swing verification using feature point tracking.

    PubMed

    Miura, Hideharu; Ozawa, Shuichi; Enosaki, Tsubasa; Kawakubo, Atsushi; Hosono, Fumika; Yamada, Kiyoshi; Nagata, Yasushi

    2017-01-01

    To perform dynamic tumor tracking (DTT) for clinical applications safely and accurately, gimbaled head swing verification is important. We propose a quantitative gimbaled head swing verification method for daily quality assurance (QA), which uses feature point tracking and a web camera. The web camera was placed on a couch at the same position for every gimbaled head swing verification, and could move based on a determined input function (sinusoidal patterns; amplitude: ± 20 mm; cycle: 3 s) in the pan and tilt directions at isocenter plane. Two continuous images were then analyzed for each feature point using the pyramidal Lucas-Kanade (LK) method, which is an optical flow estimation algorithm. We used a tapped hole as a feature point of the gimbaled head. The period and amplitude were analyzed to acquire a quantitative gimbaled head swing value for daily QA. The mean ± SD of the period were 3.00 ± 0.03 (range: 3.00-3.07) s and 3.00 ± 0.02 (range: 3.00-3.07) s in the pan and tilt directions, respectively. The mean ± SD of the relative displacement were 19.7 ± 0.08 (range: 19.6-19.8) mm and 18.9 ± 0.2 (range: 18.4-19.5) mm in the pan and tilt directions, respectively. The gimbaled head swing was reliable for DTT. We propose a quantitative gimbaled head swing verification method for daily QA using the feature point tracking method and a web camera. Our method can quantitatively assess the gimbaled head swing for daily QA from baseline values, measured at the time of acceptance and commissioning. © 2016 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  15. Online pretreatment verification of high-dose rate brachytherapy using an imaging panel.

    PubMed

    Fonseca, Gabriel P; Podesta, Mark; Bellezzo, Murillo; Van den Bosch, Michiel R; Lutgens, Ludy; Vanneste, Ben G L; Voncken, Robert; Van Limbergen, Evert J; Reniers, Brigitte; Verhaegen, Frank

    2017-07-07

    Brachytherapy is employed to treat a wide variety of cancers. However, an accurate treatment verification method is currently not available. This study describes a pre-treatment verification system that uses an imaging panel (IP) to verify important aspects of the treatment plan. A detailed modelling of the IP was only possible with an extensive calibration performed using a robotic arm. Irradiations were performed with a high dose rate (HDR) 192 Ir source within a water phantom. An empirical fit was applied to measure the distance between the source and the detector so 3D Cartesian coordinates of the dwell positions can be obtained using a single panel. The IP acquires 7.14 fps to verify the dwell times, dwell positions and air kerma strength (Sk). A gynecological applicator was used to create a treatment plan that was registered with a CT image of the water phantom used during the experiments for verification purposes. Errors (shifts, exchanged connections and wrong dwell times) were simulated to verify the proposed verification system. Cartesian source positions (panel measurement plane) have a standard deviation of about 0.02 cm. The measured distance between the source and the panel (z-coordinate) have a standard deviation up to 0.16 cm and maximum absolute error of  ≈0.6 cm if the signal is close to sensitive limit of the panel. The average response of the panel is very linear with Sk. Therefore, Sk measurements can be performed with relatively small errors. The measured dwell times show a maximum error of 0.2 s which is consistent with the acquisition rate of the panel. All simulated errors were clearly identified by the proposed system. The use of IPs is not common in brachytherapy, however, it provides considerable advantages. It was demonstrated that the IP can accurately measure Sk, dwell times and dwell positions.

  16. Experimental verification of layout physical verification of silicon photonics

    NASA Astrophysics Data System (ADS)

    El Shamy, Raghi S.; Swillam, Mohamed A.

    2018-02-01

    Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.

  17. DFM flow by using combination between design based metrology system and model based verification at sub-50nm memory device

    NASA Astrophysics Data System (ADS)

    Kim, Cheol-kyun; Kim, Jungchan; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu; Kim, Jinwoong

    2007-03-01

    As the minimum transistor length is getting smaller, the variation and uniformity of transistor length seriously effect device performance. So, the importance of optical proximity effects correction (OPC) and resolution enhancement technology (RET) cannot be overemphasized. However, OPC process is regarded by some as a necessary evil in device performance. In fact, every group which includes process and design, are interested in whole chip CD variation trend and CD uniformity, which represent real wafer. Recently, design based metrology systems are capable of detecting difference between data base to wafer SEM image. Design based metrology systems are able to extract information of whole chip CD variation. According to the results, OPC abnormality was identified and design feedback items are also disclosed. The other approaches are accomplished on EDA companies, like model based OPC verifications. Model based verification will be done for full chip area by using well-calibrated model. The object of model based verification is the prediction of potential weak point on wafer and fast feed back to OPC and design before reticle fabrication. In order to achieve robust design and sufficient device margin, appropriate combination between design based metrology system and model based verification tools is very important. Therefore, we evaluated design based metrology system and matched model based verification system for optimum combination between two systems. In our study, huge amount of data from wafer results are classified and analyzed by statistical method and classified by OPC feedback and design feedback items. Additionally, novel DFM flow would be proposed by using combination of design based metrology and model based verification tools.

  18. Authoring and verification of clinical guidelines: a model driven approach.

    PubMed

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc. All rights reserved.

  19. From Cookbook to Collaborative: Transforming a University Biology Laboratory Course

    ERIC Educational Resources Information Center

    Herron, Sherry S.

    2009-01-01

    As described in "How People Learn," "Developing Biological Literacy," and by the Commission on Undergraduate Education in the Biological Sciences during the 1960s and early 1970s, laboratories should promote guided-inquiries or investigations, and not simply consist of cookbook or verification activities. However, the only word that could describe…

  20. A Software Hub for High Assurance Model-Driven Development and Analysis

    DTIC Science & Technology

    2007-01-23

    verification of UML models in TLPVS. In Thomas Baar, Alfred Strohmeier, Ana Moreira, and Stephen J. Mellor, editors, UML 2004 - The Unified Modeling...volume 3785 of Lecture Notes in Computer Science, pages 52–65, Manchester, UK, Nov 2005. Springer. [GH04] Günter Graw and Peter Herrmann. Transformation

  1. WOSMIP II- Workshop on Signatures of Medical and Industrial Isotope Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthews, Murray; Achim, Pascal; Auer, M.

    2011-11-01

    Medical and industrial fadioisotopes are fundamental tools used in science, medicine and industry with an ever expanding usage in medical practice where their availability is vital. Very sensitive environmental radionuclide monitoring networks have been developed for nuclear-security-related monitoring [particularly Comprehensive Test-Ban-Treaty (CTBT) compliance verification] and are now operational.

  2. Commissioning and Science Verification of JAST/T80

    NASA Astrophysics Data System (ADS)

    Ederoclte, A.; Cenarro, A. J.; Marín-Franch, A.; Cristóbal-Hornillos, D.; Vázquez Ramió, H.; Varela, J.; Hurier, G.; Moles, M.; Lamadrid, J. L.; Díaz-Martín, M. C.; Iglesias Marzoa, R.; Tilve, V.; Rodríguez, S.; Maícas, N.; Abri, J.

    2017-03-01

    Located at the Observatorio Astrofísico de Javalambre, the ’’Javalambre Auxiliary Survey Telescope’’ is an 80cm telescope with a unvignetted 2 square degrees field of view. The telescope is equipped with T80Cam, a camera with a large format CCD and two filter wheels which can host, at any given time, 12 filters. The telescope has been designed to provide optical quality all across the field of view, which is achieved with a field corrector. In this talk, I will review the commissioning of the telescope. The optical performance in the centre of the field of view has been tested with lucky imaging technique, providing a telescope PSF of 0.4’’, which is close to the one expected from theory. Moreover, the tracking of the telescope does not affect the image quality, as it has been shown that stars appear round even in exposures of 10minutes obtained without guiding. Most importantly, we present the preliminary results of science verification observations which combine the two main characteristics of this telescope: the large field of view and the special filter set.

  3. Motion Planning of Two Stacker Cranes in a Large-Scale Automated Storage/Retrieval System

    NASA Astrophysics Data System (ADS)

    Kung, Yiheng; Kobayashi, Yoshimasa; Higashi, Toshimitsu; Ota, Jun

    We propose a method for reducing the computational time of motion planning for stacker cranes. Most automated storage/retrieval systems (AS/RSs) are only equipped with one stacker crane. However, this is logistically challenging, and greater work efficiency in warehouses, such as those using two stacker cranes, is required. In this paper, a warehouse with two stacker cranes working simultaneously is proposed. Unlike warehouses with only one crane, trajectory planning in those with two cranes is very difficult. Since there are two cranes working together, a proper trajectory must be considered to avoid collision. However, verifying collisions is complicated and requires a considerable amount of computational time. As transport work in AS/RSs occurs randomly, motion planning cannot be conducted in advance. Planning an appropriate trajectory within a restricted duration would be a difficult task. We thereby address the current problem of motion planning requiring extensive calculation time. As a solution, we propose a “free-step” to simplify the procedure of collision verification and reduce the computational time. On the other hand, we proposed a method to reschedule the order of collision verification in order to find an appropriate trajectory in less time. By the proposed method, we reduce the calculation time to less than 1/300 of that achieved in former research.

  4. Fuzzy method of recognition of high molecular substances in evidence-based biology

    NASA Astrophysics Data System (ADS)

    Olevskyi, V. I.; Smetanin, V. T.; Olevska, Yu. B.

    2017-10-01

    Nowadays modern requirements to achieving reliable results along with high quality of researches put mathematical analysis methods of results at the forefront. Because of this, evidence-based methods of processing experimental data have become increasingly popular in the biological sciences and medicine. Their basis is meta-analysis, a method of quantitative generalization of a large number of randomized trails contributing to a same special problem, which are often contradictory and performed by different authors. It allows identifying the most important trends and quantitative indicators of the data, verification of advanced hypotheses and discovering new effects in the population genotype. The existing methods of recognizing high molecular substances by gel electrophoresis of proteins under denaturing conditions are based on approximate methods for comparing the contrast of electrophoregrams with a standard solution of known substances. We propose a fuzzy method for modeling experimental data to increase the accuracy and validity of the findings of the detection of new proteins.

  5. Verification of causal influences of reasoning skills and epistemology on physics conceptual learning

    NASA Astrophysics Data System (ADS)

    Ding, Lin

    2014-12-01

    This study seeks to test the causal influences of reasoning skills and epistemologies on student conceptual learning in physics. A causal model, integrating multiple variables that were investigated separately in the prior literature, is proposed and tested through path analysis. These variables include student preinstructional reasoning skills measured by the Classroom Test of Scientific Reasoning, pre- and postepistemological views measured by the Colorado Learning Attitudes about Science Survey, and pre- and postperformance on Newtonian concepts measured by the Force Concept Inventory. Students from a traditionally taught calculus-based introductory mechanics course at a research university participated in the study. Results largely support the postulated causal model and reveal strong influences of reasoning skills and preinstructional epistemology on student conceptual learning gains. Interestingly enough, postinstructional epistemology does not appear to have a significant influence on student learning gains. Moreover, pre- and postinstructional epistemology, although barely different from each other on average, have little causal connection between them.

  6. What happens when the ice melts? Belugas, contaminants, ecosystems and human communities in the complexity of global change.

    PubMed

    Lennert, Ann Eileen

    2016-06-15

    In general, it is important to examine the whole spectrum of interrelated fields while comprehending pollution, climate change or the environment, because some of their relevances are expected and others not. This study aims at comparatively examining different but interrelated ways of acquiring and communicating information on environmental changes, focusing on pollution in the Arctic, in particular Greenland. In the context of climate change, it discusses how heavily polluted and stressed Arctic marine ecosystems may be affected when ice melts. Bridging cultures of knowledge, this study claims that traditional knowledge together with natural science and studies of contaminants in Arctic marine ecosystems can indicate behavioural factors, elements acting as additional stressors on animals and communities relying on them. Furthermore, it explains the role of scientific engagement with local communities in not only the identification and verification of stressors, enhancing our understanding of them, but also the proposal of solutions to related problems. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Consortium for Verification Technology Fellowship Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadler, Lorraine E.

    2017-06-01

    As one recipient of the Consortium for Verification Technology (CVT) Fellowship, I spent eight days as a visiting scientist at the University of Michigan, Department of Nuclear Engineering and Radiological Sciences (NERS). During this time, I participated in multiple department and research group meetings and presentations, met with individual faculty and students, toured multiple laboratories, and taught one-half of a one-unit class on Risk Analysis in Nuclear Arms control (six 1.5 hour lectures). The following report describes some of the interactions that I had during my time as well as a brief discussion of the impact of this fellowship onmore » members of the consortium and on me/my laboratory’s technical knowledge and network.« less

  8. Lessons Learned from Optical Payload for Lasercomm Science (OPALS) Mission Operations

    NASA Technical Reports Server (NTRS)

    Sindiy, Oleg V.; Abrahamson, Matthew J.; Biswas, Abhijit; Wright, Malcolm W.; Padams, Jordan H.; Konyha, Alexander L.

    2015-01-01

    This paper provides an overview of Optical Payload for Lasercomm Science (OPALS) activities and lessons learned during mission operations. Activities described cover the periods of commissioning, prime, and extended mission operations, during which primary and secondary mission objectives were achieved for demonstrating space-to-ground optical communications. Lessons learned cover Mission Operations System topics in areas of: architecture verification and validation, staffing, mission support area, workstations, workstation tools, interfaces with support services, supporting ground stations, team training, procedures, flight software upgrades, post-processing tools, and public outreach.

  9. Structural Safety of a Hubble Space Telescope Science Instrument

    NASA Technical Reports Server (NTRS)

    Lou, M. C.; Brent, D. N.

    1993-01-01

    This paper gives an overview of safety requirements related to structural design and verificationof payloads to be launched and/or retrieved by the Space Shuttle. To demonstrate the generalapproach used to implement these requirements in the development of a typical Shuttle payload, theWide Field/Planetary Camera II, a second generation science instrument currently being developed bythe Jet Propulsion Laboratory (JPL) for the Hubble Space Telescope is used as an example. Inaddition to verification of strength and dynamic characteristics, special emphasis is placed upon thefracture control implementation process, including parts classification and fracture controlacceptability.

  10. KSC-02pd1946

    NASA Image and Video Library

    2002-12-17

    KENNEDY SPACE CENTER, FLA. - An Orbital Sciences L-1011 aircraft arrives at the Cape Canaveral Air Force Station Skid Strip. Attached underneath the aircraft is the Pegasus XL Expendable Launch Vehicle, which will be transported to the Multi-Payload Processing Facility for testing and verification. The Pegasus will undergo three flight simulations prior to its scheduled launch in late January 2003. The Pegasus XL will carry NASA's Solar Radiation and Climate Experiment (SORCE) into orbit. Built by Orbital Sciences Space Systems Group, SORCE will study and measure solar irradiance as a source of energy in the Earth's atmosphere. .

  11. KSC-02pd1951

    NASA Image and Video Library

    2002-12-17

    KENNEDY SPACE CENTER, FLA. -- Workers at the Cape Canaveral Air Force Station Skid Strip stand next to the Pegasus XL Expendable Launch Vehicle underneath the Orbital Sciences L-1011 aircraft. The Pegasus will be transported to the Multi-Payload Processing Facility for testing and verification. The Pegasus will undergo three flight simulations prior to its scheduled launch in late January 2003. The Pegasus XL will carry NASA's Solar Radiation and Climate Experiment (SORCE) into orbit. Built by Orbital Sciences Space Systems Group, SORCE will study and measure solar irradiance as a source of energy in the Earth's atmosphere. .

  12. A Mode-Shape-Based Fault Detection Methodology for Cantilever Beams

    NASA Technical Reports Server (NTRS)

    Tejada, Arturo

    2009-01-01

    An important goal of NASA's Internal Vehicle Health Management program (IVHM) is to develop and verify methods and technologies for fault detection in critical airframe structures. A particularly promising new technology under development at NASA Langley Research Center is distributed Bragg fiber optic strain sensors. These sensors can be embedded in, for instance, aircraft wings to continuously monitor surface strain during flight. Strain information can then be used in conjunction with well-known vibrational techniques to detect faults due to changes in the wing's physical parameters or to the presence of incipient cracks. To verify the benefits of this technology, the Formal Methods Group at NASA LaRC has proposed the use of formal verification tools such as PVS. The verification process, however, requires knowledge of the physics and mathematics of the vibrational techniques and a clear understanding of the particular fault detection methodology. This report presents a succinct review of the physical principles behind the modeling of vibrating structures such as cantilever beams (the natural model of a wing). It also reviews two different classes of fault detection techniques and proposes a particular detection method for cracks in wings, which is amenable to formal verification. A prototype implementation of these methods using Matlab scripts is also described and is related to the fundamental theoretical concepts.

  13. FIR signature verification system characterizing dynamics of handwriting features

    NASA Astrophysics Data System (ADS)

    Thumwarin, Pitak; Pernwong, Jitawat; Matsuura, Takenobu

    2013-12-01

    This paper proposes an online signature verification method based on the finite impulse response (FIR) system characterizing time-frequency characteristics of dynamic handwriting features. First, the barycenter determined from both the center point of signature and two adjacent pen-point positions in the signing process, instead of one pen-point position, is used to reduce the fluctuation of handwriting motion. In this paper, among the available dynamic handwriting features, motion pressure and area pressure are employed to investigate handwriting behavior. Thus, the stable dynamic handwriting features can be described by the relation of the time-frequency characteristics of the dynamic handwriting features. In this study, the aforesaid relation can be represented by the FIR system with the wavelet coefficients of the dynamic handwriting features as both input and output of the system. The impulse response of the FIR system is used as the individual feature for a particular signature. In short, the signature can be verified by evaluating the difference between the impulse responses of the FIR systems for a reference signature and the signature to be verified. The signature verification experiments in this paper were conducted using the SUBCORPUS MCYT-100 signature database consisting of 5,000 signatures from 100 signers. The proposed method yielded equal error rate (EER) of 3.21% on skilled forgeries.

  14. Probabilistic Elastic Part Model: A Pose-Invariant Representation for Real-World Face Verification.

    PubMed

    Li, Haoxiang; Hua, Gang

    2018-04-01

    Pose variation remains to be a major challenge for real-world face recognition. We approach this problem through a probabilistic elastic part model. We extract local descriptors (e.g., LBP or SIFT) from densely sampled multi-scale image patches. By augmenting each descriptor with its location, a Gaussian mixture model (GMM) is trained to capture the spatial-appearance distribution of the face parts of all face images in the training corpus, namely the probabilistic elastic part (PEP) model. Each mixture component of the GMM is confined to be a spherical Gaussian to balance the influence of the appearance and the location terms, which naturally defines a part. Given one or multiple face images of the same subject, the PEP-model builds its PEP representation by sequentially concatenating descriptors identified by each Gaussian component in a maximum likelihood sense. We further propose a joint Bayesian adaptation algorithm to adapt the universally trained GMM to better model the pose variations between the target pair of faces/face tracks, which consistently improves face verification accuracy. Our experiments show that we achieve state-of-the-art face verification accuracy with the proposed representations on the Labeled Face in the Wild (LFW) dataset, the YouTube video face database, and the CMU MultiPIE dataset.

  15. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  16. Protecting Privacy and Securing the Gathering of Location Proofs - The Secure Location Verification Proof Gathering Protocol

    NASA Astrophysics Data System (ADS)

    Graham, Michelle; Gray, David

    As wireless networks become increasingly ubiquitous, the demand for a method of locating a device has increased dramatically. Location Based Services are now commonplace but there are few methods of verifying or guaranteeing a location provided by a user without some specialised hardware, especially in larger scale networks. We propose a system for the verification of location claims, using proof gathered from neighbouring devices. In this paper we introduce a protocol to protect this proof gathering process, protecting the privacy of all involved parties and securing it from intruders and malicious claiming devices. We present the protocol in stages, extending the security of this protocol to allow for flexibility within its application. The Secure Location Verification Proof Gathering Protocol (SLVPGP) has been designed to function within the area of Vehicular Networks, although its application could be extended to any device with wireless & cryptographic capabilities.

  17. Automated Analysis of Stateflow Models

    NASA Technical Reports Server (NTRS)

    Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier

    2017-01-01

    Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.

  18. Highly efficient simulation environment for HDTV video decoder in VLSI design

    NASA Astrophysics Data System (ADS)

    Mao, Xun; Wang, Wei; Gong, Huimin; He, Yan L.; Lou, Jian; Yu, Lu; Yao, Qingdong; Pirsch, Peter

    2002-01-01

    With the increase of the complex of VLSI such as the SoC (System on Chip) of MPEG-2 Video decoder with HDTV scalability especially, simulation and verification of the full design, even as high as the behavior level in HDL, often proves to be very slow, costly and it is difficult to perform full verification until late in the design process. Therefore, they become bottleneck of the procedure of HDTV video decoder design, and influence it's time-to-market mostly. In this paper, the architecture of Hardware/Software Interface of HDTV video decoder is studied, and a Hardware-Software Mixed Simulation (HSMS) platform is proposed to check and correct error in the early design stage, based on the algorithm of MPEG-2 video decoding. The application of HSMS to target system could be achieved by employing several introduced approaches. Those approaches speed up the simulation and verification task without decreasing performance.

  19. Consistent model driven architecture

    NASA Astrophysics Data System (ADS)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  20. 78 FR 46424 - Proposed Information Collection (Residency Verification Report-Veterans and Survivors) Activity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-31

    ... to determine Filipino Veterans or beneficiaries receiving benefit at the full-dollar rate continues... approved collection. Abstract: VA Form Letter 21-914 is use to verify whether Filipino Veterans of the...

  1. 75 FR 62186 - Proposed Information Collection (Residency Verification Report-Veterans and Survivors) Activity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-07

    ... to determine Filipino veterans or beneficiaries receiving benefit at the full-dollar rate continues... approved collection. Abstract: VA Form Letter 21-914 is use to verify whether Filipino veterans of the...

  2. Design and Development of the Space Shuttle Tail Service Masts

    NASA Technical Reports Server (NTRS)

    Dandage, S. R.; Herman, N. A.; Godfrey, S. E.; Uda, R. T.

    1977-01-01

    The results of the tail service masts (TSM) concept verification test are presented along with the resulting impact on prototype design. The design criteria are outlined, and the proposed prototype TSM tests are described.

  3. Spectral signature verification using statistical analysis and text mining

    NASA Astrophysics Data System (ADS)

    DeCoster, Mallory E.; Firpi, Alexe H.; Jacobs, Samantha K.; Cone, Shelli R.; Tzeng, Nigel H.; Rodriguez, Benjamin M.

    2016-05-01

    In the spectral science community, numerous spectral signatures are stored in databases representative of many sample materials collected from a variety of spectrometers and spectroscopists. Due to the variety and variability of the spectra that comprise many spectral databases, it is necessary to establish a metric for validating the quality of spectral signatures. This has been an area of great discussion and debate in the spectral science community. This paper discusses a method that independently validates two different aspects of a spectral signature to arrive at a final qualitative assessment; the textual meta-data and numerical spectral data. Results associated with the spectral data stored in the Signature Database1 (SigDB) are proposed. The numerical data comprising a sample material's spectrum is validated based on statistical properties derived from an ideal population set. The quality of the test spectrum is ranked based on a spectral angle mapper (SAM) comparison to the mean spectrum derived from the population set. Additionally, the contextual data of a test spectrum is qualitatively analyzed using lexical analysis text mining. This technique analyzes to understand the syntax of the meta-data to provide local learning patterns and trends within the spectral data, indicative of the test spectrum's quality. Text mining applications have successfully been implemented for security2 (text encryption/decryption), biomedical3 , and marketing4 applications. The text mining lexical analysis algorithm is trained on the meta-data patterns of a subset of high and low quality spectra, in order to have a model to apply to the entire SigDB data set. The statistical and textual methods combine to assess the quality of a test spectrum existing in a database without the need of an expert user. This method has been compared to other validation methods accepted by the spectral science community, and has provided promising results when a baseline spectral signature is present for comparison. The spectral validation method proposed is described from a practical application and analytical perspective.

  4. Five biomedical experiments flown in an Earth orbiting laboratory: Lessons learned from developing these experiments on the first international microgravity mission from concept to landing

    NASA Technical Reports Server (NTRS)

    Winget, C. M.; Lashbrook, J. J.; Callahan, P. X.; Schaefer, R. L.

    1993-01-01

    There are numerous problems associated with accommodating complex biological systems in microgravity in the flexible laboratory systems installed in the Orbiter cargo bay. This presentation will focus upon some of the lessons learned along the way from the University laboratory to the IML-1 Microgravity Laboratory. The First International Microgravity Laboratory (IML-1) mission contained a large number of specimens, including: 72 million nematodes, US-1; 3 billion yeast cells, US-2; 32 million mouse limb-bud cells, US-3; and 540 oat seeds (96 planted), FOTRAN. All five of the experiments had to undergo significant redevelopment effort in order to allow the investigator's ideas and objectives to be accommodated within the constraints of the IML-1 mission. Each of these experiments were proposed as unique entities rather than part of the mission, and many procedures had to be modified from the laboratory practice to meet IML-1 constraints. After a proposal is accepted by NASA for definition, an interactive process is begun between the Principal Investigator and the developer to ensure a maximum science return. The success of the five SLSPO-managed experiments was the result of successful completion of all preflight biological testing and hardware verification finalized at the KSC Life Sciences Support Facility housed in Hangar L. The ESTEC Biorack facility housed three U.S. experiments (US-1, US-2, and US-3). The U.S. Gravitational Plant Physiology Facility housed GTHRES and FOTRAN. The IML-1 mission (launched from KSC on 22 Jan. 1992, and landed at Dryden Flight Research Facility on 30 Jan. 1992) was an outstanding success--close to 100 percent of the prelaunch anticipated science return was achieved and, in some cases, greater than 100 percent was achieved (because of an extra mission day).

  5. Figures of Merit for Control Verification

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Goesu. Daniel P.

    2008-01-01

    This paper proposes a methodology for evaluating a controller's ability to satisfy a set of closed-loop specifications when the plant has an arbitrary functional dependency on uncertain parameters. Control verification metrics applicable to deterministic and probabilistic uncertainty models are proposed. These metrics, which result from sizing the largest uncertainty set of a given class for which the specifications are satisfied, enable systematic assessment of competing control alternatives regardless of the methods used to derive them. A particularly attractive feature of the tools derived is that their efficiency and accuracy do not depend on the robustness of the controller. This is in sharp contrast to Monte Carlo based methods where the number of simulations required to accurately approximate the failure probability grows exponentially with its closeness to zero. This framework allows for the integration of complex, high-fidelity simulations of the integrated system and only requires standard optimization algorithms for its implementation.

  6. Signal existence verification (SEV) for GPS low received power signal detection using the time-frequency approach.

    PubMed

    Jan, Shau-Shiun; Sun, Chih-Cheng

    2010-01-01

    The detection of low received power of global positioning system (GPS) signals in the signal acquisition process is an important issue for GPS applications. Improving the miss-detection problem of low received power signal is crucial, especially for urban or indoor environments. This paper proposes a signal existence verification (SEV) process to detect and subsequently verify low received power GPS signals. The SEV process is based on the time-frequency representation of GPS signal, and it can capture the characteristic of GPS signal in the time-frequency plane to enhance the GPS signal acquisition performance. Several simulations and experiments are conducted to show the effectiveness of the proposed method for low received power signal detection. The contribution of this work is that the SEV process is an additional scheme to assist the GPS signal acquisition process in low received power signal detection, without changing the original signal acquisition or tracking algorithms.

  7. An Integrated Environment for Efficient Formal Design and Verification

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.

  8. Measurement of a True [Formula: see text]O2max during a Ramp Incremental Test Is Not Confirmed by a Verification Phase.

    PubMed

    Murias, Juan M; Pogliaghi, Silvia; Paterson, Donald H

    2018-01-01

    The accuracy of an exhaustive ramp incremental (RI) test to determine maximal oxygen uptake ([Formula: see text]O 2max ) was recently questioned and the utilization of a verification phase proposed as a gold standard. This study compared the oxygen uptake ([Formula: see text]O 2 ) during a RI test to that obtained during a verification phase aimed to confirm attainment of [Formula: see text]O 2max . Sixty-one healthy males [31 older (O) 65 ± 5 yrs; 30 younger (Y) 25 ± 4 yrs] performed a RI test (15-20 W/min for O and 25 W/min for Y). At the end of the RI test, a 5-min recovery period was followed by a verification phase of constant load cycling to fatigue at either 85% ( n = 16) or 105% ( n = 45) of the peak power output obtained from the RI test. The highest [Formula: see text]O 2 after the RI test (39.8 ± 11.5 mL·kg -1 ·min -1 ) and the verification phase (40.1 ± 11.2 mL·kg -1 ·min -1 ) were not different ( p = 0.33) and they were highly correlated ( r = 0.99; p < 0.01). This response was not affected by age or intensity of the verification phase. The Bland-Altman analysis revealed a very small absolute bias (-0.25 mL·kg -1 ·min -1 , not different from 0) and a precision of ±1.56 mL·kg -1 ·min -1 between measures. This study indicated that a verification phase does not highlight an under-estimation of [Formula: see text]O 2max derived from a RI test, in a large and heterogeneous group of healthy younger and older men naïve to laboratory testing procedures. Moreover, only minor within-individual differences were observed between the maximal [Formula: see text]O 2 elicited during the RI and the verification phase. Thus a verification phase does not add any validation of the determination of a [Formula: see text]O 2max . Therefore, the recommendation that a verification phase should become a gold standard procedure, although initially appealing, is not supported by the experimental data.

  9. Mass and galaxy distributions of four massive galaxy clusters from Dark Energy Survey Science Verification data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melchior, P.; Suchyta, E.; Huff, E.

    2015-03-31

    We measure the weak-lensing masses and galaxy distributions of four massive galaxy clusters observed during the Science Verification phase of the Dark Energy Survey. This pathfinder study is meant to 1) validate the DECam imager for the task of measuring weak-lensing shapes, and 2) utilize DECam's large field of view to map out the clusters and their environments over 90 arcmin. We conduct a series of rigorous tests on astrometry, photometry, image quality, PSF modeling, and shear measurement accuracy to single out flaws in the data and also to identify the optimal data processing steps and parameters. We find Sciencemore » Verification data from DECam to be suitable for the lensing analysis described in this paper. The PSF is generally well-behaved, but the modeling is rendered difficult by a flux-dependent PSF width and ellipticity. We employ photometric redshifts to distinguish between foreground and background galaxies, and a red-sequence cluster finder to provide cluster richness estimates and cluster-galaxy distributions. By fitting NFW profiles to the clusters in this study, we determine weak-lensing masses that are in agreement with previous work. For Abell 3261, we provide the first estimates of redshift, weak-lensing mass, and richness. In addition, the cluster-galaxy distributions indicate the presence of filamentary structures attached to 1E 0657-56 and RXC J2248.7-4431, stretching out as far as 1 degree (approximately 20 Mpc), showcasing the potential of DECam and DES for detailed studies of degree-scale features on the sky.« less

  10. In-Space Engine (ISE-100) Development - Design Verification Test

    NASA Technical Reports Server (NTRS)

    Trinh, Huu P.; Popp, Chris; Bullard, Brad

    2017-01-01

    In the past decade, NASA has formulated science mission concepts with an anticipation of landing spacecraft on the lunar surface, meteoroids, and other planets. Advancing thruster technology for spacecraft propulsion systems has been considered for maximizing science payload. Starting in 2010, development of In-Space Engine (designated as ISE-100) has been carried out. ISE-100 thruster is designed based on heritage Missile Defense Agency (MDA) technology aimed for a lightweight and efficient system in terms volume and packaging. It runs with a hypergolic bi-propellant system: MON-25 (nitrogen tetroxide, N2O4, with 25% of nitric oxide, NO) and MMH (monomethylhydrazine, CH6N2) for NASA spacecraft applications. The utilization of this propellant system will provide a propulsion system capable of operating at wide range of temperatures, from 50 C (122 F) down to -30 C (-22 F) to drastically reduce heater power. The thruster is designed to deliver 100 lb(sub f) of thrust with the capability of a pulse mode operation for a wide range of mission duty cycles (MDCs). Two thrusters were fabricated. As part of the engine development, this test campaign is dedicated for the design verification of the thruster. This presentation will report the efforts of the design verification hot-fire test program of the ISE-100 thruster in collaboration between NASA Marshall Space Flight Center (MSFC) and Aerojet Rocketdyne (AR) test teams. The hot-fire tests were conducted at Advance Mobile Propulsion Test (AMPT) facility in Durango, Colorado, from May 13 to June 10, 2016. This presentation will also provide a summary of key points from the test results.

  11. Ground based ISS payload microgravity disturbance assessments.

    PubMed

    McNelis, Anne M; Heese, John A; Samorezov, Sergey; Moss, Larry A; Just, Marcus L

    2005-01-01

    In order to verify that the International Space Station (ISS) payload facility racks do not disturb the microgravity environment of neighboring facility racks and that the facility science operations are not compromised, a testing and analytical verification process must be followed. Currently no facility racks have taken this process from start to finish. The authors are participants in implementing this process for the NASA Glenn Research Center (GRC) Fluids and Combustion Facility (FCF). To address the testing part of the verification process, the Microgravity Emissions Laboratory (MEL) was developed at GRC. The MEL is a 6 degree of freedom inertial measurement system capable of characterizing inertial response forces (emissions) of components, sub-rack payloads, or rack-level payloads down to 10(-7) g's. The inertial force output data, generated from the steady state or transient operations of the test articles, are utilized in analytical simulations to predict the on-orbit vibratory environment at specific science or rack interface locations. Once the facility payload rack and disturbers are properly modeled an assessment can be made as to whether required microgravity levels are achieved. The modeling is utilized to develop microgravity predictions which lead to the development of microgravity sensitive ISS experiment operations once on-orbit. The on-orbit measurements will be verified by use of the NASA GRC Space Acceleration Measurement System (SAMS). The major topics to be addressed in this paper are: (1) Microgravity Requirements, (2) Microgravity Disturbers, (3) MEL Testing, (4) Disturbance Control, (5) Microgravity Control Process, and (6) On-Orbit Predictions and Verification. Published by Elsevier Ltd.

  12. Mass and galaxy distributions of four massive galaxy clusters from Dark Energy Survey Science Verification data

    DOE PAGES

    Melchior, P.; Suchyta, E.; Huff, E.; ...

    2015-03-31

    We measure the weak-lensing masses and galaxy distributions of four massive galaxy clusters observed during the Science Verification phase of the Dark Energy Survey. This pathfinder study is meant to 1) validate the DECam imager for the task of measuring weak-lensing shapes, and 2) utilize DECam's large field of view to map out the clusters and their environments over 90 arcmin. We conduct a series of rigorous tests on astrometry, photometry, image quality, PSF modelling, and shear measurement accuracy to single out flaws in the data and also to identify the optimal data processing steps and parameters. We find Sciencemore » Verification data from DECam to be suitable for the lensing analysis described in this paper. The PSF is generally well-behaved, but the modelling is rendered difficult by a flux-dependent PSF width and ellipticity. We employ photometric redshifts to distinguish between foreground and background galaxies, and a red-sequence cluster finder to provide cluster richness estimates and cluster-galaxy distributions. By fitting NFW profiles to the clusters in this study, we determine weak-lensing masses that are in agreement with previous work. For Abell 3261, we provide the first estimates of redshift, weak-lensing mass, and richness. Additionally, the cluster-galaxy distributions indicate the presence of filamentary structures attached to 1E 0657-56 and RXC J2248.7-4431, stretching out as far as 1degree (approximately 20 Mpc), showcasing the potential of DECam and DES for detailed studies of degree-scale features on the sky.« less

  13. An Identity-Based Anti-Quantum Privacy-Preserving Blind Authentication in Wireless Sensor Networks.

    PubMed

    Zhu, Hongfei; Tan, Yu-An; Zhu, Liehuang; Wang, Xianmin; Zhang, Quanxin; Li, Yuanzhang

    2018-05-22

    With the development of wireless sensor networks, IoT devices are crucial for the Smart City; these devices change people's lives such as e-payment and e-voting systems. However, in these two systems, the state-of-art authentication protocols based on traditional number theory cannot defeat a quantum computer attack. In order to protect user privacy and guarantee trustworthy of big data, we propose a new identity-based blind signature scheme based on number theorem research unit lattice, this scheme mainly uses a rejection sampling theorem instead of constructing a trapdoor. Meanwhile, this scheme does not depend on complex public key infrastructure and can resist quantum computer attack. Then we design an e-payment protocol using the proposed scheme. Furthermore, we prove our scheme is secure in the random oracle, and satisfies confidentiality, integrity, and non-repudiation. Finally, we demonstrate that the proposed scheme outperforms the other traditional existing identity-based blind signature schemes in signing speed and verification speed, outperforms the other lattice-based blind signature in signing speed, verification speed, and signing secret key size.

  14. An Identity-Based Anti-Quantum Privacy-Preserving Blind Authentication in Wireless Sensor Networks

    PubMed Central

    Zhu, Hongfei; Tan, Yu-an; Zhu, Liehuang; Wang, Xianmin; Zhang, Quanxin; Li, Yuanzhang

    2018-01-01

    With the development of wireless sensor networks, IoT devices are crucial for the Smart City; these devices change people’s lives such as e-payment and e-voting systems. However, in these two systems, the state-of-art authentication protocols based on traditional number theory cannot defeat a quantum computer attack. In order to protect user privacy and guarantee trustworthy of big data, we propose a new identity-based blind signature scheme based on number theorem research unit lattice, this scheme mainly uses a rejection sampling theorem instead of constructing a trapdoor. Meanwhile, this scheme does not depend on complex public key infrastructure and can resist quantum computer attack. Then we design an e-payment protocol using the proposed scheme. Furthermore, we prove our scheme is secure in the random oracle, and satisfies confidentiality, integrity, and non-repudiation. Finally, we demonstrate that the proposed scheme outperforms the other traditional existing identity-based blind signature schemes in signing speed and verification speed, outperforms the other lattice-based blind signature in signing speed, verification speed, and signing secret key size. PMID:29789475

  15. Faster Double-Size Bipartite Multiplication out of Montgomery Multipliers

    NASA Astrophysics Data System (ADS)

    Yoshino, Masayuki; Okeya, Katsuyuki; Vuillaume, Camille

    This paper proposes novel algorithms for computing double-size modular multiplications with few modulus-dependent precomputations. Low-end devices such as smartcards are usually equipped with hardware Montgomery multipliers. However, due to progresses of mathematical attacks, security institutions such as NIST have steadily demanded longer bit-lengths for public-key cryptography, making the multipliers quickly obsolete. In an attempt to extend the lifespan of such multipliers, double-size techniques compute modular multiplications with twice the bit-length of the multipliers. Techniques are known for extending the bit-length of classical Euclidean multipliers, of Montgomery multipliers and the combination thereof, namely bipartite multipliers. However, unlike classical and bipartite multiplications, Montgomery multiplications involve modulus-dependent precomputations, which amount to a large part of an RSA encryption or signature verification. The proposed double-size technique simulates double-size multiplications based on single-size Montgomery multipliers, and yet precomputations are essentially free: in an 2048-bit RSA encryption or signature verification with public exponent e=216+1, the proposal with a 1024-bit Montgomery multiplier is at least 1.5 times faster than previous double-size Montgomery multiplications.

  16. Collusion-aware privacy-preserving range query in tiered wireless sensor networks.

    PubMed

    Zhang, Xiaoying; Dong, Lei; Peng, Hui; Chen, Hong; Zhao, Suyun; Li, Cuiping

    2014-12-11

    Wireless sensor networks (WSNs) are indispensable building blocks for the Internet of Things (IoT). With the development of WSNs, privacy issues have drawn more attention. Existing work on the privacy-preserving range query mainly focuses on privacy preservation and integrity verification in two-tiered WSNs in the case of compromisedmaster nodes, but neglects the damage of node collusion. In this paper, we propose a series of collusion-aware privacy-preserving range query protocols in two-tiered WSNs. To the best of our knowledge, this paper is the first to consider collusion attacks for a range query in tiered WSNs while fulfilling the preservation of privacy and integrity. To preserve the privacy of data and queries, we propose a novel encoding scheme to conceal sensitive information. To preserve the integrity of the results, we present a verification scheme using the correlation among data. In addition, two schemes are further presented to improve result accuracy and reduce communication cost. Finally, theoretical analysis and experimental results confirm the efficiency, accuracy and privacy of our proposals.

  17. Collusion-Aware Privacy-Preserving Range Query in Tiered Wireless Sensor Networks†

    PubMed Central

    Zhang, Xiaoying; Dong, Lei; Peng, Hui; Chen, Hong; Zhao, Suyun; Li, Cuiping

    2014-01-01

    Wireless sensor networks (WSNs) are indispensable building blocks for the Internet of Things (IoT). With the development of WSNs, privacy issues have drawn more attention. Existing work on the privacy-preserving range query mainly focuses on privacy preservation and integrity verification in two-tiered WSNs in the case of compromised master nodes, but neglects the damage of node collusion. In this paper, we propose a series of collusion-aware privacy-preserving range query protocols in two-tiered WSNs. To the best of our knowledge, this paper is the first to consider collusion attacks for a range query in tiered WSNs while fulfilling the preservation of privacy and integrity. To preserve the privacy of data and queries, we propose a novel encoding scheme to conceal sensitive information. To preserve the integrity of the results, we present a verification scheme using the correlation among data. In addition, two schemes are further presented to improve result accuracy and reduce communication cost. Finally, theoretical analysis and experimental results confirm the efficiency, accuracy and privacy of our proposals. PMID:25615731

  18. 77 FR 67737 - Proposed Information Collection (Student Verification of Enrollment) Activity: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-13

    ...The Veterans Benefits Administration (VBA), Department of Veterans Affairs (VA), is announcing an opportunity for public comment on the proposed collection of certain information by the agency. Under the Paperwork Reduction Act (PRA) of 1995, Federal agencies are required to publish notice in the Federal Register concerning each proposed collection of information, including each proposed extension of a currently approved collection, and allow 60 days for public comment in response to the notice. This notice solicits comments on information needed to determine an individual's continued entitlement to VA benefits.

  19. SAMS-II Requirements and Operations

    NASA Technical Reports Server (NTRS)

    Wald, Lawrence W.

    1998-01-01

    The Space Acceleration Measurements System (SAMS) II is the primary instrument for the measurement, storage, and communication of the microgravity environment aboard the International Space Station (ISS). SAMS-II is being developed by the NASA Lewis Research Center Microgravity Science Division to primarily support the Office of Life and Microgravity Science and Applications (OLMSA) Microgravity Science and Applications Division (MSAD) payloads aboard the ISS. The SAMS-II is currently in the test and verification phase at NASA LeRC, prior to its first hardware delivery scheduled for July 1998. This paper will provide an overview of the SAMS-II instrument, including the system requirements and topology, physical and electrical characteristics, and the Concept of Operations for SAMS-II aboard the ISS.

  20. Scientific Data Purchase Project Overview Presentation

    NASA Technical Reports Server (NTRS)

    Holekamp, Kara; Fletcher, Rose

    2001-01-01

    The Scientific Data Purchase (SDP) project acquires science data from commercial sources. It is a demonstration project to test a new way of doing business, tap new sources of data, support Earth science research, and support the commercial remote sensing industry. Phase I of the project reviews simulated/prototypical data sets from 10 companies. Phase II of the project is a 3 year purchase/distribution of select data from 5 companies. The status of several SDP projects is reviewed in this viewgraph presentation, as is the SDP process of tasking, verification, validation, and data archiving. The presentation also lists SDP results for turnaround time, metrics, customers, data use, science research, applications research, and user feedback.

  1. Structural, Thermal, and Optical Performance (STOP) Modeling and Results for the James Webb Space Telescope Integrated Science Instrument Module

    NASA Technical Reports Server (NTRS)

    Gracey, Renee; Bartoszyk, Andrew; Cofie, Emmanuel; Comber, Brian; Hartig, George; Howard, Joseph; Sabatke, Derek; Wenzel, Greg; Ohl, Raymond

    2016-01-01

    The James Webb Space Telescope includes the Integrated Science Instrument Module (ISIM) element that contains four science instruments (SI) including a Guider. We performed extensive structural, thermal, and optical performance(STOP) modeling in support of all phases of ISIM development. In this paper, we focus on modeling and results associated with test and verification. ISIMs test program is bound by ground environments, mostly notably the 1g and test chamber thermal environments. This paper describes STOP modeling used to predict ISIM system performance in 0g and at various on-orbit temperature environments. The predictions are used to project results obtained during testing to on-orbit performance.

  2. The Deep Space Network: A Radio Communications Instrument for Deep Space Exploration

    NASA Technical Reports Server (NTRS)

    Renzetti, N. A.; Stelzried, C. T.; Noreen, G. K.; Slobin, S. D.; Petty, S. M.; Trowbridge, D. L.; Donnelly, H.; Kinman, P. W.; Armstrong, J. W.; Burow, N. A.

    1983-01-01

    The primary purpose of the Deep Space Network (DSN) is to serve as a communications instrument for deep space exploration, providing communications between the spacecraft and the ground facilities. The uplink communications channel provides instructions or commands to the spacecraft. The downlink communications channel provides command verification and spacecraft engineering and science instrument payload data.

  3. Professional Development Aligned with AP Chemistry Curriculum: Promoting Science Practices and Facilitating Enduring Conceptual Understanding

    ERIC Educational Resources Information Center

    Herrington, Deborah G.; Yezierski, Ellen J.

    2014-01-01

    The recent revisions to the advanced placement (AP) chemistry curriculum promote deep conceptual understanding of chemistry content over more rote memorization of facts and algorithmic problem solving. For many teachers, this will mean moving away from traditional worksheets and verification lab activities that they have used to address the vast…

  4. Human-Computer Interaction: A Journal of Theoretical, Empirical and Methodological Issues of User Science and of System Design. Volume 7, Number 1

    DTIC Science & Technology

    1992-01-01

    Norman .................................... University of California, San Diego, CA Dan R . Olsen, Jr ........................................ Brigham...Peter G. Poison .............................................. University of Colorado, Boulder, CO James R . Rhyne ................. IBM T J Watson...and artificial intelligence, among which are: * reasoning about concurrent systems, including program verification ( Barringer , 1985), operating

  5. SMAP Verification and Validation Project - Final Report

    NASA Technical Reports Server (NTRS)

    Murry, Michael

    2012-01-01

    In 2007, the National Research Council (NRC) released the Decadal Survey of Earth science. In the future decade, the survey identified 15 new space missions of significant scientific and application value for the National Aeronautics and Space Administration (NASA) to undertake. One of these missions was the Soil Moisture Active Passive (SMAP) mission that NASA assigned to the Jet Propulsion Laboratory (JPL) in 2008. The goal of SMAP1 is to provide global, high resolution mapping of soil moisture and its freeze/thaw states. The SMAP project recently passed its Critical Design Review and is proceeding with its fabrication and testing phase.Verification and Validation (V&V) is widely recognized as a critical component in system engineering and is vital to the success of any space mission. V&V is a process that is used to check that a system meets its design requirements and specifications in order to fulfill its intended purpose. Verification often refers to the question "Have we built the system right?" whereas Validation asks "Have we built the right system?" Currently the SMAP V&V team is verifying design requirements through inspection, demonstration, analysis, or testing. An example of the SMAP V&V process is the verification of the antenna pointing accuracy with mathematical models since it is not possible to provide the appropriate micro-gravity environment for testing the antenna on Earth before launch.

  6. Biomarker Discovery and Verification of Esophageal Squamous Cell Carcinoma Using Integration of SWATH/MRM.

    PubMed

    Hou, Guixue; Lou, Xiaomin; Sun, Yulin; Xu, Shaohang; Zi, Jin; Wang, Quanhui; Zhou, Baojin; Han, Bo; Wu, Lin; Zhao, Xiaohang; Lin, Liang; Liu, Siqi

    2015-09-04

    We propose an efficient integration of SWATH with MRM for biomarker discovery and verification when the corresponding ion library is well established. We strictly controlled the false positive rate associated with SWATH MS signals and carefully selected the target peptides coupled with SWATH and MRM. We collected 10 samples of esophageal squamous cell carcinoma (ESCC) tissues paired with tumors and adjacent regions and quantified 1758 unique proteins with FDR 1% at protein level using SWATH, in which 467 proteins were abundance-dependent with ESCC. After carefully evaluating the SWATH MS signals of the up-regulated proteins, we selected 120 proteins for MRM verification. MRM analysis of the pooled and individual esophageal tissues resulted in 116 proteins that exhibited similar abundance response modes to ESCC that were acquired with SWATH. Because the ESCC-related proteins consisted of a high percentile of secreted proteins, we conducted the MRM assay on patient sera that were collected from pre- and postoperation. Of the 116 target proteins, 42 were identified in the ESCC sera, including 11 with lowered abundances postoperation. Coupling SWATH and MRM is thus feasible and efficient for the discovery and verification of cancer-related protein biomarkers.

  7. Development of CFC-Free Cleaning Processes at the NASA White Sands Test Facility

    NASA Technical Reports Server (NTRS)

    Beeson, Harold; Kirsch, Mike; Hornung, Steven; Biesinger, Paul

    1995-01-01

    The NASA White Sands Test Facility (WSTF) is developing cleaning and verification processes to replace currently used chlorofluorocarbon-113- (CFC-113-) based processes. The processes being evaluated include both aqueous- and solvent-based techniques. The presentation will include the findings of investigations of aqueous cleaning and verification processes that are based on a draft of a proposed NASA Kennedy Space Center (KSC) cleaning procedure. Verification testing with known contaminants, such as hydraulic fluid and commonly used oils, established correlations between nonvolatile residue and CFC-113. Recoveries ranged from 35 to 60 percent of theoretical. WSTF is also investigating enhancements to aqueous sampling for organics and particulates. Although aqueous alternatives have been identified for several processes, a need still exists for nonaqueous solvent cleaning, such as the cleaning and cleanliness verification of gauges used for oxygen service. The cleaning effectiveness of tetrachloroethylene (PCE), trichloroethylene (TCE), ethanol, hydrochlorofluorocarbon-225 (HCFC-225), tert-butylmethylether, and n-Hexane was evaluated using aerospace gauges and precision instruments and then compared to the cleaning effectiveness of CFC-113. Solvents considered for use in oxygen systems were also tested for oxygen compatibility using high-pressure oxygen autoignition and liquid oxygen mechanical impact testing.

  8. Model-based system engineering approach for the Euclid mission to manage scientific and technical complexity

    NASA Astrophysics Data System (ADS)

    Lorenzo Alvarez, Jose; Metselaar, Harold; Amiaux, Jerome; Saavedra Criado, Gonzalo; Gaspar Venancio, Luis M.; Salvignol, Jean-Christophe; Laureijs, René J.; Vavrek, Roland

    2016-08-01

    In the last years, the system engineering field is coming to terms with a paradigm change in the approach for complexity management. Different strategies have been proposed to cope with highly interrelated systems, system of systems and collaborative system engineering have been proposed and a significant effort is being invested into standardization and ontology definition. In particular, Model Based System Engineering (MBSE) intends to introduce methodologies for a systematic system definition, development, validation, deployment, operation and decommission, based on logical and visual relationship mapping, rather than traditional 'document based' information management. The practical implementation in real large-scale projects is not uniform across fields. In space science missions, the usage has been limited to subsystems or sample projects with modeling being performed 'a-posteriori' in many instances. The main hurdle for the introduction of MBSE practices in new projects is still the difficulty to demonstrate their added value to a project and whether their benefit is commensurate with the level of effort required to put them in place. In this paper we present the implemented Euclid system modeling activities, and an analysis of the benefits and limitations identified to support in particular requirement break-down and allocation, and verification planning at mission level.

  9. Highway noise measurements for verification of prediction models

    DOT National Transportation Integrated Search

    1978-01-01

    Accurate prediction of highway noise has been a major problem for state highway departments. Many noise models have been proposed to alleviate this problem. Results contained in this report will be used to analyze some of these models, and to determi...

  10. Phase II NCAT test track results.

    DOT National Transportation Integrated Search

    2006-12-01

    There is a need to be able to quickly test materials and mixtures in-place, under real traffic. : There have been many developments during the last few years that need verification prior to : adopting. One such development is the new proposed mechani...

  11. Data requirements for verification of ram glow chemistry

    NASA Technical Reports Server (NTRS)

    Swenson, G. R.; Mende, S. B.

    1985-01-01

    A set of questions is posed regarding the surface chemistry producing the ram glow on the space shuttle. The questions surround verification of the chemical cycle involved in the physical processes leading to the glow. The questions, and a matrix of measurements required for most answers, are presented. The measurements include knowledge of the flux composition to and from a ram surface as well as spectroscopic signatures from the U to visible to IR. A pallet set of experiments proposed to accomplish the measurements is discussed. An interim experiment involving an available infrared instrument to be operated from the shuttle Orbiter cabin is also be discussed.

  12. Biometrics, identification and surveillance.

    PubMed

    Lyon, David

    2008-11-01

    Governing by identity describes the emerging regime of a globalizing, mobile world. Governance depends on identification but identification increasingly depends on biometrics. This 'solution' to difficulties of verification is described and some technical weaknesses are discussed. The role of biometrics in classification systems is also considered and is shown to contain possible prejudice in relation to racialized criteria of identity. Lastly, the culture of biometric identification is shown to be limited to abstract data, artificially separated from the lived experience of the body including the orientation to others. It is proposed that creators of national ID systems in particular address these crucial deficiencies in their attempt to provide new modes of verification.

  13. Compendium of Arms Control Verification Proposals.

    DTIC Science & Technology

    1982-03-01

    proposals. This appears to be the case for virtually all kinds of prospective arms control topics from general and complete disarmament to control of specific... virtually anywhere. Unrestricted access would be particularly necessary in the case of states which previously had nuclear weapons. Practically speaking...been transferred out of the area, it would be harder to check on remaining stocks without some intrusive inspection and it would be virtually impossible

  14. Teichmuller Space Resolution of the EPR Paradox

    NASA Astrophysics Data System (ADS)

    Winterberg, Friedwardt

    2013-04-01

    The mystery of Newton's action-at-a-distance law of gravity was resolved by Einstein with Riemann's non-Euclidean geometry, which permitted the explanation of the departure from Newton's law for the motion of Mercury. It is here proposed that the similarly mysterious non-local EPR-type quantum correlations may be explained by a Teichmuller space geometry below the Planck length, for which an experiment for its verification is proposed.

  15. Comparing the Effectiveness of Verification and Inquiry Laboratories in Supporting Undergraduate Science Students in Constructing Arguments around Socioscientific Issues

    ERIC Educational Resources Information Center

    Grooms, Jonathon; Sampson, Victor; Golden, Barry

    2014-01-01

    This quasi-experimental study uses a pre-/post-intervention approach to investigate the quality of undergraduate students' arguments in the context of socioscientific issues (SSI) based on experiencing a semester of traditional "cookbook" instruction (N?=?79) or a semester of argument-based instruction (N?=?73) in the context of an…

  16. Demonstration and Verification of a Broad Spectrum Anomalous Dispersion Effects Tool for Index of Refraction and Optical Turbulence Calculations

    DTIC Science & Technology

    2009-03-01

    thanks for paving the way for me, giving me guidance, and working with me to build the laboratory scintillometer. Thanks to my coworkers Seth Marek...Washington, DC as a function of relative humidity. Journal of Atmospheric Sciences, 39, 1838-1852. Good, E. R., Watkins , B. J., Quesada, A. F., Brown, J

  17. Background Registration-Based Adaptive Noise Filtering of LWIR/MWIR Imaging Sensors for UAV Applications

    PubMed Central

    Kim, Byeong Hak; Kim, Min Young; Chae, You Seong

    2017-01-01

    Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC. PMID:29280970

  18. Background Registration-Based Adaptive Noise Filtering of LWIR/MWIR Imaging Sensors for UAV Applications.

    PubMed

    Kim, Byeong Hak; Kim, Min Young; Chae, You Seong

    2017-12-27

    Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC.

  19. Applications of HCMM satellite data to the study of urban heating patterns

    NASA Technical Reports Server (NTRS)

    Carlson, T. N. (Principal Investigator)

    1980-01-01

    A research summary is presented and is divided into two major areas, one developmental and the other basic science. In the first three sub-categories are discussed: image processing techniques, especially the method whereby surface temperature image are converted to images of surface energy budget, moisture availability and thermal inertia; model development; and model verification. Basic science includes the use of a method to further the understanding of the urban heat island and anthropogenic modification of the surface heating, evaporation over vegetated surfaces, and the effect of surface heat flux on plume spread.

  20. KSC-02pd1949

    NASA Image and Video Library

    2002-12-17

    KENNEDY SPACE CENTER, FLA. -- Workers at the Cape Canaveral Air Force Station Skid Strip get ready to remove the Pegasus XL Expendable Launch Vehicle attached underneath the Orbital Sciences L-1011 aircraft. The Pegasus will be transported to the Multi-Payload Processing Facility for testing and verification. The Pegasus will undergo three flight simulations prior to its scheduled launch in late January 2003. The Pegasus XL will carry NASA's Solar Radiation and Climate Experiment (SORCE) into orbit. Built by Orbital Sciences Space Systems Group, SORCE will study and measure solar irradiance as a source of energy in the Earth's atmosphere. .

  1. A Study of the Access to the Scholarly Record from a Hospital Health Science Core Collection *

    PubMed Central

    Williams, James F.; Pings, Vern M.

    1973-01-01

    This study is an effort to determine possible service performance levels in hospital libraries based on access to the scholarly record of medicine through selected lists of clinical journals and indexing and abstracting journals. The study was designed to test a methodology as well as to provide data for planning and management decisions for health science libraries. Findings and conclusions cover the value of a core collection of journals, length of journal files, performance of certain bibliographic instruments in citation verification, and the implications of study data for library planning and management. PMID:4744345

  2. Adherence to balance tolerance limits at the Upper Mississippi Science Center, La Crosse, Wisconsin.

    USGS Publications Warehouse

    Myers, C.T.; Kennedy, D.M.

    1998-01-01

    Verification of balance accuracy entails applying a series of standard masses to a balance prior to use and recording the measured values. The recorded values for each standard should have lower and upper weight limits or tolerances that are accepted as verification of balance accuracy under normal operating conditions. Balance logbooks for seven analytical balances at the Upper Mississippi Science Center were checked over a 3.5-year period to determine if the recorded weights were within the established tolerance limits. A total of 9435 measurements were checked. There were 14 instances in which the balance malfunctioned and operators recorded a rationale in the balance logbook. Sixty-three recording errors were found. Twenty-eight operators were responsible for two types of recording errors: Measurements of weights were recorded outside of the tolerance limit but not acknowledged as an error by the operator (n = 40); and measurements were recorded with the wrong number of decimal places (n = 23). The adherence rate for following tolerance limits was 99.3%. To ensure the continued adherence to tolerance limits, the quality-assurance unit revised standard operating procedures to require more frequent review of balance logbooks.

  3. CMB lensing tomography with the DES Science Verification galaxies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giannantonio, T.

    We measure the cross-correlation between the galaxy density in the Dark Energy Survey (DES) Science Verification data and the lensing of the cosmic microwave background (CMB) as reconstructed with the Planck satellite and the South Pole Telescope (SPT). When using the DES main galaxy sample over the full redshift range 0.2 < z phot < 1.2, a cross-correlation signal is detected at 6σ and 4σ with SPT and Planck respectively. We then divide the DES galaxies into five photometric redshift bins, finding significant (>2σ) detections in all bins. Comparing to the fiducial Planck cosmology, we find the redshift evolution ofmore » the signal matches expectations, although the amplitude is consistently lower than predicted across redshift bins. We test for possible systematics that could affect our result and find no evidence for significant contamination. Finally, we demonstrate how these measurements can be used to constrain the growth of structure across cosmic time. We find the data are fit by a model in which the amplitude of structure in the z < 1.2 universe is 0.73 ± 0.16 times as large as predicted in the LCDM Planck cosmology, a 1.7σ deviation.« less

  4. CMB lensing tomography with the DES Science Verification galaxies

    DOE PAGES

    Giannantonio, T.

    2016-01-07

    We measure the cross-correlation between the galaxy density in the Dark Energy Survey (DES) Science Verification data and the lensing of the cosmic microwave background (CMB) as reconstructed with the Planck satellite and the South Pole Telescope (SPT). When using the DES main galaxy sample over the full redshift range 0.2 < z phot < 1.2, a cross-correlation signal is detected at 6σ and 4σ with SPT and Planck respectively. We then divide the DES galaxies into five photometric redshift bins, finding significant (>2σ) detections in all bins. Comparing to the fiducial Planck cosmology, we find the redshift evolution ofmore » the signal matches expectations, although the amplitude is consistently lower than predicted across redshift bins. We test for possible systematics that could affect our result and find no evidence for significant contamination. Finally, we demonstrate how these measurements can be used to constrain the growth of structure across cosmic time. We find the data are fit by a model in which the amplitude of structure in the z < 1.2 universe is 0.73 ± 0.16 times as large as predicted in the LCDM Planck cosmology, a 1.7σ deviation.« less

  5. Towards Test Driven Development for Computational Science with pFUnit

    NASA Technical Reports Server (NTRS)

    Rilee, Michael L.; Clune, Thomas L.

    2014-01-01

    Developers working in Computational Science & Engineering (CSE)/High Performance Computing (HPC) must contend with constant change due to advances in computing technology and science. Test Driven Development (TDD) is a methodology that mitigates software development risks due to change at the cost of adding comprehensive and continuous testing to the development process. Testing frameworks tailored for CSE/HPC, like pFUnit, can lower the barriers to such testing, yet CSE software faces unique constraints foreign to the broader software engineering community. Effective testing of numerical software requires a comprehensive suite of oracles, i.e., use cases with known answers, as well as robust estimates for the unavoidable numerical errors associated with implementation with finite-precision arithmetic. At first glance these concerns often seem exceedingly challenging or even insurmountable for real-world scientific applications. However, we argue that this common perception is incorrect and driven by (1) a conflation between model validation and software verification and (2) the general tendency in the scientific community to develop relatively coarse-grained, large procedures that compound numerous algorithmic steps.We believe TDD can be applied routinely to numerical software if developers pursue fine-grained implementations that permit testing, neatly side-stepping concerns about needing nontrivial oracles as well as the accumulation of errors. We present an example of a successful, complex legacy CSE/HPC code whose development process shares some aspects with TDD, which we contrast with current and potential capabilities. A mix of our proposed methodology and framework support should enable everyday use of TDD by CSE-expert developers.

  6. 76 FR 59109 - Agency Information Collection Activities: Proposed Collection; Comment Request-Supplemental...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-23

    ... Nutrition Assistance Program Prisoner and Death Match Requirements AGENCY: Food and Nutrition Service (FNS.... SUPPLEMENTARY INFORMATION: Title: Supplemental Nutrition Assistance Program Prisoner and Death Match... verification and death matching procedures as mandated by legislation and previously implemented through agency...

  7. 76 FR 338 - Notice of Proposed Information Collection Requests

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-04

    ... information technology. Dated: December 28, 2010. James Hyler, Acting Director, Information Collection... Program (TQE) Scholarship Contract and Teaching Verification Forms on Scholarship Recipients. OMB Control... service obligation to teach in a high-need school in a high-need Local Educational Agency. This...

  8. Computer science: Key to a space program renaissance. The 1981 NASA/ASEE summer study on the use of computer science and technology in NASA. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Freitas, R. A., Jr. (Editor); Carlson, P. A. (Editor)

    1983-01-01

    Adoption of an aggressive computer science research and technology program within NASA will: (1) enable new mission capabilities such as autonomous spacecraft, reliability and self-repair, and low-bandwidth intelligent Earth sensing; (2) lower manpower requirements, especially in the areas of Space Shuttle operations, by making fuller use of control center automation, technical support, and internal utilization of state-of-the-art computer techniques; (3) reduce project costs via improved software verification, software engineering, enhanced scientist/engineer productivity, and increased managerial effectiveness; and (4) significantly improve internal operations within NASA with electronic mail, managerial computer aids, an automated bureaucracy and uniform program operating plans.

  9. Results of the life sciences DSOs conducted aboard the space shuttle 1981-1986

    NASA Technical Reports Server (NTRS)

    Bungo, Michael W.; Bagian, Tandi M.; Bowman, Mark A.; Levitan, Barry M.

    1987-01-01

    Results are presented for a number of life sciences investigations sponsored by the Space Biomedical Research Institute at the NASA Lyndon B. Johnson Space Center and conducted as Detailed Supplementary Objectives (DSOs) on Space Shuttle flights between 1981 and 1986. An introduction and a description of the DSO program are followed by summary reports on the investigations. Reports are grouped into the following disciplines: Biochemistry and Pharmacology, Cardiovascular Effects and Fluid Shifts, Equipment Testing and Experiment Verification, Microbiology, Space Motion Sickness, and Vision. In the appendix, the status of every medical/life science DSO is presented in graphical form, which enables the flight history, the number of subjects tested, and the experiment results to be reviewed at a glance.

  10. SU-E-T-278: Realization of Dose Verification Tool for IMRT Plan Based On DPM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Jinfeng; Cao, Ruifen; Dai, Yumei

    Purpose: To build a Monte Carlo dose verification tool for IMRT Plan by implementing a irradiation source model into DPM code. Extend the ability of DPM to calculate any incident angles and irregular-inhomogeneous fields. Methods: With the virtual source and the energy spectrum which unfolded from the accelerator measurement data,combined with optimized intensity maps to calculate the dose distribution of the irradiation irregular-inhomogeneous field. The irradiation source model of accelerator was substituted by a grid-based surface source. The contour and the intensity distribution of the surface source were optimized by ARTS (Accurate/Advanced Radiotherapy System) optimization module based on the tumormore » configuration. The weight of the emitter was decided by the grid intensity. The direction of the emitter was decided by the combination of the virtual source and the emitter emitting position. The photon energy spectrum unfolded from the accelerator measurement data was adjusted by compensating the contaminated electron source. For verification, measured data and realistic clinical IMRT plan were compared with DPM dose calculation. Results: The regular field was verified by comparing with the measured data. It was illustrated that the differences were acceptable (<2% inside the field, 2–3mm in the penumbra). The dose calculation of irregular field by DPM simulation was also compared with that of FSPB (Finite Size Pencil Beam) and the passing rate of gamma analysis was 95.1% for peripheral lung cancer. The regular field and the irregular rotational field were all within the range of permitting error. The computing time of regular fields were less than 2h, and the test of peripheral lung cancer was 160min. Through parallel processing, the adapted DPM could complete the calculation of IMRT plan within half an hour. Conclusion: The adapted parallelized DPM code with irradiation source model is faster than classic Monte Carlo codes. Its computational accuracy and speed satisfy the clinical requirement, and it is expectable to be a Monte Carlo dose verification tool for IMRT Plan. Strategic Priority Research Program of the China Academy of Science(XDA03040000); National Natural Science Foundation of China (81101132)« less

  11. WFIRST: Update on the Coronagraph Science Requirements

    NASA Astrophysics Data System (ADS)

    Douglas, Ewan S.; Cahoy, Kerri; Carlton, Ashley; Macintosh, Bruce; Turnbull, Margaret; Kasdin, Jeremy; WFIRST Coronagraph Science Investigation Teams

    2018-01-01

    The WFIRST Coronagraph instrument (CGI) will enable direct imaging and low resolution spectroscopy of exoplanets in reflected light and imaging polarimetry of circumstellar disks. The CGI science investigation teams were tasked with developing a set of science requirements which advance our knowledge of exoplanet occurrence and atmospheric composition, as well as the composition and morphology of exozodiacal debris disks, cold Kuiper Belt analogs, and protoplanetary systems. We present the initial content, rationales, validation, and verification plans for the WFIRST CGI, informed by detailed and still-evolving instrument and observatory performance models. We also discuss our approach to the requirements development and management process, including the collection and organization of science inputs, open source approach to managing the requirements database, and the range of models used for requirements validation. These tools can be applied to requirements development processes for other astrophysical space missions, and may ease their management and maintenance. These WFIRST CGI science requirements allow the community to learn about and provide insights and feedback on the expected instrument performance and science return.

  12. Towards Verification of Operational Procedures Using Auto-Generated Diagnostic Trees

    NASA Technical Reports Server (NTRS)

    Kurtoglu, Tolga; Lutz, Robyn; Patterson-Hine, Ann

    2009-01-01

    The design, development, and operation of complex space, lunar and planetary exploration systems require the development of general procedures that describe a detailed set of instructions capturing how mission tasks are performed. For both crewed and uncrewed NASA systems, mission safety and the accomplishment of the scientific mission objectives are highly dependent on the correctness of procedures. In this paper, we describe how to use the auto-generated diagnostic trees from existing diagnostic models to improve the verification of standard operating procedures. Specifically, we introduce a systematic method, namely the Diagnostic Tree for Verification (DTV), developed with the goal of leveraging the information contained within auto-generated diagnostic trees in order to check the correctness of procedures, to streamline the procedures in terms of reducing the number of steps or use of resources in them, and to propose alternative procedural steps adaptive to changing operational conditions. The application of the DTV method to a spacecraft electrical power system shows the feasibility of the approach and its range of capabilities

  13. Verification of micro-scale photogrammetry for smooth three-dimensional object measurement

    NASA Astrophysics Data System (ADS)

    Sims-Waterhouse, Danny; Piano, Samanta; Leach, Richard

    2017-05-01

    By using sub-millimetre laser speckle pattern projection we show that photogrammetry systems are able to measure smooth three-dimensional objects with surface height deviations less than 1 μm. The projection of laser speckle patterns allows correspondences on the surface of smooth spheres to be found, and as a result, verification artefacts with low surface height deviations were measured. A combination of VDI/VDE and ISO standards were also utilised to provide a complete verification method, and determine the quality parameters for the system under test. Using the proposed method applied to a photogrammetry system, a 5 mm radius sphere was measured with an expanded uncertainty of 8.5 μm for sizing errors, and 16.6 μm for form errors with a 95 % confidence interval. Sphere spacing lengths between 6 mm and 10 mm were also measured by the photogrammetry system, and were found to have expanded uncertainties of around 20 μm with a 95 % confidence interval.

  14. Verification and Validation of KBS with Neural Network Components

    NASA Technical Reports Server (NTRS)

    Wen, Wu; Callahan, John

    1996-01-01

    Artificial Neural Network (ANN) play an important role in developing robust Knowledge Based Systems (KBS). The ANN based components used in these systems learn to give appropriate predictions through training with correct input-output data patterns. Unlike traditional KBS that depends on a rule database and a production engine, the ANN based system mimics the decisions of an expert without specifically formulating the if-than type of rules. In fact, the ANNs demonstrate their superiority when such if-then type of rules are hard to generate by human expert. Verification of traditional knowledge based system is based on the proof of consistency and completeness of the rule knowledge base and correctness of the production engine.These techniques, however, can not be directly applied to ANN based components.In this position paper, we propose a verification and validation procedure for KBS with ANN based components. The essence of the procedure is to obtain an accurate system specification through incremental modification of the specifications using an ANN rule extraction algorithm.

  15. 78 FR 3893 - Enbridge Energy, Limited Partnership; Notice of Technical Conference

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-17

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. IS13-17-000] Enbridge Energy... Energy, Limited Partnership's proposed revision to its downstream Nomination Verification Procedure. \\1\\ Enbridge Energy, Limited Partnership, 141 FERC ] 61,246 (2012). Take notice that a technical conference...

  16. Field verification of geogrid properties for base course reinforcement applications : final report.

    DOT National Transportation Integrated Search

    2013-11-01

    The proposed field study is a continuation of a recently concluded, ODOT-funded project titled: Development of ODOT Guidelines for the Use of Geogrids in Aggregate Bases, which is aimed at addressing the need for improved guidelines for base reinforc...

  17. Calibration of region-specific gates pile driving formula for LRFD : final report 561.

    DOT National Transportation Integrated Search

    2016-05-01

    This research project proposes new DOTD pile driving formulas for pile capacity verification using pile driving blow : counts obtained at either end-of-initial driving (EOID) or at the beginning-of-restrike (BOR). The pile driving : formulas were dev...

  18. A neural-visualization IDS for honeynet data.

    PubMed

    Herrero, Álvaro; Zurutuza, Urko; Corchado, Emilio

    2012-04-01

    Neural intelligent systems can provide a visualization of the network traffic for security staff, in order to reduce the widely known high false-positive rate associated with misuse-based Intrusion Detection Systems (IDSs). Unlike previous work, this study proposes an unsupervised neural models that generate an intuitive visualization of the captured traffic, rather than network statistics. These snapshots of network events are immensely useful for security personnel that monitor network behavior. The system is based on the use of different neural projection and unsupervised methods for the visual inspection of honeypot data, and may be seen as a complementary network security tool that sheds light on internal data structures through visual inspection of the traffic itself. Furthermore, it is intended to facilitate verification and assessment of Snort performance (a well-known and widely-used misuse-based IDS), through the visualization of attack patterns. Empirical verification and comparison of the proposed projection methods are performed in a real domain, where two different case studies are defined and analyzed.

  19. Experimental Validation of L1 Adaptive Control: Rohrs' Counterexample in Flight

    NASA Technical Reports Server (NTRS)

    Xargay, Enric; Hovakimyan, Naira; Dobrokhodov, Vladimir; Kaminer, Issac; Kitsios, Ioannis; Cao, Chengyu; Gregory, Irene M.; Valavani, Lena

    2010-01-01

    The paper presents new results on the verification and in-flight validation of an L1 adaptive flight control system, and proposes a general methodology for verification and validation of adaptive flight control algorithms. The proposed framework is based on Rohrs counterexample, a benchmark problem presented in the early 80s to show the limitations of adaptive controllers developed at that time. In this paper, the framework is used to evaluate the performance and robustness characteristics of an L1 adaptive control augmentation loop implemented onboard a small unmanned aerial vehicle. Hardware-in-the-loop simulations and flight test results confirm the ability of the L1 adaptive controller to maintain stability and predictable performance of the closed loop adaptive system in the presence of general (artificially injected) unmodeled dynamics. The results demonstrate the advantages of L1 adaptive control as a verifiable robust adaptive control architecture with the potential of reducing flight control design costs and facilitating the transition of adaptive control into advanced flight control systems.

  20. Verification of hypergraph states

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki; Takeuchi, Yuki; Hayashi, Masahito

    2017-12-01

    Hypergraph states are generalizations of graph states where controlled-Z gates on edges are replaced with generalized controlled-Z gates on hyperedges. Hypergraph states have several advantages over graph states. For example, certain hypergraph states, such as the Union Jack states, are universal resource states for measurement-based quantum computing with only Pauli measurements, while graph state measurement-based quantum computing needs non-Clifford basis measurements. Furthermore, it is impossible to classically efficiently sample measurement results on hypergraph states unless the polynomial hierarchy collapses to the third level. Although several protocols have been proposed to verify graph states with only sequential single-qubit Pauli measurements, there was no verification method for hypergraph states. In this paper, we propose a method for verifying a certain class of hypergraph states with only sequential single-qubit Pauli measurements. Importantly, no i.i.d. property of samples is assumed in our protocol: any artificial entanglement among samples cannot fool the verifier. As applications of our protocol, we consider verified blind quantum computing with hypergraph states, and quantum computational supremacy demonstrations with hypergraph states.

  1. A uniqueness-and-anonymity-preserving remote user authentication scheme for connected health care.

    PubMed

    Chang, Ya-Fen; Yu, Shih-Hui; Shiao, Ding-Rui

    2013-04-01

    Connected health care provides new opportunities for improving financial and clinical performance. Many connected health care applications such as telecare medicine information system, personally controlled health records system, and patient monitoring have been proposed. Correct and quality care is the goal of connected heath care, and user authentication can ensure the legality of patients. After reviewing authentication schemes for connected health care applications, we find that many of them cannot protect patient privacy such that others can trace users/patients by the transmitted data. And the verification tokens used by these authentication schemes to authenticate users or servers are only password, smart card and RFID tag. Actually, these verification tokens are not unique and easy to copy. On the other hand, biometric characteristics, such as iris, face, voiceprint, fingerprint and so on, are unique, easy to be verified, and hard to be copied. In this paper, a biometrics-based user authentication scheme will be proposed to ensure uniqueness and anonymity at the same time. With the proposed scheme, only the legal user/patient himself/herself can access the remote server, and no one can trace him/her according to transmitted data.

  2. Curve Set Feature-Based Robust and Fast Pose Estimation Algorithm

    PubMed Central

    Hashimoto, Koichi

    2017-01-01

    Bin picking refers to picking the randomly-piled objects from a bin for industrial production purposes, and robotic bin picking is always used in automated assembly lines. In order to achieve a higher productivity, a fast and robust pose estimation algorithm is necessary to recognize and localize the randomly-piled parts. This paper proposes a pose estimation algorithm for bin picking tasks using point cloud data. A novel descriptor Curve Set Feature (CSF) is proposed to describe a point by the surface fluctuation around this point and is also capable of evaluating poses. The Rotation Match Feature (RMF) is proposed to match CSF efficiently. The matching process combines the idea of the matching in 2D space of origin Point Pair Feature (PPF) algorithm with nearest neighbor search. A voxel-based pose verification method is introduced to evaluate the poses and proved to be more than 30-times faster than the kd-tree-based verification method. Our algorithm is evaluated against a large number of synthetic and real scenes and proven to be robust to noise, able to detect metal parts, more accurately and more than 10-times faster than PPF and Oriented, Unique and Repeatable (OUR)-Clustered Viewpoint Feature Histogram (CVFH). PMID:28771216

  3. Accurate palm vein recognition based on wavelet scattering and spectral regression kernel discriminant analysis

    NASA Astrophysics Data System (ADS)

    Elnasir, Selma; Shamsuddin, Siti Mariyam; Farokhi, Sajad

    2015-01-01

    Palm vein recognition (PVR) is a promising new biometric that has been applied successfully as a method of access control by many organizations, which has even further potential in the field of forensics. The palm vein pattern has highly discriminative features that are difficult to forge because of its subcutaneous position in the palm. Despite considerable progress and a few practical issues, providing accurate palm vein readings has remained an unsolved issue in biometrics. We propose a robust and more accurate PVR method based on the combination of wavelet scattering (WS) with spectral regression kernel discriminant analysis (SRKDA). As the dimension of WS generated features is quite large, SRKDA is required to reduce the extracted features to enhance the discrimination. The results based on two public databases-PolyU Hyper Spectral Palmprint public database and PolyU Multi Spectral Palmprint-show the high performance of the proposed scheme in comparison with state-of-the-art methods. The proposed approach scored a 99.44% identification rate and a 99.90% verification rate [equal error rate (EER)=0.1%] for the hyperspectral database and a 99.97% identification rate and a 99.98% verification rate (EER=0.019%) for the multispectral database.

  4. Prompt-gamma monitoring in hadrontherapy: A review

    NASA Astrophysics Data System (ADS)

    Krimmer, J.; Dauvergne, D.; Létang, J. M.; Testa, É.

    2018-01-01

    Secondary radiation emission induced by nuclear reactions is correlated to the path of ions in matter. Therefore, such penetrating radiation can be used for in vivo control of hadrontherapy treatments, for which the primary beam is absorbed inside the patient. Among secondary radiations, prompt-gamma rays were proposed for real-time verification of ion range. Such a verification is a desired condition to reduce uncertainties in treatment planning. For more than a decade, efforts have been undertaken worldwide to promote prompt-gamma-based devices to be used in clinical conditions. Dedicated cameras are necessary to overcome the challenges of a broad- and high-energy distribution, a large background, high instantaneous count rates, and compatibility constraints with patient irradiation. Several types of prompt-gamma imaging devices have been proposed, that are either physically-collimated or electronically collimated (Compton cameras). Clinical tests are now undergoing. Meanwhile, other methods than direct prompt-gamma imaging were proposed, that are based on specific counting using either time-of-flight or photon energy measurements. In the present article, we make a review and discuss the state of the art for all techniques using prompt-gamma detection to improve the quality assurance in hadrontherapy.

  5. Discriminative Projection Selection Based Face Image Hashing

    NASA Astrophysics Data System (ADS)

    Karabat, Cagatay; Erdogan, Hakan

    Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.

  6. An Improved Quantum Proxy Blind Signature Scheme Based on Genuine Seven-Qubit Entangled State

    NASA Astrophysics Data System (ADS)

    Yang, Yuan-Yuan; Xie, Shu-Cui; Zhang, Jian-Zhong

    2017-07-01

    An improved quantum proxy blind signature scheme based on controlled teleportation is proposed in this paper. Genuine seven-qubit entangled state functions as quantum channel. We use the physical characteristics of quantum mechanics to implement delegation, signature and verification. Security analysis shows that our scheme is unforgeability, undeniability, blind and unconditionally secure. Meanwhile, we propose a trust party to provide higher security, the trust party is costless.

  7. Rule Systems for Runtime Verification: A Short Tutorial

    NASA Astrophysics Data System (ADS)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  8. Acoustic emissions verification testing of International Space Station experiment racks at the NASA Glenn Research Center Acoustical Testing Laboratory

    NASA Astrophysics Data System (ADS)

    Akers, James C.; Passe, Paul J.; Cooper, Beth A.

    2005-09-01

    The Acoustical Testing Laboratory (ATL) at the NASA John H. Glenn Research Center (GRC) in Cleveland, OH, provides acoustic emission testing and noise control engineering services for a variety of specialized customers, particularly developers of equipment and science experiments manifested for NASA's manned space missions. The ATL's primary customer has been the Fluids and Combustion Facility (FCF), a multirack microgravity research facility being developed at GRC for the USA Laboratory Module of the International Space Station (ISS). Since opening in September 2000, ATL has conducted acoustic emission testing of components, subassemblies, and partially populated FCF engineering model racks. The culmination of this effort has been the acoustic emission verification tests on the FCF Combustion Integrated Rack (CIR) and Fluids Integrated Rack (FIR), employing a procedure that incorporates ISO 11201 (``Acoustics-Noise emitted by machinery and equipment-Measurement of emission sound pressure levels at a work station and at other specified positions-Engineering method in an essentially free field over a reflecting plane''). This paper will provide an overview of the test methodology, software, and hardware developed to perform the acoustic emission verification tests on the CIR and FIR flight racks and lessons learned from these tests.

  9. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    NASA Astrophysics Data System (ADS)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  10. 77 FR 40090 - Proposed Collection of Information; Comment Request: Biological Sciences Proposal Classification...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-06

    ... NATIONAL SCIENCE FOUNDATION Proposed Collection of Information; Comment Request: Biological Sciences Proposal Classification Form AGENCY: National Science Foundation. ACTION: Notice. SUMMARY: The... Biological Sciences has a continuing commitment to monitor its information collection in order to preserve...

  11. Evaluation of ensemble forecast uncertainty using a new proper score: application to medium-range and seasonal forecasts

    NASA Astrophysics Data System (ADS)

    Christensen, Hannah; Moroz, Irene; Palmer, Tim

    2015-04-01

    Forecast verification is important across scientific disciplines as it provides a framework for evaluating the performance of a forecasting system. In the atmospheric sciences, probabilistic skill scores are often used for verification as they provide a way of unambiguously ranking the performance of different probabilistic forecasts. In order to be useful, a skill score must be proper -- it must encourage honesty in the forecaster, and reward forecasts which are reliable and which have good resolution. A new score, the Error-spread Score (ES), is proposed which is particularly suitable for evaluation of ensemble forecasts. It is formulated with respect to the moments of the forecast. The ES is confirmed to be a proper score, and is therefore sensitive to both resolution and reliability. The ES is tested on forecasts made using the Lorenz '96 system, and found to be useful for summarising the skill of the forecasts. The European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble prediction system (EPS) is evaluated using the ES. Its performance is compared to a perfect statistical probabilistic forecast -- the ECMWF high resolution deterministic forecast dressed with the observed error distribution. This generates a forecast that is perfectly reliable if considered over all time, but which does not vary from day to day with the predictability of the atmospheric flow. The ES distinguishes between the dynamically reliable EPS forecasts and the statically reliable dressed deterministic forecasts. Other skill scores are tested and found to be comparatively insensitive to this desirable forecast quality. The ES is used to evaluate seasonal range ensemble forecasts made with the ECMWF System 4. The ensemble forecasts are found to be skilful when compared with climatological or persistence forecasts, though this skill is dependent on region and time of year.

  12. Modeling the Water Balloon Slingshot

    NASA Astrophysics Data System (ADS)

    Bousquet, Benjamin D.; Figura, Charles C.

    2013-01-01

    In the introductory physics courses at Wartburg College, we have been working to create a lab experience focused on the scientific process itself rather than verification of physical laws presented in the classroom or textbook. To this end, we have developed a number of open-ended modeling exercises suitable for a variety of learning environments, from non-science major classes to algebra-based and calculus-based introductory physics classes.

  13. Verification technology of remote sensing camera satellite imaging simulation based on ray tracing

    NASA Astrophysics Data System (ADS)

    Gu, Qiongqiong; Chen, Xiaomei; Yang, Deyun

    2017-08-01

    Remote sensing satellite camera imaging simulation technology is broadly used to evaluate the satellite imaging quality and to test the data application system. But the simulation precision is hard to examine. In this paper, we propose an experimental simulation verification method, which is based on the test parameter variation comparison. According to the simulation model based on ray-tracing, the experiment is to verify the model precision by changing the types of devices, which are corresponding the parameters of the model. The experimental results show that the similarity between the imaging model based on ray tracing and the experimental image is 91.4%, which can simulate the remote sensing satellite imaging system very well.

  14. Biomedical support systems. [use and verification of biomedical hardware in altitude test

    NASA Technical Reports Server (NTRS)

    Brockett, R. M.; Ferguson, J. M.; Luczkowski, S. M.

    1973-01-01

    Biomedical support hardware for SMEAT consisted basically of two systems, the inflight medical support system, and the operational bioinstrumentation system. The former is essentially a diagnostic and therapeutic kit; the latter is a belt equipped with sensors worn by the crewman to permit monitoring of his vital signs. Special attention was given during to the use and verification of the items in the systems so that changes required in the equipment could be pinpointed and effected prior to the Skylab mission. During the in-chamber testing, evaluations were made of the effectiveness of the proposed microbiology procedures, techniques, equipment, and the stability of media and reagents over the extended period of storage.

  15. Verification of three-microphone impedance tube method for measurement of transmission loss in aerogels

    NASA Astrophysics Data System (ADS)

    Connick, Robert J.

    Accurate measurement of normal incident transmission loss is essential for the acoustic characterization of building materials. In this research, a method of measuring normal incidence sound transmission loss proposed by Salissou et al. as a complement to standard E2611-09 of the American Society for Testing and Materials [Standard Test Method for Measurement of Normal Incidence Sound Transmission of Acoustical Materials Based on the Transfer Matrix Method (American Society for Testing and Materials, New York, 2009)] is verified. Two sam- ples from the original literature are used to verify the method as well as a Filtros RTM sample. Following the verification, several nano-material Aerogel samples are measured.

  16. RELAP-7 Software Verification and Validation Plan - Requirements Traceability Matrix (RTM) Part 2: Code Assessment Strategy, Procedure, and RTM Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Jun Soo; Choi, Yong Joon; Smith, Curtis Lee

    2016-09-01

    This document addresses two subjects involved with the RELAP-7 Software Verification and Validation Plan (SVVP): (i) the principles and plan to assure the independence of RELAP-7 assessment through the code development process, and (ii) the work performed to establish the RELAP-7 assessment plan, i.e., the assessment strategy, literature review, and identification of RELAP-7 requirements. Then, the Requirements Traceability Matrices (RTMs) proposed in previous document (INL-EXT-15-36684) are updated. These RTMs provide an efficient way to evaluate the RELAP-7 development status as well as the maturity of RELAP-7 assessment through the development process.

  17. Tritium as an indicator of venues for nuclear tests.

    PubMed

    Lyakhova, O N; Lukashenko, S N; Mulgin, S I; Zhdanov, S V

    2013-10-01

    Currently, due to the Treaty on the Non-proliferation of Nuclear Weapons there is a highly topical issue of an accurate verification of nuclear explosion venues. This paper proposes to consider new method for verification by using tritium as an indicator. Detailed studies of the tritium content in the air were carried in the locations of underground nuclear tests - "Balapan" and "Degelen" testing sites located in Semipalatinsk Test Site. The paper presents data on the levels and distribution of tritium in the air where tunnels and boreholes are located - explosion epicentres, wellheads and tunnel portals, as well as in estuarine areas of the venues for the underground nuclear explosions (UNE). Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. A fingerprint key binding algorithm based on vector quantization and error correction

    NASA Astrophysics Data System (ADS)

    Li, Liang; Wang, Qian; Lv, Ke; He, Ning

    2012-04-01

    In recent years, researches on seamless combination cryptosystem with biometric technologies, e.g. fingerprint recognition, are conducted by many researchers. In this paper, we propose a binding algorithm of fingerprint template and cryptographic key to protect and access the key by fingerprint verification. In order to avoid the intrinsic fuzziness of variant fingerprints, vector quantization and error correction technique are introduced to transform fingerprint template and then bind with key, after a process of fingerprint registration and extracting global ridge pattern of fingerprint. The key itself is secure because only hash value is stored and it is released only when fingerprint verification succeeds. Experimental results demonstrate the effectiveness of our ideas.

  19. Prompt gamma timing range verification for scattered proton beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kormoll, T.; Golnik, C.; Hueso Gonzalez, F.

    2015-07-01

    Range verification is a very important point in order to fully exploit the physical advantages of protons compared to photons in cancer irradiation. Recently, a simple method has been proposed which makes use of the time of fight of protons in tissue and the promptly emitted secondary photons along the proton path (Prompt Gamma Timing, PGT). This has been considered so far for monoenergetic pencil beams only. In this work, it has been studied whether this technique can also be applied in passively formed irradiation fields with a so called spread out Bragg peak. Time correlated profiles could be recorded,more » which show a trend that is consistent with theoretical predictions. (authors)« less

  20. 78 FR 67204 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-08

    ... action to submit an information collection request to the Office of Management and Budget (OMB) and... Verification System (LVS) has been developed, providing an electronic method for fulfilling this requirement... publicly available documents, including the draft supporting statement, at the NRC's Public Document Room...

  1. 78 FR 66365 - Proposed Information Collection Activity; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-05

    ... for Needy Families (TANF) program, it imposed a new data requirement that States prepare and submit data verification procedures and replaced other data requirements with new versions including: the TANF Data Report, the SSP-MOE Data Report, the Caseload Reduction Documentation Process, and the Reasonable...

  2. Learning Deep Representations for Ground to Aerial Geolocalization (Open Access)

    DTIC Science & Technology

    2015-10-15

    proposed approach, Where-CNN, is inspired by deep learning success in face verification and achieves significant improvements over tra- ditional hand...crafted features and existing deep features learned from other large-scale databases. We show the ef- fectiveness of Where-CNN in finding matches

  3. 76 FR 2393 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-13

    ... notice. Proposed Project Annual Submission of the Ingredients Added to, and the Quantity of Nicotine... information about the ingredients used in smokeless tobacco products and their nicotine content. Respondents... mail submissions are not accepted. Upon receipt and verification of the annual ingredient and nicotine...

  4. Letting the daylight in: Reviewing the reviewers and other ways to maximize transparency in science

    PubMed Central

    Wicherts, Jelte M.; Kievit, Rogier A.; Bakker, Marjan; Borsboom, Denny

    2012-01-01

    With the emergence of online publishing, opportunities to maximize transparency of scientific research have grown considerably. However, these possibilities are still only marginally used. We argue for the implementation of (1) peer-reviewed peer review, (2) transparent editorial hierarchies, and (3) online data publication. First, peer-reviewed peer review entails a community-wide review system in which reviews are published online and rated by peers. This ensures accountability of reviewers, thereby increasing academic quality of reviews. Second, reviewers who write many highly regarded reviews may move to higher editorial positions. Third, online publication of data ensures the possibility of independent verification of inferential claims in published papers. This counters statistical errors and overly positive reporting of statistical results. We illustrate the benefits of these strategies by discussing an example in which the classical publication system has gone awry, namely controversial IQ research. We argue that this case would have likely been avoided using more transparent publication practices. We argue that the proposed system leads to better reviews, meritocratic editorial hierarchies, and a higher degree of replicability of statistical analyses. PMID:22536180

  5. The meshless local Petrov-Galerkin method based on moving Kriging interpolation for solving the time fractional Navier-Stokes equations.

    PubMed

    Thamareerat, N; Luadsong, A; Aschariyaphotha, N

    2016-01-01

    In this paper, we present a numerical scheme used to solve the nonlinear time fractional Navier-Stokes equations in two dimensions. We first employ the meshless local Petrov-Galerkin (MLPG) method based on a local weak formulation to form the system of discretized equations and then we will approximate the time fractional derivative interpreted in the sense of Caputo by a simple quadrature formula. The moving Kriging interpolation which possesses the Kronecker delta property is applied to construct shape functions. This research aims to extend and develop further the applicability of the truly MLPG method to the generalized incompressible Navier-Stokes equations. Two numerical examples are provided to illustrate the accuracy and efficiency of the proposed algorithm. Very good agreement between the numerically and analytically computed solutions can be observed in the verification. The present MLPG method has proved its efficiency and reliability for solving the two-dimensional time fractional Navier-Stokes equations arising in fluid dynamics as well as several other problems in science and engineering.

  6. A detailed experimental study of a DNA computer with two endonucleases.

    PubMed

    Sakowski, Sebastian; Krasiński, Tadeusz; Sarnik, Joanna; Blasiak, Janusz; Waldmajer, Jacek; Poplawski, Tomasz

    2017-07-14

    Great advances in biotechnology have allowed the construction of a computer from DNA. One of the proposed solutions is a biomolecular finite automaton, a simple two-state DNA computer without memory, which was presented by Ehud Shapiro's group at the Weizmann Institute of Science. The main problem with this computer, in which biomolecules carry out logical operations, is its complexity - increasing the number of states of biomolecular automata. In this study, we constructed (in laboratory conditions) a six-state DNA computer that uses two endonucleases (e.g. AcuI and BbvI) and a ligase. We have presented a detailed experimental verification of its feasibility. We described the effect of the number of states, the length of input data, and the nondeterminism on the computing process. We also tested different automata (with three, four, and six states) running on various accepted input words of different lengths such as ab, aab, aaab, ababa, and of an unaccepted word ba. Moreover, this article presents the reaction optimization and the methods of eliminating certain biochemical problems occurring in the implementation of a biomolecular DNA automaton based on two endonucleases.

  7. A retrospective of the GREGOR solar telescope in scientific literature

    NASA Astrophysics Data System (ADS)

    Denker, C.; von der Lühe, O.; Feller, A.; Arlt, K.; Balthasar, H.; Bauer, S.-M.; Bello González, N.; Berkefeld, Th.; Caligari, P.; Collados, M.; Fischer, A.; Granzer, T.; Hahn, T.; Halbgewachs, C.; Heidecke, F.; Hofmann, A.; Kentischer, T.; Klva{ňa, M.; Kneer, F.; Lagg, A.; Nicklas, H.; Popow, E.; Puschmann, K. G.; Rendtel, J.; Schmidt, D.; Schmidt, W.; Sobotka, M.; Solanki, S. K.; Soltau, D.; Staude, J.; Strassmeier, K. G.; Volkmer, R.; Waldmann, T.; Wiehr, E.; Wittmann, A. D.; Woche, M.

    2012-11-01

    In this review, we look back upon the literature, which had the GREGOR solar telescope project as its subject including science cases, telescope subsystems, and post-focus instruments. The articles date back to the year 2000, when the initial concepts for a new solar telescope on Tenerife were first presented at scientific meetings. This comprehensive bibliography contains literature until the year 2012, i.e., the final stages of commissioning and science verification. Taking stock of the various publications in peer-reviewed journals and conference proceedings also provides the ``historical'' context for the reference articles in this special issue of Astronomische Nachrichten/Astronomical Notes.

  8. U.S. Geological Survey Global Seismographic Network - Five-Year Plan 2006-2010

    USGS Publications Warehouse

    Leith, William S.; Gee, Lind S.; Hutt, Charles R.

    2009-01-01

    The Global Seismographic Network provides data for earthquake alerting, tsunami warning, nuclear treaty verification, and Earth science research. The system consists of nearly 150 permanent digital stations, distributed across the globe, connected by a modern telecommunications network. It serves as a multi-use scientific facility and societal resource for monitoring, research, and education, by providing nearly uniform, worldwide monitoring of the Earth. The network was developed and is operated through a partnership among the National Science Foundation (http://www.nsf.gov), the Incorporated Research Institutions for Seismology (http://www.iris.edu/hq/programs/gsn), and the U.S. Geological Survey (http://earthquake.usgs.gov/gsn).

  9. Forecast Verification: Identification of small changes in weather forecasting skill

    NASA Astrophysics Data System (ADS)

    Weatherhead, E. C.; Jensen, T. L.

    2017-12-01

    Global and regonal weather forecasts have improved over the past seven decades most often because of small, incrmental improvements. The identificaiton and verification of forecast improvement due to proposed small changes in forecasting can be expensive and, if not carried out efficiently, can slow progress in forecasting development. This presentation will look at the skill of commonly used verification techniques and show how the ability to detect improvements can depend on the magnitude of the improvement, the number of runs used to test the improvement, the location on the Earth and the statistical techniques used. For continuous variables, such as temperture, wind and humidity, the skill of a forecast can be directly compared using a pair-wise statistical test that accommodates the natural autocorrelation and magnitude of variability. For discrete variables, such as tornado outbreaks, or icing events, the challenges is to reduce the false alarm rate while improving the rate of correctly identifying th discrete event. For both continuus and discrete verification results, proper statistical approaches can reduce the number of runs needed to identify a small improvement in forecasting skill. Verification within the Next Generation Global Prediction System is an important component to the many small decisions needed to make stat-of-the-art improvements to weather forecasting capabilities. The comparison of multiple skill scores with often conflicting results requires not only appropriate testing, but also scientific judgment to assure that the choices are appropriate not only for improvements in today's forecasting capabilities, but allow improvements that will come in the future.

  10. Experimental verification of self-calibration radiometer based on spontaneous parametric downconversion

    NASA Astrophysics Data System (ADS)

    Gao, Dongyang; Zheng, Xiaobing; Li, Jianjun; Hu, Youbo; Xia, Maopeng; Salam, Abdul; Zhang, Peng

    2018-03-01

    Based on spontaneous parametric downconversion process, we propose a novel self-calibration radiometer scheme which can self-calibrate the degradation of its own response and ultimately monitor the fluctuation of a target radiation. Monitor results were independent of its degradation and not linked to the primary standard detector scale. The principle and feasibility of the proposed scheme were verified by observing bromine-tungsten lamp. A relative standard deviation of 0.39 % was obtained for stable bromine-tungsten lamp. Results show that the proposed scheme is advanced of its principle. The proposed scheme could make a significant breakthrough in the self-calibration issue on the space platform.

  11. An analytical model of SAGD process considering the effect of threshold pressure gradient

    NASA Astrophysics Data System (ADS)

    Morozov, P.; Abdullin, A.; Khairullin, M.

    2018-05-01

    An analytical model is proposed for the development of super-viscous oil deposits by the method of steam-assisted gravity drainage, taking into account the nonlinear filtration law with the limiting gradient. The influence of non-Newtonian properties of oil on the productivity of a horizontal well and the cumulative steam-oil ratio are studied. Verification of the proposed model based on the results of physical modeling of the SAGD process was carried out.

  12. Universal Verification Methodology Based Register Test Automation Flow.

    PubMed

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  13. AdaBoost-based on-line signature verifier

    NASA Astrophysics Data System (ADS)

    Hongo, Yasunori; Muramatsu, Daigo; Matsumoto, Takashi

    2005-03-01

    Authentication of individuals is rapidly becoming an important issue. The authors previously proposed a Pen-input online signature verification algorithm. The algorithm considers a writer"s signature as a trajectory of pen position, pen pressure, pen azimuth, and pen altitude that evolve over time, so that it is dynamic and biometric. Many algorithms have been proposed and reported to achieve accuracy for on-line signature verification, but setting the threshold value for these algorithms is a problem. In this paper, we introduce a user-generic model generated by AdaBoost, which resolves this problem. When user- specific models (one model for each user) are used for signature verification problems, we need to generate the models using only genuine signatures. Forged signatures are not available because imposters do not give forged signatures for training in advance. However, we can make use of another's forged signature in addition to the genuine signatures for learning by introducing a user generic model. And Adaboost is a well-known classification algorithm, making final decisions depending on the sign of the output value. Therefore, it is not necessary to set the threshold value. A preliminary experiment is performed on a database consisting of data from 50 individuals. This set consists of western-alphabet-based signatures provide by a European research group. In this experiment, our algorithm gives an FRR of 1.88% and an FAR of 1.60%. Since no fine-tuning was done, this preliminary result looks very promising.

  14. Thinking forensics: Cognitive science for forensic practitioners.

    PubMed

    Edmond, Gary; Towler, Alice; Growns, Bethany; Ribeiro, Gianni; Found, Bryan; White, David; Ballantyne, Kaye; Searston, Rachel A; Thompson, Matthew B; Tangen, Jason M; Kemp, Richard I; Martire, Kristy

    2017-03-01

    Human factors and their implications for forensic science have attracted increasing levels of interest across criminal justice communities in recent years. Initial interest centred on cognitive biases, but has since expanded such that knowledge from psychology and cognitive science is slowly infiltrating forensic practices more broadly. This article highlights a series of important findings and insights of relevance to forensic practitioners. These include research on human perception, memory, context information, expertise, decision-making, communication, experience, verification, confidence, and feedback. The aim of this article is to sensitise forensic practitioners (and lawyers and judges) to a range of potentially significant issues, and encourage them to engage with research in these domains so that they may adapt procedures to improve performance, mitigate risks and reduce errors. Doing so will reduce the divide between forensic practitioners and research scientists as well as improve the value and utility of forensic science evidence. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier B.V. All rights reserved.

  15. 33 CFR 385.27 - Project Cooperation Agreements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... to water reservations. Reservations or allocations of water are a State responsibility. Any change to... public to review and comment on any proposed changes in the water reservation made by the State. (2) The... with the non-Federal sponsor in accordance with applicable law. (b) Verification of water reservations...

  16. 33 CFR 385.27 - Project Cooperation Agreements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... to water reservations. Reservations or allocations of water are a State responsibility. Any change to... public to review and comment on any proposed changes in the water reservation made by the State. (2) The... with the non-Federal sponsor in accordance with applicable law. (b) Verification of water reservations...

  17. 33 CFR 385.27 - Project Cooperation Agreements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... to water reservations. Reservations or allocations of water are a State responsibility. Any change to... public to review and comment on any proposed changes in the water reservation made by the State. (2) The... with the non-Federal sponsor in accordance with applicable law. (b) Verification of water reservations...

  18. 33 CFR 385.27 - Project Cooperation Agreements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... to water reservations. Reservations or allocations of water are a State responsibility. Any change to... public to review and comment on any proposed changes in the water reservation made by the State. (2) The... with the non-Federal sponsor in accordance with applicable law. (b) Verification of water reservations...

  19. 33 CFR 385.27 - Project Cooperation Agreements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... to water reservations. Reservations or allocations of water are a State responsibility. Any change to... public to review and comment on any proposed changes in the water reservation made by the State. (2) The... with the non-Federal sponsor in accordance with applicable law. (b) Verification of water reservations...

  20. Sewer Lateral Electro Scan Field Verification Pilot (WERF Report INFR4R12)

    EPA Science Inventory

    Abstract:WERF selected a proposed research project to field test an emerging technology for inspecting sanitary sewer lateral pipes. The technology is called Electro Scan and is used to find defects in laterals that allow the infiltration of groundwater into the lateral. Electro ...

  1. 76 FR 72225 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-22

    .... Independent public accountants must verify the fund's assets at least three times a year and two of the... independent public accountants when they perform verifications of fund assets.\\4\\ Approximately 243 funds rely... hours per fund) x $165 (fund senior accountant's hourly rate) = $82.50. \\3\\ Respondents estimated that...

  2. 75 FR 28550 - Proposed Information Collection; Comment Request; Delivery Verification Procedure

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-21

    ... the commodities shipped to the U.S. were in fact received. This procedure increases the effectiveness... Review: Regular submission. Affected Public: Business or other for-profit organizations. Estimated Number.... Estimated Total Annual Cost to Public: $0. IV. Request for Comments Comments are invited on: (a) Whether the...

  3. The Comprehension and Validation of Social Information.

    ERIC Educational Resources Information Center

    Wyer, Robert S., Jr.; Radvansky, Gabriel A.

    1999-01-01

    Proposes a theory of social cognition to account for the comprehension and verification of social information. The theory views comprehension as a process of constructing situation models of new information on the basis of previously formed models about its referents. The comprehension of both single statements and multiple pieces of information…

  4. 76 FR 34713 - Proposed Establishment of a Federally Funded Research and Development Center-Third Notice

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-14

    ... (FFRDC) to facilitate the modernization of business processes and supporting systems and their operations... processes and supporting systems and their operations. Some of the broad task areas that will be utilized..., organizational planning, research and development, continuous process improvement, Independent Verification and...

  5. 76 FR 72306 - Federal Housing Administration (FHA) Appraiser Roster: Appraiser Qualifications for Placement on...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-23

    ... Appraiser Roster regulations by replacing the obsolete references to the Credit Alert Interactive Voice Response System (CAIVRS) with references to its successor, the online-based Credit Alert Verification... propose the elimination references to the Credit Alert Interactive Voice Response System (CAIVRS). On July...

  6. Issues of planning trajectory of parallel robots taking into account zones of singularity

    NASA Astrophysics Data System (ADS)

    Rybak, L. A.; Khalapyan, S. Y.; Gaponenko, E. V.

    2018-03-01

    A method for determining the design characteristics of a parallel robot necessary to provide specified parameters of its working space that satisfy the controllability requirement is developed. The experimental verification of the proposed method was carried out using an approximate planar 3-RPR mechanism.

  7. MO-AB-BRA-03: Development of Novel Real Time in Vivo EPID Treatment Verification for Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fonseca, G; Podesta, M; Reniers, B

    2016-06-15

    Purpose: High Dose Rate (HDR) brachytherapy treatments are employed worldwide to treat a wide variety of cancers. However, in vivo dose verification remains a challenge with no commercial dosimetry system available to verify the treatment dose delivered to the patient. We propose a novel dosimetry system that couples an independent Monte Carlo (MC) simulation platform and an amorphous silicon Electronic Portal Imaging Device (EPID) to provide real time treatment verification. Methods: MC calculations predict the EPID response to the photon fluence emitted by the HDR source by simulating the patient, the source dwell positions and times, and treatment complexities suchmore » as tissue compositions/densities and different applicators. Simulated results are then compared against EPID measurements acquired with ∼0.14s time resolution which allows dose measurements for each dwell position. The EPID has been calibrated using an Ir-192 HDR source and experiments were performed using different phantoms, including tissue equivalent materials (PMMA, lung and bone). A source positioning accuracy of 0.2 mm, without including the afterloader uncertainty, was ensured using a robotic arm moving the source. Results: An EPID can acquire 3D Cartesian source positions and its response varies significantly due to differences in the material composition/density of the irradiated object, allowing detection of changes in patient geometry. The panel time resolution allows dose rate and dwell time measurements. Moreover, predicted EPID images obtained from clinical treatment plans provide anatomical information that can be related to the patient anatomy, mostly bone and air cavities, localizing the source inside of the patient using its anatomy as reference. Conclusion: Results obtained show the feasibility of the proposed dose verification system that is capable to verify all the brachytherapy treatment steps in real time providing data about treatment delivery quality and also applicator/structure motion during or between treatments.« less

  8. Computer Security Models

    DTIC Science & Technology

    1984-09-01

    Verification Technique for a Class of Security Kernels," International Symposium on Programming , Lecture Notes in Computer Science 137, Springer-Verlag, New York...September 1984 MTR9S31 " J. K. Millen Computer Security C. M. Cerniglia Models * 0 Ne c - ¢- C. S• ~CONTRACT SPONSOR OUSDRE/C31 & ESO/ALEE...ABSTRACT The purpose of this report is to provide a basis for evaluating security models in the context of secure computer system development

  9. Integrating Requirements Engineering, Modeling, and Verification Technologies into Software and Systems Engineering

    DTIC Science & Technology

    2007-10-28

    Software Engineering, FASE󈧉, volume 3442 of Lecture Notes in Computer Science, pages 175--189. Springer, 2005. Andreas Bauer, Martin Leucker, and Jonathan ...of Personnel receiving masters degrees NAME Markus Strohmeier Gerrit Hanselmann Jonathan Streit Ernst Sassen 4Total Number: Names of personnel...developed and documented mainly within the master thesis by Jonathan Streit [Str06]: • Jonathan Streit. Development of a programming language like tem

  10. Verification of Spatial Forecasts of Continuous Meteorological Variables Using Categorical and Object-Based Methods

    DTIC Science & Technology

    2016-08-01

    Using Categorical and Object-Based Methods by John W Raby and Huaqing Cai Approved for public release; distribution...by John W Raby and Huaqing Cai Computational and Information Sciences Directorate, ARL Approved for public release...AUTHOR(S) John W Raby and Huaqing Cai 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND

  11. Open and Crowd-Sourced Data for Treaty Verification

    DTIC Science & Technology

    2014-10-01

    the case of user - generated Internet content , such as Wikipedia, or Amazon reviews. Another example is the Zooniverse citizen science project,6 which...Prescribed by ANSI Std. Z39.18 Contents EXECUTIVE SUMMARY 1 1 INTRODUCTION 5 1.1 Charge to the Panel . . . . . . . . . . . . . . . . . . . . . . . 7 2...number of potential observations can in many in- stances make up for relatively crude measurements made by the pub- lic. Users are motivated to

  12. James Webb Space Telescope (JWST) Integrated Science Instruments Module (ISIM) Cryo-Vacuum (CV) Test Campaign Summary

    NASA Technical Reports Server (NTRS)

    Yew, Calinda; Whitehouse, Paul; Lui, Yan; Banks, Kimberly

    2016-01-01

    JWST Integrated Science Instruments Module (ISIM) has completed its system-level testing program at the NASA Goddard Space Flight Center (GSFC). In March 2016, ISIM was successfully delivered for integration with the Optical Telescope Element (OTE) after the successful verification of the system through a series of three cryo-vacuum (CV) tests. The first test served as a risk reduction test; the second test provided the initial verification of the fully-integrated flight instruments; and the third test verified the system in its final flight configuration. The complexity of the mission has generated challenging requirements that demand highly reliable system performance and capabilities from the Space Environment Simulator (SES) vacuum chamber. As JWST progressed through its CV testing campaign, deficiencies in the test configuration and support equipment were uncovered from one test to the next. Subsequent upgrades and modifications were implemented to improve the facility support capabilities required to achieve test requirements. This paper: (1) provides an overview of the integrated mechanical and thermal facility systems required to achieve the objectives of JWST ISIM testing, (2) compares the overall facility performance and instrumentation results from the three ISIM CV tests, and (3) summarizes lessons learned from the ISIM testing campaign.

  13. Galaxy-galaxy lensing in the Dark Energy Survey Science Verification data

    NASA Astrophysics Data System (ADS)

    Clampitt, J.; Sánchez, C.; Kwan, J.; Krause, E.; MacCrann, N.; Park, Y.; Troxel, M. A.; Jain, B.; Rozo, E.; Rykoff, E. S.; Wechsler, R. H.; Blazek, J.; Bonnett, C.; Crocce, M.; Fang, Y.; Gaztanaga, E.; Gruen, D.; Jarvis, M.; Miquel, R.; Prat, J.; Ross, A. J.; Sheldon, E.; Zuntz, J.; Abbott, T. M. C.; Abdalla, F. B.; Armstrong, R.; Becker, M. R.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Dietrich, J. P.; Doel, P.; Estrada, J.; Evrard, A. E.; Fausti Neto, A.; Flaugher, B.; Fosalba, P.; Frieman, J.; Gruendl, R. A.; Honscheid, K.; James, D. J.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; March, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Mohr, J. J.; Nichol, R. C.; Nord, B.; Plazas, A. A.; Romer, A. K.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Vikram, V.; Walker, A. R.

    2017-03-01

    We present galaxy-galaxy lensing results from 139 deg2 of Dark Energy Survey (DES) Science Verification (SV) data. Our lens sample consists of red galaxies, known as redMaGiC, which are specifically selected to have a low photometric redshift error and outlier rate. The lensing measurement has a total signal-to-noise ratio of 29 over scales 0.09 < R < 15 Mpc h-1, including all lenses over a wide redshift range 0.2 < z < 0.8. Dividing the lenses into three redshift bins for this constant moving number density sample, we find no evidence for evolution in the halo mass with redshift. We obtain consistent results for the lensing measurement with two independent shear pipelines, NGMIX and IM3SHAPE. We perform a number of null tests on the shear and photometric redshift catalogues and quantify resulting systematic uncertainties. Covariances from jackknife subsamples of the data are validated with a suite of 50 mock surveys. The result and systematic checks in this work provide a critical input for future cosmological and galaxy evolution studies with the DES data and redMaGiC galaxy samples. We fit a halo occupation distribution (HOD) model, and demonstrate that our data constrain the mean halo mass of the lens galaxies, despite strong degeneracies between individual HOD parameters.

  14. redMaGiC: selecting luminous red galaxies from the DES Science Verification data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rozo, E.

    We introduce redMaGiC, an automated algorithm for selecting Luminous Red Galaxies (LRGs). The algorithm was developed to minimize photometric redshift uncertainties in photometric large-scale structure studies. redMaGiC achieves this by self-training the color-cuts necessary to produce a luminosity-thresholded LRG sam- ple of constant comoving density. Additionally, we demonstrate that redMaGiC photo-zs are very nearly as accurate as the best machine-learning based methods, yet they require minimal spectroscopic training, do not suffer from extrapolation biases, and are very nearly Gaussian. We apply our algorithm to Dark Energy Survey (DES) Science Verification (SV) data to produce a redMaGiC catalog sampling the redshiftmore » range z ϵ [0.2,0.8]. Our fiducial sample has a comoving space density of 10 -3 (h -1Mpc) -3, and a median photo-z bias (z spec z photo) and scatter (σ z=(1 + z)) of 0.005 and 0.017 respectively.The corresponding 5σ outlier fraction is 1.4%. We also test our algorithm with Sloan Digital Sky Survey (SDSS) Data Release 8 (DR8) and Stripe 82 data, and discuss how spectroscopic training can be used to control photo-z biases at the 0.1% level.« less

  15. Suite of Benchmark Tests to Conduct Mesh-Convergence Analysis of Nonlinear and Non-constant Coefficient Transport Codes

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2014-12-01

    Verification of geophysics codes is imperative to avoid serious academic as well as practical consequences. In case that access to any given source code is not possible, the Method of Manufactured Solution (MMS) cannot be employed in code verification. In contrast, employing the Method of Exact Solution (MES) has several practical advantages. In this research, we first provide four new one-dimensional analytical solutions designed for code verification; these solutions are able to uncover the particular imperfections of the Advection-diffusion-reaction equation, such as nonlinear advection, diffusion or source terms, as well as non-constant coefficient equations. After that, we provide a solution of Burgers' equation in a novel setup. Proposed solutions satisfy the continuity of mass for the ambient flow, which is a crucial factor for coupled hydrodynamics-transport solvers. Then, we use the derived analytical solutions for code verification. To clarify gray-literature issues in the verification of transport codes, we designed a comprehensive test suite to uncover any imperfection in transport solvers via a hierarchical increase in the level of tests' complexity. The test suite includes hundreds of unit tests and system tests to check vis-a-vis the portions of the code. Examples for checking the suite start by testing a simple case of unidirectional advection; then, bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh-convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available, we utilize symmetry. Auxiliary subroutines for automation of the test suite and report generation are designed. All in all, the test package is not only a robust tool for code verification but it also provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport. We also convey our experiences in finding several errors which were not detectable with routine verification techniques.

  16. Automation and uncertainty analysis of a method for in-vivo range verification in particle therapy.

    PubMed

    Frey, K; Unholtz, D; Bauer, J; Debus, J; Min, C H; Bortfeld, T; Paganetti, H; Parodi, K

    2014-10-07

    We introduce the automation of the range difference calculation deduced from particle-irradiation induced β(+)-activity distributions with the so-called most-likely-shift approach, and evaluate its reliability via the monitoring of algorithm- and patient-specific uncertainty factors. The calculation of the range deviation is based on the minimization of the absolute profile differences in the distal part of two activity depth profiles shifted against each other. Depending on the workflow of positron emission tomography (PET)-based range verification, the two profiles under evaluation can correspond to measured and simulated distributions, or only measured data from different treatment sessions. In comparison to previous work, the proposed approach includes an automated identification of the distal region of interest for each pair of PET depth profiles and under consideration of the planned dose distribution, resulting in the optimal shift distance. Moreover, it introduces an estimate of uncertainty associated to the identified shift, which is then used as weighting factor to 'red flag' problematic large range differences. Furthermore, additional patient-specific uncertainty factors are calculated using available computed tomography (CT) data to support the range analysis. The performance of the new method for in-vivo treatment verification in the clinical routine is investigated with in-room PET images for proton therapy as well as with offline PET images for proton and carbon ion therapy. The comparison between measured PET activity distributions and predictions obtained by Monte Carlo simulations or measurements from previous treatment fractions is performed. For this purpose, a total of 15 patient datasets were analyzed, which were acquired at Massachusetts General Hospital and Heidelberg Ion-Beam Therapy Center with in-room PET and offline PET/CT scanners, respectively. Calculated range differences between the compared activity distributions are reported in a 2D map in beam-eye-view. In comparison to previously proposed approaches, the new most-likely-shift method shows more robust results for assessing in-vivo the range from strongly varying PET distributions caused by differing patient geometry, ion beam species, beam delivery techniques, PET imaging concepts and counting statistics. The additional visualization of the uncertainties and the dedicated weighting strategy contribute to the understanding of the reliability of observed range differences and the complexity in the prediction of activity distributions. The proposed method promises to offer a feasible technique for clinical routine of PET-based range verification.

  17. Automation and uncertainty analysis of a method for in-vivo range verification in particle therapy

    NASA Astrophysics Data System (ADS)

    Frey, K.; Unholtz, D.; Bauer, J.; Debus, J.; Min, C. H.; Bortfeld, T.; Paganetti, H.; Parodi, K.

    2014-10-01

    We introduce the automation of the range difference calculation deduced from particle-irradiation induced β+-activity distributions with the so-called most-likely-shift approach, and evaluate its reliability via the monitoring of algorithm- and patient-specific uncertainty factors. The calculation of the range deviation is based on the minimization of the absolute profile differences in the distal part of two activity depth profiles shifted against each other. Depending on the workflow of positron emission tomography (PET)-based range verification, the two profiles under evaluation can correspond to measured and simulated distributions, or only measured data from different treatment sessions. In comparison to previous work, the proposed approach includes an automated identification of the distal region of interest for each pair of PET depth profiles and under consideration of the planned dose distribution, resulting in the optimal shift distance. Moreover, it introduces an estimate of uncertainty associated to the identified shift, which is then used as weighting factor to ‘red flag’ problematic large range differences. Furthermore, additional patient-specific uncertainty factors are calculated using available computed tomography (CT) data to support the range analysis. The performance of the new method for in-vivo treatment verification in the clinical routine is investigated with in-room PET images for proton therapy as well as with offline PET images for proton and carbon ion therapy. The comparison between measured PET activity distributions and predictions obtained by Monte Carlo simulations or measurements from previous treatment fractions is performed. For this purpose, a total of 15 patient datasets were analyzed, which were acquired at Massachusetts General Hospital and Heidelberg Ion-Beam Therapy Center with in-room PET and offline PET/CT scanners, respectively. Calculated range differences between the compared activity distributions are reported in a 2D map in beam-eye-view. In comparison to previously proposed approaches, the new most-likely-shift method shows more robust results for assessing in-vivo the range from strongly varying PET distributions caused by differing patient geometry, ion beam species, beam delivery techniques, PET imaging concepts and counting statistics. The additional visualization of the uncertainties and the dedicated weighting strategy contribute to the understanding of the reliability of observed range differences and the complexity in the prediction of activity distributions. The proposed method promises to offer a feasible technique for clinical routine of PET-based range verification.

  18. Array Detector Modules for Spent Fuel Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolotnikov, Aleksey

    Brookhaven National Laboratory (BNL) proposes to evaluate the arrays of position-sensitive virtual Frisch-grid (VFG) detectors for passive gamma-ray emission tomography (ET) to verify the spent fuel in storage casks before storing them in geo-repositories. Our primary objective is to conduct a preliminary analysis of the arrays capabilities and to perform field measurements to validate the effectiveness of the proposed array modules. The outcome of this proposal will consist of baseline designs for the future ET system which can ultimately be used together with neutrons detectors. This will demonstrate the usage of this technology in spent fuel storage casks.

  19. Advanced user support programme—TEMPUS IML-2

    NASA Astrophysics Data System (ADS)

    Diefenbach, A.; Kratz, M.; Uffelmann, D.; Willnecker, R.

    1995-05-01

    The DLR Microgravity User Support Centre (MUSC) in Cologne has supported microgravity experiments in the field of materials and life sciences since 1979. In the beginning of user support activities, MUSC tasks comprised the basic ground and mission support, whereas present programmes are expanded on, for example, powerful telescience and advanced real time data acquisition capabilities for efficient experiment operation and monitoring. In view of the Space Station era, user support functions will increase further. Additional tasks and growing responsibilities must be covered, e.g. extended science support as well as experiment and facility operations. The user support for TEMPUS IML-2, under contract of the German Space Agency DARA, represents a further step towards the required new-generation of future ground programme. TEMPUS is a new highly sophisticated Spacelab multi-user facility for containerless processing of metallic samples. Electromagnetic levitation technique is applied and various experiment diagnosis tools are offered. Experiments from eight U.S. and German investigator groups have been selected for flight on the second International Microgravity Laboratory Mission IML-2 in 1994. Based on the experience gained in the research programme of the DLR Institute for Space Simulation since 1984, MUSC is performing a comprehensive experiment preparation programme in close collaboration with the investigator teams. Complex laboratory equipment has been built up for technology and experiment preparation development. New experiment techniques have been developed for experiment verification tests. The MUSC programme includes thorough analysis and testing of scientific requirements of every proposed experiment with respect to the facility hard- and software capabilities. In addition, studies on the experiment-specific operation requirements have been performed and suitable telescience scenarios were analysed. The present paper will give a survey of the TEMPUS user support tasks emphasizing the advanced science support activities, which are considered significant for future ground programmes.

  20. On flattening filter‐free portal dosimetry

    PubMed Central

    Novais, Juan Castro; Molina López, María Yolanda; Maqueda, Sheila Ruiz

    2016-01-01

    Varian introduced (in 2010) the option of removing the flattening filter (FF) in their C‐Arm linacs for intensity‐modulated treatments. This mode, called flattening filter‐free (FFF), offers the advantage of a greater dose rate. Varian's “Portal Dosimetry” is an electronic portal imager device (EPID)‐based tool for IMRT verification. This tool lacks the capability of verifying flattening filter‐free (FFF) modes due to saturation and lack of an image prediction algorithm. (Note: the latest versions of this software and EPID correct these issues.) The objective of the present study is to research the feasibility of said verifications (with the older versions of the software and EPID). By placing the EPID at a greater distance, the images can be acquired without saturation, yielding a linearity similar to the flattened mode. For the image prediction, a method was optimized based on the clinically used algorithm (analytical anisotropic algorithm (AAA)) over a homogeneous phantom. The depth inside the phantom and its electronic density were tailored. An application was developed to allow the conversion of a dose plane (in DICOM format) to Varian's custom format for Portal Dosimetry. The proposed method was used for the verification of test and clinical fields for the three qualities used in our institution for IMRT: 6X, 6FFF and 10FFF. The method developed yielded a positive verification (more than 95% of the points pass a 2%/2 mm gamma) for both the clinical and test fields. This method was also capable of “predicting” static and wedged fields. A workflow for the verification of FFF fields was developed. This method relies on the clinical algorithm used for dose calculation and is able to verify the FFF modes, as well as being useful for machine quality assurance. The procedure described does not require new hardware. This method could be used as a verification of Varian's Portal Dose Image Prediction. PACS number(s): 87.53.Kn, 87.55.T‐, 87.56.bd, 87.59.‐e PMID:27455487

  1. Science verification of operational aerosol and cloud products for TROPOMI on Sentinel-5 precursor

    NASA Astrophysics Data System (ADS)

    Lelli, Luca; Gimeno-Garcia, Sebastian; Sanders, Abram; Sneep, Maarten; Rozanov, Vladimir V.; Kokhanvosky, Alexander A.; Loyola, Diego; Burrows, John P.

    2016-04-01

    With the approaching launch of the Sentinel-5 precursor (S-5P) satellite, scheduled by mid 2016, one preparatory task of the L2 working group (composed by the Institute of Environmental Physics IUP Bremen, the Royal Netherlands Meteorological Institute KNMI De Bilt, and the German Aerospace Center DLR Oberpfaffenhofen) has been the assessment of biases among aerosol and cloud products, that are going to be inferred by the respective algorithms from measurements of the platform's payload TROPOspheric Monitoring Instrument (TROPOMI). The instrument will measure terrestrial radiance with varying moderate spectral resolutions from the ultraviolet throughout the shortwave infrared. Specifically, all the operational and verification algorithms involved in this comparison exploit the sensitivity of molecular oxygen absorption (the A-band, 755-775 nm, with a resolution of 0.54 nm) to changes in optical and geometrical parameters of tropospheric scattering layers. Therefore, aerosol layer height (ALH) and thickness (AOT), cloud top height (CTH), thickness (COT) and albedo (CA) are the targeted properties. First, the verification of these properties has been accomplished upon synchronisation of the respective forward radiative transfer models for a variety of atmospheric scenarios. Then, biases against independent techniques have been evaluated with real measurements of selected GOME-2 orbits. Global seasonal bias assessment has been carried out for CTH, CA and COT, whereas the verification of ALH and AOT is based on the analysis of the ash plume emitted by the icelandic volcanic eruption Eyjafjallajökull in May 2010 and selected dust scenes off the Saharan west coast sensed by SCIAMACHY in year 2009.

  2. Root-sum-square structural strength verification approach

    NASA Technical Reports Server (NTRS)

    Lee, Henry M.

    1994-01-01

    Utilizing a proposed fixture design or some variation thereof, this report presents a verification approach to strength test space flight payload components, electronics boxes, mechanisms, lines, fittings, etc., which traditionally do not lend themselves to classical static loading. The fixture, through use of ordered Euler rotation angles derived herein, can be mounted on existing vibration shakers and can provide an innovative method of applying single axis flight load vectors. The versatile fixture effectively loads protoflight or prototype components in all three axes simultaneously by use of a sinusoidal burst of desired magnitude at less than one-third the first resonant frequency. Cost savings along with improved hardware confidence are shown. The end product is an efficient way to verify experiment hardware for both random vibration and strength.

  3. FY2017 Final Report: Power of the People: A technical ethical and experimental examination of the use of crowdsourcing to support international nuclear safeguards verification.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gastelum, Zoe Nellie; Sentz, Kari; Swanson, Meili Claire

    Recent advances in information technology have led to an expansion of crowdsourcing activities that utilize the “power of the people” harnessed via online games, communities of interest, and other platforms to collect, analyze, verify, and provide technological solutions for challenges from a multitude of domains. To related this surge in popularity, the research team developed a taxonomy of crowdsourcing activities as they relate to international nuclear safeguards, evaluated the potential legal and ethical issues surrounding the use of crowdsourcing to support safeguards, and proposed experimental designs to test the capabilities and prospect for the use of crowdsourcing to support nuclearmore » safeguards verification.« less

  4. 75 FR 81315 - Earth Sciences Proposal Review Panel; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-27

    ... NATIONAL SCIENCE FOUNDATION Earth Sciences Proposal Review Panel; Notice of Meeting In accordance... announces the following meeting. Name: Proposal Review Panel in Earth Sciences (1569). Date and Time... Kelz, Program Director, Instrumentation & Facilities Program, Division of Earth Sciences, Room 785...

  5. Multicentre validation of IMRT pre-treatment verification: comparison of in-house and external audit.

    PubMed

    Jornet, Núria; Carrasco, Pablo; Beltrán, Mercè; Calvo, Juan Francisco; Escudé, Lluís; Hernández, Victor; Quera, Jaume; Sáez, Jordi

    2014-09-01

    We performed a multicentre intercomparison of IMRT optimisation and dose planning and IMRT pre-treatment verification methods and results. The aims were to check consistency between dose plans and to validate whether in-house pre-treatment verification results agreed with those of an external audit. Participating centres used two mock cases (prostate and head and neck) for the intercomparison and audit. Compliance to dosimetric goals and total number of MU per plan were collected. A simple quality index to compare the different plans was proposed. We compared gamma index pass rates using the centre's equipment and methodology to those of an external audit. While for the prostate case, all centres fulfilled the dosimetric goals and plan quality was homogeneous, that was not the case for the head and neck case. The number of MU did not correlate with the plan quality index. Pre-treatment verifications results of the external audit did not agree with those of the in-house measurements for two centres: being within tolerance for in-house measurements and unacceptable for the audit or the other way round. Although all plans fulfilled dosimetric constraints, plan quality is highly dependent on the planner expertise. External audits are an excellent tool to detect errors in IMRT implementation and cannot be replaced by intercomparison using results obtained by centres. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  6. Personal Verification/Identification via Analysis of the Peripheral ECG Leads: Influence of the Personal Health Status on the Accuracy

    PubMed Central

    Bortolan, Giovanni

    2015-01-01

    Traditional means for identity validation (PIN codes, passwords), and physiological and behavioral biometric characteristics (fingerprint, iris, and speech) are susceptible to hacker attacks and/or falsification. This paper presents a method for person verification/identification based on correlation of present-to-previous limb ECG leads: I (r I), II (r II), calculated from them first principal ECG component (r PCA), linear and nonlinear combinations between r I, r II, and r PCA. For the verification task, the one-to-one scenario is applied and threshold values for r I, r II, and r PCA and their combinations are derived. The identification task supposes one-to-many scenario and the tested subject is identified according to the maximal correlation with a previously recorded ECG in a database. The population based ECG-ILSA database of 540 patients (147 healthy subjects, 175 patients with cardiac diseases, and 218 with hypertension) has been considered. In addition a common reference PTB dataset (14 healthy individuals) with short time interval between the two acquisitions has been taken into account. The results on ECG-ILSA database were satisfactory with healthy people, and there was not a significant decrease in nonhealthy patients, demonstrating the robustness of the proposed method. With PTB database, the method provides an identification accuracy of 92.9% and a verification sensitivity and specificity of 100% and 89.9%. PMID:26568954

  7. Warhead verification as inverse problem: Applications of neutron spectrum unfolding from organic-scintillator measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawrence, Chris C.; Flaska, Marek; Pozzi, Sara A.

    2016-08-14

    Verification of future warhead-dismantlement treaties will require detection of certain warhead attributes without the disclosure of sensitive design information, and this presents an unusual measurement challenge. Neutron spectroscopy—commonly eschewed as an ill-posed inverse problem—may hold special advantages for warhead verification by virtue of its insensitivity to certain neutron-source parameters like plutonium isotopics. In this article, we investigate the usefulness of unfolded neutron spectra obtained from organic-scintillator data for verifying a particular treaty-relevant warhead attribute: the presence of high-explosive and neutron-reflecting materials. Toward this end, several improvements on current unfolding capabilities are demonstrated: deuterated detectors are shown to have superior response-matrixmore » condition to that of standard hydrogen-base scintintillators; a novel data-discretization scheme is proposed which removes important detector nonlinearities; and a technique is described for re-parameterizing the unfolding problem in order to constrain the parameter space of solutions sought, sidestepping the inverse problem altogether. These improvements are demonstrated with trial measurements and verified using accelerator-based time-of-flight calculation of reference spectra. Then, a demonstration is presented in which the elemental compositions of low-Z neutron-attenuating materials are estimated to within 10%. These techniques could have direct application in verifying the presence of high-explosive materials in a neutron-emitting test item, as well as other for treaty verification challenges.« less

  8. Warhead verification as inverse problem: Applications of neutron spectrum unfolding from organic-scintillator measurements

    NASA Astrophysics Data System (ADS)

    Lawrence, Chris C.; Febbraro, Michael; Flaska, Marek; Pozzi, Sara A.; Becchetti, F. D.

    2016-08-01

    Verification of future warhead-dismantlement treaties will require detection of certain warhead attributes without the disclosure of sensitive design information, and this presents an unusual measurement challenge. Neutron spectroscopy—commonly eschewed as an ill-posed inverse problem—may hold special advantages for warhead verification by virtue of its insensitivity to certain neutron-source parameters like plutonium isotopics. In this article, we investigate the usefulness of unfolded neutron spectra obtained from organic-scintillator data for verifying a particular treaty-relevant warhead attribute: the presence of high-explosive and neutron-reflecting materials. Toward this end, several improvements on current unfolding capabilities are demonstrated: deuterated detectors are shown to have superior response-matrix condition to that of standard hydrogen-base scintintillators; a novel data-discretization scheme is proposed which removes important detector nonlinearities; and a technique is described for re-parameterizing the unfolding problem in order to constrain the parameter space of solutions sought, sidestepping the inverse problem altogether. These improvements are demonstrated with trial measurements and verified using accelerator-based time-of-flight calculation of reference spectra. Then, a demonstration is presented in which the elemental compositions of low-Z neutron-attenuating materials are estimated to within 10%. These techniques could have direct application in verifying the presence of high-explosive materials in a neutron-emitting test item, as well as other for treaty verification challenges.

  9. Illumination-tolerant face verification of low-bit-rate JPEG2000 wavelet images with advanced correlation filters for handheld devices

    NASA Astrophysics Data System (ADS)

    Wijaya, Surya Li; Savvides, Marios; Vijaya Kumar, B. V. K.

    2005-02-01

    Face recognition on mobile devices, such as personal digital assistants and cell phones, is a big challenge owing to the limited computational resources available to run verifications on the devices themselves. One approach is to transmit the captured face images by use of the cell-phone connection and to run the verification on a remote station. However, owing to limitations in communication bandwidth, it may be necessary to transmit a compressed version of the image. We propose using the image compression standard JPEG2000, which is a wavelet-based compression engine used to compress the face images to low bit rates suitable for transmission over low-bandwidth communication channels. At the receiver end, the face images are reconstructed with a JPEG2000 decoder and are fed into the verification engine. We explore how advanced correlation filters, such as the minimum average correlation energy filter [Appl. Opt. 26, 3633 (1987)] and its variants, perform by using face images captured under different illumination conditions and encoded with different bit rates under the JPEG2000 wavelet-encoding standard. We evaluate the performance of these filters by using illumination variations from the Carnegie Mellon University's Pose, Illumination, and Expression (PIE) face database. We also demonstrate the tolerance of these filters to noisy versions of images with illumination variations.

  10. Personal Verification/Identification via Analysis of the Peripheral ECG Leads: Influence of the Personal Health Status on the Accuracy.

    PubMed

    Jekova, Irena; Bortolan, Giovanni

    2015-01-01

    Traditional means for identity validation (PIN codes, passwords), and physiological and behavioral biometric characteristics (fingerprint, iris, and speech) are susceptible to hacker attacks and/or falsification. This paper presents a method for person verification/identification based on correlation of present-to-previous limb ECG leads: I (r I), II (r II), calculated from them first principal ECG component (r PCA), linear and nonlinear combinations between r I, r II, and r PCA. For the verification task, the one-to-one scenario is applied and threshold values for r I, r II, and r PCA and their combinations are derived. The identification task supposes one-to-many scenario and the tested subject is identified according to the maximal correlation with a previously recorded ECG in a database. The population based ECG-ILSA database of 540 patients (147 healthy subjects, 175 patients with cardiac diseases, and 218 with hypertension) has been considered. In addition a common reference PTB dataset (14 healthy individuals) with short time interval between the two acquisitions has been taken into account. The results on ECG-ILSA database were satisfactory with healthy people, and there was not a significant decrease in nonhealthy patients, demonstrating the robustness of the proposed method. With PTB database, the method provides an identification accuracy of 92.9% and a verification sensitivity and specificity of 100% and 89.9%.

  11. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendt, Fabian F; Yu, Yi-Hsiang; Nielsen, Kim

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 was proposed by Bob Thresher (National Renewable Energy Laboratory) in 2015 and approved by the OES Executive Committee EXCO in 2016. The kickoff workshop took place in September 2016, wherein the initial baseline task was defined. Experience from similar offshore wind validation/verification projects (OC3-OC5 conducted within the International Energy Agency Wind Task 30)more » [1], [2] showed that a simple test case would help the initial cooperation to present results in a comparable way. A heaving sphere was chosen as the first test case. The team of project participants simulated different numerical experiments, such as heave decay tests and regular and irregular wave cases. The simulation results are presented and discussed in this paper.« less

  12. Comprehensive predictions of target proteins based on protein-chemical interaction using virtual screening and experimental verifications.

    PubMed

    Kobayashi, Hiroki; Harada, Hiroko; Nakamura, Masaomi; Futamura, Yushi; Ito, Akihiro; Yoshida, Minoru; Iemura, Shun-Ichiro; Shin-Ya, Kazuo; Doi, Takayuki; Takahashi, Takashi; Natsume, Tohru; Imoto, Masaya; Sakakibara, Yasubumi

    2012-04-05

    Identification of the target proteins of bioactive compounds is critical for elucidating the mode of action; however, target identification has been difficult in general, mostly due to the low sensitivity of detection using affinity chromatography followed by CBB staining and MS/MS analysis. We applied our protocol of predicting target proteins combining in silico screening and experimental verification for incednine, which inhibits the anti-apoptotic function of Bcl-xL by an unknown mechanism. One hundred eighty-two target protein candidates were computationally predicted to bind to incednine by the statistical prediction method, and the predictions were verified by in vitro binding of incednine to seven proteins, whose expression can be confirmed in our cell system.As a result, 40% accuracy of the computational predictions was achieved successfully, and we newly found 3 incednine-binding proteins. This study revealed that our proposed protocol of predicting target protein combining in silico screening and experimental verification is useful, and provides new insight into a strategy for identifying target proteins of small molecules.

  13. Authentication Based on Pole-zero Models of Signature Velocity

    PubMed Central

    Rashidi, Saeid; Fallah, Ali; Towhidkhah, Farzad

    2013-01-01

    With the increase of communication and financial transaction through internet, on-line signature verification is an accepted biometric technology for access control and plays a significant role in authenticity and authorization in modernized society. Therefore, fast and precise algorithms for the signature verification are very attractive. The goal of this paper is modeling of velocity signal that pattern and properties is stable for persons. With using pole-zero models based on discrete cosine transform, precise method is proposed for modeling and then features is founded from strokes. With using linear, parzen window and support vector machine classifiers, the signature verification technique was tested with a large number of authentic and forgery signatures and has demonstrated the good potential of this technique. The signatures are collected from three different database include a proprietary database, the SVC2004 and the Sabanci University signature database benchmark databases. Experimental results based on Persian, SVC2004 and SUSIG databases show that our method achieves an equal error rate of 5.91%, 5.62% and 3.91% in the skilled forgeries, respectively. PMID:24696797

  14. Precision cleaning verification of fluid components by air/water impingement and total carbon analysis

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.

    1994-01-01

    NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 sq m. Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging/diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg/sq ft of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVRs impinged from witness plates of 0.05 to 0.75 sq m.

  15. Precision Cleaning Verification of Fluid Components by Air/Water Impingement and Total Carbon Analysis

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.

    1995-01-01

    NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 m(exp 2). Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging-diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC-113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg-ft(exp 2) of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVR's impinged from witness plates of 0.05 to 0.75 m(exp 2).

  16. Verification and Improvement of ERS-1/2 Altimeter Geophysical Data Records for Global Change Studies

    NASA Technical Reports Server (NTRS)

    Shum, C. K.

    2000-01-01

    This Final Technical Report summarizes the research work conducted under NASA's Physical Oceanography Program entitled, Verification And Improvement Of ERS-112 Altimeter Geophysical Data Recorders For Global Change Studies, for the time period from January 1, 2000 through June 30, 2000. This report also provides a summary of the investigation from July 1, 1997 - June 30, 2000. The primary objectives of this investigation include verification and improvement of the ERS-1 and ERS-2 radar altimeter geophysical data records for distribution of the data to the ESA-approved U.S. ERS-1/-2 investigators for global climate change studies. Specifically, the investigation is to verify and improve the ERS geophysical data record products by calibrating the instrument and assessing accuracy for the ERS-1/-2 orbital, geophysical, media, and instrument corrections. The purpose is to ensure that the consistency of constants, standards and algorithms with TOPEX/POSEIDON radar altimeter for global climate change studies such as the monitoring and interpretation of long-term sea level change. This investigation has provided the current best precise orbits, with the radial orbit accuracy for ERS-1 (Phases C-G) and ERS-2 estimated at the 3-5 cm rms level, an 30-fold improvement compared to the 1993 accuracy. We have finalized the production and verification of the value-added ERS-1 mission (Phases A, B, C, D, E, F, and G), in collaboration with JPL PODAAC and the University of Texas. Orbit and data verification and improvement of algorithms led to the best data product available to-date. ERS-2 altimeter data have been improved and we have been active on Envisat (2001 launch) GDR algorithm review and improvement. The data improvement of ERS-1 and ERS-2 led to improvement in the global mean sea surface, marine gravity anomaly and bathymetry models, and a study of Antarctica mass balance, which was published in Science in 1998.

  17. Temporal evolution modeling of hydraulic and water quality performance of permeable pavements

    NASA Astrophysics Data System (ADS)

    Huang, Jian; He, Jianxun; Valeo, Caterina; Chu, Angus

    2016-02-01

    A mathematical model for predicting hydraulic and water quality performance in both the short- and long-term is proposed based on field measurements for three types of permeable pavements: porous asphalt (PA), porous concrete (PC), and permeable inter-locking concrete pavers (PICP). The model was applied to three field-scale test sites in Calgary, Alberta, Canada. The model performance was assessed in terms of hydraulic parameters including time to peak, peak flow and water balance and a water quality variable (the removal rate of total suspended solids). A total of 20 simulated storm events were used for model calibration and verification processes. The proposed model can simulate the outflow hydrographs with a coefficient of determination (R2) ranging from 0.762 to 0.907, and normalized root-mean-square deviation (NRMSD) ranging from 13.78% to 17.83%. Comparison of the time to peak flow, peak flow, runoff volume and TSS removal rates between the measured and modeled values in model verification phase had a maximum difference of 11%. The results demonstrate that the proposed model is capable of capturing the temporal dynamics of the pavement performance. Therefore, the model has great potential as a practical modeling tool for permeable pavement design and performance assessment.

  18. Security Architecture and Protocol for Trust Verifications Regarding the Integrity of Files Stored in Cloud Services.

    PubMed

    Pinheiro, Alexandre; Dias Canedo, Edna; de Sousa Junior, Rafael Timoteo; de Oliveira Albuquerque, Robson; García Villalba, Luis Javier; Kim, Tai-Hoon

    2018-03-02

    Cloud computing is considered an interesting paradigm due to its scalability, availability and virtually unlimited storage capacity. However, it is challenging to organize a cloud storage service (CSS) that is safe from the client point-of-view and to implement this CSS in public clouds since it is not advisable to blindly consider this configuration as fully trustworthy. Ideally, owners of large amounts of data should trust their data to be in the cloud for a long period of time, without the burden of keeping copies of the original data, nor of accessing the whole content for verifications regarding data preservation. Due to these requirements, integrity, availability, privacy and trust are still challenging issues for the adoption of cloud storage services, especially when losing or leaking information can bring significant damage, be it legal or business-related. With such concerns in mind, this paper proposes an architecture for periodically monitoring both the information stored in the cloud and the service provider behavior. The architecture operates with a proposed protocol based on trust and encryption concepts to ensure cloud data integrity without compromising confidentiality and without overloading storage services. Extensive tests and simulations of the proposed architecture and protocol validate their functional behavior and performance.

  19. Security Architecture and Protocol for Trust Verifications Regarding the Integrity of Files Stored in Cloud Services †

    PubMed Central

    2018-01-01

    Cloud computing is considered an interesting paradigm due to its scalability, availability and virtually unlimited storage capacity. However, it is challenging to organize a cloud storage service (CSS) that is safe from the client point-of-view and to implement this CSS in public clouds since it is not advisable to blindly consider this configuration as fully trustworthy. Ideally, owners of large amounts of data should trust their data to be in the cloud for a long period of time, without the burden of keeping copies of the original data, nor of accessing the whole content for verifications regarding data preservation. Due to these requirements, integrity, availability, privacy and trust are still challenging issues for the adoption of cloud storage services, especially when losing or leaking information can bring significant damage, be it legal or business-related. With such concerns in mind, this paper proposes an architecture for periodically monitoring both the information stored in the cloud and the service provider behavior. The architecture operates with a proposed protocol based on trust and encryption concepts to ensure cloud data integrity without compromising confidentiality and without overloading storage services. Extensive tests and simulations of the proposed architecture and protocol validate their functional behavior and performance. PMID:29498641

  20. Communication Architecture in Mixed-Reality Simulations of Unmanned Systems

    PubMed Central

    2018-01-01

    Verification of the correct functionality of multi-vehicle systems in high-fidelity scenarios is required before any deployment of such a complex system, e.g., in missions of remote sensing or in mobile sensor networks. Mixed-reality simulations where both virtual and physical entities can coexist and interact have been shown to be beneficial for development, testing, and verification of such systems. This paper deals with the problems of designing a certain communication subsystem for such highly desirable realistic simulations. Requirements of this communication subsystem, including proper addressing, transparent routing, visibility modeling, or message management, are specified prior to designing an appropriate solution. Then, a suitable architecture of this communication subsystem is proposed together with solutions to the challenges that arise when simultaneous virtual and physical message transmissions occur. The proposed architecture can be utilized as a high-fidelity network simulator for vehicular systems with implicit mobility models that are given by real trajectories of the vehicles. The architecture has been utilized within multiple projects dealing with the development and practical deployment of multi-UAV systems, which support the architecture’s viability and advantages. The provided experimental results show the achieved similarity of the communication characteristics of the fully deployed hardware setup to the setup utilizing the proposed mixed-reality architecture. PMID:29538290

Top