Sample records for addition verification studies

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: ENVIROFUELS DIESEL FUEL CATALYZER FUEL ADDITIVE

    EPA Science Inventory

    EPA's Environmental Technology Verification Program has tested EnviroFuels diesel fuel additive, called the Diesel Fuel Catalyzer. EnviroFuels has stated that heavy-duty on and off road diesel engines are the intended market for the catalyzer. Preliminary tests conducted indicate...

  2. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  3. Shuttle payload interface verification equipment study. Volume 2: Technical document, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The technical analysis is reported that was performed during the shuttle payload interface verification equipment study. It describes: (1) the background and intent of the study; (2) study approach and philosophy covering all facets of shuttle payload/cargo integration; (3)shuttle payload integration requirements; (4) preliminary design of the horizontal IVE; (5) vertical IVE concept; and (6) IVE program development plans, schedule and cost. Also included is a payload integration analysis task to identify potential uses in addition to payload interface verification.

  4. A Single-Boundary Accumulator Model of Response Times in an Addition Verification Task

    PubMed Central

    Faulkenberry, Thomas J.

    2017-01-01

    Current theories of mathematical cognition offer competing accounts of the interplay between encoding and calculation in mental arithmetic. Additive models propose that manipulations of problem format do not interact with the cognitive processes used in calculation. Alternatively, interactive models suppose that format manipulations have a direct effect on calculation processes. In the present study, we tested these competing models by fitting participants' RT distributions in an arithmetic verification task with a single-boundary accumulator model (the shifted Wald distribution). We found that in addition to providing a more complete description of RT distributions, the accumulator model afforded a potentially more sensitive test of format effects. Specifically, we found that format affected drift rate, which implies that problem format has a direct impact on calculation processes. These data give further support for an interactive model of mental arithmetic. PMID:28769853

  5. Students' Verification Strategies for Combinatorial Problems

    ERIC Educational Resources Information Center

    Mashiach Eizenberg, Michal; Zaslavsky, Orit

    2004-01-01

    We focus on a major difficulty in solving combinatorial problems, namely, on the verification of a solution. Our study aimed at identifying undergraduate students' tendencies to verify their solutions, and the verification strategies that they employ when solving these problems. In addition, an attempt was made to evaluate the level of efficiency…

  6. Environmental Technology Verification Report: Taconic Energy, Inc. TEA Fuel Additive

    EPA Science Inventory

    The Greenhouse Gas Technology Center (GHG Center) is one of six verification organizations operating under EPA’s ETV program. One sector of significant interest to GHG Center stakeholders is transportation - particularly technologies that result in fuel economy improvements. Taco...

  7. Spacecraft attitude calibration/verification baseline study

    NASA Technical Reports Server (NTRS)

    Chen, L. C.

    1981-01-01

    A baseline study for a generalized spacecraft attitude calibration/verification system is presented. It can be used to define software specifications for three major functions required by a mission: the pre-launch parameter observability and data collection strategy study; the in-flight sensor calibration; and the post-calibration attitude accuracy verification. Analytical considerations are given for both single-axis and three-axis spacecrafts. The three-axis attitudes considered include the inertial-pointing attitudes, the reference-pointing attitudes, and attitudes undergoing specific maneuvers. The attitude sensors and hardware considered include the Earth horizon sensors, the plane-field Sun sensors, the coarse and fine two-axis digital Sun sensors, the three-axis magnetometers, the fixed-head star trackers, and the inertial reference gyros.

  8. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    PubMed

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  9. Verification, Validation and Sensitivity Studies in Computational Biomechanics

    PubMed Central

    Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.

    2012-01-01

    Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646

  10. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the... verifications and analysis. It may also be necessary to limit the range of conditions under which the PEMS can... additional information or analysis to support your conclusions. (b) Overall verification. This paragraph (b...

  11. Verification of a Viscous Computational Aeroacoustics Code using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  12. Verification of a Viscous Computational Aeroacoustics Code Using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  13. Study of techniques for redundancy verification without disrupting systems, phases 1-3

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.

  14. QPF verification using different radar-based analyses: a case study

    NASA Astrophysics Data System (ADS)

    Moré, J.; Sairouni, A.; Rigo, T.; Bravo, M.; Mercader, J.

    2009-09-01

    Verification of QPF in NWP models has been always challenging not only for knowing what scores are better to quantify a particular skill of a model but also for choosing the more appropriate methodology when comparing forecasts with observations. On the one hand, an objective verification technique can provide conclusions that are not in agreement with those ones obtained by the "eyeball" method. Consequently, QPF can provide valuable information to forecasters in spite of having poor scores. On the other hand, there are difficulties in knowing the "truth" so different results can be achieved depending on the procedures used to obtain the precipitation analysis. The aim of this study is to show the importance of combining different precipitation analyses and verification methodologies to obtain a better knowledge of the skills of a forecasting system. In particular, a short range precipitation forecasting system based on MM5 at 12 km coupled with LAPS is studied in a local convective precipitation event that took place in NE Iberian Peninsula on October 3rd 2008. For this purpose, a variety of verification methods (dichotomous, recalibration and object oriented methods) are used to verify this case study. At the same time, different precipitation analyses are used in the verification process obtained by interpolating radar data using different techniques.

  15. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  16. Systematic study of source mask optimization and verification flows

    NASA Astrophysics Data System (ADS)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  17. Verification Games: Crowd-Sourced Formal Verification

    DTIC Science & Technology

    2016-03-01

    VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION UNIVERSITY OF WASHINGTON MARCH 2016 FINAL TECHNICAL REPORT...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750...clarification memorandum dated 16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Over the more than three years of the project Verification Games : Crowd-sourced

  18. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements: (a...

  19. Verification study of an emerging fire suppression system

    DOE PAGES

    Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.; ...

    2016-01-01

    Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation andmore » mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.« less

  20. Verification study of an emerging fire suppression system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.

    Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation andmore » mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.« less

  1. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  2. CASL Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mousseau, Vincent Andrew; Dinh, Nam

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation andmore » verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.« less

  3. SU-F-T-494: A Multi-Institutional Study of Independent Dose Verification Using Golden Beam Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Itano, M; Yamazaki, T; Tachibana, R

    Purpose: In general, beam data of individual linac is measured for independent dose verification software program and the verification is performed as a secondary check. In this study, independent dose verification using golden beam data was compared to that using individual linac’s beam data. Methods: Six institutions were participated and three different beam data were prepared. The one was individual measured data (Original Beam Data, OBD) .The others were generated by all measurements from same linac model (Model-GBD) and all linac models (All-GBD). The three different beam data were registered to the independent verification software program for each institute. Subsequently,more » patient’s plans in eight sites (brain, head and neck, lung, esophagus, breast, abdomen, pelvis and bone) were analyzed using the verification program to compare doses calculated using the three different beam data. Results: 1116 plans were collected from six institutes. Compared to using the OBD, the results shows the variation using the Model-GBD based calculation and the All-GBD was 0.0 ± 0.3% and 0.0 ± 0.6%, respectively. The maximum variations were 1.2% and 2.3%, respectively. The plans with the variation over 1% shows the reference points were located away from the central axis with/without physical wedge. Conclusion: The confidence limit (2SD) using the Model-GBD and the All-GBD was within 0.6% and 1.2%, respectively. Thus, the use of golden beam data may be feasible for independent verification. In addition to it, the verification using golden beam data provide quality assurance of planning from the view of audit. This research is partially supported by Japan Agency for Medical Research and Development(AMED)« less

  4. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  5. Adjusting for partial verification or workup bias in meta-analyses of diagnostic accuracy studies.

    PubMed

    de Groot, Joris A H; Dendukuri, Nandini; Janssen, Kristel J M; Reitsma, Johannes B; Brophy, James; Joseph, Lawrence; Bossuyt, Patrick M M; Moons, Karel G M

    2012-04-15

    A key requirement in the design of diagnostic accuracy studies is that all study participants receive both the test under evaluation and the reference standard test. For a variety of practical and ethical reasons, sometimes only a proportion of patients receive the reference standard, which can bias the accuracy estimates. Numerous methods have been described for correcting this partial verification bias or workup bias in individual studies. In this article, the authors describe a Bayesian method for obtaining adjusted results from a diagnostic meta-analysis when partial verification or workup bias is present in a subset of the primary studies. The method corrects for verification bias without having to exclude primary studies with verification bias, thus preserving the main advantages of a meta-analysis: increased precision and better generalizability. The results of this method are compared with the existing methods for dealing with verification bias in diagnostic meta-analyses. For illustration, the authors use empirical data from a systematic review of studies of the accuracy of the immunohistochemistry test for diagnosis of human epidermal growth factor receptor 2 status in breast cancer patients.

  6. Formulating face verification with semidefinite programming.

    PubMed

    Yan, Shuicheng; Liu, Jianzhuang; Tang, Xiaoou; Huang, Thomas S

    2007-11-01

    This paper presents a unified solution to three unsolved problems existing in face verification with subspace learning techniques: selection of verification threshold, automatic determination of subspace dimension, and deducing feature fusing weights. In contrast to previous algorithms which search for the projection matrix directly, our new algorithm investigates a similarity metric matrix (SMM). With a certain verification threshold, this matrix is learned by a semidefinite programming approach, along with the constraints of the kindred pairs with similarity larger than the threshold, and inhomogeneous pairs with similarity smaller than the threshold. Then, the subspace dimension and the feature fusing weights are simultaneously inferred from the singular value decomposition of the derived SMM. In addition, the weighted and tensor extensions are proposed to further improve the algorithmic effectiveness and efficiency, respectively. Essentially, the verification is conducted within an affine subspace in this new algorithm and is, hence, called the affine subspace for verification (ASV). Extensive experiments show that the ASV can achieve encouraging face verification accuracy in comparison to other subspace algorithms, even without the need to explore any parameters.

  7. Cognitive Bias in the Verification and Validation of Space Flight Systems

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of

  8. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  9. Expert system verification and validation study

    NASA Technical Reports Server (NTRS)

    French, Scott W.; Hamilton, David

    1992-01-01

    Five workshops on verification and validation (V&V) of expert systems (ES) where taught during this recent period of performance. Two key activities, previously performed under this contract, supported these recent workshops (1) Survey of state-of-the-practice of V&V of ES and (2) Development of workshop material and first class. The first activity involved performing an extensive survey of ES developers in order to answer several questions regarding the state-of-the-practice in V&V of ES. These questions related to the amount and type of V&V done and the successfulness of this V&V. The next key activity involved developing an intensive hands-on workshop in V&V of ES. This activity involved surveying a large number of V&V techniques, conventional as well as ES specific ones. In addition to explaining the techniques, we showed how each technique could be applied on a sample problem. References were included in the workshop material, and cross referenced to techniques, so that students would know where to go to find additional information about each technique. In addition to teaching specific techniques, we included an extensive amount of material on V&V concepts and how to develop a V&V plan for an ES project. We felt this material was necessary so that developers would be prepared to develop an orderly and structured approach to V&V. That is, they would have a process that supported the use of the specific techniques. Finally, to provide hands-on experience, we developed a set of case study exercises. These exercises were to provide an opportunity for the students to apply all the material (concepts, techniques, and planning material) to a realistic problem.

  10. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luke, S J

    2011-12-20

    This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of themore » US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of

  11. Influence of the Redundant Verification and the Non-Redundant Verification on the Hydraulic Tomography

    NASA Astrophysics Data System (ADS)

    Wei, T. B.; Chen, Y. L.; Lin, H. R.; Huang, S. Y.; Yeh, T. C. J.; Wen, J. C.

    2016-12-01

    In the groundwater study, it estimated the heterogeneous spatial distribution of hydraulic Properties, there were many scholars use to hydraulic tomography (HT) from field site pumping tests to estimate inverse of heterogeneous spatial distribution of hydraulic Properties, to prove the most of most field site aquifer was heterogeneous hydrogeological parameters spatial distribution field. Many scholars had proposed a method of hydraulic tomography to estimate heterogeneous spatial distribution of hydraulic Properties of aquifer, the Huang et al. [2011] was used the non-redundant verification analysis of pumping wells changed, observation wells fixed on the inverse and the forward, to reflect the feasibility of the heterogeneous spatial distribution of hydraulic Properties of field site aquifer of the non-redundant verification analysis on steady-state model.From post literature, finding only in steady state, non-redundant verification analysis of pumping well changed location and observation wells fixed location for inverse and forward. But the studies had not yet pumping wells fixed or changed location, and observation wells fixed location for redundant verification or observation wells change location for non-redundant verification of the various combinations may to explore of influences of hydraulic tomography method. In this study, it carried out redundant verification method and non-redundant verification method for forward to influences of hydraulic tomography method in transient. And it discuss above mentioned in NYUST campus sites the actual case, to prove the effectiveness of hydraulic tomography methods, and confirmed the feasibility on inverse and forward analysis from analysis results.Keywords: Hydraulic Tomography, Redundant Verification, Heterogeneous, Inverse, Forward

  12. Feasibility of biochemical verification in a web-based smoking cessation study.

    PubMed

    Cha, Sarah; Ganz, Ollie; Cohn, Amy M; Ehlke, Sarah J; Graham, Amanda L

    2017-10-01

    Cogent arguments have been made against the need for biochemical verification in population-based studies with low-demand characteristics. Despite this fact, studies involving digital interventions (low-demand) are often required in peer review to report biochemically verified abstinence. To address this discrepancy, we examined the feasibility and costs of biochemical verification in a web-based study conducted with a national sample. Participants were 600U.S. adult current smokers who registered on a web-based smoking cessation program and completed surveys at baseline and 3months. Saliva sampling kits were sent to participants who reported 7-day abstinence at 3months, and analyzed for cotinine. The response rate at 3-months was 41.2% (n=247): 93 participants reported 7-day abstinence (38%) and were mailed a saliva kit (71% returned). The discordance rate was 36.4%. Participants with discordant responses were more likely to report 3-month use of nicotine replacement therapy or e-cigarettes than those with concordant responses (79.2% vs. 45.2%, p=0.007). The total cost of saliva sampling was $8280 ($125/sample). Biochemical verification was both time- and cost-intensive, and yielded a relatively small number of samples due to low response rates and use of other nicotine products during the follow-up period. There was a high rate of discordance of self-reported abstinence and saliva testing. Costs for data collection may be prohibitive for studies with large sample sizes or limited budgets. Our findings echo previous statements that biochemical verification is not necessary in population-based studies, and add evidence specific to technology-based studies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Verification of Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  14. SU-E-T-49: A Multi-Institutional Study of Independent Dose Verification for IMRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baba, H; Tachibana, H; Kamima, T

    2015-06-15

    Purpose: AAPM TG114 does not cover the independent verification for IMRT. We conducted a study of independent dose verification for IMRT in seven institutes to show the feasibility. Methods: 384 IMRT plans in the sites of prostate and head and neck (HN) were collected from the institutes, where the planning was performed using Eclipse and Pinnacle3 with the two techniques of step and shoot (S&S) and sliding window (SW). All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiologicalmore » path length. An ion-chamber measurement in a water-equivalent slab phantom was performed to compare the doses computed using the TPS and an independent dose verification program. Additionally, the agreement in dose computed in patient CT images between using the TPS and using the SMU was assessed. The dose of the composite beams in the plan was evaluated. Results: The agreement between the measurement and the SMU were −2.3±1.9 % and −5.6±3.6 % for prostate and HN sites, respectively. The agreement between the TPSs and the SMU were −2.1±1.9 % and −3.0±3.7 for prostate and HN sites, respectively. There was a negative systematic difference with similar standard deviation and the difference was larger in the HN site. The S&S technique showed a statistically significant difference between the SW. Because the Clarkson-based method in the independent program underestimated (cannot consider) the dose under the MLC. Conclusion: The accuracy would be improved when the Clarkson-based algorithm should be modified for IMRT and the tolerance level would be within 5%.« less

  15. Considerations in STS payload environmental verification

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1978-01-01

    Considerations regarding the Space Transportation System (STS) payload environmental verification are reviewed. It is noted that emphasis is placed on testing at the subassembly level and that the basic objective of structural dynamic payload verification is to ensure reliability in a cost-effective manner. Structural analyses consist of: (1) stress analysis for critical loading conditions, (2) model analysis for launch and orbital configurations, (3) flight loads analysis, (4) test simulation analysis to verify models, (5) kinematic analysis of deployment/retraction sequences, and (6) structural-thermal-optical program analysis. In addition to these approaches, payload verification programs are being developed in the thermal-vacuum area. These include the exposure to extreme temperatures, temperature cycling, thermal-balance testing and thermal-vacuum testing.

  16. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  17. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  18. A study of applications scribe frame data verifications using design rule check

    NASA Astrophysics Data System (ADS)

    Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki

    2013-06-01

    In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.

  19. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  20. Separating stages of arithmetic verification: An ERP study with a novel paradigm.

    PubMed

    Avancini, Chiara; Soltész, Fruzsina; Szűcs, Dénes

    2015-08-01

    In studies of arithmetic verification, participants typically encounter two operands and they carry out an operation on these (e.g. adding them). Operands are followed by a proposed answer and participants decide whether this answer is correct or incorrect. However, interpretation of results is difficult because multiple parallel, temporally overlapping numerical and non-numerical processes of the human brain may contribute to task execution. In order to overcome this problem here we used a novel paradigm specifically designed to tease apart the overlapping cognitive processes active during arithmetic verification. Specifically, we aimed to separate effects related to detection of arithmetic correctness, detection of the violation of strategic expectations, detection of physical stimulus properties mismatch and numerical magnitude comparison (numerical distance effects). Arithmetic correctness, physical stimulus properties and magnitude information were not task-relevant properties of the stimuli. We distinguished between a series of temporally highly overlapping cognitive processes which in turn elicited overlapping ERP effects with distinct scalp topographies. We suggest that arithmetic verification relies on two major temporal phases which include parallel running processes. Our paradigm offers a new method for investigating specific arithmetic verification processes in detail. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Feasibility study on dosimetry verification of volumetric-modulated arc therapy-based total marrow irradiation.

    PubMed

    Liang, Yun; Kim, Gwe-Ya; Pawlicki, Todd; Mundt, Arno J; Mell, Loren K

    2013-03-04

    The purpose of this study was to develop dosimetry verification procedures for volumetric-modulated arc therapy (VMAT)-based total marrow irradiation (TMI). The VMAT based TMI plans were generated for three patients: one child and two adults. The planning target volume (PTV) was defined as bony skeleton, from head to mid-femur, with a 3 mm margin. The plan strategy similar to published studies was adopted. The PTV was divided into head and neck, chest, and pelvic regions, with separate plans each of which is composed of 2-3 arcs/fields. Multiple isocenters were evenly distributed along the patient's axial direction. The focus of this study is to establish a dosimetry quality assurance procedure involving both two-dimensional (2D) and three-dimensional (3D) volumetric verifications, which is desirable for a large PTV treated with multiple isocenters. The 2D dose verification was performed with film for gamma evaluation and absolute point dose was measured with ion chamber, with attention to the junction between neighboring plans regarding hot/cold spots. The 3D volumetric dose verification used commercial dose reconstruction software to reconstruct dose from electronic portal imaging devices (EPID) images. The gamma evaluation criteria in both 2D and 3D verification were 5% absolute point dose difference and 3 mm of distance to agreement. With film dosimetry, the overall average gamma passing rate was 98.2% and absolute dose difference was 3.9% in junction areas among the test patients; with volumetric portal dosimetry, the corresponding numbers were 90.7% and 2.4%. A dosimetry verification procedure involving both 2D and 3D was developed for VMAT-based TMI. The initial results are encouraging and warrant further investigation in clinical trials.

  2. MARATHON Verification (MARV)

    DTIC Science & Technology

    2017-08-01

    comparable with MARATHON 1 in terms of output. Rather, the MARATHON 2 verification cases were designed to ensure correct implementation of the new algorithms...DISCLAIMER The findings of this report are not to be construed as an official Department of the Army position, policy, or decision unless so designated by...for employment against demands. This study is a comparative verification of the functionality of MARATHON 4 (our newest implementation of MARATHON

  3. Using Small-Step Refinement for Algorithm Verification in Computer Science Education

    ERIC Educational Resources Information Center

    Simic, Danijela

    2015-01-01

    Stepwise program refinement techniques can be used to simplify program verification. Programs are better understood since their main properties are clearly stated, and verification of rather complex algorithms is reduced to proving simple statements connecting successive program specifications. Additionally, it is easy to analyse similar…

  4. Simulation environment based on the Universal Verification Methodology

    NASA Astrophysics Data System (ADS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  5. Verification of the FtCayuga fault-tolerant microprocessor system. Volume 1: A case study in theorem prover-based verification

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam; Bickford, Mark

    1991-01-01

    The design and formal verification of a hardware system for a task that is an important component of a fault tolerant computer architecture for flight control systems is presented. The hardware system implements an algorithm for obtaining interactive consistancy (byzantine agreement) among four microprocessors as a special instruction on the processors. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, provided certain preconditions hold. An assumption is made that the processors execute synchronously. For verification, the authors used a computer aided design hardware design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.

  6. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  7. Self-verification motives at the collective level of self-definition.

    PubMed

    Chen, Serena; Chen, Karen Y; Shaw, Lindsay

    2004-01-01

    Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.

  8. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  9. An Efficient Location Verification Scheme for Static Wireless Sensor Networks.

    PubMed

    Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok

    2017-01-24

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.

  10. An Efficient Location Verification Scheme for Static Wireless Sensor Networks

    PubMed Central

    Kim, In-hwan; Kim, Bo-sung; Song, JooSeok

    2017-01-01

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007

  11. Synesthesia affects verification of simple arithmetic equations.

    PubMed

    Ghirardelli, Thomas G; Mills, Carol Bergfeld; Zilioli, Monica K C; Bailey, Leah P; Kretschmar, Paige K

    2010-01-01

    To investigate the effects of color-digit synesthesia on numerical representation, we presented a synesthete, called SE, in the present study, and controls with mathematical equations for verification. In Experiment 1, SE verified addition equations made up of digits that either matched or mismatched her color-digit photisms or were in black. In Experiment 2A, the addends were presented in the different color conditions and the solution was presented in black, whereas in Experiment 2B the addends were presented in black and the solutions were presented in the different color conditions. In Experiment 3, multiplication and division equations were presented in the same color conditions as in Experiment 1. SE responded significantly faster to equations that matched her photisms than to those that did not; controls did not show this effect. These results suggest that photisms influence the processing of digits in arithmetic verification, replicating and extending previous findings.

  12. VEG-01: Veggie Hardware Verification Testing

    NASA Technical Reports Server (NTRS)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  13. Verification of chemistry reference ranges using a simple method in sub-Saharan Africa

    PubMed Central

    Taylor, Douglas; Mandala, Justin; Nanda, Kavita; Van Campenhout, Christel; Agingu, Walter; Madurai, Lorna; Barsch, Eva-Maria; Deese, Jennifer; Van Damme, Lut; Crucitti, Tania

    2016-01-01

    Background Chemistry safety assessments are interpreted by using chemistry reference ranges (CRRs). Verification of CRRs is time consuming and often requires a statistical background. Objectives We report on an easy and cost-saving method to verify CRRs. Methods Using a former method introduced by Sigma Diagnostics, three study sites in sub-Saharan Africa, Bondo, Kenya, and Pretoria and Bloemfontein, South Africa, verified the CRRs for hepatic and renal biochemistry assays performed during a clinical trial of HIV antiretroviral pre-exposure prophylaxis. The aspartate aminotransferase/alanine aminotransferase, creatinine and phosphorus results from 10 clinically-healthy participants at the screening visit were used. In the event the CRRs did not pass the verification, new CRRs had to be calculated based on 40 clinically-healthy participants. Results Within a few weeks, the study sites accomplished verification of the CRRs without additional costs. The aspartate aminotransferase reference ranges for the Bondo, Kenya site and the alanine aminotransferase reference ranges for the Pretoria, South Africa site required adjustment. The phosphorus CRR passed verification and the creatinine CRR required adjustment at every site. The newly-established CRR intervals were narrower than the CRRs used previously at these study sites due to decreases in the upper limits of the reference ranges. As a result, more toxicities were detected. Conclusion To ensure the safety of clinical trial participants, verification of CRRs should be standard practice in clinical trials conducted in settings where the CRR has not been validated for the local population. This verification method is simple, inexpensive, and can be performed by any medical laboratory. PMID:28879112

  14. Verification of chemistry reference ranges using a simple method in sub-Saharan Africa.

    PubMed

    De Baetselier, Irith; Taylor, Douglas; Mandala, Justin; Nanda, Kavita; Van Campenhout, Christel; Agingu, Walter; Madurai, Lorna; Barsch, Eva-Maria; Deese, Jennifer; Van Damme, Lut; Crucitti, Tania

    2016-01-01

    Chemistry safety assessments are interpreted by using chemistry reference ranges (CRRs). Verification of CRRs is time consuming and often requires a statistical background. We report on an easy and cost-saving method to verify CRRs. Using a former method introduced by Sigma Diagnostics, three study sites in sub-Saharan Africa, Bondo, Kenya, and Pretoria and Bloemfontein, South Africa, verified the CRRs for hepatic and renal biochemistry assays performed during a clinical trial of HIV antiretroviral pre-exposure prophylaxis. The aspartate aminotransferase/alanine aminotransferase, creatinine and phosphorus results from 10 clinically-healthy participants at the screening visit were used. In the event the CRRs did not pass the verification, new CRRs had to be calculated based on 40 clinically-healthy participants. Within a few weeks, the study sites accomplished verification of the CRRs without additional costs. The aspartate aminotransferase reference ranges for the Bondo, Kenya site and the alanine aminotransferase reference ranges for the Pretoria, South Africa site required adjustment. The phosphorus CRR passed verification and the creatinine CRR required adjustment at every site. The newly-established CRR intervals were narrower than the CRRs used previously at these study sites due to decreases in the upper limits of the reference ranges. As a result, more toxicities were detected. To ensure the safety of clinical trial participants, verification of CRRs should be standard practice in clinical trials conducted in settings where the CRR has not been validated for the local population. This verification method is simple, inexpensive, and can be performed by any medical laboratory.

  15. The Maximal Oxygen Uptake Verification Phase: a Light at the End of the Tunnel?

    PubMed

    Schaun, Gustavo Z

    2017-12-08

    Commonly performed during an incremental test to exhaustion, maximal oxygen uptake (V̇O 2max ) assessment has become a recurring practice in clinical and experimental settings. To validate the test, several criteria were proposed. In this context, the plateau in oxygen uptake (V̇O 2 ) is inconsistent in its frequency, reducing its usefulness as a robust method to determine "true" V̇O 2max . Moreover, secondary criteria previously suggested, such as expiratory exchange ratios or percentages of maximal heart rate, are highly dependent on protocol design and often are achieved at V̇O 2 percentages well below V̇O 2max . Thus, an alternative method termed verification phase was proposed. Currently, it is clear that the verification phase can be a practical and sensitive method to confirm V̇O 2max ; however, procedures to conduct it are not standardized across the literature and no previous research tried to summarize how it has been employed. Therefore, in this review the knowledge on the verification phase was updated, while suggestions on how it can be performed (e.g. intensity, duration, recovery) were provided according to population and protocol design. Future studies should focus to identify a verification protocol feasible for different populations and to compare square-wave and multistage verification phases. Additionally, studies assessing verification phases in different patient populations are still warranted.

  16. Self-verification and contextualized self-views.

    PubMed

    Chen, Serena; English, Tammy; Peng, Kaiping

    2006-07-01

    Whereas most self-verification research has focused on people's desire to verify their global self-conceptions, the present studies examined self-verification with regard to contextualized selfviews-views of the self in particular situations and relationships. It was hypothesized that individuals whose core self-conceptions include contextualized self-views should seek to verify these self-views. In Study 1, the more individuals defined the self in dialectical terms, the more their judgments were biased in favor of verifying over nonverifying feedback about a negative, situation-specific self-view. In Study 2, consistent with research on gender differences in the importance of relationships to the self-concept, women but not men showed a similar bias toward feedback about a negative, relationship-specific self-view, a pattern not seen for global self-views. Together, the results support the notion that self-verification occurs for core self-conceptions, whatever form(s) they may take. Individual differences in self-verification and the nature of selfhood and authenticity are discussed.

  17. Signature Verification Based on Handwritten Text Recognition

    NASA Astrophysics Data System (ADS)

    Viriri, Serestina; Tapamo, Jules-R.

    Signatures continue to be an important biometric trait because it remains widely used primarily for authenticating the identity of human beings. This paper presents an efficient text-based directional signature recognition algorithm which verifies signatures, even when they are composed of special unconstrained cursive characters which are superimposed and embellished. This algorithm extends the character-based signature verification technique. The experiments carried out on the GPDS signature database and an additional database created from signatures captured using the ePadInk tablet, show that the approach is effective and efficient, with a positive verification rate of 94.95%.

  18. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  19. Verification and classification bias interactions in diagnostic test accuracy studies for fine-needle aspiration biopsy.

    PubMed

    Schmidt, Robert L; Walker, Brandon S; Cohen, Michael B

    2015-03-01

    Reliable estimates of accuracy are important for any diagnostic test. Diagnostic accuracy studies are subject to unique sources of bias. Verification bias and classification bias are 2 sources of bias that commonly occur in diagnostic accuracy studies. Statistical methods are available to estimate the impact of these sources of bias when they occur alone. The impact of interactions when these types of bias occur together has not been investigated. We developed mathematical relationships to show the combined effect of verification bias and classification bias. A wide range of case scenarios were generated to assess the impact of bias components and interactions on total bias. Interactions between verification bias and classification bias caused overestimation of sensitivity and underestimation of specificity. Interactions had more effect on sensitivity than specificity. Sensitivity was overestimated by at least 7% in approximately 6% of the tested scenarios. Specificity was underestimated by at least 7% in less than 0.1% of the scenarios. Interactions between verification bias and classification bias create distortions in accuracy estimates that are greater than would be predicted from each source of bias acting independently. © 2014 American Cancer Society.

  20. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  1. Industrial methodology for process verification in research (IMPROVER): toward systems biology verification

    PubMed Central

    Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo

    2012-01-01

    Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044

  2. Physics Verification Overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  3. EMC: Verification

    Science.gov Websites

    , GFS, RAP, HRRR, HIRESW, SREF mean, International Global Models, HPC analysis Precipitation Skill Scores : 1995-Present NAM, GFS, NAM CONUS nest, International Models EMC Forecast Verfication Stats: NAM ) Real Time Verification of NCEP Operational Models against observations Real Time Verification of NCEP

  4. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  5. Integrated guidance, navigation and control verification plan primary flight system. [space shuttle avionics integration

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The verification process and requirements for the ascent guidance interfaces and the ascent integrated guidance, navigation and control system for the space shuttle orbiter are defined as well as portions of supporting systems which directly interface with the system. The ascent phase of verification covers the normal and ATO ascent through the final OMS-2 circularization burn (all of OPS-1), the AOA ascent through the OMS-1 burn, and the RTLS ascent through ET separation (all of MM 601). In addition, OPS translation verification is defined. Verification trees and roadmaps are given.

  6. Voltage verification unit

    DOEpatents

    Martin, Edward J [Virginia Beach, VA

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  7. ICAN/PART: Particulate composite analyzer, user's manual and verification studies

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.; Murthy, Pappu L. N.; Mital, Subodh K.

    1996-01-01

    A methodology for predicting the equivalent properties and constituent microstresses for particulate matrix composites, based on the micromechanics approach, is developed. These equations are integrated into a computer code developed to predict the equivalent properties and microstresses of fiber reinforced polymer matrix composites to form a new computer code, ICAN/PART. Details of the flowchart, input and output for ICAN/PART are described, along with examples of the input and output. Only the differences between ICAN/PART and the original ICAN code are described in detail, and the user is assumed to be familiar with the structure and usage of the original ICAN code. Detailed verification studies, utilizing dim dimensional finite element and boundary element analyses, are conducted in order to verify that the micromechanics methodology accurately models the mechanics of particulate matrix composites. ne equivalent properties computed by ICAN/PART fall within bounds established by the finite element and boundary element results. Furthermore, constituent microstresses computed by ICAN/PART agree in average sense with results computed using the finite element method. The verification studies indicate that the micromechanics programmed into ICAN/PART do indeed accurately model the mechanics of particulate matrix composites.

  8. KSOS Secure Unix Verification Plan (Kernelized Secure Operating System).

    DTIC Science & Technology

    1980-12-01

    shall be handled as proprietary information untii 5 Apri 1978. After that time, the Government m-. distribute the document as it sees fit. UNIX and PWB...Accession For P-’(’ T.’i3 :- NTI G.;:’... &I : " \\ " Y: Codes mdlc/or 71!O lii WDL-TR7809 KSOS VERIFICATION PLAN SECTION I INTRODUCTION "’The purpose...funding, additional tools may be available by the time they are needed for FSOS verification. We intend to use the best available technology in

  9. The 2014 Sandia Verification and Validation Challenge: Problem statement

    DOE PAGES

    Hu, Kenneth; Orient, George

    2016-01-18

    This paper presents a case study in utilizing information from experiments, models, and verification and validation (V&V) to support a decision. It consists of a simple system with data and models provided, plus a safety requirement to assess. The goal is to pose a problem that is flexible enough to allow challengers to demonstrate a variety of approaches, but constrained enough to focus attention on a theme. This was accomplished by providing a good deal of background information in addition to the data, models, and code, but directing the participants' activities with specific deliverables. In this challenge, the theme ismore » how to gather and present evidence about the quality of model predictions, in order to support a decision. This case study formed the basis of the 2014 Sandia V&V Challenge Workshop and this resulting special edition of the ASME Journal of Verification, Validation, and Uncertainty Quantification.« less

  10. Full-chip level MEEF analysis using model based lithography verification

    NASA Astrophysics Data System (ADS)

    Kim, Juhwan; Wang, Lantian; Zhang, Daniel; Tang, Zongwu

    2005-11-01

    MEEF (Mask Error Enhancement Factor) has become a critical factor in CD uniformity control since optical lithography process moved to sub-resolution era. A lot of studies have been done by quantifying the impact of the mask CD (Critical Dimension) errors on the wafer CD errors1-2. However, the benefits from those studies were restricted only to small pattern areas of the full-chip data due to long simulation time. As fast turn around time can be achieved for the complicated verifications on very large data by linearly scalable distributed processing technology, model-based lithography verification becomes feasible for various types of applications such as post mask synthesis data sign off for mask tape out in production and lithography process development with full-chip data3,4,5. In this study, we introduced two useful methodologies for the full-chip level verification of mask error impact on wafer lithography patterning process. One methodology is to check MEEF distribution in addition to CD distribution through process window, which can be used for RET/OPC optimization at R&D stage. The other is to check mask error sensitivity on potential pinch and bridge hotspots through lithography process variation, where the outputs can be passed on to Mask CD metrology to add CD measurements on those hotspot locations. Two different OPC data were compared using the two methodologies in this study.

  11. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  12. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  13. PET/CT imaging for treatment verification after proton therapy: A study with plastic phantoms and metallic implants

    PubMed Central

    Parodi, Katia; Paganetti, Harald; Cascio, Ethan; Flanz, Jacob B.; Bonab, Ali A.; Alpert, Nathaniel M.; Lohmann, Kevin; Bortfeld, Thomas

    2008-01-01

    The feasibility of off-line positron emission tomography/computed tomography (PET/CT) for routine three dimensional in-vivo treatment verification of proton radiation therapy is currently under investigation at Massachusetts General Hospital in Boston. In preparation for clinical trials, phantom experiments were carried out to investigate the sensitivity and accuracy of the method depending on irradiation and imaging parameters. Furthermore, they addressed the feasibility of PET/CT as a robust verification tool in the presence of metallic implants. These produce x-ray CT artifacts and fluence perturbations which may compromise the accuracy of treatment planning algorithms. Spread-out Bragg peak proton fields were delivered to different phantoms consisting of polymethylmethacrylate (PMMA), PMMA stacked with lung and bone equivalent materials, and PMMA with titanium rods to mimic implants in patients. PET data were acquired in list mode starting within 20 min after irradiation at a commercial luthetium-oxyorthosilicate (LSO)-based PET/CT scanner. The amount and spatial distribution of the measured activity could be well reproduced by calculations based on the GEANT4 and FLUKA Monte Carlo codes. This phantom study supports the potential of millimeter accuracy for range monitoring and lateral field position verification even after low therapeutic dose exposures of 2 Gy, despite the delay between irradiation and imaging. It also indicates the value of PET for treatment verification in the presence of metallic implants, demonstrating a higher sensitivity to fluence perturbations in comparison to a commercial analytical treatment planning system. Finally, it addresses the suitability of LSO-based PET detectors for hadron therapy monitoring. This unconventional application of PET involves countrates which are orders of magnitude lower than in diagnostic tracer imaging, i.e., the signal of interest is comparable to the noise originating from the intrinsic radioactivity of

  14. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices.

    PubMed

    Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei

    2017-01-10

    In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer's forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices.

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION--TEST REPORT OF MOBILE SOURCE EMISSION CONTROL DEVICES, FLINT HILLS RESOURCES, LP, CCD15010 DIESEL FUEL FORMULATION WITH HITEC4121 ADDITIVE

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  16. Statistical Design for Biospecimen Cohort Size in Proteomics-based Biomarker Discovery and Verification Studies

    PubMed Central

    Skates, Steven J.; Gillette, Michael A.; LaBaer, Joshua; Carr, Steven A.; Anderson, N. Leigh; Liebler, Daniel C.; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L.; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R.; Rodriguez, Henry; Boja, Emily S.

    2014-01-01

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC), with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance, and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step towards building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research. PMID:24063748

  17. Statistical design for biospecimen cohort size in proteomics-based biomarker discovery and verification studies.

    PubMed

    Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S

    2013-12-06

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.

  18. Modality Switching in a Property Verification Task: An ERP Study of What Happens When Candles Flicker after High Heels Click

    PubMed Central

    Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana

    2011-01-01

    The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties. PMID:21713128

  19. Modality Switching in a Property Verification Task: An ERP Study of What Happens When Candles Flicker after High Heels Click.

    PubMed

    Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana

    2011-01-01

    The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties.

  20. Exploring implementation practices in results-based financing: the case of the verification in Benin.

    PubMed

    Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier

    2017-03-14

    Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks

  1. Feasibility Study on Applying Radiophotoluminescent Glass Dosimeters for CyberKnife SRS Dose Verification

    PubMed Central

    Hsu, Shih-Ming; Hung, Chao-Hsiung; Liao, Yi-Jen; Fu, Hsiao-Mei; Tsai, Jo-Ting

    2017-01-01

    CyberKnife is one of multiple modalities for stereotactic radiosurgery (SRS). Due to the nature of CyberKnife and the characteristics of SRS, dose evaluation of the CyberKnife procedure is critical. A radiophotoluminescent glass dosimeter was used to verify the dose accuracy for the CyberKnife procedure and validate a viable dose verification system for CyberKnife treatment. A radiophotoluminescent glass dosimeter, thermoluminescent dosimeter, and Kodak EDR2 film were used to measure the lateral dose profile and percent depth dose of CyberKnife. A Monte Carlo simulation for dose verification was performed using BEAMnrc to verify the measured results. This study also used a radiophotoluminescent glass dosimeter coupled with an anthropomorphic phantom to evaluate the accuracy of the dose given by CyberKnife. Measurements from the radiophotoluminescent glass dosimeter were compared with the results of a thermoluminescent dosimeter and EDR2 film, and the differences found were less than 5%. The radiophotoluminescent glass dosimeter has some advantages in terms of dose measurements over CyberKnife, such as repeatability, stability, and small effective size. These advantages make radiophotoluminescent glass dosimeters a potential candidate dosimeter for the CyberKnife procedure. This study concludes that radiophotoluminescent glass dosimeters are a promising and reliable dosimeter for CyberKnife dose verification with clinically acceptable accuracy within 5%. PMID:28046056

  2. SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamashita, M; Kokubo, M; Institute of Biomedical Research and Innovation, Kobe, Hyogo

    2016-06-15

    Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used formore » dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical

  3. SU-E-T-32: A Feasibility Study of Independent Dose Verification for IMAT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamima, T; Takahashi, R; Sato, Y

    2015-06-15

    Purpose: To assess the feasibility of the independent dose verification (Indp) for intensity modulated arc therapy (IMAT). Methods: An independent dose calculation software program (Simple MU Analysis, Triangle Products, JP) was used in this study, which can compute the radiological path length from the surface to the reference point for each control point using patient’s CT image dataset and the MLC aperture shape was simultaneously modeled in reference to the information of MLC from DICOM-RT plan. Dose calculation was performed using a modified Clarkson method considering MLC transmission and dosimetric leaf gap. In this study, a retrospective analysis was conductedmore » in which IMAT plans from 120 patients of the two sites (prostate / head and neck) from four institutes were retrospectively analyzed to compare the Indp to the TPS using patient CT images. In addition, an ion-chamber measurement was performed to verify the accuracy of the TPS and the Indp in water-equivalent phantom. Results: The agreements between the Indp and the TPS (mean±1SD) were −0.8±2.4% and −1.3±3.8% for the regions of prostate and head and neck, respectively. The measurement comparison showed similar results (−0.8±1.6% and 0.1±4.6% for prostate and head and neck). The variation was larger in the head and neck because the number of the segments was increased that the reference point was under the MLC and the modified Clarkson method cannot consider the smooth falloff of the leaf penumbra. Conclusion: The independent verification program would be practical and effective for secondary check for IMAT with the sufficient accuracy in the measurement and CT-based calculation. The accuracy would be improved if considering the falloff of the leaf penumbra.« less

  4. Viability Study for an Unattended UF 6 Cylinder Verification Station: Phase I Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Leon E.; Miller, Karen A.; Garner, James R.

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass and identification for all declared UF 6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field

  5. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  6. Optical security verification for blurred fingerprints

    NASA Astrophysics Data System (ADS)

    Soon, Boon Y.; Karim, Mohammad A.; Alam, Mohammad S.

    1998-12-01

    Optical fingerprint security verification is gaining popularity, as it has the potential to perform correlation at the speed of light. With advancement in optical security verification techniques, authentication process can be almost foolproof and reliable for financial transaction, banking, etc. In law enforcement, when a fingerprint is obtained from a crime scene, it may be blurred and can be an unhealthy candidate for correlation purposes. Therefore, the blurred fingerprint needs to be clarified before it is used for the correlation process. There are a several different types of blur, such as linear motion blur and defocus blur, induced by aberration of imaging system. In addition, we may or may not know the blur function. In this paper, we propose the non-singularity inverse filtering in frequency/power domain for deblurring known motion-induced blur in fingerprints. This filtering process will be incorporated with the pow spectrum subtraction technique, uniqueness comparison scheme, and the separated target and references planes method in the joint transform correlator. The proposed hardware implementation is a hybrid electronic-optical correlator system. The performance of the proposed system would be verified with computer simulation for both cases: with and without additive random noise corruption.

  7. Hierarchical Representation Learning for Kinship Verification.

    PubMed

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  8. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices

    PubMed Central

    Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei

    2017-01-01

    In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer’s forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices. PMID:28075375

  9. Identity Verification, Control, and Aggression in Marriage

    ERIC Educational Resources Information Center

    Stets, Jan E.; Burke, Peter J.

    2005-01-01

    In this research we study the identity verification process and its effects in marriage. Drawing on identity control theory, we hypothesize that a lack of verification in the spouse identity (1) threatens stable self-meanings and interaction patterns between spouses, and (2) challenges a (nonverified) spouse's perception of control over the…

  10. Cleaning and Cleanliness Verification Techniques for Mars Returned Sample Handling

    NASA Technical Reports Server (NTRS)

    Mickelson, E. T.; Lindstrom, D. J.; Allton, J. H.; Hittle, J. D.

    2002-01-01

    Precision cleaning and cleanliness verification techniques are examined as a subset of a comprehensive contamination control strategy for a Mars sample return mission. Additional information is contained in the original extended abstract.

  11. Reactive system verification case study: Fault-tolerant transputer communication

    NASA Technical Reports Server (NTRS)

    Crane, D. Francis; Hamory, Philip J.

    1993-01-01

    A reactive program is one which engages in an ongoing interaction with its environment. A system which is controlled by an embedded reactive program is called a reactive system. Examples of reactive systems are aircraft flight management systems, bank automatic teller machine (ATM) networks, airline reservation systems, and computer operating systems. Reactive systems are often naturally modeled (for logical design purposes) as a composition of autonomous processes which progress concurrently and which communicate to share information and/or to coordinate activities. Formal (i.e., mathematical) frameworks for system verification are tools used to increase the users' confidence that a system design satisfies its specification. A framework for reactive system verification includes formal languages for system modeling and for behavior specification and decision procedures and/or proof-systems for verifying that the system model satisfies the system specifications. Using the Ostroff framework for reactive system verification, an approach to achieving fault-tolerant communication between transputers was shown to be effective. The key components of the design, the decoupler processes, may be viewed as discrete-event-controllers introduced to constrain system behavior such that system specifications are satisfied. The Ostroff framework was also effective. The expressiveness of the modeling language permitted construction of a faithful model of the transputer network. The relevant specifications were readily expressed in the specification language. The set of decision procedures provided was adequate to verify the specifications of interest. The need for improved support for system behavior visualization is emphasized.

  12. Acoustic-based proton range verification in heterogeneous tissue: simulation studies

    NASA Astrophysics Data System (ADS)

    Jones, Kevin C.; Nie, Wei; Chu, James C. H.; Turian, Julius V.; Kassaee, Alireza; Sehgal, Chandra M.; Avery, Stephen

    2018-01-01

    Acoustic-based proton range verification (protoacoustics) is a potential in vivo technique for determining the Bragg peak position. Previous measurements and simulations have been restricted to homogeneous water tanks. Here, a CT-based simulation method is proposed and applied to a liver and prostate case to model the effects of tissue heterogeneity on the protoacoustic amplitude and time-of-flight range verification accuracy. For the liver case, posterior irradiation with a single proton pencil beam was simulated for detectors placed on the skin. In the prostate case, a transrectal probe measured the protoacoustic pressure generated by irradiation with five separate anterior proton beams. After calculating the proton beam dose deposition, each CT voxel’s material properties were mapped based on Hounsfield Unit values, and thermoacoustically-generated acoustic wave propagation was simulated with the k-Wave MATLAB toolbox. By comparing the simulation results for the original liver CT to homogenized variants, the effects of heterogeneity were assessed. For the liver case, 1.4 cGy of dose at the Bragg peak generated 50 mPa of pressure (13 cm distal), a 2×  lower amplitude than simulated in a homogeneous water tank. Protoacoustic triangulation of the Bragg peak based on multiple detector measurements resulted in 0.4 mm accuracy for a δ-function proton pulse irradiation of the liver. For the prostate case, higher amplitudes are simulated (92-1004 mPa) for closer detectors (<8 cm). For four of the prostate beams, the protoacoustic range triangulation was accurate to  ⩽1.6 mm (δ-function proton pulse). Based on the results, application of protoacoustic range verification to heterogeneous tissue will result in decreased signal amplitudes relative to homogeneous water tank measurements, but accurate range verification is still expected to be possible.

  13. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations... concentration subcomponents (e.g., THC and CH4 for NMHC) separately. For example, for NMHC measurements, perform drift verification on NMHC; do not verify THC and CH4 separately. (2) Drift verification requires two...

  14. A Roadmap for the Implementation of Continued Process Verification.

    PubMed

    Boyer, Marcus; Gampfer, Joerg; Zamamiri, Abdel; Payne, Robin

    2016-01-01

    In 2014, the members of the BioPhorum Operations Group (BPOG) produced a 100-page continued process verification case study, entitled "Continued Process Verification: An Industry Position Paper with Example Protocol". This case study captures the thought processes involved in creating a continued process verification plan for a new product in response to the U.S. Food and Drug Administration's guidance on the subject introduced in 2011. In so doing, it provided the specific example of a plan developed for a new molecular antibody product based on the "A MAb Case Study" that preceded it in 2009.This document provides a roadmap that draws on the content of the continued process verification case study to provide a step-by-step guide in a more accessible form, with reference to a process map of the product life cycle. It could be used as a basis for continued process verification implementation in a number of different scenarios: For a single product and process;For a single site;To assist in the sharing of data monitoring responsibilities among sites;To assist in establishing data monitoring agreements between a customer company and a contract manufacturing organization. The U.S. Food and Drug Administration issued guidance on the management of manufacturing processes designed to improve quality and control of drug products. This involved increased focus on regular monitoring of manufacturing processes, reporting of the results, and the taking of opportunities to improve. The guidance and practice associated with it is known as continued process verification This paper summarizes good practice in responding to continued process verification guidance, gathered from subject matter experts in the biopharmaceutical industry. © PDA, Inc. 2016.

  15. TH-AB-201-01: A Feasibility Study of Independent Dose Verification for CyberKnife

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sato, A; Noda, T; Keduka, Y

    2016-06-15

    Purpose: CyberKnife irradiation is composed of tiny-size, multiple and intensity-modulated beams compared to conventional linacs. Few of the publications for Independent dose calculation verification for CyberKnife have been reported. In this study, we evaluated the feasibility of independent dose verification for CyberKnife treatment as Secondary check. Methods: The followings were measured: test plans using some static and single beams, clinical plans in a phantom and using patient’s CT. 75 patient plans were collected from several treatment sites of brain, lung, liver and bone. In the test plans and the phantom plans, a pinpoint ion-chamber measurement was performed to assess dosemore » deviation for a treatment planning system (TPS) and an independent verification program of Simple MU Analysis (SMU). In the clinical plans, dose deviation between the SMU and the TPS was performed. Results: In test plan, the dose deviations were 3.3±4.5%, and 4.1±4.4% for the TPS and the SMU, respectively. In the phantom measurements for the clinical plans, the dose deviations were −0.2±3.6% for the TPS and −2.3±4.8% for the SMU. In the clinical plans using the patient’s CT, the dose deviations were −3.0±2.1% (Mean±1SD). The systematic difference was partially derived from inverse square law and penumbra calculation. Conclusion: The independent dose calculation for CyberKnife shows −3.0±4.2% (Mean±2SD) and our study, the confidence limit was achieved within 5% of the tolerance level from AAPM task group 114 for non-IMRT treatment. Thus, it may be feasible to use independent dose calculation verification for CyberKnife treatment as the secondary check. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  16. Collaborative Localization and Location Verification in WSNs

    PubMed Central

    Miao, Chunyu; Dai, Guoyong; Ying, Kezhen; Chen, Qingzhang

    2015-01-01

    Localization is one of the most important technologies in wireless sensor networks. A lightweight distributed node localization scheme is proposed by considering the limited computational capacity of WSNs. The proposed scheme introduces the virtual force model to determine the location by incremental refinement. Aiming at solving the drifting problem and malicious anchor problem, a location verification algorithm based on the virtual force mode is presented. In addition, an anchor promotion algorithm using the localization reliability model is proposed to re-locate the drifted nodes. Extended simulation experiments indicate that the localization algorithm has relatively high precision and the location verification algorithm has relatively high accuracy. The communication overhead of these algorithms is relative low, and the whole set of reliable localization methods is practical as well as comprehensive. PMID:25954948

  17. Validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Gilstrap, Lewey

    1991-01-01

    Validation and verification (V&V) are procedures used to evaluate system structure or behavior with respect to a set of requirements. Although expert systems are often developed as a series of prototypes without requirements, it is not possible to perform V&V on any system for which requirements have not been prepared. In addition, there are special problems associated with the evaluation of expert systems that do not arise in the evaluation of conventional systems, such as verification of the completeness and accuracy of the knowledge base. The criticality of most NASA missions make it important to be able to certify the performance of the expert systems used to support these mission. Recommendations for the most appropriate method for integrating V&V into the Expert System Development Methodology (ESDM) and suggestions for the most suitable approaches for each stage of ESDM development are presented.

  18. Verification of road databases using multiple road models

    NASA Astrophysics Data System (ADS)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  19. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  20. A study of compositional verification based IMA integration method

    NASA Astrophysics Data System (ADS)

    Huang, Hui; Zhang, Guoquan; Xu, Wanmeng

    2018-03-01

    The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.

  1. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  2. Fingerprint changes and verification failure among patients with hand dermatitis.

    PubMed

    Lee, Chew Kek; Chang, Choong Chor; Johar, Asmah; Puwira, Othman; Roshidah, Baba

    2013-03-01

    To determine the prevalence of fingerprint verification failure and to define and quantify the fingerprint changes associated with fingerprint verification failure. Case-control study. Referral public dermatology center. The study included 100 consecutive patients with clinical hand dermatitis involving the palmar distal phalanx of either thumb and 100 age-, sex-, and ethnicity-matched controls. Patients with an altered thumb print due to other causes and palmar hyperhidrosis were excluded. Fingerprint verification(pass/fail) and hand eczema severity index score. Twenty-seven percent of patients failed fingerprint verification compared with 2% of controls. Fingerprint verification failure was associated with a higher hand eczema severity index score (P.001). The main fingerprint abnormalities were fingerprint dystrophy (42.0%) and abnormal white lines (79.5%). The number of abnormal white lines was significantly higher among the patients with hand dermatitis compared with controls(P=.001). Among the patients with hand dermatitis, theodds of failing fingerprint verification with fingerprint dystrophy was 4.01. The presence of broad lines and long lines was associated with a greater odds of fingerprint verification failure (odds ratio [OR], 8.04; 95% CI, 3.56-18.17 and OR, 2.37; 95% CI, 1.31-4.27, respectively),while the presence of thin lines was protective of verification failure (OR, 0.45; 95% CI, 0.23-0.89). Fingerprint verification failure is a significant problem among patients with more severe hand dermatitis. It is mainly due to fingerprint dystrophy and abnormal white lines. Malaysian National Medical Research Register Identifier: NMRR-11-30-8226

  3. PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Frederick, J. M.

    2016-12-01

    In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to

  4. New Aspects of Probabilistic Forecast Verification Using Information Theory

    NASA Astrophysics Data System (ADS)

    Tödter, Julian; Ahrens, Bodo

    2013-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification, particularly concerning ensemble forecasts. Recent findings concerning the "Ignorance Score" are shortly reviewed, then a consistent generalization to continuous forecasts is motivated. For ensemble-generated forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up a natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The useful properties of the conceptually appealing CRIGN are illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This algorithm can also be used to calculate the decomposition of the more traditional CRPS exactly. The applicability of the "new" measures is demonstrated in a small evaluation study of ensemble-based precipitation forecasts.

  5. Block 2 SRM conceptual design studies. Volume 1, Book 2: Preliminary development and verification plan

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.

  6. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bravenec, Ronald

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less

  7. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  8. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  9. A Practitioners Perspective on Verification

    NASA Astrophysics Data System (ADS)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  10. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  11. The effectiveness of ID readers and remote age verification in enhancing compliance with the legal age limit for alcohol.

    PubMed

    Van Hoof, Joris J

    2017-04-01

    Currently, two different age verification systems (AVS) are implemented to enhance compliance with legal age limits for the sale of alcohol in the Netherlands. In this study, we tested the operational procedures and effectiveness of ID readers and remote age verification technology in supermarkets during the sale of alcohol. Following a trained alcohol purchase protocol, eight mystery shoppers (both underage and in the branch's reference age) conducted 132 alcohol purchase attempts in stores that were equipped with ID readers or remote age verification or were part of a control group. In stores equipped with an ID reader, 34% of the purchases were conducted without any mistakes (full compliance). In stores with remote age verification, full compliance was achieved in 87% of the cases. The control group reached 57% compliance, which is in line with the national average. Stores with ID readers perform worse than stores with remote age verification, and also worse than stores without any AVS. For both systems, in addition to effectiveness, public support and user friendliness need to be investigated. This study shows that remote age verification technology is a promising intervention that increases vendor compliance during the sales of age restricted products. © The Author 2016. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  12. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Eligibility verification. 457.380 Section 457.380... Requirements: Eligibility, Screening, Applications, and Enrollment § 457.380 Eligibility verification. (a) The... State may establish reasonable eligibility verification mechanisms to promote enrollment of eligible...

  13. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  14. Quantum money with classical verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gavinsky, Dmitry

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  15. Quantum money with classical verification

    NASA Astrophysics Data System (ADS)

    Gavinsky, Dmitry

    2014-12-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  16. Kleene Algebra and Bytecode Verification

    DTIC Science & Technology

    2016-04-27

    computing the star (Kleene closure) of a matrix of transfer functions. In this paper we show how this general framework applies to the problem of Java ...bytecode verification. We show how to specify transfer functions arising in Java bytecode verification in such a way that the Kleene algebra operations...potentially improve the performance over the standard worklist algorithm when a small cutset can be found. Key words: Java , bytecode, verification, static

  17. Verification and Improvement of ERS-1/2 Altimeter Geophysical Data Records for Global Change Studies

    NASA Technical Reports Server (NTRS)

    Shum, C. K.

    2000-01-01

    This Final Technical Report summarizes the research work conducted under NASA's Physical Oceanography Program entitled, Verification And Improvement Of ERS-112 Altimeter Geophysical Data Recorders For Global Change Studies, for the time period from January 1, 2000 through June 30, 2000. This report also provides a summary of the investigation from July 1, 1997 - June 30, 2000. The primary objectives of this investigation include verification and improvement of the ERS-1 and ERS-2 radar altimeter geophysical data records for distribution of the data to the ESA-approved U.S. ERS-1/-2 investigators for global climate change studies. Specifically, the investigation is to verify and improve the ERS geophysical data record products by calibrating the instrument and assessing accuracy for the ERS-1/-2 orbital, geophysical, media, and instrument corrections. The purpose is to ensure that the consistency of constants, standards and algorithms with TOPEX/POSEIDON radar altimeter for global climate change studies such as the monitoring and interpretation of long-term sea level change. This investigation has provided the current best precise orbits, with the radial orbit accuracy for ERS-1 (Phases C-G) and ERS-2 estimated at the 3-5 cm rms level, an 30-fold improvement compared to the 1993 accuracy. We have finalized the production and verification of the value-added ERS-1 mission (Phases A, B, C, D, E, F, and G), in collaboration with JPL PODAAC and the University of Texas. Orbit and data verification and improvement of algorithms led to the best data product available to-date. ERS-2 altimeter data have been improved and we have been active on Envisat (2001 launch) GDR algorithm review and improvement. The data improvement of ERS-1 and ERS-2 led to improvement in the global mean sea surface, marine gravity anomaly and bathymetry models, and a study of Antarctica mass balance, which was published in Science in 1998.

  18. Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William; Budzien, Joanne Louise; Ferguson, Jim Michael

    This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents servemore » as the compilation of results demonstrating accomplishment of these objectives.« less

  19. The politics of verification and the control of nuclear tests, 1945-1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallagher, N.W.

    1990-01-01

    This dissertation addresses two questions: (1) why has agreement been reached on verification regimes to support some arms control accords but not others; and (2) what determines the extent to which verification arrangements promote stable cooperation. This study develops an alternative framework for analysis by examining the politics of verification at two levels. The logical politics of verification are shaped by the structure of the problem of evaluating cooperation under semi-anarchical conditions. The practical politics of verification are driven by players' attempts to use verification arguments to promote their desired security outcome. The historical material shows that agreements on verificationmore » regimes are reached when key domestic and international players desire an arms control accord and believe that workable verification will not have intolerable costs. Clearer understanding of how verification is itself a political problem, and how players manipulate it to promote other goals is necessary if the politics of verification are to support rather than undermine the development of stable cooperation.« less

  20. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests

  1. Preventing illegal tobacco and alcohol sales to minors through electronic age-verification devices: a field effectiveness study.

    PubMed

    Krevor, Brad; Capitman, John A; Oblak, Leslie; Cannon, Joanna B; Ruwe, Mathilda

    2003-01-01

    Efforts to prohibit the sales of tobacco and alcohol products to minors are widespread. Electronic Age Verification (EAV) devices are one possible means to improve compliance with sales to minors laws. The purpose of this study was to evaluate the implementation and effectiveness of EAV devices in terms of the frequency and accuracy of age verification, as well as to examine the impact of EAV's on the retailer environment. Two study locations were selected: Tallahassee, Florida and Iowa City, Iowa. Retail stores were invited to participate in the study, producing a self-selected experimental group. Stores that did not elect to test the EAV's comprised the comparison group. The data sources included: 1) mystery shopper inspections: two pre- and five post-EAV installation mystery shopper inspections of tobacco and alcohol retailers; 2) retail clerk and manager interviews; and 3) customer interviews. The study found that installing EAV devices with minimal training and encouragement did not increase age verification and underage sales refusal. Surveyed clerks reported positive experiences using the electronic ID readers and customers reported almost no discomfort about being asked to swipe their IDs. Observations from this study support the need for a more comprehensive system for responsible retailing.

  2. A hybrid Bayesian hierarchical model combining cohort and case-control studies for meta-analysis of diagnostic tests: Accounting for partial verification bias.

    PubMed

    Ma, Xiaoye; Chen, Yong; Cole, Stephen R; Chu, Haitao

    2016-12-01

    To account for between-study heterogeneity in meta-analysis of diagnostic accuracy studies, bivariate random effects models have been recommended to jointly model the sensitivities and specificities. As study design and population vary, the definition of disease status or severity could differ across studies. Consequently, sensitivity and specificity may be correlated with disease prevalence. To account for this dependence, a trivariate random effects model had been proposed. However, the proposed approach can only include cohort studies with information estimating study-specific disease prevalence. In addition, some diagnostic accuracy studies only select a subset of samples to be verified by the reference test. It is known that ignoring unverified subjects may lead to partial verification bias in the estimation of prevalence, sensitivities, and specificities in a single study. However, the impact of this bias on a meta-analysis has not been investigated. In this paper, we propose a novel hybrid Bayesian hierarchical model combining cohort and case-control studies and correcting partial verification bias at the same time. We investigate the performance of the proposed methods through a set of simulation studies. Two case studies on assessing the diagnostic accuracy of gadolinium-enhanced magnetic resonance imaging in detecting lymph node metastases and of adrenal fluorine-18 fluorodeoxyglucose positron emission tomography in characterizing adrenal masses are presented. © The Author(s) 2014.

  3. A Hybrid Bayesian Hierarchical Model Combining Cohort and Case-control Studies for Meta-analysis of Diagnostic Tests: Accounting for Partial Verification Bias

    PubMed Central

    Ma, Xiaoye; Chen, Yong; Cole, Stephen R.; Chu, Haitao

    2014-01-01

    To account for between-study heterogeneity in meta-analysis of diagnostic accuracy studies, bivariate random effects models have been recommended to jointly model the sensitivities and specificities. As study design and population vary, the definition of disease status or severity could differ across studies. Consequently, sensitivity and specificity may be correlated with disease prevalence. To account for this dependence, a trivariate random effects model had been proposed. However, the proposed approach can only include cohort studies with information estimating study-specific disease prevalence. In addition, some diagnostic accuracy studies only select a subset of samples to be verified by the reference test. It is known that ignoring unverified subjects may lead to partial verification bias in the estimation of prevalence, sensitivities and specificities in a single study. However, the impact of this bias on a meta-analysis has not been investigated. In this paper, we propose a novel hybrid Bayesian hierarchical model combining cohort and case-control studies and correcting partial verification bias at the same time. We investigate the performance of the proposed methods through a set of simulation studies. Two case studies on assessing the diagnostic accuracy of gadolinium-enhanced magnetic resonance imaging in detecting lymph node metastases and of adrenal fluorine-18 fluorodeoxyglucose positron emission tomography in characterizing adrenal masses are presented. PMID:24862512

  4. Generalization of information-based concepts in forecast verification

    NASA Astrophysics Data System (ADS)

    Tödter, J.; Ahrens, B.

    2012-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification. Recent findings concerning the Ignorance Score are shortly reviewed, then the generalization to continuous forecasts is shown. For ensemble forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are the prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up the natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The applicability and usefulness of the conceptually appealing CRIGN is illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This is also directly applicable to the more traditional CRPS.

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM CASE STUDIES: DEMONSTRATING PROGRAM OUTCOMES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This bookle...

  6. Microcode Verification Project.

    DTIC Science & Technology

    1980-05-01

    numerical constant. The internal syntax for these minimum and maximum values is REALMIN and REALMAX. ISPSSIMP ISPSSIMP is the file simplifying bitstring...To be fair , it is quito clear that much of the ILbor Il tile verification task can be reduced If verification and. code development are carried out...basi.a of and the language we have chosen for both encoding our descriptions of machines and reasoning about the course of computations. Internally , our

  7. Feedback Seeking in Early Adolescence: Self-Enhancement or Self-Verification?

    PubMed

    Rosen, Lisa H; Principe, Connor P; Langlois, Judith H

    2013-02-13

    The authors examined whether early adolescents ( N = 90) solicit self-enhancing feedback (i.e., positive feedback) or self-verifying feedback (i.e., feedback congruent with self-views, even when these views are negative). Sixth, seventh, and eighth graders first completed a self-perception measure and then selected whether to receive positive or negative feedback from an unknown peer in different domains of self. Results were consistent with self-verification theory; adolescents who perceived themselves as having both strengths and weaknesses were more likely to seek negative feedback regarding a self-perceived weakness compared to a self-perceived strength. The authors found similar support for self-verification processes when they considered the entire sample regardless of perceived strengths and weaknesses; hierarchical linear modeling (HLM) examined the predictive power of ratings of self-perceived ability, certainty, and importance on feedback seeking for all participants and provided additional evidence of self-verification strivings in adolescence.

  8. Feedback Seeking in Early Adolescence: Self-Enhancement or Self-Verification?

    PubMed Central

    Rosen, Lisa H.; Principe, Connor P.; Langlois, Judith H.

    2012-01-01

    The authors examined whether early adolescents (N = 90) solicit self-enhancing feedback (i.e., positive feedback) or self-verifying feedback (i.e., feedback congruent with self-views, even when these views are negative). Sixth, seventh, and eighth graders first completed a self-perception measure and then selected whether to receive positive or negative feedback from an unknown peer in different domains of self. Results were consistent with self-verification theory; adolescents who perceived themselves as having both strengths and weaknesses were more likely to seek negative feedback regarding a self-perceived weakness compared to a self-perceived strength. The authors found similar support for self-verification processes when they considered the entire sample regardless of perceived strengths and weaknesses; hierarchical linear modeling (HLM) examined the predictive power of ratings of self-perceived ability, certainty, and importance on feedback seeking for all participants and provided additional evidence of self-verification strivings in adolescence. PMID:23543746

  9. Environmental Technology Verification Program - ETV - Case Studies: Demonstrating Program Outcomes

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This cd con...

  10. Face verification with balanced thresholds.

    PubMed

    Yan, Shuicheng; Xu, Dong; Tang, Xiaoou

    2007-01-01

    The process of face verification is guided by a pre-learned global threshold, which, however, is often inconsistent with class-specific optimal thresholds. It is, hence, beneficial to pursue a balance of the class-specific thresholds in the model-learning stage. In this paper, we present a new dimensionality reduction algorithm tailored to the verification task that ensures threshold balance. This is achieved by the following aspects. First, feasibility is guaranteed by employing an affine transformation matrix, instead of the conventional projection matrix, for dimensionality reduction, and, hence, we call the proposed algorithm threshold balanced transformation (TBT). Then, the affine transformation matrix, constrained as the product of an orthogonal matrix and a diagonal matrix, is optimized to improve the threshold balance and classification capability in an iterative manner. Unlike most algorithms for face verification which are directly transplanted from face identification literature, TBT is specifically designed for face verification and clarifies the intrinsic distinction between these two tasks. Experiments on three benchmark face databases demonstrate that TBT significantly outperforms the state-of-the-art subspace techniques for face verification.

  11. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  12. EXAMINING THE ROLE AND RESEARCH CHALLENGES OF SOCIAL MEDIA AS A TOOL FOR NONPROLIFERATION AND ARMS CONTROL TREATY VERIFICATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.

    Traditional arms control treaty verification activities typically involve a combination of technical measurements via physical and chemical sensors, state declarations, political agreements, and on-site inspections involving international subject matter experts. However, the ubiquity of the internet, and the electronic sharing of data that it enables, has made available a wealth of open source information with the potential to benefit verification efforts. Open source information is already being used by organizations such as the International Atomic Energy Agency to support the verification of state-declared information, prepare inspectors for in-field activities, and to maintain situational awareness . The recent explosion in socialmore » media use has opened new doors to exploring the attitudes, moods, and activities around a given topic. Social media platforms, such as Twitter, Facebook, and YouTube, offer an opportunity for individuals, as well as institutions, to participate in a global conversation at minimal cost. Social media data can also provide a more data-rich environment, with text data being augmented with images, videos, and location data. The research described in this paper investigates the utility of applying social media signatures as potential arms control and nonproliferation treaty verification tools and technologies, as determined through a series of case studies. The treaty relevant events that these case studies touch upon include detection of undeclared facilities or activities, determination of unknown events recorded by the International Monitoring System (IMS), and the global media response to the occurrence of an Indian missile launch. The case studies examine how social media can be used to fill an information gap and provide additional confidence to a verification activity. The case studies represent, either directly or through a proxy, instances where social media information may be available that could potentially augment the

  13. Verification in Referral-Based Crowdsourcing

    PubMed Central

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  14. 25 CFR 61.8 - Verification forms.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... using the last address of record. The verification form will be used to ascertain the previous enrollee... death. Name and/or address changes will only be made if the verification form is signed by an adult... 25 Indians 1 2010-04-01 2010-04-01 false Verification forms. 61.8 Section 61.8 Indians BUREAU OF...

  15. High stakes in INF verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krepon, M.

    1987-06-01

    The stakes involved in negotiating INF verification arrangements are high. While these proposals deal only with intermediate-range ground-launched cruise and mobile missiles, if properly devised they could help pave the way for comprehensive limits on other cruise missiles and strategic mobile missiles. In contrast, poorly drafted monitoring provisions could compromise national industrial security and generate numerous compliance controversies. Any verification regime will require new openness on both sides, but that means significant risks as well as opportunities. US and Soviet negotiators could spend weeks, months, and even years working out in painstaking detail verification provisions for medium-range missiles. Alternatively, ifmore » the two sides wished to conclude an INF agreement quickly, they could defer most of the difficult verification issues to the strategic arms negotiations.« less

  16. Analytical Verifications in Cryogenic Testing of NGST Advanced Mirror System Demonstrators

    NASA Technical Reports Server (NTRS)

    Cummings, Ramona; Levine, Marie; VanBuren, Dave; Kegley, Jeff; Green, Joseph; Hadaway, James; Presson, Joan; Cline, Todd; Stahl, H. Philip (Technical Monitor)

    2002-01-01

    Ground based testing is a critical and costly part of component, assembly, and system verifications of large space telescopes. At such tests, however, with integral teamwork by planners, analysts, and test personnel, segments can be included to validate specific analytical parameters and algorithms at relatively low additional cost. This paper opens with strategy of analytical verification segments added to vacuum cryogenic testing of Advanced Mirror System Demonstrator (AMSD) assemblies. These AMSD assemblies incorporate material and architecture concepts being considered in the Next Generation Space Telescope (NGST) design. The test segments for workmanship testing, cold survivability, and cold operation optical throughput are supplemented by segments for analytical verifications of specific structural, thermal, and optical parameters. Utilizing integrated modeling and separate materials testing, the paper continues with support plan for analyses, data, and observation requirements during the AMSD testing, currently slated for late calendar year 2002 to mid calendar year 2003. The paper includes anomaly resolution as gleaned by authors from similar analytical verification support of a previous large space telescope, then closes with draft of plans for parameter extrapolations, to form a well-verified portion of the integrated modeling being done for NGST performance predictions.

  17. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  18. Authoring and verification of clinical guidelines: a model driven approach.

    PubMed

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc

  19. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  20. Environmental Testing Campaign and Verification of Satellite Deimos-2 at INTA

    NASA Astrophysics Data System (ADS)

    Hernandez, Daniel; Vazquez, Mercedes; Anon, Manuel; Olivo, Esperanza; Gallego, Pablo; Morillo, Pablo; Parra, Javier; Capraro; Luengo, Mar; Garcia, Beatriz; Villacorta, Pablo

    2014-06-01

    In this paper the environmental test campaign and verification of the DEIMOS-2 (DM2) satellite will be presented and described. DM2 will be ready for launch in 2014.Firstly, a short description of the satellite is presented, including its physical characteristics and intended optical performances. DEIMOS-2 is a LEO satellite for earth observation that will provide high resolution imaging services for agriculture, civil protection, environmental issues, disasters monitoring, climate change, urban planning, cartography, security and intelligence.Then, the verification and test campaign carried out on the SM and FM models at INTA is described; including Mechanical test for the SM and Climatic, Mechanical and Electromagnetic Compatibility tests for the FM. In addition, this paper includes Centre of Gravity and Moment of Inertia measurements for both models, and other verification activities carried out in order to ensure satellite's health during launch and its in orbit performance.

  1. CD volume design and verification

    NASA Technical Reports Server (NTRS)

    Li, Y. P.; Hughes, J. S.

    1993-01-01

    In this paper, we describe a prototype for CD-ROM volume design and verification. This prototype allows users to create their own model of CD volumes by modifying a prototypical model. Rule-based verification of the test volumes can then be performed later on against the volume definition. This working prototype has proven the concept of model-driven rule-based design and verification for large quantity of data. The model defined for the CD-ROM volumes becomes a data model as well as an executable specification.

  2. [Uniqueness seeking behavior as a self-verification: an alternative approach to the study of uniqueness].

    PubMed

    Yamaoka, S

    1995-06-01

    Uniqueness theory explains that extremely high perceived similarity between self and others evokes negative emotional reactions and causes uniqueness seeking behavior. However, the theory conceptualizes similarity so ambiguously that it appears to suffer from low predictive validity. The purpose of the current article is to propose an alternative explanation of uniqueness seeking behavior. It posits that perceived uniqueness deprivation is a threat to self-concepts, and therefore causes self-verification behavior. Two levels of self verification are conceived: one based on personal categorization and the other on social categorization. The present approach regards uniqueness seeking behavior as the personal-level self verification. To test these propositions, a 2 (very high or moderate similarity information) x 2 (with or without outgroup information) x 2 (high or low need for uniqueness) between-subject factorial-design experiment was conducted with 95 university students. Results supported the self-verification approach, and were discussed in terms of effects of uniqueness deprivation, levels of self-categorization, and individual differences in need for uniqueness.

  3. A Design Verification of the Parallel Pipelined Image Processings

    NASA Astrophysics Data System (ADS)

    Wasaki, Katsumi; Harai, Toshiaki

    2008-11-01

    This paper presents a case study of the design and verification of a parallel and pipe-lined image processing unit based on an extended Petri net, which is called a Logical Colored Petri net (LCPN). This is suitable for Flexible-Manufacturing System (FMS) modeling and discussion of structural properties. LCPN is another family of colored place/transition-net(CPN) with the addition of the following features: integer value assignment of marks, representation of firing conditions as marks' value based formulae, and coupling of output procedures with transition firing. Therefore, to study the behavior of a system modeled with this net, we provide a means of searching the reachability tree for markings.

  4. Monitoring and verification R&D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pilat, Joseph F; Budlong - Sylvester, Kory W; Fearey, Bryan L

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existingmore » energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.« less

  5. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  6. Verification of forecast ensembles in complex terrain including observation uncertainty

    NASA Astrophysics Data System (ADS)

    Dorninger, Manfred; Kloiber, Simon

    2017-04-01

    Traditionally, verification means to verify a forecast (ensemble) with the truth represented by observations. The observation errors are quite often neglected arguing that they are small when compared to the forecast error. In this study as part of the MesoVICT (Mesoscale Verification Inter-comparison over Complex Terrain) project it will be shown, that observation errors have to be taken into account for verification purposes. The observation uncertainty is estimated from the VERA (Vienna Enhanced Resolution Analysis) and represented via two analysis ensembles which are compared to the forecast ensemble. For the whole study results from COSMO-LEPS provided by Arpae-SIMC Emilia-Romagna are used as forecast ensemble. The time period covers the MesoVICT core case from 20-22 June 2007. In a first step, all ensembles are investigated concerning their distribution. Several tests have been executed (Kolmogorov-Smirnov-Test, Finkelstein-Schafer Test, Chi-Square Test etc.) showing no exact mathematical distribution. So the main focus is on non-parametric statistics (e.g. Kernel density estimation, Boxplots etc.) and also the deviation between "forced" normal distributed data and the kernel density estimations. In a next step the observational deviations due to the analysis ensembles are analysed. In a first approach scores are multiple times calculated with every single ensemble member from the analysis ensemble regarded as "true" observation. The results are presented as boxplots for the different scores and parameters. Additionally, the bootstrapping method is also applied to the ensembles. These possible approaches to incorporating observational uncertainty into the computation of statistics will be discussed in the talk.

  7. Optimal Verification of Entangled States with Local Measurements

    NASA Astrophysics Data System (ADS)

    Pallister, Sam; Linden, Noah; Montanaro, Ashley

    2018-04-01

    Consider the task of verifying that a given quantum device, designed to produce a particular entangled state, does indeed produce that state. One natural approach would be to characterize the output state by quantum state tomography, or alternatively, to perform some kind of Bell test, tailored to the state of interest. We show here that neither approach is optimal among local verification strategies for 2-qubit states. We find the optimal strategy in this case and show that quadratically fewer total measurements are needed to verify to within a given fidelity than in published results for quantum state tomography, Bell test, or fidelity estimation protocols. We also give efficient verification protocols for any stabilizer state. Additionally, we show that requiring that the strategy be constructed from local, nonadaptive, and noncollective measurements only incurs a constant-factor penalty over a strategy without these restrictions.

  8. Analyzing Personalized Policies for Online Biometric Verification

    PubMed Central

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M.

    2014-01-01

    Motivated by India’s nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident’s biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India’s program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India’s biometric program. The mean delay is sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32–41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident. PMID:24787752

  9. Analyzing personalized policies for online biometric verification.

    PubMed

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  10. Towards the formal verification of the requirements and design of a processor interface unit: HOL listings

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1993-01-01

    This technical report contains the Higher-Order Logic (HOL) listings of the partial verification of the requirements and design for a commercially developed processor interface unit (PIU). The PIU is an interface chip performing memory interface, bus interface, and additional support services for a commercial microprocessor within a fault tolerant computer system. This system, the Fault Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance-free operation, or both. This report contains the actual HOL listings of the PIU verification as it currently exists. Section two of this report contains general-purpose HOL theories and definitions that support the PIU verification. These include arithmetic theories dealing with inequalities and associativity, and a collection of tactics used in the PIU proofs. Section three contains the HOL listings for the completed PIU design verification. Section 4 contains the HOL listings for the partial requirements verification of the P-Port.

  11. 40 CFR 1066.240 - Torque transducer verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification. Verify torque-measurement systems by performing the verifications described in §§ 1066.270 and... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Torque transducer verification. 1066...

  12. Compromises produced by the dialectic between self-verification and self-enhancement.

    PubMed

    Morling, B; Epstein, S

    1997-12-01

    Three studies of people's reactions to evaluative feedback demonstrated that the dialectic between self-enhancement and self-verification results in compromises between these 2 motives, as hypothesized in cognitive-experiential self-theory. The demonstration was facilitated by 2 procedural improvements: Enhancement and verification were established by calibrating evaluative feedback against self appraisals, and degree of enhancement and of verification were varied along a continuum, rather than categorically. There was also support for the hypotheses that processing in an intuitive-experiential mode favors enhancement and processing in an analytical-rational mode favors verification in the kinds of situations investigated.

  13. Verification of EPA's " Preliminary remediation goals for radionuclides" (PRG) electronic calculator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stagich, B. H.

    The U.S. Environmental Protection Agency (EPA) requested an external, independent verification study of their “Preliminary Remediation Goals for Radionuclides” (PRG) electronic calculator. The calculator provides information on establishing PRGs for radionuclides at Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) sites with radioactive contamination (Verification Study Charge, Background). These risk-based PRGs set concentration limits using carcinogenic toxicity values under specific exposure conditions (PRG User’s Guide, Section 1). The purpose of this verification study is to ascertain that the computer codes has no inherit numerical problems with obtaining solutions as well as to ensure that the equations are programmed correctly.

  14. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  15. Ascertainment and verification of diabetes in the EPIC-NL study.

    PubMed

    Sluijs, I; van der A, D L; Beulens, J W J; Spijkerman, A M W; Ros, M M; Grobbee, D E; van der Schouw, Y T

    2010-08-01

    The objectives of this study were to describe in detail the ascertainment and verification of prevalent and incident diabetes in the Dutch contributor to the European Prospective Investigation into Cancer and Nutrition (EPIC-NL cohort) and to examine to what extent ascertained diabetes agreed with general practitioner (GP) and pharmacy records. In total, 40,011 adults, aged 21 to 70 years at baseline, were included. Diabetes was ascertained via self-report, linkage to registers of hospital discharge diagnoses (HDD) and a urinary glucose strip test. Ascertained diabetes cases were verified against GP or pharmacist information using mailed questionnaires. At baseline, 795 (2.0%) diabetes cases were ascertained, and 1494 (3.7%) during a mean follow-up of ten years. The majority was ascertained via self-report only (56.7%), or self-report in combination with HDD (18.0%). After verification of ascertained diabetes cases, 1532 (66.9%) [corrected] were defined as having diabetes , 495 (21.6%) as non-diabetic individuals, and 262 (11.5%) as uncertain. Of the 1538 cases ascertained by self-report, 1350 (positive predictive value: 87.8%) were confirmed by GP or pharmacist. Cases ascertained via self-report in combination with HDD were most often confirmed (334 (positive predictive value: 96.0%)). Two out of three ascertained diabetes cases were confirmed to have been diagnosed with diabetes by their GP or pharmacist. Diabetes cases ascertained via self-report in combination with HDD had the highest confirmation.

  16. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  17. 14 CFR 211.11 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Verification. 211.11 Section 211.11 Aeronautics and Space OFFICE OF THE SECRETARY, DEPARTMENT OF TRANSPORTATION (AVIATION PROCEEDINGS) ECONOMIC REGULATIONS APPLICATIONS FOR PERMITS TO FOREIGN AIR CARRIERS General Requirements § 211.11 Verification...

  18. Verification of operational solar flare forecast: Case of Regional Warning Center Japan

    NASA Astrophysics Data System (ADS)

    Kubo, Yûki; Den, Mitsue; Ishii, Mamoru

    2017-08-01

    In this article, we discuss a verification study of an operational solar flare forecast in the Regional Warning Center (RWC) Japan. The RWC Japan has been issuing four-categorical deterministic solar flare forecasts for a long time. In this forecast verification study, we used solar flare forecast data accumulated over 16 years (from 2000 to 2015). We compiled the forecast data together with solar flare data obtained with the Geostationary Operational Environmental Satellites (GOES). Using the compiled data sets, we estimated some conventional scalar verification measures with 95% confidence intervals. We also estimated a multi-categorical scalar verification measure. These scalar verification measures were compared with those obtained by the persistence method and recurrence method. As solar activity varied during the 16 years, we also applied verification analyses to four subsets of forecast-observation pair data with different solar activity levels. We cannot conclude definitely that there are significant performance differences between the forecasts of RWC Japan and the persistence method, although a slightly significant difference is found for some event definitions. We propose to use a scalar verification measure to assess the judgment skill of the operational solar flare forecast. Finally, we propose a verification strategy for deterministic operational solar flare forecasting. For dichotomous forecast, a set of proposed verification measures is a frequency bias for bias, proportion correct and critical success index for accuracy, probability of detection for discrimination, false alarm ratio for reliability, Peirce skill score for forecast skill, and symmetric extremal dependence index for association. For multi-categorical forecast, we propose a set of verification measures as marginal distributions of forecast and observation for bias, proportion correct for accuracy, correlation coefficient and joint probability distribution for association, the

  19. Aerospace Nickel-cadmium Cell Verification

    NASA Technical Reports Server (NTRS)

    Manzo, Michelle A.; Strawn, D. Michael; Hall, Stephen W.

    2001-01-01

    During the early years of satellites, NASA successfully flew "NASA-Standard" nickel-cadmium (Ni-Cd) cells manufactured by GE/Gates/SAFF on a variety of spacecraft. In 1992 a NASA Battery Review Board determined that the strategy of a NASA Standard Cell and Battery Specification and the accompanying NASA control of a standard manufacturing control document (MCD) for Ni-Cd cells and batteries was unwarranted. As a result of that determination, standards were abandoned and the use of cells other than the NASA Standard was required. In order to gain insight into the performance and characteristics of the various aerospace Ni-Cd products available, tasks were initiated within the NASA Aerospace Flight Battery Systems Program that involved the procurement and testing of representative aerospace Ni-Cd cell designs. A standard set of test conditions was established in order to provide similar information about the products from various vendors. The objective of this testing was to provide independent verification of representative commercial flight cells available in the marketplace today. This paper will provide a summary of the verification tests run on cells from various manufacturers: Sanyo 35 Ampere-hour (Ali) standard and 35 Ali advanced Ni-Cd cells, SAFr 50 Ah Ni-Cd cells and Eagle-Picher 21 Ali Magnum and 21 Ali Super Ni-CdTM cells from Eagle-Picher were put through a full evaluation. A limited number of 18 and 55 Ali cells from Acme Electric were also tested to provide an initial evaluation of the Acme aerospace cell designs. Additionally, 35 Ali aerospace design Ni-MH cells from Sanyo were evaluated under the standard conditions established for this program. Ile test program is essentially complete. The cell design parameters, the verification test plan and the details of the test result will be discussed.

  20. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PERFORMANCE VERIFICATION OF THE W.L. GORE & ASSOCIATES GORE-SORBER SCREENING SURVEY

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  2. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having knowledge...

  3. [Determinants of task preferences when performance is indicative of individual characteristics: self-assessment motivation and self-verification motivation].

    PubMed

    Numazaki, M; Kudo, E

    1995-04-01

    The present study was conducted to examine determinants of information-gathering behavior with regard to one's own characteristics. Four tasks with different self-congruent and incongruent diagnosticity were presented to subjects. As self-assessment theory predicted, high diagnostic tasks were preferred to low tasks. And as self-verification theory predicted, self-congruent diagnosticity had a stronger effect on task preference than self-incongruent diagnosticity. In addition, subjects who perceived the relevant characteristics important inclined to choose self-assessment behavior more than who did not. Also, subjects who were certain of their self-concept inclined to choose self-verification behavior more than who were not. These results suggest that both self-assessment and self-verification motivations play important roles in information-gathering behavior regarding one's characteristics, and strength of the motivations is determined by the importance of relevant characteristics or the certainty of self-concept.

  4. A new method to address verification bias in studies of clinical screening tests: cervical cancer screening assays as an example.

    PubMed

    Xue, Xiaonan; Kim, Mimi Y; Castle, Philip E; Strickler, Howard D

    2014-03-01

    Studies to evaluate clinical screening tests often face the problem that the "gold standard" diagnostic approach is costly and/or invasive. It is therefore common to verify only a subset of negative screening tests using the gold standard method. However, undersampling the screen negatives can lead to substantial overestimation of the sensitivity and underestimation of the specificity of the diagnostic test. Our objective was to develop a simple and accurate statistical method to address this "verification bias." We developed a weighted generalized estimating equation approach to estimate, in a single model, the accuracy (eg, sensitivity/specificity) of multiple assays and simultaneously compare results between assays while addressing verification bias. This approach can be implemented using standard statistical software. Simulations were conducted to assess the proposed method. An example is provided using a cervical cancer screening trial that compared the accuracy of human papillomavirus and Pap tests, with histologic data as the gold standard. The proposed approach performed well in estimating and comparing the accuracy of multiple assays in the presence of verification bias. The proposed approach is an easy to apply and accurate method for addressing verification bias in studies of multiple screening methods. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Ramifications of the Children's Surgery Verification Program for Patients and Hospitals.

    PubMed

    Baxter, Katherine J; Gale, Bonnie F; Travers, Curtis D; Heiss, Kurt F; Raval, Mehul V

    2018-05-01

    The American College of Surgeons in 2015 instituted the Children's Surgery Verification program delineating requirements for hospitals providing pediatric surgical care. Our purpose was to examine possible effects of the Children's Surgery Verification program by evaluating neonates undergoing high-risk operations. Using the Kid's Inpatient Database 2009, we identified infants undergoing operations for 5 high-risk neonatal conditions. We considered all children's hospitals and children's units Level I centers and considered all others Level II/III. We estimated the number of neonates requiring relocation and the additional distance traveled. We used propensity score adjusted logistic regression to model mortality at Level I vs Level II/III hospitals. Overall, 7,938 neonates were identified across 21 states at 91 Level I and 459 Level II/III hospitals. Based on our classifications, 2,744 (34.6%) patients would need to relocate to Level I centers. The median additional distance traveled was 6.6 miles. The maximum distance traveled varied by state, from <55 miles (New Jersey and Rhode Island) to >200 miles (Montana, Oregon, Colorado, and California). The adjusted odds of mortality at Level II/III vs Level I centers was 1.67 (95% CI 1.44 to 1.93). We estimate 1 life would be saved for every 32 neonates moved. Although this conservative estimate demonstrates that more than one-third of complex surgical neonates in 2009 would have needed to relocate under the Children's Surgery Verification program, the additional distance traveled is relatively short for most but not all, and this program might improve mortality. Local level ramifications of this novel national program require additional investigation. Copyright © 2018 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  6. Expert system verification and validation study. Delivery 3A and 3B: Trip summaries

    NASA Technical Reports Server (NTRS)

    French, Scott

    1991-01-01

    Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.

  7. Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, S R; Bihari, B L; Salari, K

    As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.

  8. CHEMICAL INDUCTION MIXER VERIFICATION - ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Wet-Weather Flow Technologies Pilot of the Environmental Technology Verification (ETV) Program, which is supported by the U.S. Environmental Protection Agency and facilitated by NSF International, has recently evaluated the performance of chemical induction mixers used for di...

  9. 78 FR 56266 - Consent Based Social Security Number Verification (CBSV) Service

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-12

    ... developed CBSV as a user- friendly, internet-based application with safeguards that protect the public's information. In addition to the benefit of providing high volume, centralized SSN verification services to users in a secure manner, CBSV provides us with cost and workload management benefits. New Information...

  10. 4D offline PET-based treatment verification in scanned ion beam therapy: a phantom study

    NASA Astrophysics Data System (ADS)

    Kurz, Christopher; Bauer, Julia; Unholtz, Daniel; Richter, Daniel; Stützer, Kristin; Bert, Christoph; Parodi, Katia

    2015-08-01

    At the Heidelberg Ion-Beam Therapy Center, patient irradiation with scanned proton and carbon ion beams is verified by offline positron emission tomography (PET) imaging: the {β+} -activity measured within the patient is compared to a prediction calculated on the basis of the treatment planning data in order to identify potential delivery errors. Currently, this monitoring technique is limited to the treatment of static target structures. However, intra-fractional organ motion imposes considerable additional challenges to scanned ion beam radiotherapy. In this work, the feasibility and potential of time-resolved (4D) offline PET-based treatment verification with a commercial full-ring PET/CT (x-ray computed tomography) device are investigated for the first time, based on an experimental campaign with moving phantoms. Motion was monitored during the gated beam delivery as well as the subsequent PET acquisition and was taken into account in the corresponding 4D Monte-Carlo simulations and data evaluation. Under the given experimental conditions, millimeter agreement between the prediction and measurement was found. Dosimetric consequences due to the phantom motion could be reliably identified. The agreement between PET measurement and prediction in the presence of motion was found to be similar as in static reference measurements, thus demonstrating the potential of 4D PET-based treatment verification for future clinical applications.

  11. Definition of ground test for verification of large space structure control

    NASA Technical Reports Server (NTRS)

    Glaese, John R.

    1994-01-01

    Under this contract, the Large Space Structure Ground Test Verification (LSSGTV) Facility at the George C. Marshall Space Flight Center (MSFC) was developed. Planning in coordination with NASA was finalized and implemented. The contract was modified and extended with several increments of funding to procure additional hardware and to continue support for the LSSGTV facility. Additional tasks were defined for the performance of studies in the dynamics, control and simulation of tethered satellites. When the LSSGTV facility development task was completed, support and enhancement activities were funded through a new competitive contract won by LCD. All work related to LSSGTV performed under NAS8-35835 has been completed and documented. No further discussion of these activities will appear in this report. This report summarizes the tether dynamics and control studies performed.

  12. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  13. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  14. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  15. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  16. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  17. Cleaning and Cleanliness Measurement of Additive Manufactured Parts

    NASA Technical Reports Server (NTRS)

    Mitchell, Mark A.; Edwards, Kevin; Fox, Eric; Boothe, Richard

    2017-01-01

    Additive Manufacturing processes allow for the manufacture of complex three dimensional components that otherwise could not be manufactured. Post treatment processes require the removal of any remnant bulk powder that may become entrapped within small cavities and channels within a component. This project focuses on several gross cleaning methods and the verification metrics associated with additive manufactured parts for oxygen propulsion usage.

  18. 76 FR 60112 - Consent Based Social Security Number Verification (CBSV) Service

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-28

    ... protect the public's information. In addition to the benefit of providing high volume, centralized SSN verification services to the business community in a secure manner, CBSV provides us with cost and workload management benefits. New Information: To use CBSV, interested parties must pay a one- time non-refundable...

  19. ADEN ALOS PALSAR Product Verification

    NASA Astrophysics Data System (ADS)

    Wright, P. A.; Meadows, P. J.; Mack, G.; Miranda, N.; Lavalle, M.

    2008-11-01

    Within the ALOS Data European Node (ADEN) the verification of PALSAR products is an important and continuing activity, to ensure data utility for the users. The paper will give a summary of the verification activities, the status of the ADEN PALSAR processor and the current quality issues that are important for users of ADEN PALSAR data.

  20. Sierra/SolidMechanics 4.48 Verification Tests Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plews, Julia A.; Crane, Nathan K; de Frias, Gabriel Jose

    2018-03-01

    Presented in this document is a small portion of the tests that exist in the Sierra / SolidMechanics (Sierra / SM) verification test suite. Most of these tests are run nightly with the Sierra / SM code suite, and the results of the test are checked versus the correct analytical result. For each of the tests presented in this document, the test setup, a description of the analytic solution, and comparison of the Sierra / SM code results to the analytic solution is provided. Mesh convergence is also checked on a nightly basis for several of these tests. This documentmore » can be used to confirm that a given code capability is verified or referenced as a compilation of example problems. Additional example problems are provided in the Sierra / SM Example Problems Manual. Note, many other verification tests exist in the Sierra / SM test suite, but have not yet been included in this manual.« less

  1. Sierra/SolidMechanics 4.48 Verification Tests Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plews, Julia A.; Crane, Nathan K.; de Frias, Gabriel Jose

    Presented in this document is a small portion of the tests that exist in the Sierra / SolidMechanics (Sierra / SM) verification test suite. Most of these tests are run nightly with the Sierra / SM code suite, and the results of the test are checked versus the correct analytical result. For each of the tests presented in this document, the test setup, a description of the analytic solution, and comparison of the Sierra / SM code results to the analytic solution is provided. Mesh convergence is also checked on a nightly basis for several of these tests. This documentmore » can be used to confirm that a given code capability is verified or referenced as a compilation of example problems. Additional example problems are provided in the Sierra / SM Example Problems Manual. Note, many other verification tests exist in the Sierra / SM test suite, but have not yet been included in this manual.« less

  2. A Verification Method for MASOES.

    PubMed

    Perozo, N; Aguilar Perozo, J; Terán, O; Molina, H

    2013-02-01

    MASOES is a 3agent architecture for designing and modeling self-organizing and emergent systems. This architecture describes the elements, relationships, and mechanisms, both at the individual and the collective levels, that favor the analysis of the self-organizing and emergent phenomenon without mathematically modeling the system. In this paper, a method is proposed for verifying MASOES from the point of view of design in order to study the self-organizing and emergent behaviors of the modeled systems. The verification criteria are set according to what is proposed in MASOES for modeling self-organizing and emerging systems and the principles of the wisdom of crowd paradigm and the fuzzy cognitive map (FCM) theory. The verification method for MASOES has been implemented in a tool called FCM Designer and has been tested to model a community of free software developers that works under the bazaar style as well as a Wikipedia community in order to study their behavior and determine their self-organizing and emergent capacities.

  3. Assessment of Galileo modal test results for mathematical model verification

    NASA Technical Reports Server (NTRS)

    Trubert, M.

    1984-01-01

    The modal test program for the Galileo Spacecraft was completed at the Jet Propulsion Laboratory in the summer of 1983. The multiple sine dwell method was used for the baseline test. The Galileo Spacecraft is a rather complex 2433 kg structure made of a central core on which seven major appendages representing 30 percent of the total mass are attached, resulting in a high modal density structure. The test revealed a strong nonlinearity in several major modes. This nonlinearity discovered in the course of the test necessitated running additional tests at the unusually high response levels of up to about 21 g. The high levels of response were required to obtain a model verification valid at the level of loads for which the spacecraft was designed. Because of the high modal density and the nonlinearity, correlation between the dynamic mathematical model and the test results becomes a difficult task. Significant changes in the pre-test analytical model are necessary to establish confidence in the upgraded analytical model used for the final load verification. This verification, using a test verified model, is required by NASA to fly the Galileo Spacecraft on the Shuttle/Centaur launch vehicle in 1986.

  4. Formal verification of software-based medical devices considering medical guidelines.

    PubMed

    Daw, Zamira; Cleaveland, Rance; Vetter, Marcus

    2014-01-01

    Software-based devices have increasingly become an important part of several clinical scenarios. Due to their critical impact on human life, medical devices have very strict safety requirements. It is therefore necessary to apply verification methods to ensure that the safety requirements are met. Verification of software-based devices is commonly limited to the verification of their internal elements without considering the interaction that these elements have with other devices as well as the application environment in which they are used. Medical guidelines define clinical procedures, which contain the necessary information to completely verify medical devices. The objective of this work was to incorporate medical guidelines into the verification process in order to increase the reliability of the software-based medical devices. Medical devices are developed using the model-driven method deterministic models for signal processing of embedded systems (DMOSES). This method uses unified modeling language (UML) models as a basis for the development of medical devices. The UML activity diagram is used to describe medical guidelines as workflows. The functionality of the medical devices is abstracted as a set of actions that is modeled within these workflows. In this paper, the UML models are verified using the UPPAAL model-checker. For this purpose, a formalization approach for the UML models using timed automaton (TA) is presented. A set of requirements is verified by the proposed approach for the navigation-guided biopsy. This shows the capability for identifying errors or optimization points both in the workflow and in the system design of the navigation device. In addition to the above, an open source eclipse plug-in was developed for the automated transformation of UML models into TA models that are automatically verified using UPPAAL. The proposed method enables developers to model medical devices and their clinical environment using clinical workflows as one

  5. Probabilistic verification of cloud fraction from three different products with CALIPSO

    NASA Astrophysics Data System (ADS)

    Jung, B. J.; Descombes, G.; Snyder, C.

    2017-12-01

    In this study, we present how Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) can be used for probabilistic verification of cloud fraction, and apply this probabilistic approach to three cloud fraction products: a) The Air Force Weather (AFW) World Wide Merged Cloud Analysis (WWMCA), b) Satellite Cloud Observations and Radiative Property retrieval Systems (SatCORPS) from NASA Langley Research Center, and c) Multi-sensor Advection Diffusion nowCast (MADCast) from NCAR. Although they differ in their details, both WWMCA and SatCORPS retrieve cloud fraction from satellite observations, mainly of infrared radiances. MADCast utilizes in addition a short-range forecast of cloud fraction (provided by the Model for Prediction Across Scales, assuming cloud fraction is advected as a tracer) and a column-by-column particle filter implemented within the Gridpoint Statistical Interpolation (GSI) data-assimilation system. The probabilistic verification considers the retrieved or analyzed cloud fractions as predicting the probability of cloud at any location within a grid cell and the 5-km vertical feature mask (VFM) from CALIPSO level-2 products as a point observation of cloud.

  6. Post-OPC verification using a full-chip pattern-based simulation verification method

    NASA Astrophysics Data System (ADS)

    Hung, Chi-Yuan; Wang, Ching-Heng; Ma, Cliff; Zhang, Gary

    2005-11-01

    In this paper, we evaluated and investigated techniques for performing fast full-chip post-OPC verification using a commercial product platform. A number of databases from several technology nodes, i.e. 0.13um, 0.11um and 90nm are used in the investigation. Although it has proven that for most cases, our OPC technology is robust in general, due to the variety of tape-outs with complicated design styles and technologies, it is difficult to develop a "complete or bullet-proof" OPC algorithm that would cover every possible layout patterns. In the evaluation, among dozens of databases, some OPC databases were found errors by Model-based post-OPC checking, which could cost significantly in manufacturing - reticle, wafer process, and more importantly the production delay. From such a full-chip OPC database verification, we have learned that optimizing OPC models and recipes on a limited set of test chip designs may not provide sufficient coverage across the range of designs to be produced in the process. And, fatal errors (such as pinch or bridge) or poor CD distribution and process-sensitive patterns may still occur. As a result, more than one reticle tape-out cycle is not uncommon to prove models and recipes that approach the center of process for a range of designs. So, we will describe a full-chip pattern-based simulation verification flow serves both OPC model and recipe development as well as post OPC verification after production release of the OPC. Lastly, we will discuss the differentiation of the new pattern-based and conventional edge-based verification tools and summarize the advantages of our new tool and methodology: 1). Accuracy: Superior inspection algorithms, down to 1nm accuracy with the new "pattern based" approach 2). High speed performance: Pattern-centric algorithms to give best full-chip inspection efficiency 3). Powerful analysis capability: Flexible error distribution, grouping, interactive viewing and hierarchical pattern extraction to narrow

  7. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  8. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  9. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  10. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  11. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...

  12. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...

  13. The Effect of Mystery Shopper Reports on Age Verification for Tobacco Purchases

    PubMed Central

    KREVOR, BRAD S.; PONICKI, WILLIAM R.; GRUBE, JOEL W.; DeJONG, WILLIAM

    2011-01-01

    Mystery shops (MS) involving attempted tobacco purchases by young buyers have been employed to monitor retail stores’ performance in refusing underage sales. Anecdotal evidence suggests that MS visits with immediate feedback to store personnel can improve age verification. This study investigated the impact of monthly and twice-monthly MS reports on age verification. Forty-five Walgreens stores were each visited 20 times by mystery shoppers. The stores were randomly assigned to one of three conditions. Control group stores received no feedback, whereas two treatment groups received feedback communications every visit (twice monthly) or every second visit (monthly) after baseline. Logit regression models tested whether each treatment group improved verification rates relative to the control group. Post-baseline verification rates were higher in both treatment groups than in the control group, but only the stores receiving monthly communications had a significantly greater improvement than control group stores. Verification rates increased significantly during the study period for all three groups, with delayed improvement among control group stores. Communication between managers regarding the MS program may account for the delayed age-verification improvements observed in the control group stores. Encouraging inter-store communication might extend the benefits of MS programs beyond those stores that receive this intervention. PMID:21541874

  14. The effect of mystery shopper reports on age verification for tobacco purchases.

    PubMed

    Krevor, Brad S; Ponicki, William R; Grube, Joel W; DeJong, William

    2011-09-01

    Mystery shops involving attempted tobacco purchases by young buyers have been implemented in order to monitor retail stores' performance in refusing underage sales. Anecdotal evidence suggests that mystery shop visits with immediate feedback to store personnel can improve age verification. This study investigated the effect of monthly and twice-monthly mystery shop reports on age verification. Mystery shoppers visited 45 Walgreens stores 20 times. The stores were randomly assigned to 1 of 3 conditions. Control group stores received no feedback, whereas 2 treatment groups received feedback communications on every visit (twice monthly) or on every second visit (monthly) after baseline. Logit regression models tested whether each treatment group improved verification rates relative to the control group. Postbaseline verification rates were higher in both treatment groups than in the control group, but only the stores receiving monthly communications had a significantly greater improvement compared with the control group stores. Verification rates increased significantly during the study period for all 3 groups, with delayed improvement among control group stores. Communication between managers regarding the mystery shop program may account for the delayed age-verification improvements observed in the control group stores. Encouraging interstore communication might extend the benefits of mystery shop programs beyond those stores that receive this intervention. Copyright © Taylor & Francis Group, LLC

  15. Usefulness of biological fingerprint in magnetic resonance imaging for patient verification.

    PubMed

    Ueda, Yasuyuki; Morishita, Junji; Kudomi, Shohei; Ueda, Katsuhiko

    2016-09-01

    The purpose of our study is to investigate the feasibility of automated patient verification using multi-planar reconstruction (MPR) images generated from three-dimensional magnetic resonance (MR) imaging of the brain. Several anatomy-related MPR images generated from three-dimensional fast scout scan of each MR examination were used as biological fingerprint images in this study. The database of this study consisted of 730 temporal pairs of MR examination of the brain. We calculated the correlation value between current and prior biological fingerprint images of the same patient and also all combinations of two images for different patients to evaluate the effectiveness of our method for patient verification. The best performance of our system were as follows: a half-total error rate of 1.59 % with a false acceptance rate of 0.023 % and a false rejection rate of 3.15 %, an equal error rate of 1.37 %, and a rank-one identification rate of 98.6 %. Our method makes it possible to verify the identity of the patient using only some existing medical images without the addition of incidental equipment. Also, our method will contribute to patient misidentification error management caused by human errors.

  16. Alternative sample sizes for verification dose experiments and dose audits

    NASA Astrophysics Data System (ADS)

    Taylor, W. A.; Hansen, J. M.

    1999-01-01

    ISO 11137 (1995), "Sterilization of Health Care Products—Requirements for Validation and Routine Control—Radiation Sterilization", provides sampling plans for performing initial verification dose experiments and quarterly dose audits. Alternative sampling plans are presented which provide equivalent protection. These sampling plans can significantly reduce the cost of testing. These alternative sampling plans have been included in a draft ISO Technical Report (type 2). This paper examines the rational behind the proposed alternative sampling plans. The protection provided by the current verification and audit sampling plans is first examined. Then methods for identifying equivalent plans are highlighted. Finally, methods for comparing the cost associated with the different plans are provided. This paper includes additional guidance for selecting between the original and alternative sampling plans not included in the technical report.

  17. Security Verification of Secure MANET Routing Protocols

    DTIC Science & Technology

    2012-03-22

    SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS THESIS Matthew F. Steele, Captain, USAF AFIT/GCS/ ENG /12-03 DEPARTMENT OF THE AIR FORCE AIR...States AFIT/GCS/ ENG /12-03 SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS THESIS Presented to the Faculty Department of Electrical and Computer...DISTRIBUTION UNLIMITED AFIT/GCS/ ENG /12-03 SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS Matthew F. Steele, B.S.E.E. Captain, USAF

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  19. Gender Verification of Female Olympic Athletes.

    ERIC Educational Resources Information Center

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  20. Spatial Evaluation and Verification of Earthquake Simulators

    NASA Astrophysics Data System (ADS)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  1. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  2. SU-E-T-50: A Multi-Institutional Study of Independent Dose Verification Software Program for Lung SBRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kawai, D; Takahashi, R; Kamima, T

    2015-06-15

    Purpose: The accuracy of dose distribution depends on treatment planning system especially in heterogeneity-region. The tolerance level (TL) of the secondary check using the independent dose verification may be variable in lung SBRT plans. We conducted a multi-institutional study to evaluate the tolerance level of lung SBRT plans shown in the AAPM TG114. Methods: Five institutes in Japan participated in this study. All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiological path length. Analytical Anisotropic Algorithm (AAA), Pencilmore » Beam Convolution with modified Batho-method (PBC-B) and Adaptive Convolve (AC) were used for lung SBRT planning. A measurement using an ion-chamber was performed in a heterogeneous phantom to compare doses from the three different algorithms and the SMU to the measured dose. In addition to it, a retrospective analysis using clinical lung SBRT plans (547 beams from 77 patients) was conducted to evaluate the confidence limit (CL, Average±2SD) in dose between the three algorithms and the SMU. Results: Compared to the measurement, the AAA showed the larger systematic dose error of 2.9±3.2% than PBC-B and AC. The Clarkson-based SMU showed larger error of 5.8±3.8%. The CLs for clinical plans were 7.7±6.0 % (AAA), 5.3±3.3 % (AC), 5.7±3.4 % (PBC -B), respectively. Conclusion: The TLs from the CLs were evaluated. A Clarkson-based system shows a large systematic variation because of inhomogeneous correction. The AAA showed a significant variation. Thus, we must consider the difference of inhomogeneous correction as well as the dependence of dose calculation engine.« less

  3. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Design verification testing. 61.40-3 Section 61.40-3... INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...

  4. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  5. Software verification plan for GCS. [guidance and control software

    NASA Technical Reports Server (NTRS)

    Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.

    1990-01-01

    This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.

  6. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants

    PubMed Central

    Drapikowski, Paweł; Kazimierczak-Grygiel, Ewa; Korecki, Dominik; Wiland-Szymańska, Justyna

    2016-01-01

    This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume) ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated. PMID:27355949

  7. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants.

    PubMed

    Drapikowski, Paweł; Kazimierczak-Grygiel, Ewa; Korecki, Dominik; Wiland-Szymańska, Justyna

    2016-06-27

    This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume) ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated.

  8. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM CASE STUDIES: DEMONSTRATING PROGRAM OUTCOMES, VOLUME II

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This bookle...

  10. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What is the Platform Verification Program? 250... Platforms and Structures Platform Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms...

  11. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication verification, or installation verification... 30 Mineral Resources 2 2010-07-01 2010-07-01 false When must I resubmit Platform Verification...

  12. Built-in-Test Verification Techniques

    DTIC Science & Technology

    1987-02-01

    report documents the results of the effort for the Rome Air Development Center Contract F30602-84-C-0021, BIT Verification Techniques. The work was...Richard Spillman of Sp.,llman Research Associates. The principal investigators were Mike Partridge and subsequently Jeffrey Albert. The contract was...two your effort to develop techniques for Built-In Test (BIT) verification. The objective of the contract was to develop specifications and technical

  13. Code Verification of the HIGRAD Computational Fluid Dynamics Solver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Buren, Kendra L.; Canfield, Jesse M.; Hemez, Francois M.

    2012-05-04

    The purpose of this report is to outline code and solution verification activities applied to HIGRAD, a Computational Fluid Dynamics (CFD) solver of the compressible Navier-Stokes equations developed at the Los Alamos National Laboratory, and used to simulate various phenomena such as the propagation of wildfires and atmospheric hydrodynamics. Code verification efforts, as described in this report, are an important first step to establish the credibility of numerical simulations. They provide evidence that the mathematical formulation is properly implemented without significant mistakes that would adversely impact the application of interest. Highly accurate analytical solutions are derived for four code verificationmore » test problems that exercise different aspects of the code. These test problems are referred to as: (i) the quiet start, (ii) the passive advection, (iii) the passive diffusion, and (iv) the piston-like problem. These problems are simulated using HIGRAD with different levels of mesh discretization and the numerical solutions are compared to their analytical counterparts. In addition, the rates of convergence are estimated to verify the numerical performance of the solver. The first three test problems produce numerical approximations as expected. The fourth test problem (piston-like) indicates the extent to which the code is able to simulate a 'mild' discontinuity, which is a condition that would typically be better handled by a Lagrangian formulation. The current investigation concludes that the numerical implementation of the solver performs as expected. The quality of solutions is sufficient to provide credible simulations of fluid flows around wind turbines. The main caveat associated to these findings is the low coverage provided by these four problems, and somewhat limited verification activities. A more comprehensive evaluation of HIGRAD may be beneficial for future studies.« less

  14. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What is the Platform Verification Program? 250... Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms; platforms of a new or unique design...

  15. Functions of social support and self-verification in association with loneliness, depression, and stress.

    PubMed

    Wright, Kevin B; King, Shawn; Rosenberg, Jenny

    2014-01-01

    This study investigated the influence of social support and self-verification on loneliness, depression, and stress among 477 college students. The authors propose and test a theoretical model using structural equation modeling. The results indicated empirical support for the model, with self-verification mediating the relation between social support and health outcomes. The results have implications for social support and self-verification research, which are discussed along with directions for future research and limitations of the study.

  16. Geometrical verification system using Adobe Photoshop in radiotherapy.

    PubMed

    Ishiyama, Hiromichi; Suzuki, Koji; Niino, Keiji; Hosoya, Takaaki; Hayakawa, Kazushige

    2005-02-01

    Adobe Photoshop is used worldwide and is useful for comparing portal films with simulation films. It is possible to scan images and then view them simultaneously with this software. The purpose of this study was to assess the accuracy of a geometrical verification system using Adobe Photoshop. We prepared the following two conditions for verification. Under one condition, films were hanged on light boxes, and examiners measured distances between the isocenter on simulation films and that on portal films by adjusting the bony structures. Under the other condition, films were scanned into a computer and displayed using Adobe Photoshop, and examiners measured distances between the isocenter on simulation films and those on portal films by adjusting the bony structures. To obtain control data, lead balls were used as a fiducial point for matching the films accurately. The errors, defined as the differences between the control data and the measurement data, were assessed. Errors of the data obtained using Adobe Photoshop were significantly smaller than those of the data obtained from films on light boxes (p < 0.007). The geometrical verification system using Adobe Photoshop is available on any PC with this software and is useful for improving the accuracy of verification.

  17. Verification Challenges at Low Numbers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-07-16

    This paper will explore the difficulties of deep reductions by examining the technical verification challenges. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 (Pifer 2010). Further reductions will include stepping stones at 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, themore » issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national lab complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.« less

  18. Verification of monitor unit calculations for non-IMRT clinical radiotherapy: report of AAPM Task Group 114.

    PubMed

    Stern, Robin L; Heaton, Robert; Fraser, Martin W; Goddu, S Murty; Kirby, Thomas H; Lam, Kwok Leung; Molineu, Andrea; Zhu, Timothy C

    2011-01-01

    The requirement of an independent verification of the monitor units (MU) or time calculated to deliver the prescribed dose to a patient has been a mainstay of radiation oncology quality assurance. The need for and value of such a verification was obvious when calculations were performed by hand using look-up tables, and the verification was achieved by a second person independently repeating the calculation. However, in a modern clinic using CT/MR/PET simulation, computerized 3D treatment planning, heterogeneity corrections, and complex calculation algorithms such as convolution/superposition and Monte Carlo, the purpose of and methodology for the MU verification have come into question. In addition, since the verification is often performed using a simpler geometrical model and calculation algorithm than the primary calculation, exact or almost exact agreement between the two can no longer be expected. Guidelines are needed to help the physicist set clinically reasonable action levels for agreement. This report addresses the following charges of the task group: (1) To re-evaluate the purpose and methods of the "independent second check" for monitor unit calculations for non-IMRT radiation treatment in light of the complexities of modern-day treatment planning. (2) To present recommendations on how to perform verification of monitor unit calculations in a modern clinic. (3) To provide recommendations on establishing action levels for agreement between primary calculations and verification, and to provide guidance in addressing discrepancies outside the action levels. These recommendations are to be used as guidelines only and shall not be interpreted as requirements.

  19. Model Based Verification of Cyber Range Event Environments

    DTIC Science & Technology

    2015-12-10

    Model Based Verification of Cyber Range Event Environments Suresh K. Damodaran MIT Lincoln Laboratory 244 Wood St., Lexington, MA, USA...apply model based verification to cyber range event environment configurations, allowing for the early detection of errors in event environment...Environment Representation (CCER) ontology. We also provide an overview of a methodology to specify verification rules and the corresponding error

  20. Active alignment/contact verification system

    DOEpatents

    Greenbaum, William M.

    2000-01-01

    A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.

  1. Advanced verification methods for OVI security ink

    NASA Astrophysics Data System (ADS)

    Coombs, Paul G.; McCaffery, Shaun F.; Markantes, Tom

    2006-02-01

    OVI security ink +, incorporating OVP security pigment* microflakes, enjoys a history of effective document protection. This security feature provides not only first-line recognition by the person on the street, but also facilitates machine-readability. This paper explores the evolution of OVI reader technology from proof-of-concept to miniaturization. Three different instruments have been built to advance the technology of OVI machine verification. A bench-top unit has been constructed which allows users to automatically verify a multitude of different banknotes and OVI images. In addition, high speed modules were fabricated and tested in a state of the art banknote sorting machine. Both units demonstrate the ability of modern optical components to illuminate and collect light reflected from the interference platelets within OVI ink. Electronic hardware and software convert and process the optical information in milliseconds to accurately determine the authenticity of the security feature. Most recently, OVI ink verification hardware has been miniaturized and simplified providing yet another platform for counterfeit protection. These latest devices provide a tool for store clerks and bank tellers to unambiguously determine the validity of banknotes in the time period it takes the cash drawer to be opened.

  2. Automated System Calibration and Verification of the Position Measurements for the Los Alamos Isotope Production Facility and the Switchyard Kicker Facilities

    NASA Astrophysics Data System (ADS)

    Barr, D.; Gilpatrick, J. D.; Martinez, D.; Shurter, R. B.

    2004-11-01

    The Los Alamos Neutron Science Center (LANSCE) facility at Los Alamos National Laboratory has constructed both an Isotope Production Facility (IPF) and a Switchyard Kicker (XDK) as additions to the H+ and H- accelerator. These additions contain eleven Beam Position Monitors (BPMs) that measure the beam's position throughout the transport. The analog electronics within each processing module determines the beam position using the log-ratio technique. For system reliability, calibrations compensate for various temperature drifts and other imperfections in the processing electronics components. Additionally, verifications are periodically implemented by a PC running a National Instruments LabVIEW virtual instrument (VI) to verify continued system and cable integrity. The VI communicates with the processor cards via a PCI/MXI-3 VXI-crate communication module. Previously, accelerator operators performed BPM system calibrations typically once per day while beam was explicitly turned off. One of this new measurement system's unique achievements is its automated calibration and verification capability. Taking advantage of the pulsed nature of the LANSCE-facility beams, the integrated electronics hardware and VI perform calibration and verification operations between beam pulses without interrupting production beam delivery. The design, construction, and performance results of the automated calibration and verification portion of this position measurement system will be the topic of this paper.

  3. Enhanced Verification Test Suite for Physics Simulation Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, J R; Brock, J S; Brandon, S T

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest.more » This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  4. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  5. 22 CFR 123.14 - Import certificate/delivery verification procedure.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Import certificate/delivery verification...

  6. 22 CFR 123.14 - Import certificate/delivery verification procedure.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Import certificate/delivery verification...

  7. DFM flow by using combination between design based metrology system and model based verification at sub-50nm memory device

    NASA Astrophysics Data System (ADS)

    Kim, Cheol-kyun; Kim, Jungchan; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu; Kim, Jinwoong

    2007-03-01

    As the minimum transistor length is getting smaller, the variation and uniformity of transistor length seriously effect device performance. So, the importance of optical proximity effects correction (OPC) and resolution enhancement technology (RET) cannot be overemphasized. However, OPC process is regarded by some as a necessary evil in device performance. In fact, every group which includes process and design, are interested in whole chip CD variation trend and CD uniformity, which represent real wafer. Recently, design based metrology systems are capable of detecting difference between data base to wafer SEM image. Design based metrology systems are able to extract information of whole chip CD variation. According to the results, OPC abnormality was identified and design feedback items are also disclosed. The other approaches are accomplished on EDA companies, like model based OPC verifications. Model based verification will be done for full chip area by using well-calibrated model. The object of model based verification is the prediction of potential weak point on wafer and fast feed back to OPC and design before reticle fabrication. In order to achieve robust design and sufficient device margin, appropriate combination between design based metrology system and model based verification tools is very important. Therefore, we evaluated design based metrology system and matched model based verification system for optimum combination between two systems. In our study, huge amount of data from wafer results are classified and analyzed by statistical method and classified by OPC feedback and design feedback items. Additionally, novel DFM flow would be proposed by using combination of design based metrology and model based verification tools.

  8. Nonlinear 3D MHD verification study: SpeCyl and PIXIE3D codes for RFP and Tokamak plasmas

    NASA Astrophysics Data System (ADS)

    Bonfiglio, D.; Cappello, S.; Chacon, L.

    2010-11-01

    A strong emphasis is presently placed in the fusion community on reaching predictive capability of computational models. An essential requirement of such endeavor is the process of assessing the mathematical correctness of computational tools, termed verification [1]. We present here a successful nonlinear cross-benchmark verification study between the 3D nonlinear MHD codes SpeCyl [2] and PIXIE3D [3]. Excellent quantitative agreement is obtained in both 2D and 3D nonlinear visco-resistive dynamics for reversed-field pinch (RFP) and tokamak configurations [4]. RFP dynamics, in particular, lends itself as an ideal non trivial test-bed for 3D nonlinear verification. Perspectives for future application of the fully-implicit parallel code PIXIE3D to RFP physics, in particular to address open issues on RFP helical self-organization, will be provided. [4pt] [1] M. Greenwald, Phys. Plasmas 17, 058101 (2010) [0pt] [2] S. Cappello and D. Biskamp, Nucl. Fusion 36, 571 (1996) [0pt] [3] L. Chac'on, Phys. Plasmas 15, 056103 (2008) [0pt] [4] D. Bonfiglio, L. Chac'on and S. Cappello, Phys. Plasmas 17 (2010)

  9. Critical Surface Cleaning and Verification Alternatives

    NASA Technical Reports Server (NTRS)

    Melton, Donald M.; McCool, A. (Technical Monitor)

    2000-01-01

    As a result of federal and state requirements, historical critical cleaning and verification solvents such as Freon 113, Freon TMC, and Trichloroethylene (TCE) are either highly regulated or no longer 0 C available. Interim replacements such as HCFC 225 have been qualified, however toxicity and future phase-out regulations necessitate long term solutions. The scope of this project was to qualify a safe and environmentally compliant LOX surface verification alternative to Freon 113, TCE and HCFC 225. The main effort was focused on initiating the evaluation and qualification of HCFC 225G as an alternate LOX verification solvent. The project was scoped in FY 99/00 to perform LOX compatibility, cleaning efficiency and qualification on flight hardware.

  10. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  11. Formal verification of medical monitoring software using Z language: a representative sample.

    PubMed

    Babamir, Seyed Morteza; Borhani, Mehdi

    2012-08-01

    Medical monitoring systems are useful aids assisting physicians in keeping patients under constant surveillance; however, taking sound decision by the systems is a physician concern. As a result, verification of the systems behavior in monitoring patients is a matter of significant. The patient monitoring is undertaken by software in modern medical systems; so, software verification of modern medial systems have been noticed. Such verification can be achieved by the Formal Languages having mathematical foundations. Among others, the Z language is a suitable formal language has been used to formal verification of systems. This study aims to present a constructive method to verify a representative sample of a medical system by which the system is visually specified and formally verified against patient constraints stated in Z Language. Exploiting our past experience in formal modeling Continuous Infusion Insulin Pump (CIIP), we think of the CIIP system as a representative sample of medical systems in proposing our present study. The system is responsible for monitoring diabetic's blood sugar.

  12. Simulation-based MDP verification for leading-edge masks

    NASA Astrophysics Data System (ADS)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification

  13. High-resolution face verification using pore-scale facial features.

    PubMed

    Li, Dong; Zhou, Huiling; Lam, Kin-Man

    2015-08-01

    Face recognition methods, which usually represent face images using holistic or local facial features, rely heavily on alignment. Their performances also suffer a severe degradation under variations in expressions or poses, especially when there is one gallery per subject only. With the easy access to high-resolution (HR) face images nowadays, some HR face databases have recently been developed. However, few studies have tackled the use of HR information for face recognition or verification. In this paper, we propose a pose-invariant face-verification method, which is robust to alignment errors, using the HR information based on pore-scale facial features. A new keypoint descriptor, namely, pore-Principal Component Analysis (PCA)-Scale Invariant Feature Transform (PPCASIFT)-adapted from PCA-SIFT-is devised for the extraction of a compact set of distinctive pore-scale facial features. Having matched the pore-scale features of two-face regions, an effective robust-fitting scheme is proposed for the face-verification task. Experiments show that, with one frontal-view gallery only per subject, our proposed method outperforms a number of standard verification methods, and can achieve excellent accuracy even the faces are under large variations in expression and pose.

  14. The monitoring and verification of nuclear weapons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garwin, Richard L., E-mail: RLG2@us.ibm.com

    2014-05-09

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION: ADD-ON NOX CONTROLS

    EPA Science Inventory

    The paper discusses the environmental technology verification (ETV) of add-on nitrogen oxide (NOx) controls. Research Triangle Institute (RTI) is EPA's cooperating partner for the Air Pollution Control Technology (APCT) Program, one of a dozen ETV pilot programs. Verification of ...

  16. Verification and quality control of routine hematology analyzers.

    PubMed

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  17. Multi-canister overpack project -- verification and validation, MCNP 4A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldmann, L.H.

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less

  18. A UVM simulation environment for the study, optimization and verification of HL-LHC digital pixel readout chips

    NASA Astrophysics Data System (ADS)

    Marconi, S.; Conti, E.; Christiansen, J.; Placidi, P.

    2018-05-01

    The operating conditions of the High Luminosity upgrade of the Large Hadron Collider are very demanding for the design of next generation hybrid pixel readout chips in terms of particle rate, radiation level and data bandwidth. To this purpose, the RD53 Collaboration has developed for the ATLAS and CMS experiments a dedicated simulation and verification environment using industry-consolidated tools and methodologies, such as SystemVerilog and the Universal Verification Methodology (UVM). This paper presents how the so-called VEPIX53 environment has first guided the design of digital architectures, optimized for processing and buffering very high particle rates, and secondly how it has been reused for the functional verification of the first large scale demonstrator chip designed by the collaboration, which has recently been submitted.

  19. Design of verification platform for wireless vision sensor networks

    NASA Astrophysics Data System (ADS)

    Ye, Juanjuan; Shang, Fei; Yu, Chuang

    2017-08-01

    At present, the majority of research for wireless vision sensor networks (WVSNs) still remains in the software simulation stage, and the verification platforms of WVSNs that available for use are very few. This situation seriously restricts the transformation from theory research of WVSNs to practical application. Therefore, it is necessary to study the construction of verification platform of WVSNs. This paper combines wireless transceiver module, visual information acquisition module and power acquisition module, designs a high-performance wireless vision sensor node whose core is ARM11 microprocessor and selects AODV as the routing protocol to set up a verification platform called AdvanWorks for WVSNs. Experiments show that the AdvanWorks can successfully achieve functions of image acquisition, coding, wireless transmission, and obtain the effective distance parameters between nodes, which lays a good foundation for the follow-up application of WVSNs.

  20. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  2. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  3. Glove-based approach to online signature verification.

    PubMed

    Kamel, Nidal S; Sayeed, Shohel; Ellis, Grant A

    2008-06-01

    Utilizing the multiple degrees of freedom offered by the data glove for each finger and the hand, a novel on-line signature verification system using the Singular Value Decomposition (SVD) numerical tool for signature classification and verification is presented. The proposed technique is based on the Singular Value Decomposition in finding r singular vectors sensing the maximal energy of glove data matrix A, called principal subspace, so the effective dimensionality of A can be reduced. Having modeled the data glove signature through its r-principal subspace, signature authentication is performed by finding the angles between the different subspaces. A demonstration of the data glove is presented as an effective high-bandwidth data entry device for signature verification. This SVD-based signature verification technique is tested and its performance is shown to be able to recognize forgery signatures with a false acceptance rate of less than 1.2%.

  4. A methodology for model-based development and automated verification of software for aerospace systems

    NASA Astrophysics Data System (ADS)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  5. Verification of S&D Solutions for Network Communications and Devices

    NASA Astrophysics Data System (ADS)

    Rudolph, Carsten; Compagna, Luca; Carbone, Roberto; Muñoz, Antonio; Repp, Jürgen

    This chapter describes the tool-supported verification of S&D Solutions on the level of network communications and devices. First, the general goals and challenges of verification in the context of AmI systems are highlighted and the role of verification and validation within the SERENITY processes is explained.Then, SERENITY extensions to the SH VErification tool are explained using small examples. Finally, the applicability of existing verification tools is discussed in the context of the AVISPA toolset. The two different tools show that for the security analysis of network and devices S&D Patterns relevant complementary approachesexist and can be used.

  6. Test/QA Plan for Verification of Semi-Continuous Ambient Air Monitoring Systems - Second Round

    EPA Science Inventory

    Test/QA Plan for Verification of Semi-Continuous Ambient Air Monitoring Systems - Second Round. Changes reflect performance of second round of testing at new location and with various changes to personnel. Additional changes reflect general improvements to the Version 1 test/QA...

  7. Formal Multilevel Hierarchical Verification of Synchronous MOS VLSI Circuits.

    DTIC Science & Technology

    1987-06-01

    166 12.4 Capacitance Coupling............................. 166 12.5 Multiple Abstraction Fuctions ....................... 168...depend on whether it is performing flat verification or hierarchical verification. The primary operations of Silica Pithecus when performing flat...signals never arise. The primary operation of Silica Pithecus when performing hierarchical verification is processing constraints to show they hold

  8. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  9. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  10. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  11. Trajectory Based Behavior Analysis for User Verification

    NASA Astrophysics Data System (ADS)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  12. Online 3D EPID-based dose verification: Proof of concept

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozenda

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of thismore » study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame

  13. Online 3D EPID-based dose verification: Proof of concept.

    PubMed

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel

    2016-07-01

    Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took

  14. 47 CFR 2.952 - Limitation on verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Limitation on verification. 2.952 Section 2.952 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL... person shall, in any advertising matter, brochure, etc., use or make reference to a verification in a...

  15. Response to "Improving Patient Safety With Error Identification in Chemotherapy Orders by Verification Nurses"
.

    PubMed

    Zhu, Ling-Ling; Lv, Na; Zhou, Quan

    2016-12-01

    We read, with great interest, the study by Baldwin and Rodriguez (2016), which described the role of the verification nurse and details the verification process in identifying errors related to chemotherapy orders. We strongly agree with their findings that a verification nurse, collaborating closely with the prescribing physician, pharmacist, and treating nurse, can better identify errors and maintain safety during chemotherapy administration.

  16. A feasibility study of treatment verification using EPID cine images for hypofractionated lung radiotherapy

    NASA Astrophysics Data System (ADS)

    Tang, Xiaoli; Lin, Tong; Jiang, Steve

    2009-09-01

    We propose a novel approach for potential online treatment verification using cine EPID (electronic portal imaging device) images for hypofractionated lung radiotherapy based on a machine learning algorithm. Hypofractionated radiotherapy requires high precision. It is essential to effectively monitor the target to ensure that the tumor is within the beam aperture. We modeled the treatment verification problem as a two-class classification problem and applied an artificial neural network (ANN) to classify the cine EPID images acquired during the treatment into corresponding classes—with the tumor inside or outside of the beam aperture. Training samples were generated for the ANN using digitally reconstructed radiographs (DRRs) with artificially added shifts in the tumor location—to simulate cine EPID images with different tumor locations. Principal component analysis (PCA) was used to reduce the dimensionality of the training samples and cine EPID images acquired during the treatment. The proposed treatment verification algorithm was tested on five hypofractionated lung patients in a retrospective fashion. On average, our proposed algorithm achieved a 98.0% classification accuracy, a 97.6% recall rate and a 99.7% precision rate. This work was first presented at the Seventh International Conference on Machine Learning and Applications, San Diego, CA, USA, 11-13 December 2008.

  17. Handbook: Design of automated redundancy verification

    NASA Technical Reports Server (NTRS)

    Ford, F. A.; Hasslinger, T. W.; Moreno, F. J.

    1971-01-01

    The use of the handbook is discussed and the design progress is reviewed. A description of the problem is presented, and examples are given to illustrate the necessity for redundancy verification, along with the types of situations to which it is typically applied. Reusable space vehicles, such as the space shuttle, are recognized as being significant in the development of the automated redundancy verification problem.

  18. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    NASA Technical Reports Server (NTRS)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  19. Being known, intimate, and valued: global self-verification and dyadic adjustment in couples and roommates.

    PubMed

    Katz, Jennifer; Joiner, Thomas E

    2002-02-01

    We contend that close relationships provide adults with optimal opportunities for personal growth when relationship partners provide accurate, honest feedback. Accordingly, it was predicted that young adults would experience the relationship quality with relationship partners who evaluated them in a manner consistent their own self-evaluations. Three empirical tests of this self-verification hypothesis as applied to close dyads were conducted. In Study 1, young adults in dating relationships were most intimate with and somewhat more committed to partners when they perceived that partners evaluated them as they evaluated themselves. Self-verification effects were pronounced for those involved in more serious dating relationships. In Study 2, men reported the greatest esteem for same-sex roommates who evaluated them in a self-verifying manner. Results from Study 2 were replicated and extended to both male and female roommate dyads in Study 3. Further, self-verification effects were most pronounced for young adults with high emotional empathy. Results suggest that self-verification theory is useful for understanding dyadic adjustment across a variety of relational contexts in young adulthood. Implications of self-verification processes for adult personal development are outlined within an identity negotiation framework.

  20. Calibration and verification of models of organic carbon removal kinetics in Aerated Submerged Fixed-Bed Biofilm Reactors (ASFBBR): a case study of wastewater from an oil-refinery.

    PubMed

    Trojanowicz, Karol; Wójcik, Włodzimierz

    2011-01-01

    The article presents a case-study on the calibration and verification of mathematical models of organic carbon removal kinetics in biofilm. The chosen Harremöes and Wanner & Reichert models were calibrated with a set of model parameters obtained both during dedicated studies conducted at pilot- and lab-scales for petrochemical wastewater conditions and from the literature. Next, the models were successfully verified through studies carried out utilizing a pilot ASFBBR type bioreactor installed in an oil-refinery wastewater treatment plant. During verification the pilot biofilm reactor worked under varying surface organic loading rates (SOL), dissolved oxygen concentrations and temperatures. The verification proved that the models can be applied in practice to petrochemical wastewater treatment engineering for e.g. biofilm bioreactor dimensioning.

  1. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  2. Self-verification and depression among youth psychiatric inpatients.

    PubMed

    Joiner, T E; Katz, J; Lew, A S

    1997-11-01

    According to self-verification theory (e.g., W.B. Swann, 1983), people are motivated to preserve stable self-concepts by seeking self-confirming interpersonal responses, even if the responses are negative. In the current study of 72 youth psychiatric inpatients (36 boys; 36 girls; ages 7-17, M = 13.18; SD = 2.59), the authors provide the 1st test of self-verification theory among a youth sample. Participants completed self-report questionnaires on depression, self-esteem, anxiety, negative and positive affect, and interest in negative feedback from others. The authors made chart diagnoses available, and they collected peer rejection ratings. Consistent with hypotheses, the authors found that interest in negative feedback was associated with depression, was predictive of peer rejection (but only within relatively longer peer relationships), was more highly related to cognitive than emotional aspects of depression, and was specifically associated with depression, rather than being generally associated with emotional distress. The authors discuss implications for self-verification theory and for the phenomenology of youth depression.

  3. Formal specification and verification of Ada software

    NASA Technical Reports Server (NTRS)

    Hird, Geoffrey R.

    1991-01-01

    The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.

  4. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  5. Current status of verification practices in clinical biochemistry in Spain.

    PubMed

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  6. Dynamic testing for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.

    1972-01-01

    Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.

  7. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  8. INF verification: a guide for the perplexed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendelsohn, J.

    1987-09-01

    The administration has dug itself some deep holes on the verification issue. It will have to conclude an arms control treaty without having resolved earlier (but highly questionable) compliance issues on which it has placed great emphasis. It will probably have to abandon its more sweeping (and unnecessary) on-site inspection (OSI) proposals because of adverse security and political implications for the United States and its allies. And, finally, it will probably have to present to the Congress an INF treaty that will provide for a considerably less-stringent (but nonetheless adequate) verification regime that it had originally demanded. It is difficultmore » to dispel the impression that, when the likelihood of concluding an INF treaty seemed remote, the administration indulged its penchant for intrusive and non-negotiable verification measures. As the possibility of, and eagerness for, a treaty increased, and as the Soviet Union shifted its policy from one of the resistance to OSI to one of indicating that on-site verification involved reciprocal obligations, the administration was forced to scale back its OSI rhetoric. This re-evaluation of OSI by the administration does not make the INF treaty any less verifiable; from the outset the Reagan administration was asking for a far more extensive verification package than was necessary, practicable, acceptable, or negotiable.« less

  9. WTS-4 system verification unit for wind/hydroelectric integration study

    NASA Technical Reports Server (NTRS)

    Watts, A. W.

    1982-01-01

    The Bureau of Reclamation (Reclamation) initiated a study to investigate the concept of integrating 100 MW of wind energy from megawatt-size wind turbines with the Federal hydroelectric system. As a part of the study, one large wind turbine was purchased through the competitive bid process and is now being installed to serve as a system verification unit (SVU). Reclamation negotiated an agreement with NASA to provide technical management of the project for the design, fabrication, installation, testing, and initial operation. Hamilton Standard was awarded a contract to furnish and install its WTS-4 wind turbine rated at 4 MW at a site near Medicine Bow, Wyoming. The purposes for installing the SVU are to fully evaluate the wind/hydro integration concept, make technical evaluation of the hardware design, train personnel in the technology, evaluate operation and maintenance aspects, and evaluate associated environmental impacts. The SVU will be operational in June 1982. Data from the WTS-4 and from a second SVU, Boeing's MOD-2, will be used to prepare a final design for a 100-MW farm if Congress authorizes the project.

  10. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Control Functions), MOD-027-1 (Verification of Models and Data for Turbine/Governor and Load Control or...), MOD-027-1 (Verification of Models and Data for Turbine/Governor and Load Control or Active Power... Category B and C contingencies, as required by wind generators in Order No. 661, or that those generators...

  11. 24 CFR 985.3 - Indicators, HUD verification methods and ratings.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Indicators, HUD verification..., HUD verification methods and ratings. This section states the performance indicators that are used to assess PHA Section 8 management. HUD will use the verification method identified for each indicator in...

  12. A robust method using propensity score stratification for correcting verification bias for binary tests

    PubMed Central

    He, Hua; McDermott, Michael P.

    2012-01-01

    Sensitivity and specificity are common measures of the accuracy of a diagnostic test. The usual estimators of these quantities are unbiased if data on the diagnostic test result and the true disease status are obtained from all subjects in an appropriately selected sample. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the result of the diagnostic test and other characteristics of the subjects. Estimators of sensitivity and specificity based on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias under the assumption that the missing data on disease status are missing at random (MAR), that is, the probability of missingness depends on the true (missing) disease status only through the test result and observed covariate information. When some of the covariates are continuous, or the number of covariates is relatively large, the existing methods require parametric models for the probability of disease or the probability of verification (given the test result and covariates), and hence are subject to model misspecification. We propose a new method for correcting verification bias based on the propensity score, defined as the predicted probability of verification given the test result and observed covariates. This is estimated separately for those with positive and negative test results. The new method classifies the verified sample into several subsamples that have homogeneous propensity scores and allows correction for verification bias. Simulation studies demonstrate that the new estimators are more robust to model misspecification than existing methods, but still perform well when the models for the probability of disease and probability of verification are correctly specified. PMID:21856650

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM FOR MONITORING AND CHARACTERIZATION

    EPA Science Inventory

    The Environmental Technology Verification Program is a service of the Environmental Protection Agency designed to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of performance. The goal of ETV i...

  14. The influence of verification jig on framework fit for nonsegmented fixed implant-supported complete denture.

    PubMed

    Ercoli, Carlo; Geminiani, Alessandro; Feng, Changyong; Lee, Heeje

    2012-05-01

    The purpose of this retrospective study was to assess if there was a difference in the likelihood of achieving passive fit when an implant-supported full-arch prosthesis framework is fabricated with or without the aid of a verification jig. This investigation was approved by the University of Rochester Research Subject Review Board (protocol #RSRB00038482). Thirty edentulous patients, 49 to 73 years old (mean 61 years old), rehabilitated with a nonsegmented fixed implant-supported complete denture were included in the study. During the restorative process, final impressions were made using the pickup impression technique and elastomeric impression materials. For 16 patients, a verification jig was made (group J), while for the remaining 14 patients, a verification jig was not used (group NJ) and the framework was fabricated directly on the master cast. During the framework try-in appointment, the fit was assessed by clinical (Sheffield test) and radiographic inspection and recorded as passive or nonpassive. When a verification jig was used (group J, n = 16), all frameworks exhibited clinically passive fit, while when a verification jig was not used (group NJ, n = 14), only two frameworks fit. This difference was statistically significant (p < .001). Within the limitations of this retrospective study, the fabrication of a verification jig ensured clinically passive fit of metal frameworks in nonsegmented fixed implant-supported complete denture. © 2011 Wiley Periodicals, Inc.

  15. Land Ice Verification and Validation Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and testmore » data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  16. SU-E-T-48: A Multi-Institutional Study of Independent Dose Verification for Conventional, SRS and SBRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takahashi, R; Kamima, T; Tachibana, H

    2015-06-15

    Purpose: To show the results of a multi-institutional study of the independent dose verification for conventional, Stereotactic radiosurgery and body radiotherapy (SRS and SBRT) plans based on the action level of AAPM TG-114. Methods: This study was performed at 12 institutions in Japan. To eliminate the bias of independent dose verification program (Indp), all of the institutions used the same CT-based independent dose verification software (Simple MU Analysis, Triangle Products, JP) with the Clarkson-based algorithm. Eclipse (AAA, PBC), Pinnacle{sup 3} (Adaptive Convolve) and Xio (Superposition) were used as treatment planning system (TPS). The confidence limits (CL, Mean±2SD) for 18 sitesmore » (head, breast, lung, pelvis, etc.) were evaluated in comparison in dose between the TPS and the Indp. Results: A retrospective analysis of 6352 treatment fields was conducted. The CLs for conventional, SRS and SBRT were 1.0±3.7 %, 2.0±2.5 % and 6.2±4.4 %, respectively. In conventional plans, most of the sites showed within 5 % of TG-114 action level. However, there were the systematic difference (4.0±4.0 % and 2.5±5.8 % for breast and lung, respectively). In SRS plans, our results showed good agreement compared to the action level. In SBRT plans, the discrepancy between the Indp was variable depending on dose calculation algorithms of TPS. Conclusion: The impact of dose calculation algorithms for the TPS and the Indp affects the action level. It is effective to set the site-specific tolerances, especially for the site where inhomogeneous correction can affect dose distribution strongly.« less

  17. Formal Verification at System Level

    NASA Astrophysics Data System (ADS)

    Mazzini, S.; Puri, S.; Mari, F.; Melatti, I.; Tronci, E.

    2009-05-01

    System Level Analysis calls for a language comprehensible to experts with different background and yet precise enough to support meaningful analyses. SysML is emerging as an effective balance between such conflicting goals. In this paper we outline some the results obtained as for SysML based system level functional formal verification by an ESA/ESTEC study, with a collaboration among INTECS and La Sapienza University of Roma. The study focuses on SysML based system level functional requirements techniques.

  18. 7 CFR 272.8 - State income and eligibility verification system.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false State income and eligibility verification system. 272... PARTICIPATING STATE AGENCIES § 272.8 State income and eligibility verification system. (a) General. (1) State agencies may maintain and use an income and eligibility verification system (IEVS), as specified in this...

  19. Guidelines for qualifying cleaning and verification materials

    NASA Technical Reports Server (NTRS)

    Webb, D.

    1995-01-01

    This document is intended to provide guidance in identifying technical issues which must be addressed in a comprehensive qualification plan for materials used in cleaning and cleanliness verification processes. Information presented herein is intended to facilitate development of a definitive checklist that should address all pertinent materials issues when down selecting a cleaning/verification media.

  20. 21 CFR 21.44 - Verification of identity.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  1. 21 CFR 21.44 - Verification of identity.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 1 2013-04-01 2013-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  2. 21 CFR 21.44 - Verification of identity.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 1 2014-04-01 2014-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  3. 21 CFR 21.44 - Verification of identity.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 1 2012-04-01 2012-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  4. GFO-1 Geophysical Data Record and Orbit Verifications for Global Change Studies

    NASA Technical Reports Server (NTRS)

    Shum, C. K.

    2000-01-01

    This final report summarizes the research work conducted under NASA's Physical Oceanography Program, entitled, GFO-1 Geophysical Data Record And Orbit Verifications For Global Change Studies, for the investigation time period from December 1, 1997 through November 30, 2000. The primary objectives of the investigation include providing verification and improvement for the precise orbit, media, geophysical, and instrument corrections to accurately reduce U.S. Navy's Geosat-Followon-1 (GFO-1) mission radar altimeter data to sea level measurements. The status of the GFO satellite (instrument and spacecraft operations, orbital tracking and altimeter) is summarized. GFO spacecraft has been accepted by the Navy from Ball Aerospace and has been declared operational since November, 2000. We have participated in four official GFO calibration/validation periods (Cal/Val I-IV), spanning from June 1999 through October 2000. Results of verification of the GFO orbit and geophysical data record measurements both from NOAA (IGDR) and from the Navy (NGDR) are reported. Our preliminary results indicate that: (1) the precise orbit (GSFC and OSU) can be determined to approx. 5 - 6 cm rms radially using SLR and altimeter crossovers; (2) estimated GFO MOE (GSFC or NRL) radial orbit accuracy is approx. 7 - 30 cm and Operational Doppler orbit accuracy is approx. 60 - 350 cm. After bias and tilt adjustment (1000 km arc), estimated Doppler orbit accuracy is approx. 1.2 - 6.5 cm rms and the MOE accuracy is approx. 1.0 - 2.3 cm; (3) the geophysical and media corrections have been validated versus in situ measurements and measurements from other operating altimeters (T/P and ERS-2). Altimeter time bias is insignificant with 0-2 ms. Sea state bias is about approx. 3 - 4.5% of SWH. Wet troposphere correction has approx. 1 cm bias and approx. 3 cm rms when compared with ERS-2 data. Use of GIM and IRI95 provide ionosphere correction accurate to 2-3 cm rms during medium to high solar activities; (4

  5. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  6. You Can't See the Real Me: Attachment Avoidance, Self-Verification, and Self-Concept Clarity.

    PubMed

    Emery, Lydia F; Gardner, Wendi L; Carswell, Kathleen L; Finkel, Eli J

    2018-03-01

    Attachment shapes people's experiences in their close relationships and their self-views. Although attachment avoidance and anxiety both undermine relationships, past research has primarily emphasized detrimental effects of anxiety on the self-concept. However, as partners can help people maintain stable self-views, avoidant individuals' negative views of others might place them at risk for self-concept confusion. We hypothesized that avoidance would predict lower self-concept clarity and that less self-verification from partners would mediate this association. Attachment avoidance was associated with lower self-concept clarity (Studies 1-5), an effect that was mediated by low self-verification (Studies 2-3). The association between avoidance and self-verification was mediated by less self-disclosure and less trust in partner feedback (Study 4). Longitudinally, avoidance predicted changes in self-verification, which in turn predicted changes in self-concept clarity (Study 5). Thus, avoidant individuals' reluctance to trust or become too close to others may result in hidden costs to the self-concept.

  7. Formal Verification for a Next-Generation Space Shuttle

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2002-01-01

    This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.

  8. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication... 30 Mineral Resources 2 2011-07-01 2011-07-01 false When must I resubmit Platform Verification...

  9. A digital flight control system verification laboratory

    NASA Technical Reports Server (NTRS)

    De Feo, P.; Saib, S.

    1982-01-01

    A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.

  10. A methodology for the rigorous verification of plasma simulation codes

    NASA Astrophysics Data System (ADS)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION TEST PROTOCOL, GENERAL VENTILATION FILTERS

    EPA Science Inventory

    The Environmental Technology Verification Test Protocol, General Ventilation Filters provides guidance for verification tests.

    Reference is made in the protocol to the ASHRAE 52.2P "Method of Testing General Ventilation Air-cleaning Devices for Removal Efficiency by P...

  12. Signature Verification Using N-tuple Learning Machine.

    PubMed

    Maneechot, Thanin; Kitjaidure, Yuttana

    2005-01-01

    This research presents new algorithm for signature verification using N-tuple learning machine. The features are taken from handwritten signature on Digital Tablet (On-line). This research develops recognition algorithm using four features extraction, namely horizontal and vertical pen tip position(x-y position), pen tip pressure, and pen altitude angles. Verification uses N-tuple technique with Gaussian thresholding.

  13. Verification bias an underrecognized source of error in assessing the efficacy of medical imaging.

    PubMed

    Petscavage, Jonelle M; Richardson, Michael L; Carr, Robert B

    2011-03-01

    Diagnostic tests are validated by comparison against a "gold standard" reference test. When the reference test is invasive or expensive, it may not be applied to all patients. This can result in biased estimates of the sensitivity and specificity of the diagnostic test. This type of bias is called "verification bias," and is a common problem in imaging research. The purpose of our study is to estimate the prevalence of verification bias in the recent radiology literature. All issues of the American Journal of Roentgenology (AJR), Academic Radiology, Radiology, and European Journal of Radiology (EJR) between November 2006 and October 2009 were reviewed for original research articles mentioning sensitivity or specificity as endpoints. Articles were read to determine whether verification bias was present and searched for author recognition of verification bias in the design. During 3 years, these journals published 2969 original research articles. A total of 776 articles used sensitivity or specificity as an outcome. Of these, 211 articles demonstrated potential verification bias. The fraction of articles with potential bias was respectively 36.4%, 23.4%, 29.5%, and 13.4% for AJR, Academic Radiology, Radiology, and EJR. The total fraction of papers with potential bias in which the authors acknowledged this bias was 17.1%. Verification bias is a common and frequently unacknowledged source of error in efficacy studies of diagnostic imaging. Bias can often be eliminated by proper study design. When it cannot be eliminated, it should be estimated and acknowledged. Published by Elsevier Inc.

  14. 38 CFR 74.11 - How does CVE process applications for VetBiz VIP Verification Program?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... electronic means. (The Office of Management and Budget has approved the information collection requirements... Veterans Enterprise, is authorized to approve or deny applications for VetBiz VIP Verification. The CVE... complete and suitable for evaluation and, if not, what additional information or clarification is required...

  15. 38 CFR 74.11 - How does CVE process applications for VetBiz VIP Verification Program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... electronic means. (The Office of Management and Budget has approved the information collection requirements... Veterans Enterprise, is authorized to approve or deny applications for VetBiz VIP Verification. The CVE... complete and suitable for evaluation and, if not, what additional information or clarification is required...

  16. 38 CFR 74.11 - How does CVE process applications for VetBiz VIP Verification Program?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... electronic means. (The Office of Management and Budget has approved the information collection requirements... Veterans Enterprise, is authorized to approve or deny applications for VetBiz VIP Verification. The CVE... complete and suitable for evaluation and, if not, what additional information or clarification is required...

  17. 38 CFR 74.11 - How does CVE process applications for VetBiz VIP Verification Program?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... electronic means. (The Office of Management and Budget has approved the information collection requirements... Veterans Enterprise, is authorized to approve or deny applications for VetBiz VIP Verification. The CVE... complete and suitable for evaluation and, if not, what additional information or clarification is required...

  18. 38 CFR 74.11 - How does CVE process applications for VetBiz VIP Verification Program?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... electronic means. (The Office of Management and Budget has approved the information collection requirements... Veterans Enterprise, is authorized to approve or deny applications for VetBiz VIP Verification. The CVE... complete and suitable for evaluation and, if not, what additional information or clarification is required...

  19. 24 CFR 5.659 - Family information and verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Family information and verification... Assisted Housing Serving Persons with Disabilities: Family Income and Family Payment; Occupancy... § 5.659 Family information and verification. (a) Applicability. This section states requirements for...

  20. 24 CFR 5.659 - Family information and verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Family information and verification... Assisted Housing Serving Persons with Disabilities: Family Income and Family Payment; Occupancy... § 5.659 Family information and verification. (a) Applicability. This section states requirements for...

  1. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... that your new configuration meets this verification. The verification consists of operating an engine... with data simultaneously generated and recorded by laboratory equipment as follows: (1) Mount an engine...

  2. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... that your new configuration meets this verification. The verification consists of operating an engine... with data simultaneously generated and recorded by laboratory equipment as follows: (1) Mount an engine...

  3. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... that your new configuration meets this verification. The verification consists of operating an engine... with data simultaneously generated and recorded by laboratory equipment as follows: (1) Mount an engine...

  4. 40 CFR 1066.275 - Daily dynamometer readiness verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.275 Daily... automated process for this verification procedure, perform this evaluation by setting the initial speed and... your dynamometer does not perform this verification with an automated process: (1) With the dynamometer...

  5. Software Verification of Orion Cockpit Displays

    NASA Technical Reports Server (NTRS)

    Biswas, M. A. Rafe; Garcia, Samuel; Prado, Matthew; Hossain, Sadad; Souris, Matthew; Morin, Lee

    2017-01-01

    NASA's latest spacecraft Orion is in the development process of taking humans deeper into space. Orion is equipped with three main displays to monitor and control the spacecraft. To ensure the software behind the glass displays operates without faults, rigorous testing is needed. To conduct such testing, the Rapid Prototyping Lab at NASA's Johnson Space Center along with the University of Texas at Tyler employed a software verification tool, EggPlant Functional by TestPlant. It is an image based test automation tool that allows users to create scripts to verify the functionality within a program. A set of edge key framework and Common EggPlant Functions were developed to enable creation of scripts in an efficient fashion. This framework standardized the way to code and to simulate user inputs in the verification process. Moreover, the Common EggPlant Functions can be used repeatedly in verification of different displays.

  6. A Methodology for Evaluating Artifacts Produced by a Formal Verification Process

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette

    2011-01-01

    The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.

  7. Verification bias: an under-recognized source of error in assessing the efficacy of MRI of the meniscii.

    PubMed

    Richardson, Michael L; Petscavage, Jonelle M

    2011-11-01

    The sensitivity and specificity of magnetic resonance imaging (MRI) for diagnosis of meniscal tears has been studied extensively, with tears usually verified by surgery. However, surgically unverified cases are often not considered in these studies, leading to verification bias, which can falsely increase the sensitivity and decrease the specificity estimates. Our study suggests that such bias may be very common in the meniscal MRI literature, and illustrates techniques to detect and correct for such bias. PubMed was searched for articles estimating sensitivity and specificity of MRI for meniscal tears. These were assessed for verification bias, deemed potentially present if a study included any patients whose MRI findings were not surgically verified. Retrospective global sensitivity analysis (GSA) was performed when possible. Thirty-nine of the 314 studies retrieved from PubMed specifically dealt with meniscal tears. All 39 included unverified patients, and hence, potential verification bias. Only seven articles included sufficient information to perform GSA. Of these, one showed definite verification bias, two showed no bias, and four others showed bias within certain ranges of disease prevalence. Only 9 of 39 acknowledged the possibility of verification bias. Verification bias is underrecognized and potentially common in published estimates of the sensitivity and specificity of MRI for the diagnosis of meniscal tears. When possible, it should be avoided by proper study design. If unavoidable, it should be acknowledged. Investigators should tabulate unverified as well as verified data. Finally, verification bias should be estimated; if present, corrected estimates of sensitivity and specificity should be used. Our online web-based calculator makes this process relatively easy. Copyright © 2011 AUR. Published by Elsevier Inc. All rights reserved.

  8. 37 CFR 262.7 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner or a Performer may conduct a single audit of the Designated Agent upon reasonable notice and... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR CERTAIN ELIGIBLE...

  9. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...

  10. 40 CFR 1065.920 - PEMS Calibrations and verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... verification. The verification consists of operating an engine over a duty cycle in the laboratory and... by laboratory equipment as follows: (1) Mount an engine on a dynamometer for laboratory testing...

  11. Formal verification of human-automation interaction

    NASA Technical Reports Server (NTRS)

    Degani, Asaf; Heymann, Michael

    2002-01-01

    This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.

  12. A framework of multitemplate ensemble for fingerprint verification

    NASA Astrophysics Data System (ADS)

    Yin, Yilong; Ning, Yanbin; Ren, Chunxiao; Liu, Li

    2012-12-01

    How to improve performance of an automatic fingerprint verification system (AFVS) is always a big challenge in biometric verification field. Recently, it becomes popular to improve the performance of AFVS using ensemble learning approach to fuse related information of fingerprints. In this article, we propose a novel framework of fingerprint verification which is based on the multitemplate ensemble method. This framework is consisted of three stages. In the first stage, enrollment stage, we adopt an effective template selection method to select those fingerprints which best represent a finger, and then, a polyhedron is created by the matching results of multiple template fingerprints and a virtual centroid of the polyhedron is given. In the second stage, verification stage, we measure the distance between the centroid of the polyhedron and a query image. In the final stage, a fusion rule is used to choose a proper distance from a distance set. The experimental results on the FVC2004 database prove the improvement on the effectiveness of the new framework in fingerprint verification. With a minutiae-based matching method, the average EER of four databases in FVC2004 drops from 10.85 to 0.88, and with a ridge-based matching method, the average EER of these four databases also decreases from 14.58 to 2.51.

  13. Comparative study of the swabbing properties of seven commercially available swab materials for cleaning verification.

    PubMed

    Corrigan, Damion K; Piletsky, Sergey; McCrossen, Sean

    2009-01-01

    This article compares the technical performances of several different commercially available swabbing materials for the purpose of cleaning verification. A steel surface was soiled with solutions of acetaminophen, nicotinic acid, diclofenac, and benzamidine and wiped with each swabbing material. The compounds were extracted with water or ethanol (depending on polarity of analyte) and their concentration in extract was quantified spectrophotometrically. The study also investigated swab debris on the wiped surface. The swab performances were compared and the best swab material was identified.

  14. Proton Therapy Verification with PET Imaging

    PubMed Central

    Zhu, Xuping; Fakhri, Georges El

    2013-01-01

    Proton therapy is very sensitive to uncertainties introduced during treatment planning and dose delivery. PET imaging of proton induced positron emitter distributions is the only practical approach for in vivo, in situ verification of proton therapy. This article reviews the current status of proton therapy verification with PET imaging. The different data detecting systems (in-beam, in-room and off-line PET), calculation methods for the prediction of proton induced PET activity distributions, and approaches for data evaluation are discussed. PMID:24312147

  15. INF and IAEA: A comparative analysis of verification strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scheinman, L.; Kratzer, M.

    1992-07-01

    This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities.

  16. Formal verification of an oral messages algorithm for interactive consistency

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1992-01-01

    The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.

  17. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a) In...

  18. Three Lectures on Theorem-proving and Program Verification

    NASA Technical Reports Server (NTRS)

    Moore, J. S.

    1983-01-01

    Topics concerning theorem proving and program verification are discussed with particlar emphasis on the Boyer/Moore theorem prover, and approaches to program verification such as the functional and interpreter methods and the inductive assertion approach. A history of the discipline and specific program examples are included.

  19. 10 CFR 61.32 - Facility information and verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Facility information and verification. 61.32 Section 61.32 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR LAND DISPOSAL OF RADIOACTIVE WASTE Licenses Us/iaea Safeguards Agreement § 61.32 Facility information and verification. (a) In...

  20. 10 CFR 61.32 - Facility information and verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Facility information and verification. 61.32 Section 61.32 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR LAND DISPOSAL OF RADIOACTIVE WASTE Licenses Us/iaea Safeguards Agreement § 61.32 Facility information and verification. (a) In...

  1. 10 CFR 61.32 - Facility information and verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Facility information and verification. 61.32 Section 61.32 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR LAND DISPOSAL OF RADIOACTIVE WASTE Licenses Us/iaea Safeguards Agreement § 61.32 Facility information and verification. (a) In...

  2. Distilling the Verification Process for Prognostics Algorithms

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai

    2013-01-01

    The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.

  3. Numerical verification of composite rods theory on multi-story buildings analysis

    NASA Astrophysics Data System (ADS)

    El-Din Mansour, Alaa; Filatov, Vladimir; Gandzhuntsev, Michael; Ryasny, Nikita

    2018-03-01

    In the article, a verification proposal of the composite rods theory on the structural analysis of skeletons for high-rise buildings. A testing design model been formed on which horizontal elements been represented by a multilayer cantilever beam operates on transverse bending on which slabs are connected with a moment-non-transferring connections and a multilayer columns represents the vertical elements. Those connections are sufficiently enough to form a shearing action can be approximated by a certain shear forces function, the thing which significantly reduces the overall static indeterminacy degree of the structural model. A system of differential equations describe the operation mechanism of the multilayer rods that solved using the numerical approach of successive approximations method. The proposed methodology to be used while preliminary calculations for the sake of determining the rigidity characteristics of the structure; are needed. In addition, for a qualitative assessment of the results obtained by other methods when performing calculations with the verification aims.

  4. Land surface Verification Toolkit (LVT)

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  5. EOS-AM precision pointing verification

    NASA Technical Reports Server (NTRS)

    Throckmorton, A.; Braknis, E.; Bolek, J.

    1993-01-01

    The Earth Observing System (EOS) AM mission requires tight pointing knowledge to meet scientific objectives, in a spacecraft with low frequency flexible appendage modes. As the spacecraft controller reacts to various disturbance sources and as the inherent appendage modes are excited by this control action, verification of precision pointing knowledge becomes particularly challenging for the EOS-AM mission. As presently conceived, this verification includes a complementary set of multi-disciplinary analyses, hardware tests and real-time computer in the loop simulations, followed by collection and analysis of hardware test and flight data and supported by a comprehensive data base repository for validated program values.

  6. 78 FR 27882 - VA Veteran-Owned Small Business (VOSB) Verification Guidelines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-13

    ... Verification Self-Assessment Tool that walks the veteran through the regulation and how it applies to the...) Verification Guidelines AGENCY: Department of Veterans Affairs. ACTION: Advanced notice of proposed rulemaking... regulations governing the Department of Veterans Affairs (VA) Veteran-Owned Small Business (VOSB) Verification...

  7. 78 FR 32010 - Pipeline Safety: Public Workshop on Integrity Verification Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-28

    .... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process AGENCY: Pipeline and... announcing a public workshop to be held on the concept of ``Integrity Verification Process.'' The Integrity Verification Process shares similar characteristics with fitness for service processes. At this workshop, the...

  8. Crowd Sourced Formal Verification-Augmentation (CSFV-A)

    DTIC Science & Technology

    2016-06-01

    Formal Verification (CSFV) program built games that recast FV problems into puzzles to make these problems more accessible, increasing the manpower to...construct FV proofs. This effort supported the CSFV program by hosting the games on a public website, and analyzed the gameplay for efficiency to...provide FV proofs. 15. SUBJECT TERMS Crowd Source, Software, Formal Verification, Games 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT

  9. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR DATA ENTRY AND DATA VERIFICATION (HAND ENTRY) (UA-D-15.0)

    EPA Science Inventory

    The purpose of this SOP is to define the steps involved in data entry and data verification of physical forms. It applies to the data entry and data verification of all physical forms. The procedure defined herein was developed for use in the Arizona NHEXAS project and the "Bor...

  10. 24 CFR 960.259 - Family information and verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Family information and verification... URBAN DEVELOPMENT ADMISSION TO, AND OCCUPANCY OF, PUBLIC HOUSING Rent and Reexamination § 960.259 Family information and verification. (a) Family obligation to supply information. (1) The family must supply any...

  11. 24 CFR 960.259 - Family information and verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Family information and verification... URBAN DEVELOPMENT ADMISSION TO, AND OCCUPANCY OF, PUBLIC HOUSING Rent and Reexamination § 960.259 Family information and verification. (a) Family obligation to supply information. (1) The family must supply any...

  12. Student-Teacher Linkage Verification: Model Process and Recommendations

    ERIC Educational Resources Information Center

    Watson, Jeffery; Graham, Matthew; Thorn, Christopher A.

    2012-01-01

    As momentum grows for tracking the role of individual educators in student performance, school districts across the country are implementing projects that involve linking teachers to their students. Programs that link teachers to student outcomes require a verification process for student-teacher linkages. Linkage verification improves accuracy by…

  13. Overview of the TOPEX/Poseidon Platform Harvest Verification Experiment

    NASA Technical Reports Server (NTRS)

    Morris, Charles S.; DiNardo, Steven J.; Christensen, Edward J.

    1995-01-01

    An overview is given of the in situ measurement system installed on Texaco's Platform Harvest for verification of the sea level measurement from the TOPEX/Poseidon satellite. The prelaunch error budget suggested that the total root mean square (RMS) error due to measurements made at this verification site would be less than 4 cm. The actual error budget for the verification site is within these original specifications. However, evaluation of the sea level data from three measurement systems at the platform has resulted in unexpectedly large differences between the systems. Comparison of the sea level measurements from the different tide gauge systems has led to a better understanding of the problems of measuring sea level in relatively deep ocean. As of May 1994, the Platform Harvest verification site has successfully supported 60 TOPEX/Poseidon overflights.

  14. International Space Station Requirement Verification for Commercial Visiting Vehicles

    NASA Technical Reports Server (NTRS)

    Garguilo, Dan

    2017-01-01

    The COTS program demonstrated NASA could rely on commercial providers for safe, reliable, and cost-effective cargo delivery to ISS. The ISS Program has developed a streamlined process to safely integrate commercial visiting vehicles and ensure requirements are met Levy a minimum requirement set (down from 1000s to 100s) focusing on the ISS interface and safety, reducing the level of NASA oversight/insight and burden on the commercial Partner. Partners provide a detailed verification and validation plan documenting how they will show they've met NASA requirements. NASA conducts process sampling to ensure that the established verification processes is being followed. NASA participates in joint verification events and analysis for requirements that require both parties verify. Verification compliance is approved by NASA and launch readiness certified at mission readiness reviews.

  15. Property-driven functional verification technique for high-speed vision system-on-chip processor

    NASA Astrophysics Data System (ADS)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  16. Challenges in High-Assurance Runtime Verification

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.

    2016-01-01

    Safety-critical systems are growing more complex and becoming increasingly autonomous. Runtime Verification (RV) has the potential to provide protections when a system cannot be assured by conventional means, but only if the RV itself can be trusted. In this paper, we proffer a number of challenges to realizing high-assurance RV and illustrate how we have addressed them in our research. We argue that high-assurance RV provides a rich target for automated verification tools in hope of fostering closer collaboration among the communities.

  17. Formal Verification of the AAMP-FV Microcode

    NASA Technical Reports Server (NTRS)

    Miller, Steven P.; Greve, David A.; Wilding, Matthew M.; Srivas, Mandayam

    1999-01-01

    This report describes the experiences of Collins Avionics & Communications and SRI International in formally specifying and verifying the microcode in a Rockwell proprietary microprocessor, the AAMP-FV, using the PVS verification system. This project built extensively on earlier experiences using PVS to verify the microcode in the AAMP5, a complex, pipelined microprocessor designed for use in avionics displays and global positioning systems. While the AAMP5 experiment demonstrated the technical feasibility of formal verification of microcode, the steep learning curve encountered left unanswered the question of whether it could be performed at reasonable cost. The AAMP-FV project was conducted to determine whether the experience gained on the AAMP5 project could be used to make formal verification of microcode cost effective for safety-critical and high volume devices.

  18. Verification test report on a solar heating and hot water system

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  19. Volumetric Verification of Multiaxis Machine Tool Using Laser Tracker

    PubMed Central

    Aguilar, Juan José

    2014-01-01

    This paper aims to present a method of volumetric verification in machine tools with linear and rotary axes using a laser tracker. Beyond a method for a particular machine, it presents a methodology that can be used in any machine type. Along this paper, the schema and kinematic model of a machine with three axes of movement, two linear and one rotational axes, including the measurement system and the nominal rotation matrix of the rotational axis are presented. Using this, the machine tool volumetric error is obtained and nonlinear optimization techniques are employed to improve the accuracy of the machine tool. The verification provides a mathematical, not physical, compensation, in less time than other methods of verification by means of the indirect measurement of geometric errors of the machine from the linear and rotary axes. This paper presents an extensive study about the appropriateness and drawbacks of the regression function employed depending on the types of movement of the axes of any machine. In the same way, strengths and weaknesses of measurement methods and optimization techniques depending on the space available to place the measurement system are presented. These studies provide the most appropriate strategies to verify each machine tool taking into consideration its configuration and its available work space. PMID:25202744

  20. Cognitive Addition: Comparison of Learning Disabled and Academically Normal Children.

    ERIC Educational Resources Information Center

    Geary, David C.; And Others

    To isolate the process deficits underlying a specific learning disability in mathematics achievement, 77 academically normal and 46 learning disabled (LD) students in second, fourth or sixth grade were presented 140 simple addition problems using a true-false reaction time verification paradigm. (The problems were on a video screen controlled by…

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT FOR AMMONIA RECOVERY PROCESS

    EPA Science Inventory

    This Technology Verification report describes the nature and scope of an environmental evaluation of ThermoEnergy Corporation’s Ammonia Recovery Process (ARP) system. The information contained in this report represents data that were collected over a 3-month pilot study. The ti...

  2. TEST/QA PLAN FOR THE VERIFICATION TESTING OF ALTERNATIVES OR REFORMULATED LIQUID FUELS, FUEL ADDITIVES, FUEL EMULSONS, AND LUBRICANTS FOR HIGHWAY AND NONROAD USE HEAVY DUTY DIESEL ENGINES AND LIGHT DUTY GASOLINE ENGINES AND VEHICLES

    EPA Science Inventory

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  3. Temporal Specification and Verification of Real-Time Systems.

    DTIC Science & Technology

    1991-08-30

    of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .

  4. 20 CFR 212.5 - Verification of military service.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Verification of military service. 212.5... MILITARY SERVICE § 212.5 Verification of military service. Military service may be verified by the... armed forces that shows the beginning and ending dates of the individual's active military service; or a...

  5. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  6. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  7. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  8. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  9. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    NASA Astrophysics Data System (ADS)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  10. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience.

    PubMed

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Jörgen; Nyholm, Tufve; Ahnesjö, Anders; Karlsson, Mikael

    2007-08-21

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm(3) ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 +/- 1.2% and 0.5 +/- 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 +/- 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental

  11. Offline signature verification using convolution Siamese network

    NASA Astrophysics Data System (ADS)

    Xing, Zi-Jian; Yin, Fei; Wu, Yi-Chao; Liu, Cheng-Lin

    2018-04-01

    This paper presents an offline signature verification approach using convolutional Siamese neural network. Unlike the existing methods which consider feature extraction and metric learning as two independent stages, we adopt a deepleaning based framework which combines the two stages together and can be trained end-to-end. The experimental results on two offline public databases (GPDSsynthetic and CEDAR) demonstrate the superiority of our method on the offline signature verification problem.

  12. Prompt Gamma Imaging for In Vivo Range Verification of Pencil Beam Scanning Proton Therapy.

    PubMed

    Xie, Yunhe; Bentefour, El Hassane; Janssens, Guillaume; Smeets, Julien; Vander Stappen, François; Hotoiu, Lucian; Yin, Lingshu; Dolney, Derek; Avery, Stephen; O'Grady, Fionnbarr; Prieels, Damien; McDonough, James; Solberg, Timothy D; Lustig, Robert A; Lin, Alexander; Teo, Boon-Keng K

    2017-09-01

    To report the first clinical results and value assessment of prompt gamma imaging for in vivo proton range verification in pencil beam scanning mode. A stand-alone, trolley-mounted, prototype prompt gamma camera utilizing a knife-edge slit collimator design was used to record the prompt gamma signal emitted along the proton tracks during delivery of proton therapy for a brain cancer patient. The recorded prompt gamma depth detection profiles of individual pencil beam spots were compared with the expected profiles simulated from the treatment plan. In 6 treatment fractions recorded over 3 weeks, the mean (± standard deviation) range shifts aggregated over all spots in 9 energy layers were -0.8 ± 1.3 mm for the lateral field, 1.7 ± 0.7 mm for the right-superior-oblique field, and -0.4 ± 0.9 mm for the vertex field. This study demonstrates the feasibility and illustrates the distinctive benefits of prompt gamma imaging in pencil beam scanning treatment mode. Accuracy in range verification was found in this first clinical case to be better than the range uncertainty margin applied in the treatment plan. These first results lay the foundation for additional work toward tighter integration of the system for in vivo proton range verification and quantification of range uncertainties. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Compressive sensing using optimized sensing matrix for face verification

    NASA Astrophysics Data System (ADS)

    Oey, Endra; Jeffry; Wongso, Kelvin; Tommy

    2017-12-01

    Biometric appears as one of the solutions which is capable in solving problems that occurred in the usage of password in terms of data access, for example there is possibility in forgetting password and hard to recall various different passwords. With biometrics, physical characteristics of a person can be captured and used in the identification process. In this research, facial biometric is used in the verification process to determine whether the user has the authority to access the data or not. Facial biometric is chosen as its low cost implementation and generate quite accurate result for user identification. Face verification system which is adopted in this research is Compressive Sensing (CS) technique, in which aims to reduce dimension size as well as encrypt data in form of facial test image where the image is represented in sparse signals. Encrypted data can be reconstructed using Sparse Coding algorithm. Two types of Sparse Coding namely Orthogonal Matching Pursuit (OMP) and Iteratively Reweighted Least Squares -ℓp (IRLS-ℓp) will be used for comparison face verification system research. Reconstruction results of sparse signals are then used to find Euclidean norm with the sparse signal of user that has been previously saved in system to determine the validity of the facial test image. Results of system accuracy obtained in this research are 99% in IRLS with time response of face verification for 4.917 seconds and 96.33% in OMP with time response of face verification for 0.4046 seconds with non-optimized sensing matrix, while 99% in IRLS with time response of face verification for 13.4791 seconds and 98.33% for OMP with time response of face verification for 3.1571 seconds with optimized sensing matrix.

  14. Considerations in STS payload environmental verification

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1978-01-01

    The current philosophy of the GSFS regarding environmental verification of Shuttle payloads is reviewed. In the structures area, increased emphasis will be placed on the use of analysis for design verification, with selective testing performed as necessary. Furthermore, as a result of recent cost optimization analysis, the multitier test program will presumably give way to a comprehensive test program at the major payload subassembly level after adequate workmanship at the component level has been verified. In the thermal vacuum area, thought is being given to modifying the approaches used for conventional spacecraft.

  15. Comments for A Conference on Verification in the 21st Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doyle, James E.

    2012-06-12

    The author offers 5 points for the discussion of Verification and Technology: (1) Experience with the implementation of arms limitation and arms reduction agreements confirms that technology alone has never been relied upon to provide effective verification. (2) The historical practice of verification of arms control treaties between Cold War rivals may constrain the cooperative and innovative use of technology for transparency, veification and confidence building in the future. (3) An area that has been identified by many, including the US State Department and NNSA as being rich for exploration for potential uses of technology for transparency and verification ismore » information and communications technology (ICT). This includes social media, crowd-sourcing, the internet of things, and the concept of societal verification, but there are issues. (4) On the issue of the extent to which verification technologies are keeping pace with the demands of future protocols and agrements I think the more direct question is ''are they effective in supporting the objectives of the treaty or agreement?'' In this regard it is important to acknowledge that there is a verification grand challenge at our doorstep. That is ''how does one verify limitations on nuclear warheads in national stockpiles?'' (5) Finally, while recognizing the daunting political and security challenges of such an approach, multilateral engagement and cooperation at the conceptual and technical levels provides benefits for addressing future verification challenges.« less

  16. Security Verification Techniques Applied to PatchLink COTS Software

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt; Andrew, Chris; Jog, Sameer

    2006-01-01

    Verification of the security of software artifacts is a challenging task. An integrated approach that combines verification techniques can increase the confidence in the security of software artifacts. Such an approach has been developed by the Jet Propulsion Laboratory (JPL) and the University of California at Davis (UC Davis). Two security verification instruments were developed and then piloted on PatchLink's UNIX Agent, a Commercial-Off-The-Shelf (COTS) software product, to assess the value of the instruments and the approach. The two instruments are the Flexible Modeling Framework (FMF) -- a model-based verification instrument (JPL), and a Property-Based Tester (UC Davis). Security properties were formally specified for the COTS artifact and then verified using these instruments. The results were then reviewed to determine the effectiveness of the approach and the security of the COTS product.

  17. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (UA-D-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the "Border" study. Keywords: Computers; Software; QA/QC.

    The National Human Exposure Assessment Sur...

  18. Ontology Matching with Semantic Verification.

    PubMed

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  19. What is the Final Verification of Engineering Requirements?

    NASA Technical Reports Server (NTRS)

    Poole, Eric

    2010-01-01

    This slide presentation reviews the process of development through the final verification of engineering requirements. The definition of the requirements is driven by basic needs, and should be reviewed by both the supplier and the customer. All involved need to agree upon a formal requirements including changes to the original requirements document. After the requirements have ben developed, the engineering team begins to design the system. The final design is reviewed by other organizations. The final operational system must satisfy the original requirements, though many verifications should be performed during the process. The verification methods that are used are test, inspection, analysis and demonstration. The plan for verification should be created once the system requirements are documented. The plan should include assurances that every requirement is formally verified, that the methods and the responsible organizations are specified, and that the plan is reviewed by all parties. The options of having the engineering team involved in all phases of the development as opposed to having some other organization continue the process once the design has been complete is discussed.

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  1. Linear models to perform treaty verification tasks for enhanced information security

    NASA Astrophysics Data System (ADS)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A.

    2017-02-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  2. Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System

    NASA Astrophysics Data System (ADS)

    Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li

    The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.

  3. Evaluation of HCFC AK 225 Alternatives for Precision Cleaning and Verification

    NASA Technical Reports Server (NTRS)

    Melton, D. M.

    1998-01-01

    Maintaining qualified cleaning and verification processes are essential in an production environment. Environmental regulations have and are continuing to impact cleaning and verification processing in component and large structures, both at the Michoud Assembly Facility and component suppliers. The goal of the effort was to assure that the cleaning and verification proceeds unimpeded and that qualified, environmentally compliant material and process replacements are implemented and perform to specifications. The approach consisted of (1) selection of a Supersonic Gas-Liquid Cleaning System; (2) selection and evaluation of three cleaning and verification solvents as candidate alternatives to HCFC 225 (Vertrel 423 (HCFC), Vertrel MCA (HFC/1,2-Dichloroethylene), and HFE 7100DE (HFE/1,2 Dichloroethylene)); and evaluation of an analytical instrumental post cleaning verification technique. This document is presented in viewgraph format.

  4. Implementation and verification of global optimization benchmark problems

    NASA Astrophysics Data System (ADS)

    Posypkin, Mikhail; Usov, Alexander

    2017-12-01

    The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  5. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  6. Age and gender-invariant features of handwritten signatures for verification systems

    NASA Astrophysics Data System (ADS)

    AbdAli, Sura; Putz-Leszczynska, Joanna

    2014-11-01

    Handwritten signature is one of the most natural biometrics, the study of human physiological and behavioral patterns. Behavioral biometrics includes signatures that may be different due to its owner gender or age because of intrinsic or extrinsic factors. This paper presents the results of the author's research on age and gender influence on verification factors. The experiments in this research were conducted using a database that contains signatures and their associated metadata. The used algorithm is based on the universal forgery feature idea, where the global classifier is able to classify a signature as a genuine one or, as a forgery, without the actual knowledge of the signature template and its owner. Additionally, the reduction of the dimensionality with the MRMR method is discussed.

  7. Ada(R) Test and Verification System (ATVS)

    NASA Technical Reports Server (NTRS)

    Strelich, Tom

    1986-01-01

    The Ada Test and Verification System (ATVS) functional description and high level design are completed and summarized. The ATVS will provide a comprehensive set of test and verification capabilities specifically addressing the features of the Ada language, support for embedded system development, distributed environments, and advanced user interface capabilities. Its design emphasis was on effective software development environment integration and flexibility to ensure its long-term use in the Ada software development community.

  8. Distance Metric Learning Using Privileged Information for Face Verification and Person Re-Identification.

    PubMed

    Xu, Xinxing; Li, Wen; Xu, Dong

    2015-12-01

    In this paper, we propose a new approach to improve face verification and person re-identification in the RGB images by leveraging a set of RGB-D data, in which we have additional depth images in the training data captured using depth cameras such as Kinect. In particular, we extract visual features and depth features from the RGB images and depth images, respectively. As the depth features are available only in the training data, we treat the depth features as privileged information, and we formulate this task as a distance metric learning with privileged information problem. Unlike the traditional face verification and person re-identification tasks that only use visual features, we further employ the extra depth features in the training data to improve the learning of distance metric in the training process. Based on the information-theoretic metric learning (ITML) method, we propose a new formulation called ITML with privileged information (ITML+) for this task. We also present an efficient algorithm based on the cyclic projection method for solving the proposed ITML+ formulation. Extensive experiments on the challenging faces data sets EUROCOM and CurtinFaces for face verification as well as the BIWI RGBD-ID data set for person re-identification demonstrate the effectiveness of our proposed approach.

  9. Developing a NASA strategy for the verification of large space telescope observatories

    NASA Astrophysics Data System (ADS)

    Crooke, Julie A.; Gunderson, Johanna A.; Hagopian, John G.; Levine, Marie

    2006-06-01

    In July 2005, the Office of Program Analysis and Evaluation (PA&E) at NASA Headquarters was directed to develop a strategy for verification of the performance of large space telescope observatories, which occurs predominantly in a thermal vacuum test facility. A mission model of the expected astronomical observatory missions over the next 20 years was identified along with performance, facility and resource requirements. Ground testing versus alternatives was analyzed to determine the pros, cons and break points in the verification process. Existing facilities and their capabilities were examined across NASA, industry and other government agencies as well as the future demand for these facilities across NASA's Mission Directorates. Options were developed to meet the full suite of mission verification requirements, and performance, cost, risk and other analyses were performed. Findings and recommendations from the study were presented to the NASA Administrator and the NASA Strategic Management Council (SMC) in February 2006. This paper details the analysis, results, and findings from this study.

  10. Formal verification of an avionics microprocessor

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam, K.; Miller, Steven P.

    1995-01-01

    Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.

  11. SITE CHARACTERIZATION AND MONITORING TECHNOLOGY VERIFICATION: PROGRESS AND RESULTS

    EPA Science Inventory

    The Site Characterization and Monitoring Technology Pilot of the U.S. Environmental Protection Agency's Environmental Technology Verification Program (ETV) has been engaged in verification activities since the fall of 1994 (U.S. EPA, 1997). The purpose of the ETV is to promote th...

  12. Certification and verification for Calmac flat plate solar collector

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Information used in the certification and verification of the Calmac Flat Plate Collector is presented. Contained are such items as test procedures and results, information on materials used, installation, operation, and maintenance manuals, and other information pertaining to the verification and certification.

  13. 40 CFR 1066.240 - Torque transducer verification and calibration.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification and calibration. Calibrate torque-measurement systems as described in 40 CFR 1065.310. ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Torque transducer verification and...

  14. 40 CFR 1066.240 - Torque transducer verification and calibration.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification and calibration. Calibrate torque-measurement systems as described in 40 CFR 1065.310. ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Torque transducer verification and...

  15. 48 CFR 552.204-9 - Personal Identity Verification requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Personal Identity....204-9 Personal Identity Verification requirements. As prescribed in 504.1303, insert the following clause: Personal Identity Verification Requirements (OCT 2012) (a) The contractor shall comply with GSA...

  16. 48 CFR 552.204-9 - Personal Identity Verification requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false Personal Identity....204-9 Personal Identity Verification requirements. As prescribed in 504.1303, insert the following clause: Personal Identity Verification Requirements (OCT 2012) (a) The contractor shall comply with GSA...

  17. 48 CFR 552.204-9 - Personal Identity Verification requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Personal Identity....204-9 Personal Identity Verification requirements. As prescribed in 504.1303, insert the following clause: Personal Identity Verification Requirements (OCT 2012) (a) The contractor shall comply with GSA...

  18. Joint ETV/NOWATECH verification protocol for the Sorbisense GSW40 passive sampler

    EPA Science Inventory

    Environmental technology verification (ETV) is an independent (third party) assessment of the performance of a technology or a product for a specified application, under defined conditions and quality assurance. This verification is a joint verification with the US EPA ETV schem...

  19. Measurements for liquid rocket engine performance code verification

    NASA Technical Reports Server (NTRS)

    Praharaj, Sarat C.; Palko, Richard L.

    1986-01-01

    The goal of the rocket engine performance code verification tests is to obtain the I sub sp with an accuracy of 0.25% or less. This needs to be done during the sequence of four related tests (two reactive and two hot gas simulation) to best utilize the loss separation technique recommended in this study. In addition to I sub sp, the measurements of the input and output parameters for the codes are needed. This study has shown two things in regard to obtaining the I sub sp uncertainty within the 0.25% target. First, this target is generally not being realized at the present time, and second, the instrumentation and testing technology does exist to obtain this 0.25% uncertainty goal. However, to achieve this goal will require carefully planned, designed, and conducted testing. In addition, the test-stand (or system) dynamics must be evaluated in the pre-test and post-test phases of the design of the experiment and data analysis, respectively always keeping in mind that a .25% overall uncertainty in I sub sp is targeted. A table gives the maximum allowable uncertainty required for obtaining I sub sp with 0.25% uncertainty, the currently-quoted instrument specification, and present test uncertainty for the parameters. In general, it appears that measurement of the mass flow parameter within the required uncertainty may be the most difficult.

  20. Improved Detection Technique for Solvent Rinse Cleanliness Verification

    NASA Technical Reports Server (NTRS)

    Hornung, S. D.; Beeson, H. D.

    2001-01-01

    The NASA White Sands Test Facility (WSTF) has an ongoing effort to reduce or eliminate usage of cleaning solvents such as CFC-113 and its replacements. These solvents are used in the final clean and cleanliness verification processes for flight and ground support hardware, especially for oxygen systems where organic contaminants can pose an ignition hazard. For the final cleanliness verification in the standard process, the equivalent of one square foot of surface area of parts is rinsed with the solvent, and the final 100 mL of the rinse is captured. The amount of nonvolatile residue (NVR) in the solvent is determined by weight after the evaporation of the solvent. An improved process of sampling this rinse, developed at WSTF, requires evaporation of less than 2 mL of the solvent to make the cleanliness verification. Small amounts of the solvent are evaporated in a clean stainless steel cup, and the cleanliness of the stainless steel cup is measured using a commercially available surface quality monitor. The effectiveness of this new cleanliness verification technique was compared to the accepted NVR sampling procedures. Testing with known contaminants in solution, such as hydraulic fluid, fluorinated lubricants, and cutting and lubricating oils, was performed to establish a correlation between amount in solution and the process response. This report presents the approach and results and discusses the issues in establishing the surface quality monitor-based cleanliness verification.

  1. Transmutation Fuel Performance Code Thermal Model Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  2. Specification, Validation and Verification of Mobile Application Behavior

    DTIC Science & Technology

    2013-03-01

    VALIDATION AND VERIFICATION OF MOBILE APPLICATION BEHAVIOR by Christopher B. Bonine March 2013 Thesis Advisor: Man-Tak Shing Thesis Co...NUMBERS 6. AUTHOR(S) Christopher B. Bonine 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Postgraduate School Monterey, CA 93943–5000 8...VALIDATION AND VERIFICATION OF MOBILE APPLICATION BEHAVIOR Christopher B. Bonine Lieutenant, United States Navy B.S. Southern Polytechnic State

  3. Control and Non-Payload Communications (CNPC) Prototype Radio Verification Test Report

    NASA Technical Reports Server (NTRS)

    Bishop, William D.; Frantz, Brian D.; Thadhani, Suresh K.; Young, Daniel P.

    2017-01-01

    This report provides an overview and results from the verification of the specifications that defines the operational capabilities of the airborne and ground, L Band and C Band, Command and Non-Payload Communications radio link system. An overview of system verification is provided along with an overview of the operation of the radio. Measurement results are presented for verification of the radios operation.

  4. A novel verification method using a plastic scintillator imagining system for assessment of gantry sag in radiotherapy.

    PubMed

    Tsuneda, Masato; Nishio, Teiji; Saito, Akito; Tanaka, Sodai; Suzuki, Tatsuhiko; Kawahara, Daisuke; Matsushita, Keiichiro; Nishio, Aya; Ozawa, Shuichi; Karasawa, Kumiko; Nagata, Yasushi

    2018-06-01

    High accuracy of the beam-irradiated position is required for high-precision radiation therapy such as stereotactic body radiation therapy (SBRT), volumetric modulated arc therapy (VMAT), and intensity modulated radiation therapy (IMRT). Users generally perform the verification of the mechanical and radiation isocenters using the star shot test and the Winston Lutz test that allow evaluation of the displacement at the isocenter. However, these methods are unable to evaluate directly and quantitatively the sagging angle that is caused by the weight of the gantry itself along the gantry rotation axis. In addition, the verification of the central axis of the irradiated beam that is not dependent at the isocenter is needed for the mechanical quality assurance of a nonisocentric irradiation technique. In this study, we have developed a prototype system for the verification of three-dimensional (3D) beam alignment and we have verified the system concept for 3D isocentricity. Our system allows detection of the central axis in 3D coordinates and evaluation of the irradiated oblique angle to the gantry rotation axis, i.e., the sagging angle. In order to measure the central axis of the irradiated beam in 3D coordinates, we constructed the prototype verification system consisting of a column-shaped plastic scintillator (CoPS), a truncated cone-shaped mirror (TCsM), and a cooled charged-coupled device (CCD) camera. This verification system was irradiated with 6-MV photon beams and the scintillation light was measured using the CCD camera. The central axis on the axial plane (two-dimensional (2D) central axis) was acquired from the integration of the scintillation light along the major axis of the CoPS, and the central axis in 3D coordinates (3D central axis) was acquired from two curve-shaped profiles which were reflected by the TCsM. We verified the calculation accuracy of the gantry rotation axis, θ z . Additionally, we calculated the 3D central axis and the sagging angle

  5. Space shuttle propulsion estimation development verification, volume 1

    NASA Technical Reports Server (NTRS)

    Rogers, Robert M.

    1989-01-01

    The results of the Propulsion Estimation Development Verification are summarized. A computer program developed under a previous contract (NAS8-35324) was modified to include improved models for the Solid Rocket Booster (SRB) internal ballistics, the Space Shuttle Main Engine (SSME) power coefficient model, the vehicle dynamics using quaternions, and an improved Kalman filter algorithm based on the U-D factorized algorithm. As additional output, the estimated propulsion performances, for each device are computed with the associated 1-sigma bounds. The outputs of the estimation program are provided in graphical plots. An additional effort was expended to examine the use of the estimation approach to evaluate single engine test data. In addition to the propulsion estimation program PFILTER, a program was developed to produce a best estimate of trajectory (BET). The program LFILTER, also uses the U-D factorized algorithm form of the Kalman filter as in the propulsion estimation program PFILTER. The necessary definitions and equations explaining the Kalman filtering approach for the PFILTER program, the models used for this application for dynamics and measurements, program description, and program operation are presented.

  6. A Hybrid On-line Verification Method of Relay Setting

    NASA Astrophysics Data System (ADS)

    Gao, Wangyuan; Chen, Qing; Si, Ji; Huang, Xin

    2017-05-01

    Along with the rapid development of the power industry, grid structure gets more sophisticated. The validity and rationality of protective relaying are vital to the security of power systems. To increase the security of power systems, it is essential to verify the setting values of relays online. Traditional verification methods mainly include the comparison of protection range and the comparison of calculated setting value. To realize on-line verification, the verifying speed is the key. The verifying result of comparing protection range is accurate, but the computation burden is heavy, and the verifying speed is slow. Comparing calculated setting value is much faster, but the verifying result is conservative and inaccurate. Taking the overcurrent protection as example, this paper analyses the advantages and disadvantages of the two traditional methods above, and proposes a hybrid method of on-line verification which synthesizes the advantages of the two traditional methods. This hybrid method can meet the requirements of accurate on-line verification.

  7. Mapping {sup 15}O Production Rate for Proton Therapy Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grogg, Kira; Alpert, Nathaniel M.; Zhu, Xuping

    Purpose: This work was a proof-of-principle study for the evaluation of oxygen-15 ({sup 15}O) production as an imaging target through the use of positron emission tomography (PET), to improve verification of proton treatment plans and to study the effects of perfusion. Methods and Materials: Dynamic PET measurements of irradiation-produced isotopes were made for a phantom and rabbit thigh muscles. The rabbit muscle was irradiated and imaged under both live and dead conditions. A differential equation was fitted to phantom and in vivo data, yielding estimates of {sup 15}O production and clearance rates, which were compared to live versus dead rates formore » the rabbit and to Monte Carlo predictions. Results: PET clearance rates agreed with decay constants of the dominant radionuclide species in 3 different phantom materials. In 2 oxygen-rich materials, the ratio of {sup 15}O production rates agreed with the expected ratio. In the dead rabbit thighs, the dynamic PET concentration histories were accurately described using {sup 15}O decay constant, whereas the live thigh activity decayed faster. Most importantly, the {sup 15}O production rates agreed within 2% (P>.5) between conditions. Conclusions: We developed a new method for quantitative measurement of {sup 15}O production and clearance rates in the period immediately following proton therapy. Measurements in the phantom and rabbits were well described in terms of {sup 15}O production and clearance rates, plus a correction for other isotopes. These proof-of-principle results support the feasibility of detailed verification of proton therapy treatment delivery. In addition, {sup 15}O clearance rates may be useful in monitoring permeability changes due to therapy.« less

  8. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  9. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE PAGES

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...

    2017-03-23

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  10. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    NASA Astrophysics Data System (ADS)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help

  11. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    NASA Astrophysics Data System (ADS)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  12. Verification and validation of a rapid heat transfer calculation methodology for transient melt pool solidification conditions in powder bed metal additive manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plotkowski, A.; Kirka, M. M.; Babu, S. S.

    A fundamental understanding of spatial and temporal thermal distributions is crucial for predicting solidification and solid-state microstructural development in parts made by additive manufacturing. While sophisticated numerical techniques that are based on finite element or finite volume methods are useful for gaining insight into these phenomena at the length scale of the melt pool (100 - 500 µm), they are ill-suited for predicting engineering trends over full part cross-sections (> 10 x 10 cm) or many layers over long process times (> many days) due to the necessity of fully resolving the heat source characteristics. On the other hand, itmore » is extremely difficult to resolve the highly dynamic nature of the process using purely in-situ characterization techniques. This article proposes a pragmatic alternative based on a semi-analytical approach to predicting the transient heat conduction during powder bed metal additive manufacturing process. The model calculations were theoretically verified for selective laser melting of AlSi10Mg and electron beam melting of IN718 powders for simple cross-sectional geometries and the transient results are compared to steady state predictions from the Rosenthal equation. It is shown that the transient effects of the scan strategy create significant variations in the melt pool geometry and solid-liquid interface velocity, especially as the thermal diffusivity of the material decreases and the pre-heat of the process increases. With positive verification of the strategy, the model was then experimentally validated to simulate two point-melt scan strategies during electron beam melting of IN718, one intended to produce a columnar and one an equiaxed grain structure. Lastly, through comparison of the solidification conditions (i.e. transient and spatial variations of thermal gradient and liquid-solid interface velocity) predicted by the model to phenomenological CET theory, the model accurately predicted the experimental

  13. Verification and validation of a rapid heat transfer calculation methodology for transient melt pool solidification conditions in powder bed metal additive manufacturing

    DOE PAGES

    Plotkowski, A.; Kirka, M. M.; Babu, S. S.

    2017-10-16

    A fundamental understanding of spatial and temporal thermal distributions is crucial for predicting solidification and solid-state microstructural development in parts made by additive manufacturing. While sophisticated numerical techniques that are based on finite element or finite volume methods are useful for gaining insight into these phenomena at the length scale of the melt pool (100 - 500 µm), they are ill-suited for predicting engineering trends over full part cross-sections (> 10 x 10 cm) or many layers over long process times (> many days) due to the necessity of fully resolving the heat source characteristics. On the other hand, itmore » is extremely difficult to resolve the highly dynamic nature of the process using purely in-situ characterization techniques. This article proposes a pragmatic alternative based on a semi-analytical approach to predicting the transient heat conduction during powder bed metal additive manufacturing process. The model calculations were theoretically verified for selective laser melting of AlSi10Mg and electron beam melting of IN718 powders for simple cross-sectional geometries and the transient results are compared to steady state predictions from the Rosenthal equation. It is shown that the transient effects of the scan strategy create significant variations in the melt pool geometry and solid-liquid interface velocity, especially as the thermal diffusivity of the material decreases and the pre-heat of the process increases. With positive verification of the strategy, the model was then experimentally validated to simulate two point-melt scan strategies during electron beam melting of IN718, one intended to produce a columnar and one an equiaxed grain structure. Lastly, through comparison of the solidification conditions (i.e. transient and spatial variations of thermal gradient and liquid-solid interface velocity) predicted by the model to phenomenological CET theory, the model accurately predicted the experimental

  14. Simulation based mask defect repair verification and disposition

    NASA Astrophysics Data System (ADS)

    Guo, Eric; Zhao, Shirley; Zhang, Skin; Qian, Sandy; Cheng, Guojie; Vikram, Abhishek; Li, Ling; Chen, Ye; Hsiang, Chingyun; Zhang, Gary; Su, Bo

    2009-10-01

    As the industry moves towards sub-65nm technology nodes, the mask inspection, with increased sensitivity and shrinking critical defect size, catches more and more nuisance and false defects. Increased defect counts pose great challenges in the post inspection defect classification and disposition: which defect is real defect, and among the real defects, which defect should be repaired and how to verify the post-repair defects. In this paper, we address the challenges in mask defect verification and disposition, in particular, in post repair defect verification by an efficient methodology, using SEM mask defect images, and optical inspection mask defects images (only for verification of phase and transmission related defects). We will demonstrate the flow using programmed mask defects in sub-65nm technology node design. In total 20 types of defects were designed including defects found in typical real circuit environments with 30 different sizes designed for each type. The SEM image was taken for each programmed defect after the test mask was made. Selected defects were repaired and SEM images from the test mask were taken again. Wafers were printed with the test mask before and after repair as defect printability references. A software tool SMDD-Simulation based Mask Defect Disposition-has been used in this study. The software is used to extract edges from the mask SEM images and convert them into polygons to save in GDSII format. Then, the converted polygons from the SEM images were filled with the correct tone to form mask patterns and were merged back into the original GDSII design file. This merge is for the purpose of contour simulation-since normally the SEM images cover only small area (~1 μm) and accurate simulation requires including larger area of optical proximity effect. With lithography process model, the resist contour of area of interest (AOI-the area surrounding a mask defect) can be simulated. If such complicated model is not available, a simple

  15. Formal System Verification for Trustworthy Embedded Systems

    DTIC Science & Technology

    2011-04-19

    microkernel basis. We had previously achieved code- level formal verification of the seL4 microkernel [3]. In the present project, over 12 months with 0.6 FTE...project, we designed and implemented a secure network access device (SAC) on top of the verified seL4 microkernel. The device allows a trusted front...Engelhardt, Rafal Kolan- ski, Michael Norrish, Thomas Sewell, Harvey Tuch, and Simon Winwood. seL4 : Formal verification of an OS kernel. CACM, 53(6):107

  16. Two-Black Box Concept for Warhead Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bates, Cameron Russell; Frame, Katherine Chiyoko; Mckigney, Edward Allen

    2017-03-06

    We have created a possible solution to meeting the requirements of certification/authentication while still employing complicated criteria. Technical solutions to protecting information from the host in an inspection environment needs to be assessed by those with specific expertise but, LANL can still study the verification problem. The two-black box framework developed provides another potential solution to the confidence vs. certification paradox.

  17. 18 CFR 385.2005 - Subscription and verification (Rule 2005).

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Subscription and verification (Rule 2005). 385.2005 Section 385.2005 Conservation of Power and Water Resources FEDERAL ENERGY... Requirements for Filings in Proceedings Before the Commission § 385.2005 Subscription and verification (Rule...

  18. 18 CFR 385.2005 - Subscription and verification (Rule 2005).

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Subscription and verification (Rule 2005). 385.2005 Section 385.2005 Conservation of Power and Water Resources FEDERAL ENERGY... Requirements for Filings in Proceedings Before the Commission § 385.2005 Subscription and verification (Rule...

  19. 18 CFR 385.2005 - Subscription and verification (Rule 2005).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Subscription and verification (Rule 2005). 385.2005 Section 385.2005 Conservation of Power and Water Resources FEDERAL ENERGY... Requirements for Filings in Proceedings Before the Commission § 385.2005 Subscription and verification (Rule...

  20. 18 CFR 385.2005 - Subscription and verification (Rule 2005).

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Subscription and verification (Rule 2005). 385.2005 Section 385.2005 Conservation of Power and Water Resources FEDERAL ENERGY... Requirements for Filings in Proceedings Before the Commission § 385.2005 Subscription and verification (Rule...

  1. 45 CFR 1626.7 - Verification of eligible alien status.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 4 2014-10-01 2014-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the only...

  2. 45 CFR 1626.7 - Verification of eligible alien status.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 4 2012-10-01 2012-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the only...

  3. 45 CFR 1626.7 - Verification of eligible alien status.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 4 2013-10-01 2013-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the only...

  4. 45 CFR 1626.7 - Verification of eligible alien status.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 4 2011-10-01 2011-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the only...

  5. 45 CFR 1626.7 - Verification of eligible alien status.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the only...

  6. 29 CFR 403.8 - Dissemination and verification of reports.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... LABOR-MANAGEMENT STANDARDS LABOR ORGANIZATION ANNUAL FINANCIAL REPORTS § 403.8 Dissemination and verification of reports. (a) Every labor organization required to submit a report under section 201(b) of the... 29 Labor 2 2010-07-01 2010-07-01 false Dissemination and verification of reports. 403.8 Section...

  7. A Quantitative Approach to the Formal Verification of Real-Time Systems.

    DTIC Science & Technology

    1996-09-01

    Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic

  8. Enhanced verification test suite for physics simulation codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.

    2008-09-01

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  9. Impact of radiation attenuation by a carbon fiber couch on patient dose verification

    NASA Astrophysics Data System (ADS)

    Yu, Chun-Yen; Chou, Wen-Tsae; Liao, Yi-Jen; Lee, Jeng-Hung; Liang, Ji-An; Hsu, Shih-Ming

    2017-02-01

    The aim of this study was to understand the difference between the measured and calculated irradiation attenuations obtained using two algorithms and to identify the influence of couch attenuation on patient dose verification. We performed eight tests of couch attenuation with two photon energies, two longitudinal couch positions, and two rail positions. The couch attenuation was determined using a radiation treatment planning system. The measured and calculated attenuations were compared. We also performed 12 verifications of head-and-neck and rectum cases by using a Delta phantom. The dose deviation (DD), distance to agreement (DTA), and gamma index of pencil-beam convolution (PBC) verifications were nearly the same. The agreement was least consistent for the anisotropic analytical algorithm (AAA) without the couch for the head-and-neck case, in which the DD, DTA, and gamma index were 74.4%, 99.3%, and 89%, respectively; for the rectum case, the corresponding values were 56.2%, 95.1%, and 92.4%. We suggest that dose verification should be performed using the following three metrics simultaneously: DD, DTA, and the gamma index.

  10. Development of independent MU/treatment time verification algorithm for non-IMRT treatment planning: A clinical experience

    NASA Astrophysics Data System (ADS)

    Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan

    2018-02-01

    The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.

  11. 34 CFR 668.54 - Selection of applications for verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false Selection of applications for verification. 668.54 Section 668.54 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION STUDENT ASSISTANCE GENERAL PROVISIONS Verification of Student Aid Application Information § 668.54...

  12. Self-verification and social anxiety: preference for negative social feedback and low social self-esteem.

    PubMed

    Valentiner, David P; Skowronski, John J; McGrath, Patrick B; Smith, Sarah A; Renner, Kerry A

    2011-10-01

    A self-verification model of social anxiety views negative social self-esteem as a core feature of social anxiety. This core feature is proposed to be maintained through self-verification processes, such as by leading individuals with negative social self-esteem to prefer negative social feedback. This model is tested in two studies. In Study 1, questionnaires were administered to a college sample (N = 317). In Study 2, questionnaires were administered to anxiety disordered patients (N = 62) before and after treatment. Study 1 developed measures of preference for negative social feedback and social self-esteem, and provided evidence of their incremental validity in a college sample. Study 2 found that these two variables are not strongly related to fears of evaluation, are relatively unaffected by a treatment that targets such fears, and predict residual social anxiety following treatment. Overall, these studies provide preliminary evidence for a self-verification model of social anxiety.

  13. NCEP Model Verification

    Science.gov Websites

    daily and monthly statistics. The daily and monthly verification processing is broken down into three geopotential height and wind using daily statistics from the gdas1 prepbufr files at 00Z; 06Z; 12Z; and, 18Z Hemisphere; the Southern Hemisphere; and the Tropics. Daily S1 scores from the GFS and NAM models are

  14. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  15. Mitigating errors caused by interruptions during medication verification and administration: interventions in a simulated ambulatory chemotherapy setting

    PubMed Central

    Prakash, Varuna; Koczmara, Christine; Savage, Pamela; Trip, Katherine; Stewart, Janice; McCurdie, Tara; Cafazzo, Joseph A; Trbovich, Patricia

    2014-01-01

    Background Nurses are frequently interrupted during medication verification and administration; however, few interventions exist to mitigate resulting errors, and the impact of these interventions on medication safety is poorly understood. Objective The study objectives were to (A) assess the effects of interruptions on medication verification and administration errors, and (B) design and test the effectiveness of targeted interventions at reducing these errors. Methods The study focused on medication verification and administration in an ambulatory chemotherapy setting. A simulation laboratory experiment was conducted to determine interruption-related error rates during specific medication verification and administration tasks. Interventions to reduce these errors were developed through a participatory design process, and their error reduction effectiveness was assessed through a postintervention experiment. Results Significantly more nurses committed medication errors when interrupted than when uninterrupted. With use of interventions when interrupted, significantly fewer nurses made errors in verifying medication volumes contained in syringes (16/18; 89% preintervention error rate vs 11/19; 58% postintervention error rate; p=0.038; Fisher's exact test) and programmed in ambulatory pumps (17/18; 94% preintervention vs 11/19; 58% postintervention; p=0.012). The rate of error commission significantly decreased with use of interventions when interrupted during intravenous push (16/18; 89% preintervention vs 6/19; 32% postintervention; p=0.017) and pump programming (7/18; 39% preintervention vs 1/19; 5% postintervention; p=0.017). No statistically significant differences were observed for other medication verification tasks. Conclusions Interruptions can lead to medication verification and administration errors. Interventions were highly effective at reducing unanticipated errors of commission in medication administration tasks, but showed mixed effectiveness at

  16. Effect of verification cores on tip capacity of drilled shafts.

    DOT National Transportation Integrated Search

    2009-02-01

    This research addressed two key issues: : 1) Will verification cores holes fill during concrete backfilling? If so, what are the mechanical properties of the : filling material? In dry conditions, verification core holes always completely fill with c...

  17. An unattended verification station for UF 6 cylinders: Field trial findings

    DOE PAGES

    Smith, L. E.; Miller, K. A.; McDonald, B. S.; ...

    2017-08-26

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less

  18. An unattended verification station for UF 6 cylinders: Field trial findings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, L. E.; Miller, K. A.; McDonald, B. S.

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less

  19. An unattended verification station for UF6 cylinders: Field trial findings

    NASA Astrophysics Data System (ADS)

    Smith, L. E.; Miller, K. A.; McDonald, B. S.; Webster, J. B.; Zalavadia, M. A.; Garner, J. R.; Stewart, S. L.; Branney, S. J.; Todd, L. C.; Deshmukh, N. S.; Nordquist, H. A.; Kulisek, J. A.; Swinhoe, M. T.

    2017-12-01

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. Analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 "typical" Type 30B cylinders, and the viability of an "NDA Fingerprint" concept as a high-fidelity means to periodically verify that material diversion has not occurred.

  20. Generic Protocol for the Verification of Ballast Water Treatment Technology

    EPA Science Inventory

    In anticipation of the need to address performance verification and subsequent approval of new and innovative ballast water treatment technologies for shipboard installation, the U.S Coast Guard and the Environmental Protection Agency‘s Environmental Technology Verification Progr...

  1. The Cooking and Pneumonia Study (CAPS) in Malawi: Implementation of Remote Source Data Verification

    PubMed Central

    Weston, William; Smedley, James; Bennett, Andrew; Mortimer, Kevin

    2016-01-01

    Background Source data verification (SDV) is a data monitoring procedure which compares the original records with the Case Report Form (CRF). Traditionally, on-site SDV relies on monitors making multiples visits to study sites requiring extensive resources. The Cooking And Pneumonia Study (CAPS) is a 24- month village-level cluster randomized controlled trial assessing the effectiveness of an advanced cook-stove intervention in preventing pneumonia in children under five in rural Malawi (www.capstudy.org). CAPS used smartphones to capture digital images of the original records on an electronic CRF (eCRF). In the present study, descriptive statistics are used to report the experience of electronic data capture with remote SDV in a challenging research setting in rural Malawi. Methods At three monthly intervals, fieldworkers, who were employed by CAPS, captured pneumonia data from the original records onto the eCRF. Fieldworkers also captured digital images of the original records. Once Internet connectivity was available, the data captured on the eCRF and the digital images of the original records were uploaded to a web-based SDV application. This enabled SDV to be conducted remotely from the UK. We conducted SDV of the pneumonia data (occurrence, severity, and clinical indicators) recorded in the eCRF with the data in the digital images of the original records. Result 664 episodes of pneumonia were recorded after 6 months of follow-up. Of these 664 episodes, 611 (92%) had a finding of pneumonia in the original records. All digital images of the original records were clear and legible. Conclusion Electronic data capture using eCRFs on mobile technology is feasible in rural Malawi. Capturing digital images of the original records in the field allows remote SDV to be conducted efficiently and securely without requiring additional field visits. We recommend these approaches in similar settings, especially those with health endpoints. PMID:27355447

  2. The Cooking and Pneumonia Study (CAPS) in Malawi: Implementation of Remote Source Data Verification.

    PubMed

    Weston, William; Smedley, James; Bennett, Andrew; Mortimer, Kevin

    2016-01-01

    Source data verification (SDV) is a data monitoring procedure which compares the original records with the Case Report Form (CRF). Traditionally, on-site SDV relies on monitors making multiples visits to study sites requiring extensive resources. The Cooking And Pneumonia Study (CAPS) is a 24- month village-level cluster randomized controlled trial assessing the effectiveness of an advanced cook-stove intervention in preventing pneumonia in children under five in rural Malawi (www.capstudy.org). CAPS used smartphones to capture digital images of the original records on an electronic CRF (eCRF). In the present study, descriptive statistics are used to report the experience of electronic data capture with remote SDV in a challenging research setting in rural Malawi. At three monthly intervals, fieldworkers, who were employed by CAPS, captured pneumonia data from the original records onto the eCRF. Fieldworkers also captured digital images of the original records. Once Internet connectivity was available, the data captured on the eCRF and the digital images of the original records were uploaded to a web-based SDV application. This enabled SDV to be conducted remotely from the UK. We conducted SDV of the pneumonia data (occurrence, severity, and clinical indicators) recorded in the eCRF with the data in the digital images of the original records. 664 episodes of pneumonia were recorded after 6 months of follow-up. Of these 664 episodes, 611 (92%) had a finding of pneumonia in the original records. All digital images of the original records were clear and legible. Electronic data capture using eCRFs on mobile technology is feasible in rural Malawi. Capturing digital images of the original records in the field allows remote SDV to be conducted efficiently and securely without requiring additional field visits. We recommend these approaches in similar settings, especially those with health endpoints.

  3. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... (b) of this section and § 5.514, no individual or family applying for assistance may receive such assistance prior to the verification of the eligibility of at least the individual or one family member. Verification of eligibility consistent with § 5.514 occurs when the individual or family members have submitted...

  4. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... (b) of this section and § 5.514, no individual or family applying for assistance may receive such assistance prior to the verification of the eligibility of at least the individual or one family member. Verification of eligibility consistent with § 5.514 occurs when the individual or family members have submitted...

  5. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... (b) of this section and § 5.514, no individual or family applying for assistance may receive such assistance prior to the verification of the eligibility of at least the individual or one family member. Verification of eligibility consistent with § 5.514 occurs when the individual or family members have submitted...

  6. Gate-Level Commercial Microelectronics Verification with Standard Cell Recognition

    DTIC Science & Technology

    2015-03-26

    21 2.2.1.4 Algorithm Insufficiencies as Applied to DARPA’s Cir- cuit Verification Efforts . . . . . . . . . . . . . . . . . . 22 vi Page...58 4.2 Discussion of SCR Algorithm and Code . . . . . . . . . . . . . . . . . . . 91 4.2.1 Explication of SCR Algorithm ...93 4.2.2 Algorithm Attributes . . . . . . . . . . . . . . . . . . . . . . . . . 118 4.3 Advantages of Transistor-level Verification with SCR

  7. The Challenge for Arms Control Verification in the Post-New START World

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wuest, C R

    Nuclear weapon arms control treaty verification is a key aspect of any agreement between signatories to establish that the terms and conditions spelled out in the treaty are being met. Historically, arms control negotiations have focused more on the rules and protocols for reducing the numbers of warheads and delivery systems - sometimes resorting to complex and arcane procedures for counting forces - in an attempt to address perceived or real imbalances in a nation's strategic posture that could lead to instability. Verification procedures are generally defined in arms control treaties and supporting documents and tend to focus on technicalmore » means and measures designed to ensure that a country is following the terms of the treaty and that it is not liable to engage in deception or outright cheating in an attempt to circumvent the spirit and the letter of the agreement. As the Obama Administration implements the articles, terms, and conditions of the recently ratified and entered-into-force New START treaty, there are already efforts within and outside of government to move well below the specified New START levels of 1550 warheads, 700 deployed strategic delivery vehicles, and 800 deployed and nondeployed strategic launchers (Inter-Continental Ballistic Missile (ICBM) silos, Submarine-Launched Ballistic Missile (SLBM) tubes on submarines, and bombers). A number of articles and opinion pieces have appeared that advocate for significantly deeper cuts in the U.S. nuclear stockpile, with some suggesting that unilateral reductions on the part of the U.S. would help coax Russia and others to follow our lead. Papers and studies prepared for the U.S. Department of Defense and at the U.S. Air War College have also been published, suggesting that nuclear forces totaling no more than about 300 warheads would be sufficient to meet U.S. national security and deterrence needs. (Davis 2011, Schaub and Forsyth 2010) Recent articles by James M. Acton and others suggest that

  8. International Cooperative for Aerosol Prediction Workshop on Aerosol Forecast Verification

    NASA Technical Reports Server (NTRS)

    Benedetti, Angela; Reid, Jeffrey S.; Colarco, Peter R.

    2011-01-01

    The purpose of this workshop was to reinforce the working partnership between centers who are actively involved in global aerosol forecasting, and to discuss issues related to forecast verification. Participants included representatives from operational centers with global aerosol forecasting requirements, a panel of experts on Numerical Weather Prediction and Air Quality forecast verification, data providers, and several observers from the research community. The presentations centered on a review of current NWP and AQ practices with subsequent discussion focused on the challenges in defining appropriate verification measures for the next generation of aerosol forecast systems.

  9. 47 CFR 64.1120 - Verification of orders for telecommunications service.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 3 2013-10-01 2013-10-01 false Verification of orders for telecommunications service. 64.1120 Section 64.1120 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON... Telecommunications Service Providers § 64.1120 Verification of orders for telecommunications service. (a) No...

  10. 47 CFR 64.1120 - Verification of orders for telecommunications service.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 3 2012-10-01 2012-10-01 false Verification of orders for telecommunications service. 64.1120 Section 64.1120 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON... Telecommunications Service Providers § 64.1120 Verification of orders for telecommunications service. (a) No...

  11. 47 CFR 64.1120 - Verification of orders for telecommunications service.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 3 2014-10-01 2014-10-01 false Verification of orders for telecommunications service. 64.1120 Section 64.1120 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON... Telecommunications Service Providers § 64.1120 Verification of orders for telecommunications service. (a) No...

  12. 76 FR 23861 - Documents Acceptable for Employment Eligibility Verification; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-29

    ... Documents Acceptable for Employment Eligibility Verification; Correction AGENCY: U.S. Citizenship and... titled Documents Acceptable for Employment Eligibility Verification published in the Federal Register on... a final rule in the Federal Register at 76 FR 21225 establishing Documents Acceptable for Employment...

  13. 47 CFR 64.1120 - Verification of orders for telecommunications service.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 3 2011-10-01 2011-10-01 false Verification of orders for telecommunications service. 64.1120 Section 64.1120 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON... Telecommunications Service Providers § 64.1120 Verification of orders for telecommunications service. (a) No...

  14. Precision segmented reflector, figure verification sensor

    NASA Technical Reports Server (NTRS)

    Manhart, Paul K.; Macenka, Steve A.

    1989-01-01

    The Precision Segmented Reflector (PSR) program currently under way at the Jet Propulsion Laboratory is a test bed and technology demonstration program designed to develop and study the structural and material technologies required for lightweight, precision segmented reflectors. A Figure Verification Sensor (FVS) which is designed to monitor the active control system of the segments is described, a best fit surface is defined, and an image or wavefront quality of the assembled array of reflecting panels is assessed

  15. Monte Carlo simulations to replace film dosimetry in IMRT verification.

    PubMed

    Goetzfried, Thomas; Rickhey, Mark; Treutwein, Marius; Koelbl, Oliver; Bogner, Ludwig

    2011-01-01

    Patient-specific verification of intensity-modulated radiation therapy (IMRT) plans can be done by dosimetric measurements or by independent dose or monitor unit calculations. The aim of this study was the clinical evaluation of IMRT verification based on a fast Monte Carlo (MC) program with regard to possible benefits compared to commonly used film dosimetry. 25 head-and-neck IMRT plans were recalculated by a pencil beam based treatment planning system (TPS) using an appropriate quality assurance (QA) phantom. All plans were verified both by film and diode dosimetry and compared to MC simulations. The irradiated films, the results of diode measurements and the computed dose distributions were evaluated, and the data were compared on the basis of gamma maps and dose-difference histograms. Average deviations in the high-dose region between diode measurements and point dose calculations performed with the TPS and MC program were 0.7 ± 2.7% and 1.2 ± 3.1%, respectively. For film measurements, the mean gamma values with 3% dose difference and 3mm distance-to-agreement were 0.74 ± 0.28 (TPS as reference) with dose deviations up to 10%. Corresponding values were significantly reduced to 0.34 ± 0.09 for MC dose calculation. The total time needed for both verification procedures is comparable, however, by far less labor intensive in the case of MC simulations. The presented study showed that independent dose calculation verification of IMRT plans with a fast MC program has the potential to eclipse film dosimetry more and more in the near future. Thus, the linac-specific QA part will necessarily become more important. In combination with MC simulations and due to the simple set-up, point-dose measurements for dosimetric plausibility checks are recommended at least in the IMRT introduction phase. Copyright © 2010. Published by Elsevier GmbH.

  16. Generic interpreters and microprocessor verification

    NASA Technical Reports Server (NTRS)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  17. Linear models to perform treaty verification tasks for enhanced information security

    DOE PAGES

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; ...

    2016-11-12

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensionalmore » vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.« less

  18. Linear models to perform treaty verification tasks for enhanced information security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensionalmore » vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.« less

  19. Spot scanning proton therapy plan assessment: design and development of a dose verification application for use in routine clinical practice

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Walsh, Timothy J.; Beltran, Chris J.; Stoker, Joshua B.; Mundy, Daniel W.; Parry, Mark D.; Bues, Martin; Fatyga, Mirek

    2016-04-01

    The use of radiation therapy for the treatment of cancer has been carried out clinically since the late 1800's. Early on however, it was discovered that a radiation dose sufficient to destroy cancer cells can also cause severe injury to surrounding healthy tissue. Radiation oncologists continually strive to find the perfect balance between a dose high enough to destroy the cancer and one that avoids damage to healthy organs. Spot scanning or "pencil beam" proton radiotherapy offers another option to improve on this. Unlike traditional photon therapy, proton beams stop in the target tissue, thus better sparing all organs beyond the targeted tumor. In addition, the beams are far narrower and thus can be more precisely "painted" onto the tumor, avoiding exposure to surrounding healthy tissue. To safely treat patients with proton beam radiotherapy, dose verification should be carried out for each plan prior to treatment. Proton dose verification systems are not currently commercially available so the Department of Radiation Oncology at the Mayo Clinic developed its own, called DOSeCHECK, which offers two distinct dose simulation methods: GPU-based Monte Carlo and CPU-based analytical. The three major components of the system include the web-based user interface, the Linux-based dose verification simulation engines, and the supporting services and components. The architecture integrates multiple applications, libraries, platforms, programming languages, and communication protocols and was successfully deployed in time for Mayo Clinic's first proton beam therapy patient. Having a simple, efficient application for dose verification greatly reduces staff workload and provides additional quality assurance, ultimately improving patient safety.

  20. Experimental preparation and verification of quantum money

    NASA Astrophysics Data System (ADS)

    Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei

    2018-03-01

    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.

  1. Standardized verification of fuel cycle modeling

    DOE PAGES

    Feng, B.; Dixon, B.; Sunny, E.; ...

    2016-04-05

    A nuclear fuel cycle systems modeling and code-to-code comparison effort was coordinated across multiple national laboratories to verify the tools needed to perform fuel cycle analyses of the transition from a once-through nuclear fuel cycle to a sustainable potential future fuel cycle. For this verification study, a simplified example transition scenario was developed to serve as a test case for the four systems codes involved (DYMOND, VISION, ORION, and MARKAL), each used by a different laboratory participant. In addition, all participants produced spreadsheet solutions for the test case to check all the mass flows and reactor/facility profiles on a year-by-yearmore » basis throughout the simulation period. The test case specifications describe a transition from the current US fleet of light water reactors to a future fleet of sodium-cooled fast reactors that continuously recycle transuranic elements as fuel. After several initial coordinated modeling and calculation attempts, it was revealed that most of the differences in code results were not due to different code algorithms or calculation approaches, but due to different interpretations of the input specifications among the analysts. Therefore, the specifications for the test case itself were iteratively updated to remove ambiguity and to help calibrate interpretations. In addition, a few corrections and modifications were made to the codes as well, which led to excellent agreement between all codes and spreadsheets for this test case. Although no fuel cycle transition analysis codes matched the spreadsheet results exactly, all remaining differences in the results were due to fundamental differences in code structure and/or were thoroughly explained. As a result, the specifications and example results are provided so that they can be used to verify additional codes in the future for such fuel cycle transition scenarios.« less

  2. 78 FR 56268 - Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-12

    .... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension... public workshop on ``Integrity Verification Process'' which took place on August 7, 2013. The notice also sought comments on the proposed ``Integrity Verification Process.'' In response to the comments received...

  3. Verification and validation of RADMODL Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transportmore » of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.« less

  4. Static test induced loads verification beyond elastic limit

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Harrington, F.

    1996-01-01

    Increasing demands for reliable and least-cost high-performance aerostructures are pressing design analyses, materials, and manufacturing processes to new and narrowly experienced performance and verification technologies. This study assessed the adequacy of current experimental verification of the traditional binding ultimate safety factor which covers rare events in which no statistical design data exist. Because large high-performance structures are inherently very flexible, boundary rotations and deflections under externally applied loads approaching fracture may distort their transmission and unknowingly accept submarginal structures or prematurely fracturing reliable ones. A technique was developed, using measured strains from back-to-back surface mounted gauges, to analyze, define, and monitor induced moments and plane forces through progressive material changes from total-elastic to total-inelastic zones within the structural element cross section. Deviations from specified test loads are identified by the consecutively changing ratios of moment-to-axial load.

  5. Static test induced loads verification beyond elastic limit

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Harrington, F.

    1996-01-01

    Increasing demands for reliable and least-cost high performance aerostructures are pressing design analyses, materials, and manufacturing processes to new and narrowly experienced performance and verification technologies. This study assessed the adequacy of current experimental verification of the traditional binding ultimate safety factor which covers rare events in which no statistical design data exist. Because large, high-performance structures are inherently very flexible, boundary rotations and deflections under externally applied loads approaching fracture may distort their transmission and unknowingly accept submarginal structures or prematurely fracturing reliable ones. A technique was developed, using measured strains from back-to-back surface mounted gauges, to analyze, define, and monitor induced moments and plane forces through progressive material changes from total-elastic to total inelastic zones within the structural element cross section. Deviations from specified test loads are identified by the consecutively changing ratios of moment-to-axial load.

  6. Fostering group identification and creativity in diverse groups: the role of individuation and self-verification.

    PubMed

    Swann, William B; Kwan, Virginia S Y; Polzer, Jeffrey T; Milton, Laurie P

    2003-11-01

    A longitudinal study examined the interplay of identity negotiation processes and diversity in small groups of master's of business administration (MBA) students. When perceivers formed relatively positive impressions of other group members, higher diversity predicted more individuation of targets. When perceivers formed relatively neutral impressions of other group members, however, higher diversity predicted less individuation of targets. Individuation at the outset of the semester predicted self-verification effects several weeks later, and self-verification, in turn, predicted group identification and creative task performance. The authors conclude that contrary to self-categorization theory, fostering individuation and self-verification in diverse groups may maximize group identification and productivity.

  7. CMOS VLSI Layout and Verification of a SIMD Computer

    NASA Technical Reports Server (NTRS)

    Zheng, Jianqing

    1996-01-01

    A CMOS VLSI layout and verification of a 3 x 3 processor parallel computer has been completed. The layout was done using the MAGIC tool and the verification using HSPICE. Suggestions for expanding the computer into a million processor network are presented. Many problems that might be encountered when implementing a massively parallel computer are discussed.

  8. Verification Challenges at Low Numbers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered ofmore » the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.« less

  9. Implementation of Precision Verification Solvents on the External Tank

    NASA Technical Reports Server (NTRS)

    Campbell, M.

    1998-01-01

    This paper presents the Implementation of Precision Verification Solvents on the External Tank. The topics include: 1) Background; 2) Solvent Usages; 3) TCE (Trichloroethylene) Reduction; 4) Solvent Replacement Studies; 5) Implementation; 6) Problems Occuring During Implementation; and 7) Future Work. This paper is presented in viewgraph form.

  10. Formal Methods for Verification and Validation of Partial Specifications: A Case Study

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  11. Verification Assessment of Flow Boundary Conditions for CFD Analysis of Supersonic Inlet Flows

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2002-01-01

    Boundary conditions for subsonic inflow, bleed, and subsonic outflow as implemented into the WIND CFD code are assessed with respect to verification for steady and unsteady flows associated with supersonic inlets. Verification procedures include grid convergence studies and comparisons to analytical data. The objective is to examine errors, limitations, capabilities, and behavior of the boundary conditions. Computational studies were performed on configurations derived from a "parameterized" supersonic inlet. These include steady supersonic flows with normal and oblique shocks, steady subsonic flow in a diffuser, and unsteady flow with the propagation and reflection of an acoustic disturbance.

  12. Verification of floating-point software

    NASA Technical Reports Server (NTRS)

    Hoover, Doug N.

    1990-01-01

    Floating point computation presents a number of problems for formal verification. Should one treat the actual details of floating point operations, or accept them as imprecisely defined, or should one ignore round-off error altogether and behave as if floating point operations are perfectly accurate. There is the further problem that a numerical algorithm usually only approximately computes some mathematical function, and we often do not know just how good the approximation is, even in the absence of round-off error. ORA has developed a theory of asymptotic correctness which allows one to verify floating point software with a minimum entanglement in these problems. This theory and its implementation in the Ariel C verification system are described. The theory is illustrated using a simple program which finds a zero of a given function by bisection. This paper is presented in viewgraph form.

  13. Annual verifications--a tick-box exercise?

    PubMed

    Walker, Gwen; Williams, David

    2014-09-01

    With the onus on healthcare providers and their staff to protect patients against all elements of 'avoidable harm' perhaps never greater, Gwen Walker, a highly experienced infection prevention control nurse specialist, and David Williams, MD of Approved Air, who has 30 years' experience in validation and verification of ventilation and ultraclean ventilation systems, examine changing requirements for, and trends in, operating theatre ventilation. Validation and verification reporting on such vital HVAC equipment should not, they argue, merely be viewed as a 'tick-box exercise'; it should instead 'comprehensively inform key stakeholders, and ultimately form part of clinical governance, thus protecting those ultimately named responsible for organisation-wide safety at Trust board level'.

  14. INF and IAEA: A comparative analysis of verification strategy. [Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scheinman, L.; Kratzer, M.

    1992-07-01

    This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities.

  15. 78 FR 52085 - VA Veteran-Owned Small Business Verification Guidelines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-22

    ... DEPARTMENT OF VETERANS AFFAIRS 38 CFR Part 74 RIN 2900-AO49 VA Veteran-Owned Small Business Verification Guidelines AGENCY: Department of Veterans Affairs. ACTION: Final rule. SUMMARY: This document... Domestic Assistance This final rule affects the verification guidelines of veteran- owned small businesses...

  16. Time-space modal logic for verification of bit-slice circuits

    NASA Astrophysics Data System (ADS)

    Hiraishi, Hiromi

    1996-03-01

    The major goal of this paper is to propose a new modal logic aiming at formal verification of bit-slice circuits. The new logic is called as time-space modal logic and its major feature is that it can handle two transition relations: one for time transition and the other for space transition. As for a verification algorithm, a symbolic model checking algorithm of the new logic is shown. This could be applicable to verification of bit-slice microprocessor of infinite bit width and 1D systolic array of infinite length. A simple benchmark result shows the effectiveness of the proposed approach.

  17. High-Resolution Fast-Neutron Spectrometry for Arms Control and Treaty Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David L. Chichester; James T. Johnson; Edward H. Seabury

    2012-07-01

    Many nondestructive nuclear analysis techniques have been developed to support the measurement needs of arms control and treaty verification, including gross photon and neutron counting, low- and high-resolution gamma spectrometry, time-correlated neutron measurements, and photon and neutron imaging. One notable measurement technique that has not been extensively studied to date for these applications is high-resolution fast-neutron spectrometry (HRFNS). Applied for arms control and treaty verification, HRFNS has the potential to serve as a complimentary measurement approach to these other techniques by providing a means to either qualitatively or quantitatively determine the composition and thickness of non-nuclear materials surrounding neutron-emitting materials.more » The technique uses the normally-occurring neutrons present in arms control and treaty verification objects of interest as an internal source of neutrons for performing active-interrogation transmission measurements. Most low-Z nuclei of interest for arms control and treaty verification, including 9Be, 12C, 14N, and 16O, possess fast-neutron resonance features in their absorption cross sections in the 0.5- to 5-MeV energy range. Measuring the selective removal of source neutrons over this energy range, assuming for example a fission-spectrum starting distribution, may be used to estimate the stoichiometric composition of intervening materials between the neutron source and detector. At a simpler level, determination of the emitted fast-neutron spectrum may be used for fingerprinting 'known' assemblies for later use in template-matching tests. As with photon spectrometry, automated analysis of fast-neutron spectra may be performed to support decision making and reporting systems protected behind information barriers. This paper will report recent work at Idaho National Laboratory to explore the feasibility of using HRFNS for arms control and treaty verification applications, including simulations

  18. Signal verification can promote reliable signalling.

    PubMed

    Broom, Mark; Ruxton, Graeme D; Schaefer, H Martin

    2013-11-22

    The central question in communication theory is whether communication is reliable, and if so, which mechanisms select for reliability. The primary approach in the past has been to attribute reliability to strategic costs associated with signalling as predicted by the handicap principle. Yet, reliability can arise through other mechanisms, such as signal verification; but the theoretical understanding of such mechanisms has received relatively little attention. Here, we model whether verification can lead to reliability in repeated interactions that typically characterize mutualisms. Specifically, we model whether fruit consumers that discriminate among poor- and good-quality fruits within a population can select for reliable fruit signals. In our model, plants either signal or they do not; costs associated with signalling are fixed and independent of plant quality. We find parameter combinations where discriminating fruit consumers can select for signal reliability by abandoning unprofitable plants more quickly. This self-serving behaviour imposes costs upon plants as a by-product, rendering it unprofitable for unrewarding plants to signal. Thus, strategic costs to signalling are not a prerequisite for reliable communication. We expect verification to more generally explain signal reliability in repeated consumer-resource interactions that typify mutualisms but also in antagonistic interactions such as mimicry and aposematism.

  19. Survey of Verification and Validation Techniques for Small Satellite Software Development

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  20. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    PubMed Central

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  1. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  2. VAVUQ, Python and Matlab freeware for Verification and Validation, Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Courtney, J. E.; Zamani, K.; Bombardelli, F. A.; Fleenor, W. E.

    2015-12-01

    A package of scripts is presented for automated Verification and Validation (V&V) and Uncertainty Quantification (UQ) for engineering codes that approximate Partial Differential Equations (PDFs). The code post-processes model results to produce V&V and UQ information. This information can be used to assess model performance. Automated information on code performance can allow for a systematic methodology to assess the quality of model approximations. The software implements common and accepted code verification schemes. The software uses the Method of Manufactured Solutions (MMS), the Method of Exact Solution (MES), Cross-Code Verification, and Richardson Extrapolation (RE) for solution (calculation) verification. It also includes common statistical measures that can be used for model skill assessment. Complete RE can be conducted for complex geometries by implementing high-order non-oscillating numerical interpolation schemes within the software. Model approximation uncertainty is quantified by calculating lower and upper bounds of numerical error from the RE results. The software is also able to calculate the Grid Convergence Index (GCI), and to handle adaptive meshes and models that implement mixed order schemes. Four examples are provided to demonstrate the use of the software for code and solution verification, model validation and uncertainty quantification. The software is used for code verification of a mixed-order compact difference heat transport solver; the solution verification of a 2D shallow-water-wave solver for tidal flow modeling in estuaries; the model validation of a two-phase flow computation in a hydraulic jump compared to experimental data; and numerical uncertainty quantification for 3D CFD modeling of the flow patterns in a Gust erosion chamber.

  3. Towards the formal verification of the requirements and design of a processor interface unit

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1993-01-01

    The formal verification of the design and partial requirements for a Processor Interface Unit (PIU) using the Higher Order Logic (HOL) theorem-proving system is described. The processor interface unit is a single-chip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. It provides the opportunity to investigate the specification and verification of a real-world subsystem within a commercially-developed fault-tolerant computer. An overview of the PIU verification effort is given. The actual HOL listing from the verification effort are documented in a companion NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings' including the general-purpose HOL theories and definitions that support the PIU verification as well as tactics used in the proofs.

  4. Probabilistic Sizing and Verification of Space Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Denaux, David; Ballhause, Dirk; Logut, Daniel; Lucarelli, Stefano; Coe, Graham; Laine, Benoit

    2012-07-01

    Sizing of ceramic parts is best optimised using a probabilistic approach which takes into account the preexisting flaw distribution in the ceramic part to compute a probability of failure of the part depending on the applied load, instead of a maximum allowable load as for a metallic part. This requires extensive knowledge of the material itself but also an accurate control of the manufacturing process. In the end, risk reduction approaches such as proof testing may be used to lower the final probability of failure of the part. Sizing and verification of ceramic space structures have been performed by Astrium for more than 15 years, both with Zerodur and SiC: Silex telescope structure, Seviri primary mirror, Herschel telescope, Formosat-2 instrument, and other ceramic structures flying today. Throughout this period of time, Astrium has investigated and developed experimental ceramic analysis tools based on the Weibull probabilistic approach. In the scope of the ESA/ESTEC study: “Mechanical Design and Verification Methodologies for Ceramic Structures”, which is to be concluded in the beginning of 2012, existing theories, technical state-of-the-art from international experts, and Astrium experience with probabilistic analysis tools have been synthesized into a comprehensive sizing and verification method for ceramics. Both classical deterministic and more optimised probabilistic methods are available, depending on the criticality of the item and on optimisation needs. The methodology, based on proven theory, has been successfully applied to demonstration cases and has shown its practical feasibility.

  5. Monte Carlo verification of radiotherapy treatments with CloudMC.

    PubMed

    Miras, Hector; Jiménez, Rubén; Perales, Álvaro; Terrón, José Antonio; Bertolet, Alejandro; Ortiz, Antonio; Macías, José

    2018-06-27

    A new implementation has been made on CloudMC, a cloud-based platform presented in a previous work, in order to provide services for radiotherapy treatment verification by means of Monte Carlo in a fast, easy and economical way. A description of the architecture of the application and the new developments implemented is presented together with the results of the tests carried out to validate its performance. CloudMC has been developed over Microsoft Azure cloud. It is based on a map/reduce implementation for Monte Carlo calculations distribution over a dynamic cluster of virtual machines in order to reduce calculation time. CloudMC has been updated with new methods to read and process the information related to radiotherapy treatment verification: CT image set, treatment plan, structures and dose distribution files in DICOM format. Some tests have been designed in order to determine, for the different tasks, the most suitable type of virtual machines from those available in Azure. Finally, the performance of Monte Carlo verification in CloudMC is studied through three real cases that involve different treatment techniques, linac models and Monte Carlo codes. Considering computational and economic factors, D1_v2 and G1 virtual machines were selected as the default type for the Worker Roles and the Reducer Role respectively. Calculation times up to 33 min and costs of 16 € were achieved for the verification cases presented when a statistical uncertainty below 2% (2σ) was required. The costs were reduced to 3-6 € when uncertainty requirements are relaxed to 4%. Advantages like high computational power, scalability, easy access and pay-per-usage model, make Monte Carlo cloud-based solutions, like the one presented in this work, an important step forward to solve the long-lived problem of truly introducing the Monte Carlo algorithms in the daily routine of the radiotherapy planning process.

  6. Verification of Gyrokinetic codes: Theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia; Bottino, Alberto; Görler, Tobias; Sonnendrücker, Eric; Told, Daniel; Villard, Laurent

    2017-05-01

    In fusion plasmas, the strong magnetic field allows the fast gyro-motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the subsequent transport. Naturally, these codes require thorough verification and validation. Here, we present a new and generic theoretical framework and specific numerical applications to test the faithfulness of the implemented models to theory and to verify the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which has rarely been done and therefore makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The verification of the numerical scheme is proposed via the benchmark effort. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC) and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations implemented in the ORB5 and GENE codes using the Lagrangian variational formulation. At the computational level, detailed verifications of global electromagnetic test cases developed from the CYCLONE Base Case are considered, including a parametric β-scan covering the transition from ITG to KBM and the spectral properties at the nominal β value.

  7. SU-E-T-762: Toward Volume-Based Independent Dose Verification as Secondary Check

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tachibana, H; Tachibana, R

    2015-06-15

    Purpose: Lung SBRT plan has been shifted to volume prescription technique. However, point dose agreement is still verified using independent dose verification at the secondary check. The volume dose verification is more affected by inhomogeneous correction rather than point dose verification currently used as the check. A feasibility study for volume dose verification was conducted in lung SBRT plan. Methods: Six SBRT plans were collected in our institute. Two dose distributions with / without inhomogeneous correction were generated using Adaptive Convolve (AC) in Pinnacle3. Simple MU Analysis (SMU, Triangle Product, Ishikawa, JP) was used as the independent dose verification softwaremore » program, in which a modified Clarkson-based algorithm was implemented and radiological path length was computed using CT images independently to the treatment planning system. The agreement in point dose and mean dose between the AC with / without the correction and the SMU were assessed. Results: In the point dose evaluation for the center of the GTV, the difference shows the systematic shift (4.5% ± 1.9 %) in comparison of the AC with the inhomogeneous correction, on the other hands, there was good agreement of 0.2 ± 0.9% between the SMU and the AC without the correction. In the volume evaluation, there were significant differences in mean dose for not only PTV (14.2 ± 5.1 %) but also GTV (8.0 ± 5.1 %) compared to the AC with the correction. Without the correction, the SMU showed good agreement for GTV (1.5 ± 0.9%) as well as PTV (0.9% ± 1.0%). Conclusion: The volume evaluation for secondary check may be possible in homogenous region. However, the volume including the inhomogeneous media would make larger discrepancy. Dose calculation algorithm for independent verification needs to be modified to take into account the inhomogeneous correction.« less

  8. Towards Verification and Validation for Increased Autonomy

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra

    2017-01-01

    This presentation goes over the work we have performed over the last few years on verification and validation of the next generation onboard collision avoidance system, ACAS X, for commercial aircraft. It describes our work on probabilistic verification and synthesis of the model that ACAS X is based on, and goes on to the validation of that model with respect to actual simulation and flight data. The presentation then moves on to identify the characteristics of ACAS X that are related to autonomy and to discuss the challenges that autonomy pauses on VV. All work presented has already been published.

  9. Electric power system test and verification program

    NASA Technical Reports Server (NTRS)

    Rylicki, Daniel S.; Robinson, Frank, Jr.

    1994-01-01

    Space Station Freedom's (SSF's) electric power system (EPS) hardware and software verification is performed at all levels of integration, from components to assembly and system level tests. Careful planning is essential to ensure the EPS is tested properly on the ground prior to launch. The results of the test performed on breadboard model hardware and analyses completed to date have been evaluated and used to plan for design qualification and flight acceptance test phases. These results and plans indicate the verification program for SSF's 75-kW EPS would have been successful and completed in time to support the scheduled first element launch.

  10. Mitigating errors caused by interruptions during medication verification and administration: interventions in a simulated ambulatory chemotherapy setting.

    PubMed

    Prakash, Varuna; Koczmara, Christine; Savage, Pamela; Trip, Katherine; Stewart, Janice; McCurdie, Tara; Cafazzo, Joseph A; Trbovich, Patricia

    2014-11-01

    Nurses are frequently interrupted during medication verification and administration; however, few interventions exist to mitigate resulting errors, and the impact of these interventions on medication safety is poorly understood. The study objectives were to (A) assess the effects of interruptions on medication verification and administration errors, and (B) design and test the effectiveness of targeted interventions at reducing these errors. The study focused on medication verification and administration in an ambulatory chemotherapy setting. A simulation laboratory experiment was conducted to determine interruption-related error rates during specific medication verification and administration tasks. Interventions to reduce these errors were developed through a participatory design process, and their error reduction effectiveness was assessed through a postintervention experiment. Significantly more nurses committed medication errors when interrupted than when uninterrupted. With use of interventions when interrupted, significantly fewer nurses made errors in verifying medication volumes contained in syringes (16/18; 89% preintervention error rate vs 11/19; 58% postintervention error rate; p=0.038; Fisher's exact test) and programmed in ambulatory pumps (17/18; 94% preintervention vs 11/19; 58% postintervention; p=0.012). The rate of error commission significantly decreased with use of interventions when interrupted during intravenous push (16/18; 89% preintervention vs 6/19; 32% postintervention; p=0.017) and pump programming (7/18; 39% preintervention vs 1/19; 5% postintervention; p=0.017). No statistically significant differences were observed for other medication verification tasks. Interruptions can lead to medication verification and administration errors. Interventions were highly effective at reducing unanticipated errors of commission in medication administration tasks, but showed mixed effectiveness at reducing predictable errors of detection in

  11. Test and Verification Approach for the NASA Constellation Program

    NASA Technical Reports Server (NTRS)

    Strong, Edward

    2008-01-01

    This viewgraph presentation is a test and verification approach for the NASA Constellation Program. The contents include: 1) The Vision for Space Exploration: Foundations for Exploration; 2) Constellation Program Fleet of Vehicles; 3) Exploration Roadmap; 4) Constellation Vehicle Approximate Size Comparison; 5) Ares I Elements; 6) Orion Elements; 7) Ares V Elements; 8) Lunar Lander; 9) Map of Constellation content across NASA; 10) CxP T&V Implementation; 11) Challenges in CxP T&V Program; 12) T&V Strategic Emphasis and Key Tenets; 13) CxP T&V Mission & Vision; 14) Constellation Program Organization; 15) Test and Evaluation Organization; 16) CxP Requirements Flowdown; 17) CxP Model Based Systems Engineering Approach; 18) CxP Verification Planning Documents; 19) Environmental Testing; 20) Scope of CxP Verification; 21) CxP Verification - General Process Flow; 22) Avionics and Software Integrated Testing Approach; 23) A-3 Test Stand; 24) Space Power Facility; 25) MEIT and FEIT; 26) Flight Element Integrated Test (FEIT); 27) Multi-Element Integrated Testing (MEIT); 28) Flight Test Driving Principles; and 29) Constellation s Integrated Flight Test Strategy Low Earth Orbit Servicing Capability.

  12. Ares I-X Range Safety Simulation Verification and Analysis Independent Validation and Verification

    NASA Technical Reports Server (NTRS)

    Merry, Carl M.; Tarpley, Ashley F.; Craig, A. Scott; Tartabini, Paul V.; Brewer, Joan D.; Davis, Jerel G.; Dulski, Matthew B.; Gimenez, Adrian; Barron, M. Kyle

    2011-01-01

    NASA s Ares I-X vehicle launched on a suborbital test flight from the Eastern Range in Florida on October 28, 2009. To obtain approval for launch, a range safety final flight data package was generated to meet the data requirements defined in the Air Force Space Command Manual 91-710 Volume 2. The delivery included products such as a nominal trajectory, trajectory envelopes, stage disposal data and footprints, and a malfunction turn analysis. The Air Force s 45th Space Wing uses these products to ensure public and launch area safety. Due to the criticality of these data, an independent validation and verification effort was undertaken to ensure data quality and adherence to requirements. As a result, the product package was delivered with the confidence that independent organizations using separate simulation software generated data to meet the range requirements and yielded consistent results. This document captures Ares I-X final flight data package verification and validation analysis, including the methodology used to validate and verify simulation inputs, execution, and results and presents lessons learned during the process

  13. A Scala DSL for RETE-Based Runtime Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2013-01-01

    Runtime verification (RV) consists in part of checking execution traces against formalized specifications. Several systems have emerged, most of which support specification notations based on state machines, regular expressions, temporal logic, or grammars. The field of Artificial Intelligence (AI) has for an even longer period of time studied rule-based production systems, which at a closer look appear to be relevant for RV, although seemingly focused on slightly different application domains, such as for example business processes and expert systems. The core algorithm in many of these systems is the Rete algorithm. We have implemented a Rete-based runtime verification system, named LogFire (originally intended for offline log analysis but also applicable to online analysis), as an internal DSL in the Scala programming language, using Scala's support for defining DSLs. This combination appears attractive from a practical point of view. Our contribution is in part conceptual in arguing that such rule-based frameworks originating from AI may be suited for RV.

  14. Inverse probability weighting estimation of the volume under the ROC surface in the presence of verification bias.

    PubMed

    Zhang, Ying; Alonzo, Todd A

    2016-11-01

    In diagnostic medicine, the volume under the receiver operating characteristic (ROC) surface (VUS) is a commonly used index to quantify the ability of a continuous diagnostic test to discriminate between three disease states. In practice, verification of the true disease status may be performed only for a subset of subjects under study since the verification procedure is invasive, risky, or expensive. The selection for disease examination might depend on the results of the diagnostic test and other clinical characteristics of the patients, which in turn can cause bias in estimates of the VUS. This bias is referred to as verification bias. Existing verification bias correction in three-way ROC analysis focuses on ordinal tests. We propose verification bias-correction methods to construct ROC surface and estimate the VUS for a continuous diagnostic test, based on inverse probability weighting. By applying U-statistics theory, we develop asymptotic properties for the estimator. A Jackknife estimator of variance is also derived. Extensive simulation studies are performed to evaluate the performance of the new estimators in terms of bias correction and variance. The proposed methods are used to assess the ability of a biomarker to accurately identify stages of Alzheimer's disease. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Verification of Advective Bar Elements Implemented in the Aria Thermal Response Code.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, Brantley

    2016-01-01

    A verification effort was undertaken to evaluate the implementation of the new advective bar capability in the Aria thermal response code. Several approaches to the verification process were taken : a mesh refinement study to demonstrate solution convergence in the fluid and the solid, visually examining the mapping of the advective bar element nodes to the surrounding surfaces, and a comparison of solutions produced using the advective bars for simple geometries with solutions from commercial CFD software . The mesh refinement study has shown solution convergence for simple pipe flow in both temperature and velocity . Guidelines were provided tomore » achieve appropriate meshes between the advective bar elements and the surrounding volume. Simulations of pipe flow using advective bars elements in Aria have been compared to simulations using the commercial CFD software ANSYS Fluent (r) and provided comparable solutions in temperature and velocity supporting proper implementation of the new capability. Verification of Advective Bar Elements iv Acknowledgements A special thanks goes to Dean Dobranich for his guidance and expertise through all stages of this effort . His advice and feedback was instrumental to its completion. Thanks also goes to Sam Subia and Tolu Okusanya for helping to plan many of the verification activities performed in this document. Thank you to Sam, Justin Lamb and Victor Brunini for their assistance in resolving issues encountered with running the advective bar element model. Finally, thanks goes to Dean, Sam, and Adam Hetzler for reviewing the document and providing very valuable comments.« less

  16. Online fingerprint verification.

    PubMed

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  17. Gender verification testing in sport.

    PubMed

    Ferris, E A

    1992-07-01

    Gender verification testing in sport, first introduced in 1966 by the International Amateur Athletic Federation (IAAF) in response to fears that males with a physical advantage in terms of muscle mass and strength were cheating by masquerading as females in women's competition, has led to unfair disqualifications of women athletes and untold psychological harm. The discredited sex chromatin test, which identifies only the sex chromosome component of gender and is therefore misleading, was abandoned in 1991 by the IAAF in favour of medical checks for all athletes, women and men, which preclude the need for gender testing. But, women athletes will still be tested at the Olympic Games at Albertville and Barcelona using polymerase chain reaction (PCR) to amplify DNA sequences on the Y chromosome which identifies genetic sex only. Gender verification testing may in time be abolished when the sporting community are fully cognizant of its scientific and ethical implications.

  18. An Overview of the Runtime Verification Tool Java PathExplorer

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2002-01-01

    We present an overview of the Java PathExplorer runtime verification tool, in short referred to as JPAX. JPAX can monitor the execution of a Java program and check that it conforms with a set of user provided properties formulated in temporal logic. JPAX can in addition analyze the program for concurrency errors such as deadlocks and data races. The concurrency analysis requires no user provided specification. The tool facilitates automated instrumentation of a program's bytecode, which when executed will emit an event stream, the execution trace, to an observer. The observer dispatches the incoming event stream to a set of observer processes, each performing a specialized analysis, such as the temporal logic verification, the deadlock analysis and the data race analysis. Temporal logic specifications can be formulated by the user in the Maude rewriting logic, where Maude is a high-speed rewriting system for equational logic, but here extended with executable temporal logic. The Maude rewriting engine is then activated as an event driven monitoring process. Alternatively, temporal specifications can be translated into efficient automata, which check the event stream. JPAX can be used during program testing to gain increased information about program executions, and can potentially furthermore be applied during operation to survey safety critical systems.

  19. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  20. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  1. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  2. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  3. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  4. Verification of Java Programs using Symbolic Execution and Invariant Generation

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: DUST SUPPRESSANT PRODUCTS: SYNTECH PRODUCTS CORPORATION'S TECHSUPPRESS

    EPA Science Inventory

    Dust suppressant products used to control particulate emissions from unpaved roads are among the technologies evaluated by the Air Pollution Control Technology (APCT) Verification Center, part of the U.S. Environmental Protection Agency's Environmental Technology Verification (ET...

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: DUST SUPPRESSANT PRODUCTS: SYNTECH PRODUCTS CORPORATION'S PETROTAC

    EPA Science Inventory

    Dust suppressant products used to control particulate emissions from unpaved roads are among the technologies evaluated by the Air Pollution Control Technology (APCT) Verification Center, part of the U.S. Environmental Protection Agency's Environmental Technology Verification (ET...

  7. PERFORMANCE VERIFICATION TEST FOR FIELD-PORTABLE MEASUREMENTS OF LEAD IN DUST

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program (www.epa.jzov/etv) conducts performance verification tests of technologies used for the characterization and monitoring of contaminated media. The program exists to provide high-quali...

  8. Interferometric step gauge for CMM verification

    NASA Astrophysics Data System (ADS)

    Hemming, B.; Esala, V.-P.; Laukkanen, P.; Rantanen, A.; Viitala, R.; Widmaier, T.; Kuosmanen, P.; Lassila, A.

    2018-07-01

    The verification of the measurement capability of coordinate measuring machines (CMM) is usually performed using gauge blocks or step gauges as reference standards. Gauge blocks and step gauges are robust and easy to use, but have some limitations such as finite lengths and uncertainty of thermal expansion. This paper describes the development, testing and uncertainty evaluation of an interferometric step gauge (ISG) for CMM verification. The idea of the ISG is to move a carriage bearing a gauge block along a rail and to measure the position with an interferometer. For a displacement of 1 m the standard uncertainty of the position of the gauge block is 0.2 µm. A short range periodic error of CMM can also be detected.

  9. 28 CFR 541.29 - Staff verification of need for protection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 2 2014-07-01 2014-07-01 false Staff verification of need for protection... MANAGEMENT INMATE DISCIPLINE AND SPECIAL HOUSING UNITS Special Housing Units § 541.29 Staff verification of need for protection. If a staff investigation verifies your need for placement in the SHU as a...

  10. 28 CFR 541.29 - Staff verification of need for protection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 28 Judicial Administration 2 2012-07-01 2012-07-01 false Staff verification of need for protection... MANAGEMENT INMATE DISCIPLINE AND SPECIAL HOUSING UNITS Special Housing Units § 541.29 Staff verification of need for protection. If a staff investigation verifies your need for placement in the SHU as a...

  11. 28 CFR 541.29 - Staff verification of need for protection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 28 Judicial Administration 2 2011-07-01 2011-07-01 false Staff verification of need for protection... MANAGEMENT INMATE DISCIPLINE AND SPECIAL HOUSING UNITS Special Housing Units § 541.29 Staff verification of need for protection. If a staff investigation verifies your need for placement in the SHU as a...

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION--GENERIC VERIFICATION PROTOCOL FOR BIOLOGICAL AND AEROSOL TESTING OF GENERAL VENTILATION AIR CLEANERS

    EPA Science Inventory

    Under EPA's Environmental Technology Verification Program, Research Triangle Institute (RTI) will operate the Air Pollution Control Technology Center to verify the filtration efficiency and bioaerosol inactivation efficiency of heating, ventilation and air conditioning air cleane...

  13. Experimental verification of layout physical verification of silicon photonics

    NASA Astrophysics Data System (ADS)

    El Shamy, Raghi S.; Swillam, Mohamed A.

    2018-02-01

    Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.

  14. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  15. ESTEST: A Framework for the Verification and Validation of Electronic Structure Codes

    NASA Astrophysics Data System (ADS)

    Yuan, Gary; Gygi, Francois

    2011-03-01

    ESTEST is a verification and validation (V& V) framework for electronic structure codes that supports Qbox, Quantum Espresso, ABINIT, the Exciting Code and plans support for many more. We discuss various approaches to the electronic structure V& V problem implemented in ESTEST, that are related to parsing, formats, data management, search, comparison and analyses. Additionally, an early experiment in the distribution of V& V ESTEST servers among the electronic structure community will be presented. Supported by NSF-OCI 0749217 and DOE FC02-06ER25777.

  16. VERIFICATION TESTING OF HIGH-RATE MECHANICAL INDUCTION MIXERS FOR CHEMICAL DISINFECTANTS

    EPA Science Inventory

    This paper describes the results of verification testing of mechanical induction mixers for dispersion of chemical disinfectants in wet-weather flow (WWF) conducted under the U.S. Environmental Protection Agency's Environmental Technology Verification (ETV) WWF Pilot Program. Th...

  17. Toward Automatic Verification of Goal-Oriented Flow Simulations

    NASA Technical Reports Server (NTRS)

    Nemec, Marian; Aftosmis, Michael J.

    2014-01-01

    We demonstrate the power of adaptive mesh refinement with adjoint-based error estimates in verification of simulations governed by the steady Euler equations. The flow equations are discretized using a finite volume scheme on a Cartesian mesh with cut cells at the wall boundaries. The discretization error in selected simulation outputs is estimated using the method of adjoint-weighted residuals. Practical aspects of the implementation are emphasized, particularly in the formulation of the refinement criterion and the mesh adaptation strategy. Following a thorough code verification example, we demonstrate simulation verification of two- and three-dimensional problems. These involve an airfoil performance database, a pressure signature of a body in supersonic flow and a launch abort with strong jet interactions. The results show reliable estimates and automatic control of discretization error in all simulations at an affordable computational cost. Moreover, the approach remains effective even when theoretical assumptions, e.g., steady-state and solution smoothness, are relaxed.

  18. A Secure Framework for Location Verification in Pervasive Computing

    NASA Astrophysics Data System (ADS)

    Liu, Dawei; Lee, Moon-Chuen; Wu, Dan

    The way people use computing devices has been changed in some way by the relatively new pervasive computing paradigm. For example, a person can use a mobile device to obtain its location information at anytime and anywhere. There are several security issues concerning whether this information is reliable in a pervasive environment. For example, a malicious user may disable the localization system by broadcasting a forged location, and it may impersonate other users by eavesdropping their locations. In this paper, we address the verification of location information in a secure manner. We first present the design challenges for location verification, and then propose a two-layer framework VerPer for secure location verification in a pervasive computing environment. Real world GPS-based wireless sensor network experiments confirm the effectiveness of the proposed framework.

  19. Static Verification for Code Contracts

    NASA Astrophysics Data System (ADS)

    Fähndrich, Manuel

    The Code Contracts project [3] at Microsoft Research enables programmers on the .NET platform to author specifications in existing languages such as C# and VisualBasic. To take advantage of these specifications, we provide tools for documentation generation, runtime contract checking, and static contract verification.

  20. Expert system verification and validation study: ES V/V Workshop

    NASA Technical Reports Server (NTRS)

    French, Scott; Hamilton, David

    1992-01-01

    The primary purpose of this document is to build a foundation for applying principles of verification and validation (V&V) of expert systems. To achieve this, some V&V as applied to conventionally implemented software is required. Part one will discuss the background of V&V from the perspective of (1) what is V&V of software and (2) V&V's role in developing software. Part one will also overview some common analysis techniques that are applied when performing V&V of software. All of these materials will be presented based on the assumption that the reader has little or no background in V&V or in developing procedural software. The primary purpose of part two is to explain the major techniques that have been developed for V&V of expert systems.