Sample records for verification study strongly

  1. Response to "Improving Patient Safety With Error Identification in Chemotherapy Orders by Verification Nurses"
.

    PubMed

    Zhu, Ling-Ling; Lv, Na; Zhou, Quan

    2016-12-01

    We read, with great interest, the study by Baldwin and Rodriguez (2016), which described the role of the verification nurse and details the verification process in identifying errors related to chemotherapy orders. We strongly agree with their findings that a verification nurse, collaborating closely with the prescribing physician, pharmacist, and treating nurse, can better identify errors and maintain safety during chemotherapy administration.

  2. Self-verification and social anxiety: preference for negative social feedback and low social self-esteem.

    PubMed

    Valentiner, David P; Skowronski, John J; McGrath, Patrick B; Smith, Sarah A; Renner, Kerry A

    2011-10-01

    A self-verification model of social anxiety views negative social self-esteem as a core feature of social anxiety. This core feature is proposed to be maintained through self-verification processes, such as by leading individuals with negative social self-esteem to prefer negative social feedback. This model is tested in two studies. In Study 1, questionnaires were administered to a college sample (N = 317). In Study 2, questionnaires were administered to anxiety disordered patients (N = 62) before and after treatment. Study 1 developed measures of preference for negative social feedback and social self-esteem, and provided evidence of their incremental validity in a college sample. Study 2 found that these two variables are not strongly related to fears of evaluation, are relatively unaffected by a treatment that targets such fears, and predict residual social anxiety following treatment. Overall, these studies provide preliminary evidence for a self-verification model of social anxiety.

  3. Dynamic testing for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.

    1972-01-01

    Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.

  4. Research on key technology of the verification system of steel rule based on vision measurement

    NASA Astrophysics Data System (ADS)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  5. Shop floor compliance with age restrictions for tobacco sales: remote versus in-store age verification.

    PubMed

    van Hoof, Joris J; Gosselt, Jordy F; de Jong, Menno D T

    2010-02-01

    To compare traditional in-store age verification with a newly developed remote age verification system, 100 cigarette purchase attempts were made by 15-year-old "mystery shoppers." The remote system led to a strong increase in compliance (96% vs. 12%), reflecting more identification requests and more sale refusals when adolescents showed their identification cards. Copyright 2010 Society for Adolescent Medicine. Published by Elsevier Inc. All rights reserved.

  6. Validation (not just verification) of Deep Space Missions

    NASA Technical Reports Server (NTRS)

    Duren, Riley M.

    2006-01-01

    ion & Validation (V&V) is a widely recognized and critical systems engineering function. However, the often used definition 'Verification proves the design is right; validation proves it is the right design' is rather vague. And while Verification is a reasonably well standardized systems engineering process, Validation is a far more abstract concept and the rigor and scope applied to it varies widely between organizations and individuals. This is reflected in the findings in recent Mishap Reports for several NASA missions, in which shortfalls in Validation (not just Verification) were cited as root- or contributing-factors in catastrophic mission loss. Furthermore, although there is strong agreement in the community that Test is the preferred method for V&V, many people equate 'V&V' with 'Test', such that Analysis and Modeling aren't given comparable attention. Another strong motivator is a realization that the rapid growth in complexity of deep-space missions (particularly Planetary Landers and Space Observatories given their inherent unknowns) is placing greater demands on systems engineers to 'get it right' with Validation.

  7. Patterns relationships of student’s creativity with its indicators in learning optical instrument

    NASA Astrophysics Data System (ADS)

    Sukarmin; Dhian, T. E. V.; Nonoh, S. A.; Delisma, W. A.

    2017-01-01

    This study aims to identify patterns relationships of student’s creativity with its indicators in Learning Optical Instrument. The study was conducted at SMPN 2 Sawo. SMPN 1 Jetis, SMPIT Darut Taqwa, SMPN 1 Dander, Bojonegoro and SMPN 3 Plus Al-Fatima. Data analysis used descriptive analysis using the Confirmatory Factor Analysis. Creativity test instruments used have been tested parameters. Creativity indicators used are personal (self-confidence, perseverance), press (spirit, unyielding), process (preparation, incubation illumination, verification) and the product (knowledge, skills). Research Result shows that perseverance and incubation are the highest capabilities and verification capabilities of the lowest. All indicators on student creativity can still be improved. The relationship between creativity with the indicators grouped into a strong, moderate, weak and no relation. Indicators that have a strong relationship (r ≥ 0.50), namely are personal (self-confidence, perseverance), process (illumination). Indicators that have a connection was (0.3 ≤ r ≤ 0.49) are press (spirit), process (verification). Indicators which have a very low correlation (r ≤ 0.1 ≤ 0.29) are press (unyielding), process (preparation), process (incubation), product (skills) as shown in Figure 1. Indicators that do not have a relationship between the creativity of the students with the indicator that is, product (knowledge).

  8. The DES Bright Arcs Survey: Hundreds of Candidate Strongly Lensed Galaxy Systems from the Dark Energy Survey Science Verification and Year 1 Observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diehl, H. T.; Buckley-Geer, E. J.; Lindgren, K. A.

    We report the results of searches for strong gravitational lens systems in the Dark Energy Survey (DES) Science Verification and Year 1 observations. The Science Verification data span approximately 250 sq. deg. with a median i -band limiting magnitude for extended objects (10 σ ) of 23.0. The Year 1 data span approximately 2000 sq. deg. and have an i -band limiting magnitude for extended objects (10 σ ) of 22.9. As these data sets are both wide and deep, they are particularly useful for identifying strong gravitational lens candidates. Potential strong gravitational lens candidate systems were initially identified basedmore » on a color and magnitude selection in the DES object catalogs or because the system is at the location of a previously identified galaxy cluster. Cutout images of potential candidates were then visually scanned using an object viewer and numerically ranked according to whether or not we judged them to be likely strong gravitational lens systems. Having scanned nearly 400,000 cutouts, we present 374 candidate strong lens systems, of which 348 are identified for the first time. We provide the R.A. and decl., the magnitudes and photometric properties of the lens and source objects, and the distance (radius) of the source(s) from the lens center for each system.« less

  9. The DES Bright Arcs Survey: Hundreds of Candidate Strongly Lensed Galaxy Systems from the Dark Energy Survey Science Verification and Year 1 Observations

    NASA Astrophysics Data System (ADS)

    Diehl, H. T.; Buckley-Geer, E. J.; Lindgren, K. A.; Nord, B.; Gaitsch, H.; Gaitsch, S.; Lin, H.; Allam, S.; Collett, T. E.; Furlanetto, C.; Gill, M. S. S.; More, A.; Nightingale, J.; Odden, C.; Pellico, A.; Tucker, D. L.; da Costa, L. N.; Fausti Neto, A.; Kuropatkin, N.; Soares-Santos, M.; Welch, B.; Zhang, Y.; Frieman, J. A.; Abdalla, F. B.; Annis, J.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Cunha, C. E.; D'Andrea, C. B.; Desai, S.; Dietrich, J. P.; Drlica-Wagner, A.; Evrard, A. E.; Finley, D. A.; Flaugher, B.; García-Bellido, J.; Gerdes, D. W.; Goldstein, D. A.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Kuehn, K.; Kuhlmann, S.; Lahav, O.; Li, T. S.; Lima, M.; Maia, M. A. G.; Marshall, J. L.; Menanteau, F.; Miquel, R.; Nichol, R. C.; Nugent, P.; Ogando, R. L. C.; Plazas, A. A.; Reil, K.; Romer, A. K.; Sako, M.; Sanchez, E.; Santiago, B.; Scarpine, V.; Schindler, R.; Schubnell, M.; Sevilla-Noarbe, I.; Sheldon, E.; Smith, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Walker, A. R.; DES Collaboration

    2017-09-01

    We report the results of searches for strong gravitational lens systems in the Dark Energy Survey (DES) Science Verification and Year 1 observations. The Science Verification data span approximately 250 sq. deg. with a median I-band limiting magnitude for extended objects (10σ) of 23.0. The Year 1 data span approximately 2000 sq. deg. and have an I-band limiting magnitude for extended objects (10σ) of 22.9. As these data sets are both wide and deep, they are particularly useful for identifying strong gravitational lens candidates. Potential strong gravitational lens candidate systems were initially identified based on a color and magnitude selection in the DES object catalogs or because the system is at the location of a previously identified galaxy cluster. Cutout images of potential candidates were then visually scanned using an object viewer and numerically ranked according to whether or not we judged them to be likely strong gravitational lens systems. Having scanned nearly 400,000 cutouts, we present 374 candidate strong lens systems, of which 348 are identified for the first time. We provide the R.A. and decl., the magnitudes and photometric properties of the lens and source objects, and the distance (radius) of the source(s) from the lens center for each system.

  10. Perceptual Processing Affects Conceptual Processing

    ERIC Educational Resources Information Center

    van Dantzig, Saskia; Pecher, Diane; Zeelenberg, Rene; Barsalou, Lawrence W.

    2008-01-01

    According to the Perceptual Symbols Theory of cognition (Barsalou, 1999), modality-specific simulations underlie the representation of concepts. A strong prediction of this view is that perceptual processing affects conceptual processing. In this study, participants performed a perceptual detection task and a conceptual property-verification task…

  11. L band push broom microwave radiometer: Soil moisture verification and time series experiment Delmarva Peninsula

    NASA Technical Reports Server (NTRS)

    Jackson, T. J.; Shiue, J.; Oneill, P.; Wang, J.; Fuchs, J.; Owe, M.

    1984-01-01

    The verification of a multi-sensor aircraft system developed to study soil moisture applications is discussed. This system consisted of a three beam push broom L band microwave radiometer, a thermal infrared scanner, a multispectral scanner, video and photographic cameras and an onboard navigational instrument. Ten flights were made of agricultural sites in Maryland and Delaware with little or no vegetation cover. Comparisons of aircraft and ground measurements showed that the system was reliable and consistent. Time series analysis of microwave and evaporation data showed a strong similarity that indicates a potential direction for future research.

  12. Nonlinear 3D MHD verification study: SpeCyl and PIXIE3D codes for RFP and Tokamak plasmas

    NASA Astrophysics Data System (ADS)

    Bonfiglio, D.; Cappello, S.; Chacon, L.

    2010-11-01

    A strong emphasis is presently placed in the fusion community on reaching predictive capability of computational models. An essential requirement of such endeavor is the process of assessing the mathematical correctness of computational tools, termed verification [1]. We present here a successful nonlinear cross-benchmark verification study between the 3D nonlinear MHD codes SpeCyl [2] and PIXIE3D [3]. Excellent quantitative agreement is obtained in both 2D and 3D nonlinear visco-resistive dynamics for reversed-field pinch (RFP) and tokamak configurations [4]. RFP dynamics, in particular, lends itself as an ideal non trivial test-bed for 3D nonlinear verification. Perspectives for future application of the fully-implicit parallel code PIXIE3D to RFP physics, in particular to address open issues on RFP helical self-organization, will be provided. [4pt] [1] M. Greenwald, Phys. Plasmas 17, 058101 (2010) [0pt] [2] S. Cappello and D. Biskamp, Nucl. Fusion 36, 571 (1996) [0pt] [3] L. Chac'on, Phys. Plasmas 15, 056103 (2008) [0pt] [4] D. Bonfiglio, L. Chac'on and S. Cappello, Phys. Plasmas 17 (2010)

  13. Decision forests for learning prostate cancer probability maps from multiparametric MRI

    NASA Astrophysics Data System (ADS)

    Ehrenberg, Henry R.; Cornfeld, Daniel; Nawaf, Cayce B.; Sprenkle, Preston C.; Duncan, James S.

    2016-03-01

    Objectives: Advances in multiparametric magnetic resonance imaging (mpMRI) and ultrasound/MRI fusion imaging offer a powerful alternative to the typical undirected approach to diagnosing prostate cancer. However, these methods require the time and expertise needed to interpret mpMRI image scenes. In this paper, a machine learning framework for automatically detecting and localizing cancerous lesions within the prostate is developed and evaluated. Methods: Two studies were performed to gather MRI and pathology data. The 12 patients in the first study underwent an MRI session to obtain structural, diffusion-weighted, and dynamic contrast enhanced image vol- umes of the prostate, and regions suspected of being cancerous from the MRI data were manually contoured by radiologists. Whole-mount slices of the prostate were obtained for the patients in the second study, in addition to structural and diffusion-weighted MRI data, for pathology verification. A 3-D feature set for voxel-wise appear- ance description combining intensity data, textural operators, and zonal approximations was generated. Voxels in a test set were classified as normal or cancer using a decision forest-based model initialized using Gaussian discriminant analysis. A leave-one-patient-out cross-validation scheme was used to assess the predictions against the expert manual segmentations confirmed as cancer by biopsy. Results: We achieved an area under the average receiver-operator characteristic curve of 0.923 for the first study, and visual assessment of the probability maps showed 21 out of 22 tumors were identified while a high level of specificity was maintained. In addition to evaluating the model against related approaches, the effects of the individual MRI parameter types were explored, and pathological verification using whole-mount slices from the second study was performed. Conclusions: The results of this paper show that the combination of mpMRI and machine learning is a powerful tool for quantitatively diagnosing prostate cancer.

  14. Environmental Technology Verification Program Advanced Monitoring Systems Center Quality Assurance Project Plan for Verification of Black Carbon Monitors

    EPA Science Inventory

    Black carbon is a term that is commonly used to describe strongly light absorbing carbon (LAC), which is thought to play a significant role in global climate change through direct absorption of light, interaction with clouds, and by reducing the reflectivity of snow and ice. BC ...

  15. Perceptual processing affects conceptual processing.

    PubMed

    Van Dantzig, Saskia; Pecher, Diane; Zeelenberg, René; Barsalou, Lawrence W

    2008-04-05

    According to the Perceptual Symbols Theory of cognition (Barsalou, 1999), modality-specific simulations underlie the representation of concepts. A strong prediction of this view is that perceptual processing affects conceptual processing. In this study, participants performed a perceptual detection task and a conceptual property-verification task in alternation. Responses on the property-verification task were slower for those trials that were preceded by a perceptual trial in a different modality than for those that were preceded by a perceptual trial in the same modality. This finding of a modality-switch effect across perceptual processing and conceptual processing supports the hypothesis that perceptual and conceptual representations are partially based on the same systems. 2008 Cognitive Science Society, Inc.

  16. The DES Bright Arcs Survey: Hundreds of Candidate Strongly Lensed Galaxy Systems from the Dark Energy Survey Science Verification and Year 1 Observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diehl, H. T.; Buckley-Geer, E. J.; Lindgren, K. A.

    We report the results of our searches for strong gravitational lens systems in the Dark Energy Survey (DES) Science Verication and Year 1 observations. The Science Verication data spans approximately 250 sq. deg. with median i

  17. The DES Bright Arcs Survey: Hundreds of Candidate Strongly Lensed Galaxy Systems from the Dark Energy Survey Science Verification and Year 1 Observations

    DOE PAGES

    Diehl, H. T.; Buckley-Geer, E. J.; Lindgren, K. A.; ...

    2017-09-01

    We report the results of our searches for strong gravitational lens systems in the Dark Energy Survey (DES) Science Verication and Year 1 observations. The Science Verication data spans approximately 250 sq. deg. with median i

  18. Toward Automatic Verification of Goal-Oriented Flow Simulations

    NASA Technical Reports Server (NTRS)

    Nemec, Marian; Aftosmis, Michael J.

    2014-01-01

    We demonstrate the power of adaptive mesh refinement with adjoint-based error estimates in verification of simulations governed by the steady Euler equations. The flow equations are discretized using a finite volume scheme on a Cartesian mesh with cut cells at the wall boundaries. The discretization error in selected simulation outputs is estimated using the method of adjoint-weighted residuals. Practical aspects of the implementation are emphasized, particularly in the formulation of the refinement criterion and the mesh adaptation strategy. Following a thorough code verification example, we demonstrate simulation verification of two- and three-dimensional problems. These involve an airfoil performance database, a pressure signature of a body in supersonic flow and a launch abort with strong jet interactions. The results show reliable estimates and automatic control of discretization error in all simulations at an affordable computational cost. Moreover, the approach remains effective even when theoretical assumptions, e.g., steady-state and solution smoothness, are relaxed.

  19. Nonlinear three-dimensional verification of the SPECYL and PIXIE3D magnetohydrodynamics codes for fusion plasmas

    NASA Astrophysics Data System (ADS)

    Bonfiglio, D.; Chacón, L.; Cappello, S.

    2010-08-01

    With the increasing impact of scientific discovery via advanced computation, there is presently a strong emphasis on ensuring the mathematical correctness of computational simulation tools. Such endeavor, termed verification, is now at the center of most serious code development efforts. In this study, we address a cross-benchmark nonlinear verification study between two three-dimensional magnetohydrodynamics (3D MHD) codes for fluid modeling of fusion plasmas, SPECYL [S. Cappello and D. Biskamp, Nucl. Fusion 36, 571 (1996)] and PIXIE3D [L. Chacón, Phys. Plasmas 15, 056103 (2008)], in their common limit of application: the simple viscoresistive cylindrical approximation. SPECYL is a serial code in cylindrical geometry that features a spectral formulation in space and a semi-implicit temporal advance, and has been used extensively to date for reversed-field pinch studies. PIXIE3D is a massively parallel code in arbitrary curvilinear geometry that features a conservative, solenoidal finite-volume discretization in space, and a fully implicit temporal advance. The present study is, in our view, a first mandatory step in assessing the potential of any numerical 3D MHD code for fluid modeling of fusion plasmas. Excellent agreement is demonstrated over a wide range of parameters for several fusion-relevant cases in both two- and three-dimensional geometries.

  20. Nonlinear three-dimensional verification of the SPECYL and PIXIE3D magnetohydrodynamics codes for fusion plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonfiglio, Daniele; Chacon, Luis; Cappello, Susanna

    2010-01-01

    With the increasing impact of scientific discovery via advanced computation, there is presently a strong emphasis on ensuring the mathematical correctness of computational simulation tools. Such endeavor, termed verification, is now at the center of most serious code development efforts. In this study, we address a cross-benchmark nonlinear verification study between two three-dimensional magnetohydrodynamics (3D MHD) codes for fluid modeling of fusion plasmas, SPECYL [S. Cappello and D. Biskamp, Nucl. Fusion 36, 571 (1996)] and PIXIE3D [L. Chacon, Phys. Plasmas 15, 056103 (2008)], in their common limit of application: the simple viscoresistive cylindrical approximation. SPECYL is a serial code inmore » cylindrical geometry that features a spectral formulation in space and a semi-implicit temporal advance, and has been used extensively to date for reversed-field pinch studies. PIXIE3D is a massively parallel code in arbitrary curvilinear geometry that features a conservative, solenoidal finite-volume discretization in space, and a fully implicit temporal advance. The present study is, in our view, a first mandatory step in assessing the potential of any numerical 3D MHD code for fluid modeling of fusion plasmas. Excellent agreement is demonstrated over a wide range of parameters for several fusion-relevant cases in both two- and three-dimensional geometries.« less

  1. Source Physics Experiment: Research in Support of Verification and Nonproliferation

    DTIC Science & Technology

    2011-09-01

    designed to provide a carefully controlled seismic and strong motion data set from buried explosions at the Nevada National Security Site (NNSS). The...deposition partitioned into internal (heat and plastic strain) and kinetic (e.g., radiated seismic ) energy, giving more confidence in predicted free...ample information to study dry and water-saturated fractures, local lithology and topography on the radiated seismic wavefield. Spallation on

  2. Enhanced Verification Test Suite for Physics Simulation Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, J R; Brock, J S; Brandon, S T

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest.more » This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of greater sophistication or other physics regimes (e.g., energetic material response, magneto-hydrodynamics), would represent a scientifically desirable complement to the fundamental test cases discussed in this report. The authors believe that this document can be used to enhance the verification analyses undertaken at the DOE WP Laboratories and, thus, to improve the quality, credibility, and usefulness of the simulation codes that are analyzed with these problems.« less

  3. SU-E-T-48: A Multi-Institutional Study of Independent Dose Verification for Conventional, SRS and SBRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takahashi, R; Kamima, T; Tachibana, H

    2015-06-15

    Purpose: To show the results of a multi-institutional study of the independent dose verification for conventional, Stereotactic radiosurgery and body radiotherapy (SRS and SBRT) plans based on the action level of AAPM TG-114. Methods: This study was performed at 12 institutions in Japan. To eliminate the bias of independent dose verification program (Indp), all of the institutions used the same CT-based independent dose verification software (Simple MU Analysis, Triangle Products, JP) with the Clarkson-based algorithm. Eclipse (AAA, PBC), Pinnacle{sup 3} (Adaptive Convolve) and Xio (Superposition) were used as treatment planning system (TPS). The confidence limits (CL, Mean±2SD) for 18 sitesmore » (head, breast, lung, pelvis, etc.) were evaluated in comparison in dose between the TPS and the Indp. Results: A retrospective analysis of 6352 treatment fields was conducted. The CLs for conventional, SRS and SBRT were 1.0±3.7 %, 2.0±2.5 % and 6.2±4.4 %, respectively. In conventional plans, most of the sites showed within 5 % of TG-114 action level. However, there were the systematic difference (4.0±4.0 % and 2.5±5.8 % for breast and lung, respectively). In SRS plans, our results showed good agreement compared to the action level. In SBRT plans, the discrepancy between the Indp was variable depending on dose calculation algorithms of TPS. Conclusion: The impact of dose calculation algorithms for the TPS and the Indp affects the action level. It is effective to set the site-specific tolerances, especially for the site where inhomogeneous correction can affect dose distribution strongly.« less

  4. Fluorescent nanodiamonds as highly stable biomarker for endotoxin verification

    NASA Astrophysics Data System (ADS)

    Bergmann, Thorsten; Burg, Jan Michael; Lilholt, Maria; Maeder, Ulf; Beer, Sebastian; Salzig, Denise; Ebrahimi, Mehrdad; Czermak, Peter; Fiebich, Martin

    2012-03-01

    Fluorescent nanodiamonds (ND) provide advantageous properties as a fluorescent biomarker for in vitro and in vivo studies. The maximum fluorescence occurs around 700 nm, they do not show photobleaching or blinking and seem to be nontoxic. After a pretreatment with strong acid fluorescent ND can be functionalized and coupled to endotoxin. Endotoxin is a decay product of bacteria and causes strong immune reactions. Therefore endotoxin has to be removed for most applications. An effective removal procedure is membrane filtration. The endotoxin, coupled to fluorescent ND can be visualized by using confocal microscopy which allows the investigation of the separation mechanisms of the filtration process within the membranes.

  5. Self-verification motives at the collective level of self-definition.

    PubMed

    Chen, Serena; Chen, Karen Y; Shaw, Lindsay

    2004-01-01

    Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.

  6. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  7. Coherent Beam-Beam Instability in Collisions with a Large Crossing Angle

    NASA Astrophysics Data System (ADS)

    Ohmi, K.; Kuroo, N.; Oide, K.; Zhou, D.; Zimmermann, F.

    2017-09-01

    In recent years the "crab-waist collision" scheme [P. Raimondi, Proceedings of 2nd SuperB Workshop, Frascati, 2006.; M. Zobov et al., Phys. Rev. Lett. 104, 174801 (2010), 10.1103/PhysRevLett.104.174801] has become popular for circular e+ e- colliders. The designs of several future colliders are based on this scheme. So far the beam-beam effects for collisions under a large crossing angle with or without crab waist were mostly studied using weak-strong simulations. We present here strong-strong simulations showing a novel strong coherent head-tail instability, which can limit the performance of proposed future colliders. We explain the underlying instability mechanism starting from the "cross-wake force" induced by the beam-beam interaction. Using this beam-beam wake, the beam-beam head tail modes are studied by an eigenmode analysis. The instability may affect all collider designs based on the crab-waist scheme. We suggest an experimental verification at SuperKEKB during its commissioning phase II.

  8. A New Integrated Threshold Selection Methodology for Spatial Forecast Verification of Extreme Events

    NASA Astrophysics Data System (ADS)

    Kholodovsky, V.

    2017-12-01

    Extreme weather and climate events such as heavy precipitation, heat waves and strong winds can cause extensive damage to the society in terms of human lives and financial losses. As climate changes, it is important to understand how extreme weather events may change as a result. Climate and statistical models are often independently used to model those phenomena. To better assess performance of the climate models, a variety of spatial forecast verification methods have been developed. However, spatial verification metrics that are widely used in comparing mean states, in most cases, do not have an adequate theoretical justification to benchmark extreme weather events. We proposed a new integrated threshold selection methodology for spatial forecast verification of extreme events that couples existing pattern recognition indices with high threshold choices. This integrated approach has three main steps: 1) dimension reduction; 2) geometric domain mapping; and 3) thresholds clustering. We apply this approach to an observed precipitation dataset over CONUS. The results are evaluated by displaying threshold distribution seasonally, monthly and annually. The method offers user the flexibility of selecting a high threshold that is linked to desired geometrical properties. The proposed high threshold methodology could either complement existing spatial verification methods, where threshold selection is arbitrary, or be directly applicable in extreme value theory.

  9. Probabilistic Material Strength Degradation Model for Inconel 718 Components Subjected to High Temperature, High-Cycle and Low-Cycle Mechanical Fatigue, Creep and Thermal Fatigue Effects

    NASA Technical Reports Server (NTRS)

    Bast, Callie C.; Boyce, Lola

    1995-01-01

    The development of methodology for a probabilistic material strength degradation is described. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes five effects that typically reduce lifetime strength: high temperature, high-cycle mechanical fatigue, low-cycle mechanical fatigue, creep and thermal fatigue. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing predictions of high-cycle mechanical fatigue and high temperature effects with experiments are presented. Results from this limited verification study strongly supported that material degradation can be represented by randomized multifactor interaction models.

  10. Adapted RF pulse design for SAR reduction in parallel excitation with experimental verification at 9.4 T.

    PubMed

    Wu, Xiaoping; Akgün, Can; Vaughan, J Thomas; Andersen, Peter; Strupp, John; Uğurbil, Kâmil; Van de Moortele, Pierre-François

    2010-07-01

    Parallel excitation holds strong promises to mitigate the impact of large transmit B1 (B+1) distortion at very high magnetic field. Accelerated RF pulses, however, inherently tend to require larger values in RF peak power which may result in substantial increase in Specific Absorption Rate (SAR) in tissues, which is a constant concern for patient safety at very high field. In this study, we demonstrate adapted rate RF pulse design allowing for SAR reduction while preserving excitation target accuracy. Compared with other proposed implementations of adapted rate RF pulses, our approach is compatible with any k-space trajectories, does not require an analytical expression of the gradient waveform and can be used for large flip angle excitation. We demonstrate our method with numerical simulations based on electromagnetic modeling and we include an experimental verification of transmit pattern accuracy on an 8 transmit channel 9.4 T system.

  11. LISA verification binaries with updated distances from Gaia Data Release 2

    NASA Astrophysics Data System (ADS)

    Kupfer, T.; Korol, V.; Shah, S.; Nelemans, G.; Marsh, T. R.; Ramsay, G.; Groot, P. J.; Steeghs, D. T. H.; Rossi, E. M.

    2018-06-01

    Ultracompact binaries with orbital periods less than a few hours will dominate the gravitational wave signal in the mHz regime. Until recently, 10 systems were expected have a predicted gravitational wave signal strong enough to be detectable by the Laser Interferometer Space Antenna (LISA), the so-called `verification binaries'. System parameters, including distances, are needed to provide an accurate prediction of the expected gravitational wave strength to be measured by LISA. Using parallaxes from Gaia Data Release 2 we calculate signal-to-noise ratios (SNR) for ≈50 verification binary candidates. We find that 11 binaries reach a SNR≥20, two further binaries reaching a SNR≥5 and three more systems are expected to have a SNR≈5 after four years integration with LISA. For these 16 systems we present predictions of the gravitational wave amplitude (A) and parameter uncertainties from Fisher information matrix on the amplitude (A) and inclination (ι).

  12. Strong Cosmic Censorship

    NASA Astrophysics Data System (ADS)

    Isenberg, James

    2017-01-01

    The Hawking-Penrose theorems tell us that solutions of Einstein's equations are generally singular, in the sense of the incompleteness of causal geodesics (the paths of physical observers). These singularities might be marked by the blowup of curvature and therefore crushing tidal forces, or by the breakdown of physical determinism. Penrose has conjectured (in his `Strong Cosmic Censorship Conjecture`) that it is generically unbounded curvature that causes singularities, rather than causal breakdown. The verification that ``AVTD behavior'' (marked by the domination of time derivatives over space derivatives) is generically present in a family of solutions has proven to be a useful tool for studying model versions of Strong Cosmic Censorship in that family. I discuss some of the history of Strong Cosmic Censorship, and then discuss what is known about AVTD behavior and Strong Cosmic Censorship in families of solutions defined by varying degrees of isometry, and discuss recent results which we believe will extend this knowledge and provide new support for Strong Cosmic Censorship. I also comment on some of the recent work on ``Weak Null Singularities'', and how this relates to Strong Cosmic Censorship.

  13. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  14. Study of techniques for redundancy verification without disrupting systems, phases 1-3

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.

  15. Assessment of Galileo modal test results for mathematical model verification

    NASA Technical Reports Server (NTRS)

    Trubert, M.

    1984-01-01

    The modal test program for the Galileo Spacecraft was completed at the Jet Propulsion Laboratory in the summer of 1983. The multiple sine dwell method was used for the baseline test. The Galileo Spacecraft is a rather complex 2433 kg structure made of a central core on which seven major appendages representing 30 percent of the total mass are attached, resulting in a high modal density structure. The test revealed a strong nonlinearity in several major modes. This nonlinearity discovered in the course of the test necessitated running additional tests at the unusually high response levels of up to about 21 g. The high levels of response were required to obtain a model verification valid at the level of loads for which the spacecraft was designed. Because of the high modal density and the nonlinearity, correlation between the dynamic mathematical model and the test results becomes a difficult task. Significant changes in the pre-test analytical model are necessary to establish confidence in the upgraded analytical model used for the final load verification. This verification, using a test verified model, is required by NASA to fly the Galileo Spacecraft on the Shuttle/Centaur launch vehicle in 1986.

  16. Seismic rehabilitation of skewed and curved bridges using a new generation of buckling restrained braces.

    DOT National Transportation Integrated Search

    2016-12-01

    The objective of this project is to find effective configurations for using buckling restrained braces (BRBs) in both skewed and curved bridges for reducing the effects of strong earthquakes. Verification is performed by numerical simulation using an...

  17. Verification of Argentine ant defensive compounds and their behavioral effects on heterospecific competitors and conspecific nestmates

    USDA-ARS?s Scientific Manuscript database

    The invasive Argentine ant (Linepithema humile) has become established worldwide in regions with Mediterranean or subtropical climates. The species typically disrupts the balance of natural ecosystems by competitively displacing some native ant species via strong exploitation and interference compet...

  18. Verification of Gyrokinetic codes: Theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia; Bottino, Alberto; Görler, Tobias; Sonnendrücker, Eric; Told, Daniel; Villard, Laurent

    2017-05-01

    In fusion plasmas, the strong magnetic field allows the fast gyro-motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the subsequent transport. Naturally, these codes require thorough verification and validation. Here, we present a new and generic theoretical framework and specific numerical applications to test the faithfulness of the implemented models to theory and to verify the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which has rarely been done and therefore makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The verification of the numerical scheme is proposed via the benchmark effort. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC) and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations implemented in the ORB5 and GENE codes using the Lagrangian variational formulation. At the computational level, detailed verifications of global electromagnetic test cases developed from the CYCLONE Base Case are considered, including a parametric β-scan covering the transition from ITG to KBM and the spectral properties at the nominal β value.

  19. Beyond the Caster Semenya controversy: the case of the use of genetics for gender testing in sport.

    PubMed

    Wonkam, Ambroise; Fieggen, Karen; Ramesar, Raj

    2010-12-01

    Caster Semenya won the eight-hundred-meter title in the Berlin World Athletics Championships in 2009. Few hours after, Caster was at the center of a harsh contestation on gender. The International Association of Athletics Federations started an investigation, which was not respectful of her privacy. Caster's case highlights the need for an improvement in the awareness of genetic counseling principles amongst professionals, the public and various stakeholders. We critically examine the historical steps of gender verification in the Olympics, the violation of genetic counseling principles in Caster's case and outline some reflections on the complexity of the genetics of Disorders of sex development (DSD). Variability in both genotypes and phenotypes in DSD may not allow any etiological or functional classification at this point in time that could permit uncontroversial gender verification for fairer sport participation. We strongly suggest revisiting the pertinence of gender verification, and the process whereby this is done.

  20. Energetic arousal and language: predictions from the computational theory of quantifiers processing.

    PubMed

    Zajenkowski, Marcin

    2013-10-01

    The author examines the relationship between energetic arousal (EA) and the processing of sentences containing natural-language quantifiers. Previous studies and theories have shown that energy may differentially affect various cognitive functions. Recent investigations devoted to quantifiers strongly support the theory that various types of quantifiers involve different cognitive functions in the sentence-picture verification task. In the present study, 201 students were presented with a sentence-picture verification task consisting of simple propositions containing a quantifier that referred to the color of a car on display. Color pictures of cars accompanied the propositions. In addition, the level of participants' EA was measured before and after the verification task. It was found that EA and performance on proportional quantifiers (e.g., "More than half of the cars are red") are in an inverted U-shaped relationship. This result may be explained by the fact that proportional sentences engage working memory to a high degree, and previous models of EA-cognition associations have been based on the assumption that tasks that require parallel attentional and memory processes are best performed when energy is moderate. The research described in the present article has several applications, as it shows the optimal human conditions for verbal comprehension. For instance, it may be important in workplace design to control the level of arousal experienced by office staff when work is mostly related to the processing of complex texts. Energy level may be influenced by many factors, such as noise, time of day, or thermal conditions.

  1. Multipartite entanglement verification resistant against dishonest parties.

    PubMed

    Pappa, Anna; Chailloux, André; Wehner, Stephanie; Diamanti, Eleni; Kerenidis, Iordanis

    2012-06-29

    Future quantum information networks will consist of quantum and classical agents, who have the ability to communicate in a variety of ways with trusted and untrusted parties and securely delegate computational tasks to untrusted large-scale quantum computing servers. Multipartite quantum entanglement is a fundamental resource for such a network and, hence, it is imperative to study the possibility of verifying a multipartite entanglement source in a way that is efficient and provides strong guarantees even in the presence of multiple dishonest parties. In this Letter, we show how an agent of a quantum network can perform a distributed verification of a source creating multipartite Greenberger-Horne-Zeilinger (GHZ) states with minimal resources, which is, nevertheless, resistant against any number of dishonest parties. Moreover, we provide a tight tradeoff between the level of security and the distance between the state produced by the source and the ideal GHZ state. Last, by adding the resource of a trusted common random source, we can further provide security guarantees for all honest parties in the quantum network simultaneously.

  2. Multi-centre audit of VMAT planning and pre-treatment verification.

    PubMed

    Jurado-Bruggeman, Diego; Hernández, Victor; Sáez, Jordi; Navarro, David; Pino, Francisco; Martínez, Tatiana; Alayrach, Maria-Elena; Ailleres, Norbert; Melero, Alejandro; Jornet, Núria

    2017-08-01

    We performed a multi-centre intercomparison of VMAT dose planning and pre-treatment verification. The aims were to analyse the dose plans in terms of dosimetric quality and deliverability, and to validate whether in-house pre-treatment verification results agreed with those of an external audit. The nine participating centres encompassed different machines, equipment, and methodologies. Two mock cases (prostate and head and neck) were planned using one and two arcs. A plan quality index was defined to compare the plans and different complexity indices were calculated to check their deliverability. We compared gamma index pass rates using the centre's equipment and methodology to those of an external audit (global 3D gamma, absolute dose differences, 10% of maximum dose threshold). Log-file analysis was performed to look for delivery errors. All centres fulfilled the dosimetric goals but plan quality and delivery complexity were heterogeneous and uncorrelated, depending on the manufacturer and the planner's methodology. Pre-treatment verifications results were within tolerance in all cases for gamma 3%-3mm evaluation. Nevertheless, differences between the external audit and in-house measurements arose due to different equipment or methodology, especially for 2%-2mm criteria with differences up to 20%. No correlation was found between complexity indices and verification results amongst centres. All plans fulfilled dosimetric constraints, but plan quality and complexity did not correlate and were strongly dependent on the planner and the vendor. In-house measurements cannot completely replace external audits for credentialing. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Verification of performance specifications for a US Food and Drug Administration-approved molecular microbiology test: Clostridium difficile cytotoxin B using the Becton, Dickinson and Company GeneOhm Cdiff assay.

    PubMed

    Schlaberg, Robert; Mitchell, Michael J; Taggart, Edward W; She, Rosemary C

    2012-01-01

    US Food and Drug Administration (FDA)-approved diagnostic tests based on molecular genetic technologies are becoming available for an increasing number of microbial pathogens. Advances in technology and lower costs have moved molecular diagnostic tests formerly performed for research purposes only into much wider use in clinical microbiology laboratories. To provide an example of laboratory studies performed to verify the performance of an FDA-approved assay for the detection of Clostridium difficile cytotoxin B compared with the manufacturer's performance standards. We describe the process and protocols used by a laboratory for verification of an FDA-approved assay, assess data from the verification studies, and implement the assay after verification. Performance data from the verification studies conducted by the laboratory were consistent with the manufacturer's performance standards and the assay was implemented into the laboratory's test menu. Verification studies are required for FDA-approved diagnostic assays prior to use in patient care. Laboratories should develop a standardized approach to verification studies that can be adapted and applied to different types of assays. We describe the verification of an FDA-approved real-time polymerase chain reaction assay for the detection of a toxin gene in a bacterial pathogen.

  4. A family of metric gravities

    NASA Astrophysics Data System (ADS)

    Shuler, Robert

    2018-04-01

    The goal of this paper is to take a completely fresh approach to metric gravity, in which the metric principle is strictly adhered to but its properties in local space-time are derived from conservation principles, not inferred from a global field equation. The global field strength variation then gains some flexibility, but only in the regime of very strong fields (2nd-order terms) whose measurement is now being contemplated. So doing provides a family of similar gravities, differing only in strong fields, which could be developed into meaningful verification targets for strong fields after the manner in which far-field variations were used in the 20th century. General Relativity (GR) is shown to be a member of the family and this is demonstrated by deriving the Schwarzschild metric exactly from a suitable field strength assumption. The method of doing so is interesting in itself because it involves only one differential equation rather than the usual four. Exact static symmetric field solutions are also given for one pedagogical alternative based on potential, and one theoretical alternative based on inertia, and the prospects of experimentally differentiating these are analyzed. Whether the method overturns the conventional wisdom that GR is the only metric theory of gravity and that alternatives must introduce additional interactions and fields is somewhat semantical, depending on whether one views the field strength assumption as a field and whether the assumption that produces GR is considered unique in some way. It is of course possible to have other fields, and the local space-time principle can be applied to field gravities which usually are weak-field approximations having only time dilation, giving them the spatial factor and promoting them to full metric theories. Though usually pedagogical, some of them are interesting from a quantum gravity perspective. Cases are noted where mass measurement errors, or distributions of dark matter, can cause one theory to mimic another implying that such estimates or distributions should be first obtained from weakfield measurements before being used to discriminate verification candidates. By this method theorists gain insight into the local constraints on space-time, and GR verification gains strong-field comparative objectives.

  5. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  6. Adjusting for partial verification or workup bias in meta-analyses of diagnostic accuracy studies.

    PubMed

    de Groot, Joris A H; Dendukuri, Nandini; Janssen, Kristel J M; Reitsma, Johannes B; Brophy, James; Joseph, Lawrence; Bossuyt, Patrick M M; Moons, Karel G M

    2012-04-15

    A key requirement in the design of diagnostic accuracy studies is that all study participants receive both the test under evaluation and the reference standard test. For a variety of practical and ethical reasons, sometimes only a proportion of patients receive the reference standard, which can bias the accuracy estimates. Numerous methods have been described for correcting this partial verification bias or workup bias in individual studies. In this article, the authors describe a Bayesian method for obtaining adjusted results from a diagnostic meta-analysis when partial verification or workup bias is present in a subset of the primary studies. The method corrects for verification bias without having to exclude primary studies with verification bias, thus preserving the main advantages of a meta-analysis: increased precision and better generalizability. The results of this method are compared with the existing methods for dealing with verification bias in diagnostic meta-analyses. For illustration, the authors use empirical data from a systematic review of studies of the accuracy of the immunohistochemistry test for diagnosis of human epidermal growth factor receptor 2 status in breast cancer patients.

  7. ARSENIC SPECIATION IN WATER AND DIETARY SAMPLES BY IC-ICP-MS WITH STRUCTURAL VERIFICATION VIA IC-ESI-MS/MS

    EPA Science Inventory

    The two predominate sources of arsenic exposure are water and dietary ingestion. Dietary sources can easily exceed drinking water exposures based on "total" arsenic measurements. This can be deceiving because arsenic's toxicity is strongly dependent on its chemical form and the...

  8. Rudolph A. Marcus and His Theory of Electron Transfer Reactions

    Science.gov Websites

    early 1950s and soon discovered ... a strong experimental program at Brookhaven on electron-transfer experimental work provided the first verification of several of the predictions of his theory. This, in turn Marcus theory, namely, experimental evidence for the so-called "inverted region" where rates

  9. The politics of verification and the control of nuclear tests, 1945-1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallagher, N.W.

    1990-01-01

    This dissertation addresses two questions: (1) why has agreement been reached on verification regimes to support some arms control accords but not others; and (2) what determines the extent to which verification arrangements promote stable cooperation. This study develops an alternative framework for analysis by examining the politics of verification at two levels. The logical politics of verification are shaped by the structure of the problem of evaluating cooperation under semi-anarchical conditions. The practical politics of verification are driven by players' attempts to use verification arguments to promote their desired security outcome. The historical material shows that agreements on verificationmore » regimes are reached when key domestic and international players desire an arms control accord and believe that workable verification will not have intolerable costs. Clearer understanding of how verification is itself a political problem, and how players manipulate it to promote other goals is necessary if the politics of verification are to support rather than undermine the development of stable cooperation.« less

  10. Verification of Gyrokinetic codes: theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia

    2016-10-01

    In fusion plasmas the strong magnetic field allows the fast gyro motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the consequent transport. We present a new and generic theoretical framework and specific numerical applications to test the validity and the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The indirect verification of numerical scheme is proposed via the Benchmark process. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC), and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations using the generic variational formulation. Then, we derive and include the models implemented in ORB5 and GENE inside this hierarchy. At the computational level, detailed verification of global electromagnetic test cases based on the CYCLONE are considered, including a parametric β-scan covering the transition between the ITG to KBM and the spectral properties at the nominal β value.

  11. Scope and verification of a Fissile Material (Cutoff) Treaty

    DOE PAGES

    von Hippel, Frank N.

    2014-01-01

    A Fissile Material Cutoff Treaty (FMCT) would ban the production of fissile material – in practice highly-enriched uranium and separated plutonium – for weapons. It has been supported by strong majorities in the United Nations. After it comes into force, newly produced fissile materials could only be produced under international – most likely International Atomic Energy Agency – monitoring. There are many non-weapon states that argue the treaty should also place under safeguards pre-existing stocks of fissile material in civilian use or declared excess for weapons so as to make nuclear-weapons reductions irreversible. Our paper discusses the scope of themore » FMCT, the ability to detect clandestine production and verification challenges in the nuclear-weapons states.« less

  12. Influence of the Redundant Verification and the Non-Redundant Verification on the Hydraulic Tomography

    NASA Astrophysics Data System (ADS)

    Wei, T. B.; Chen, Y. L.; Lin, H. R.; Huang, S. Y.; Yeh, T. C. J.; Wen, J. C.

    2016-12-01

    In the groundwater study, it estimated the heterogeneous spatial distribution of hydraulic Properties, there were many scholars use to hydraulic tomography (HT) from field site pumping tests to estimate inverse of heterogeneous spatial distribution of hydraulic Properties, to prove the most of most field site aquifer was heterogeneous hydrogeological parameters spatial distribution field. Many scholars had proposed a method of hydraulic tomography to estimate heterogeneous spatial distribution of hydraulic Properties of aquifer, the Huang et al. [2011] was used the non-redundant verification analysis of pumping wells changed, observation wells fixed on the inverse and the forward, to reflect the feasibility of the heterogeneous spatial distribution of hydraulic Properties of field site aquifer of the non-redundant verification analysis on steady-state model.From post literature, finding only in steady state, non-redundant verification analysis of pumping well changed location and observation wells fixed location for inverse and forward. But the studies had not yet pumping wells fixed or changed location, and observation wells fixed location for redundant verification or observation wells change location for non-redundant verification of the various combinations may to explore of influences of hydraulic tomography method. In this study, it carried out redundant verification method and non-redundant verification method for forward to influences of hydraulic tomography method in transient. And it discuss above mentioned in NYUST campus sites the actual case, to prove the effectiveness of hydraulic tomography methods, and confirmed the feasibility on inverse and forward analysis from analysis results.Keywords: Hydraulic Tomography, Redundant Verification, Heterogeneous, Inverse, Forward

  13. Plasma protein absolute quantification by nano-LC Q-TOF UDMSE for clinical biomarker verification

    PubMed Central

    ILIES, MARIA; IUGA, CRISTINA ADELA; LOGHIN, FELICIA; DHOPLE, VISHNU MUKUND; HAMMER, ELKE

    2017-01-01

    Background and aims Proteome-based biomarker studies are targeting proteins that could serve as diagnostic, prognosis, and prediction molecules. In the clinical routine, immunoassays are currently used for the absolute quantification of such biomarkers, with the major limitation that only one molecule can be targeted per assay. The aim of our study was to test a mass spectrometry based absolute quantification method for the verification of plasma protein sets which might serve as reliable biomarker panels for the clinical practice. Methods Six EDTA plasma samples were analyzed after tryptic digestion using a high throughput data independent acquisition nano-LC Q-TOF UDMSE proteomics approach. Synthetic Escherichia coli standard peptides were spiked in each sample for the absolute quantification. Data analysis was performed using ProgenesisQI v2.0 software (Waters Corporation). Results Our method ensured absolute quantification of 242 non redundant plasma proteins in a single run analysis. The dynamic range covered was 105. 86% were represented by classical plasma proteins. The overall median coefficient of variation was 0.36, while a set of 63 proteins was found to be highly stable. Absolute protein concentrations strongly correlated with values reviewed in the literature. Conclusions Nano-LC Q-TOF UDMSE proteomic analysis can be used for a simple and rapid determination of absolute amounts of plasma proteins. A large number of plasma proteins could be analyzed, while a wide dynamic range was covered with low coefficient of variation at protein level. The method proved to be a reliable tool for the quantification of protein panel for biomarker verification in the clinical practice. PMID:29151793

  14. Modality Switching in a Property Verification Task: An ERP Study of What Happens When Candles Flicker after High Heels Click

    PubMed Central

    Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana

    2011-01-01

    The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties. PMID:21713128

  15. Modality Switching in a Property Verification Task: An ERP Study of What Happens When Candles Flicker after High Heels Click.

    PubMed

    Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana

    2011-01-01

    The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties.

  16. Functions of social support and self-verification in association with loneliness, depression, and stress.

    PubMed

    Wright, Kevin B; King, Shawn; Rosenberg, Jenny

    2014-01-01

    This study investigated the influence of social support and self-verification on loneliness, depression, and stress among 477 college students. The authors propose and test a theoretical model using structural equation modeling. The results indicated empirical support for the model, with self-verification mediating the relation between social support and health outcomes. The results have implications for social support and self-verification research, which are discussed along with directions for future research and limitations of the study.

  17. New generalized Noh solutions for HEDP hydrocode verification

    NASA Astrophysics Data System (ADS)

    Velikovich, A. L.; Giuliani, J. L.; Zalesak, S. T.; Tangri, V.

    2017-10-01

    The classic Noh solution describing stagnation of a cold ideal gas in a strong accretion shock wave has been the workhorse of compressible hydrocode verification for over three decades. We describe a number of its generalizations available for HEDP code verification. First, for an ideal gas, we have obtained self-similar solutions that describe adiabatic convergence either of a finite-pressure gas into an empty cavity or of a finite-amplitude sound wave into a uniform resting gas surrounding the center or axis of symmetry. At the moment of collapse such a flow produces a uniform gas whose velocity at each point is constant and directed towards the axis or the center, i. e. the initial condition similar to the classic solution but with a finite pressure of the converging gas. After that, a constant-velocity accretion shock propagates into the incident gas whose pressure and velocity profiles are not flat, in contrast with the classic solution. Second, for an arbitrary equation of state, we demonstrate the existence of self-similar solutions of the Noh problem in cylindrical and spherical geometry. Examples of such solutions with a three-term equation of state that includes cold, thermal ion/lattice, and thermal electron contributions are presented for aluminum and copper. These analytic solutions are compared to our numerical simulation results as an example of their use for code verification. Work supported by the US DOE/NNSA.

  18. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luke, S J

    2011-12-20

    This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of themore » US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the genetic information associated with the various pathogens. In addition, it has been determined that a suitable information barrier could be applied to this technology when the verification regime has been defined. Finally, the report posits a path forward for additional development of information barriers in a biological weapons verification regime. This path forward has shown that a new analysis approach coined as Information Loss Analysis might need to be pursued so that a numerical understanding of how information can be lost in specific measurement systems can be achieved.« less

  19. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  20. Study on verifying the angle measurement performance of the rotary-laser system

    NASA Astrophysics Data System (ADS)

    Zhao, Jin; Ren, Yongjie; Lin, Jiarui; Yin, Shibin; Zhu, Jigui

    2018-04-01

    An angle verification method to verify the angle measurement performance of the rotary-laser system was developed. Angle measurement performance has a great impact on measuring accuracy. Although there is some previous research on the verification of angle measuring uncertainty for the rotary-laser system, there are still some limitations. High-precision reference angles are used in the study of the method, and an integrated verification platform is set up to evaluate the performance of the system. This paper also probes the error that has biggest influence on the verification system. Some errors of the verification system are avoided via the experimental method, and some are compensated through the computational formula and curve fitting. Experimental results show that the angle measurement performance meets the requirement for coordinate measurement. The verification platform can evaluate the uncertainty of angle measurement for the rotary-laser system efficiently.

  1. Verification bias: an under-recognized source of error in assessing the efficacy of MRI of the meniscii.

    PubMed

    Richardson, Michael L; Petscavage, Jonelle M

    2011-11-01

    The sensitivity and specificity of magnetic resonance imaging (MRI) for diagnosis of meniscal tears has been studied extensively, with tears usually verified by surgery. However, surgically unverified cases are often not considered in these studies, leading to verification bias, which can falsely increase the sensitivity and decrease the specificity estimates. Our study suggests that such bias may be very common in the meniscal MRI literature, and illustrates techniques to detect and correct for such bias. PubMed was searched for articles estimating sensitivity and specificity of MRI for meniscal tears. These were assessed for verification bias, deemed potentially present if a study included any patients whose MRI findings were not surgically verified. Retrospective global sensitivity analysis (GSA) was performed when possible. Thirty-nine of the 314 studies retrieved from PubMed specifically dealt with meniscal tears. All 39 included unverified patients, and hence, potential verification bias. Only seven articles included sufficient information to perform GSA. Of these, one showed definite verification bias, two showed no bias, and four others showed bias within certain ranges of disease prevalence. Only 9 of 39 acknowledged the possibility of verification bias. Verification bias is underrecognized and potentially common in published estimates of the sensitivity and specificity of MRI for the diagnosis of meniscal tears. When possible, it should be avoided by proper study design. If unavoidable, it should be acknowledged. Investigators should tabulate unverified as well as verified data. Finally, verification bias should be estimated; if present, corrected estimates of sensitivity and specificity should be used. Our online web-based calculator makes this process relatively easy. Copyright © 2011 AUR. Published by Elsevier Inc. All rights reserved.

  2. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  3. Verification of EPA's " Preliminary remediation goals for radionuclides" (PRG) electronic calculator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stagich, B. H.

    The U.S. Environmental Protection Agency (EPA) requested an external, independent verification study of their “Preliminary Remediation Goals for Radionuclides” (PRG) electronic calculator. The calculator provides information on establishing PRGs for radionuclides at Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) sites with radioactive contamination (Verification Study Charge, Background). These risk-based PRGs set concentration limits using carcinogenic toxicity values under specific exposure conditions (PRG User’s Guide, Section 1). The purpose of this verification study is to ascertain that the computer codes has no inherit numerical problems with obtaining solutions as well as to ensure that the equations are programmed correctly.

  4. Fingerprint changes and verification failure among patients with hand dermatitis.

    PubMed

    Lee, Chew Kek; Chang, Choong Chor; Johar, Asmah; Puwira, Othman; Roshidah, Baba

    2013-03-01

    To determine the prevalence of fingerprint verification failure and to define and quantify the fingerprint changes associated with fingerprint verification failure. Case-control study. Referral public dermatology center. The study included 100 consecutive patients with clinical hand dermatitis involving the palmar distal phalanx of either thumb and 100 age-, sex-, and ethnicity-matched controls. Patients with an altered thumb print due to other causes and palmar hyperhidrosis were excluded. Fingerprint verification(pass/fail) and hand eczema severity index score. Twenty-seven percent of patients failed fingerprint verification compared with 2% of controls. Fingerprint verification failure was associated with a higher hand eczema severity index score (P.001). The main fingerprint abnormalities were fingerprint dystrophy (42.0%) and abnormal white lines (79.5%). The number of abnormal white lines was significantly higher among the patients with hand dermatitis compared with controls(P=.001). Among the patients with hand dermatitis, theodds of failing fingerprint verification with fingerprint dystrophy was 4.01. The presence of broad lines and long lines was associated with a greater odds of fingerprint verification failure (odds ratio [OR], 8.04; 95% CI, 3.56-18.17 and OR, 2.37; 95% CI, 1.31-4.27, respectively),while the presence of thin lines was protective of verification failure (OR, 0.45; 95% CI, 0.23-0.89). Fingerprint verification failure is a significant problem among patients with more severe hand dermatitis. It is mainly due to fingerprint dystrophy and abnormal white lines. Malaysian National Medical Research Register Identifier: NMRR-11-30-8226

  5. Collisionless Electrostatic Shock Modeling and Simulation

    DTIC Science & Technology

    2016-10-21

    unlimited. PA#16490 Dissipation Controls Wave Train Under- and Over-damped Shocks – Under-damped: • Dissipation is weak, ripples persist. • High...Density Position – Over-damped: ● Strong dissipation damps ripples . ● Low Density Position 12 Position Distribution A. Approved for public release...distribution unlimited. PA#16490 Model Verification Comparison with Linearized Solution – Evolution of the First Ripple Wavelength: • Simulated

  6. Space-borne survey instrument operations: lessons learned and new concepts for the Euclid NISP instrument

    NASA Astrophysics Data System (ADS)

    Valenziano, L.; Gregorio, A.; Butler, R. C.; Amiaux, J.; Bonoli, C.; Bortoletto, F.; Burigana, C.; Corcione, L.; Ealet, A.; Frailis, M.; Jahnke, K.; Ligori, S.; Maiorano, E.; Morgante, G.; Nicastro, L.; Pasian, F.; Riva, M.; Scaramella, R.; Schiavone, F.; Tavagnacco, D.; Toledo-Moreo, R.; Trifoglio, M.; Zacchei, A.; Zerbi, F. M.; Maciaszek, T.

    2012-09-01

    Euclid is the future ESA mission, mainly devoted to Cosmology. Like WMAP and Planck, it is a survey mission, to be launched in 2019 and injected in orbit far away from the Earth, for a nominal lifetime of 7 years. Euclid has two instruments on-board, the Visible Imager (VIS) and the Near- Infrared Spectro-Photometer (NISP). The NISP instrument includes cryogenic mechanisms, active thermal control, high-performance Data Processing Unit and requires periodic in-flight calibrations and instrument parameters monitoring. To fully exploit the capability of the NISP, a careful control of systematic effects is required. From previous experiments, we have built the concept of an integrated instrument development and verification approach, where the scientific, instrument and ground-segment expertise have strong interactions from the early phases of the project. In particular, we discuss the strong integration of test and calibration activities with the Ground Segment, starting from early pre-launch verification activities. We want to report here the expertise acquired by the Euclid team in previous missions, only citing the literature for detailed reference, and indicate how it is applied in the Euclid mission framework.

  7. Recruitment Strategies of Methamphetamine-Using Men Who Have Sex with Men into an Online Survey

    PubMed Central

    Wilkerson, J. Michael; Shenk, Jared E.; Grey, Jeremy A.; Simon Rosser, B. R.; Noor, Syed W.

    2014-01-01

    Recruiting hidden populations into online research remains challenging. In this manuscript, we report lessons learned from our efforts to recruit methamphetamine-using men who have sex with men. Between July and October 2012, we implemented a four-phase recruitment strategy to enroll a total of 343 methamphetamine-using MSM into an online survey about recent substance use, sexual behavior, and various psychosocial measures. The four phases were implemented sequentially. During phase one, we placed advertisements on mobile applications, and during phase two, we placed advertisements on traditional websites formatted for browsers. During phase three, we used e-mail to initiate snowball recruitment, and during phase four, we used social media for snowball recruitment. Advertisements on mobile devices and websites formatted for browsers proved to be expensive options and resulted in few eligible participants. Our attempts to initiate a snowball through e-mail also proved unsuccessful. The majority (n=320) of observations in our final dataset came from our use of social media. However, participant fraud was a concern, requiring us to implement a strong participant verification protocol. For maximum recruitment and cost-effectiveness, researchers should use social media for recruitment provided they employ strong participant verification protocols. PMID:25642143

  8. Recruitment Strategies of Methamphetamine-Using Men Who Have Sex with Men into an Online Survey.

    PubMed

    Wilkerson, J Michael; Shenk, Jared E; Grey, Jeremy A; Simon Rosser, B R; Noor, Syed W

    Recruiting hidden populations into online research remains challenging. In this manuscript, we report lessons learned from our efforts to recruit methamphetamine-using men who have sex with men. Between July and October 2012, we implemented a four-phase recruitment strategy to enroll a total of 343 methamphetamine-using MSM into an online survey about recent substance use, sexual behavior, and various psychosocial measures. The four phases were implemented sequentially. During phase one, we placed advertisements on mobile applications, and during phase two, we placed advertisements on traditional websites formatted for browsers. During phase three, we used e-mail to initiate snowball recruitment, and during phase four, we used social media for snowball recruitment. Advertisements on mobile devices and websites formatted for browsers proved to be expensive options and resulted in few eligible participants. Our attempts to initiate a snowball through e-mail also proved unsuccessful. The majority (n=320) of observations in our final dataset came from our use of social media. However, participant fraud was a concern, requiring us to implement a strong participant verification protocol. For maximum recruitment and cost-effectiveness, researchers should use social media for recruitment provided they employ strong participant verification protocols.

  9. Being known, intimate, and valued: global self-verification and dyadic adjustment in couples and roommates.

    PubMed

    Katz, Jennifer; Joiner, Thomas E

    2002-02-01

    We contend that close relationships provide adults with optimal opportunities for personal growth when relationship partners provide accurate, honest feedback. Accordingly, it was predicted that young adults would experience the relationship quality with relationship partners who evaluated them in a manner consistent their own self-evaluations. Three empirical tests of this self-verification hypothesis as applied to close dyads were conducted. In Study 1, young adults in dating relationships were most intimate with and somewhat more committed to partners when they perceived that partners evaluated them as they evaluated themselves. Self-verification effects were pronounced for those involved in more serious dating relationships. In Study 2, men reported the greatest esteem for same-sex roommates who evaluated them in a self-verifying manner. Results from Study 2 were replicated and extended to both male and female roommate dyads in Study 3. Further, self-verification effects were most pronounced for young adults with high emotional empathy. Results suggest that self-verification theory is useful for understanding dyadic adjustment across a variety of relational contexts in young adulthood. Implications of self-verification processes for adult personal development are outlined within an identity negotiation framework.

  10. Compromises produced by the dialectic between self-verification and self-enhancement.

    PubMed

    Morling, B; Epstein, S

    1997-12-01

    Three studies of people's reactions to evaluative feedback demonstrated that the dialectic between self-enhancement and self-verification results in compromises between these 2 motives, as hypothesized in cognitive-experiential self-theory. The demonstration was facilitated by 2 procedural improvements: Enhancement and verification were established by calibrating evaluative feedback against self appraisals, and degree of enhancement and of verification were varied along a continuum, rather than categorically. There was also support for the hypotheses that processing in an intuitive-experiential mode favors enhancement and processing in an analytical-rational mode favors verification in the kinds of situations investigated.

  11. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  12. You Can't See the Real Me: Attachment Avoidance, Self-Verification, and Self-Concept Clarity.

    PubMed

    Emery, Lydia F; Gardner, Wendi L; Carswell, Kathleen L; Finkel, Eli J

    2018-03-01

    Attachment shapes people's experiences in their close relationships and their self-views. Although attachment avoidance and anxiety both undermine relationships, past research has primarily emphasized detrimental effects of anxiety on the self-concept. However, as partners can help people maintain stable self-views, avoidant individuals' negative views of others might place them at risk for self-concept confusion. We hypothesized that avoidance would predict lower self-concept clarity and that less self-verification from partners would mediate this association. Attachment avoidance was associated with lower self-concept clarity (Studies 1-5), an effect that was mediated by low self-verification (Studies 2-3). The association between avoidance and self-verification was mediated by less self-disclosure and less trust in partner feedback (Study 4). Longitudinally, avoidance predicted changes in self-verification, which in turn predicted changes in self-concept clarity (Study 5). Thus, avoidant individuals' reluctance to trust or become too close to others may result in hidden costs to the self-concept.

  13. The Challenge for Arms Control Verification in the Post-New START World

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wuest, C R

    Nuclear weapon arms control treaty verification is a key aspect of any agreement between signatories to establish that the terms and conditions spelled out in the treaty are being met. Historically, arms control negotiations have focused more on the rules and protocols for reducing the numbers of warheads and delivery systems - sometimes resorting to complex and arcane procedures for counting forces - in an attempt to address perceived or real imbalances in a nation's strategic posture that could lead to instability. Verification procedures are generally defined in arms control treaties and supporting documents and tend to focus on technicalmore » means and measures designed to ensure that a country is following the terms of the treaty and that it is not liable to engage in deception or outright cheating in an attempt to circumvent the spirit and the letter of the agreement. As the Obama Administration implements the articles, terms, and conditions of the recently ratified and entered-into-force New START treaty, there are already efforts within and outside of government to move well below the specified New START levels of 1550 warheads, 700 deployed strategic delivery vehicles, and 800 deployed and nondeployed strategic launchers (Inter-Continental Ballistic Missile (ICBM) silos, Submarine-Launched Ballistic Missile (SLBM) tubes on submarines, and bombers). A number of articles and opinion pieces have appeared that advocate for significantly deeper cuts in the U.S. nuclear stockpile, with some suggesting that unilateral reductions on the part of the U.S. would help coax Russia and others to follow our lead. Papers and studies prepared for the U.S. Department of Defense and at the U.S. Air War College have also been published, suggesting that nuclear forces totaling no more than about 300 warheads would be sufficient to meet U.S. national security and deterrence needs. (Davis 2011, Schaub and Forsyth 2010) Recent articles by James M. Acton and others suggest that the prospects for maintaining U.S. security and minimizing the chances of nuclear war, while deliberately reducing stockpiles to a few hundred weapons, is possible but not without risk. While the question of the appropriate level of cuts to U.S. nuclear forces is being actively debated, a key issue continues to be whether verification procedures are strong enough to ensure that both the U.S. and Russia are fulfilling their obligations under the current New Start treaty and any future arms reduction treaties. A recent opinion piece by Henry Kissinger and Brent Scowcroft (2012) raised a number of issues with respect to governing a policy to enhance strategic stability, including: in deciding on force levels and lower numbers, verification is crucial. Particularly important is a determination of what level of uncertainty threatens the calculation of stability. At present, that level is well within the capabilities of the existing verification systems. We must be certain that projected levels maintain - and when possible, reinforce - that confidence. The strengths and weaknesses of the New START verification regime should inform and give rise to stronger regimes for future arms control agreements. These future arms control agreements will likely need to include other nuclear weapons states and so any verification regime will need to be acceptable to all parties. Currently, China is considered the most challenging party to include in any future arms control agreement and China's willingness to enter into verification regimes such as those implemented in New START may only be possible when it feels it has reached nuclear parity with the U.S. and Russia. Similarly, in keeping with its goals of reaching peer status with the U.S. and Russia, Frieman (2004) suggests that China would be more willing to accept internationally accepted and applied verification regimes rather than bilateral ones. The current verification protocols specified in the New START treaty are considered as the baseline case and are contrasted with possible alternative verification protocols that could be effective in a post-New START era of significant reductions in U.S. and other countries nuclear stockpiles. Of particular concern is the possibility of deception and breakout when declared and observed numbers of weapons are below the level considered to pose an existential threat to the U.S. In a regime of very low stockpile numbers, 'traditional' verification protocols as currently embodied in the New START treaty might prove less than adequate. I introduce and discuss a number of issues that need to be considered in future verification protocols, many of which do not have immediate solutions and so require further study. I also discuss alternatives and enhancements to traditional verification protocols, for example, confidence building measures such as burden sharing against the common threat of weapon of mass destruction (WMD) terrorism, joint research and development.« less

  14. Supporting Technology for Chain of Custody of Nuclear Weapons and Materials throughout the Dismantlement and Disposition Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bunch, Kyle J.; Jones, Anthony M.; Ramuhalli, Pradeep

    The ratification and ongoing implementation of the New START Treaty have been widely regarded as noteworthy global security achievements for both the Obama Administration and the Putin (formerly Medvedev) regime. But deeper cuts that move beyond the United States and Russia to engage the P-5 and other nuclear weapons possessor states are envisioned under future arms control regimes, and are indeed required for the P-5 in accordance with their Article VI disarmament obligations in the Nuclear Non-Proliferation Treaty. Future verification needs will include monitoring the cessation of production of new fissile material for weapons, monitoring storage of warhead components andmore » fissile materials and verifying dismantlement of warheads, pits, secondary stages, and other materials. A fundamental challenge to implementing a nuclear disarmament regime is the ability to thwart unauthorized material diversion throughout the dismantlement and disposition process through strong chain of custody implementation. Verifying the declared presence, or absence, of nuclear materials and weapons components throughout the dismantlement and disposition lifecycle is a critical aspect of the disarmament process. From both the diplomatic and technical perspectives, verification under these future arms control regimes will require new solutions. Since any acceptable verification technology must protect sensitive design information and attributes to prevent the release of classified or other proliferation-sensitive information, non-nuclear non-sensitive modalities may provide significant new verification tools which do not require the use of additional information barriers. Alternative verification technologies based upon electromagnetic and acoustics could potentially play an important role in fulfilling the challenging requirements of future verification regimes. For example, researchers at the Pacific Northwest National Laboratory (PNNL) have demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to rapidly confirm the presence of specific components on a yes/no basis without revealing classified information. PNNL researchers have also used ultrasonic measurements to obtain images of material microstructures which may be used as templates or unique identifiers of treaty-limited items. Such alternative technologies are suitable for application in various stages of weapons dismantlement and often include the advantage of an inherent information barrier due to the inability to extract classified weapon design information from the collected data. As a result, these types of technologies complement radiation-based verification methods for arms control. This article presents an overview of several alternative verification technologies that are suitable for supporting a future, broader and more intrusive arms control regime that spans the nuclear weapons disarmament lifecycle. The general capabilities and limitations of each verification modality are discussed and example technologies are presented. Potential applications are defined in the context of the nuclear material and weapons lifecycle. Example applications range from authentication (e.g., tracking and signatures within the chain of custody from downloading through weapons storage, unclassified templates and unique identification) to verification of absence and final material disposition.« less

  15. Shuttle payload interface verification equipment study. Volume 2: Technical document, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The technical analysis is reported that was performed during the shuttle payload interface verification equipment study. It describes: (1) the background and intent of the study; (2) study approach and philosophy covering all facets of shuttle payload/cargo integration; (3)shuttle payload integration requirements; (4) preliminary design of the horizontal IVE; (5) vertical IVE concept; and (6) IVE program development plans, schedule and cost. Also included is a payload integration analysis task to identify potential uses in addition to payload interface verification.

  16. A Roadmap for the Implementation of Continued Process Verification.

    PubMed

    Boyer, Marcus; Gampfer, Joerg; Zamamiri, Abdel; Payne, Robin

    2016-01-01

    In 2014, the members of the BioPhorum Operations Group (BPOG) produced a 100-page continued process verification case study, entitled "Continued Process Verification: An Industry Position Paper with Example Protocol". This case study captures the thought processes involved in creating a continued process verification plan for a new product in response to the U.S. Food and Drug Administration's guidance on the subject introduced in 2011. In so doing, it provided the specific example of a plan developed for a new molecular antibody product based on the "A MAb Case Study" that preceded it in 2009.This document provides a roadmap that draws on the content of the continued process verification case study to provide a step-by-step guide in a more accessible form, with reference to a process map of the product life cycle. It could be used as a basis for continued process verification implementation in a number of different scenarios: For a single product and process;For a single site;To assist in the sharing of data monitoring responsibilities among sites;To assist in establishing data monitoring agreements between a customer company and a contract manufacturing organization. The U.S. Food and Drug Administration issued guidance on the management of manufacturing processes designed to improve quality and control of drug products. This involved increased focus on regular monitoring of manufacturing processes, reporting of the results, and the taking of opportunities to improve. The guidance and practice associated with it is known as continued process verification This paper summarizes good practice in responding to continued process verification guidance, gathered from subject matter experts in the biopharmaceutical industry. © PDA, Inc. 2016.

  17. Verification of operational solar flare forecast: Case of Regional Warning Center Japan

    NASA Astrophysics Data System (ADS)

    Kubo, Yûki; Den, Mitsue; Ishii, Mamoru

    2017-08-01

    In this article, we discuss a verification study of an operational solar flare forecast in the Regional Warning Center (RWC) Japan. The RWC Japan has been issuing four-categorical deterministic solar flare forecasts for a long time. In this forecast verification study, we used solar flare forecast data accumulated over 16 years (from 2000 to 2015). We compiled the forecast data together with solar flare data obtained with the Geostationary Operational Environmental Satellites (GOES). Using the compiled data sets, we estimated some conventional scalar verification measures with 95% confidence intervals. We also estimated a multi-categorical scalar verification measure. These scalar verification measures were compared with those obtained by the persistence method and recurrence method. As solar activity varied during the 16 years, we also applied verification analyses to four subsets of forecast-observation pair data with different solar activity levels. We cannot conclude definitely that there are significant performance differences between the forecasts of RWC Japan and the persistence method, although a slightly significant difference is found for some event definitions. We propose to use a scalar verification measure to assess the judgment skill of the operational solar flare forecast. Finally, we propose a verification strategy for deterministic operational solar flare forecasting. For dichotomous forecast, a set of proposed verification measures is a frequency bias for bias, proportion correct and critical success index for accuracy, probability of detection for discrimination, false alarm ratio for reliability, Peirce skill score for forecast skill, and symmetric extremal dependence index for association. For multi-categorical forecast, we propose a set of verification measures as marginal distributions of forecast and observation for bias, proportion correct for accuracy, correlation coefficient and joint probability distribution for association, the likelihood distribution for discrimination, the calibration distribution for reliability and resolution, and the Gandin-Murphy-Gerrity score and judgment skill score for skill.

  18. The influence of verification jig on framework fit for nonsegmented fixed implant-supported complete denture.

    PubMed

    Ercoli, Carlo; Geminiani, Alessandro; Feng, Changyong; Lee, Heeje

    2012-05-01

    The purpose of this retrospective study was to assess if there was a difference in the likelihood of achieving passive fit when an implant-supported full-arch prosthesis framework is fabricated with or without the aid of a verification jig. This investigation was approved by the University of Rochester Research Subject Review Board (protocol #RSRB00038482). Thirty edentulous patients, 49 to 73 years old (mean 61 years old), rehabilitated with a nonsegmented fixed implant-supported complete denture were included in the study. During the restorative process, final impressions were made using the pickup impression technique and elastomeric impression materials. For 16 patients, a verification jig was made (group J), while for the remaining 14 patients, a verification jig was not used (group NJ) and the framework was fabricated directly on the master cast. During the framework try-in appointment, the fit was assessed by clinical (Sheffield test) and radiographic inspection and recorded as passive or nonpassive. When a verification jig was used (group J, n = 16), all frameworks exhibited clinically passive fit, while when a verification jig was not used (group NJ, n = 14), only two frameworks fit. This difference was statistically significant (p < .001). Within the limitations of this retrospective study, the fabrication of a verification jig ensured clinically passive fit of metal frameworks in nonsegmented fixed implant-supported complete denture. © 2011 Wiley Periodicals, Inc.

  19. Identity Verification, Control, and Aggression in Marriage

    ERIC Educational Resources Information Center

    Stets, Jan E.; Burke, Peter J.

    2005-01-01

    In this research we study the identity verification process and its effects in marriage. Drawing on identity control theory, we hypothesize that a lack of verification in the spouse identity (1) threatens stable self-meanings and interaction patterns between spouses, and (2) challenges a (nonverified) spouse's perception of control over the…

  20. Students' Verification Strategies for Combinatorial Problems

    ERIC Educational Resources Information Center

    Mashiach Eizenberg, Michal; Zaslavsky, Orit

    2004-01-01

    We focus on a major difficulty in solving combinatorial problems, namely, on the verification of a solution. Our study aimed at identifying undergraduate students' tendencies to verify their solutions, and the verification strategies that they employ when solving these problems. In addition, an attempt was made to evaluate the level of efficiency…

  1. QPF verification using different radar-based analyses: a case study

    NASA Astrophysics Data System (ADS)

    Moré, J.; Sairouni, A.; Rigo, T.; Bravo, M.; Mercader, J.

    2009-09-01

    Verification of QPF in NWP models has been always challenging not only for knowing what scores are better to quantify a particular skill of a model but also for choosing the more appropriate methodology when comparing forecasts with observations. On the one hand, an objective verification technique can provide conclusions that are not in agreement with those ones obtained by the "eyeball" method. Consequently, QPF can provide valuable information to forecasters in spite of having poor scores. On the other hand, there are difficulties in knowing the "truth" so different results can be achieved depending on the procedures used to obtain the precipitation analysis. The aim of this study is to show the importance of combining different precipitation analyses and verification methodologies to obtain a better knowledge of the skills of a forecasting system. In particular, a short range precipitation forecasting system based on MM5 at 12 km coupled with LAPS is studied in a local convective precipitation event that took place in NE Iberian Peninsula on October 3rd 2008. For this purpose, a variety of verification methods (dichotomous, recalibration and object oriented methods) are used to verify this case study. At the same time, different precipitation analyses are used in the verification process obtained by interpolating radar data using different techniques.

  2. Self-verification and contextualized self-views.

    PubMed

    Chen, Serena; English, Tammy; Peng, Kaiping

    2006-07-01

    Whereas most self-verification research has focused on people's desire to verify their global self-conceptions, the present studies examined self-verification with regard to contextualized selfviews-views of the self in particular situations and relationships. It was hypothesized that individuals whose core self-conceptions include contextualized self-views should seek to verify these self-views. In Study 1, the more individuals defined the self in dialectical terms, the more their judgments were biased in favor of verifying over nonverifying feedback about a negative, situation-specific self-view. In Study 2, consistent with research on gender differences in the importance of relationships to the self-concept, women but not men showed a similar bias toward feedback about a negative, relationship-specific self-view, a pattern not seen for global self-views. Together, the results support the notion that self-verification occurs for core self-conceptions, whatever form(s) they may take. Individual differences in self-verification and the nature of selfhood and authenticity are discussed.

  3. Systematic study of source mask optimization and verification flows

    NASA Astrophysics Data System (ADS)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  4. The Effect of Mystery Shopper Reports on Age Verification for Tobacco Purchases

    PubMed Central

    KREVOR, BRAD S.; PONICKI, WILLIAM R.; GRUBE, JOEL W.; DeJONG, WILLIAM

    2011-01-01

    Mystery shops (MS) involving attempted tobacco purchases by young buyers have been employed to monitor retail stores’ performance in refusing underage sales. Anecdotal evidence suggests that MS visits with immediate feedback to store personnel can improve age verification. This study investigated the impact of monthly and twice-monthly MS reports on age verification. Forty-five Walgreens stores were each visited 20 times by mystery shoppers. The stores were randomly assigned to one of three conditions. Control group stores received no feedback, whereas two treatment groups received feedback communications every visit (twice monthly) or every second visit (monthly) after baseline. Logit regression models tested whether each treatment group improved verification rates relative to the control group. Post-baseline verification rates were higher in both treatment groups than in the control group, but only the stores receiving monthly communications had a significantly greater improvement than control group stores. Verification rates increased significantly during the study period for all three groups, with delayed improvement among control group stores. Communication between managers regarding the MS program may account for the delayed age-verification improvements observed in the control group stores. Encouraging inter-store communication might extend the benefits of MS programs beyond those stores that receive this intervention. PMID:21541874

  5. The effect of mystery shopper reports on age verification for tobacco purchases.

    PubMed

    Krevor, Brad S; Ponicki, William R; Grube, Joel W; DeJong, William

    2011-09-01

    Mystery shops involving attempted tobacco purchases by young buyers have been implemented in order to monitor retail stores' performance in refusing underage sales. Anecdotal evidence suggests that mystery shop visits with immediate feedback to store personnel can improve age verification. This study investigated the effect of monthly and twice-monthly mystery shop reports on age verification. Mystery shoppers visited 45 Walgreens stores 20 times. The stores were randomly assigned to 1 of 3 conditions. Control group stores received no feedback, whereas 2 treatment groups received feedback communications on every visit (twice monthly) or on every second visit (monthly) after baseline. Logit regression models tested whether each treatment group improved verification rates relative to the control group. Postbaseline verification rates were higher in both treatment groups than in the control group, but only the stores receiving monthly communications had a significantly greater improvement compared with the control group stores. Verification rates increased significantly during the study period for all 3 groups, with delayed improvement among control group stores. Communication between managers regarding the mystery shop program may account for the delayed age-verification improvements observed in the control group stores. Encouraging interstore communication might extend the benefits of mystery shop programs beyond those stores that receive this intervention. Copyright © Taylor & Francis Group, LLC

  6. The identification of post-starburst galaxies at z ˜ 1 using multiwavelength photometry: a spectroscopic verification

    NASA Astrophysics Data System (ADS)

    Maltby, David T.; Almaini, Omar; Wild, Vivienne; Hatch, Nina A.; Hartley, William G.; Simpson, Chris; McLure, Ross J.; Dunlop, James; Rowlands, Kate; Cirasuolo, Michele

    2016-06-01

    Despite decades of study, we still do not fully understand why some massive galaxies abruptly switch off their star formation in the early Universe, and what causes their rapid transition to the red sequence. Post-starburst galaxies provide a rare opportunity to study this transition phase, but few have currently been spectroscopically identified at high redshift (z > 1). In this paper, we present the spectroscopic verification of a new photometric technique to identify post-starbursts in high-redshift surveys. The method classifies the broad-band optical-near-infrared spectral energy distributions (SEDs) of galaxies using three spectral shape parameters (supercolours), derived from a principal component analysis of model SEDs. When applied to the multiwavelength photometric data in the UKIDSS Ultra Deep Survey, this technique identified over 900 candidate post-starbursts at redshifts 0.5 < z < 2.0. In this study, we present deep optical spectroscopy for a subset of these galaxies, in order to confirm their post-starburst nature. Where a spectroscopic assessment was possible, we find the majority (19/24 galaxies; ˜80 per cent) exhibit the strong Balmer absorption (H δ equivalent width Wλ > 5 Å) and Balmer break, characteristic of post-starburst galaxies. We conclude that photometric methods can be used to select large samples of recently-quenched galaxies in the distant Universe.

  7. Calibration and verification of models of organic carbon removal kinetics in Aerated Submerged Fixed-Bed Biofilm Reactors (ASFBBR): a case study of wastewater from an oil-refinery.

    PubMed

    Trojanowicz, Karol; Wójcik, Włodzimierz

    2011-01-01

    The article presents a case-study on the calibration and verification of mathematical models of organic carbon removal kinetics in biofilm. The chosen Harremöes and Wanner & Reichert models were calibrated with a set of model parameters obtained both during dedicated studies conducted at pilot- and lab-scales for petrochemical wastewater conditions and from the literature. Next, the models were successfully verified through studies carried out utilizing a pilot ASFBBR type bioreactor installed in an oil-refinery wastewater treatment plant. During verification the pilot biofilm reactor worked under varying surface organic loading rates (SOL), dissolved oxygen concentrations and temperatures. The verification proved that the models can be applied in practice to petrochemical wastewater treatment engineering for e.g. biofilm bioreactor dimensioning.

  8. Verification bias an underrecognized source of error in assessing the efficacy of medical imaging.

    PubMed

    Petscavage, Jonelle M; Richardson, Michael L; Carr, Robert B

    2011-03-01

    Diagnostic tests are validated by comparison against a "gold standard" reference test. When the reference test is invasive or expensive, it may not be applied to all patients. This can result in biased estimates of the sensitivity and specificity of the diagnostic test. This type of bias is called "verification bias," and is a common problem in imaging research. The purpose of our study is to estimate the prevalence of verification bias in the recent radiology literature. All issues of the American Journal of Roentgenology (AJR), Academic Radiology, Radiology, and European Journal of Radiology (EJR) between November 2006 and October 2009 were reviewed for original research articles mentioning sensitivity or specificity as endpoints. Articles were read to determine whether verification bias was present and searched for author recognition of verification bias in the design. During 3 years, these journals published 2969 original research articles. A total of 776 articles used sensitivity or specificity as an outcome. Of these, 211 articles demonstrated potential verification bias. The fraction of articles with potential bias was respectively 36.4%, 23.4%, 29.5%, and 13.4% for AJR, Academic Radiology, Radiology, and EJR. The total fraction of papers with potential bias in which the authors acknowledged this bias was 17.1%. Verification bias is a common and frequently unacknowledged source of error in efficacy studies of diagnostic imaging. Bias can often be eliminated by proper study design. When it cannot be eliminated, it should be estimated and acknowledged. Published by Elsevier Inc.

  9. A robust method using propensity score stratification for correcting verification bias for binary tests

    PubMed Central

    He, Hua; McDermott, Michael P.

    2012-01-01

    Sensitivity and specificity are common measures of the accuracy of a diagnostic test. The usual estimators of these quantities are unbiased if data on the diagnostic test result and the true disease status are obtained from all subjects in an appropriately selected sample. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the result of the diagnostic test and other characteristics of the subjects. Estimators of sensitivity and specificity based on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias under the assumption that the missing data on disease status are missing at random (MAR), that is, the probability of missingness depends on the true (missing) disease status only through the test result and observed covariate information. When some of the covariates are continuous, or the number of covariates is relatively large, the existing methods require parametric models for the probability of disease or the probability of verification (given the test result and covariates), and hence are subject to model misspecification. We propose a new method for correcting verification bias based on the propensity score, defined as the predicted probability of verification given the test result and observed covariates. This is estimated separately for those with positive and negative test results. The new method classifies the verified sample into several subsamples that have homogeneous propensity scores and allows correction for verification bias. Simulation studies demonstrate that the new estimators are more robust to model misspecification than existing methods, but still perform well when the models for the probability of disease and probability of verification are correctly specified. PMID:21856650

  10. Electrically tunable artificial gauge potential for polaritons

    PubMed Central

    Lim, Hyang-Tag; Togan, Emre; Kroner, Martin; Miguel-Sanchez, Javier; Imamoğlu, Atac

    2017-01-01

    Neutral particles subject to artificial gauge potentials can behave as charged particles in magnetic fields. This fascinating premise has led to demonstrations of one-way waveguides, topologically protected edge states and Landau levels for photons. In ultracold neutral atoms, effective gauge fields have allowed the emulation of matter under strong magnetic fields leading to realization of Harper-Hofstadter and Haldane models. Here we show that application of perpendicular electric and magnetic fields effects a tunable artificial gauge potential for two-dimensional microcavity exciton polaritons. For verification, we perform interferometric measurements of the associated phase accumulated during coherent polariton transport. Since the gauge potential originates from the magnetoelectric Stark effect, it can be realized for photons strongly coupled to excitations in any polarizable medium. Together with strong polariton–polariton interactions and engineered polariton lattices, artificial gauge fields could play a key role in investigation of non-equilibrium dynamics of strongly correlated photons. PMID:28230047

  11. Ultrasonography in diagnosing clinically occult groin hernia: systematic review and meta-analysis.

    PubMed

    Kwee, Robert M; Kwee, Thomas C

    2018-05-14

    To provide an updated systematic review on the performance of ultrasonography (US) in diagnosing clinically occult groin hernia. A systematic search was performed in MEDLINE and Embase. Methodological quality of included studies was assessed. Accuracy data of US in detecting clinically occult groin hernia were extracted. Positive predictive value (PPV) was pooled with a random effects model. For studies investigating the performance of US in hernia type classification (inguinal vs femoral), correctly classified proportion was assessed. Sixteen studies were included. In the two studies without verification bias, sensitivities were 29.4% [95% confidence interval (CI), 15.1-47.5%] and 90.9% (95% CI, 70.8-98.9%); specificities were 90.0% (95% CI, 80.5-95.9%) and 90.6% (95% CI, 83.0-95.6%). Verification bias or a variation of it (i.e. study limited to only subjects with definitive proof of disease status) was present in all other studies. Sensitivity, specificity, and negative predictive value (NPV) were not pooled. PPV ranged from 58.8 to 100%. Pooled PPV, based on data from ten studies with low risk of bias and no applicability concerns with respect to patient selection, was 85.6% (95% CI, 76.5-92.7%). Proportion of correctly classified hernias, based on data from four studies, ranged between 94.4% and 99.1%. Sensitivity, specificity and NPV of US in detecting clinically occult groin hernia cannot reliably be determined based on current evidence. Further studies are necessary. Accuracy may strongly depend on the examiner's skills. PPV is high. Inguinal and femoral hernias can reliably be differentiated by US. • Sensitivity, specificity and NPV of ultrasound in detecting clinically occult groin hernia cannot reliably be determined based on current evidence. • Accuracy may strongly depend on the examiner's skills. • PPV of US in detection of clinically occult groin hernia is high [pooled PPV of 85.6% (95% confidence interval, 76.5-92.7%)]. • US has very high performance in correctly differentiating between clinically occult inguinal and femoral hernia (correctness of 94.4- 99.1%).

  12. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    PubMed

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  13. Building the Qualification File of EGNOS with DOORS

    NASA Astrophysics Data System (ADS)

    Fabre, J.

    2008-08-01

    EGNOS, the European Satellite-Based Augmentation System (SBAS) to GPS, is getting to its final deployment and being initially operated towards qualification and certification to reach operational capability by 2008/2009. A very important milestone in the development process is the System Qualification Review (QR). As the verification phase aims at demonstrating that the EGNOS System design meets the applicable requirements, the QR declares the completion of verification activities. The main document to present at QR is a consolidated, consistent and complete Qualification file. The information included shall give confidence to the QR reviewers that the performed qualification activities are completed. Therefore, an important issue for the project team is to focus on synthetic and consistent information, and to make the presentation as clear as possible. Traceability to applicable requirements shall be systematically presented. Moreover, in order to support verification justification, reference to details shall be available, and the reviewer shall have the possibility to link automatically to the documents including this detailed information. In that frame, Thales Alenia Space has implemented a strong support in terms of methodology and tool, to provide to System Engineering and Verification teams a single reference technical database, in which all team members consult the applicable requirements, compliance, justification, design data and record the information necessary to build the final Qualification file. This paper presents the EGNOS context, the Qualification file contents, and the methodology implemented, based on Thales Alenia Space practices and in line with ECSS. Finally, it shows how the Qualification file is built in a DOORS environment.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scribner, R.A.

    Sea-launched cruise missiles (SLCMs) present some particularly striking problems for both national security and arms control. These small, dual-purpose, difficult to detect weapons present some formidable challenges for verification in any scheme that attempts to limit rather than eliminate them. Conventionally armed SLCMs offer to the navies of both superpowers important offensive and defensive capabilities. Nuclear armed, long-range, land-attack SLCMs, on the other hand, seem to pose destabilizing threats and otherwise have questionable value, despite strong US support for extensive deployment of them. If these weapons are not constrained, their deployment could circumvent gains which might be made in agreementsmore » directly reducing of strategic nuclear weapons. This paper reviews the technology and planned deployments of SLCMs, the verification schemes which have been discussed and are being investigated to try to deal with the problem, and examines the proposed need for and possible uses of SLCMs. It presents an overview of the problem technically, militarily, and politically.« less

  15. Electronic cigarette sales to minors via the internet.

    PubMed

    Williams, Rebecca S; Derrick, Jason; Ribisl, Kurt M

    2015-03-01

    Electronic cigarettes (e-cigarettes) entered the US market in 2007 and, with little regulatory oversight, grew into a $2-billion-a-year industry by 2013. The Centers for Disease Control and Prevention has reported a trend of increasing e-cigarette use among teens, with use rates doubling from 2011 to 2012. While several studies have documented that teens can and do buy cigarettes online, to our knowledge, no studies have yet examined age verification among Internet tobacco vendors selling e-cigarettes. To estimate the extent to which minors can successfully purchase e-cigarettes online and assess compliance with North Carolina's 2013 e-cigarette age-verification law. In this cross-sectional study conducted from February 2014 to June 2014, 11 nonsmoking minors aged 14 to 17 years made supervised e-cigarette purchase attempts from 98 Internet e-cigarette vendors. Purchase attempts were made at the University of North Carolina Internet Tobacco Vendors Study project offices using credit cards. Rate at which minors can successfully purchase e-cigarettes on the Internet. Minors successfully received deliveries of e-cigarettes from 76.5% of purchase attempts, with no attempts by delivery companies to verify their ages at delivery and 95% of delivered orders simply left at the door. All delivered packages came from shipping companies that, according to company policy or federal regulation, do not ship cigarettes to consumers. Of the total orders, 18 failed for reasons unrelated to age verification. Only 5 of the remaining 80 youth purchase attempts were rejected owing to age verification, resulting in a youth buy rate of 93.7%. None of the vendors complied with North Carolina's e-cigarette age-verification law. Minors are easily able to purchase e-cigarettes from the Internet because of an absence of age-verification measures used by Internet e-cigarette vendors. Federal law should require and enforce rigorous age verification for all e-cigarette sales as with the federal PACT (Prevent All Cigarette Trafficking) Act's requirements for age verification in Internet cigarette sales.

  16. A study of applications scribe frame data verifications using design rule check

    NASA Astrophysics Data System (ADS)

    Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki

    2013-06-01

    In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.

  17. Verification, Validation and Sensitivity Studies in Computational Biomechanics

    PubMed Central

    Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.

    2012-01-01

    Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646

  18. The effectiveness of ID readers and remote age verification in enhancing compliance with the legal age limit for alcohol.

    PubMed

    Van Hoof, Joris J

    2017-04-01

    Currently, two different age verification systems (AVS) are implemented to enhance compliance with legal age limits for the sale of alcohol in the Netherlands. In this study, we tested the operational procedures and effectiveness of ID readers and remote age verification technology in supermarkets during the sale of alcohol. Following a trained alcohol purchase protocol, eight mystery shoppers (both underage and in the branch's reference age) conducted 132 alcohol purchase attempts in stores that were equipped with ID readers or remote age verification or were part of a control group. In stores equipped with an ID reader, 34% of the purchases were conducted without any mistakes (full compliance). In stores with remote age verification, full compliance was achieved in 87% of the cases. The control group reached 57% compliance, which is in line with the national average. Stores with ID readers perform worse than stores with remote age verification, and also worse than stores without any AVS. For both systems, in addition to effectiveness, public support and user friendliness need to be investigated. This study shows that remote age verification technology is a promising intervention that increases vendor compliance during the sales of age restricted products. © The Author 2016. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  19. The Maximal Oxygen Uptake Verification Phase: a Light at the End of the Tunnel?

    PubMed

    Schaun, Gustavo Z

    2017-12-08

    Commonly performed during an incremental test to exhaustion, maximal oxygen uptake (V̇O 2max ) assessment has become a recurring practice in clinical and experimental settings. To validate the test, several criteria were proposed. In this context, the plateau in oxygen uptake (V̇O 2 ) is inconsistent in its frequency, reducing its usefulness as a robust method to determine "true" V̇O 2max . Moreover, secondary criteria previously suggested, such as expiratory exchange ratios or percentages of maximal heart rate, are highly dependent on protocol design and often are achieved at V̇O 2 percentages well below V̇O 2max . Thus, an alternative method termed verification phase was proposed. Currently, it is clear that the verification phase can be a practical and sensitive method to confirm V̇O 2max ; however, procedures to conduct it are not standardized across the literature and no previous research tried to summarize how it has been employed. Therefore, in this review the knowledge on the verification phase was updated, while suggestions on how it can be performed (e.g. intensity, duration, recovery) were provided according to population and protocol design. Future studies should focus to identify a verification protocol feasible for different populations and to compare square-wave and multistage verification phases. Additionally, studies assessing verification phases in different patient populations are still warranted.

  20. Feasibility study on dosimetry verification of volumetric-modulated arc therapy-based total marrow irradiation.

    PubMed

    Liang, Yun; Kim, Gwe-Ya; Pawlicki, Todd; Mundt, Arno J; Mell, Loren K

    2013-03-04

    The purpose of this study was to develop dosimetry verification procedures for volumetric-modulated arc therapy (VMAT)-based total marrow irradiation (TMI). The VMAT based TMI plans were generated for three patients: one child and two adults. The planning target volume (PTV) was defined as bony skeleton, from head to mid-femur, with a 3 mm margin. The plan strategy similar to published studies was adopted. The PTV was divided into head and neck, chest, and pelvic regions, with separate plans each of which is composed of 2-3 arcs/fields. Multiple isocenters were evenly distributed along the patient's axial direction. The focus of this study is to establish a dosimetry quality assurance procedure involving both two-dimensional (2D) and three-dimensional (3D) volumetric verifications, which is desirable for a large PTV treated with multiple isocenters. The 2D dose verification was performed with film for gamma evaluation and absolute point dose was measured with ion chamber, with attention to the junction between neighboring plans regarding hot/cold spots. The 3D volumetric dose verification used commercial dose reconstruction software to reconstruct dose from electronic portal imaging devices (EPID) images. The gamma evaluation criteria in both 2D and 3D verification were 5% absolute point dose difference and 3 mm of distance to agreement. With film dosimetry, the overall average gamma passing rate was 98.2% and absolute dose difference was 3.9% in junction areas among the test patients; with volumetric portal dosimetry, the corresponding numbers were 90.7% and 2.4%. A dosimetry verification procedure involving both 2D and 3D was developed for VMAT-based TMI. The initial results are encouraging and warrant further investigation in clinical trials.

  1. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR DATA ENTRY AND DATA VERIFICATION (HAND ENTRY) (UA-D-15.0)

    EPA Science Inventory

    The purpose of this SOP is to define the steps involved in data entry and data verification of physical forms. It applies to the data entry and data verification of all physical forms. The procedure defined herein was developed for use in the Arizona NHEXAS project and the "Bor...

  2. Verification of chemistry reference ranges using a simple method in sub-Saharan Africa

    PubMed Central

    Taylor, Douglas; Mandala, Justin; Nanda, Kavita; Van Campenhout, Christel; Agingu, Walter; Madurai, Lorna; Barsch, Eva-Maria; Deese, Jennifer; Van Damme, Lut; Crucitti, Tania

    2016-01-01

    Background Chemistry safety assessments are interpreted by using chemistry reference ranges (CRRs). Verification of CRRs is time consuming and often requires a statistical background. Objectives We report on an easy and cost-saving method to verify CRRs. Methods Using a former method introduced by Sigma Diagnostics, three study sites in sub-Saharan Africa, Bondo, Kenya, and Pretoria and Bloemfontein, South Africa, verified the CRRs for hepatic and renal biochemistry assays performed during a clinical trial of HIV antiretroviral pre-exposure prophylaxis. The aspartate aminotransferase/alanine aminotransferase, creatinine and phosphorus results from 10 clinically-healthy participants at the screening visit were used. In the event the CRRs did not pass the verification, new CRRs had to be calculated based on 40 clinically-healthy participants. Results Within a few weeks, the study sites accomplished verification of the CRRs without additional costs. The aspartate aminotransferase reference ranges for the Bondo, Kenya site and the alanine aminotransferase reference ranges for the Pretoria, South Africa site required adjustment. The phosphorus CRR passed verification and the creatinine CRR required adjustment at every site. The newly-established CRR intervals were narrower than the CRRs used previously at these study sites due to decreases in the upper limits of the reference ranges. As a result, more toxicities were detected. Conclusion To ensure the safety of clinical trial participants, verification of CRRs should be standard practice in clinical trials conducted in settings where the CRR has not been validated for the local population. This verification method is simple, inexpensive, and can be performed by any medical laboratory. PMID:28879112

  3. Verification of chemistry reference ranges using a simple method in sub-Saharan Africa.

    PubMed

    De Baetselier, Irith; Taylor, Douglas; Mandala, Justin; Nanda, Kavita; Van Campenhout, Christel; Agingu, Walter; Madurai, Lorna; Barsch, Eva-Maria; Deese, Jennifer; Van Damme, Lut; Crucitti, Tania

    2016-01-01

    Chemistry safety assessments are interpreted by using chemistry reference ranges (CRRs). Verification of CRRs is time consuming and often requires a statistical background. We report on an easy and cost-saving method to verify CRRs. Using a former method introduced by Sigma Diagnostics, three study sites in sub-Saharan Africa, Bondo, Kenya, and Pretoria and Bloemfontein, South Africa, verified the CRRs for hepatic and renal biochemistry assays performed during a clinical trial of HIV antiretroviral pre-exposure prophylaxis. The aspartate aminotransferase/alanine aminotransferase, creatinine and phosphorus results from 10 clinically-healthy participants at the screening visit were used. In the event the CRRs did not pass the verification, new CRRs had to be calculated based on 40 clinically-healthy participants. Within a few weeks, the study sites accomplished verification of the CRRs without additional costs. The aspartate aminotransferase reference ranges for the Bondo, Kenya site and the alanine aminotransferase reference ranges for the Pretoria, South Africa site required adjustment. The phosphorus CRR passed verification and the creatinine CRR required adjustment at every site. The newly-established CRR intervals were narrower than the CRRs used previously at these study sites due to decreases in the upper limits of the reference ranges. As a result, more toxicities were detected. To ensure the safety of clinical trial participants, verification of CRRs should be standard practice in clinical trials conducted in settings where the CRR has not been validated for the local population. This verification method is simple, inexpensive, and can be performed by any medical laboratory.

  4. Development of independent MU/treatment time verification algorithm for non-IMRT treatment planning: A clinical experience

    NASA Astrophysics Data System (ADS)

    Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan

    2018-02-01

    The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.

  5. Formal verification of medical monitoring software using Z language: a representative sample.

    PubMed

    Babamir, Seyed Morteza; Borhani, Mehdi

    2012-08-01

    Medical monitoring systems are useful aids assisting physicians in keeping patients under constant surveillance; however, taking sound decision by the systems is a physician concern. As a result, verification of the systems behavior in monitoring patients is a matter of significant. The patient monitoring is undertaken by software in modern medical systems; so, software verification of modern medial systems have been noticed. Such verification can be achieved by the Formal Languages having mathematical foundations. Among others, the Z language is a suitable formal language has been used to formal verification of systems. This study aims to present a constructive method to verify a representative sample of a medical system by which the system is visually specified and formally verified against patient constraints stated in Z Language. Exploiting our past experience in formal modeling Continuous Infusion Insulin Pump (CIIP), we think of the CIIP system as a representative sample of medical systems in proposing our present study. The system is responsible for monitoring diabetic's blood sugar.

  6. Multicriterion problem of allocation of resources in the heterogeneous distributed information processing systems

    NASA Astrophysics Data System (ADS)

    Antamoshkin, O. A.; Kilochitskaya, T. R.; Ontuzheva, G. A.; Stupina, A. A.; Tynchenko, V. S.

    2018-05-01

    This study reviews the problem of allocation of resources in the heterogeneous distributed information processing systems, which may be formalized in the form of a multicriterion multi-index problem with the linear constraints of the transport type. The algorithms for solution of this problem suggest a search for the entire set of Pareto-optimal solutions. For some classes of hierarchical systems, it is possible to significantly speed up the procedure of verification of a system of linear algebraic inequalities for consistency due to the reducibility of them to the stream models or the application of other solution schemes (for strongly connected structures) that take into account the specifics of the hierarchies under consideration.

  7. Numerically robust and efficient nonlocal electron transport in 2D DRACO simulations

    NASA Astrophysics Data System (ADS)

    Cao, Duc; Chenhall, Jeff; Moses, Greg; Delettrez, Jacques; Collins, Tim

    2013-10-01

    An improved implicit algorithm based on Schurtz, Nicolai and Busquet (SNB) algorithm for nonlocal electron transport is presented. Validation with direct drive shock timing experiments and verification with the Goncharov nonlocal model in 1D LILAC simulations demonstrate the viability of this efficient algorithm for producing 2D lagrangian radiation hydrodynamics direct drive simulations. Additionally, simulations provide strong incentive to further modify key parameters within the SNB theory, namely the ``mean free path.'' An example 2D polar drive simulation to study 2D effects of the nonlocal flux as well as mean free path modifications will also be presented. This research was supported by the University of Rochester Laboratory for Laser Energetics.

  8. Block 2 SRM conceptual design studies. Volume 1, Book 2: Preliminary development and verification plan

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.

  9. MARATHON Verification (MARV)

    DTIC Science & Technology

    2017-08-01

    comparable with MARATHON 1 in terms of output. Rather, the MARATHON 2 verification cases were designed to ensure correct implementation of the new algorithms...DISCLAIMER The findings of this report are not to be construed as an official Department of the Army position, policy, or decision unless so designated by...for employment against demands. This study is a comparative verification of the functionality of MARATHON 4 (our newest implementation of MARATHON

  10. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  11. Dosimetric verification of stereotactic radiosurgery/stereotactic radiotherapy dose distributions using Gafchromic EBT3.

    PubMed

    Cusumano, Davide; Fumagalli, Maria L; Marchetti, Marcello; Fariselli, Laura; De Martin, Elena

    2015-01-01

    Aim of this study is to examine the feasibility of using the new Gafchromic EBT3 film in a high-dose stereotactic radiosurgery and radiotherapy quality assurance procedure. Owing to the reduced dimensions of the involved lesions, the feasibility of scanning plan verification films on the scanner plate area with the best uniformity rather than using a correction mask was evaluated. For this purpose, signal values dispersion and reproducibility of film scans were investigated. Uniformity was then quantified in the selected area and was found to be within 1.5% for doses up to 8 Gy. A high-dose threshold level for analyses using this procedure was established evaluating the sensitivity of the irradiated films. Sensitivity was found to be of the order of centiGray for doses up to 6.2 Gy and decreasing for higher doses. The obtained results were used to implement a procedure comparing dose distributions delivered with a CyberKnife system to planned ones. The procedure was validated through single beam irradiation on a Gafchromic film. The agreement between dose distributions was then evaluated for 13 patients (brain lesions, 5 Gy/die prescription isodose ~80%) using gamma analysis. Results obtained using Gamma test criteria of 5%/1 mm show a pass rate of 94.3%. Gamma frequency parameters calculation for EBT3 films showed to strongly depend on subtraction of unexposed film pixel values from irradiated ones. In the framework of the described dosimetric procedure, EBT3 films proved to be effective in the verification of high doses delivered to lesions with complex shapes and adjacent to organs at risk. Copyright © 2015 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.

  12. Bias-corrected diagnostic performance of the naked-eye single-tube red-cell osmotic fragility test (NESTROFT): an effective screening tool for beta-thalassemia.

    PubMed

    Mamtani, Manju; Jawahirani, Anil; Das, Kishor; Rughwani, Vinky; Kulkarni, Hemant

    2006-08-01

    It is being increasingly recognized that a majority of the countries in the thalassemia-belt need a cost-effective screening program as the first step towards control of thalassemia. Although the naked eye single tube red cell osmotic fragility test (NESTROFT) has been considered to be a very effective screening tool for beta-thalassemia trait, assessment of its diagnostic performance has been affected with the reference test- and verification-bias. Here, we set out to provide estimates of sensitivity and specificity of NESTROFT corrected for these potential biases. We conducted a cross-sectional diagnostic test evaluation study using data from 1563 subjects from Central India with a high prevalence of beta-thalassemia. We used latent class modelling after ensuring its validity to account for the reference test bias and global sensitivity analysis to control the verification bias. We also compared the results of latent class modelling with those of five discriminant indexes. We observed that across a range of cut-offs for the mean corpuscular volume (MCV) and the hemoglobin A2 (HbA2) concentration the average sensitivity and specificity of NESTROFT obtained from latent class modelling was 99.8 and 83.7%, respectively. These estimates were comparable to those characterizing the diagnostic performance of HbA2, which is considered by many as the reference test to detect beta-thalassemia. After correction for the verification bias these estimates were 93.4 and 97.2%, respectively. Combined with the inexpensive and quick disposition of NESTROFT, these results strongly support its candidature as a screening tool-especially in the resource-poor and high-prevalence settings.

  13. Dosimetric verification of stereotactic radiosurgery/stereotactic radiotherapy dose distributions using Gafchromic EBT3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cusumano, Davide, E-mail: davide.cusumano@unimi.it; Fumagalli, Maria L.; Marchetti, Marcello

    2015-10-01

    Aim of this study is to examine the feasibility of using the new Gafchromic EBT3 film in a high-dose stereotactic radiosurgery and radiotherapy quality assurance procedure. Owing to the reduced dimensions of the involved lesions, the feasibility of scanning plan verification films on the scanner plate area with the best uniformity rather than using a correction mask was evaluated. For this purpose, signal values dispersion and reproducibility of film scans were investigated. Uniformity was then quantified in the selected area and was found to be within 1.5% for doses up to 8 Gy. A high-dose threshold level for analyses usingmore » this procedure was established evaluating the sensitivity of the irradiated films. Sensitivity was found to be of the order of centiGray for doses up to 6.2 Gy and decreasing for higher doses. The obtained results were used to implement a procedure comparing dose distributions delivered with a CyberKnife system to planned ones. The procedure was validated through single beam irradiation on a Gafchromic film. The agreement between dose distributions was then evaluated for 13 patients (brain lesions, 5 Gy/die prescription isodose ~80%) using gamma analysis. Results obtained using Gamma test criteria of 5%/1 mm show a pass rate of 94.3%. Gamma frequency parameters calculation for EBT3 films showed to strongly depend on subtraction of unexposed film pixel values from irradiated ones. In the framework of the described dosimetric procedure, EBT3 films proved to be effective in the verification of high doses delivered to lesions with complex shapes and adjacent to organs at risk.« less

  14. Exploring implementation practices in results-based financing: the case of the verification in Benin.

    PubMed

    Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier

    2017-03-14

    Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks causes a further verticalization of the health system. Our results highlight the potential disconnect between the theory of change behind RBF and the actual scheme's implementation. The implications are relevant at methodological level, stressing the importance of analyzing implementation processes to fully understand results, as well as at operational level, pointing to the need to carefully adapt the design of RBF schemes (including verification and other key functions) to the context and to allow room to iteratively modify it during implementation. They also question whether the rationale for thorough and costly verification is justified, or rather adaptations are possible.

  15. A field study of the accuracy and reliability of a biometric iris recognition system.

    PubMed

    Latman, Neal S; Herb, Emily

    2013-06-01

    The iris of the eye appears to satisfy the criteria for a good anatomical characteristic for use in a biometric system. The purpose of this study was to evaluate a biometric iris recognition system: Mobile-Eyes™. The enrollment, verification, and identification applications were evaluated in a field study for accuracy and reliability using both irises of 277 subjects. Independent variables included a wide range of subject demographics, ambient light, and ambient temperature. A sub-set of 35 subjects had alcohol-induced nystagmus. There were 2710 identification and verification attempts, which resulted in 1,501,340 and 5540 iris comparisons respectively. In this study, the system successfully enrolled all subjects on the first attempt. All 277 subjects were successfully verified and identified on the first day of enrollment. None of the current or prior eye conditions prevented enrollment, verification, or identification. All 35 subjects with alcohol-induced nystagmus were successfully verified and identified. There were no false verifications or false identifications. Two conditions were identified that potentially could circumvent the use of iris recognitions systems in general. The Mobile-Eyes™ iris recognition system exhibited accurate and reliable enrollment, verification, and identification applications in this study. It may have special applications in subjects with nystagmus. Copyright © 2012 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.

  16. JPRS Report, Soviet Union, World Economy and International Relations, Number 2, February 1990.

    DTIC Science & Technology

    1990-06-21

    of the simplistic line of reasoning was too strong, and the new concept was too unusual. The main reason, however, was the exceptional complexity ...many other problems which were previously considered strictly internal. This also applies to participation in the joint resolution of global problems...would give rise to the need to resolve the complex issues of the verification of the elimination itself and, possibly, of the production of fissionable

  17. Mitigating errors caused by interruptions during medication verification and administration: interventions in a simulated ambulatory chemotherapy setting.

    PubMed

    Prakash, Varuna; Koczmara, Christine; Savage, Pamela; Trip, Katherine; Stewart, Janice; McCurdie, Tara; Cafazzo, Joseph A; Trbovich, Patricia

    2014-11-01

    Nurses are frequently interrupted during medication verification and administration; however, few interventions exist to mitigate resulting errors, and the impact of these interventions on medication safety is poorly understood. The study objectives were to (A) assess the effects of interruptions on medication verification and administration errors, and (B) design and test the effectiveness of targeted interventions at reducing these errors. The study focused on medication verification and administration in an ambulatory chemotherapy setting. A simulation laboratory experiment was conducted to determine interruption-related error rates during specific medication verification and administration tasks. Interventions to reduce these errors were developed through a participatory design process, and their error reduction effectiveness was assessed through a postintervention experiment. Significantly more nurses committed medication errors when interrupted than when uninterrupted. With use of interventions when interrupted, significantly fewer nurses made errors in verifying medication volumes contained in syringes (16/18; 89% preintervention error rate vs 11/19; 58% postintervention error rate; p=0.038; Fisher's exact test) and programmed in ambulatory pumps (17/18; 94% preintervention vs 11/19; 58% postintervention; p=0.012). The rate of error commission significantly decreased with use of interventions when interrupted during intravenous push (16/18; 89% preintervention vs 6/19; 32% postintervention; p=0.017) and pump programming (7/18; 39% preintervention vs 1/19; 5% postintervention; p=0.017). No statistically significant differences were observed for other medication verification tasks. Interruptions can lead to medication verification and administration errors. Interventions were highly effective at reducing unanticipated errors of commission in medication administration tasks, but showed mixed effectiveness at reducing predictable errors of detection in medication verification tasks. These findings can be generalised and adapted to mitigate interruption-related errors in other settings where medication verification and administration are required. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  18. Mitigating errors caused by interruptions during medication verification and administration: interventions in a simulated ambulatory chemotherapy setting

    PubMed Central

    Prakash, Varuna; Koczmara, Christine; Savage, Pamela; Trip, Katherine; Stewart, Janice; McCurdie, Tara; Cafazzo, Joseph A; Trbovich, Patricia

    2014-01-01

    Background Nurses are frequently interrupted during medication verification and administration; however, few interventions exist to mitigate resulting errors, and the impact of these interventions on medication safety is poorly understood. Objective The study objectives were to (A) assess the effects of interruptions on medication verification and administration errors, and (B) design and test the effectiveness of targeted interventions at reducing these errors. Methods The study focused on medication verification and administration in an ambulatory chemotherapy setting. A simulation laboratory experiment was conducted to determine interruption-related error rates during specific medication verification and administration tasks. Interventions to reduce these errors were developed through a participatory design process, and their error reduction effectiveness was assessed through a postintervention experiment. Results Significantly more nurses committed medication errors when interrupted than when uninterrupted. With use of interventions when interrupted, significantly fewer nurses made errors in verifying medication volumes contained in syringes (16/18; 89% preintervention error rate vs 11/19; 58% postintervention error rate; p=0.038; Fisher's exact test) and programmed in ambulatory pumps (17/18; 94% preintervention vs 11/19; 58% postintervention; p=0.012). The rate of error commission significantly decreased with use of interventions when interrupted during intravenous push (16/18; 89% preintervention vs 6/19; 32% postintervention; p=0.017) and pump programming (7/18; 39% preintervention vs 1/19; 5% postintervention; p=0.017). No statistically significant differences were observed for other medication verification tasks. Conclusions Interruptions can lead to medication verification and administration errors. Interventions were highly effective at reducing unanticipated errors of commission in medication administration tasks, but showed mixed effectiveness at reducing predictable errors of detection in medication verification tasks. These findings can be generalised and adapted to mitigate interruption-related errors in other settings where medication verification and administration are required. PMID:24906806

  19. SU-F-T-494: A Multi-Institutional Study of Independent Dose Verification Using Golden Beam Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Itano, M; Yamazaki, T; Tachibana, R

    Purpose: In general, beam data of individual linac is measured for independent dose verification software program and the verification is performed as a secondary check. In this study, independent dose verification using golden beam data was compared to that using individual linac’s beam data. Methods: Six institutions were participated and three different beam data were prepared. The one was individual measured data (Original Beam Data, OBD) .The others were generated by all measurements from same linac model (Model-GBD) and all linac models (All-GBD). The three different beam data were registered to the independent verification software program for each institute. Subsequently,more » patient’s plans in eight sites (brain, head and neck, lung, esophagus, breast, abdomen, pelvis and bone) were analyzed using the verification program to compare doses calculated using the three different beam data. Results: 1116 plans were collected from six institutes. Compared to using the OBD, the results shows the variation using the Model-GBD based calculation and the All-GBD was 0.0 ± 0.3% and 0.0 ± 0.6%, respectively. The maximum variations were 1.2% and 2.3%, respectively. The plans with the variation over 1% shows the reference points were located away from the central axis with/without physical wedge. Conclusion: The confidence limit (2SD) using the Model-GBD and the All-GBD was within 0.6% and 1.2%, respectively. Thus, the use of golden beam data may be feasible for independent verification. In addition to it, the verification using golden beam data provide quality assurance of planning from the view of audit. This research is partially supported by Japan Agency for Medical Research and Development(AMED)« less

  20. TROIKA: a general framework for heart rate monitoring using wrist-type photoplethysmographic signals during intensive physical exercise.

    PubMed

    Zhang, Zhilin; Pi, Zhouyue; Liu, Benyuan

    2015-02-01

    Heart rate monitoring using wrist-type photoplethysmographic signals during subjects' intensive exercise is a difficult problem, since the signals are contaminated by extremely strong motion artifacts caused by subjects' hand movements. So far few works have studied this problem. In this study, a general framework, termed TROIKA, is proposed, which consists of signal decomposiTion for denoising, sparse signal RecOnstructIon for high-resolution spectrum estimation, and spectral peaK trAcking with verification. The TROIKA framework has high estimation accuracy and is robust to strong motion artifacts. Many variants can be straightforwardly derived from this framework. Experimental results on datasets recorded from 12 subjects during fast running at the peak speed of 15 km/h showed that the average absolute error of heart rate estimation was 2.34 beat per minute, and the Pearson correlation between the estimates and the ground truth of heart rate was 0.992. This framework is of great values to wearable devices such as smartwatches which use PPG signals to monitor heart rate for fitness.

  1. Inverse probability weighting estimation of the volume under the ROC surface in the presence of verification bias.

    PubMed

    Zhang, Ying; Alonzo, Todd A

    2016-11-01

    In diagnostic medicine, the volume under the receiver operating characteristic (ROC) surface (VUS) is a commonly used index to quantify the ability of a continuous diagnostic test to discriminate between three disease states. In practice, verification of the true disease status may be performed only for a subset of subjects under study since the verification procedure is invasive, risky, or expensive. The selection for disease examination might depend on the results of the diagnostic test and other clinical characteristics of the patients, which in turn can cause bias in estimates of the VUS. This bias is referred to as verification bias. Existing verification bias correction in three-way ROC analysis focuses on ordinal tests. We propose verification bias-correction methods to construct ROC surface and estimate the VUS for a continuous diagnostic test, based on inverse probability weighting. By applying U-statistics theory, we develop asymptotic properties for the estimator. A Jackknife estimator of variance is also derived. Extensive simulation studies are performed to evaluate the performance of the new estimators in terms of bias correction and variance. The proposed methods are used to assess the ability of a biomarker to accurately identify stages of Alzheimer's disease. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. The use of robots for arms control treaty verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michalowski, S.J.

    1991-01-01

    Many aspects of the superpower relationship now present a new set of challenges and opportunities, including the vital area of arms control. This report addresses one such possibility: the use of robots for the verification of arms control treaties. The central idea of this report is far from commonly-accepted. In fact, it was only encountered once in bibliographic review phase of the project. Nonetheless, the incentive for using robots is simple and coincides with that of industrial applications: to replace or supplement human activity in the performance of tasks for which human participation is unnecessary, undesirable, impossible, too dangerous ormore » too expensive. As in industry, robots should replace workers (in this case, arms control inspectors) only when questions of efficiency, reliability, safety, security and cost-effectiveness have been answered satisfactorily. In writing this report, it is not our purpose to strongly advocate the application of robots in verification. Rather, we wish to explore the significant aspects, pro and con, of applying experience from the field of flexible automation to the complex task of assuring arms control treaty compliance. We want to establish a framework for further discussion of this topic and to define criteria for evaluating future proposals. The authors' expertise is in robots, not arms control. His practical experience has been in developing systems for use in the rehabilitation of severely disabled persons (such as quadriplegics), who can use robots for assistance during activities of everyday living, as well as in vocational applications. This creates a special interest in implementations that, in some way, include a human operator in the control scheme of the robot. As we hope to show in this report, such as interactive systems offer the greatest promise of making a contribution to the challenging problems of treaty verification. 15 refs.« less

  3. Clinical evaluation of 4D PET motion compensation strategies for treatment verification in ion beam therapy

    NASA Astrophysics Data System (ADS)

    Gianoli, Chiara; Kurz, Christopher; Riboldi, Marco; Bauer, Julia; Fontana, Giulia; Baroni, Guido; Debus, Jürgen; Parodi, Katia

    2016-06-01

    A clinical trial named PROMETHEUS is currently ongoing for inoperable hepatocellular carcinoma (HCC) at the Heidelberg Ion Beam Therapy Center (HIT, Germany). In this framework, 4D PET-CT datasets are acquired shortly after the therapeutic treatment to compare the irradiation induced PET image with a Monte Carlo PET prediction resulting from the simulation of treatment delivery. The extremely low count statistics of this measured PET image represents a major limitation of this technique, especially in presence of target motion. The purpose of the study is to investigate two different 4D PET motion compensation strategies towards the recovery of the whole count statistics for improved image quality of the 4D PET-CT datasets for PET-based treatment verification. The well-known 4D-MLEM reconstruction algorithm, embedding the motion compensation in the reconstruction process of 4D PET sinograms, was compared to a recently proposed pre-reconstruction motion compensation strategy, which operates in sinogram domain by applying the motion compensation to the 4D PET sinograms. With reference to phantom and patient datasets, advantages and drawbacks of the two 4D PET motion compensation strategies were identified. The 4D-MLEM algorithm was strongly affected by inverse inconsistency of the motion model but demonstrated the capability to mitigate the noise-break-up effects. Conversely, the pre-reconstruction warping showed less sensitivity to inverse inconsistency but also more noise in the reconstructed images. The comparison was performed by relying on quantification of PET activity and ion range difference, typically yielding similar results. The study demonstrated that treatment verification of moving targets could be accomplished by relying on the whole count statistics image quality, as obtained from the application of 4D PET motion compensation strategies. In particular, the pre-reconstruction warping was shown to represent a promising choice when combined with intra-reconstruction smoothing.

  4. Verification and classification bias interactions in diagnostic test accuracy studies for fine-needle aspiration biopsy.

    PubMed

    Schmidt, Robert L; Walker, Brandon S; Cohen, Michael B

    2015-03-01

    Reliable estimates of accuracy are important for any diagnostic test. Diagnostic accuracy studies are subject to unique sources of bias. Verification bias and classification bias are 2 sources of bias that commonly occur in diagnostic accuracy studies. Statistical methods are available to estimate the impact of these sources of bias when they occur alone. The impact of interactions when these types of bias occur together has not been investigated. We developed mathematical relationships to show the combined effect of verification bias and classification bias. A wide range of case scenarios were generated to assess the impact of bias components and interactions on total bias. Interactions between verification bias and classification bias caused overestimation of sensitivity and underestimation of specificity. Interactions had more effect on sensitivity than specificity. Sensitivity was overestimated by at least 7% in approximately 6% of the tested scenarios. Specificity was underestimated by at least 7% in less than 0.1% of the scenarios. Interactions between verification bias and classification bias create distortions in accuracy estimates that are greater than would be predicted from each source of bias acting independently. © 2014 American Cancer Society.

  5. Electronic Cigarette Sales to Minors via the Internet

    PubMed Central

    Williams, Rebecca S.; Derrick, Jason; Ribisl, Kurt M.

    2015-01-01

    Importance Electronic cigarettes (e-cigarettes) entered the US market in 2007 and, with little regulatory oversight, grew into a $2-billion-a-year industry by 2013. The Centers for Disease Control and Prevention has reported a trend of increasing e-cigarette use among teens, with use rates doubling from 2011 to 2012. While several studies have documented that teens can and do buy cigarettes online, to our knowledge, no studies have yet examined age verification among Internet tobacco vendors selling e-cigarettes. Objective To estimate the extent to which minors can successfully purchase e-cigarettes online and assess compliance with North Carolina's 2013 e-cigarette age-verification law. Design, Setting, and Participants In this cross-sectional study conducted from February 2014 to June 2014, 11 nonsmoking minors aged 14 to 17 years made supervised e-cigarette purchase attempts from 98 Internet e-cigarette vendors. Purchase attempts were made at the University of North Carolina Internet Tobacco Vendors Study project offices using credit cards. Main Outcome and Measure Rate at which minors can successfully purchase e-cigarettes on the Internet. Results Minors successfully received deliveries of e-cigarettes from 76.5% of purchase attempts, with no attempts by delivery companies to verify their ages at delivery and 95% of delivered orders simply left at the door. All delivered packages came from shipping companies that, according to company policy or federal regulation, do not ship cigarettes to consumers. Of the total orders, 18 failed for reasons unrelated to age verification. Only 5 of the remaining 80 youth purchase attempts were rejected owing to age verification, resulting in a youth buy rate of 93.7%. None of the vendors complied with North Carolina's e-cigarette age-verification law. Conclusions and Relevance Minors are easily able to purchase e-cigarettes from the Internet because of an absence of age-verification measures used by Internet e-cigarette vendors. Federal law should require and enforce rigorous age verification for all e-cigarette sales as with the federal PACT (Prevent All Cigarette Trafficking) Act's requirements for age verification in Internet cigarette sales. PMID:25730697

  6. Verification of the FtCayuga fault-tolerant microprocessor system. Volume 1: A case study in theorem prover-based verification

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam; Bickford, Mark

    1991-01-01

    The design and formal verification of a hardware system for a task that is an important component of a fault tolerant computer architecture for flight control systems is presented. The hardware system implements an algorithm for obtaining interactive consistancy (byzantine agreement) among four microprocessors as a special instruction on the processors. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, provided certain preconditions hold. An assumption is made that the processors execute synchronously. For verification, the authors used a computer aided design hardware design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.

  7. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication verification, or installation verification... 30 Mineral Resources 2 2010-07-01 2010-07-01 false When must I resubmit Platform Verification...

  8. INF and IAEA: A comparative analysis of verification strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scheinman, L.; Kratzer, M.

    1992-07-01

    This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities.

  9. The advantage of being oneself: The role of applicant self-verification in organizational hiring decisions.

    PubMed

    Moore, Celia; Lee, Sun Young; Kim, Kawon; Cable, Daniel M

    2017-11-01

    In this paper, we explore whether individuals who strive to self-verify flourish or flounder on the job market. Using placement data from 2 very different field samples, we found that individuals rated by the organization as being in the top 10% of candidates were significantly more likely to receive a job offer if they have a stronger drive to self-verify. A third study, using a quasi-experimental design, explored the mechanism behind this effect and tested whether individuals who are high and low on this disposition communicate differently in a structured mock job interview. Text analysis (LIWC) of interview transcripts revealed systematic differences in candidates' language use as a function of their self-verification drives. These differences led an expert rater to perceive candidates with a strong drive to self-verify as less inauthentic and less misrepresentative than their low self-verifying peers, making her more likely to recommend these candidates for a job. Taken together, our results suggest that authentic self-presentation is an unidentified route to success on the job market, amplifying the chances that high-quality candidates can convert organizations' positive evaluations into tangible job offers. We discuss implications for job applicants, organizations, and the labor market. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. Galaxy-galaxy lensing in the Dark Energy Survey Science Verification data

    NASA Astrophysics Data System (ADS)

    Clampitt, J.; Sánchez, C.; Kwan, J.; Krause, E.; MacCrann, N.; Park, Y.; Troxel, M. A.; Jain, B.; Rozo, E.; Rykoff, E. S.; Wechsler, R. H.; Blazek, J.; Bonnett, C.; Crocce, M.; Fang, Y.; Gaztanaga, E.; Gruen, D.; Jarvis, M.; Miquel, R.; Prat, J.; Ross, A. J.; Sheldon, E.; Zuntz, J.; Abbott, T. M. C.; Abdalla, F. B.; Armstrong, R.; Becker, M. R.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Dietrich, J. P.; Doel, P.; Estrada, J.; Evrard, A. E.; Fausti Neto, A.; Flaugher, B.; Fosalba, P.; Frieman, J.; Gruendl, R. A.; Honscheid, K.; James, D. J.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; March, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Mohr, J. J.; Nichol, R. C.; Nord, B.; Plazas, A. A.; Romer, A. K.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Vikram, V.; Walker, A. R.

    2017-03-01

    We present galaxy-galaxy lensing results from 139 deg2 of Dark Energy Survey (DES) Science Verification (SV) data. Our lens sample consists of red galaxies, known as redMaGiC, which are specifically selected to have a low photometric redshift error and outlier rate. The lensing measurement has a total signal-to-noise ratio of 29 over scales 0.09 < R < 15 Mpc h-1, including all lenses over a wide redshift range 0.2 < z < 0.8. Dividing the lenses into three redshift bins for this constant moving number density sample, we find no evidence for evolution in the halo mass with redshift. We obtain consistent results for the lensing measurement with two independent shear pipelines, NGMIX and IM3SHAPE. We perform a number of null tests on the shear and photometric redshift catalogues and quantify resulting systematic uncertainties. Covariances from jackknife subsamples of the data are validated with a suite of 50 mock surveys. The result and systematic checks in this work provide a critical input for future cosmological and galaxy evolution studies with the DES data and redMaGiC galaxy samples. We fit a halo occupation distribution (HOD) model, and demonstrate that our data constrain the mean halo mass of the lens galaxies, despite strong degeneracies between individual HOD parameters.

  11. Independent verification and validation report of Washington state ferries' wireless high speed data project

    DOT National Transportation Integrated Search

    2008-06-30

    The following Independent Verification and Validation (IV&V) report documents and presents the results of a study of the Washington State Ferries Prototype Wireless High Speed Data Network. The purpose of the study was to evaluate and determine if re...

  12. THE DEVELOPMENT AND INTER-LABORATORY VERIFICATION OF LC-MS LIBRARIES FOR ORGANIC CHEMICALS OF ENVIRONMENTAL CONCERN

    EPA Science Inventory

    The development, verification, and comparison study between LC-MS libraries for two manufacturers’ instruments and a verified protocol are discussed. The LC-MS library protocol was verified through an inter-laboratory study that involved Federal, State, and private laboratories. ...

  13. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  14. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  15. Observation and confirmation of six strong-lensing systems in the Dark Energy Survey science verification data

    DOE PAGES

    Nord, B.; Buckley-Geer, E.; Lin, H.; ...

    2016-08-05

    We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey data. Through visual inspection of data from the Science Verification season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-object Spectrograph at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 either weremore » not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy-cluster-scale lenses. The lensed sources range in redshift z ~ 0.80–3.2 and in i-band surface brightness i SB ~ 23–25 mag arcsec –2 (2'' aperture). For each of the six systems, we estimate the Einstein radius θ E and the enclosed mass M enc, which have ranges θ E ~ 5''–9'' and M enc ~ 8 × 10 12 to 6 × 10 13 M ⊙, respectively.« less

  16. Observation and confirmation of six strong-lensing systems in the Dark Energy Survey science verification data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nord, B.; Buckley-Geer, E.; Lin, H.

    We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey data. Through visual inspection of data from the Science Verification season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-object Spectrograph at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 either weremore » not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy-cluster-scale lenses. The lensed sources range in redshift z ~ 0.80–3.2 and in i-band surface brightness i SB ~ 23–25 mag arcsec –2 (2'' aperture). For each of the six systems, we estimate the Einstein radius θ E and the enclosed mass M enc, which have ranges θ E ~ 5''–9'' and M enc ~ 8 × 10 12 to 6 × 10 13 M ⊙, respectively.« less

  17. Quantification of Greenhouse Gas Emission Rates from strong Point Sources by Airborne IPDA-Lidar Measurements: Methodology and Experimental Results

    NASA Astrophysics Data System (ADS)

    Ehret, G.; Amediek, A.; Wirth, M.; Fix, A.; Kiemle, C.; Quatrevalet, M.

    2016-12-01

    We report on a new method and on the first demonstration to quantify emission rates from strong greenhouse gas (GHG) point sources using airborne Integrated Path Differential Absorption (IPDA) Lidar measurements. In order to build trust in the self-reported emission rates by countries, verification against independent monitoring systems is a prerequisite to check the reported budget. A significant fraction of the total anthropogenic emission of CO2 and CH4 originates from localized strong point sources of large energy production sites or landfills. Both are not monitored with sufficiently accuracy by the current observation system. There is a debate whether airborne remote sensing could fill in the gap to infer those emission rates from budgeting or from Gaussian plume inversion approaches, whereby measurements of the GHG column abundance beneath the aircraft can be used to constrain inverse models. In contrast to passive sensors, the use of an active instrument like CHARM-F for such emission verification measurements is new. CHARM-F is a new airborne IPDA-Lidar devised for the German research aircraft HALO for the simultaneous measurement of the column-integrated dry-air mixing ratio of CO2 and CH4 commonly denoted as XCO2 und XCH4, respectively. It has successfully been tested in a serious of flights over Central Europe to assess its performance under various reflectivity conditions and in a strongly varying topography like the Alps. The analysis of a methane plume measured in crosswind direction of a coal mine ventilation shaft revealed an instantaneous emission rate of 9.9 ± 1.7 kt CH4 yr-1. We discuss the methodology of our point source estimation approach and give an outlook on the CoMet field experiment scheduled in 2017 for the measurement of anthropogenic and natural GHG emissions by a combination of active and passive remote sensing instruments on research aircraft.

  18. INF and IAEA: A comparative analysis of verification strategy. [Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scheinman, L.; Kratzer, M.

    1992-07-01

    This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities.

  19. Verification Assessment of Flow Boundary Conditions for CFD Analysis of Supersonic Inlet Flows

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2002-01-01

    Boundary conditions for subsonic inflow, bleed, and subsonic outflow as implemented into the WIND CFD code are assessed with respect to verification for steady and unsteady flows associated with supersonic inlets. Verification procedures include grid convergence studies and comparisons to analytical data. The objective is to examine errors, limitations, capabilities, and behavior of the boundary conditions. Computational studies were performed on configurations derived from a "parameterized" supersonic inlet. These include steady supersonic flows with normal and oblique shocks, steady subsonic flow in a diffuser, and unsteady flow with the propagation and reflection of an acoustic disturbance.

  20. Life cycle management of analytical methods.

    PubMed

    Parr, Maria Kristina; Schmidt, Alexander H

    2018-01-05

    In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Verification Games: Crowd-Sourced Formal Verification

    DTIC Science & Technology

    2016-03-01

    VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION UNIVERSITY OF WASHINGTON MARCH 2016 FINAL TECHNICAL REPORT...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750...clarification memorandum dated 16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Over the more than three years of the project Verification Games : Crowd-sourced

  2. Fostering group identification and creativity in diverse groups: the role of individuation and self-verification.

    PubMed

    Swann, William B; Kwan, Virginia S Y; Polzer, Jeffrey T; Milton, Laurie P

    2003-11-01

    A longitudinal study examined the interplay of identity negotiation processes and diversity in small groups of master's of business administration (MBA) students. When perceivers formed relatively positive impressions of other group members, higher diversity predicted more individuation of targets. When perceivers formed relatively neutral impressions of other group members, however, higher diversity predicted less individuation of targets. Individuation at the outset of the semester predicted self-verification effects several weeks later, and self-verification, in turn, predicted group identification and creative task performance. The authors conclude that contrary to self-categorization theory, fostering individuation and self-verification in diverse groups may maximize group identification and productivity.

  3. Verification of natural infection of peridomestic rodents by PCV2 on commercial swine farms.

    PubMed

    Pinheiro, Albanno Leonard Braz Campos; Bulos, Luiz Henrique Silva; Onofre, Thiago Souza; de Paula Gabardo, Michelle; de Carvalho, Otávio Valério; Fausto, Mariana Costa; Guedes, Roberto Maurício Carvalho; de Almeida, Márcia Rogéria; Silva Júnior, Abelardo

    2013-06-01

    The porcine circovirus-2 (PCV2) is the main agent responsible for porcine circovirus associated diseases (PCVAD). Few studies have been done regarding PCV2 infection in other species. The purpose of this study was to investigate the occurrence of PCV2 infection in the peridomestic rodent species Mus musculus and Rattus rattus on commercial pig farms in Brazil. Immunohistochemistry assay demonstrated PCV2 in the spleen, lung and kidney. Viral DNA was detected in tissues by nested PCR assay. Partial sequences of PCV2 genomes detected in the rodents had strong identity with gene sequences of PCV2 isolates from pigs. These results show that the studied peridomestic rodent species can be naturally infected by PCV2. However, further studies are needed to confirm PCV2 transmission from rodents to pigs. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Developing a Test for Assessing Elementary Students' Comprehension of Science Texts

    ERIC Educational Resources Information Center

    Wang, Jing-Ru; Chen, Shin-Feng; Tsay, Reuy-Fen; Chou, Ching-Ting; Lin, Sheau-Wen; Kao, Huey-Lien

    2012-01-01

    This study reports on the process of developing a test to assess students' reading comprehension of scientific materials and on the statistical results of the verification study. A combination of classic test theory and item response theory approaches was used to analyze the assessment data from a verification study. Data analysis indicates the…

  5. Industrial methodology for process verification in research (IMPROVER): toward systems biology verification

    PubMed Central

    Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo

    2012-01-01

    Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044

  6. EXAMINING THE ROLE AND RESEARCH CHALLENGES OF SOCIAL MEDIA AS A TOOL FOR NONPROLIFERATION AND ARMS CONTROL TREATY VERIFICATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.

    Traditional arms control treaty verification activities typically involve a combination of technical measurements via physical and chemical sensors, state declarations, political agreements, and on-site inspections involving international subject matter experts. However, the ubiquity of the internet, and the electronic sharing of data that it enables, has made available a wealth of open source information with the potential to benefit verification efforts. Open source information is already being used by organizations such as the International Atomic Energy Agency to support the verification of state-declared information, prepare inspectors for in-field activities, and to maintain situational awareness . The recent explosion in socialmore » media use has opened new doors to exploring the attitudes, moods, and activities around a given topic. Social media platforms, such as Twitter, Facebook, and YouTube, offer an opportunity for individuals, as well as institutions, to participate in a global conversation at minimal cost. Social media data can also provide a more data-rich environment, with text data being augmented with images, videos, and location data. The research described in this paper investigates the utility of applying social media signatures as potential arms control and nonproliferation treaty verification tools and technologies, as determined through a series of case studies. The treaty relevant events that these case studies touch upon include detection of undeclared facilities or activities, determination of unknown events recorded by the International Monitoring System (IMS), and the global media response to the occurrence of an Indian missile launch. The case studies examine how social media can be used to fill an information gap and provide additional confidence to a verification activity. The case studies represent, either directly or through a proxy, instances where social media information may be available that could potentially augment the evaluation of an event. The goal of this paper is to instigate a discussion within the verification community as to where and how social media can be effectively utilized to complement and enhance traditional treaty verification efforts. In addition, this paper seeks to identify areas of future research and development necessary to adapt social media analytic tools and techniques, and to form the seed for social media analytics to aid and inform arms control and nonproliferation policymakers and analysts. While social media analysis (as well as open source analysis as a whole) will not ever be able to replace traditional arms control verification measures, they do supply unique signatures that can augment existing analysis.« less

  7. Rapid determination of alpha emitters using Actinide resin.

    PubMed

    Navarro, N; Rodriguez, L; Alvarez, A; Sancho, C

    2004-01-01

    The European Commission has recently published the recommended radiological protection criteria for the clearance of building and building rubble from the dismantling of nuclear installations. Radionuclide specific clearance levels for actinides are very low (between 0.1 and 1 Bq g(-1)). The prevalence of natural radionuclides in rubble materials makes the verification of these levels by direct alpha counting impossible. The capability of Actinide resin (Eichrom Industries, Inc.) for extracting plutonium and americium from rubble samples has been tested in this work. Besides a strong affinity for actinides in the tri, tetra and hexavalent oxidation states, this extraction chromatographic resin presents an easy recovery of absorbed radionuclides. The retention capability was evaluated on rubble samples spiked with certified radionuclide standards (239Pu and 241Am). Samples were leached with nitric acid, passed through a chromatographic column containing the resin and the elution fraction was measured by LSC. Actinide retention varies from 60% to 80%. Based on these results, a rapid method for the verification of clearance levels for actinides in rubble samples is proposed.

  8. Measurement of a True [Formula: see text]O2max during a Ramp Incremental Test Is Not Confirmed by a Verification Phase.

    PubMed

    Murias, Juan M; Pogliaghi, Silvia; Paterson, Donald H

    2018-01-01

    The accuracy of an exhaustive ramp incremental (RI) test to determine maximal oxygen uptake ([Formula: see text]O 2max ) was recently questioned and the utilization of a verification phase proposed as a gold standard. This study compared the oxygen uptake ([Formula: see text]O 2 ) during a RI test to that obtained during a verification phase aimed to confirm attainment of [Formula: see text]O 2max . Sixty-one healthy males [31 older (O) 65 ± 5 yrs; 30 younger (Y) 25 ± 4 yrs] performed a RI test (15-20 W/min for O and 25 W/min for Y). At the end of the RI test, a 5-min recovery period was followed by a verification phase of constant load cycling to fatigue at either 85% ( n = 16) or 105% ( n = 45) of the peak power output obtained from the RI test. The highest [Formula: see text]O 2 after the RI test (39.8 ± 11.5 mL·kg -1 ·min -1 ) and the verification phase (40.1 ± 11.2 mL·kg -1 ·min -1 ) were not different ( p = 0.33) and they were highly correlated ( r = 0.99; p < 0.01). This response was not affected by age or intensity of the verification phase. The Bland-Altman analysis revealed a very small absolute bias (-0.25 mL·kg -1 ·min -1 , not different from 0) and a precision of ±1.56 mL·kg -1 ·min -1 between measures. This study indicated that a verification phase does not highlight an under-estimation of [Formula: see text]O 2max derived from a RI test, in a large and heterogeneous group of healthy younger and older men naïve to laboratory testing procedures. Moreover, only minor within-individual differences were observed between the maximal [Formula: see text]O 2 elicited during the RI and the verification phase. Thus a verification phase does not add any validation of the determination of a [Formula: see text]O 2max . Therefore, the recommendation that a verification phase should become a gold standard procedure, although initially appealing, is not supported by the experimental data.

  9. Software engineering and automatic continuous verification of scientific software

    NASA Astrophysics Data System (ADS)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical solutions via the method of manufactured solutions. By developing and verifying code in tandem we avoid a number of pitfalls in scientific software development and advocate similar procedures for other scientific code applications.

  10. A Methodology for Evaluating Artifacts Produced by a Formal Verification Process

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette

    2011-01-01

    The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.

  11. Results from an Independent View on The Validation of Safety-Critical Space Systems

    NASA Astrophysics Data System (ADS)

    Silva, N.; Lopes, R.; Esper, A.; Barbosa, R.

    2013-08-01

    The Independent verification and validation (IV&V) has been a key process for decades, and is considered in several international standards. One of the activities described in the “ESA ISVV Guide” is the independent test verification (stated as Integration/Unit Test Procedures and Test Data Verification). This activity is commonly overlooked since customers do not really see the added value of checking thoroughly the validation team work (could be seen as testing the tester's work). This article presents the consolidated results of a large set of independent test verification activities, including the main difficulties, results obtained and advantages/disadvantages for the industry of these activities. This study will support customers in opting-in or opting-out for this task in future IV&V contracts since we provide concrete results from real case studies in the space embedded systems domain.

  12. A new method to address verification bias in studies of clinical screening tests: cervical cancer screening assays as an example.

    PubMed

    Xue, Xiaonan; Kim, Mimi Y; Castle, Philip E; Strickler, Howard D

    2014-03-01

    Studies to evaluate clinical screening tests often face the problem that the "gold standard" diagnostic approach is costly and/or invasive. It is therefore common to verify only a subset of negative screening tests using the gold standard method. However, undersampling the screen negatives can lead to substantial overestimation of the sensitivity and underestimation of the specificity of the diagnostic test. Our objective was to develop a simple and accurate statistical method to address this "verification bias." We developed a weighted generalized estimating equation approach to estimate, in a single model, the accuracy (eg, sensitivity/specificity) of multiple assays and simultaneously compare results between assays while addressing verification bias. This approach can be implemented using standard statistical software. Simulations were conducted to assess the proposed method. An example is provided using a cervical cancer screening trial that compared the accuracy of human papillomavirus and Pap tests, with histologic data as the gold standard. The proposed approach performed well in estimating and comparing the accuracy of multiple assays in the presence of verification bias. The proposed approach is an easy to apply and accurate method for addressing verification bias in studies of multiple screening methods. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Benchmark solution of the dynamic response of a spherical shell at finite strain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Versino, Daniele; Brock, Jerry S.

    2016-09-28

    Our paper describes the development of high fidelity solutions for the study of homogeneous (elastic and inelastic) spherical shells subject to dynamic loading and undergoing finite deformations. The goal of the activity is to provide high accuracy results that can be used as benchmark solutions for the verification of computational physics codes. Furthermore, the equilibrium equations for the geometrically non-linear problem are solved through mode expansion of the displacement field and the boundary conditions are enforced in a strong form. Time integration is performed through high-order implicit Runge–Kutta schemes. Finally, we evaluate accuracy and convergence of the proposed method bymore » means of numerical examples with finite deformations and material non-linearities and inelasticity.« less

  14. First XMM-Newton Observations of a Cataclysmic Variable II: Spectral Studies of OY Car

    NASA Technical Reports Server (NTRS)

    Ramsay, Gavin; Cordova, France; Cottam, Jean; Mason, Keith; Much, Rudu; Osborne, Julian; Pandel, Dirk; Poole, Tracey; Wheatley, Peter

    2000-01-01

    We present XMM-Newton X-ray spectra of the disc accreting cataclysmic variable OY Car, which were obtained during the performance verification phase of the mission. These data were taken 4 days after a short outburst. In the EPIC spectra we find strong Iron K(beta) emission with weaker Iron K(alpha) emission together with Silicon and Sulphur lines. The spectra are best fitted with a three temperature plasma model with a partial covering absorber. Multiple temperature emission is confirmed by the emission lines seen in the RGS spectrum and the H/He like intensity ratio for Iron and Sulphur which imply temperatures of approx. 7keV and approx. 3keV respectively.

  15. Verification of the effects of Schumann frequency range electromagnetic fields on the human cardiovascular system

    NASA Astrophysics Data System (ADS)

    Tuzhilkin, D. A.; Borodin, A. S.

    2017-11-01

    The results of the study of variations in the electromagnetic background parameters of the Schumann resonator frequency range and the variability indices of the human heart period during its free activity are presented on the basis of 24-hour synchronous monitoring data. It is shown that the integral evaluation of the conjugacy of the heart rate variability indices from the Schumann resonance parameters is extremely weak. In this case, the differential evaluation of this dependence with separation into characteristic time intervals of the day, characterized by different motor activity of the subjects, becomes significantly higher. The number of volunteers whose conjugacy is characterized by a strong correlation in some cases reaches 35 percent of the sample.

  16. Spacecraft attitude calibration/verification baseline study

    NASA Technical Reports Server (NTRS)

    Chen, L. C.

    1981-01-01

    A baseline study for a generalized spacecraft attitude calibration/verification system is presented. It can be used to define software specifications for three major functions required by a mission: the pre-launch parameter observability and data collection strategy study; the in-flight sensor calibration; and the post-calibration attitude accuracy verification. Analytical considerations are given for both single-axis and three-axis spacecrafts. The three-axis attitudes considered include the inertial-pointing attitudes, the reference-pointing attitudes, and attitudes undergoing specific maneuvers. The attitude sensors and hardware considered include the Earth horizon sensors, the plane-field Sun sensors, the coarse and fine two-axis digital Sun sensors, the three-axis magnetometers, the fixed-head star trackers, and the inertial reference gyros.

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM CASE STUDIES: DEMONSTRATING PROGRAM OUTCOMES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This bookle...

  18. Is it Code Imperfection or 'garbage in Garbage Out'? Outline of Experiences from a Comprehensive Adr Code Verification

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2013-12-01

    ADR equation describes many physical phenomena of interest in the field of water quality in natural streams and groundwater. In many cases such as: density driven flow, multiphase reactive transport, and sediment transport, either one or a number of terms in the ADR equation may become nonlinear. For that reason, numerical tools are the only practical choice to solve these PDEs. All numerical solvers developed for transport equation need to undergo code verification procedure before they are put in to practice. Code verification is a mathematical activity to uncover failures and check for rigorous discretization of PDEs and implementation of initial/boundary conditions. In the context computational PDE verification is not a well-defined procedure on a clear path. Thus, verification tests should be designed and implemented with in-depth knowledge of numerical algorithms and physics of the phenomena as well as mathematical behavior of the solution. Even test results need to be mathematically analyzed to distinguish between an inherent limitation of algorithm and a coding error. Therefore, it is well known that code verification is a state of the art, in which innovative methods and case-based tricks are very common. This study presents full verification of a general transport code. To that end, a complete test suite is designed to probe the ADR solver comprehensively and discover all possible imperfections. In this study we convey our experiences in finding several errors which were not detectable with routine verification techniques. We developed a test suit including hundreds of unit tests and system tests. The test package has gradual increment in complexity such that tests start from simple and increase to the most sophisticated level. Appropriate verification metrics are defined for the required capabilities of the solver as follows: mass conservation, convergence order, capabilities in handling stiff problems, nonnegative concentration, shape preservation, and spurious wiggles. Thereby, we provide objective, quantitative values as opposed to subjective qualitative descriptions as 'weak' or 'satisfactory' agreement with those metrics. We start testing from a simple case of unidirectional advection, then bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. For all of the mentioned cases we conduct mesh convergence tests. These tests compare the results' order of accuracy versus the formal order of accuracy of discretization. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available we utilize Symmetry, Complete Richardson Extrapolation and Method of False Injection to uncover bugs. Detailed discussions of capabilities of the mentioned code verification techniques are given. Auxiliary subroutines for automation of the test suit and report generation are designed. All in all, the test package is not only a robust tool for code verification but also it provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport.

  19. Separating stages of arithmetic verification: An ERP study with a novel paradigm.

    PubMed

    Avancini, Chiara; Soltész, Fruzsina; Szűcs, Dénes

    2015-08-01

    In studies of arithmetic verification, participants typically encounter two operands and they carry out an operation on these (e.g. adding them). Operands are followed by a proposed answer and participants decide whether this answer is correct or incorrect. However, interpretation of results is difficult because multiple parallel, temporally overlapping numerical and non-numerical processes of the human brain may contribute to task execution. In order to overcome this problem here we used a novel paradigm specifically designed to tease apart the overlapping cognitive processes active during arithmetic verification. Specifically, we aimed to separate effects related to detection of arithmetic correctness, detection of the violation of strategic expectations, detection of physical stimulus properties mismatch and numerical magnitude comparison (numerical distance effects). Arithmetic correctness, physical stimulus properties and magnitude information were not task-relevant properties of the stimuli. We distinguished between a series of temporally highly overlapping cognitive processes which in turn elicited overlapping ERP effects with distinct scalp topographies. We suggest that arithmetic verification relies on two major temporal phases which include parallel running processes. Our paradigm offers a new method for investigating specific arithmetic verification processes in detail. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Hierarchical Representation Learning for Kinship Verification.

    PubMed

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  1. Software verification plan for GCS. [guidance and control software

    NASA Technical Reports Server (NTRS)

    Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.

    1990-01-01

    This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM CASE STUDIES: DEMONSTRATING PROGRAM OUTCOMES, VOLUME II

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This bookle...

  3. Environmental Technology Verification Program - ETV - Case Studies: Demonstrating Program Outcomes

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This cd con...

  4. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  5. [Uniqueness seeking behavior as a self-verification: an alternative approach to the study of uniqueness].

    PubMed

    Yamaoka, S

    1995-06-01

    Uniqueness theory explains that extremely high perceived similarity between self and others evokes negative emotional reactions and causes uniqueness seeking behavior. However, the theory conceptualizes similarity so ambiguously that it appears to suffer from low predictive validity. The purpose of the current article is to propose an alternative explanation of uniqueness seeking behavior. It posits that perceived uniqueness deprivation is a threat to self-concepts, and therefore causes self-verification behavior. Two levels of self verification are conceived: one based on personal categorization and the other on social categorization. The present approach regards uniqueness seeking behavior as the personal-level self verification. To test these propositions, a 2 (very high or moderate similarity information) x 2 (with or without outgroup information) x 2 (high or low need for uniqueness) between-subject factorial-design experiment was conducted with 95 university students. Results supported the self-verification approach, and were discussed in terms of effects of uniqueness deprivation, levels of self-categorization, and individual differences in need for uniqueness.

  6. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (IIT-A-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures for the initial and periodic verification and validation of computer programs. The programs are used during the Arizona NHEXAS project and Border study at the Illinois Institute of Technology (IIT) site. Keywords: computers; s...

  7. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (UA-D-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the "Border" study. Keywords: Computers; Software; QA/QC.

    The National Human Exposure Assessment Sur...

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT FOR AMMONIA RECOVERY PROCESS

    EPA Science Inventory

    This Technology Verification report describes the nature and scope of an environmental evaluation of ThermoEnergy Corporation’s Ammonia Recovery Process (ARP) system. The information contained in this report represents data that were collected over a 3-month pilot study. The ti...

  9. Probabilistic Material Strength Degradation Model for Inconel 718 Components Subjected to High Temperature, Mechanical Fatigue, Creep and Thermal Fatigue Effects

    NASA Technical Reports Server (NTRS)

    Bast, Callie Corinne Scheidt

    1994-01-01

    This thesis presents the on-going development of methodology for a probabilistic material strength degradation model. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes four effects that typically reduce lifetime strength: high temperature, mechanical fatigue, creep, and thermal fatigue. Statistical analysis was conducted on experimental Inconel 718 data obtained from the open literature. This analysis provided regression parameters for use as the model's empirical material constants, thus calibrating the model specifically for Inconel 718. Model calibration was carried out for four variables, namely, high temperature, mechanical fatigue, creep, and thermal fatigue. Methodology to estimate standard deviations of these material constants for input into the probabilistic material strength model was developed. Using the current version of PROMISS, entitled PROMISS93, a sensitivity study for the combined effects of mechanical fatigue, creep, and thermal fatigue was performed. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing a combination of mechanical fatigue and high temperature effects by model to the combination by experiment were conducted. Thus, for Inconel 718, the basic model assumption of independence between effects was evaluated. Results from this limited verification study strongly supported this assumption.

  10. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  11. OBSERVATION AND CONFIRMATION OF SIX STRONG-LENSING SYSTEMS IN THE DARK ENERGY SURVEY SCIENCE VERIFICATION DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nord, B.; Buckley-Geer, E.; Lin, H.

    We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey data. Through visual inspection of data from the Science Verification season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-object Spectrograph at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 either weremore » not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy-cluster-scale lenses. The lensed sources range in redshift z ∼ 0.80–3.2 and in i -band surface brightness i {sub SB} ∼ 23–25 mag arcsec{sup −2} (2″ aperture). For each of the six systems, we estimate the Einstein radius θ {sub E} and the enclosed mass M {sub enc}, which have ranges θ {sub E} ∼ 5″–9″ and M {sub enc} ∼ 8 × 10{sup 12} to 6 × 10{sup 13} M {sub ⊙}, respectively.« less

  12. Observation and Confirmation of Six Strong-lensing Systems in the Dark Energy Survey Science Verification Data

    NASA Astrophysics Data System (ADS)

    Nord, B.; Buckley-Geer, E.; Lin, H.; Diehl, H. T.; Helsby, J.; Kuropatkin, N.; Amara, A.; Collett, T.; Allam, S.; Caminha, G. B.; De Bom, C.; Desai, S.; Dúmet-Montoya, H.; Pereira, M. Elidaiana da S.; Finley, D. A.; Flaugher, B.; Furlanetto, C.; Gaitsch, H.; Gill, M.; Merritt, K. W.; More, A.; Tucker, D.; Saro, A.; Rykoff, E. S.; Rozo, E.; Birrer, S.; Abdalla, F. B.; Agnello, A.; Auger, M.; Brunner, R. J.; Carrasco Kind, M.; Castander, F. J.; Cunha, C. E.; da Costa, L. N.; Foley, R. J.; Gerdes, D. W.; Glazebrook, K.; Gschwend, J.; Hartley, W.; Kessler, R.; Lagattuta, D.; Lewis, G.; Maia, M. A. G.; Makler, M.; Menanteau, F.; Niernberg, A.; Scolnic, D.; Vieira, J. D.; Gramillano, R.; Abbott, T. M. C.; Banerji, M.; Benoit-Lévy, A.; Brooks, D.; Burke, D. L.; Capozzi, D.; Carnero Rosell, A.; Carretero, J.; D'Andrea, C. B.; Dietrich, J. P.; Doel, P.; Evrard, A. E.; Frieman, J.; Gaztanaga, E.; Gruen, D.; Honscheid, K.; James, D. J.; Kuehn, K.; Li, T. S.; Lima, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Miquel, R.; Neilsen, E.; Nichol, R. C.; Ogando, R.; Plazas, A. A.; Romer, A. K.; Sako, M.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thaler, J.; Walker, A. R.; Wester, W.; Zhang, Y.; DES Collaboration

    2016-08-01

    We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey data. Through visual inspection of data from the Science Verification season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-object Spectrograph at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 either were not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy-cluster-scale lenses. The lensed sources range in redshift z ˜ 0.80-3.2 and in I-band surface brightness I SB ˜ 23-25 mag arcsec-2 (2″ aperture). For each of the six systems, we estimate the Einstein radius θ E and the enclosed mass M enc, which have ranges θ E ˜ 5″-9″ and M enc ˜ 8 × 1012 to 6 × 1013 M ⊙, respectively. This paper includes data gathered with the 6.5 m Magellan Telescopes located at Las Campanas Observatory, Chile.

  13. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  14. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  15. A preliminary study on the use of FX-Glycine gel and an in-house optical cone beam CT readout for IMRT and RapidArc verification

    NASA Astrophysics Data System (ADS)

    Ravindran, Paul B.; Ebenezer, Suman Babu S.; Winfred, Michael Raj; Amalan, S.

    2017-05-01

    The radiochromic FX gel with Optical CT readout has been investigated by several authors and has shown promising results for 3D dosimetry. One of the applications of the gel dosimeters is their use in 3D dose verification for IMRT and RapidArc quality assurance. Though polymer gel has been used successfully for clinical dose verification, the use of FX gel for clinical dose verification with optical cone beam CT needs further validation. In this work, we have used FX gel and an in- house optical readout system for gamma analysis between the dose matrices of measured dose distribution and a treatment planning system (TPS) calculated dose distribution for a few test cases.

  16. Depression and selection of positive and negative social feedback: motivated preference or cognitive balance?

    PubMed

    Alloy, L B; Lipman, A J

    1992-05-01

    In this commentary we examine Swann, Wenzlaff, Krull, and Pelham's (1992) findings with respect to each of 5 central propositions in self-verification theory. We conclude that although the data are consistent with self-verification theory, none of the 5 components of the theory have been demonstrated convincingly as yet. Specifically, we argue that depressed subjects' selection of social feedback appears to be balanced or evenhanded rather than biased toward negative feedback and that there is little evidence to indicate that depressives actively seek negative appraisals. Furthermore, we suggest that the studies are silent with respect to the motivational postulates of self-verification theory and that a variety of competing cognitive and motivational models can explain Swann et al.'s findings as well as self-verification theory.

  17. Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers

    NASA Technical Reports Server (NTRS)

    Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.

    1983-01-01

    A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.

  18. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  19. RF model of the distribution system as a communication channel, phase 2. Volume 2: Task reports

    NASA Technical Reports Server (NTRS)

    Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.

    1982-01-01

    Based on the established feasibility of predicting, via a model, the propagation of Power Line Frequency on radial type distribution feeders, verification studies comparing model predictions against measurements were undertaken using more complicated feeder circuits and situations. Detailed accounts of the major tasks are presented. These include: (1) verification of model; (2) extension, implementation, and verification of perturbation theory; (3) parameter sensitivity; (4) transformer modeling; and (5) compensation of power distribution systems for enhancement of power line carrier communication reliability.

  20. Developing a NASA strategy for the verification of large space telescope observatories

    NASA Astrophysics Data System (ADS)

    Crooke, Julie A.; Gunderson, Johanna A.; Hagopian, John G.; Levine, Marie

    2006-06-01

    In July 2005, the Office of Program Analysis and Evaluation (PA&E) at NASA Headquarters was directed to develop a strategy for verification of the performance of large space telescope observatories, which occurs predominantly in a thermal vacuum test facility. A mission model of the expected astronomical observatory missions over the next 20 years was identified along with performance, facility and resource requirements. Ground testing versus alternatives was analyzed to determine the pros, cons and break points in the verification process. Existing facilities and their capabilities were examined across NASA, industry and other government agencies as well as the future demand for these facilities across NASA's Mission Directorates. Options were developed to meet the full suite of mission verification requirements, and performance, cost, risk and other analyses were performed. Findings and recommendations from the study were presented to the NASA Administrator and the NASA Strategic Management Council (SMC) in February 2006. This paper details the analysis, results, and findings from this study.

  1. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the... verifications and analysis. It may also be necessary to limit the range of conditions under which the PEMS can... additional information or analysis to support your conclusions. (b) Overall verification. This paragraph (b...

  2. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  3. Feasibility of biochemical verification in a web-based smoking cessation study.

    PubMed

    Cha, Sarah; Ganz, Ollie; Cohn, Amy M; Ehlke, Sarah J; Graham, Amanda L

    2017-10-01

    Cogent arguments have been made against the need for biochemical verification in population-based studies with low-demand characteristics. Despite this fact, studies involving digital interventions (low-demand) are often required in peer review to report biochemically verified abstinence. To address this discrepancy, we examined the feasibility and costs of biochemical verification in a web-based study conducted with a national sample. Participants were 600U.S. adult current smokers who registered on a web-based smoking cessation program and completed surveys at baseline and 3months. Saliva sampling kits were sent to participants who reported 7-day abstinence at 3months, and analyzed for cotinine. The response rate at 3-months was 41.2% (n=247): 93 participants reported 7-day abstinence (38%) and were mailed a saliva kit (71% returned). The discordance rate was 36.4%. Participants with discordant responses were more likely to report 3-month use of nicotine replacement therapy or e-cigarettes than those with concordant responses (79.2% vs. 45.2%, p=0.007). The total cost of saliva sampling was $8280 ($125/sample). Biochemical verification was both time- and cost-intensive, and yielded a relatively small number of samples due to low response rates and use of other nicotine products during the follow-up period. There was a high rate of discordance of self-reported abstinence and saliva testing. Costs for data collection may be prohibitive for studies with large sample sizes or limited budgets. Our findings echo previous statements that biochemical verification is not necessary in population-based studies, and add evidence specific to technology-based studies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Experimental quantum verification in the presence of temporally correlated noise

    NASA Astrophysics Data System (ADS)

    Mavadia, S.; Edmunds, C. L.; Hempel, C.; Ball, H.; Roy, F.; Stace, T. M.; Biercuk, M. J.

    2018-02-01

    Growth in the capabilities of quantum information hardware mandates access to techniques for performance verification that function under realistic laboratory conditions. Here we experimentally characterise the impact of common temporally correlated noise processes on both randomised benchmarking (RB) and gate-set tomography (GST). Our analysis highlights the role of sequence structure in enhancing or suppressing the sensitivity of quantum verification protocols to either slowly or rapidly varying noise, which we treat in the limiting cases of quasi-DC miscalibration and white noise power spectra. We perform experiments with a single trapped 171Yb+ ion-qubit and inject engineered noise (" separators="∝σ^ z ) to probe protocol performance. Experiments on RB validate predictions that measured fidelities over sequences are described by a gamma distribution varying between approximately Gaussian, and a broad, highly skewed distribution for rapidly and slowly varying noise, respectively. Similarly we find a strong gate set dependence of default experimental GST procedures in the presence of correlated errors, leading to significant deviations between estimated and calculated diamond distances in the presence of correlated σ^ z errors. Numerical simulations demonstrate that expansion of the gate set to include negative rotations can suppress these discrepancies and increase reported diamond distances by orders of magnitude for the same error processes. Similar effects do not occur for correlated σ^ x or σ^ y errors or depolarising noise processes, highlighting the impact of the critical interplay of selected gate set and the gauge optimisation process on the meaning of the reported diamond norm in correlated noise environments.

  5. A Preliminary Experimental Examination of Worldview Verification, Perceived Racism, and Stress Reactivity in African Americans

    PubMed Central

    Lucas, Todd; Lumley, Mark A.; Flack, John M.; Wegner, Rhiana; Pierce, Jennifer; Goetz, Stefan

    2016-01-01

    Objective According to worldview verification theory, inconsistencies between lived experiences and worldviews are psychologically threatening. These inconsistencies may be key determinants of stress processes that influence cardiovascular health disparities. This preliminary examination considers how experiencing injustice can affect perceived racism and biological stress reactivity among African Americans. Guided by worldview verification theory, it was hypothesized that responses to receiving an unfair outcome would be moderated by fairness of the accompanying decision process, and that this effect would further depend on the consistency of the decision process with preexisting justice beliefs. Method A sample of 118 healthy African American adults completed baseline measures of justice beliefs, followed by a laboratory-based social-evaluative stressor task. Two randomized fairness manipulations were implemented during the task: participants were given either high or low levels of distributive (outcome) and procedural (decision process) justice. Glucocorticoid (cortisol) and inflammatory (C-reactive protein) biological responses were measured in oral fluids, and attributions of racism were also measured. Results The hypothesized 3-way interaction was generally obtained. Among African Americans with a strong belief in justice, perceived racism, cortisol and C-reactive protein responses to low distributive justice were higher when procedural justice was low. Among African Americans with a weak belief in justice however, these responses were higher when a low level of distributive justice was coupled with high procedural justice. Conclusions Biological and psychological processes that contribute to cardiovascular health disparities are affected by consistency between individual-level and contextual justice factors. PMID:27018728

  6. Two barriers to realizing the benefits of biometrics: a chain perspective on biometrics and identity fraud as biometrics' real challenge

    NASA Astrophysics Data System (ADS)

    Grijpink, Jan

    2004-06-01

    Along at least twelve dimensions biometric systems might vary. We need to exploit this variety to manoeuvre biometrics into place to be able to realise its social potential. Subsequently, two perspectives on biometrics are proposed revealing that biometrics will probably be ineffective in combating identity fraud, organised crime and terrorism: (1) the value chain perspective explains the first barrier: our strong preference for large scale biometric systems for general compulsory use. These biometric systems cause successful infringements to spread unnoticed. A biometric system will only function adequately if biometrics is indispensable for solving the dominant chain problem. Multi-chain use of biometrics takes it beyond the boundaries of good manageability. (2) the identity fraud perspective exposes the second barrier: our traditional approach to identity verification. We focus on identity documents, neglecting the person and the situation involved. Moreover, western legal cultures have made identity verification procedures known, transparent, uniform and predictable. Thus, we have developed a blind spot to identity fraud. Biometrics provides good potential to better checking persons, but will probably be used to enhance identity documents. Biometrics will only pay off if it confronts the identity fraudster with less predictable verification processes and more risks of his identity fraud being spotted. Standardised large scale applications of biometrics for general compulsory use without countervailing measures will probably produce the reverse. This contribution tentatively presents a few headlines for an overall biometrics strategy that could better resist identity fraud.

  7. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    PubMed Central

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  8. Differential evolution algorithm based photonic structure design: numerical and experimental verification of subwavelength λ/5 focusing of light.

    PubMed

    Bor, E; Turduev, M; Kurt, H

    2016-08-01

    Photonic structure designs based on optimization algorithms provide superior properties compared to those using intuition-based approaches. In the present study, we numerically and experimentally demonstrate subwavelength focusing of light using wavelength scale absorption-free dielectric scattering objects embedded in an air background. An optimization algorithm based on differential evolution integrated into the finite-difference time-domain method was applied to determine the locations of each circular dielectric object with a constant radius and refractive index. The multiobjective cost function defined inside the algorithm ensures strong focusing of light with low intensity side lobes. The temporal and spectral responses of the designed compact photonic structure provided a beam spot size in air with a full width at half maximum value of 0.19λ, where λ is the wavelength of light. The experiments were carried out in the microwave region to verify numerical findings, and very good agreement between the two approaches was found. The subwavelength light focusing is associated with a strong interference effect due to nonuniformly arranged scatterers and an irregular index gradient. Improving the focusing capability of optical elements by surpassing the diffraction limit of light is of paramount importance in optical imaging, lithography, data storage, and strong light-matter interaction.

  9. Differential evolution algorithm based photonic structure design: numerical and experimental verification of subwavelength λ/5 focusing of light

    PubMed Central

    Bor, E.; Turduev, M.; Kurt, H.

    2016-01-01

    Photonic structure designs based on optimization algorithms provide superior properties compared to those using intuition-based approaches. In the present study, we numerically and experimentally demonstrate subwavelength focusing of light using wavelength scale absorption-free dielectric scattering objects embedded in an air background. An optimization algorithm based on differential evolution integrated into the finite-difference time-domain method was applied to determine the locations of each circular dielectric object with a constant radius and refractive index. The multiobjective cost function defined inside the algorithm ensures strong focusing of light with low intensity side lobes. The temporal and spectral responses of the designed compact photonic structure provided a beam spot size in air with a full width at half maximum value of 0.19λ, where λ is the wavelength of light. The experiments were carried out in the microwave region to verify numerical findings, and very good agreement between the two approaches was found. The subwavelength light focusing is associated with a strong interference effect due to nonuniformly arranged scatterers and an irregular index gradient. Improving the focusing capability of optical elements by surpassing the diffraction limit of light is of paramount importance in optical imaging, lithography, data storage, and strong light-matter interaction. PMID:27477060

  10. Galaxy-galaxy lensing in the Dark Energy Survey Science Verification data

    DOE PAGES

    Clampitt, J.; S?nchez, C.; Kwan, J.; ...

    2016-11-22

    We present galaxy-galaxy lensing results from 139 square degrees of Dark Energy Survey (DES) Science Verification (SV) data. Our lens sample consists of red galaxies, known as redMaGiC, which are specifically selected to have a low photometric redshift error and outlier rate. The lensing measurement has a total signal-to-noise of 29 over scales $0.09 < R < 15$ Mpc/$h$, including all lenses over a wide redshift range $0.2 < z < 0.8$. Dividing the lenses into three redshift bins for this constant moving number density sample, we find no evidence for evolution in the halo mass with redshift. We obtainmore » consistent results for the lensing measurement with two independent shear pipelines, ngmix and im3shape. We perform a number of null tests on the shear and photometric redshift catalogs and quantify resulting systematic uncertainties. Covariances from jackknife subsamples of the data are validated with a suite of 50 mock surveys. The results and systematics checks in this work provide a critical input for future cosmological and galaxy evolution studies with the DES data and redMaGiC galaxy samples. We fit a Halo Occupation Distribution (HOD) model, and demonstrate that our data constrains the mean halo mass of the lens galaxies, despite strong degeneracies between individual HOD parameters.« less

  11. Complexity of major UK companies between 2006 and 2010: Hierarchical structure method approach

    NASA Astrophysics Data System (ADS)

    Ulusoy, Tolga; Keskin, Mustafa; Shirvani, Ayoub; Deviren, Bayram; Kantar, Ersin; Çaǧrı Dönmez, Cem

    2012-11-01

    This study reports on topology of the top 40 UK companies that have been analysed for predictive verification of markets for the period 2006-2010, applying the concept of minimal spanning tree and hierarchical tree (HT) analysis. Construction of the minimal spanning tree (MST) and the hierarchical tree (HT) is confined to a brief description of the methodology and a definition of the correlation function between a pair of companies based on the London Stock Exchange (LSE) index in order to quantify synchronization between the companies. A derivation of hierarchical organization and the construction of minimal-spanning and hierarchical trees for the 2006-2008 and 2008-2010 periods have been used and the results validate the predictive verification of applied semantics. The trees are known as useful tools to perceive and detect the global structure, taxonomy and hierarchy in financial data. From these trees, two different clusters of companies in 2006 were detected. They also show three clusters in 2008 and two between 2008 and 2010, according to their proximity. The clusters match each other as regards their common production activities or their strong interrelationship. The key companies are generally given by major economic activities as expected. This work gives a comparative approach between MST and HT methods from statistical physics and information theory with analysis of financial markets that may give new valuable and useful information of the financial market dynamics.

  12. Galaxy-galaxy lensing in the Dark Energy Survey Science Verification data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clampitt, J.; S?nchez, C.; Kwan, J.

    We present galaxy-galaxy lensing results from 139 square degrees of Dark Energy Survey (DES) Science Verification (SV) data. Our lens sample consists of red galaxies, known as redMaGiC, which are specifically selected to have a low photometric redshift error and outlier rate. The lensing measurement has a total signal-to-noise of 29 over scales $0.09 < R < 15$ Mpc/$h$, including all lenses over a wide redshift range $0.2 < z < 0.8$. Dividing the lenses into three redshift bins for this constant moving number density sample, we find no evidence for evolution in the halo mass with redshift. We obtainmore » consistent results for the lensing measurement with two independent shear pipelines, ngmix and im3shape. We perform a number of null tests on the shear and photometric redshift catalogs and quantify resulting systematic uncertainties. Covariances from jackknife subsamples of the data are validated with a suite of 50 mock surveys. The results and systematics checks in this work provide a critical input for future cosmological and galaxy evolution studies with the DES data and redMaGiC galaxy samples. We fit a Halo Occupation Distribution (HOD) model, and demonstrate that our data constrains the mean halo mass of the lens galaxies, despite strong degeneracies between individual HOD parameters.« less

  13. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (UA-D-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the Border study. Keywords: Computers; Software; QA/QC.

    The U.S.-Mexico Border Program is sponsored ...

  14. An Investigation into Solution Verification for CFD-DEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fullmer, William D.; Musser, Jordan

    This report presents the study of the convergence behavior of the computational fluid dynamicsdiscrete element method (CFD-DEM) method, specifically National Energy Technology Laboratory’s (NETL) open source MFiX code (MFiX-DEM) with a diffusion based particle-tocontinuum filtering scheme. In particular, this study focused on determining if the numerical method had a solution in the high-resolution limit where the grid size is smaller than the particle size. To address this uncertainty, fixed particle beds of two primary configurations were studied: i) fictitious beds where the particles are seeded with a random particle generator, and ii) instantaneous snapshots from a transient simulation of anmore » experimentally relevant problem. Both problems considered a uniform inlet boundary and a pressure outflow. The CFD grid was refined from a few particle diameters down to 1/6 th of a particle diameter. The pressure drop between two vertical elevations, averaged across the bed cross-section was considered as the system response quantity of interest. A least-squares regression method was used to extrapolate the grid-dependent results to an approximate “grid-free” solution in the limit of infinite resolution. The results show that the diffusion based scheme does yield a converging solution. However, the convergence is more complicated than encountered in simpler, single-phase flow problems showing strong oscillations and, at times, oscillations superimposed on top of globally non-monotonic behavior. The challenging convergence behavior highlights the importance of using at least four grid resolutions in solution verification problems so that (over-determined) regression-based extrapolation methods may be applied to approximate the grid-free solution. The grid-free solution is very important in solution verification and VVUQ exercise in general as the difference between it and the reference solution largely determines the numerical uncertainty. By testing different randomized particle configurations of the same general problem (for the fictitious case) or different instances of freezing a transient simulation, the numerical uncertainties appeared to be on the same order of magnitude as ensemble or time averaging uncertainties. By testing different drag laws, almost all cases studied show that model form uncertainty in this one, very important closure relation was larger than the numerical uncertainty, at least with a reasonable CFD grid, roughly five particle diameters. In this study, the diffusion width (filtering length scale) was mostly set at a constant of six particle diameters. A few exploratory tests were performed to show that similar convergence behavior was observed for diffusion widths greater than approximately two particle diameters. However, this subject was not investigated in great detail because determining an appropriate filter size is really a validation question which must be determined by comparison to experimental or highly accurate numerical data. Future studies are being considered targeting solution verification of transient simulations as well as validation of the filter size with direct numerical simulation data.« less

  15. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication... 30 Mineral Resources 2 2011-07-01 2011-07-01 false When must I resubmit Platform Verification...

  16. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What is the Platform Verification Program? 250... Platforms and Structures Platform Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms...

  17. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  18. Preventing illegal tobacco and alcohol sales to minors through electronic age-verification devices: a field effectiveness study.

    PubMed

    Krevor, Brad; Capitman, John A; Oblak, Leslie; Cannon, Joanna B; Ruwe, Mathilda

    2003-01-01

    Efforts to prohibit the sales of tobacco and alcohol products to minors are widespread. Electronic Age Verification (EAV) devices are one possible means to improve compliance with sales to minors laws. The purpose of this study was to evaluate the implementation and effectiveness of EAV devices in terms of the frequency and accuracy of age verification, as well as to examine the impact of EAV's on the retailer environment. Two study locations were selected: Tallahassee, Florida and Iowa City, Iowa. Retail stores were invited to participate in the study, producing a self-selected experimental group. Stores that did not elect to test the EAV's comprised the comparison group. The data sources included: 1) mystery shopper inspections: two pre- and five post-EAV installation mystery shopper inspections of tobacco and alcohol retailers; 2) retail clerk and manager interviews; and 3) customer interviews. The study found that installing EAV devices with minimal training and encouragement did not increase age verification and underage sales refusal. Surveyed clerks reported positive experiences using the electronic ID readers and customers reported almost no discomfort about being asked to swipe their IDs. Observations from this study support the need for a more comprehensive system for responsible retailing.

  19. Characterization of novel preclinical dose distributions for micro irradiator

    NASA Astrophysics Data System (ADS)

    Kodra, J.; Miles, D.; Yoon, S. W.; Kirsch, D. G.; Oldham, M.

    2017-05-01

    This work explores and demonstrates the feasibility of utilizing new 3D printing techniques to implement advanced micro radiation therapy for pre-clinical small animal studies. 3D printed blocks and compensators were designed and printed from a strong x-ray attenuating material at sub-millimeter resolution. These techniques enable a powerful range of new preclinical treatment capabilities including grid therapy, lattice therapy, and IMRT treatment. At small scales, verification of these treatments is exceptionally challenging, and high resolution 3D dosimetry (0.5mm3) is an essential capability to characterize and verify these capabilities, Here, investigate the 2D and 3D dosimetry of several novel pre-clinical treatments using a combination of EBT film and Presage/optical-CT 3D dosimetry in rodent-morphic dosimeters.

  20. Heaviest Nuclei: New Element with Atomic Number 117

    ScienceCinema

    Oganessian, Yuri

    2018-01-24

    One of the fundamental outcomes of the nuclear shell model is the prediction of the 'stability islands' in the domain of the hypothetical super heavy elements. The talk is devoted to the experimental verification of these predictions - the synthesis and study of both the decay and chemical properties of the super heavy elements. The discovery of a new chemical element with atomic number Z=117 is reported. The isotopes 293117 and 294117 were produced in fusion reactions between 48Ca and 249Bk. Decay chains involving 11 new nuclei were identified by means of the Dubna gas-filled recoil separator. The measured decay properties show a strong rise of stability for heavier isotopes with Z =111, validating the concept of the long sought island of enhanced stability for heaviest nuclei.

  1. Advanced Global Atmospheric Gases Experiment (AGAGE)

    NASA Technical Reports Server (NTRS)

    Prinn, Ronald G.; Kurylo, Michael (Technical Monitor)

    2004-01-01

    We seek funding from NASA for the third year (2005) of the four-year period January 1, 2003 - December 31, 2006 for continued support of the MIT contributions to the multi-national global atmospheric trace species measurement program entitled Advanced Global Atmospheric Gases Experiment (AGAGE). The case for real-time high-frequency measurement networks like AGAGE is very strong and the observations and their interpretation are widely recognized for their importance to ozone depletion and climate change studies and to verification issues arising from the Montreal Protocol (ozone) and Kyoto Protocol (climate). The proposed AGAGE program is distinguished by its capability to measure over the globe at high frequency almost all of the important species in the Montreal Protocol and almost all of the significant non-CO2 gases in the Kyoto Protocol.

  2. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What is the Platform Verification Program? 250... Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms; platforms of a new or unique design...

  3. Property-driven functional verification technique for high-speed vision system-on-chip processor

    NASA Astrophysics Data System (ADS)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  4. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  5. The ``X component'' of the radio background

    NASA Astrophysics Data System (ADS)

    Semenova, T. A.; Pariiskii, Yu. N.; Bursov, N. N.

    2009-01-01

    The recent publication of evidence for a new mechanism producing background radio emission of the Galaxy at centimeter wavelengths (in addition to synchrotron radiation, free—free transitions in ionized gas, and the weak radio emission of standard dust) gave rise to a strong reaction among observers, and requires independent experimental verification. This signal is of special concern in connection with studies of the polarization of the cosmic microwave background (CMB) using new-generation experiments. We have derived independent estimates of the validity of the “spinning-dust” hypothesis (dipole emission of macromolecules) using multi-frequency RATAN-600 observations. Test studies in the Perseus molecular cloud show evidence for anomalous extended emission in the absence of strong radio sources (compact HII regions) that could imitate an anomalous radio spectrum in this region. A statistical analysis at centimeter wavelengths over the Ratan Zenith Field shows that the upper limit for the polarized noise from this new component in the spinning-dust hypothesis is unlikely to exceed 1 µK at wavelengths of 1 cm or shorter on the main scales of the EE mode of Sakharov oscillations. Thus, this emission should not hinder studies of this mode, at least to within several percent of the predicted level of polarization of the CMB emission.

  6. Preservation of potassium balance is strongly associated with insect cold tolerance in the field: a seasonal study of Drosophila subobscura.

    PubMed

    MacMillan, Heath A; Schou, Mads F; Kristensen, Torsten N; Overgaard, Johannes

    2016-05-01

    There is interest in pinpointing genes and physiological mechanisms explaining intra- and interspecific variations in cold tolerance, because thermal tolerance phenotypes strongly impact the distribution and abundance of wild animals. Laboratory studies have highlighted that the capacity to preserve water and ion homeostasis is linked to low temperature survival in insects. It remains unknown, however, whether adaptive seasonal acclimatization in free-ranging insects is governed by the same physiological mechanisms. Here, we test whether cold tolerance in field-caught Drosophila subobscura is high in early spring and lower during summer and whether this transition is associated with seasonal changes in the capacity of flies to preserve water and ion balance during cold stress. Indeed, flies caught during summer were less cold tolerant, and exposure of these flies to sub-zero temperatures caused a loss of haemolymph water and increased the concentration of K(+) in the haemolymph (as in laboratory-reared insects). This pattern of ion and water balance disruption was not observed in more cold-tolerant flies caught in early spring. Thus, we here provide a field verification of hypotheses based on laboratory studies and conclude that the ability to maintain ion homeostasis is important for the ability of free-ranging insects to cope with chilling. © 2016 The Author(s).

  7. Evaluation of a cosmic-ray neutron sensor network for improved land surface model prediction

    NASA Astrophysics Data System (ADS)

    Baatz, Roland; Hendricks Franssen, Harrie-Jan; Han, Xujun; Hoar, Tim; Reemt Bogena, Heye; Vereecken, Harry

    2017-05-01

    In situ soil moisture sensors provide highly accurate but very local soil moisture measurements, while remotely sensed soil moisture is strongly affected by vegetation and surface roughness. In contrast, cosmic-ray neutron sensors (CRNSs) allow highly accurate soil moisture estimation on the field scale which could be valuable to improve land surface model predictions. In this study, the potential of a network of CRNSs installed in the 2354 km2 Rur catchment (Germany) for estimating soil hydraulic parameters and improving soil moisture states was tested. Data measured by the CRNSs were assimilated with the local ensemble transform Kalman filter in the Community Land Model version 4.5. Data of four, eight and nine CRNSs were assimilated for the years 2011 and 2012 (with and without soil hydraulic parameter estimation), followed by a verification year 2013 without data assimilation. This was done using (i) a regional high-resolution soil map, (ii) the FAO soil map and (iii) an erroneous, biased soil map as input information for the simulations. For the regional soil map, soil moisture characterization was only improved in the assimilation period but not in the verification period. For the FAO soil map and the biased soil map, soil moisture predictions improved strongly to a root mean square error of 0.03 cm3 cm-3 for the assimilation period and 0.05 cm3 cm-3 for the evaluation period. Improvements were limited by the measurement error of CRNSs (0.03 cm3 cm-3). The positive results obtained with data assimilation of nine CRNSs were confirmed by the jackknife experiments with four and eight CRNSs used for assimilation. The results demonstrate that assimilated data of a CRNS network can improve the characterization of soil moisture content on the catchment scale by updating spatially distributed soil hydraulic parameters of a land surface model.

  8. Cross-Language Phonological Activation of Meaning: Evidence from Category Verification

    ERIC Educational Resources Information Center

    Friesen, Deanna C.; Jared, Debra

    2012-01-01

    The study investigated phonological processing in bilingual reading for meaning. English-French and French-English bilinguals performed a category verification task in either their first or second language. Interlingual homophones (words that share phonology across languages but not orthography or meaning) and single language control words served…

  9. Academic Self-Esteem and Perceived Validity of Grades: A Test of Self-Verification Theory.

    ERIC Educational Resources Information Center

    Okun, Morris A.; Fournet, Lee M.

    1993-01-01

    The hypothesis derived from self-verification theory that semester grade point average would be positively related to perceived validity of grade scores among high self-esteem undergraduates and inversely related for low self-esteem students was not supported in a study with 281 undergraduates. (SLD)

  10. Implementation of Precision Verification Solvents on the External Tank

    NASA Technical Reports Server (NTRS)

    Campbell, M.

    1998-01-01

    This paper presents the Implementation of Precision Verification Solvents on the External Tank. The topics include: 1) Background; 2) Solvent Usages; 3) TCE (Trichloroethylene) Reduction; 4) Solvent Replacement Studies; 5) Implementation; 6) Problems Occuring During Implementation; and 7) Future Work. This paper is presented in viewgraph form.

  11. A UVM simulation environment for the study, optimization and verification of HL-LHC digital pixel readout chips

    NASA Astrophysics Data System (ADS)

    Marconi, S.; Conti, E.; Christiansen, J.; Placidi, P.

    2018-05-01

    The operating conditions of the High Luminosity upgrade of the Large Hadron Collider are very demanding for the design of next generation hybrid pixel readout chips in terms of particle rate, radiation level and data bandwidth. To this purpose, the RD53 Collaboration has developed for the ATLAS and CMS experiments a dedicated simulation and verification environment using industry-consolidated tools and methodologies, such as SystemVerilog and the Universal Verification Methodology (UVM). This paper presents how the so-called VEPIX53 environment has first guided the design of digital architectures, optimized for processing and buffering very high particle rates, and secondly how it has been reused for the functional verification of the first large scale demonstrator chip designed by the collaboration, which has recently been submitted.

  12. Design of verification platform for wireless vision sensor networks

    NASA Astrophysics Data System (ADS)

    Ye, Juanjuan; Shang, Fei; Yu, Chuang

    2017-08-01

    At present, the majority of research for wireless vision sensor networks (WVSNs) still remains in the software simulation stage, and the verification platforms of WVSNs that available for use are very few. This situation seriously restricts the transformation from theory research of WVSNs to practical application. Therefore, it is necessary to study the construction of verification platform of WVSNs. This paper combines wireless transceiver module, visual information acquisition module and power acquisition module, designs a high-performance wireless vision sensor node whose core is ARM11 microprocessor and selects AODV as the routing protocol to set up a verification platform called AdvanWorks for WVSNs. Experiments show that the AdvanWorks can successfully achieve functions of image acquisition, coding, wireless transmission, and obtain the effective distance parameters between nodes, which lays a good foundation for the follow-up application of WVSNs.

  13. Quantum integrability and functional equations

    NASA Astrophysics Data System (ADS)

    Volin, Dmytro

    2010-03-01

    In this thesis a general procedure to represent the integral Bethe Ansatz equations in the form of the Reimann-Hilbert problem is given. This allows us to study in simple way integrable spin chains in the thermodynamic limit. Based on the functional equations we give the procedure that allows finding the subleading orders in the solution of various integral equations solved to the leading order by the Wiener-Hopf technics. The integral equations are studied in the context of the AdS/CFT correspondence, where their solution allows verification of the integrability conjecture up to two loops of the strong coupling expansion. In the context of the two-dimensional sigma models we analyze the large-order behavior of the asymptotic perturbative expansion. Obtained experience with the functional representation of the integral equations allowed us also to solve explicitly the crossing equations that appear in the AdS/CFT spectral problem.

  14. SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamashita, M; Kokubo, M; Institute of Biomedical Research and Innovation, Kobe, Hyogo

    2016-06-15

    Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used formore » dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  15. Impact of radiation attenuation by a carbon fiber couch on patient dose verification

    NASA Astrophysics Data System (ADS)

    Yu, Chun-Yen; Chou, Wen-Tsae; Liao, Yi-Jen; Lee, Jeng-Hung; Liang, Ji-An; Hsu, Shih-Ming

    2017-02-01

    The aim of this study was to understand the difference between the measured and calculated irradiation attenuations obtained using two algorithms and to identify the influence of couch attenuation on patient dose verification. We performed eight tests of couch attenuation with two photon energies, two longitudinal couch positions, and two rail positions. The couch attenuation was determined using a radiation treatment planning system. The measured and calculated attenuations were compared. We also performed 12 verifications of head-and-neck and rectum cases by using a Delta phantom. The dose deviation (DD), distance to agreement (DTA), and gamma index of pencil-beam convolution (PBC) verifications were nearly the same. The agreement was least consistent for the anisotropic analytical algorithm (AAA) without the couch for the head-and-neck case, in which the DD, DTA, and gamma index were 74.4%, 99.3%, and 89%, respectively; for the rectum case, the corresponding values were 56.2%, 95.1%, and 92.4%. We suggest that dose verification should be performed using the following three metrics simultaneously: DD, DTA, and the gamma index.

  16. Psychiatric Residents' Attitudes toward and Experiences with the Clinical-Skills Verification Process: A Pilot Study on U.S. and International Medical Graduates

    ERIC Educational Resources Information Center

    Rao, Nyapati R.; Kodali, Rahul; Mian, Ayesha; Ramtekkar, Ujjwal; Kamarajan, Chella; Jibson, Michael D.

    2012-01-01

    Objective: The authors report on a pilot study of the experiences and perceptions of foreign international medical graduate (F-IMG), United States international medical graduate (US-IMG), and United States medical graduate (USMG) psychiatric residents with the newly mandated Clinical Skills Verification (CSV) process. The goal was to identify and…

  17. Verification of a three-dimensional viscous flow analysis for a single stage compressor

    NASA Astrophysics Data System (ADS)

    Matsuoka, Akinori; Hashimoto, Keisuke; Nozaki, Osamu; Kikuchi, Kazuo; Fukuda, Masahiro; Tamura, Atsuhiro

    1992-12-01

    A transonic flowfield around rotor blades of a highly loaded single stage axial compressor was numerically analyzed by a three dimensional compressible Navier-Stokes equation code using Chakravarthy and Osher type total variation diminishing (TVD) scheme. A stage analysis which calculates both flowfields around inlet guide vane (IGV) and rotor blades simultaneously was carried out. Comparing with design values and experimental data, computed results show slight difference quantitatively. But the numerical calculation simulates well the pressure rise characteristics of the compressor and its flow pattern including strong shock surface.

  18. Automated Verification of Design Patterns with LePUS3

    NASA Technical Reports Server (NTRS)

    Nicholson, Jonathan; Gasparis, Epameinondas; Eden, Ammon H.; Kazman, Rick

    2009-01-01

    Specification and [visual] modelling languages are expected to combine strong abstraction mechanisms with rigour, scalability, and parsimony. LePUS3 is a visual, object-oriented design description language axiomatized in a decidable subset of the first-order predicate logic. We demonstrate how LePUS3 is used to formally specify a structural design pattern and prove ( verify ) whether any JavaTM 1.4 program satisfies that specification. We also show how LePUS3 specifications (charts) are composed and how they are verified fully automatically in the Two-Tier Programming Toolkit.

  19. Quantitative ultrasonic evaluation of engineering properties in metals, composites and ceramics

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1980-01-01

    Ultrasonic technology from the perspective of nondestructive evaluation approaches to material strength prediction and property verification is reviewed. Emergent advanced technology involving quantitative ultrasonic techniques for materials characterization is described. Ultrasonic methods are particularly useful in this area because they involve mechanical elastic waves that are strongly modulated by the same morphological factors that govern mechanical strength and dynamic failure processes. It is emphasized that the technology is in its infancy and that much effort is still required before all the available techniques can be transferred from laboratory to industrial environments.

  20. Space shuttle engineering and operations support. Avionics system engineering

    NASA Technical Reports Server (NTRS)

    Broome, P. A.; Neubaur, R. J.; Welsh, R. T.

    1976-01-01

    The shuttle avionics integration laboratory (SAIL) requirements for supporting the Spacelab/orbiter avionics verification process are defined. The principal topics are a Spacelab avionics hardware assessment, test operations center/electronic systems test laboratory (TOC/ESL) data processing requirements definition, SAIL (Building 16) payload accommodations study, and projected funding and test scheduling. Because of the complex nature of the Spacelab/orbiter computer systems, the PCM data link, and the high rate digital data system hardware/software relationships, early avionics interface verification is required. The SAIL is a prime candidate test location to accomplish this early avionics verification.

  1. Quantitative Prediction of the Effect of CYP3A Inhibitors and Inducers on Venetoclax Pharmacokinetics Using a Physiologically Based Pharmacokinetic Model.

    PubMed

    Freise, Kevin J; Shebley, Mohamad; Salem, Ahmed Hamed

    2017-06-01

    The objectives of the analysis were to develop and verify a venetoclax physiologically based pharmacokinetic (PBPK) model to predict the effects of cytochrome P450 3A (CYP3A) inhibitors and inducers on the PK of venetoclax and inform dosing recommendations. A minimal PBPK model was developed based on prior in vitro and in vivo clinical data using a "middle-out" approach. The PBPK model was independently verified against clinical studies of the strong CYP3A inhibitor ketoconazole, the strong CYP3A inducer, multiple-dose rifampin, and the steady-state venetoclax PK in chronic lymphocytic leukemia (CLL) subjects by comparing predicted to observed ratios of the venetoclax maximum concentration (C max R) and area under the curve from time 0 to infinity (AUC ∞ R) from these studies. The verified PBPK model was then used to simulate the effects of different CYP3A inhibitors and inducers on the venetoclax PK. Comparison of the PBPK model predicted to the observed PK parameters indicated good agreement. Verification of the PBPK model demonstrated that the ratios of the predicted:observed C max R and AUC ∞ R of venetoclax were within 0.8- to 1.25-fold range for strong CYP3A inhibitors and inducers. Model simulations indicated no effect of weak CYP3A inhibitors or inducers on C max or AUC ∞ , while both moderate and strong CYP3A inducers were estimated to decrease venetoclax exposure. Moderate and strong CYP3A inhibitors were estimated to increase venetoclax AUC ∞ , by 100% to 390% and 480% to 680%, respectively. The recommended venetoclax dose reductions of at least 50% and 75% when coadministered with moderate and strong CYP3A inhibitors, respectively, maintain venetoclax exposures between therapeutic and maximally administered safe doses. © 2017, The American College of Clinical Pharmacology.

  2. Systematic Model-in-the-Loop Test of Embedded Control Systems

    NASA Astrophysics Data System (ADS)

    Krupp, Alexander; Müller, Wolfgang

    Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.

  3. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  4. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  5. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  6. Mode conversion at density irregularities in the LAPD

    NASA Astrophysics Data System (ADS)

    Kersten, Kristopher; Cattell, Cynthia; van Compernolle, Bart; Gekelman, Walter; Pribyl, Pat; Vincena, Steve

    2010-11-01

    Mode conversion of electrostatic plasma oscillations to electromagnetic radiation is commonly observed in space plasmas as Type II and III radio bursts. Much theoretical work has addressed the phenomenon, but due to the transient nature and generation location of the bursts, experimental verification via in situ observation has proved difficult. The Large Plasma Device (LAPD) provides a reproducible plasma environment that can be tailored for the study of space plasma phenomena. A highly configurable axial magnetic field and flexible diagnostics make the device well suited for the study of plasma instabilities at density gradients. We present preliminary results of mode conversion studies performed at the LAPD. The studies employed an electron beam source configured to drive Langmuir waves towards high density plasma near the cathode discharge. Internal floating potential probes show the expected plasma oscillations ahead of the beam cathode, and external microwave antenna signals reveal a strong band of radiation near the plasma frequency that persists into the low density plasma afterglow.

  7. Space Shuttle Range Safety Command Destruct System Analysis and Verification. Phase 1. Destruct System Analysis and Verification

    DTIC Science & Technology

    1981-03-01

    overcome the shortcomings of this system. A phase III study develops the breakup model of the Space Shuttle clus’ter at various times into flight. The...2-1 ROCKET MODEL ..................................................... 2-5 COMBUSTION CHAMBER OPERATION ................................... 2-5...2-19 RESULTS .......................................................... 2-22 ROCKET MODEL

  8. Why Verifying Diagnostic Decisions with a Checklist Can Help: Insights from Eye Tracking

    ERIC Educational Resources Information Center

    Sibbald, Matthew; de Bruin, Anique B. H.; Yu, Eric; van Merrienboer, Jeroen J. G.

    2015-01-01

    Making a diagnosis involves ratifying or verifying a proposed answer. Formalizing this verification process with checklists, which highlight key variables involved in the diagnostic decision, is often advocated. However, the mechanisms by which a checklist might allow clinicians to improve their verification process have not been well studied. We…

  9. 38 CFR 21.7652 - Certification of enrollment and verification of pursuit.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... maintain daily attendance records for any course leading to a standard college degree. (a) Content of... breaks between school years. (3) When a reservist enrolls in independent study leading to a standard...) Verification of pursuit. (1) A reservist who is pursuing a course leading to a standard college degree must...

  10. The Interaction between Surface Color and Color Knowledge: Behavioral and Electrophysiological Evidence

    ERIC Educational Resources Information Center

    Bramao, Ines; Faisca, Luis; Forkstam, Christian; Inacio, Filomena; Araujo, Susana; Petersson, Karl Magnus; Reis, Alexandra

    2012-01-01

    In this study, we used event-related potentials (ERPs) to evaluate the contribution of surface color and color knowledge information in object identification. We constructed two color-object verification tasks--a surface and a knowledge verification task--using high color diagnostic objects; both typical and atypical color versions of the same…

  11. 38 CFR 21.7652 - Certification of enrollment and verification of pursuit.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... maintain daily attendance records for any course leading to a standard college degree. (a) Content of... breaks between school years. (3) When a reservist enrolls in independent study leading to a standard...) Verification of pursuit. (1) A reservist who is pursuing a course leading to a standard college degree must...

  12. Experimental verification of a Monte Carlo-based MLC simulation model for IMRT dose calculations in heterogeneous media

    NASA Astrophysics Data System (ADS)

    Tyagi, N.; Curran, B. H.; Roberson, P. L.; Moran, J. M.; Acosta, E.; Fraass, B. A.

    2008-02-01

    IMRT often requires delivering small fields which may suffer from electronic disequilibrium effects. The presence of heterogeneities, particularly low-density tissues in patients, complicates such situations. In this study, we report on verification of the DPM MC code for IMRT treatment planning in heterogeneous media, using a previously developed model of the Varian 120-leaf MLC. The purpose of this study is twofold: (a) design a comprehensive list of experiments in heterogeneous media for verification of any dose calculation algorithm and (b) verify our MLC model in these heterogeneous type geometries that mimic an actual patient geometry for IMRT treatment. The measurements have been done using an IMRT head and neck phantom (CIRS phantom) and slab phantom geometries. Verification of the MLC model has been carried out using point doses measured with an A14 slim line (SL) ion chamber inside a tissue-equivalent and a bone-equivalent material using the CIRS phantom. Planar doses using lung and bone equivalent slabs have been measured and compared using EDR films (Kodak, Rochester, NY).

  13. TRIAD: The Translational Research Informatics and Data Management Grid

    PubMed Central

    Payne, P.; Ervin, D.; Dhaval, R.; Borlawsky, T.; Lai, A.

    2011-01-01

    Objective Multi-disciplinary and multi-site biomedical research programs frequently require infrastructures capable of enabling the collection, management, analysis, and dissemination of heterogeneous, multi-dimensional, and distributed data and knowledge collections spanning organizational boundaries. We report on the design and initial deployment of an extensible biomedical informatics platform that is intended to address such requirements. Methods A common approach to distributed data, information, and knowledge management needs in the healthcare and life science settings is the deployment and use of a service-oriented architecture (SOA). Such SOA technologies provide for strongly-typed, semantically annotated, and stateful data and analytical services that can be combined into data and knowledge integration and analysis “pipelines.” Using this overall design pattern, we have implemented and evaluated an extensible SOA platform for clinical and translational science applications known as the Translational Research Informatics and Data-management grid (TRIAD). TRIAD is a derivative and extension of the caGrid middleware and has an emphasis on supporting agile “working interoperability” between data, information, and knowledge resources. Results Based upon initial verification and validation studies conducted in the context of a collection of driving clinical and translational research problems, we have been able to demonstrate that TRIAD achieves agile “working interoperability” between distributed data and knowledge sources. Conclusion Informed by our initial verification and validation studies, we believe TRIAD provides an example instance of a lightweight and readily adoptable approach to the use of SOA technologies in the clinical and translational research setting. Furthermore, our initial use cases illustrate the importance and efficacy of enabling “working interoperability” in heterogeneous biomedical environments. PMID:23616879

  14. TRIAD: The Translational Research Informatics and Data Management Grid.

    PubMed

    Payne, P; Ervin, D; Dhaval, R; Borlawsky, T; Lai, A

    2011-01-01

    Multi-disciplinary and multi-site biomedical research programs frequently require infrastructures capable of enabling the collection, management, analysis, and dissemination of heterogeneous, multi-dimensional, and distributed data and knowledge collections spanning organizational boundaries. We report on the design and initial deployment of an extensible biomedical informatics platform that is intended to address such requirements. A common approach to distributed data, information, and knowledge management needs in the healthcare and life science settings is the deployment and use of a service-oriented architecture (SOA). Such SOA technologies provide for strongly-typed, semantically annotated, and stateful data and analytical services that can be combined into data and knowledge integration and analysis "pipelines." Using this overall design pattern, we have implemented and evaluated an extensible SOA platform for clinical and translational science applications known as the Translational Research Informatics and Data-management grid (TRIAD). TRIAD is a derivative and extension of the caGrid middleware and has an emphasis on supporting agile "working interoperability" between data, information, and knowledge resources. Based upon initial verification and validation studies conducted in the context of a collection of driving clinical and translational research problems, we have been able to demonstrate that TRIAD achieves agile "working interoperability" between distributed data and knowledge sources. Informed by our initial verification and validation studies, we believe TRIAD provides an example instance of a lightweight and readily adoptable approach to the use of SOA technologies in the clinical and translational research setting. Furthermore, our initial use cases illustrate the importance and efficacy of enabling "working interoperability" in heterogeneous biomedical environments.

  15. Online 3D EPID-based dose verification: Proof of concept.

    PubMed

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel

    2016-07-01

    Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5-10 s irradiation time. A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.

  16. A quantification of the effectiveness of EPID dosimetry and software-based plan verification systems in detecting incidents in radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bojechko, Casey; Phillps, Mark; Kalet, Alan

    Purpose: Complex treatments in radiation therapy require robust verification in order to prevent errors that can adversely affect the patient. For this purpose, the authors estimate the effectiveness of detecting errors with a “defense in depth” system composed of electronic portal imaging device (EPID) based dosimetry and a software-based system composed of rules-based and Bayesian network verifications. Methods: The authors analyzed incidents with a high potential severity score, scored as a 3 or 4 on a 4 point scale, recorded in an in-house voluntary incident reporting system, collected from February 2012 to August 2014. The incidents were categorized into differentmore » failure modes. The detectability, defined as the number of incidents that are detectable divided total number of incidents, was calculated for each failure mode. Results: In total, 343 incidents were used in this study. Of the incidents 67% were related to photon external beam therapy (EBRT). The majority of the EBRT incidents were related to patient positioning and only a small number of these could be detected by EPID dosimetry when performed prior to treatment (6%). A large fraction could be detected by in vivo dosimetry performed during the first fraction (74%). Rules-based and Bayesian network verifications were found to be complimentary to EPID dosimetry, able to detect errors related to patient prescriptions and documentation, and errors unrelated to photon EBRT. Combining all of the verification steps together, 91% of all EBRT incidents could be detected. Conclusions: This study shows that the defense in depth system is potentially able to detect a large majority of incidents. The most effective EPID-based dosimetry verification is in vivo measurements during the first fraction and is complemented by rules-based and Bayesian network plan checking.« less

  17. Impact of a quality-assessment dashboard on the comprehensive review of pharmacist performance.

    PubMed

    Trinh, Long D; Roach, Erin M; Vogan, Eric D; Lam, Simon W; Eggers, Garrett G

    2017-09-01

    The impact of a quality-assessment dashboard and individualized pharmacist performance feedback on the adherence of order verification was evaluated. A before-and-after study was conducted at a 1,440-bed academic medical center. Adherence of order verification was defined as orders verified according to institution-derived, medication-related guidelines and policies. Formulas were developed to assess the adherence of verified orders to dosing guidelines using patient-specific height, weight, and serum creatinine clearance values from the electronic medical record at the time of pharmacist verification. A total of 5 medications were assessed by the formulas for adherence and displayed on the dashboard: ampicillin-sulbactam, ciprofloxacin, piperacillin-tazobactam, acyclovir, and enoxaparin. Adherence of order verification was assessed before (May 1-July 31, 2015) and after (November 1, 2015-January 31, 2016) individualized performance feedback was given based on trends identified by the quality-assessment dashboard. There was a significant increase in the overall adherence rate postintervention (90.1% versus 91.9%, p = 0.040). Among the 34 pharmacists who participated, the percentage of pharmacists with at least 90% overall adherence increased postintervention (52.9% versus 70.6%, p = 0.103). Time to verification was similar before and after the study intervention (median, 6.0 minutes; interquartile range, 3-13 minutes). The rate of documentation for nonadherent orders increased significantly postintervention (57.1% versus 68.5%, p = 0.019). The implementation of the quality-assessment dashboard, educational sessions, and individualized performance feedback significantly improved pharmacist order-verification adherence to institution-derived, medication-related guidelines and policies and the documentation rate of nonadherent orders. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  18. [Determinants of task preferences when performance is indicative of individual characteristics: self-assessment motivation and self-verification motivation].

    PubMed

    Numazaki, M; Kudo, E

    1995-04-01

    The present study was conducted to examine determinants of information-gathering behavior with regard to one's own characteristics. Four tasks with different self-congruent and incongruent diagnosticity were presented to subjects. As self-assessment theory predicted, high diagnostic tasks were preferred to low tasks. And as self-verification theory predicted, self-congruent diagnosticity had a stronger effect on task preference than self-incongruent diagnosticity. In addition, subjects who perceived the relevant characteristics important inclined to choose self-assessment behavior more than who did not. Also, subjects who were certain of their self-concept inclined to choose self-verification behavior more than who were not. These results suggest that both self-assessment and self-verification motivations play important roles in information-gathering behavior regarding one's characteristics, and strength of the motivations is determined by the importance of relevant characteristics or the certainty of self-concept.

  19. Comprehending how visual context influences incremental sentence processing: insights from ERPs and picture-sentence verification

    PubMed Central

    Knoeferle, Pia; Urbach, Thomas P.; Kutas, Marta

    2010-01-01

    To re-establish picture-sentence verification – discredited possibly for its over-reliance on post-sentence response time (RT) measures - as a task for situated comprehension, we collected event-related brain potentials (ERPs) as participants read a subject-verb-object sentence, and RTs indicating whether or not the verb matched a previously depicted action. For mismatches (vs matches), speeded RTs were longer, verb N400s over centro-parietal scalp larger, and ERPs to the object noun more negative. RTs (congruence effect) correlated inversely with the centro-parietal verb N400s, and positively with the object ERP congruence effects. Verb N400s, object ERPs, and verbal working memory scores predicted more variance in RT effects (50%) than N400s alone. Thus, (1) verification processing is not all post-sentence; (2) simple priming cannot account for these results; and (3) verification tasks can inform studies of situated comprehension. PMID:20701712

  20. Expert system verification and validation study. Delivery 3A and 3B: Trip summaries

    NASA Technical Reports Server (NTRS)

    French, Scott

    1991-01-01

    Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.

  1. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  2. National Centers for Environmental Prediction

    Science.gov Websites

    Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model PARALLEL/EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS

  3. National Centers for Environmental Prediction

    Science.gov Websites

    Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model Configuration /EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION / DIAGNOSTICS Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS

  4. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  5. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Eligibility verification. 457.380 Section 457.380... Requirements: Eligibility, Screening, Applications, and Enrollment § 457.380 Eligibility verification. (a) The... State may establish reasonable eligibility verification mechanisms to promote enrollment of eligible...

  6. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  7. Property Values Associated with the Failure of Individual Links in a System with Multiple Weak and Strong Links.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helton, Jon C.; Brooks, Dusty Marie; Sallaberry, Cedric Jean-Marie.

    Representations are developed and illustrated for the distribution of link property values at the time of link failure in the presence of aleatory uncertainty in link properties. The following topics are considered: (i) defining properties for weak links and strong links, (ii) cumulative distribution functions (CDFs) for link failure time, (iii) integral-based derivation of CDFs for link property at time of link failure, (iv) sampling-based approximation of CDFs for link property at time of link failure, (v) verification of integral-based and sampling-based determinations of CDFs for link property at time of link failure, (vi) distributions of link properties conditional onmore » time of link failure, and (vii) equivalence of two different integral-based derivations of CDFs for link property at time of link failure.« less

  8. Consistency, Verification, and Validation of Turbulence Models for Reynolds-Averaged Navier-Stokes Applications

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2009-01-01

    In current practice, it is often difficult to draw firm conclusions about turbulence model accuracy when performing multi-code CFD studies ostensibly using the same model because of inconsistencies in model formulation or implementation in different codes. This paper describes an effort to improve the consistency, verification, and validation of turbulence models within the aerospace community through a website database of verification and validation cases. Some of the variants of two widely-used turbulence models are described, and two independent computer codes (one structured and one unstructured) are used in conjunction with two specific versions of these models to demonstrate consistency with grid refinement for several representative problems. Naming conventions, implementation consistency, and thorough grid resolution studies are key factors necessary for success.

  9. A preliminary experimental examination of worldview verification, perceived racism, and stress reactivity in African Americans.

    PubMed

    Lucas, Todd; Lumley, Mark A; Flack, John M; Wegner, Rhiana; Pierce, Jennifer; Goetz, Stefan

    2016-04-01

    According to worldview verification theory, inconsistencies between lived experiences and worldviews are psychologically threatening. These inconsistencies may be key determinants of stress processes that influence cardiovascular health disparities. This preliminary examination considers how experiencing injustice can affect perceived racism and biological stress reactivity among African Americans. Guided by worldview verification theory, it was hypothesized that responses to receiving an unfair outcome would be moderated by fairness of the accompanying decision process, and that this effect would further depend on the consistency of the decision process with preexisting justice beliefs. A sample of 118 healthy African American adults completed baseline measures of justice beliefs, followed by a laboratory-based social-evaluative stressor task. Two randomized fairness manipulations were implemented during the task: participants were given either high or low levels of distributive (outcome) and procedural (decision process) justice. Glucocorticoid (cortisol) and inflammatory (C-reactive protein) biological responses were measured in oral fluids, and attributions of racism were also measured. The hypothesized 3-way interaction was generally obtained. Among African Americans with a strong belief in justice, perceived racism, cortisol, and C-reactive protein responses to low distributive justice were higher when procedural justice was low. Among African Americans with a weak belief in justice however, these responses were higher when a low level of distributive justice was coupled with high procedural justice. Biological and psychological processes that contribute to cardiovascular health disparities are affected by consistency between individual-level and contextual justice factors. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  10. Environmental Technology Verification Report -- Baghouse filtration products, GE Energy QG061 filtration media ( tested May 2007)

    EPA Science Inventory

    EPA has created the Environmental Technology Verification Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The Air Pollution Control Technology Verification Center, a cente...

  11. 40 CFR 1066.240 - Torque transducer verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification. Verify torque-measurement systems by performing the verifications described in §§ 1066.270 and... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Torque transducer verification. 1066...

  12. Study of Measurement Strategies of Geometric Deviation of the Position of the Threaded Holes

    NASA Astrophysics Data System (ADS)

    Drbul, Mário; Martikan, Pavol; Sajgalik, Michal; Czan, Andrej; Broncek, Jozef; Babik, Ondrej

    2017-12-01

    Verification of product and quality control is an integral part of current production process. In terms of functional requirements and product interoperability, it is necessary to analyze their dimensional and also geometric specifications. Threaded holes are verified elements too, which are a substantial part of detachable screw connections and have a broad presence in engineering products. This paper deals with on the analysing of measurement strategies of verification geometric deviation of the position of the threaded holes, which are the indirect method of measuring threaded pins when applying different measurement strategies which can affect the result of the verification of the product..

  13. Statistical Design for Biospecimen Cohort Size in Proteomics-based Biomarker Discovery and Verification Studies

    PubMed Central

    Skates, Steven J.; Gillette, Michael A.; LaBaer, Joshua; Carr, Steven A.; Anderson, N. Leigh; Liebler, Daniel C.; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L.; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R.; Rodriguez, Henry; Boja, Emily S.

    2014-01-01

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC), with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance, and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step towards building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research. PMID:24063748

  14. Statistical design for biospecimen cohort size in proteomics-based biomarker discovery and verification studies.

    PubMed

    Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S

    2013-12-06

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.

  15. High-resolution face verification using pore-scale facial features.

    PubMed

    Li, Dong; Zhou, Huiling; Lam, Kin-Man

    2015-08-01

    Face recognition methods, which usually represent face images using holistic or local facial features, rely heavily on alignment. Their performances also suffer a severe degradation under variations in expressions or poses, especially when there is one gallery per subject only. With the easy access to high-resolution (HR) face images nowadays, some HR face databases have recently been developed. However, few studies have tackled the use of HR information for face recognition or verification. In this paper, we propose a pose-invariant face-verification method, which is robust to alignment errors, using the HR information based on pore-scale facial features. A new keypoint descriptor, namely, pore-Principal Component Analysis (PCA)-Scale Invariant Feature Transform (PPCASIFT)-adapted from PCA-SIFT-is devised for the extraction of a compact set of distinctive pore-scale facial features. Having matched the pore-scale features of two-face regions, an effective robust-fitting scheme is proposed for the face-verification task. Experiments show that, with one frontal-view gallery only per subject, our proposed method outperforms a number of standard verification methods, and can achieve excellent accuracy even the faces are under large variations in expression and pose.

  16. Geometrical verification system using Adobe Photoshop in radiotherapy.

    PubMed

    Ishiyama, Hiromichi; Suzuki, Koji; Niino, Keiji; Hosoya, Takaaki; Hayakawa, Kazushige

    2005-02-01

    Adobe Photoshop is used worldwide and is useful for comparing portal films with simulation films. It is possible to scan images and then view them simultaneously with this software. The purpose of this study was to assess the accuracy of a geometrical verification system using Adobe Photoshop. We prepared the following two conditions for verification. Under one condition, films were hanged on light boxes, and examiners measured distances between the isocenter on simulation films and that on portal films by adjusting the bony structures. Under the other condition, films were scanned into a computer and displayed using Adobe Photoshop, and examiners measured distances between the isocenter on simulation films and those on portal films by adjusting the bony structures. To obtain control data, lead balls were used as a fiducial point for matching the films accurately. The errors, defined as the differences between the control data and the measurement data, were assessed. Errors of the data obtained using Adobe Photoshop were significantly smaller than those of the data obtained from films on light boxes (p < 0.007). The geometrical verification system using Adobe Photoshop is available on any PC with this software and is useful for improving the accuracy of verification.

  17. DFM flow by using combination between design based metrology system and model based verification at sub-50nm memory device

    NASA Astrophysics Data System (ADS)

    Kim, Cheol-kyun; Kim, Jungchan; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu; Kim, Jinwoong

    2007-03-01

    As the minimum transistor length is getting smaller, the variation and uniformity of transistor length seriously effect device performance. So, the importance of optical proximity effects correction (OPC) and resolution enhancement technology (RET) cannot be overemphasized. However, OPC process is regarded by some as a necessary evil in device performance. In fact, every group which includes process and design, are interested in whole chip CD variation trend and CD uniformity, which represent real wafer. Recently, design based metrology systems are capable of detecting difference between data base to wafer SEM image. Design based metrology systems are able to extract information of whole chip CD variation. According to the results, OPC abnormality was identified and design feedback items are also disclosed. The other approaches are accomplished on EDA companies, like model based OPC verifications. Model based verification will be done for full chip area by using well-calibrated model. The object of model based verification is the prediction of potential weak point on wafer and fast feed back to OPC and design before reticle fabrication. In order to achieve robust design and sufficient device margin, appropriate combination between design based metrology system and model based verification tools is very important. Therefore, we evaluated design based metrology system and matched model based verification system for optimum combination between two systems. In our study, huge amount of data from wafer results are classified and analyzed by statistical method and classified by OPC feedback and design feedback items. Additionally, novel DFM flow would be proposed by using combination of design based metrology and model based verification tools.

  18. Commissioning and quality assurance of an integrated system for patient positioning and setup verification in particle therapy.

    PubMed

    Pella, A; Riboldi, M; Tagaste, B; Bianculli, D; Desplanques, M; Fontana, G; Cerveri, P; Seregni, M; Fattori, G; Orecchia, R; Baroni, G

    2014-08-01

    In an increasing number of clinical indications, radiotherapy with accelerated particles shows relevant advantages when compared with high energy X-ray irradiation. However, due to the finite range of ions, particle therapy can be severely compromised by setup errors and geometric uncertainties. The purpose of this work is to describe the commissioning and the design of the quality assurance procedures for patient positioning and setup verification systems at the Italian National Center for Oncological Hadrontherapy (CNAO). The accuracy of systems installed in CNAO and devoted to patient positioning and setup verification have been assessed using a laser tracking device. The accuracy in calibration and image based setup verification relying on in room X-ray imaging system was also quantified. Quality assurance tests to check the integration among all patient setup systems were designed, and records of daily QA tests since the start of clinical operation (2011) are presented. The overall accuracy of the patient positioning system and the patient verification system motion was proved to be below 0.5 mm under all the examined conditions, with median values below the 0.3 mm threshold. Image based registration in phantom studies exhibited sub-millimetric accuracy in setup verification at both cranial and extra-cranial sites. The calibration residuals of the OTS were found consistent with the expectations, with peak values below 0.3 mm. Quality assurance tests, daily performed before clinical operation, confirm adequate integration and sub-millimetric setup accuracy. Robotic patient positioning was successfully integrated with optical tracking and stereoscopic X-ray verification for patient setup in particle therapy. Sub-millimetric setup accuracy was achieved and consistently verified in daily clinical operation.

  19. Cognitive Bias in the Verification and Validation of Space Flight Systems

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of future systems.

  20. Mechanism of alkoxy groups substitution by Grignard reagents on aromatic rings and experimental verification of theoretical predictions of anomalous reactions.

    PubMed

    Jiménez-Osés, Gonzalo; Brockway, Anthony J; Shaw, Jared T; Houk, K N

    2013-05-01

    The mechanism of direct displacement of alkoxy groups in vinylogous and aromatic esters by Grignard reagents, a reaction that is not observed with expectedly better tosyloxy leaving groups, is elucidated computationally. The mechanism of this reaction has been determined to proceed through the inner-sphere attack of nucleophilic alkyl groups from magnesium to the reacting carbons via a metalaoxetane transition state. The formation of a strong magnesium chelate with the reacting alkoxy and carbonyl groups dictates the observed reactivity and selectivity. The influence of ester, ketone, and aldehyde substituents was investigated. In some cases, the calculations predicted the formation of products different than those previously reported; these predictions were then verified experimentally. The importance of studying the actual system, and not simplified models as computational systems, is demonstrated.

  1. Mechanism of Alkoxy Groups Substitution by Grignard Reagents on Aromatic Rings and Experimental Verification of Theoretical Predictions of Anomalous Reactions

    PubMed Central

    Jiménez-Osés, Gonzalo; Brockway, Anthony J.; Shaw, Jared T.; Houk, K. N.

    2013-01-01

    The mechanism of direct displacement of alkoxy groups in vinylogous and aromatic esters by Grignard reagents, a reaction that is not observed with expectedly better tosyloxy leaving groups, is elucidated computationally. The mechanism of this reaction has been determined to proceed through the inner-sphere attack of nucleophilic alkyl groups from magnesium to the reacting carbons via a metalaoxetane transition state. The formation of a strong magnesium chelate with the reacting alkoxy and carbonyl groups dictates the observed reactivity and selectivity. The influence of ester, ketone and aldehyde substituents was investigated. In some cases, the calculations predicted the formation of products different than those previously reported; these predictions were then verified experimentally. The importance of studying the actual system, and not simplified models as computational systems, is demonstrated. PMID:23601086

  2. Design and Varactors: Operational Considerations. A Reliability Study for Robust Planar GaAs

    NASA Technical Reports Server (NTRS)

    Maiwald, Frank; Schlecht, Erich; Ward, John; Lin, Robert; Leon, Rosa; Pearson, John; Mehdi, Imran

    2003-01-01

    Preliminary conclusions include: Limits for reverse currents cannot be set. Based on current data we want to avoid any reverse bias current. We know 1 micro-A is too high. Leakage current gets suppressed when operated at 120K. Migration and verification: a) Reverse Bias Voltage will be limited; b) Health check with I/V curve: 1) Minimal reverse voltage shall be x0.75 of the calculated voltage breakdown Vbr; 2) Degradation of the Reverse Bias voltage at given current will be used as indication of ESD incidents or other Damages (high RF power, heat); 3) Calculation of diodes parameter to verify initial health check result in forward direction. RF output power starts to degrade when diode I/V curve is very strongly degraded only. Experienced on 400GHz doubler and 200GHz doubler

  3. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  4. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  5. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  6. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  7. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  8. Joint ETV/NOWATECH verification protocol for the Sorbisense GSW40 passive sampler

    EPA Science Inventory

    Environmental technology verification (ETV) is an independent (third party) assessment of the performance of a technology or a product for a specified application, under defined conditions and quality assurance. This verification is a joint verification with the US EPA ETV schem...

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PERFORMANCE VERIFICATION OF THE W.L. GORE & ASSOCIATES GORE-SORBER SCREENING SURVEY

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  10. A Comparative Study of Two Azimuth Based Non Standard Location Methods

    DTIC Science & Technology

    2017-03-23

    Standard Location Methods Rongsong JIH U.S. Department of State / Arms Control, Verification, and Compliance Bureau, 2201 C Street, NW, Washington...COMPARATIVE STUDY OF TWO AZIMUTH-BASED NON-STANDARD LOCATION METHODS R. Jih Department of State / Arms Control, Verification, and Compliance Bureau...cable. The so-called “Yin Zhong Xian” (“引中线” in Chinese) algorithm, hereafter the YZX method , is an Oriental version of IPB-based procedure. It

  11. A Methodology for Formal Hardware Verification, with Application to Microprocessors.

    DTIC Science & Technology

    1993-08-29

    concurrent programming lan- guages. Proceedings of the NATO Advanced Study Institute on Logics and Models of Concurrent Systems ( Colle - sur - Loup , France, 8-19...restricted class of formu- las . Bose and Fisher [26] developed a symbolic model checker based on a Cosmos switch-level model. Their modeling approach...verification using SDVS-the method and a case study. 17th Anuual Microprogramming Workshop (New Orleans, LA , 30 October-2 November 1984). Published as

  12. Multi-canister overpack project -- verification and validation, MCNP 4A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldmann, L.H.

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less

  13. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PERFORMANCE TESTING OF THE INDUSTRIAL TEST SYSTEM, INC. CYANIDE REAGENTSTRIP™ TEST KIT

    EPA Science Inventory

    Cyanide can be present in various forms in water. The cyanide test kit evaluated in this verification study (Industrial Test System, Inc. Cyanide Regent Strip ™ Test Kit) was designed to detect free cyanide in water. This is done by converting cyanide in water to cyanogen...

  14. Characterizing irrigation water requirements for rice production from the Arkansas Rice Research Verification Program

    USDA-ARS?s Scientific Manuscript database

    This study investigated rice irrigation water use in the University of Arkansas Rice Research Verification Program between the years of 2003 and 2011. Irrigation water use averaged 747 mm (29.4 inches) over the nine years. A significant 40% water savings was reported for rice grown under a zero gr...

  15. Verification of S&D Solutions for Network Communications and Devices

    NASA Astrophysics Data System (ADS)

    Rudolph, Carsten; Compagna, Luca; Carbone, Roberto; Muñoz, Antonio; Repp, Jürgen

    This chapter describes the tool-supported verification of S&D Solutions on the level of network communications and devices. First, the general goals and challenges of verification in the context of AmI systems are highlighted and the role of verification and validation within the SERENITY processes is explained.Then, SERENITY extensions to the SH VErification tool are explained using small examples. Finally, the applicability of existing verification tools is discussed in the context of the AVISPA toolset. The two different tools show that for the security analysis of network and devices S&D Patterns relevant complementary approachesexist and can be used.

  16. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  17. Binding of two-electron metastable states in semiconductor quantum dots under a magnetic field

    NASA Astrophysics Data System (ADS)

    Garagiola, Mariano; Pont, Federico M.; Osenda, Omar

    2018-04-01

    Applying a strong enough magnetic field results in the binding of few-electron resonant states. The mechanism was proposed many years ago but its verification in laboratory conditions is far more recent. In this work we study the binding of two-electron resonant states. The electrons are confined in a cylindrical quantum dot which is embedded in a semiconductor wire. The geometry considered is similar to the one used in actual experimental setups. The low-energy two-electron spectrum is calculated numerically from an effective-mass approximation Hamiltonian modelling the system. Methods for binding threshold calculations in systems with one and two electrons are thoroughly studied; in particular, we use quantum information quantities to assess when the strong lateral confinement approximation can be used to obtain reliable low-energy spectra. For simplicity, only cases without bound states in the absence of an external field are considered. Under these conditions, the binding threshold for the one-electron case is given by the lowest Landau energy level. Moreover, the energy of the one-electron bounded resonance can be used to obtain the two-electron binding threshold. It is shown that for realistic values of the two-electron model parameters it is feasible to bind resonances with field strengths of a few tens of tesla.

  18. 78 FR 27882 - VA Veteran-Owned Small Business (VOSB) Verification Guidelines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-13

    ... Verification Self-Assessment Tool that walks the veteran through the regulation and how it applies to the...) Verification Guidelines AGENCY: Department of Veterans Affairs. ACTION: Advanced notice of proposed rulemaking... regulations governing the Department of Veterans Affairs (VA) Veteran-Owned Small Business (VOSB) Verification...

  19. 78 FR 32010 - Pipeline Safety: Public Workshop on Integrity Verification Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-28

    .... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process AGENCY: Pipeline and... announcing a public workshop to be held on the concept of ``Integrity Verification Process.'' The Integrity Verification Process shares similar characteristics with fitness for service processes. At this workshop, the...

  20. Interim Letter Report - Verification Survey of Partial Grid E9, David Witherspoon, Inc. 1630 Site Knoxville, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P.C. Weaver

    2008-06-12

    Conduct verification surveys of available grids at the DWI 1630 in Knoxville, Tennessee. A representative with the Independent Environmental Assessment and Verification (IEAV) team from ORISE conducted a verification survey of a partial area within Grid E9.

  1. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...

  2. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...

  3. 22 CFR 123.14 - Import certificate/delivery verification procedure.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Import certificate/delivery verification...

  4. 22 CFR 123.14 - Import certificate/delivery verification procedure.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Import certificate/delivery verification...

  5. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  6. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  7. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  8. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  9. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  10. Quantitative assessment of the physical potential of proton beam range verification with PET/CT.

    PubMed

    Knopf, A; Parodi, K; Paganetti, H; Cascio, E; Bonab, A; Bortfeld, T

    2008-08-07

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6 degrees to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.

  11. Quantitative assessment of the physical potential of proton beam range verification with PET/CT

    NASA Astrophysics Data System (ADS)

    Knopf, A.; Parodi, K.; Paganetti, H.; Cascio, E.; Bonab, A.; Bortfeld, T.

    2008-08-01

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6° to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.

  12. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    NASA Technical Reports Server (NTRS)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  13. An Efficient Location Verification Scheme for Static Wireless Sensor Networks.

    PubMed

    Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok

    2017-01-24

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.

  14. An Efficient Location Verification Scheme for Static Wireless Sensor Networks

    PubMed Central

    Kim, In-hwan; Kim, Bo-sung; Song, JooSeok

    2017-01-01

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007

  15. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  16. Towards the formal verification of the requirements and design of a processor interface unit

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1993-01-01

    The formal verification of the design and partial requirements for a Processor Interface Unit (PIU) using the Higher Order Logic (HOL) theorem-proving system is described. The processor interface unit is a single-chip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. It provides the opportunity to investigate the specification and verification of a real-world subsystem within a commercially-developed fault-tolerant computer. An overview of the PIU verification effort is given. The actual HOL listing from the verification effort are documented in a companion NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings' including the general-purpose HOL theories and definitions that support the PIU verification as well as tactics used in the proofs.

  17. Contribution of the infrasound technology to characterize large scale atmospheric disturbances and impact on infrasound monitoring

    NASA Astrophysics Data System (ADS)

    Blanc, Elisabeth; Le Pichon, Alexis; Ceranna, Lars; Pilger, Christoph; Charlton Perez, Andrew; Smets, Pieter

    2016-04-01

    The International Monitoring System (IMS) developed for the verification of the Comprehensive nuclear-Test-Ban Treaty (CTBT) provides a unique global description of atmospheric disturbances generating infrasound such as extreme events (e.g. meteors, volcanoes, earthquakes, and severe weather) or human activity (e.g. explosions and supersonic airplanes). The analysis of the detected signals, recorded at global scales and over near 15 years at some stations, demonstrates that large-scale atmospheric disturbances strongly affect infrasound propagation. Their time scales vary from several tens of minutes to hours and days. Their effects are in average well resolved by the current model predictions; however, accurate spatial and temporal description is lacking in both weather and climate models. This study reviews recent results using the infrasound technology to characterize these large scale disturbances, including (i) wind fluctuations induced by gravity waves generating infrasound partial reflections and modifications of the infrasound waveguide, (ii) convection from thunderstorms and mountain waves generating gravity waves, (iii) stratospheric warming events which yield wind inversions in the stratosphere, (iv)planetary waves which control the global atmospheric circulation. Improved knowledge of these disturbances and assimilation in future models is an important objective of the ARISE (Atmospheric dynamics Research InfraStructure in Europe) project. This is essential in the context of the future verification of the CTBT as enhanced atmospheric models are necessary to assess the IMS network performance in higher resolution, reduce source location errors, and improve characterization methods.

  18. 7 CFR 272.8 - State income and eligibility verification system.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false State income and eligibility verification system. 272... PARTICIPATING STATE AGENCIES § 272.8 State income and eligibility verification system. (a) General. (1) State agencies may maintain and use an income and eligibility verification system (IEVS), as specified in this...

  19. 24 CFR 985.3 - Indicators, HUD verification methods and ratings.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Indicators, HUD verification..., HUD verification methods and ratings. This section states the performance indicators that are used to assess PHA Section 8 management. HUD will use the verification method identified for each indicator in...

  20. 78 FR 56268 - Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-12

    .... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension... public workshop on ``Integrity Verification Process'' which took place on August 7, 2013. The notice also sought comments on the proposed ``Integrity Verification Process.'' In response to the comments received...

  1. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  2. 76 FR 50164 - Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-12

    ...-AQ06 Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing... correct certain portions of the Protocol Gas Verification Program and Minimum Competency Requirements for... final rule that amends the Agency's Protocol Gas Verification Program (PGVP) and the minimum competency...

  3. 30 CFR 227.601 - What are a State's responsibilities if it performs automated verification?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... performs automated verification? 227.601 Section 227.601 Mineral Resources MINERALS MANAGEMENT SERVICE... Perform Delegated Functions § 227.601 What are a State's responsibilities if it performs automated verification? To perform automated verification of production reports or royalty reports, you must: (a) Verify...

  4. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  5. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  6. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  7. Verification test report on a solar heating and hot water system

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  8. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Design verification testing. 61.40-3 Section 61.40-3... INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...

  9. Feasibility Study on Applying Radiophotoluminescent Glass Dosimeters for CyberKnife SRS Dose Verification

    PubMed Central

    Hsu, Shih-Ming; Hung, Chao-Hsiung; Liao, Yi-Jen; Fu, Hsiao-Mei; Tsai, Jo-Ting

    2017-01-01

    CyberKnife is one of multiple modalities for stereotactic radiosurgery (SRS). Due to the nature of CyberKnife and the characteristics of SRS, dose evaluation of the CyberKnife procedure is critical. A radiophotoluminescent glass dosimeter was used to verify the dose accuracy for the CyberKnife procedure and validate a viable dose verification system for CyberKnife treatment. A radiophotoluminescent glass dosimeter, thermoluminescent dosimeter, and Kodak EDR2 film were used to measure the lateral dose profile and percent depth dose of CyberKnife. A Monte Carlo simulation for dose verification was performed using BEAMnrc to verify the measured results. This study also used a radiophotoluminescent glass dosimeter coupled with an anthropomorphic phantom to evaluate the accuracy of the dose given by CyberKnife. Measurements from the radiophotoluminescent glass dosimeter were compared with the results of a thermoluminescent dosimeter and EDR2 film, and the differences found were less than 5%. The radiophotoluminescent glass dosimeter has some advantages in terms of dose measurements over CyberKnife, such as repeatability, stability, and small effective size. These advantages make radiophotoluminescent glass dosimeters a potential candidate dosimeter for the CyberKnife procedure. This study concludes that radiophotoluminescent glass dosimeters are a promising and reliable dosimeter for CyberKnife dose verification with clinically acceptable accuracy within 5%. PMID:28046056

  10. Polymer gel dosimeters for pretreatment radiotherapy verification using the three-dimensional gamma evaluation and pass rate maps.

    PubMed

    Hsieh, Ling-Ling; Shieh, Jiunn-I; Wei, Li-Ju; Wang, Yi-Chun; Cheng, Kai-Yuan; Shih, Cheng-Ting

    2017-05-01

    Polymer gel dosimeters (PGDs) have been widely studied for use in the pretreatment verification of clinical radiation therapy. However, the readability of PGDs in three-dimensional (3D) dosimetry remain unclear. In this study, the pretreatment verifications of clinical radiation therapy were performed using an N-isopropyl-acrylamide (NIPAM) PGD, and the results were used to evaluate the performance of the NIPAM PGD on 3D dose measurement. A gel phantom was used to measure the dose distribution of a clinical case of intensity-modulated radiation therapy. Magnetic resonance imaging scans were performed for dose readouts. The measured dose volumes were compared with the planned dose volume. The relative volume histograms showed that relative volumes with a negative percent dose difference decreased as time elapsed. Furthermore, the histograms revealed few changes after 24h postirradiation. For the 3%/3mm and 2%/2mm criteria, the pass rates of the 12- and 24-h dose volumes were higher than 95%, respectively. This study thus concludes that the pass rate map can be used to evaluate the dose-temporal readability of PGDs and that the NIPAM PGD can be used for clinical pretreatment verifications. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  11. Verification of Advective Bar Elements Implemented in the Aria Thermal Response Code.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, Brantley

    2016-01-01

    A verification effort was undertaken to evaluate the implementation of the new advective bar capability in the Aria thermal response code. Several approaches to the verification process were taken : a mesh refinement study to demonstrate solution convergence in the fluid and the solid, visually examining the mapping of the advective bar element nodes to the surrounding surfaces, and a comparison of solutions produced using the advective bars for simple geometries with solutions from commercial CFD software . The mesh refinement study has shown solution convergence for simple pipe flow in both temperature and velocity . Guidelines were provided tomore » achieve appropriate meshes between the advective bar elements and the surrounding volume. Simulations of pipe flow using advective bars elements in Aria have been compared to simulations using the commercial CFD software ANSYS Fluent (r) and provided comparable solutions in temperature and velocity supporting proper implementation of the new capability. Verification of Advective Bar Elements iv Acknowledgements A special thanks goes to Dean Dobranich for his guidance and expertise through all stages of this effort . His advice and feedback was instrumental to its completion. Thanks also goes to Sam Subia and Tolu Okusanya for helping to plan many of the verification activities performed in this document. Thank you to Sam, Justin Lamb and Victor Brunini for their assistance in resolving issues encountered with running the advective bar element model. Finally, thanks goes to Dean, Sam, and Adam Hetzler for reviewing the document and providing very valuable comments.« less

  12. An unattended verification station for UF6 cylinders: Field trial findings

    NASA Astrophysics Data System (ADS)

    Smith, L. E.; Miller, K. A.; McDonald, B. S.; Webster, J. B.; Zalavadia, M. A.; Garner, J. R.; Stewart, S. L.; Branney, S. J.; Todd, L. C.; Deshmukh, N. S.; Nordquist, H. A.; Kulisek, J. A.; Swinhoe, M. T.

    2017-12-01

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. Analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 "typical" Type 30B cylinders, and the viability of an "NDA Fingerprint" concept as a high-fidelity means to periodically verify that material diversion has not occurred.

  13. Suite of Benchmark Tests to Conduct Mesh-Convergence Analysis of Nonlinear and Non-constant Coefficient Transport Codes

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2014-12-01

    Verification of geophysics codes is imperative to avoid serious academic as well as practical consequences. In case that access to any given source code is not possible, the Method of Manufactured Solution (MMS) cannot be employed in code verification. In contrast, employing the Method of Exact Solution (MES) has several practical advantages. In this research, we first provide four new one-dimensional analytical solutions designed for code verification; these solutions are able to uncover the particular imperfections of the Advection-diffusion-reaction equation, such as nonlinear advection, diffusion or source terms, as well as non-constant coefficient equations. After that, we provide a solution of Burgers' equation in a novel setup. Proposed solutions satisfy the continuity of mass for the ambient flow, which is a crucial factor for coupled hydrodynamics-transport solvers. Then, we use the derived analytical solutions for code verification. To clarify gray-literature issues in the verification of transport codes, we designed a comprehensive test suite to uncover any imperfection in transport solvers via a hierarchical increase in the level of tests' complexity. The test suite includes hundreds of unit tests and system tests to check vis-a-vis the portions of the code. Examples for checking the suite start by testing a simple case of unidirectional advection; then, bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh-convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available, we utilize symmetry. Auxiliary subroutines for automation of the test suite and report generation are designed. All in all, the test package is not only a robust tool for code verification but it also provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport. We also convey our experiences in finding several errors which were not detectable with routine verification techniques.

  14. Real-Time Verification of a High-Dose-Rate Iridium 192 Source Position Using a Modified C-Arm Fluoroscope

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nose, Takayuki, E-mail: nose-takayuki@nms.ac.jp; Chatani, Masashi; Otani, Yuki

    Purpose: High-dose-rate (HDR) brachytherapy misdeliveries can occur at any institution, and they can cause disastrous results. Even a patient's death has been reported. Misdeliveries could be avoided with real-time verification methods. In 1996, we developed a modified C-arm fluoroscopic verification of an HDR Iridium 192 source position prevent these misdeliveries. This method provided excellent image quality sufficient to detect errors, and it has been in clinical use at our institutions for 20 years. The purpose of the current study is to introduce the mechanisms and validity of our straightforward C-arm fluoroscopic verification method. Methods and Materials: Conventional X-ray fluoroscopic images aremore » degraded by spurious signals and quantum noise from Iridium 192 photons, which make source verification impractical. To improve image quality, we quadrupled the C-arm fluoroscopic X-ray dose per pulse. The pulse rate was reduced by a factor of 4 to keep the average exposure compliant with Japanese medical regulations. The images were then displayed with quarter-frame rates. Results: Sufficient quality was obtained to enable observation of the source position relative to both the applicators and the anatomy. With this method, 2 errors were detected among 2031 treatment sessions for 370 patients within a 6-year period. Conclusions: With the use of a modified C-arm fluoroscopic verification method, treatment errors that were otherwise overlooked were detected in real time. This method should be given consideration for widespread use.« less

  15. Real-Time Verification of a High-Dose-Rate Iridium 192 Source Position Using a Modified C-Arm Fluoroscope.

    PubMed

    Nose, Takayuki; Chatani, Masashi; Otani, Yuki; Teshima, Teruki; Kumita, Shinichirou

    2017-03-15

    High-dose-rate (HDR) brachytherapy misdeliveries can occur at any institution, and they can cause disastrous results. Even a patient's death has been reported. Misdeliveries could be avoided with real-time verification methods. In 1996, we developed a modified C-arm fluoroscopic verification of an HDR Iridium 192 source position prevent these misdeliveries. This method provided excellent image quality sufficient to detect errors, and it has been in clinical use at our institutions for 20 years. The purpose of the current study is to introduce the mechanisms and validity of our straightforward C-arm fluoroscopic verification method. Conventional X-ray fluoroscopic images are degraded by spurious signals and quantum noise from Iridium 192 photons, which make source verification impractical. To improve image quality, we quadrupled the C-arm fluoroscopic X-ray dose per pulse. The pulse rate was reduced by a factor of 4 to keep the average exposure compliant with Japanese medical regulations. The images were then displayed with quarter-frame rates. Sufficient quality was obtained to enable observation of the source position relative to both the applicators and the anatomy. With this method, 2 errors were detected among 2031 treatment sessions for 370 patients within a 6-year period. With the use of a modified C-arm fluoroscopic verification method, treatment errors that were otherwise overlooked were detected in real time. This method should be given consideration for widespread use. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Towards the development of a rapid, portable, surface enhanced Raman spectroscopy based cleaning verification system for the drug nelarabine.

    PubMed

    Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey

    2010-09-01

    Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.

  17. Formal Verification at System Level

    NASA Astrophysics Data System (ADS)

    Mazzini, S.; Puri, S.; Mari, F.; Melatti, I.; Tronci, E.

    2009-05-01

    System Level Analysis calls for a language comprehensible to experts with different background and yet precise enough to support meaningful analyses. SysML is emerging as an effective balance between such conflicting goals. In this paper we outline some the results obtained as for SysML based system level functional formal verification by an ESA/ESTEC study, with a collaboration among INTECS and La Sapienza University of Roma. The study focuses on SysML based system level functional requirements techniques.

  18. Full-chip level MEEF analysis using model based lithography verification

    NASA Astrophysics Data System (ADS)

    Kim, Juhwan; Wang, Lantian; Zhang, Daniel; Tang, Zongwu

    2005-11-01

    MEEF (Mask Error Enhancement Factor) has become a critical factor in CD uniformity control since optical lithography process moved to sub-resolution era. A lot of studies have been done by quantifying the impact of the mask CD (Critical Dimension) errors on the wafer CD errors1-2. However, the benefits from those studies were restricted only to small pattern areas of the full-chip data due to long simulation time. As fast turn around time can be achieved for the complicated verifications on very large data by linearly scalable distributed processing technology, model-based lithography verification becomes feasible for various types of applications such as post mask synthesis data sign off for mask tape out in production and lithography process development with full-chip data3,4,5. In this study, we introduced two useful methodologies for the full-chip level verification of mask error impact on wafer lithography patterning process. One methodology is to check MEEF distribution in addition to CD distribution through process window, which can be used for RET/OPC optimization at R&D stage. The other is to check mask error sensitivity on potential pinch and bridge hotspots through lithography process variation, where the outputs can be passed on to Mask CD metrology to add CD measurements on those hotspot locations. Two different OPC data were compared using the two methodologies in this study.

  19. Verification of relationship model between Korean new elderly class's recovery resilience and productive aging.

    PubMed

    Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk

    2015-12-01

    The purpose of this study is to verification of relationship model between Korean new elderly class's recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model.

  20. Verification of relationship model between Korean new elderly class’s recovery resilience and productive aging

    PubMed Central

    Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk

    2015-01-01

    The purpose of this study is to verification of relationship model between Korean new elderly class’s recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model. PMID:26730383

  1. Online 3D EPID-based dose verification: Proof of concept

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozenda

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of thismore » study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5–10 s irradiation time. Conclusions: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.« less

  2. Guidelines for preparing software user documentation

    NASA Technical Reports Server (NTRS)

    Miller, Diane F.

    1987-01-01

    Clear, easy-to-use software user's manuals make strong demands on special technical communication techniques. Principles and guidelines are given for analyzing the audience and dealing with wide-ranging backgrounds of potential users. Types of information to be included in a complete manual are suggested, with a technique for creating a user-oriented rather than process-oriented organization. Accuracy verification is emphasized. Simple tips are gievn for formatting for quick comprehension and reference, for deciding on packaging, for creating helpful illustrations and examples, and for setting up clear and consistent conventions. Simple guidelines are offered for writing clearly and concisely and for editing.

  3. Time crystals: a review

    NASA Astrophysics Data System (ADS)

    Sacha, Krzysztof; Zakrzewski, Jakub

    2018-01-01

    Time crystals are time-periodic self-organized structures postulated by Frank Wilczek in 2012. While the original concept was strongly criticized, it stimulated at the same time an intensive research leading to propositions and experimental verifications of discrete (or Floquet) time crystals—the structures that appear in the time domain due to spontaneous breaking of discrete time translation symmetry. The struggle to observe discrete time crystals is reviewed here together with propositions that generalize this concept introducing condensed matter like physics in the time domain. We shall also revisit the original Wilczek’s idea and review strategies aimed at spontaneous breaking of continuous time translation symmetry.

  4. Integral method for the calculation of Hawking radiation in dispersive media. II. Asymmetric asymptotics.

    PubMed

    Robertson, Scott

    2014-11-01

    Analog gravity experiments make feasible the realization of black hole space-times in a laboratory setting and the observational verification of Hawking radiation. Since such analog systems are typically dominated by dispersion, efficient techniques for calculating the predicted Hawking spectrum in the presence of strong dispersion are required. In the preceding paper, an integral method in Fourier space is proposed for stationary 1+1-dimensional backgrounds which are asymptotically symmetric. Here, this method is generalized to backgrounds which are different in the asymptotic regions to the left and right of the scattering region.

  5. Computational-hydrodynamic studies of the Noh compressible flow problem using non-ideal equations of state

    NASA Astrophysics Data System (ADS)

    Honnell, Kevin; Burnett, Sarah; Yorke, Chloe'; Howard, April; Ramsey, Scott

    2017-06-01

    The Noh problem is classic verification problem in the field of compressible flows. Simple to conceptualize, it is nonetheless difficult for numerical codes to predict correctly, making it an ideal code-verification test bed. In its original incarnation, the fluid is a simple ideal gas; once validated, however, these codes are often used to study highly non-ideal fluids and solids. In this work the classic Noh problem is extended beyond the commonly-studied polytropic ideal gas to more realistic equations of state (EOS) including the stiff gas, the Nobel-Abel gas, and the Carnahan-Starling hard-sphere fluid, thus enabling verification studies to be performed on more physically-realistic fluids. Exact solutions are compared with numerical results obtained from the Lagrangian hydrocode FLAG, developed at Los Alamos. For these more realistic EOSs, the simulation errors decreased in magnitude both at the origin and at the shock, but also spread more broadly about these points compared to the ideal EOS. The overall spatial convergence rate remained first order.

  6. Verified compilation of Concurrent Managed Languages

    DTIC Science & Technology

    2017-11-01

    designs for compiler intermediate representations that facilitate mechanized proofs and verification; and (d) a realistic case study that combines these...ideas to prove the correctness of a state-of- the-art concurrent garbage collector. 15. SUBJECT TERMS Program verification, compiler design ...Even though concurrency is a pervasive part of modern software and hardware systems, it has often been ignored in safety-critical system designs . A

  7. Pharmacy Automation in Navy Medicine: A Study of Naval Medical Center San Diego

    DTIC Science & Technology

    2015-09-01

    to pharmacist verification. ...............................................................7  Figure 3.  Robotic Delivery System Installed at Naval...medication, caps the vial, and affixes the label. This completed prescription is then placed on the conveyer belt for routing to pharmacist ...performing all steps, including transportation, up to pharmacist verification via the conveyer belt. Manual fills are located along the conveyor system

  8. Two-Black Box Concept for Warhead Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bates, Cameron Russell; Frame, Katherine Chiyoko; Mckigney, Edward Allen

    2017-03-06

    We have created a possible solution to meeting the requirements of certification/authentication while still employing complicated criteria. Technical solutions to protecting information from the host in an inspection environment needs to be assessed by those with specific expertise but, LANL can still study the verification problem. The two-black box framework developed provides another potential solution to the confidence vs. certification paradox.

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PHYSICAL REMOVAL OF PARTICULATE CONTAMINANTS IN DRINKING WATER: POLYMEM UF 120 S2 ULTRAFILTRATION MEMBRANE MODULE, LUXENBURG, WISCONSIN

    EPA Science Inventory

    Verification testing of the Polymem UF120 S2 Ultrafiltration Membrane Module was conducted over a 46-day period at the Green Bay Water Utility Filtration Plant, Luxemburg, Wisconsin. The ETV testing described herein was funded in conjunction with a 12-month membrane pilot study f...

  10. Review of waste package verification tests. Semiannual report, October 1982-March 1983

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soo, P.

    1983-08-01

    The current study is part of an ongoing task to specify tests that may be used to verify that engineered waste package/repository systems comply with NRC radionuclide containment and controlled release performance objectives. Work covered in this report analyzes verification tests for borosilicate glass waste forms and bentonite- and zeolite-based packing mateials (discrete backfills). 76 references.

  11. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING AND CODING VERIFICATION (HAND ENTRY) (UA-D-14.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for coding and coding verification of hand-entered data. It applies to the coding of all physical forms, especially those coded by hand. The strategy was developed for use in the Arizona NHEXAS project and the "Border" st...

  12. The Verification of a Method for Detecting and Quantifying Diethylene Glycol, Triethylene Glycol, Tetraethylene Glycol, 2-Butoxyethanol and 2-Methoxyethanolin in Ground and Surface Waters

    EPA Science Inventory

    This verification study was a special project designed to determine the efficacy of a draft standard operating procedure (SOP) developed by US EPA Region 3 for the determination of selected glycols in drinking waters that may have been impacted by active unconventional oil and ga...

  13. Free and Reduced-Price Meal Application and Income Verification Practices in School Nutrition Programs in the United States

    ERIC Educational Resources Information Center

    Kwon, Junehee; Lee, Yee Ming; Park, Eunhye; Wang, Yujia; Rushing, Keith

    2017-01-01

    Purpose/Objectives: This study assessed current practices and attitudes of school nutrition program (SNP) management staff regarding free and reduced-price (F-RP) meal application and verification in SNPs. Methods: Stratified, randomly selected 1,500 SNP management staff in 14 states received a link to an online questionnaire and/or a printed…

  14. 37 CFR 201.29 - Access to, and confidentiality of, Statements of Account, Verification Auditor's Reports, and...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... confidentiality of, Statements of Account, Verification Auditor's Reports, and other verification information... GENERAL PROVISIONS § 201.29 Access to, and confidentiality of, Statements of Account, Verification Auditor... Account, including the Primary Auditor's Reports, filed under 17 U.S.C. 1003(c) and access to a Verifying...

  15. Formal Multilevel Hierarchical Verification of Synchronous MOS VLSI Circuits.

    DTIC Science & Technology

    1987-06-01

    166 12.4 Capacitance Coupling............................. 166 12.5 Multiple Abstraction Fuctions ....................... 168...depend on whether it is performing flat verification or hierarchical verification. The primary operations of Silica Pithecus when performing flat...signals never arise. The primary operation of Silica Pithecus when performing hierarchical verification is processing constraints to show they hold

  16. 49 CFR 802.7 - Requests: How, where, and when presented; verification of identity of individuals making requests...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...; verification of identity of individuals making requests; accompanying persons; and procedures for... Procedures and Requirements § 802.7 Requests: How, where, and when presented; verification of identity of... which the record is contained. (d) Verification of identity of requester. (1) For written requests, the...

  17. 49 CFR 802.7 - Requests: How, where, and when presented; verification of identity of individuals making requests...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...; verification of identity of individuals making requests; accompanying persons; and procedures for... Procedures and Requirements § 802.7 Requests: How, where, and when presented; verification of identity of... which the record is contained. (d) Verification of identity of requester. (1) For written requests, the...

  18. 76 FR 54810 - Submission for Review: 3206-0215, Verification of Full-Time School Attendance, RI 25-49

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-02

    ... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: 3206-0215, Verification of Full-Time School...) 3206-0215, Verification of Full-Time School Attendance. As required by the Paperwork Reduction Act of... or faxed to (202) 395-6974. SUPPLEMENTARY INFORMATION: RI 25-49, Verification of Full-Time School...

  19. 25 CFR 61.8 - Verification forms.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... using the last address of record. The verification form will be used to ascertain the previous enrollee... death. Name and/or address changes will only be made if the verification form is signed by an adult... 25 Indians 1 2010-04-01 2010-04-01 false Verification forms. 61.8 Section 61.8 Indians BUREAU OF...

  20. Audio-visual imposture

    NASA Astrophysics Data System (ADS)

    Karam, Walid; Mokbel, Chafic; Greige, Hanna; Chollet, Gerard

    2006-05-01

    A GMM based audio visual speaker verification system is described and an Active Appearance Model with a linear speaker transformation system is used to evaluate the robustness of the verification. An Active Appearance Model (AAM) is used to automatically locate and track a speaker's face in a video recording. A Gaussian Mixture Model (GMM) based classifier (BECARS) is used for face verification. GMM training and testing is accomplished on DCT based extracted features of the detected faces. On the audio side, speech features are extracted and used for speaker verification with the GMM based classifier. Fusion of both audio and video modalities for audio visual speaker verification is compared with face verification and speaker verification systems. To improve the robustness of the multimodal biometric identity verification system, an audio visual imposture system is envisioned. It consists of an automatic voice transformation technique that an impostor may use to assume the identity of an authorized client. Features of the transformed voice are then combined with the corresponding appearance features and fed into the GMM based system BECARS for training. An attempt is made to increase the acceptance rate of the impostor and to analyzing the robustness of the verification system. Experiments are being conducted on the BANCA database, with a prospect of experimenting on the newly developed PDAtabase developed within the scope of the SecurePhone project.

  1. Current status of verification practices in clinical biochemistry in Spain.

    PubMed

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  2. Simulation validation and management

    NASA Astrophysics Data System (ADS)

    Illgen, John D.

    1995-06-01

    Illgen Simulation Technologies, Inc., has been working interactive verification and validation programs for the past six years. As a result, they have evolved a methodology that has been adopted and successfully implemented by a number of different verification and validation programs. This methodology employs a unique case of computer-assisted software engineering (CASE) tools to reverse engineer source code and produce analytical outputs (flow charts and tables) that aid the engineer/analyst in the verification and validation process. We have found that the use of CASE tools saves time,which equate to improvements in both schedule and cost. This paper will describe the ISTI-developed methodology and how CASe tools are used in its support. Case studies will be discussed.

  3. Correlation between root respiration and the levels of biomass and glycyrrhizic acid in Glycyrrhiza uralensis

    PubMed Central

    Liu, Wenlan; Sun, Zhirong; Qu, Jixu; Yang, Chunning; Zhang, Xiaomin; Wei, Xinxin

    2017-01-01

    The aim of the present study was to investigate the correlation between root respiration and the levels of biomass and glycyrrhizic acid in Glycyrrhiza uralensis. Root respiration was determined using a biological oxygen analyzer. Respiration-related enzymes including glucose-6-phosphate dehydrogenase plus 6-phosphogluconate dehydrogenase, phosphohexose isomerase and succinate dehydrogenase, and respiratory pathways were evaluated. Biomass was determined by a drying-weighing method. In addition, the percentage of glycyrrhizic acid was detected using high-performance liquid chromatography. The association between root respiration and the levels of biomass and glycyrrhizic acid was investigated. The glycolysis pathway (EMP), tricarboxylic acid cycle (TCA) and pentose phosphate (PPP) pathway acted concurrently in the roots of G. uralensis. Grey correlation analysis showed that TCA had the strongest correlation (correlation coefficient, 0.8003) with biomass. Starch and acetyl coenzyme A had the closest association with above-ground biomass, while soluble sugar correlated less strongly with above-ground biomass. Grey correlation analysis between biochemical pathways and the intermediates showed that pyruvic acid had the strongest correlation with EMP, while acetyl coenzyme A correlated most strongly with TCA. Among the intermediates and pathways, pyruvic acid and EMP exhibited the greatest correlation with glycyrrhizic acid, while acetyl coenzyme A and TCA correlated with glycyrrhizic acid less closely. The results of this study may aid the cultivation of G. uralensis. However, these results require verification in further studies. PMID:28962162

  4. Correlation between root respiration and the levels of biomass and glycyrrhizic acid in Glycyrrhiza uralensis.

    PubMed

    Liu, Wenlan; Sun, Zhirong; Qu, Jixu; Yang, Chunning; Zhang, Xiaomin; Wei, Xinxin

    2017-09-01

    The aim of the present study was to investigate the correlation between root respiration and the levels of biomass and glycyrrhizic acid in Glycyrrhiza uralensis . Root respiration was determined using a biological oxygen analyzer. Respiration-related enzymes including glucose-6-phosphate dehydrogenase plus 6-phosphogluconate dehydrogenase, phosphohexose isomerase and succinate dehydrogenase, and respiratory pathways were evaluated. Biomass was determined by a drying-weighing method. In addition, the percentage of glycyrrhizic acid was detected using high-performance liquid chromatography. The association between root respiration and the levels of biomass and glycyrrhizic acid was investigated. The glycolysis pathway (EMP), tricarboxylic acid cycle (TCA) and pentose phosphate (PPP) pathway acted concurrently in the roots of G. uralensis . Grey correlation analysis showed that TCA had the strongest correlation (correlation coefficient, 0.8003) with biomass. Starch and acetyl coenzyme A had the closest association with above-ground biomass, while soluble sugar correlated less strongly with above-ground biomass. Grey correlation analysis between biochemical pathways and the intermediates showed that pyruvic acid had the strongest correlation with EMP, while acetyl coenzyme A correlated most strongly with TCA. Among the intermediates and pathways, pyruvic acid and EMP exhibited the greatest correlation with glycyrrhizic acid, while acetyl coenzyme A and TCA correlated with glycyrrhizic acid less closely. The results of this study may aid the cultivation of G. uralensis . However, these results require verification in further studies.

  5. Design Authority in the Test Programme Definition: The Alenia Spazio Experience

    NASA Astrophysics Data System (ADS)

    Messidoro, P.; Sacchi, E.; Beruto, E.; Fleming, P.; Marucchi Chierro, P.-P.

    2004-08-01

    In addition, being the Verification and Test Programme a significant part of the spacecraft development life cycle in terms of cost and time, very often the subject of the mentioned discussion has the objective to optimize the verification campaign by possible deletion or limitation of some testing activities. The increased market pressure to reduce the project's schedule and cost is originating a dialecting process inside the project teams, involving program management and design authorities, in order to optimize the verification and testing programme. The paper introduces the Alenia Spazio experience in this context, coming from the real project life on different products and missions (science, TLC, EO, manned, transportation, military, commercial, recurrent and one-of-a-kind). Usually the applicable verification and testing standards (e.g. ECSS-E-10 part 2 "Verification" and ECSS-E-10 part 3 "Testing" [1]) are tailored to the specific project on the basis of its peculiar mission constraints. The Model Philosophy and the associated verification and test programme are defined following an iterative process which suitably combines several aspects (including for examples test requirements and facilities) as shown in Fig. 1 (from ECSS-E-10). The considered cases are mainly oriented to the thermal and mechanical verification, where the benefits of possible test programme optimizations are more significant. Considering the thermal qualification and acceptance testing (i.e. Thermal Balance and Thermal Vacuum) the lessons learned originated by the development of several satellites are presented together with the corresponding recommended approaches. In particular the cases are indicated in which a proper Thermal Balance Test is mandatory and others, in presence of more recurrent design, where a qualification by analysis could be envisaged. The importance of a proper Thermal Vacuum exposure for workmanship verification is also highlighted. Similar considerations are summarized for the mechanical testing with particular emphasis on the importance of Modal Survey, Static and Sine Vibration Tests in the qualification stage in combination with the effectiveness of Vibro-Acoustic Test in acceptance. The apparent relative importance of the Sine Vibration Test for workmanship verification in specific circumstances is also highlighted. Fig. 1. Model philosophy, Verification and Test Programme definition The verification of the project requirements is planned through a combination of suitable verification methods (in particular Analysis and Test) at the different verification levels (from System down to Equipment), in the proper verification stages (e.g. in Qualification and Acceptance).

  6. Self-verification and depression among youth psychiatric inpatients.

    PubMed

    Joiner, T E; Katz, J; Lew, A S

    1997-11-01

    According to self-verification theory (e.g., W.B. Swann, 1983), people are motivated to preserve stable self-concepts by seeking self-confirming interpersonal responses, even if the responses are negative. In the current study of 72 youth psychiatric inpatients (36 boys; 36 girls; ages 7-17, M = 13.18; SD = 2.59), the authors provide the 1st test of self-verification theory among a youth sample. Participants completed self-report questionnaires on depression, self-esteem, anxiety, negative and positive affect, and interest in negative feedback from others. The authors made chart diagnoses available, and they collected peer rejection ratings. Consistent with hypotheses, the authors found that interest in negative feedback was associated with depression, was predictive of peer rejection (but only within relatively longer peer relationships), was more highly related to cognitive than emotional aspects of depression, and was specifically associated with depression, rather than being generally associated with emotional distress. The authors discuss implications for self-verification theory and for the phenomenology of youth depression.

  7. Improved Hip-Based Individual Recognition Using Wearable Motion Recording Sensor

    NASA Astrophysics Data System (ADS)

    Gafurov, Davrondzhon; Bours, Patrick

    In todays society the demand for reliable verification of a user identity is increasing. Although biometric technologies based on fingerprint or iris can provide accurate and reliable recognition performance, they are inconvenient for periodic or frequent re-verification. In this paper we propose a hip-based user recognition method which can be suitable for implicit and periodic re-verification of the identity. In our approach we use a wearable accelerometer sensor attached to the hip of the person, and then the measured hip motion signal is analysed for identity verification purposes. The main analyses steps consists of detecting gait cycles in the signal and matching two sets of detected gait cycles. Evaluating the approach on a hip data set consisting of 400 gait sequences (samples) from 100 subjects, we obtained equal error rate (EER) of 7.5% and identification rate at rank 1 was 81.4%. These numbers are improvements by 37.5% and 11.2% respectively of the previous study using the same data set.

  8. Ramgarh Crater, Rajasthan, India - Study of multispectral images obtained by Indian remote sensing satellite (IRS-IA)

    NASA Technical Reports Server (NTRS)

    Murali, A. V.; Lulla, Kamlesh P.

    1992-01-01

    Ramgarh Crater, Rajasthan, India is a potential impact crater that has not been studied so far. The proximity of Ramgarh Crater to the Deccan flood basalt terrain makes it important to examine the spatial and temporal relationship of this crater to Deccan Volcanism because recent studies propose a strong link between impact cratering and major flood basalt eruptions. A detailed multidisciplinary study is necessary to evaluate the structure and lithology of Ramgarh Crater and its temporal relationship to the emplacement of Deccan eruptions in India. Application of the IRS-IA data to study the lithologic/surface characteristics of Ramgarh Crater (attempted for the first time) indicates the potential application of remote sensing data in these studies. The IRS-IA data are of good quality and resolution. Our preliminary assessment has shown that these data are helpful in generating lithology soil vegetation profiles of Ramgarh Crater region. These 'profile maps' would be useful for targeting the specific areas in the region for a closer look and ground truth verification during the field work and sample collection in the region.

  9. Verification and Improvement of ERS-1/2 Altimeter Geophysical Data Records for Global Change Studies

    NASA Technical Reports Server (NTRS)

    Shum, C. K.

    2000-01-01

    This Final Technical Report summarizes the research work conducted under NASA's Physical Oceanography Program entitled, Verification And Improvement Of ERS-112 Altimeter Geophysical Data Recorders For Global Change Studies, for the time period from January 1, 2000 through June 30, 2000. This report also provides a summary of the investigation from July 1, 1997 - June 30, 2000. The primary objectives of this investigation include verification and improvement of the ERS-1 and ERS-2 radar altimeter geophysical data records for distribution of the data to the ESA-approved U.S. ERS-1/-2 investigators for global climate change studies. Specifically, the investigation is to verify and improve the ERS geophysical data record products by calibrating the instrument and assessing accuracy for the ERS-1/-2 orbital, geophysical, media, and instrument corrections. The purpose is to ensure that the consistency of constants, standards and algorithms with TOPEX/POSEIDON radar altimeter for global climate change studies such as the monitoring and interpretation of long-term sea level change. This investigation has provided the current best precise orbits, with the radial orbit accuracy for ERS-1 (Phases C-G) and ERS-2 estimated at the 3-5 cm rms level, an 30-fold improvement compared to the 1993 accuracy. We have finalized the production and verification of the value-added ERS-1 mission (Phases A, B, C, D, E, F, and G), in collaboration with JPL PODAAC and the University of Texas. Orbit and data verification and improvement of algorithms led to the best data product available to-date. ERS-2 altimeter data have been improved and we have been active on Envisat (2001 launch) GDR algorithm review and improvement. The data improvement of ERS-1 and ERS-2 led to improvement in the global mean sea surface, marine gravity anomaly and bathymetry models, and a study of Antarctica mass balance, which was published in Science in 1998.

  10. Spacecraft Testing Programs: Adding Value to the Systems Engineering Process

    NASA Technical Reports Server (NTRS)

    Britton, Keith J.; Schaible, Dawn M.

    2011-01-01

    Testing has long been recognized as a critical component of spacecraft development activities - yet many major systems failures may have been prevented with more rigorous testing programs. The question is why is more testing not being conducted? Given unlimited resources, more testing would likely be included in a spacecraft development program. Striking the right balance between too much testing and not enough has been a long-term challenge for many industries. The objective of this paper is to discuss some of the barriers, enablers, and best practices for developing and sustaining a strong test program and testing team. This paper will also explore the testing decision factors used by managers; the varying attitudes toward testing; methods to develop strong test engineers; and the influence of behavior, culture and processes on testing programs. KEY WORDS: Risk, Integration and Test, Validation, Verification, Test Program Development

  11. Symbolic Computation of Strongly Connected Components Using Saturation

    NASA Technical Reports Server (NTRS)

    Zhao, Yang; Ciardo, Gianfranco

    2010-01-01

    Finding strongly connected components (SCCs) in the state-space of discrete-state models is a critical task in formal verification of LTL and fair CTL properties, but the potentially huge number of reachable states and SCCs constitutes a formidable challenge. This paper is concerned with computing the sets of states in SCCs or terminal SCCs of asynchronous systems. Because of its advantages in many applications, we employ saturation on two previously proposed approaches: the Xie-Beerel algorithm and transitive closure. First, saturation speeds up state-space exploration when computing each SCC in the Xie-Beerel algorithm. Then, our main contribution is a novel algorithm to compute the transitive closure using saturation. Experimental results indicate that our improved algorithms achieve a clear speedup over previous algorithms in some cases. With the help of the new transitive closure computation algorithm, up to 10(exp 150) SCCs can be explored within a few seconds.

  12. Quantum adiabatic machine learning

    NASA Astrophysics Data System (ADS)

    Pudenz, Kristen L.; Lidar, Daniel A.

    2013-05-01

    We develop an approach to machine learning and anomaly detection via quantum adiabatic evolution. This approach consists of two quantum phases, with some amount of classical preprocessing to set up the quantum problems. In the training phase we identify an optimal set of weak classifiers, to form a single strong classifier. In the testing phase we adiabatically evolve one or more strong classifiers on a superposition of inputs in order to find certain anomalous elements in the classification space. Both the training and testing phases are executed via quantum adiabatic evolution. All quantum processing is strictly limited to two-qubit interactions so as to ensure physical feasibility. We apply and illustrate this approach in detail to the problem of software verification and validation, with a specific example of the learning phase applied to a problem of interest in flight control systems. Beyond this example, the algorithm can be used to attack a broad class of anomaly detection problems.

  13. Structural analysis and evaluation of actual PC bridge using 950 keV/3.95 MeV X-band linacs

    NASA Astrophysics Data System (ADS)

    Takeuchi, H.; Yano, R.; Ozawa, I.; Mitsuya, Y.; Dobashi, K.; Uesaka, M.; Kusano, J.; Oshima, Y.; Ishida, M.

    2017-07-01

    In Japan, bridges constructed during the strong economic growth era are facing an aging problem and advanced maintenance methods have become strongly required recently. To meet this demand, we develop the on-site inspection system using 950 keV/3.95 MeV X-band (9.3 GHz) linac X-ray sources. These systems can visualize in seconds the inner states of bridges, including cracks of concrete, location and state of tendons (wires) and other imperfections. At the on-site inspections, 950 keV linac exhibited sufficient performance. But, for thicker concrete, it is difficult to visualize the internal state by 950 keV linac. Therefore, we proceeded the installation of 3.95 MeV linac for on-site bridge inspection. In addition, for accurate evaluation, verification on the parallel motion CT technique and FEM analysis are in progress.

  14. Viability Study for an Unattended UF 6 Cylinder Verification Station: Phase I Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Leon E.; Miller, Karen A.; Garner, James R.

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass and identification for all declared UF 6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field-measured instrument uncertainties, provides an assessment of the partial-defect sensitivity of HEVA and PNEM for both one-time assay and (repeated) NDA Fingerprint verification scenarios. The findings presented in this report represent a significant step forward in the community’s understanding of the strengths and limitations of the PNEM and HEVA NDA methods, and the viability of the UCVS concept in front-end fuel cycle facilities. This experience will inform Phase II of the UCVS viability study, should the IAEA pursue it.« less

  15. Environmental Technology Verification: Supplement to Test/QA Plan for Biological and Aerosol Testing of General Ventilation Air Cleaners; Bioaerosol Inactivation Efficiency by HVAC In-Duct Ultraviolet Light Air Cleaners

    EPA Science Inventory

    The Air Pollution Control Technology Verification Center has selected general ventilation air cleaners as a technology area. The Generic Verification Protocol for Biological and Aerosol Testing of General Ventilation Air Cleaners is on the Environmental Technology Verification we...

  16. 49 CFR 40.135 - What does the MRO tell the employee at the beginning of the verification interview?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... beginning of the verification interview? 40.135 Section 40.135 Transportation Office of the Secretary of... verification interview? (a) As the MRO, you must tell the employee that the laboratory has determined that the... finding of adulteration or substitution. (b) You must explain the verification interview process to the...

  17. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations... concentration subcomponents (e.g., THC and CH4 for NMHC) separately. For example, for NMHC measurements, perform drift verification on NMHC; do not verify THC and CH4 separately. (2) Drift verification requires two...

  18. 76 FR 44051 - Submission for Review: Verification of Who Is Getting Payments, RI 38-107 and RI 38-147

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-22

    .... SUPPLEMENTARY INFORMATION: RI 38-107, Verification of Who is Getting Payments, is designed for use by the... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: Verification of Who Is Getting Payments, RI... currently approved information collection request (ICR) 3206-0197, Verification of Who is Getting Payments...

  19. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  20. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  1. How do we know? An assessment of integrated community case management data quality in four districts of Malawi.

    PubMed

    Yourkavitch, Jennifer; Zalisk, Kirsten; Prosnitz, Debra; Luhanga, Misheck; Nsona, Humphreys

    2016-11-01

    The World Health Organization contracted annual data quality assessments of Rapid Access Expansion (RAcE) projects to review integrated community case management (iCCM) data quality and the monitoring and evaluation (M&E) system for iCCM, and to suggest ways to improve data quality. The first RAcE data quality assessment was conducted in Malawi in January 2014 and we present findings pertaining to data from the health management information system at the community, facility and other sub-national levels because RAcE grantees rely on that for most of their monitoring data. We randomly selected 10 health facilities (10% of eligible facilities) from the four RAcE project districts, and collected quantitative data with an adapted and comprehensive tool that included an assessment of Malawi's M&E system for iCCM data and a data verification exercise that traced selected indicators through the reporting system. We rated the iCCM M&E system across five function areas based on interviews and observations, and calculated verification ratios for each data reporting level. We also conducted key informant interviews with Health Surveillance Assistants and facility, district and central Ministry of Health staff. Scores show a high-functioning M&E system for iCCM with some deficiencies in data management processes. The system lacks quality controls, including data entry verification, a protocol for addressing errors, and written procedures for data collection, entry, analysis and management. Data availability was generally high except for supervision data. The data verification process identified gaps in completeness and consistency, particularly in Health Surveillance Assistants' record keeping. Staff at all levels would like more training in data management. This data quality assessment illuminates where an otherwise strong M&E system for iCCM fails to ensure some aspects of data quality. Prioritizing data management with documented protocols, additional training and approaches to create efficient supervision practices may improve iCCM data quality. © The Author 2016. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Image Hashes as Templates for Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janik, Tadeusz; Jarman, Kenneth D.; Robinson, Sean M.

    2012-07-17

    Imaging systems can provide measurements that confidently assess characteristics of nuclear weapons and dismantled weapon components, and such assessment will be needed in future verification for arms control. Yet imaging is often viewed as too intrusive, raising concern about the ability to protect sensitive information. In particular, the prospect of using image-based templates for verifying the presence or absence of a warhead, or of the declared configuration of fissile material in storage, may be rejected out-of-hand as being too vulnerable to violation of information barrier (IB) principles. Development of a rigorous approach for generating and comparing reduced-information templates from images,more » and assessing the security, sensitivity, and robustness of verification using such templates, are needed to address these concerns. We discuss our efforts to develop such a rigorous approach based on a combination of image-feature extraction and encryption-utilizing hash functions to confirm proffered declarations, providing strong classified data security while maintaining high confidence for verification. The proposed work is focused on developing secure, robust, tamper-sensitive and automatic techniques that may enable the comparison of non-sensitive hashed image data outside an IB. It is rooted in research on so-called perceptual hash functions for image comparison, at the interface of signal/image processing, pattern recognition, cryptography, and information theory. Such perceptual or robust image hashing—which, strictly speaking, is not truly cryptographic hashing—has extensive application in content authentication and information retrieval, database search, and security assurance. Applying and extending the principles of perceptual hashing to imaging for arms control, we propose techniques that are sensitive to altering, forging and tampering of the imaged object yet robust and tolerant to content-preserving image distortions and noise. Ensuring that the information contained in the hashed image data (available out-of-IB) cannot be used to extract sensitive information about the imaged object is of primary concern. Thus the techniques are characterized by high unpredictability to guarantee security. We will present an assessment of the performance of our techniques with respect to security, sensitivity and robustness on the basis of a methodical and mathematically precise framework.« less

  3. Estimation of Ecosystem Parameters of the Community Land Model with DREAM: Evaluation of the Potential for Upscaling Net Ecosystem Exchange

    NASA Astrophysics Data System (ADS)

    Hendricks Franssen, H. J.; Post, H.; Vrugt, J. A.; Fox, A. M.; Baatz, R.; Kumbhar, P.; Vereecken, H.

    2015-12-01

    Estimation of net ecosystem exchange (NEE) by land surface models is strongly affected by uncertain ecosystem parameters and initial conditions. A possible approach is the estimation of plant functional type (PFT) specific parameters for sites with measurement data like NEE and application of the parameters at other sites with the same PFT and no measurements. This upscaling strategy was evaluated in this work for sites in Germany and France. Ecosystem parameters and initial conditions were estimated with NEE-time series of one year length, or a time series of only one season. The DREAM(zs) algorithm was used for the estimation of parameters and initial conditions. DREAM(zs) is not limited to Gaussian distributions and can condition to large time series of measurement data simultaneously. DREAM(zs) was used in combination with the Community Land Model (CLM) v4.5. Parameter estimates were evaluated by model predictions at the same site for an independent verification period. In addition, the parameter estimates were evaluated at other, independent sites situated >500km away with the same PFT. The main conclusions are: i) simulations with estimated parameters reproduced better the NEE measurement data in the verification periods, including the annual NEE-sum (23% improvement), annual NEE-cycle and average diurnal NEE course (error reduction by factor 1,6); ii) estimated parameters based on seasonal NEE-data outperformed estimated parameters based on yearly data; iii) in addition, those seasonal parameters were often also significantly different from their yearly equivalents; iv) estimated parameters were significantly different if initial conditions were estimated together with the parameters. We conclude that estimated PFT-specific parameters improve land surface model predictions significantly at independent verification sites and for independent verification periods so that their potential for upscaling is demonstrated. However, simulation results also indicate that possibly the estimated parameters mask other model errors. This would imply that their application at climatic time scales would not improve model predictions. A central question is whether the integration of many different data streams (e.g., biomass, remotely sensed LAI) could solve the problems indicated here.

  4. Verification and intercomparison of mesoscale ensemble prediction systems in the Beijing 2008 Olympics Research and Development Project

    NASA Astrophysics Data System (ADS)

    Kunii, Masaru; Saito, Kazuo; Seko, Hiromu; Hara, Masahiro; Hara, Tabito; Yamaguchi, Munehiko; Gong, Jiandong; Charron, Martin; Du, Jun; Wang, Yong; Chen, Dehui

    2011-05-01

    During the period around the Beijing 2008 Olympic Games, the Beijing 2008 Olympics Research and Development Project (B08RDP) was conducted as part of the World Weather Research Program short-range weather forecasting research project. Mesoscale ensemble prediction (MEP) experiments were carried out by six organizations in near-real time, in order to share their experiences in the development of MEP systems. The purpose of this study is to objectively verify these experiments and to clarify the problems associated with the current MEP systems through the same experiences. Verification was performed using the MEP outputs interpolated into a common verification domain with a horizontal resolution of 15 km. For all systems, the ensemble spreads grew as the forecast time increased, and the ensemble mean improved the forecast errors compared with individual control forecasts in the verification against the analysis fields. However, each system exhibited individual characteristics according to the MEP method. Some participants used physical perturbation methods. The significance of these methods was confirmed by the verification. However, the mean error (ME) of the ensemble forecast in some systems was worse than that of the individual control forecast. This result suggests that it is necessary to pay careful attention to physical perturbations.

  5. Evaluation of the 29-km Eta Model for Weather Support to the United States Space Program

    NASA Technical Reports Server (NTRS)

    Manobianco, John; Nutter, Paul

    1997-01-01

    The Applied Meteorology Unit (AMU) conducted a year-long evaluation of NCEP's 29-km mesoscale Eta (meso-eta) weather prediction model in order to identify added value to forecast operations in support of the United States space program. The evaluation was stratified over warm and cool seasons and considered both objective and subjective verification methodologies. Objective verification results generally indicate that meso-eta model point forecasts at selected stations exhibit minimal error growth in terms of RMS errors and are reasonably unbiased. Conversely, results from the subjective verification demonstrate that model forecasts of developing weather events such as thunderstorms, sea breezes, and cold fronts, are not always as accurate as implied by the seasonal error statistics. Sea-breeze case studies reveal that the model generates a dynamically-consistent thermally direct circulation over the Florida peninsula, although at a larger scale than observed. Thunderstorm verification reveals that the meso-eta model is capable of predicting areas of organized convection, particularly during the late afternoon hours but is not capable of forecasting individual thunderstorms. Verification of cold fronts during the cool season reveals that the model is capable of forecasting a majority of cold frontal passages through east central Florida to within +1-h of observed frontal passage.

  6. Resource Theory of Quantum Memories and Their Faithful Verification with Minimal Assumptions

    NASA Astrophysics Data System (ADS)

    Rosset, Denis; Buscemi, Francesco; Liang, Yeong-Cherng

    2018-04-01

    We provide a complete set of game-theoretic conditions equivalent to the existence of a transformation from one quantum channel into another one, by means of classically correlated preprocessing and postprocessing maps only. Such conditions naturally induce tests to certify that a quantum memory is capable of storing quantum information, as opposed to memories that can be simulated by measurement and state preparation (corresponding to entanglement-breaking channels). These results are formulated as a resource theory of genuine quantum memories (correlated in time), mirroring the resource theory of entanglement in quantum states (correlated spatially). As the set of conditions is complete, the corresponding tests are faithful, in the sense that any non-entanglement-breaking channel can be certified. Moreover, they only require the assumption of trusted inputs, known to be unavoidable for quantum channel verification. As such, the tests we propose are intrinsically different from the usual process tomography, for which the probes of both the input and the output of the channel must be trusted. An explicit construction is provided and shown to be experimentally realizable, even in the presence of arbitrarily strong losses in the memory or detectors.

  7. SU-D-BRC-03: Development and Validation of an Online 2D Dose Verification System for Daily Patient Plan Delivery Accuracy Check

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, J; Hu, W; Xing, Y

    Purpose: All plan verification systems for particle therapy are designed to do plan verification before treatment. However, the actual dose distributions during patient treatment are not known. This study develops an online 2D dose verification tool to check the daily dose delivery accuracy. Methods: A Siemens particle treatment system with a modulated scanning spot beam is used in our center. In order to do online dose verification, we made a program to reconstruct the delivered 2D dose distributions based on the daily treatment log files and depth dose distributions. In the log files we can get the focus size, positionmore » and particle number for each spot. A gamma analysis is used to compare the reconstructed dose distributions with the dose distributions from the TPS to assess the daily dose delivery accuracy. To verify the dose reconstruction algorithm, we compared the reconstructed dose distributions to dose distributions measured using PTW 729XDR ion chamber matrix for 13 real patient plans. Then we analyzed 100 treatment beams (58 carbon and 42 proton) for prostate, lung, ACC, NPC and chordoma patients. Results: For algorithm verification, the gamma passing rate was 97.95% for the 3%/3mm and 92.36% for the 2%/2mm criteria. For patient treatment analysis,the results were 97.7%±1.1% and 91.7%±2.5% for carbon and 89.9%±4.8% and 79.7%±7.7% for proton using 3%/3mm and 2%/2mm criteria, respectively. The reason for the lower passing rate for the proton beam is that the focus size deviations were larger than for the carbon beam. The average focus size deviations were −14.27% and −6.73% for proton and −5.26% and −0.93% for carbon in the x and y direction respectively. Conclusion: The verification software meets our requirements to check for daily dose delivery discrepancies. Such tools can enhance the current treatment plan and delivery verification processes and improve safety of clinical treatments.« less

  8. Improvement on post-OPC verification efficiency for contact/via coverage check by final CD biasing of metal lines and considering their location on the metal layout

    NASA Astrophysics Data System (ADS)

    Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong

    2011-04-01

    As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model application is presented.

  9. Evaluation of the 29-km Eta Model. Part 1; Objective Verification at Three Selected Stations

    NASA Technical Reports Server (NTRS)

    Nutter, Paul A.; Manobianco, John; Merceret, Francis J. (Technical Monitor)

    1998-01-01

    This paper describes an objective verification of the National Centers for Environmental Prediction (NCEP) 29-km eta model from May 1996 through January 1998. The evaluation was designed to assess the model's surface and upper-air point forecast accuracy at three selected locations during separate warm (May - August) and cool (October - January) season periods. In order to enhance sample sizes available for statistical calculations, the objective verification includes two consecutive warm and cool season periods. Systematic model deficiencies comprise the larger portion of the total error in most of the surface forecast variables that were evaluated. The error characteristics for both surface and upper-air forecasts vary widely by parameter, season, and station location. At upper levels, a few characteristic biases are identified. Overall however, the upper-level errors are more nonsystematic in nature and could be explained partly by observational measurement uncertainty. With a few exceptions, the upper-air results also indicate that 24-h model error growth is not statistically significant. In February and August 1997, NCEP implemented upgrades to the eta model's physical parameterizations that were designed to change some of the model's error characteristics near the surface. The results shown in this paper indicate that these upgrades led to identifiable and statistically significant changes in forecast accuracy for selected surface parameters. While some of the changes were expected, others were not consistent with the intent of the model updates and further emphasize the need for ongoing sensitivity studies and localized statistical verification efforts. Objective verification of point forecasts is a stringent measure of model performance, but when used alone, is not enough to quantify the overall value that model guidance may add to the forecast process. Therefore, results from a subjective verification of the meso-eta model over the Florida peninsula are discussed in the companion paper by Manobianco and Nutter. Overall verification results presented here and in part two should establish a reasonable benchmark from which model users and developers may pursue the ongoing eta model verification strategies in the future.

  10. A novel rail defect detection method based on undecimated lifting wavelet packet transform and Shannon entropy-improved adaptive line enhancer

    NASA Astrophysics Data System (ADS)

    Hao, Qiushi; Zhang, Xin; Wang, Yan; Shen, Yi; Makis, Viliam

    2018-07-01

    Acoustic emission (AE) technology is sensitive to subliminal rail defects, however strong wheel-rail contact rolling noise under high-speed condition has gravely impeded detecting of rail defects using traditional denoising methods. In this context, the paper develops an adaptive detection method for rail cracks, which combines multiresolution analysis with an improved adaptive line enhancer (ALE). To obtain elaborate multiresolution information of transient crack signals with low computational cost, lifting scheme-based undecimated wavelet packet transform is adopted. In order to feature the impulsive property of crack signals, a Shannon entropy-improved ALE is proposed as a signal enhancing approach, where Shannon entropy is introduced to improve the cost function. Then a rail defect detection plan based on the proposed method for high-speed condition is put forward. From theoretical analysis and experimental verification, it is demonstrated that the proposed method has superior performance in enhancing the rail defect AE signal and reducing the strong background noise, offering an effective multiresolution approach for rail defect detection under high-speed and strong-noise condition.

  11. Accuracy of self-reported smoking abstinence in clinical trials of hospital-initiated smoking interventions.

    PubMed

    Scheuermann, Taneisha S; Richter, Kimber P; Rigotti, Nancy A; Cummins, Sharon E; Harrington, Kathleen F; Sherman, Scott E; Zhu, Shu-Hong; Tindle, Hilary A; Preacher, Kristopher J

    2017-12-01

    To estimate the prevalence and predictors of failed biochemical verification of self-reported abstinence among participants enrolled in trials of hospital-initiated smoking cessation interventions. Comparison of characteristics between participants who verified and those who failed to verify self-reported abstinence. Multi-site randomized clinical trials conducted between 2010 and 2014 in hospitals throughout the United States. Recently hospitalized smokers who reported tobacco abstinence 6 months post-randomization and provided a saliva sample for verification purposes (n = 822). Outcomes were salivary cotinine-verified smoking abstinence at 10 and 15 ng/ml cut-points. Predictors and correlates included participant demographics and tobacco use; hospital diagnoses and treatment; and study characteristics collected via surveys and electronic medical records. Usable samples were returned by 69.8% of the 1178 eligible trial participants who reported 7-day point prevalence abstinence. The proportion of participants verified as quit was 57.8% [95% confidence interval (CI) = 54.4, 61.2; 10 ng/ml cut-off] or 60.6% (95% CI = 57.2, 63.9; 15 ng/ml). Factors associated independently with verification at 10 ng/ml were education beyond high school education [odds ratio (OR) = 1.51; 95% CI = 1.07, 2.11], continuous abstinence since hospitalization (OR = 2.82; 95% CI = 2.02, 3.94), mailed versus in-person sample (OR = 3.20; 95% CI = 1.96, 5.21) and race. African American participants were less likely to verify abstinence than white participants (OR = 0.64; 95% CI = 0.44, 0.93). Findings were similar for verification at 15 ng/ml. Verification rates did not differ by treatment group. In the United States, high rates (40%) of recently hospitalized smokers enrolled in smoking cessation trials fail biochemical verification of their self-reported abstinence. © 2017 Society for the Study of Addiction.

  12. Gyrokinetic Particle Simulation of Turbulent Transport in Burning Plasmas (GPS - TTBP) Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chame, Jacqueline

    2011-05-27

    The goal of this project is the development of the Gyrokinetic Toroidal Code (GTC) Framework and its applications to problems related to the physics of turbulence and turbulent transport in tokamaks,. The project involves physics studies, code development, noise effect mitigation, supporting computer science efforts, diagnostics and advanced visualizations, verification and validation. Its main scientific themes are mesoscale dynamics and non-locality effects on transport, the physics of secondary structures such as zonal flows, and strongly coherent wave-particle interaction phenomena at magnetic precession resonances. Special emphasis is placed on the implications of these themes for rho-star and current scalings and formore » the turbulent transport of momentum. GTC-TTBP also explores applications to electron thermal transport, particle transport; ITB formation and cross-cuts such as edge-core coupling, interaction of energetic particles with turbulence and neoclassical tearing mode trigger dynamics. Code development focuses on major initiatives in the development of full-f formulations and the capacity to simulate flux-driven transport. In addition to the full-f -formulation, the project includes the development of numerical collision models and methods for coarse graining in phase space. Verification is pursued by linear stability study comparisons with the FULL and HD7 codes and by benchmarking with the GKV, GYSELA and other gyrokinetic simulation codes. Validation of gyrokinetic models of ion and electron thermal transport is pursed by systematic stressing comparisons with fluctuation and transport data from the DIII-D and NSTX tokamaks. The physics and code development research programs are supported by complementary efforts in computer sciences, high performance computing, and data management.« less

  13. Evolution of Galaxy Luminosity and Stellar-Mass Functions since $z=1$ with the Dark Energy Survey Science Verification Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Capozzi, D.; et al.

    We present the first study of the evolution of the galaxy luminosity and stellar-mass functions (GLF and GSMF) carried out by the Dark Energy Survey (DES). We describe the COMMODORE galaxy catalogue selected from Science Verification images. This catalogue is made ofmore » $$\\sim 4\\times 10^{6}$$ galaxies at $$0« less

  14. Face Verification across Age Progression using Discriminative Methods

    DTIC Science & Technology

    2008-01-01

    progression. The most related study to our work is [30], where the probabilistic eigenspace frame - work [22] is adapted for face identification across...solution has the same CAR and CRR, is frequently used to measure verification performance, B. Gradient Orientation and Gradient Orientation Pyramid Now we...proposed GOP representation. The other five approaches are different from our method in both representations and classification frame - works. For

  15. Identification of squid species by melting temperature shifts on fluorescence melting curve analysis (FMCA) using single dual-labeled probe

    NASA Astrophysics Data System (ADS)

    Koh, Eunjung; Song, Ha Jeong; Kwon, Na Young; Kim, Gi Won; Lee, Kwang Ho; Jo, Soyeon; Park, Sujin; Park, Jihyun; Park, Eun Kyeong; Hwang, Seung Yong

    2017-06-01

    Real time PCR is a standard method for identification of species. One of limitations of the qPCR is that there would be false-positive result due to mismatched hybridization between target sequence and probe depending on the annealing temperature in the PCR condition. As an alternative, fluorescence melting curve analysis (FMCA) could be applied for species identification. FMCA is based on a dual-labeled probe. Even with subtle difference of target sequence, there are visible melting temperature (Tm) shift. One of FMCA applications is distinguishing organisms distributed and consumed globally as popular food ingredients. Their prices are set by species or country of origin. However, counterfeiting or distributing them without any verification procedure are becoming social problems and threatening food safety. Besides distinguishing them in naked eye is very difficult and almost impossible in any processed form. Therefore, it is necessary to identify species in molecular level. In this research three species of squids which have 1-2 base pair differences each are selected as samples since they have the same issue. We designed a probe which perfectly matches with one species and the others mismatches 2 and 1 base pair respectively and labeled with fluorophore and quencher. In an experiment with a single probe, we successfully distinguished them by Tm shift depending on the difference of base pair. By combining FMCA and qPCR chip, smaller-scale assay with higher sensitivity and resolution could be possible, andc furthermore, enabling results analysis with smart phone would realize point-of-care testing (POCT).

  16. Comments for A Conference on Verification in the 21st Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doyle, James E.

    2012-06-12

    The author offers 5 points for the discussion of Verification and Technology: (1) Experience with the implementation of arms limitation and arms reduction agreements confirms that technology alone has never been relied upon to provide effective verification. (2) The historical practice of verification of arms control treaties between Cold War rivals may constrain the cooperative and innovative use of technology for transparency, veification and confidence building in the future. (3) An area that has been identified by many, including the US State Department and NNSA as being rich for exploration for potential uses of technology for transparency and verification ismore » information and communications technology (ICT). This includes social media, crowd-sourcing, the internet of things, and the concept of societal verification, but there are issues. (4) On the issue of the extent to which verification technologies are keeping pace with the demands of future protocols and agrements I think the more direct question is ''are they effective in supporting the objectives of the treaty or agreement?'' In this regard it is important to acknowledge that there is a verification grand challenge at our doorstep. That is ''how does one verify limitations on nuclear warheads in national stockpiles?'' (5) Finally, while recognizing the daunting political and security challenges of such an approach, multilateral engagement and cooperation at the conceptual and technical levels provides benefits for addressing future verification challenges.« less

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR TREATMENT OF WASTEWATER GENERATED DURING DECONTAMINATION ACTIVITIES - ULTRASTRIP SYSTEMS, INC., MOBILE EMERGENCY FILTRATION SYSTEM (MEFS) - 04/14/WQPC-HS

    EPA Science Inventory

    Performance verification testing of the UltraStrip Systems, Inc., Mobile Emergency Filtration System (MEFS) was conducted under EPA's Environmental Technology Verification (ETV) Program at the EPA Test and Evaluation (T&E) Facility in Cincinnati, Ohio, during November, 2003, thr...

  18. Model Based Verification of Cyber Range Event Environments

    DTIC Science & Technology

    2015-12-10

    Model Based Verification of Cyber Range Event Environments Suresh K. Damodaran MIT Lincoln Laboratory 244 Wood St., Lexington, MA, USA...apply model based verification to cyber range event environment configurations, allowing for the early detection of errors in event environment...Environment Representation (CCER) ontology. We also provide an overview of a methodology to specify verification rules and the corresponding error

  19. Process Document, Joint Verification Protocol, and Joint Test Plan for Verification of HACH-LANGE GmbH LUMIStox 300 Bench Top Luminometer and ECLOX Handheld Luminometer for Luminescent Bacteria Test for use in Wastewater

    EPA Science Inventory

    The Danish Environmental Technology Verification program (DANETV) Water Test Centre operated by DHI, is supported by the Danish Ministry for Science, Technology and Innovation. DANETV, the United States Environmental Protection Agency Environmental Technology Verification Progra...

  20. Certification and verification for Northrup Model NSC-01-0732 Fresnel lens concentrating solar collector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1979-03-01

    The certification and verification of the Northrup Model NSC-01-0732 Fresnel lens tracking solar collector are presented. A certification statement is included with signatures and a separate report on the structural analysis of the collector system. System verification against the Interim Performance Criteria are indicated by matrices with verification discussion, analysis, and enclosed test results.

  1. Simple method to verify OPC data based on exposure condition

    NASA Astrophysics Data System (ADS)

    Moon, James; Ahn, Young-Bae; Oh, Sey-Young; Nam, Byung-Ho; Yim, Dong Gyu

    2006-03-01

    In a world where Sub100nm lithography tool is an everyday household item for device makers, shrinkage of the device is at a rate that no one ever have imagined. With the shrinkage of device at such a high rate, demand placed on Optical Proximity Correction (OPC) is like never before. To meet this demand with respect to shrinkage rate of the device, more aggressive OPC tactic is involved. Aggressive OPC tactics is a must for sub 100nm lithography tech but this tactic eventually results in greater room for OPC error and complexity of the OPC data. Until now, Optical Rule Check (ORC) or Design Rule Check (DRC) was used to verify this complex OPC error. But each of these methods has its pros and cons. ORC verification of OPC data is rather accurate "process" wise but inspection of full chip device requires a lot of money (Computer , software,..) and patience (run time). DRC however has no such disadvantage, but accuracy of the verification is a total downfall "process" wise. In this study, we were able to create a new method for OPC data verification that combines the best of both ORC and DRC verification method. We created a method that inspects the biasing of the OPC data with respect to the illumination condition of the process that's involved. This new method for verification was applied to 80nm tech ISOLATION and GATE layer of the 512M DRAM device and showed accuracy equivalent to ORC inspection with run time that of DRC verification.

  2. The 2014 Sandia Verification and Validation Challenge: Problem statement

    DOE PAGES

    Hu, Kenneth; Orient, George

    2016-01-18

    This paper presents a case study in utilizing information from experiments, models, and verification and validation (V&V) to support a decision. It consists of a simple system with data and models provided, plus a safety requirement to assess. The goal is to pose a problem that is flexible enough to allow challengers to demonstrate a variety of approaches, but constrained enough to focus attention on a theme. This was accomplished by providing a good deal of background information in addition to the data, models, and code, but directing the participants' activities with specific deliverables. In this challenge, the theme ismore » how to gather and present evidence about the quality of model predictions, in order to support a decision. This case study formed the basis of the 2014 Sandia V&V Challenge Workshop and this resulting special edition of the ASME Journal of Verification, Validation, and Uncertainty Quantification.« less

  3. A new technique for measuring listening and reading literacy in developing countries

    NASA Astrophysics Data System (ADS)

    Greene, Barbara A.; Royer, James M.; Anzalone, Stephen

    1990-03-01

    One problem in evaluating educational interventions in developing countries is the absence of tests that adequately reflect the culture and curriculum. The Sentence Verification Technique is a new procedure for measuring reading and listening comprehension that allows for the development of tests based on materials indigenous to a given culture. The validity of using the Sentence Verification Technique to measure reading comprehension in Grenada was evaluated in the present study. The study involved 786 students at standards 3, 4 and 5. The tests for each standard consisted of passages that varied in difficulty. The students identified as high ability students in all three standards performed better than those identified as low ability. All students performed better with easier passages. Additionally, students in higher standards performed bettter than students in lower standards on a given passage. These results supported the claim that the Sentence Verification Technique is a valid measure of reading comprehension in Grenada.

  4. Influenza forecasting with Google Flu Trends.

    PubMed

    Dugas, Andrea Freyer; Jalalpour, Mehdi; Gel, Yulia; Levin, Scott; Torcaso, Fred; Igusa, Takeru; Rothman, Richard E

    2013-01-01

    We developed a practical influenza forecast model based on real-time, geographically focused, and easy to access data, designed to provide individual medical centers with advanced warning of the expected number of influenza cases, thus allowing for sufficient time to implement interventions. Secondly, we evaluated the effects of incorporating a real-time influenza surveillance system, Google Flu Trends, and meteorological and temporal information on forecast accuracy. Forecast models designed to predict one week in advance were developed from weekly counts of confirmed influenza cases over seven seasons (2004-2011) divided into seven training and out-of-sample verification sets. Forecasting procedures using classical Box-Jenkins, generalized linear models (GLM), and generalized linear autoregressive moving average (GARMA) methods were employed to develop the final model and assess the relative contribution of external variables such as, Google Flu Trends, meteorological data, and temporal information. A GARMA(3,0) forecast model with Negative Binomial distribution integrating Google Flu Trends information provided the most accurate influenza case predictions. The model, on the average, predicts weekly influenza cases during 7 out-of-sample outbreaks within 7 cases for 83% of estimates. Google Flu Trend data was the only source of external information to provide statistically significant forecast improvements over the base model in four of the seven out-of-sample verification sets. Overall, the p-value of adding this external information to the model is 0.0005. The other exogenous variables did not yield a statistically significant improvement in any of the verification sets. Integer-valued autoregression of influenza cases provides a strong base forecast model, which is enhanced by the addition of Google Flu Trends confirming the predictive capabilities of search query based syndromic surveillance. This accessible and flexible forecast model can be used by individual medical centers to provide advanced warning of future influenza cases.

  5. Lightning Jump Algorithm Development for the GOES·R Geostationary Lightning Mapper

    NASA Technical Reports Server (NTRS)

    Schultz. E.; Schultz. C.; Chronis, T.; Stough, S.; Carey, L.; Calhoun, K.; Ortega, K.; Stano, G.; Cecil, D.; Bateman, M.; hide

    2014-01-01

    Current work on the lightning jump algorithm to be used in GOES-R Geostationary Lightning Mapper (GLM)'s data stream is multifaceted due to the intricate interplay between the storm tracking, GLM proxy data, and the performance of the lightning jump itself. This work outlines the progress of the last year, where analysis and performance of the lightning jump algorithm with automated storm tracking and GLM proxy data were assessed using over 700 storms from North Alabama. The cases analyzed coincide with previous semi-objective work performed using total lightning mapping array (LMA) measurements in Schultz et al. (2011). Analysis shows that key components of the algorithm (flash rate and sigma thresholds) have the greatest influence on the performance of the algorithm when validating using severe storm reports. Automated objective analysis using the GLM proxy data has shown probability of detection (POD) values around 60% with false alarm rates (FAR) around 73% using similar methodology to Schultz et al. (2011). However, when applying verification methods similar to those employed by the National Weather Service, POD values increase slightly (69%) and FAR values decrease (63%). The relationship between storm tracking and lightning jump has also been tested in a real-time framework at NSSL. This system includes fully automated tracking by radar alone, real-time LMA and radar observations and the lightning jump. Results indicate that the POD is strong at 65%. However, the FAR is significantly higher than in Schultz et al. (2011) (50-80% depending on various tracking/lightning jump parameters) when using storm reports for verification. Given known issues with Storm Data, the performance of the real-time jump algorithm is also being tested with high density radar and surface observations from the NSSL Severe Hazards Analysis & Verification Experiment (SHAVE).

  6. Imputation of microsatellite alleles from dense SNP genotypes for parentage verification across multiple Bos taurus and Bos indicus breeds

    PubMed Central

    McClure, Matthew C.; Sonstegard, Tad S.; Wiggans, George R.; Van Eenennaam, Alison L.; Weber, Kristina L.; Penedo, Cecilia T.; Berry, Donagh P.; Flynn, John; Garcia, Jose F.; Carmo, Adriana S.; Regitano, Luciana C. A.; Albuquerque, Milla; Silva, Marcos V. G. B.; Machado, Marco A.; Coffey, Mike; Moore, Kirsty; Boscher, Marie-Yvonne; Genestout, Lucie; Mazza, Raffaele; Taylor, Jeremy F.; Schnabel, Robert D.; Simpson, Barry; Marques, Elisa; McEwan, John C.; Cromie, Andrew; Coutinho, Luiz L.; Kuehn, Larry A.; Keele, John W.; Piper, Emily K.; Cook, Jim; Williams, Robert; Van Tassell, Curtis P.

    2013-01-01

    To assist cattle producers transition from microsatellite (MS) to single nucleotide polymorphism (SNP) genotyping for parental verification we previously devised an effective and inexpensive method to impute MS alleles from SNP haplotypes. While the reported method was verified with only a limited data set (N = 479) from Brown Swiss, Guernsey, Holstein, and Jersey cattle, some of the MS-SNP haplotype associations were concordant across these phylogenetically diverse breeds. This implied that some haplotypes predate modern breed formation and remain in strong linkage disequilibrium. To expand the utility of MS allele imputation across breeds, MS and SNP data from more than 8000 animals representing 39 breeds (Bos taurus and B. indicus) were used to predict 9410 SNP haplotypes, incorporating an average of 73 SNPs per haplotype, for which alleles from 12 MS markers could be accurately be imputed. Approximately 25% of the MS-SNP haplotypes were present in multiple breeds (N = 2 to 36 breeds). These shared haplotypes allowed for MS imputation in breeds that were not represented in the reference population with only a small increase in Mendelian inheritance inconsistancies. Our reported reference haplotypes can be used for any cattle breed and the reported methods can be applied to any species to aid the transition from MS to SNP genetic markers. While ~91% of the animals with imputed alleles for 12 MS markers had ≤1 Mendelian inheritance conflicts with their parents' reported MS genotypes, this figure was 96% for our reference animals, indicating potential errors in the reported MS genotypes. The workflow we suggest autocorrects for genotyping errors and rare haplotypes, by MS genotyping animals whose imputed MS alleles fail parentage verification, and then incorporating those animals into the reference dataset. PMID:24065982

  7. An unattended verification station for UF 6 cylinders: Field trial findings

    DOE PAGES

    Smith, L. E.; Miller, K. A.; McDonald, B. S.; ...

    2017-08-26

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less

  8. An unattended verification station for UF 6 cylinders: Field trial findings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, L. E.; Miller, K. A.; McDonald, B. S.

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less

  9. Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System

    NASA Astrophysics Data System (ADS)

    Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li

    The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.

  10. Evaluation of HCFC AK 225 Alternatives for Precision Cleaning and Verification

    NASA Technical Reports Server (NTRS)

    Melton, D. M.

    1998-01-01

    Maintaining qualified cleaning and verification processes are essential in an production environment. Environmental regulations have and are continuing to impact cleaning and verification processing in component and large structures, both at the Michoud Assembly Facility and component suppliers. The goal of the effort was to assure that the cleaning and verification proceeds unimpeded and that qualified, environmentally compliant material and process replacements are implemented and perform to specifications. The approach consisted of (1) selection of a Supersonic Gas-Liquid Cleaning System; (2) selection and evaluation of three cleaning and verification solvents as candidate alternatives to HCFC 225 (Vertrel 423 (HCFC), Vertrel MCA (HFC/1,2-Dichloroethylene), and HFE 7100DE (HFE/1,2 Dichloroethylene)); and evaluation of an analytical instrumental post cleaning verification technique. This document is presented in viewgraph format.

  11. Simulation and experimental verification of prompt gamma-ray emissions during proton irradiation.

    PubMed

    Schumann, A; Petzoldt, J; Dendooven, P; Enghardt, W; Golnik, C; Hueso-González, F; Kormoll, T; Pausch, G; Roemer, K; Fiedler, F

    2015-05-21

    Irradiation with protons and light ions offers new possibilities for tumor therapy but has a strong need for novel imaging modalities for treatment verification. The development of new detector systems, which can provide an in vivo range assessment or dosimetry, requires an accurate knowledge of the secondary radiation field and reliable Monte Carlo simulations. This paper presents multiple measurements to characterize the prompt γ-ray emissions during proton irradiation and benchmarks the latest Geant4 code against the experimental findings. Within the scope of this work, the total photon yield for different target materials, the energy spectra as well as the γ-ray depth profile were assessed. Experiments were performed at the superconducting AGOR cyclotron at KVI-CART, University of Groningen. Properties of the γ-ray emissions were experimentally determined. The prompt γ-ray emissions were measured utilizing a conventional HPGe detector system (Clover) and quantitatively compared to simulations. With the selected physics list QGSP_BIC_HP, Geant4 strongly overestimates the photon yield in most cases, sometimes up to 50%. The shape of the spectrum and qualitative occurrence of discrete γ lines is reproduced accurately. A sliced phantom was designed to determine the depth profile of the photons. The position of the distal fall-off in the simulations agrees with the measurements, albeit the peak height is also overestimated. Hence, Geant4 simulations of prompt γ-ray emissions from irradiation with protons are currently far less reliable as compared to simulations of the electromagnetic processes. Deviations from experimental findings were observed and quantified. Although there has been a constant improvement of Geant4 in the hadronic sector, there is still a gap to close.

  12. WRF Simulation over the Eastern Africa by use of Land Surface Initialization

    NASA Astrophysics Data System (ADS)

    Sakwa, V. N.; Case, J.; Limaye, A. S.; Zavodsky, B.; Kabuchanga, E. S.; Mungai, J.

    2014-12-01

    The East Africa region experiences severe weather events associated with hazards of varying magnitude. It receives heavy precipitation which leads to wide spread flooding and lack of sufficient rainfall in some parts results into drought. Cases of flooding and drought are two key forecasting challenges for the Kenya Meteorological Service (KMS). The source of heat and moisture depends on the state of the land surface which interacts with the boundary layer of the atmosphere to produce excessive precipitation or lack of it that leads to severe drought. The development and evolution of precipitation systems are affected by heat and moisture fluxes from the land surface within weakly-sheared environments, such as in the tropics and sub-tropics. These heat and moisture fluxes during the day can be strongly influenced by land cover, vegetation, and soil moisture content. Therefore, it is important to represent the land surface state as accurately as possible in numerical weather prediction models. Improved modeling capabilities within the region have the potential to enhance forecast guidance in support of daily operations and high-impact weather over East Africa. KMS currently runs a configuration of the Weather Research and Forecasting (WRF) model in real time to support its daily forecasting operations, invoking the Non-hydrostatic Mesoscale Model (NMM) dynamical core. They make use of the National Oceanic and Atmospheric Administration / National Weather Service Science and Training Resource Center's Environmental Modeling System (EMS) to manage and produce the WRF-NMM model runs on a 7-km regional grid over Eastern Africa.SPoRT and SERVIR provide land surface initialization datasets and model verification tool. The NASA Land Information System (LIS) provide real-time, daily soil initialization data in place of interpolated Global Forecast System soil moisture and temperature data. Model verification is done using the Model Evaluation Tools (MET) package, in order to quantify possible improvements in simulated temperature, moisture and precipitation resulting from the experimental land surface initialization. These MET tools enable KMS to monitor model forecast accuracy in near real time. This study highlights verification results of WRF runs over East Africa using the LIS land surface initialization.

  13. A fully non-linear multi-species Fokker–Planck–Landau collision operator for simulation of fusion plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hager, Robert, E-mail: rhager@pppl.gov; Yoon, E.S., E-mail: yoone@rpi.edu; Ku, S., E-mail: sku@pppl.gov

    2016-06-15

    Fusion edge plasmas can be far from thermal equilibrium and require the use of a non-linear collision operator for accurate numerical simulations. In this article, the non-linear single-species Fokker–Planck–Landau collision operator developed by Yoon and Chang (2014) [9] is generalized to include multiple particle species. The finite volume discretization used in this work naturally yields exact conservation of mass, momentum, and energy. The implementation of this new non-linear Fokker–Planck–Landau operator in the gyrokinetic particle-in-cell codes XGC1 and XGCa is described and results of a verification study are discussed. Finally, the numerical techniques that make our non-linear collision operator viable onmore » high-performance computing systems are described, including specialized load balancing algorithms and nested OpenMP parallelization. The collision operator's good weak and strong scaling behavior are shown.« less

  14. Skyrmion states in thin confined polygonal nanostructures

    NASA Astrophysics Data System (ADS)

    Pepper, Ryan Alexander; Beg, Marijan; Cortés-Ortuño, David; Kluyver, Thomas; Bisotti, Marc-Antonio; Carey, Rebecca; Vousden, Mark; Albert, Maximilian; Wang, Weiwei; Hovorka, Ondrej; Fangohr, Hans

    2018-03-01

    Recent studies have demonstrated that skyrmionic states can be the ground state in thin-film FeGe disk nanostructures in the absence of a stabilising applied magnetic field. In this work, we advance this understanding by investigating to what extent this stabilisation of skyrmionic structures through confinement exists in geometries that do not match the cylindrical symmetry of the skyrmion—such as squares and triangles. Using simulation, we show that skyrmionic states can form the ground state for a range of system sizes in both triangular and square-shaped FeGe nanostructures of 10 nm thickness in the absence of an applied field. We further provide data to assist in the experimental verification of our prediction; to imitate an experiment where the system is saturated with a strong applied field before the field is removed, we compute the time evolution and show the final equilibrium configuration of magnetization fields, starting from a uniform alignment.

  15. Transonic flow about a thick circular-arc airfoil

    NASA Technical Reports Server (NTRS)

    Mcdevitt, J. B.; Levy, L. L., Jr.; Deiwert, G. S.

    1975-01-01

    An experimental and theoretical study of transonic flow over a thick airfoil, prompted by a need for adequately documented experiments that could provide rigorous verification of viscous flow simulation computer codes, is reported. Special attention is given to the shock-induced separation phenomenon in the turbulent regime. Measurements presented include surface pressures, streamline and flow separation patterns, and shadowgraphs. For a limited range of free-stream Mach numbers the airfoil flow field is found to be unsteady. Dynamic pressure measurements and high-speed shadowgraph movies were taken to investigate this phenomenon. Comparisons of experimentally determined and numerically simulated steady flows using a new viscous-turbulent code are also included. The comparisons show the importance of including an accurate turbulence model. When the shock-boundary layer interaction is weak the turbulence model employed appears adequate, but when the interaction is strong, and extensive regions of separation are present, the model is inadequate and needs further development.

  16. A fully non-linear multi-species Fokker–Planck–Landau collision operator for simulation of fusion plasma

    DOE PAGES

    Hager, Robert; Yoon, E. S.; Ku, S.; ...

    2016-04-04

    Fusion edge plasmas can be far from thermal equilibrium and require the use of a non-linear collision operator for accurate numerical simulations. The non-linear single-species Fokker–Planck–Landau collision operator developed by Yoon and Chang (2014) [9] is generalized to include multiple particle species. Moreover, the finite volume discretization used in this work naturally yields exact conservation of mass, momentum, and energy. The implementation of this new non-linear Fokker–Planck–Landau operator in the gyrokinetic particle-in-cell codes XGC1 and XGCa is described and results of a verification study are discussed. Finally, the numerical techniques that make our non-linear collision operator viable on high-performance computingmore » systems are described, including specialized load balancing algorithms and nested OpenMP parallelization. As a result, the collision operator's good weak and strong scaling behavior are shown.« less

  17. Dynamic tire pressure sensor for measuring ground vibration.

    PubMed

    Wang, Qi; McDaniel, James Gregory; Wang, Ming L

    2012-11-07

    This work presents a convenient and non-contact acoustic sensing approach for measuring ground vibration. This approach, which uses an instantaneous dynamic tire pressure sensor (DTPS), possesses the capability to replace the accelerometer or directional microphone currently being used for inspecting pavement conditions. By measuring dynamic pressure changes inside the tire, ground vibration can be amplified and isolated from environmental noise. In this work, verifications of the DTPS concept of sensing inside the tire have been carried out. In addition, comparisons between a DTPS, ground-mounted accelerometer, and directional microphone are made. A data analysis algorithm has been developed and optimized to reconstruct ground acceleration from DTPS data. Numerical and experimental studies of this DTPS reveal a strong potential for measuring ground vibration caused by a moving vehicle. A calibration of transfer function between dynamic tire pressure change and ground acceleration may be needed for different tire system or for more accurate application.

  18. Dynamic Tire Pressure Sensor for Measuring Ground Vibration

    PubMed Central

    Wang, Qi; McDaniel, James Gregory; Wang, Ming L.

    2012-01-01

    This work presents a convenient and non-contact acoustic sensing approach for measuring ground vibration. This approach, which uses an instantaneous dynamic tire pressure sensor (DTPS), possesses the capability to replace the accelerometer or directional microphone currently being used for inspecting pavement conditions. By measuring dynamic pressure changes inside the tire, ground vibration can be amplified and isolated from environmental noise. In this work, verifications of the DTPS concept of sensing inside the tire have been carried out. In addition, comparisons between a DTPS, ground-mounted accelerometer, and directional microphone are made. A data analysis algorithm has been developed and optimized to reconstruct ground acceleration from DTPS data. Numerical and experimental studies of this DTPS reveal a strong potential for measuring ground vibration caused by a moving vehicle. A calibration of transfer function between dynamic tire pressure change and ground acceleration may be needed for different tire system or for more accurate application. PMID:23202206

  19. Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, S R; Bihari, B L; Salari, K

    As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.

  20. Integrated guidance, navigation and control verification plan primary flight system. [space shuttle avionics integration

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The verification process and requirements for the ascent guidance interfaces and the ascent integrated guidance, navigation and control system for the space shuttle orbiter are defined as well as portions of supporting systems which directly interface with the system. The ascent phase of verification covers the normal and ATO ascent through the final OMS-2 circularization burn (all of OPS-1), the AOA ascent through the OMS-1 burn, and the RTLS ascent through ET separation (all of MM 601). In addition, OPS translation verification is defined. Verification trees and roadmaps are given.

  1. Bias in estimating accuracy of a binary screening test with differential disease verification

    PubMed Central

    Brinton, John T.; Ringham, Brandy M.; Glueck, Deborah H.

    2011-01-01

    SUMMARY Sensitivity, specificity, positive and negative predictive value are typically used to quantify the accuracy of a binary screening test. In some studies it may not be ethical or feasible to obtain definitive disease ascertainment for all subjects using a gold standard test. When a gold standard test cannot be used an imperfect reference test that is less than 100% sensitive and specific may be used instead. In breast cancer screening, for example, follow-up for cancer diagnosis is used as an imperfect reference test for women where it is not possible to obtain gold standard results. This incomplete ascertainment of true disease, or differential disease verification, can result in biased estimates of accuracy. In this paper, we derive the apparent accuracy values for studies subject to differential verification. We determine how the bias is affected by the accuracy of the imperfect reference test, the percent who receive the imperfect reference standard test not receiving the gold standard, the prevalence of the disease, and the correlation between the results for the screening test and the imperfect reference test. It is shown that designs with differential disease verification can yield biased estimates of accuracy. Estimates of sensitivity in cancer screening trials may be substantially biased. However, careful design decisions, including selection of the imperfect reference test, can help to minimize bias. A hypothetical breast cancer screening study is used to illustrate the problem. PMID:21495059

  2. Study on micro-water measurement method based on SF6 insulation equipment in high altitude area

    NASA Astrophysics Data System (ADS)

    Zhang, Han; Liu, Yajin; Yan, Jun; Liu, Zhijian; Yan, Yongfei

    2018-06-01

    Moisture content is an important indicator of the insulation and arc extinguishing performance of SF6 insulated electrical equipment. The research shows that moisture measurements are strongly influenced by altitude pressures and the different order of pressure correction and temperature correction calculation, different calculation results will result. Therefore, in this paper, we studies the pressure and temperature environment based on moisture test of SF6 gas insulated equipment in power industry. Firstly, the PVT characteristics of pure SF6 gas and water vapor were analyzed and put forward the necessity of pressure correction, then combined the Pitzer-Veli equation of SF6 gas and Water Pitzer-Veli equation to fit PVT equation of state of SF6-H20 that suitable for electric power industry and deduced the Correction Formula of Moisture Measurement in SF6 Gas. Finally, through experiments, completion of the calibration formula optimization and verification SF6 electrical equipment on, proof of the applicability and effectiveness of the correction formula.

  3. Verification of causal influences of reasoning skills and epistemology on physics conceptual learning

    NASA Astrophysics Data System (ADS)

    Ding, Lin

    2014-12-01

    This study seeks to test the causal influences of reasoning skills and epistemologies on student conceptual learning in physics. A causal model, integrating multiple variables that were investigated separately in the prior literature, is proposed and tested through path analysis. These variables include student preinstructional reasoning skills measured by the Classroom Test of Scientific Reasoning, pre- and postepistemological views measured by the Colorado Learning Attitudes about Science Survey, and pre- and postperformance on Newtonian concepts measured by the Force Concept Inventory. Students from a traditionally taught calculus-based introductory mechanics course at a research university participated in the study. Results largely support the postulated causal model and reveal strong influences of reasoning skills and preinstructional epistemology on student conceptual learning gains. Interestingly enough, postinstructional epistemology does not appear to have a significant influence on student learning gains. Moreover, pre- and postinstructional epistemology, although barely different from each other on average, have little causal connection between them.

  4. A calibration method for patient specific IMRT QA using a single therapy verification film

    PubMed Central

    Shukla, Arvind Kumar; Oinam, Arun S.; Kumar, Sanjeev; Sandhu, I.S.; Sharma, S.C.

    2013-01-01

    Aim The aim of the present study is to develop and verify the single film calibration procedure used in intensity-modulated radiation therapy (IMRT) quality assurance. Background Radiographic films have been regularly used in routine commissioning of treatment modalities and verification of treatment planning system (TPS). The radiation dosimetery based on radiographic films has ability to give absolute two-dimension dose distribution and prefer for the IMRT quality assurance. However, the single therapy verification film gives a quick and significant reliable method for IMRT verification. Materials and methods A single extended dose rate (EDR 2) film was used to generate the sensitometric curve of film optical density and radiation dose. EDR 2 film was exposed with nine 6 cm × 6 cm fields of 6 MV photon beam obtained from a medical linear accelerator at 5-cm depth in solid water phantom. The nine regions of single film were exposed with radiation doses raging from 10 to 362 cGy. The actual dose measurements inside the field regions were performed using 0.6 cm3 ionization chamber. The exposed film was processed after irradiation using a VIDAR film scanner and the value of optical density was noted for each region. Ten IMRT plans of head and neck carcinoma were used for verification using a dynamic IMRT technique, and evaluated using the gamma index method with TPS calculated dose distribution. Results Sensitometric curve has been generated using a single film exposed at nine field region to check quantitative dose verifications of IMRT treatments. The radiation scattered factor was observed to decrease exponentially with the increase in the distance from the centre of each field region. The IMRT plans based on calibration curve were verified using the gamma index method and found to be within acceptable criteria. Conclusion The single film method proved to be superior to the traditional calibration method and produce fast daily film calibration for highly accurate IMRT verification. PMID:24416558

  5. SU-E-T-762: Toward Volume-Based Independent Dose Verification as Secondary Check

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tachibana, H; Tachibana, R

    2015-06-15

    Purpose: Lung SBRT plan has been shifted to volume prescription technique. However, point dose agreement is still verified using independent dose verification at the secondary check. The volume dose verification is more affected by inhomogeneous correction rather than point dose verification currently used as the check. A feasibility study for volume dose verification was conducted in lung SBRT plan. Methods: Six SBRT plans were collected in our institute. Two dose distributions with / without inhomogeneous correction were generated using Adaptive Convolve (AC) in Pinnacle3. Simple MU Analysis (SMU, Triangle Product, Ishikawa, JP) was used as the independent dose verification softwaremore » program, in which a modified Clarkson-based algorithm was implemented and radiological path length was computed using CT images independently to the treatment planning system. The agreement in point dose and mean dose between the AC with / without the correction and the SMU were assessed. Results: In the point dose evaluation for the center of the GTV, the difference shows the systematic shift (4.5% ± 1.9 %) in comparison of the AC with the inhomogeneous correction, on the other hands, there was good agreement of 0.2 ± 0.9% between the SMU and the AC without the correction. In the volume evaluation, there were significant differences in mean dose for not only PTV (14.2 ± 5.1 %) but also GTV (8.0 ± 5.1 %) compared to the AC with the correction. Without the correction, the SMU showed good agreement for GTV (1.5 ± 0.9%) as well as PTV (0.9% ± 1.0%). Conclusion: The volume evaluation for secondary check may be possible in homogenous region. However, the volume including the inhomogeneous media would make larger discrepancy. Dose calculation algorithm for independent verification needs to be modified to take into account the inhomogeneous correction.« less

  6. From Verified Models to Verifiable Code

    NASA Technical Reports Server (NTRS)

    Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.

  7. SSME Alternate Turbopump Development Program: Design verification specification for high-pressure fuel turbopump

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The design and verification requirements are defined which are appropriate to hardware at the detail, subassembly, component, and engine levels and to correlate these requirements to the development demonstrations which provides verification that design objectives are achieved. The high pressure fuel turbopump requirements verification matrix provides correlation between design requirements and the tests required to verify that the requirement have been met.

  8. Quantum money with classical verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gavinsky, Dmitry

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  9. Quantum money with classical verification

    NASA Astrophysics Data System (ADS)

    Gavinsky, Dmitry

    2014-12-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  10. Physics Verification Overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  11. Towards composition of verified hardware devices

    NASA Technical Reports Server (NTRS)

    Schubert, E. Thomas; Levitt, K.; Cohen, G. C.

    1991-01-01

    Computers are being used where no affordable level of testing is adequate. Safety and life critical systems must find a replacement for exhaustive testing to guarantee their correctness. Through a mathematical proof, hardware verification research has focused on device verification and has largely ignored system composition verification. To address these deficiencies, we examine how the current hardware verification methodology can be extended to verify complete systems.

  12. Control and Non-Payload Communications (CNPC) Prototype Radio Verification Test Report

    NASA Technical Reports Server (NTRS)

    Bishop, William D.; Frantz, Brian D.; Thadhani, Suresh K.; Young, Daniel P.

    2017-01-01

    This report provides an overview and results from the verification of the specifications that defines the operational capabilities of the airborne and ground, L Band and C Band, Command and Non-Payload Communications radio link system. An overview of system verification is provided along with an overview of the operation of the radio. Measurement results are presented for verification of the radios operation.

  13. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct such measurements to test the algorithms during the analyzer interference verification. (c...

  14. Formal verification of an oral messages algorithm for interactive consistency

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1992-01-01

    The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.

  15. A methodology for the rigorous verification of plasma simulation codes

    NASA Astrophysics Data System (ADS)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  16. The Effects of Sentence Imitation and Picture Verification on the Recall of Subsequent Digits. Lektos: Interdisciplinary Working Papers in Language Sciences, Vol. 3, No. 1.

    ERIC Educational Resources Information Center

    Scholes, Robert J.; And Others

    The effects of sentence imitation and picture verification on the recall of subsequent digits were studied. Stimuli consisted of 20 sentences, each sentence followed by a string of five digit names, and five structural types of sentences were presented. Subjects were instructed to listen to the sentence and digit string and then either immediately…

  17. ADVANCED SURVEILLANCE OF ENVIROMENTAL RADIATION IN AUTOMATIC NETWORKS.

    PubMed

    Benito, G; Sáez, J C; Blázquez, J B; Quiñones, J

    2018-06-01

    The objective of this study is the verification of the operation of a radiation monitoring network conformed by several sensors. The malfunction of a surveillance network has security and economic consequences, which derive from its maintenance and could be avoided with an early detection. The proposed method is based on a kind of multivariate distance, and the verification for the methodology has been tested at CIEMAT's local radiological early warning network.

  18. The verification of LANDSAT data in the geographical analysis of wetlands in west Tennessee

    NASA Technical Reports Server (NTRS)

    Rehder, J.; Quattrochi, D. A.

    1978-01-01

    The reliability of LANDSAT imagery as a medium for identifying, delimiting, monitoring, measuring, and mapping wetlands in west Tennessee was assessed to verify LANDSAT as an accurate, efficient cartographic tool that could be employed by a wide range of users to study wetland dynamics. The verification procedure was based on the visual interpretation and measurement of multispectral imagery. The accuracy testing procedure was predicated on surrogate ground truth data gleaned from medium altitude imagery of the wetlands. Fourteen sites or case study areas were selected from individual 9 x 9 inch photo frames on the aerial photography. These sites were then used as data control calibration parameters for assessing the cartography accuracy of the LANDSAT imagery. An analysis of results obtained from the verification tests indicated that 1:250,000 scale LANDSAT data were the most reliable scale of imagery for visually mapping and measuring wetlands using the area grid technique. The mean areal percentage of accuracy was 93.54 percent (real) and 96.93 percent (absolute). As a test of accuracy, the LANDSAT 1:250,000 scale overall wetland measurements were compared with an area cell mensuration of the swamplands from 1:130,000 scale color infrared U-2 aircraft imagery. The comparative totals substantiated the results from the LANDSAT verification procedure.

  19. Volumetric Verification of Multiaxis Machine Tool Using Laser Tracker

    PubMed Central

    Aguilar, Juan José

    2014-01-01

    This paper aims to present a method of volumetric verification in machine tools with linear and rotary axes using a laser tracker. Beyond a method for a particular machine, it presents a methodology that can be used in any machine type. Along this paper, the schema and kinematic model of a machine with three axes of movement, two linear and one rotational axes, including the measurement system and the nominal rotation matrix of the rotational axis are presented. Using this, the machine tool volumetric error is obtained and nonlinear optimization techniques are employed to improve the accuracy of the machine tool. The verification provides a mathematical, not physical, compensation, in less time than other methods of verification by means of the indirect measurement of geometric errors of the machine from the linear and rotary axes. This paper presents an extensive study about the appropriateness and drawbacks of the regression function employed depending on the types of movement of the axes of any machine. In the same way, strengths and weaknesses of measurement methods and optimization techniques depending on the space available to place the measurement system are presented. These studies provide the most appropriate strategies to verify each machine tool taking into consideration its configuration and its available work space. PMID:25202744

  20. Verification of intensity modulated radiation therapy beams using a tissue equivalent plastic scintillator dosimetry system

    NASA Astrophysics Data System (ADS)

    Petric, Martin Peter

    This thesis describes the development and implementation of a novel method for the dosimetric verification of intensity modulated radiation therapy (IMRT) fields with several advantages over current techniques. Through the use of a tissue equivalent plastic scintillator sheet viewed by a charge-coupled device (CCD) camera, this method provides a truly tissue equivalent dosimetry system capable of efficiently and accurately performing field-by-field verification of IMRT plans. This work was motivated by an initial study comparing two IMRT treatment planning systems. The clinical functionality of BrainLAB's BrainSCAN and Varian's Helios IMRT treatment planning systems were compared in terms of implementation and commissioning, dose optimization, and plan assessment. Implementation and commissioning revealed differences in the beam data required to characterize the beam prior to use with the BrainSCAN system requiring higher resolution data compared to Helios. This difference was found to impact on the ability of the systems to accurately calculate dose for highly modulated fields, with BrainSCAN being more successful than Helios. The dose optimization and plan assessment comparisons revealed that while both systems use considerably different optimization algorithms and user-control interfaces, they are both capable of producing substantially equivalent dose plans. The extensive use of dosimetric verification techniques in the IMRT treatment planning comparison study motivated the development and implementation of a novel IMRT dosimetric verification system. The system consists of a water-filled phantom with a tissue equivalent plastic scintillator sheet built into the top surface. Scintillation light is reflected by a plastic mirror within the phantom towards a viewing window where it is captured using a CCD camera. Optical photon spread is removed using a micro-louvre optical collimator and by deconvolving a glare kernel from the raw images. Characterization of this new dosimetric verification system indicates excellent dose response and spatial linearity, high spatial resolution, and good signal uniformity and reproducibility. Dosimetric results from square fields, dynamic wedged fields, and a 7-field head and neck IMRT treatment plan indicate good agreement with film dosimetry distributions. Efficiency analysis of the system reveals a 50% reduction in time requirements for field-by-field verification of a 7-field IMRT treatment plan compared to film dosimetry.

  1. A FUNCTIONAL NEUROIMAGING INVESTIGATION OF THE ROLES OF STRUCTURAL COMPLEXITY AND TASK-DEMAND DURING AUDITORY SENTENCE PROCESSING

    PubMed Central

    Love, Tracy; Haist, Frank; Nicol, Janet; Swinney, David

    2009-01-01

    Using functional magnetic resonance imaging (fMRI), this study directly examined an issue that bridges the potential language processing and multi-modal views of the role of Broca’s area: the effects of task-demands in language comprehension studies. We presented syntactically simple and complex sentences for auditory comprehension under three different (differentially complex) task-demand conditions: passive listening, probe verification, and theme judgment. Contrary to many language imaging findings, we found that both simple and complex syntactic structures activated left inferior frontal cortex (L-IFC). Critically, we found activation in these frontal regions increased together with increased task-demands. Specifically, tasks that required greater manipulation and comparison of linguistic material recruited L-IFC more strongly; independent of syntactic structure complexity. We argue that much of the presumed syntactic effects previously found in sentence imaging studies of L-IFC may, among other things, reflect the tasks employed in these studies and that L-IFC is a region underlying mnemonic and other integrative functions, on which much language processing may rely. PMID:16881268

  2. A Quantitative Approach to the Formal Verification of Real-Time Systems.

    DTIC Science & Technology

    1996-09-01

    Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic

  3. Kleene Algebra and Bytecode Verification

    DTIC Science & Technology

    2016-04-27

    computing the star (Kleene closure) of a matrix of transfer functions. In this paper we show how this general framework applies to the problem of Java ...bytecode verification. We show how to specify transfer functions arising in Java bytecode verification in such a way that the Kleene algebra operations...potentially improve the performance over the standard worklist algorithm when a small cutset can be found. Key words: Java , bytecode, verification, static

  4. Security Verification of Secure MANET Routing Protocols

    DTIC Science & Technology

    2012-03-22

    SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS THESIS Matthew F. Steele, Captain, USAF AFIT/GCS/ ENG /12-03 DEPARTMENT OF THE AIR FORCE AIR...States AFIT/GCS/ ENG /12-03 SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS THESIS Presented to the Faculty Department of Electrical and Computer...DISTRIBUTION UNLIMITED AFIT/GCS/ ENG /12-03 SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS Matthew F. Steele, B.S.E.E. Captain, USAF

  5. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  6. A Practitioners Perspective on Verification

    NASA Astrophysics Data System (ADS)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  7. Evaluation of geotechnical monitoring data from the ESF North Ramp Starter Tunnel, April 1994 to June 1995. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-11-01

    This report presents the results of instrumentation measurements and observations made during construction of the North Ramp Starter Tunnel (NRST) of the Exploratory Studies Facility (ESF). The information in this report was developed as part of the Design Verification Study, Section 8.3.1.15.1.8 of the Yucca Mountain Site Characterization Plan (DOE 1988). The ESF is being constructed by the US Department of Energy (DOE) to evaluate the feasibility of locating a potential high-level nuclear waste repository on lands within and adjacent to the Nevada Test Site (NTS), Nye County, Nevada. The Design Verification Studies are performed to collect information during constructionmore » of the ESF that will be useful for design and construction of the potential repository. Four experiments make up the Design Verification Study: Evaluation of Mining Methods, Monitoring Drift Stability, Monitoring of Ground Support Systems, and The Air Quality and Ventilation Experiment. This report describes Sandia National Laboratories` (SNL) efforts in the first three of these experiments in the NRST.« less

  8. Evaluation of AK-225(R), Vertrel(R) MCA and HFE A 7100 as Alternative Solvents for Precision Cleaning and Verification Technology

    NASA Technical Reports Server (NTRS)

    Melendez, Orlando; Trizzino, Mary; Fedderson, Bryan

    1997-01-01

    The National Aeronautics and Space Administration (NASA), Kennedy Space Center (KSC) Materials Science Division conducted a study to evaluate alternative solvents for CFC-113 in precision cleaning and verification on typical samples that are used in the KSC environment. The effects of AK-225(R), Vertrel(R), MCA, and HFE A 7100 on selected metal and polymer materials were studied over 1, 7 and 30 day test times. This report addresses a study on the compatibility aspects of replacement solvents for materials in aerospace applications.

  9. Environmental Technology Verification: Pesticide Spray Drift Reduction Technologies for Row and Field Crops

    EPA Pesticide Factsheets

    The Environmental Technology Verification Program, established by the EPA, is designed to accelerate the development and commercialization of new or improved technologies through third-party verification and reporting of performance.

  10. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  11. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  12. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  13. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  14. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  15. How easily can omission of patients, or selection amongst poorly-reproducible measurements, create artificial correlations? Methods for detection and implications for observational research design in cardiology.

    PubMed

    Francis, Darrel P

    2013-07-15

    When reported correlation coefficients seem too high to be true, does investigative verification of source data provide suitable reassurance? This study tests how easily omission of patients or selection amongst irreproducible measurements generate fictitious strong correlations, without data fabrication. Two forms of manipulation are applied to a pair of normally-distributed, uncorrelated variables: first, exclusion of patients least favourable to a hypothesised association and, second, making multiple poorly-reproducible measurements per patient and choosing the most supportive. Excluding patients raises correlations powerfully, from 0.0 ± 0.11 (no patients omitted) to 0.40 ± 0.11 (one-fifth omitted), 0.59 ± 0.08 (one-third omitted) and 0.78 ± 0.05 (half omitted). Study size offers no protection: omitting just one-fifth of 75 patients (i.e. publishing 60) makes 92% of correlations statistically significant. Worse, simply selecting the most favourable amongst several measurements raises correlations from 0.0 ± 0.12 (single measurement of each variable) to 0.73 ± 0.06 (best of 2), and 0.90 ± 0.03 (best of 4). 100% of correlation coefficients become statistically significant. Scatterplots may reveal a telltale "shave sign" or "bite sign". Simple statistical tests are presented for these suspicious signatures in single or multiple studies. Correlations are vulnerable to data manipulation. Cardiology is especially vulnerable to patient deletion (because cardiologists ourselves might completely control enrolment and measurement), and selection of "best" measurements (because alternative heartbeats are numerous, and some modalities poorly reproducible). Source data verification cannot detect these but tests might highlight suspicious data and--aggregating across studies--unreliable laboratories or research fields. Cardiological correlation research needs adequately-informed planning and guarantees of integrity, with teeth. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  16. Formulating face verification with semidefinite programming.

    PubMed

    Yan, Shuicheng; Liu, Jianzhuang; Tang, Xiaoou; Huang, Thomas S

    2007-11-01

    This paper presents a unified solution to three unsolved problems existing in face verification with subspace learning techniques: selection of verification threshold, automatic determination of subspace dimension, and deducing feature fusing weights. In contrast to previous algorithms which search for the projection matrix directly, our new algorithm investigates a similarity metric matrix (SMM). With a certain verification threshold, this matrix is learned by a semidefinite programming approach, along with the constraints of the kindred pairs with similarity larger than the threshold, and inhomogeneous pairs with similarity smaller than the threshold. Then, the subspace dimension and the feature fusing weights are simultaneously inferred from the singular value decomposition of the derived SMM. In addition, the weighted and tensor extensions are proposed to further improve the algorithmic effectiveness and efficiency, respectively. Essentially, the verification is conducted within an affine subspace in this new algorithm and is, hence, called the affine subspace for verification (ASV). Extensive experiments show that the ASV can achieve encouraging face verification accuracy in comparison to other subspace algorithms, even without the need to explore any parameters.

  17. Limitations in learning: How treatment verifications fail and what to do about it?

    PubMed

    Richardson, Susan; Thomadsen, Bruce

    The purposes of this study were: to provide dialog on why classic incident learning systems have been insufficient for patient safety improvements, discuss failures in treatment verification, and to provide context to the reasons and lessons that can be learned from these failures. Historically, incident learning in brachytherapy is performed via database mining which might include reading of event reports and incidents followed by incorporating verification procedures to prevent similar incidents. A description of both classic event reporting databases and current incident learning and reporting systems is given. Real examples of treatment failures based on firsthand knowledge are presented to evaluate the effectiveness of verification. These failures will be described and analyzed by outlining potential pitfalls and problems based on firsthand knowledge. Databases and incident learning systems can be limited in value and fail to provide enough detail for physicists seeking process improvement. Four examples of treatment verification failures experienced firsthand by experienced brachytherapy physicists are described. These include both underverification and oververification of various treatment processes. Database mining is an insufficient method to affect substantial improvements in the practice of brachytherapy. New incident learning systems are still immature and being tested. Instead, a new method of shared learning and implementation of changes must be created. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  18. Approaching the investigation of plasma turbulence through a rigorous verification and validation procedure: A practical example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricci, P., E-mail: paolo.ricci@epfl.ch; Riva, F.; Theiler, C.

    In the present work, a Verification and Validation procedure is presented and applied showing, through a practical example, how it can contribute to advancing our physics understanding of plasma turbulence. Bridging the gap between plasma physics and other scientific domains, in particular, the computational fluid dynamics community, a rigorous methodology for the verification of a plasma simulation code is presented, based on the method of manufactured solutions. This methodology assesses that the model equations are correctly solved, within the order of accuracy of the numerical scheme. The technique to carry out a solution verification is described to provide a rigorousmore » estimate of the uncertainty affecting the numerical results. A methodology for plasma turbulence code validation is also discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The Verification and Validation methodology is then applied to the study of plasma turbulence in the basic plasma physics experiment TORPEX [Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulations carried out with the GBS code [Ricci et al., Plasma Phys. Controlled Fusion 54, 124047 (2012)]. The validation procedure allows progress in the understanding of the turbulent dynamics in TORPEX, by pinpointing the presence of a turbulent regime transition, due to the competition between the resistive and ideal interchange instabilities.« less

  19. Verification in Referral-Based Crowdsourcing

    PubMed Central

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  20. Fuzzy Logic Controller Stability Analysis Using a Satisfiability Modulo Theories Approach

    NASA Technical Reports Server (NTRS)

    Arnett, Timothy; Cook, Brandon; Clark, Matthew A.; Rattan, Kuldip

    2017-01-01

    While many widely accepted methods and techniques exist for validation and verification of traditional controllers, at this time no solutions have been accepted for Fuzzy Logic Controllers (FLCs). Due to the highly nonlinear nature of such systems, and the fact that developing a valid FLC does not require a mathematical model of the system, it is quite difficult to use conventional techniques to prove controller stability. Since safety-critical systems must be tested and verified to work as expected for all possible circumstances, the fact that FLC controllers cannot be tested to achieve such requirements poses limitations on the applications for such technology. Therefore, alternative methods for verification and validation of FLCs needs to be explored. In this study, a novel approach using formal verification methods to ensure the stability of a FLC is proposed. Main research challenges include specification of requirements for a complex system, conversion of a traditional FLC to a piecewise polynomial representation, and using a formal verification tool in a nonlinear solution space. Using the proposed architecture, the Fuzzy Logic Controller was found to always generate negative feedback, but inconclusive for Lyapunov stability.

  1. Using hardware-in-the-loop (HWIL) simulation to provide low-cost testing of TMD IR missile systems

    NASA Astrophysics Data System (ADS)

    Buford, James A., Jr.; Paone, Thad

    1998-07-01

    A greater awareness of and increased interest in the use of modeling and simulation (M&S) has been demonstrated at many levels within the Department of Defense (DoD) and all the Armed Services agencies in recent years. M&S application is regarded as a viable means of lowering the life cycle costs of theater missile defense (TMD) weapon system acquisition beginning with studies of new concepts of warfighting through user training and post-deployment support. The Missile Research, Engineering, and Development Center (MRDEC) of the U.S. Army Aviation and Missile Command (AMCOM) has an extensive history of applying all types of M&S to TMD weapon system development and has been a particularly strong advocate of hardware-in-the-loop (HWIL) simulation for many years. Over the past 10 years MRDEC has developed specific and dedicated HWIL capabilities for TMD applications in both the infrared and radio frequency sensor domains. This paper provides an overview of the infrared-based TMD HWIL missile facility known as the Imaging Infrared System Simulation (I2RSS) which is used to support the Theater High Altitude Air Defense (THAAD) missile system. This facility uses M&S to conduct daily THAAD HWIL missile simulations to support flight tests, missile/system development, independent verification and validation of weapon system embedded software and simulations, and missile/system performance against current and future threat environments. This paper describes the THAAD TMD HWIL role, process, major components, HWIL verification/validation, and daily HWIL support areas in terms of both missile and complete system.

  2. Cross-correlation redshift calibration without spectroscopic calibration samples in DES Science Verification Data

    NASA Astrophysics Data System (ADS)

    Davis, C.; Rozo, E.; Roodman, A.; Alarcon, A.; Cawthon, R.; Gatti, M.; Lin, H.; Miquel, R.; Rykoff, E. S.; Troxel, M. A.; Vielzeuf, P.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Bechtol, K.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Doel, P.; Drlica-Wagner, A.; Fausti Neto, A.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gaztanaga, E.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; Jain, B.; James, D. J.; Jeltema, T.; Krause, E.; Kuehn, K.; Kuhlmann, S.; Kuropatkin, N.; Lahav, O.; Li, T. S.; Lima, M.; March, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Ogando, R. L. C.; Plazas, A. A.; Romer, A. K.; Sanchez, E.; Scarpine, V.; Schindler, R.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, M.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Vikram, V.; Walker, A. R.; Wechsler, R. H.

    2018-06-01

    Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogues with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty of Δz ˜ ±0.01. We forecast that our proposal can, in principle, control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Our results provide strong motivation to launch a programme to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.

  3. Cross-correlation redshift calibration without spectroscopic calibration samples in DES Science Verification Data

    DOE PAGES

    Davis, C.; Rozo, E.; Roodman, A.; ...

    2018-03-26

    Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogs with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty ofmore » $$\\Delta z \\sim \\pm 0.01$$. We forecast that our proposal can in principle control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Here, our results provide strong motivation to launch a program to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.« less

  4. Evaluation of WRF-based convection-permitting multi-physics ensemble forecasts over China for an extreme rainfall event on 21 July 2012 in Beijing

    NASA Astrophysics Data System (ADS)

    Zhu, Kefeng; Xue, Ming

    2016-11-01

    On 21 July 2012, an extreme rainfall event that recorded a maximum rainfall amount over 24 hours of 460 mm, occurred in Beijing, China. Most operational models failed to predict such an extreme amount. In this study, a convective-permitting ensemble forecast system (CEFS), at 4-km grid spacing, covering the entire mainland of China, is applied to this extreme rainfall case. CEFS consists of 22 members and uses multiple physics parameterizations. For the event, the predicted maximum is 415 mm d-1 in the probability-matched ensemble mean. The predicted high-probability heavy rain region is located in southwest Beijing, as was observed. Ensemble-based verification scores are then investigated. For a small verification domain covering Beijing and its surrounding areas, the precipitation rank histogram of CEFS is much flatter than that of a reference global ensemble. CEFS has a lower (higher) Brier score and a higher resolution than the global ensemble for precipitation, indicating more reliable probabilistic forecasting by CEFS. Additionally, forecasts of different ensemble members are compared and discussed. Most of the extreme rainfall comes from convection in the warm sector east of an approaching cold front. A few members of CEFS successfully reproduce such precipitation, and orographic lift of highly moist low-level flows with a significantly southeasterly component is suggested to have played important roles in producing the initial convection. Comparisons between good and bad forecast members indicate a strong sensitivity of the extreme rainfall to the mesoscale environmental conditions, and, to less of an extent, the model physics.

  5. Cross-correlation redshift calibration without spectroscopic calibration samples in DES Science Verification Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, C.; Rozo, E.; Roodman, A.

    Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogs with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty ofmore » $$\\Delta z \\sim \\pm 0.01$$. We forecast that our proposal can in principle control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Here, our results provide strong motivation to launch a program to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.« less

  6. CFD modeling and experimental verification of oscillating flow and heat transfer processes in the micro coaxial Stirling-type pulse tube cryocooler operating at 90-170 Hz

    NASA Astrophysics Data System (ADS)

    Zhao, Yibo; Yu, Guorui; Tan, Jun; Mao, Xiaochen; Li, Jiaqi; Zha, Rui; Li, Ning; Dang, Haizheng

    2018-03-01

    This paper presents the CFD modeling and experimental verifications of oscillating flow and heat transfer processes in the micro coaxial Stirling-type pulse tube cryocooler (MCSPTC) operating at 90-170 Hz. It uses neither double-inlet nor multi-bypass while the inertance tube with a gas reservoir becomes the only phase-shifter. The effects of the frequency on flow and heat transfer processes in the pulse tube are investigated, which indicates that a low enough frequency would lead to a strong mixing between warm and cold fluids, thereby significantly deteriorating the cooling performance, whereas a high enough frequency would produce the downward sloping streams flowing from the warm end to the axis and almost puncturing the gas displacer from the warm end, thereby creating larger temperature gradients in radial directions and thus undermining the cooling performance. The influence of the pulse tube length on the temperature and velocity when the frequencies are much higher than the optimal one are also discussed. A MCSPTC with an overall mass of 1.1 kg is worked out and tested. With an input electric power of 59 W and operating at 144 Hz, it achieves a no-load temperature of 61.4 K and a cooling capacity of 1.0 W at 77 K. The changing tendencies of tested results are in good agreement with the simulations. The above studies will help to thoroughly understand the underlying mechanism of the inertance MCSPTC operating at very high frequencies.

  7. Comparative study of the swabbing properties of seven commercially available swab materials for cleaning verification.

    PubMed

    Corrigan, Damion K; Piletsky, Sergey; McCrossen, Sean

    2009-01-01

    This article compares the technical performances of several different commercially available swabbing materials for the purpose of cleaning verification. A steel surface was soiled with solutions of acetaminophen, nicotinic acid, diclofenac, and benzamidine and wiped with each swabbing material. The compounds were extracted with water or ethanol (depending on polarity of analyte) and their concentration in extract was quantified spectrophotometrically. The study also investigated swab debris on the wiped surface. The swab performances were compared and the best swab material was identified.

  8. Potential capabilities of Reynolds stress turbulence model in the COMMIX-RSM code

    NASA Technical Reports Server (NTRS)

    Chang, F. C.; Bottoni, M.

    1994-01-01

    A Reynolds stress turbulence model has been implemented in the COMMIX code, together with transport equations describing turbulent heat fluxes, variance of temperature fluctuations, and dissipation of turbulence kinetic energy. The model has been verified partially by simulating homogeneous turbulent shear flow, and stable and unstable stratified shear flows with strong buoyancy-suppressing or enhancing turbulence. This article outlines the model, explains the verifications performed thus far, and discusses potential applications of the COMMIX-RSM code in several domains, including, but not limited to, analysis of thermal striping in engineering systems, simulation of turbulence in combustors, and predictions of bubbly and particulate flows.

  9. Sequence verification as quality-control step for production of cDNA microarrays.

    PubMed

    Taylor, E; Cogdell, D; Coombes, K; Hu, L; Ramdas, L; Tabor, A; Hamilton, S; Zhang, W

    2001-07-01

    To generate cDNA arrays in our core laboratory, we amplified about 2300 PCR products from a human, sequence-verified cDNA clone library. As a quality-control step, we sequenced the PCR products immediately before printing. The sequence information was used to search the GenBank database to confirm the identities. Although these clones were previously sequence verified by the company, we found that only 79% of the clones matched the original database after handling. Our experience strongly indicates the necessity to sequence verify the clones at the final stage before printing on microarray slides and to modify the gene list accordingly.

  10. Initial Ship Design Using a Pearson Correlation Coefficient and Artificial Intelligence Techniques

    NASA Astrophysics Data System (ADS)

    Moon, Byung Young; Kim, Soo Young; Kang, Gyung Ju

    In this paper we analyzed correlation between geometrical character and resistance, and effective horse power by using Pearson correlation coefficient which is one of the data mining methods. Also we made input data to ship's geometrical character which has strong correlation with output data. We calculated effective horse power and resistance by using Neuro-Fuzzy system. To verify the calculation, 9 of 11 container ships' data were improved as data of Neuro-Fuzzy system and the others were improved as verification data. After analyzing rate of error between existing data and calculation data, we concluded that calculation data have sound agreement with existing data.

  11. Intelligent content fitting for digital publishing

    NASA Astrophysics Data System (ADS)

    Lin, Xiaofan

    2006-02-01

    One recurring problem in Variable Data Printing (VDP) is that the existing contents cannot satisfy the VDP task as-is. So there is a strong need for content fitting technologies to support high-value digital publishing applications, in which text and image are the two major types of contents. This paper presents meta-Autocrop framework for image fitting and TextFlex technology for text fitting. The meta-Autocrop framework supports multiple modes: fixed aspect-ratio mode, advice mode, and verification mode. The TextFlex technology supports non-rectangular text wrapping and paragraph-based line breaking. We also demonstrate how these content fitting technologies are utilized in the overall automated composition and layout system.

  12. Verification Testing of Air Pollution Control Technology Quality Management Plan Revision 2.3

    EPA Pesticide Factsheets

    The Air Pollution Control Technology Verification Center was established in 1995 as part of the EPA’s Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technologies’ performance.

  13. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > NAM > EXPERIMENTAL DATA Home NAM Operational Products HIRESW Operational Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model PARALLEL/EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM FOR MONITORING AND CHARACTERIZATION

    EPA Science Inventory

    The Environmental Technology Verification Program is a service of the Environmental Protection Agency designed to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of performance. The goal of ETV i...

  15. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  16. Verification and quality control of routine hematology analyzers.

    PubMed

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  17. Satellite Power System (SPS) concept definition study (exhibit C)

    NASA Technical Reports Server (NTRS)

    Haley, G. M.

    1979-01-01

    The major outputs of the study are the constructability studies which resulted in the definition of the concepts for satellite, rectenna, and satellite construction base construction. Transportation analyses resulted in definition of heavy-lift launch vehicle, electric orbit transfer vehicle, personnel orbit transfer vehicle, and intra-orbit transfer vehicle as well as overall operations related to transportation systems. The experiment/verification program definition resulted in the definition of elements for the Ground-Based Experimental Research and Key Technology plans. These studies also resulted in conceptual approaches for early space technology verification. The cost analysis defined the overall program and cost data for all program elements and phases.

  18. SU-E-T-49: A Multi-Institutional Study of Independent Dose Verification for IMRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baba, H; Tachibana, H; Kamima, T

    2015-06-15

    Purpose: AAPM TG114 does not cover the independent verification for IMRT. We conducted a study of independent dose verification for IMRT in seven institutes to show the feasibility. Methods: 384 IMRT plans in the sites of prostate and head and neck (HN) were collected from the institutes, where the planning was performed using Eclipse and Pinnacle3 with the two techniques of step and shoot (S&S) and sliding window (SW). All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiologicalmore » path length. An ion-chamber measurement in a water-equivalent slab phantom was performed to compare the doses computed using the TPS and an independent dose verification program. Additionally, the agreement in dose computed in patient CT images between using the TPS and using the SMU was assessed. The dose of the composite beams in the plan was evaluated. Results: The agreement between the measurement and the SMU were −2.3±1.9 % and −5.6±3.6 % for prostate and HN sites, respectively. The agreement between the TPSs and the SMU were −2.1±1.9 % and −3.0±3.7 for prostate and HN sites, respectively. There was a negative systematic difference with similar standard deviation and the difference was larger in the HN site. The S&S technique showed a statistically significant difference between the SW. Because the Clarkson-based method in the independent program underestimated (cannot consider) the dose under the MLC. Conclusion: The accuracy would be improved when the Clarkson-based algorithm should be modified for IMRT and the tolerance level would be within 5%.« less

  19. Verification of endotracheal intubation in obese patients - temporal comparison of ultrasound vs. auscultation and capnography.

    PubMed

    Pfeiffer, P; Bache, S; Isbye, D L; Rudolph, S S; Rovsing, L; Børglum, J

    2012-05-01

    Ultrasound (US) may have an emerging role as an adjunct in verification of endotracheal intubation. Obtaining optimal US images in obese patients is generally regarded more difficult than for other patients. This study compared the time consumption of bilateral lung US with auscultation and capnography for verifying endotracheal intubation in obese patients. A prospective, paired and investigator-blinded study performed in the operating theatre. Twenty-four adult patients requiring endotracheal intubation for bariatric surgery were included. During post-intubation bag ventilation, bilateral lung US was performed for detection of lungsliding indicating lung ventilation simultaneous with capnography and auscultation of epigastrium and chest. Primary outcome measure was the time difference to confirmed endotracheal intubation between US and auscultation alone. The secondary outcome measure was time difference between US and auscultation combined with capnography. Both methods verified endotracheal tube placement in all patients. No significant difference was found between US compared with auscultation alone. Median time for verification by auscultation alone was 47.5 s [interquartile (IQR) 40-51 s], with a mean difference of -0.3 s in favor of US (95% confidence interval -3.5-2.9 s) P = 0.87. Comparing US with the combination of auscultation and capnography, there was a significant difference between the two methods. Median time for verification by US was 43 s (IQR 40-51 s) vs. 55 s (IQR 46-65 s), P < 0.0001. In obese patients, verification of endotracheal tube placement with US is as fast as auscultation alone and faster than the standard method of auscultation and capnography. © 2012 The Authors. Acta Anaesthesiologica Scandinavica © 2012 The Acta Anaesthesiologica Scandinavica Foundation.

  20. Comparison between In-house developed and Diamond commercial software for patient specific independent monitor unit calculation and verification with heterogeneity corrections.

    PubMed

    Kuppusamy, Vijayalakshmi; Nagarajan, Vivekanandan; Jeevanandam, Prakash; Murugan, Lavanya

    2016-02-01

    The study was aimed to compare two different monitor unit (MU) or dose verification software in volumetric modulated arc therapy (VMAT) using modified Clarkson's integration technique for 6 MV photons beams. In-house Excel Spreadsheet based monitor unit verification calculation (MUVC) program and PTW's DIAMOND secondary check software (SCS), version-6 were used as a secondary check to verify the monitor unit (MU) or dose calculated by treatment planning system (TPS). In this study 180 patients were grouped into 61 head and neck, 39 thorax and 80 pelvic sites. Verification plans are created using PTW OCTAVIUS-4D phantom and also measured using 729 detector chamber and array with isocentre as the suitable point of measurement for each field. In the analysis of 154 clinically approved VMAT plans with isocentre at a region above -350 HU, using heterogeneity corrections, In-house Spreadsheet based MUVC program and Diamond SCS showed good agreement TPS. The overall percentage average deviations for all sites were (-0.93% + 1.59%) and (1.37% + 2.72%) for In-house Excel Spreadsheet based MUVC program and Diamond SCS respectively. For 26 clinically approved VMAT plans with isocentre at a region below -350 HU showed higher variations for both In-house Spreadsheet based MUVC program and Diamond SCS. It can be concluded that for patient specific quality assurance (QA), the In-house Excel Spreadsheet based MUVC program and Diamond SCS can be used as a simple and fast accompanying to measurement based verification for plans with isocentre at a region above -350 HU. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  1. Simulation verification techniques study. Task report 4: Simulation module performance parameters and performance standards

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Shuttle simulation software modules in the environment, crew station, vehicle configuration and vehicle dynamics categories are discussed. For each software module covered, a description of the module functions and operational modes, its interfaces with other modules, its stored data, inputs, performance parameters and critical performance parameters is given. Reference data sources which provide standards of performance are identified for each module. Performance verification methods are also discussed briefly.

  2. Analysis of Nerve Agent Metabolites from Hair for Long-Term Verification of Nerve Agent Exposure

    DTIC Science & Technology

    2016-05-09

    Analysis of Nerve Agent Metabolites from Hair for Long-Term Verification of Nerve Agent Exposure Amanda S. Appel,† John H. McDonough,‡ Joseph D...feasible. In this study, hair was evaluated as a long-term repository of nerve agent hydrolysis products. Pinacolyl methylphosphonic acid (PMPA...hydrolysis product of soman) and isopropyl methylphosphonic acid (IMPA; hydrolysis product of sarin) were extracted from hair samples with N,N

  3. Expert system verification and validation study. Phase 2: Requirements Identification. Delivery 2: Current requirements applicability

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The second phase of a task is described which has the ultimate purpose of ensuring that adequate Expert Systems (ESs) Verification and Validation (V and V) tools and techniques are available for Space Station Freedom Program Knowledge Based Systems development. The purpose of this phase is to recommend modifications to current software V and V requirements which will extend the applicability of the requirements to NASA ESs.

  4. Field Verification Program (Aquatic Disposal). Bioenergetic Effects of Black Rock Harbor Dredged Material on the Polychaete Nephtys incisa: A Field Verification.

    DTIC Science & Technology

    1988-03-01

    observed in the laboratory, and to determine the degree of correlation between the bioaccumulation of contaminants and bioenergetic responses...toxicity of liquid, suspended particulate, and solid phases; (c) estimating the potential contami- nant bioaccumulation ; and (d) describing the initial... bioaccumulation of dredged material contami- nants with biological responses from laboratory and field exposure to dredged material. However, this study

  5. The monitoring and verification of nuclear weapons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garwin, Richard L., E-mail: RLG2@us.ibm.com

    2014-05-09

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  6. Effect of verification cores on tip capacity of drilled shafts.

    DOT National Transportation Integrated Search

    2009-02-01

    This research addressed two key issues: : 1) Will verification cores holes fill during concrete backfilling? If so, what are the mechanical properties of the : filling material? In dry conditions, verification core holes always completely fill with c...

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION: ADD-ON NOX CONTROLS

    EPA Science Inventory

    The paper discusses the environmental technology verification (ETV) of add-on nitrogen oxide (NOx) controls. Research Triangle Institute (RTI) is EPA's cooperating partner for the Air Pollution Control Technology (APCT) Program, one of a dozen ETV pilot programs. Verification of ...

  8. Certification and verification for Northrup model NSC-01-0732 fresnel lens concentrating solar collector

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Structural analysis and certification of the collector system is presented. System verification against the interim performance criteria is presented and indicated by matrices. The verification discussion, analysis, and test results are also given.

  9. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  10. Verification of a Viscous Computational Aeroacoustics Code using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  11. Verification of a Viscous Computational Aeroacoustics Code Using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  12. SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thoelking, J; Yuvaraj, S; Jens, F

    Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference)more » and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan verification. Funding Support, Disclosures, and Conflict of Interest: COIs: Frank Lohr: Elekta: research grant, travel grants, teaching honoraria IBA: research grant, travel grants, teaching honoraria, advisory board C-Rad: board honoraria, travel grants Frederik Wenz: Elekta: research grant, teaching honoraria, consultant, advisory board Zeiss: research grant, teaching honoraria, patent Hansjoerg Wertz: Elekta: research grant, teaching honoraria IBA: research grant.« less

  13. Interpreter composition issues in the formal verification of a processor-memory module

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Cohen, Gerald C.

    1994-01-01

    This report describes interpreter composition techniques suitable for the formal specification and verification of a processor-memory module using the HOL theorem proving system. The processor-memory module is a multichip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. Modeling and verification methods were developed that permit provably secure composition at the transaction-level of specification, significantly reducing the complexity of the hierarchical verification of the system.

  14. Application of the precipitation-runoff modeling system to the Ah- shi-sle-pah Wash watershed, San Juan County, New Mexico

    USGS Publications Warehouse

    Hejl, H.R.

    1989-01-01

    The precipitation-runoff modeling system was applied to the 8.21 sq-mi drainage area of the Ah-shi-sle-pah Wash watershed in northwestern New Mexico. The calibration periods were May to September of 1981 and 1982, and the verification period was May to September 1983. Twelve storms were available for calibration and 8 storms were available for verification. For calibration A (hydraulic conductivity estimated from onsite data and other storm-mode parameters optimized), the computed standard error of estimate was 50% for runoff volumes and 72% of peak discharges. Calibration B included hydraulic conductivity in the optimization, which reduced the standard error of estimate to 28 % for runoff volumes and 50% for peak discharges. Optimized values for hydraulic conductivity resulted in reductions from 1.00 to 0.26 in/h and 0.20 to 0.03 in/h for the 2 general soils groups in the calibrations. Simulated runoff volumes using 7 of 8 storms occurring during the verification period had a standard error of estimate of 40% for verification A and 38% for verification B. Simulated peak discharge had a standard error of estimate of 120% for verification A and 56% for verification B. Including the eighth storm which had a relatively small magnitude in the verification analysis more than doubled the standard error of estimating volumes and peaks. (USGS)

  15. A hardware-software system for the automation of verification and calibration of oil metering units secondary equipment

    NASA Astrophysics Data System (ADS)

    Boyarnikov, A. V.; Boyarnikova, L. V.; Kozhushko, A. A.; Sekachev, A. F.

    2017-08-01

    In the article the process of verification (calibration) of oil metering units secondary equipment is considered. The purpose of the work is to increase the reliability and reduce the complexity of this process by developing a software and hardware system that provides automated verification and calibration. The hardware part of this complex carries out the commutation of the measuring channels of the verified controller and the reference channels of the calibrator in accordance with the introduced algorithm. The developed software allows controlling the commutation of channels, setting values on the calibrator, reading the measured data from the controller, calculating errors and compiling protocols. This system can be used for checking the controllers of the secondary equipment of the oil metering units in the automatic verification mode (with the open communication protocol) or in the semi-automatic verification mode (without it). The peculiar feature of the approach used is the development of a universal signal switch operating under software control, which can be configured for various verification methods (calibration), which allows to cover the entire range of controllers of metering units secondary equipment. The use of automatic verification with the help of a hardware and software system allows to shorten the verification time by 5-10 times and to increase the reliability of measurements, excluding the influence of the human factor.

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, TNT DETECTION TECHNOLOGY, TEXAS INSTRUMENTS, SPREETA TM SENSOR

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the ...

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, GROUNDWATER SAMPLING TECHNOLOGIES, CLEAN ENVIRONMENT EQUIPMENT, SAMPLEASE BLADDER PUMP

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  19. Use of Radio Frequency Identification (RFID) for Tracking Hazardous Waste Shipments Across International Borders -Test/QA Plan

    EPA Science Inventory

    The Environmental Technology Verification (ETV) – Environmental and Sustainable Technology Evaluations (ESTE) Program conducts third-party verification testing of commercially available technologies that may accomplish environmental program management goals. In this verification...

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: DUST SUPPRESSANT PRODUCTS: NORTH AMERICAN SALT COMPANY'S DUSTGARD

    EPA Science Inventory

    Dust suppressant products used to control particulate emissions from unpaved roads are among the technologies evaluated by the Air Pollution Control Technology (APCT) Verification Center, part of the U.S. Environmental Protection Agency's Environmental Technology Verification (ET...

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: DUST SUPPRESSANT PRODUCTS: SYNTECH PRODUCTS CORPORATION'S TECHSUPPRESS

    EPA Science Inventory

    Dust suppressant products used to control particulate emissions from unpaved roads are among the technologies evaluated by the Air Pollution Control Technology (APCT) Verification Center, part of the U.S. Environmental Protection Agency's Environmental Technology Verification (ET...

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: DUST SUPPRESSANT PRODUCTS: SYNTECH PRODUCTS CORPORATION'S PETROTAC

    EPA Science Inventory

    Dust suppressant products used to control particulate emissions from unpaved roads are among the technologies evaluated by the Air Pollution Control Technology (APCT) Verification Center, part of the U.S. Environmental Protection Agency's Environmental Technology Verification (ET...

  4. 48 CFR 4.1300 - Scope of subpart.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ADMINISTRATIVE MATTERS Personal Identity Verification 4.1300 Scope of subpart. This subpart provides policy and procedures associated with Personal Identity Verification as required by— (a) Federal Information Processing Standards Publication (FIPS PUB) Number 201, “Personal Identity Verification of Federal Employees and...

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION TEST PROTOCOL, GENERAL VENTILATION FILTERS

    EPA Science Inventory

    The Environmental Technology Verification Test Protocol, General Ventilation Filters provides guidance for verification tests.

    Reference is made in the protocol to the ASHRAE 52.2P "Method of Testing General Ventilation Air-cleaning Devices for Removal Efficiency by P...

  6. Generic Protocol for the Verification of Ballast Water Treatment Technology

    EPA Science Inventory

    In anticipation of the need to address performance verification and subsequent approval of new and innovative ballast water treatment technologies for shipboard installation, the U.S Coast Guard and the Environmental Protection Agency‘s Environmental Technology Verification Progr...

  7. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  8. Environmental Technology Verification Report - NATCO Group, Inc. – Paques THIOPAQ Gas Purification Technology

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the ...

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, GROUNDWATER SAMPLING TECHNOLOGIES, BURGE ENVIRONMENTAL INC. MULTIPROBE 100

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  10. Gender Verification of Female Olympic Athletes.

    ERIC Educational Resources Information Center

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  11. CHEMICAL INDUCTION MIXER VERIFICATION - ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Wet-Weather Flow Technologies Pilot of the Environmental Technology Verification (ETV) Program, which is supported by the U.S. Environmental Protection Agency and facilitated by NSF International, has recently evaluated the performance of chemical induction mixers used for di...

  12. 40 CFR 1066.215 - Summary of verification and calibration procedures for chassis dynamometers.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer... manufacturer instructions and good engineering judgment. (c) Automated dynamometer verifications and... accomplish the verifications and calibrations specified in this subpart. You may use these automated...

  13. 40 CFR 1066.215 - Summary of verification and calibration procedures for chassis dynamometers.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer... manufacturer instructions and good engineering judgment. (c) Automated dynamometer verifications and... accomplish the verifications and calibrations specified in this subpart. You may use these automated...

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, ENVIRONMENTAL DECISION SUPPORT SOFTWARE, DECISION FX, INC., GROUNDWATER FX

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, ENVIRONMENTAL DECISION SUPPORT SOFTWARE, DECISION FX, INC. SAMPLING FX

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  16. PERFORMANCE VERIFICATION TEST FOR FIELD-PORTABLE MEASUREMENTS OF LEAD IN DUST

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program (www.epa.jzov/etv) conducts performance verification tests of technologies used for the characterization and monitoring of contaminated media. The program exists to provide high-quali...

  17. VERIFICATION TESTING OF HIGH-RATE MECHANICAL INDUCTION MIXERS FOR CHEMICAL DISINFECTANTS

    EPA Science Inventory

    This paper describes the results of verification testing of mechanical induction mixers for dispersion of chemical disinfectants in wet-weather flow (WWF) conducted under the U.S. Environmental Protection Agency's Environmental Technology Verification (ETV) WWF Pilot Program. Th...

  18. TH-AB-201-01: A Feasibility Study of Independent Dose Verification for CyberKnife

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sato, A; Noda, T; Keduka, Y

    2016-06-15

    Purpose: CyberKnife irradiation is composed of tiny-size, multiple and intensity-modulated beams compared to conventional linacs. Few of the publications for Independent dose calculation verification for CyberKnife have been reported. In this study, we evaluated the feasibility of independent dose verification for CyberKnife treatment as Secondary check. Methods: The followings were measured: test plans using some static and single beams, clinical plans in a phantom and using patient’s CT. 75 patient plans were collected from several treatment sites of brain, lung, liver and bone. In the test plans and the phantom plans, a pinpoint ion-chamber measurement was performed to assess dosemore » deviation for a treatment planning system (TPS) and an independent verification program of Simple MU Analysis (SMU). In the clinical plans, dose deviation between the SMU and the TPS was performed. Results: In test plan, the dose deviations were 3.3±4.5%, and 4.1±4.4% for the TPS and the SMU, respectively. In the phantom measurements for the clinical plans, the dose deviations were −0.2±3.6% for the TPS and −2.3±4.8% for the SMU. In the clinical plans using the patient’s CT, the dose deviations were −3.0±2.1% (Mean±1SD). The systematic difference was partially derived from inverse square law and penumbra calculation. Conclusion: The independent dose calculation for CyberKnife shows −3.0±4.2% (Mean±2SD) and our study, the confidence limit was achieved within 5% of the tolerance level from AAPM task group 114 for non-IMRT treatment. Thus, it may be feasible to use independent dose calculation verification for CyberKnife treatment as the secondary check. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  19. Runtime Verification of Pacemaker Functionality Using Hierarchical Fuzzy Colored Petri-nets.

    PubMed

    Majma, Negar; Babamir, Seyed Morteza; Monadjemi, Amirhassan

    2017-02-01

    Today, implanted medical devices are increasingly used for many patients and in case of diverse health problems. However, several runtime problems and errors are reported by the relevant organizations, even resulting in patient death. One of those devices is the pacemaker. The pacemaker is a device helping the patient to regulate the heartbeat by connecting to the cardiac vessels. This device is directed by its software, so any failure in this software causes a serious malfunction. Therefore, this study aims to a better way to monitor the device's software behavior to decrease the failure risk. Accordingly, we supervise the runtime function and status of the software. The software verification means examining limitations and needs of the system users by the system running software. In this paper, a method to verify the pacemaker software, based on the fuzzy function of the device, is presented. So, the function limitations of the device are identified and presented as fuzzy rules and then the device is verified based on the hierarchical Fuzzy Colored Petri-net (FCPN), which is formed considering the software limits. Regarding the experiences of using: 1) Fuzzy Petri-nets (FPN) to verify insulin pumps, 2) Colored Petri-nets (CPN) to verify the pacemaker and 3) To verify the pacemaker by a software agent with Petri-network based knowledge, which we gained during the previous studies, the runtime behavior of the pacemaker software is examined by HFCPN, in this paper. This is considered a developing step compared to the earlier work. HFCPN in this paper, compared to the FPN and CPN used in our previous studies reduces the complexity. By presenting the Petri-net (PN) in a hierarchical form, the verification runtime, decreased as 90.61% compared to the verification runtime in the earlier work. Since we need an inference engine in the runtime verification, we used the HFCPN to enhance the performance of the inference engine.

  20. Perception of mental health in Pakistani nomads: An interpretative phenomenological analyses

    PubMed Central

    Choudhry, Fahad Riaz; Bokharey, Iram Zehra

    2013-01-01

    The study was conducted to explore the mental health issues of Pakistani nomads and to uncover their concept, ideation, and perception about mental health and illnesses. It was an exploratory study situated in the qualitative paradigm. The research strategy used was Interpretative Phenomenological Analysis (IPA), as the study was planned to explore the lived experiences of nomads regarding mental health and coping strategies and how they interpret those experiences. For data collection, focus group discussions (FGDs) were conducted. Seven participants were included in the FGDs, and two FGDs were conducted composed of both genders. The responses were recorded, and data were transcribed and analysed using IPA. Data verification procedures of peer review, which help to clarify researcher bias and rich thick description, were used. The major themes were lack of resources and myriad unfulfilled needs, specifically the basic needs (food, shelter, and drinking and bathing water). Moreover, a strong desire to fulfil the secondary needs of enjoyment and having luxuries was also reflected. A list of recommendations was forwarded for policy making of this marginalized community and to create awareness regarding mental health. PMID:24369779

Top