Science.gov

Sample records for enhancing ctbt verification

  1. Visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-03-01

    On-site visual inspection will play an essential role in future Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection can greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can visual inspection offer ``ground truth`` in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending party may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection.

  2. CTBT integrated verification system evaluation model supplement

    SciTech Connect

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  3. CTBT Integrated Verification System Evaluation Model

    SciTech Connect

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  4. Completing and sustaining IMS network for the CTBT Verification Regime

    NASA Astrophysics Data System (ADS)

    Meral Ozel, N.

    2015-12-01

    The CTBT International Monitoring System is to be comprised of 337 facilities located all over the world for the purpose of detecting and locating nuclear test explosions. Major challenges remain, namely the completion of the network where most of the remaining stations have either environmental, logistical and/or political issues to surmont (89% of the stations have already been built) and the sustainment of a reliable and state-of the-art network covering 4 technologies - seismic, infrasound , hydroacoustic and radionuclide. To have a credible and trustworthy verification system ready for entry into force of the Treaty, the CTBTO is protecting and enhancing its investment of its global network of stations and is providing effective data to the International Data Centre (IDC) and Member States. Regarding the protection of the CTBTO's investment and enhanced sustainment of IMS station operations, the IMS Division is enhancing the capabilities of the monitoring system by applying advances in instrumentation and introducing new software applications that are fit for purpose. Some examples are the development of noble gas laboratory systems to process and analyse subsoil samples, development of a mobile noble gas system for onsite inspection purposes, optimization of Beta Gamma detectors for Xenon detection, assessing and improving the efficiency of wind noise reduction systems for infrasound stations, development and testing of infrasound stations with a self-calibrating capability, and research into the use of modular designs for the hydroacoustic network.

  5. Decision analysis framework for evaluating CTBT seismic verification options

    SciTech Connect

    Judd, B.R.; Strait, R.S.; Younker, L.W.

    1986-09-01

    This report describes a decision analysis framework for evaluating seismic verification options for a Comprehensive Test Ban Treaty (CTBT). In addition to providing policy makers with insights into the relative merits of different options, the framework is intended to assist in formulating and evaluating political decisions - such as responses to evidence of violations - and in setting research priorities related to the options. To provide these broad analytical capabilities to decision makers, the framework incorporates a wide variety of issues. These include seismic monitoring capabilities, evasion possibilities, evidence produced by seismic systems, US response to the evidence, the dependence between US and Soviet decision-making, and the relative values of possible outcomes to the US and the Soviet Union. An added benefit of the framework is its potential use to improve communication about these CTBT verification issues among US experts and decision makers. The framework has been implemented on a portable microcomputer to facilitate this communication through demonstration and rapid evaluation of alternative judgments and policy choices. The report presents the framework and its application in four parts. The first part describes the decision analysis framework and the types of analytical results produced. In the second part, the framework is used to evaluate representative seismic verification options. The third part describes the results of sensitivity analyses that determine the relative importance of the uncertainties or subjective judgments that influence the evaluation of the options. The fourth (and final) part summaries conclusions and presents implications of the sample analytical results for further research and for policy-making related to CTBT verification. The fourth section also describes the next steps in the development and use of the decision analysis framework.

  6. Ground-based visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-11-01

    Ground-based visual inspection will play an essential role in On-Site Inspection (OSI) for Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection will greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can ground-based visual inspection offer effective documentation in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending state may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection. The inspections will be carried out by inspectors from members of the CTBT Organization.

  7. How to tackle wet-deposition of radionuclides in the context of RN threshold-monitoring for CTBT verification?

    NASA Astrophysics Data System (ADS)

    Becker, A.; Ceranna, L.; Ross, O.; Schneider, U.; Meyer-Christoffer, A.; Ziese, M.; Rudolf, B.

    2011-12-01

    As contribution to the World Climate Research Program (WCRP) and in support of the Global Climate Observing System (GCOS) of the World Meteorological Organization (WMO), the Deutscher Wetterdienst (DWD) operates the Global Precipitation Climatology Centre (GPCC). The GPCC re-analysis and near-real time monitoring products are recognized world-wide as the most reliable global data set on rain-gauge based (in-situ) precipitation measurements. The GPCC Monitoring Product (Rudolf and Becker, 2010) is available two months after the fact based on the data gathered while listening to the GTS to fetch the SYNOP and CLIMAT messages. This product serves also the reference data to calibrate satellite based precipitation measurements yielding the Global Precipitation Climatology Project (GPCP) data set (Huffmann et al., 2009). The quickest GPCC product is the First Guess version of the GPCC Monitoring Product being available already 3-5 days after the month regarded. Both, the GPCC and the GPCP products bear the capability to serve as data base for the computational light-weight post processing of the wet deposition impact on the radionuclide monitoring capability of the CTBT network (Wotawa et al., 2009) on the regional and global scale, respectively. This is of major importance any time, a reliable quantitative assessment of the source-receptor sensitivity is needed, e.g. for the analysis of isotopic ratios. Actually the wet deposition recognition is a prerequisite if ratios of particulate and noble gas measurements come into play. This is so far a quite unexplored field of investigation, but would alleviate the clearance of several apparently CTBT relevant detections, encountered in the past, as bogus and provide an assessment for the so far overestimation of the IMS RN detection capability (Figure 1). References Huffman, G.J, et al., 2009: Improving the Global Precipitation Record: GPCP Version 2.1. Geophys. Res. Lett., 36,L17808, doi:10.1029/2009GL040000. Rudolf, B. and A

  8. CTBT technical issues handbook

    SciTech Connect

    Zucca, J.J.

    1994-05-01

    The purpose of this handbook is to give the nonspecialist in nuclear explosion physics and nuclear test monitoring an introduction to the topic as it pertains to a Comprehensive Test Ban Treaty (CTBT). The authors have tried to make the handbook visually oriented, with figures paired to short discussions. As such, the handbook may be read straight through or in sections. The handbook covers four main areas and ends with a glossary, which includes both scientific terms and acronyms likely to be encountered during CTBT negotiations. The following topics are covered: (1) Physics of nuclear explosion experiments. This is a description of basic nuclear physics and elementary nuclear weapon design. Also discussed are testing practices. (2) Other nuclear experiments. This section discusses experiments that produce small amounts of nuclear energy but differ from explosion experiments discussed in the first chapter. This includes the type of activities, such as laser fusion, that would continue after a CTBT is in force. (3) Monitoring tests in various environments. This section describes the different physical environments in which a test could be conducted (underground, in the atmosphere, in space, underwater, and in the laboratory); the sources of non-nuclear events (such as earthquakes and mining operations); and the opportunities for evasion. (4) On-site inspections. A CTBT is likely to include these inspections as an element of the verification provisions, in order to resolve the nature of ambiguous events. This chapter describes some technical considerations and technologies that are likely to be useful. (5) Selecting verification measures. This chapter discusses the uncertain nature of the evidence from monitoring systems and how compliance judgments could be made, taking the uncertainties into account. It also discusses how to allocate monitoring resources, given the likelihood of testing by various countries in various environments.

  9. CTBT on-site inspections

    SciTech Connect

    Zucca, J. J.

    2014-05-09

    On-site inspection (OSI) is a critical part of the verification regime for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The OSI verification regime provides for international inspectors to make a suite of measurements and observations on site at the location of an event of interest. The other critical component of the verification regime is the International Monitoring System (IMS), which is a globally distributed network of monitoring stations. The IMS along with technical monitoring data from CTBT member countries, as appropriate, will be used to trigger an OSI. After the decision is made to carry out an OSI, it is important for the inspectors to deploy to the field site rapidly to be able to detect short-lived phenomena such as the aftershocks that may be observable after an underground nuclear explosion. The inspectors will be on site from weeks to months and will be working with many tens of tons of equipment. Parts of the OSI regime will be tested in a field exercise in the country of Jordan late in 2014. The build-up of the OSI regime has been proceeding steadily since the CTBT was signed in 1996 and is on track to becoming a deterrent to someone considering conducting a nuclear explosion in violation of the Treaty.

  10. Proceedings of the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT)

    SciTech Connect

    Nichols, James W., LTC

    2000-09-15

    These proceedings contain papers prepared for the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), held 13-15 September 2000 in New Orleans, Louisiana. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Defense Threat Reduction Agency (DTRA), Air Force Technical Applications Center (AFTAC), Department of Defense (DoD), US Army Space and Missile Defense Command, Defense Special Weapons Agency (DSWA), and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  11. Verification and enhancement high resolution layers 2012 for Bulgaria

    NASA Astrophysics Data System (ADS)

    Dimitrov, Ventzeslav; Lubenov, Todor

    Production of high-resolution layers (HRL) is a substantial part of the pan-European component of GMES/Copernicus initial operations (GIO) land monitoring service. The focus of this paper is on the results of the implementation of HRL verification and enhancement tasks for Bulgarian territory. For the reference year 2012 five HRL on land cover characteristics were produced by service providers through sophisticated classification of multi-sensor and multi-temporal satellite images: imperviousness, forests, grasslands, wetlands and permanent water bodies. As a result of the verification systematic classification errors were identified relevant to the subsequent enhancement procedure. The verification was carried out through visual inspection of stratified samples in the HRL using reliable reference spatial data sets, checking for commission and omission errors. The applied procedure included three major parts, the first two - obligatory: general overview of data quality, look-and-feel control of critical strata and statistically based quantitative verification. The enhancement task consisted in correcting errors revealed by the verification giving as a result final enhanced HRL products. Stratification schemes, evaluation grades by strata and HRL from look-and-feel verification and accuracy values from statistical verification are presented. Types and quantities of removed mistakes during the enhancement are structured and summarised. Results show that all HRL except the grasslands layer meet the 85% accuracy requirements.

  12. CTBT (7-chlorotetrazolo[5,1-c]benzo[1,2,4]triazine) producing ROS affects growth and viability of filamentous fungi.

    PubMed

    Culakova, Hana; Dzugasova, Vladimira; Gbelska, Yvetta; Subik, Julius

    2012-03-01

    CTBT (7-chlorotetrazolo[5,1-c]benzo[1,2,4]triazine) causes intracellular superoxide production and oxidative stress and enhances the susceptibility of Saccharomyces cerevisiae, Candida albicans, and C. glabrata cells to cycloheximide, 5-fluorocytosine, and azole antimycotic drugs. Here, we demonstrate the antifungal activity of CTBT against 14 tested filamentous fungi. CTBT prevented spore germination and mycelial proliferation of Aspergillus niger and the pathogenic Aspergillus fumigatus. The action of CTBT is fungicidal. CTBT increased the formation of reactive oxygen species in fungal mycelium as detected by 2',7'-dichlorodihydrofluorescein diacetate and reduced the radial growth of colonies in a dose-dependent manner. Co-application of CTBT and itraconazole led to complete inhibition of fungal growth at dosages lower than the chemicals alone. Antifungal and chemosensitizing activities of CTBT in filamentous fungi may be useful in combination treatments of infections caused by drug-resistant fungal pathogens. PMID:22212016

  13. Scientific Meetings Database: A New Tool for CTBT-Related International Cooperation

    SciTech Connect

    Knapik, Jerzy F.; Girven, Mary L.

    1999-08-20

    The mission of international cooperation is defined in the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Ways and means of implementation were the subject of discussion during the International Cooperation Workshop held in Vienna in November 1998, and during the Regional Workshop for CTBTO International Cooperation held in Cairo, Egypt in June 1999. In particular, a database of ''Scientific and Technical Meetings Directly or Indirectly Related to CTBT Verification-Related Technologies'' was developed by the CTBTO PrepCom/PTS/International Cooperation section and integrated into the organization's various web sites in cooperation with the U.S. Department of Energy CTBT Research and Development Program. This database, the structure and use of which is described in this paper/presentation is meant to assist the CTBT-related scientific community in identifying worldwide expertise in the CTBT verification-related technologies and should help experts, particularly those of less technologically advanced States Signatories, to strengthen contacts and to pursue international cooperation under the Tredy regime. Specific opportunities for international cooperation, in particular those provided by active participation in the use and further development of this database, are presented in this paper and/or presentation.

  14. Three years of operational experience from Schauinsland CTBT monitoring station.

    PubMed

    Zähringer, M; Bieringer, J; Schlosser, C

    2008-04-01

    Data from three years of operation of a low-level aerosol sampler and analyzer (RASA) at Schauinsland monitoring station are reported. The system is part of the International Monitoring System (IMS) for verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The fully automatic system is capable to measure aerosol borne gamma emitters with high sensitivity and routinely quantifies 7Be and 212Pb. The system had a high level of data availability of 90% within the reporting period. A daily screening process rendered 66 tentative identifications of verification relevant radionuclides since the system entered IMS operation in February 2004. Two of these were real events and associated to a plausible source. The remaining 64 cases can consistently be explained by detector background and statistical phenomena. Inter-comparison with data from a weekly sampler operated at the same station shows instabilities of the calibration during the test phase and a good agreement since certification of the system. PMID:18053622

  15. Enhanced Verification Test Suite for Physics Simulation Codes

    SciTech Connect

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    sophistication or other physics regimes (e.g., energetic material response, magneto-hydrodynamics), would represent a scientifically desirable complement to the fundamental test cases discussed in this report. The authors believe that this document can be used to enhance the verification analyses undertaken at the DOE WP Laboratories and, thus, to improve the quality, credibility, and usefulness of the simulation codes that are analyzed with these problems.

  16. Development to Release of CTBT Knowledge Base Datasets

    SciTech Connect

    Moore, S.G.; Shepherd, E.R.

    1998-10-20

    For the CTBT Knowledge Base to be useful as a tool for improving U.S. monitoring capabilities, the contents of the Knowledge Base must be subjected to a well-defined set of procedures to ensure integrity and relevance of the con- stituent datasets. This paper proposes a possible set of procedures for datasets that are delivered to Sandia National Laboratories (SNL) for inclusion in the Knowledge Base. The proposed procedures include defining preliminary acceptance criteria, performing verification and validation activities, and subjecting the datasets to approvrd by domain experts. Preliminary acceptance criteria include receipt of the data, its metadata, and a proposal for its usability for U.S. National Data Center operations. Verification activi- ties establish the correctness and completeness of the data, while validation activities establish the relevance of the data to its proposed use. Results from these activities are presented to domain experts, such as analysts and peers for final approval of the datasets for release to the Knowledge Base. Formats and functionality will vary across datasets, so the procedures proposed herein define an overall plan for establishing integrity and relevance of the dataset. Specific procedures for verification, validation, and approval will be defined for each dataset, or for each type of dataset, as appropriate. Potential dataset sources including Los Alamos National Laboratories and Lawrence Livermore National Laborato- ries have contributed significantly to the development of thk process.

  17. Construction of a Shallow Underground Low-background Detector for a CTBT Radionuclide Laboratory

    SciTech Connect

    Forrester, Joel B.; Greenwood, Lawrence R.; Miley, Harry S.; Myers, Allan W.; Overman, Cory T.

    2013-05-01

    The International Monitoring System (IMS) is a verification component of the Comprehensive Nuclear-Test-Ban Treaty (CTBT), and in addition to a series of radionuclide monitoring stations, contains sixteen radionuclide laboratories capable of verification of radionuclide station measurements. This paper presents an overview of a new commercially obtained low-background detector system for radionuclide aerosol measurements recently installed in a shallow (>30 meters water equivalent) underground clean-room facility at Pacific Northwest National Laboratory. Specifics such as low-background shielding materials, active shielding methods, and improvements in sensitivity to IMS isotopes will be covered.

  18. Scenario details of NPE 2012 - Independent performance assessment by simulated CTBT violation

    NASA Astrophysics Data System (ADS)

    Gestermann, N.; Bönnemann, C.; Ceranna, L.; Ross, O.; Schlosser, C.

    2012-04-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) was opened for signature on 24 September 1996. The technical preparations for monitoring CTBT compliance are moving ahead rapidly and many efforts have been made since then to establish the verification system. In that regard the two underground nuclear explosions conducted by the Democratic People's Republic of Korea 2006 and 2009 were the first real performance tests of the system. In the light of these events National Data Centres (NDCs) realized the need of getting more familiar with the verification regime details. The idea of an independent annual exercise to evaluate the processing and analysis procedures applied at the National Data Centres of the CTBT was born at the NDC Evaluation Workshop in Kiev, Ukraine, 2006. The exercises should simulate a fictitious violation of the CTBT and all NDCs are invited to clarify the nature of the selected event. This exercise should help to evaluate the effectiveness of procedures applied at NDCs, as well as the quality, completeness, and usefulness of IDC products. Moreover, the National Data Centres Preparedness Exercise (NPE) is a measure for the readiness of the NDCs to fulfill their duties in regard of the CTBT verification, the treaty compliance based judgments about the nature of events as natural or artificial and chemical or nuclear, respectively. NPEs proved to be an efficient indicative tool for testing the performance of the verification system and its elements. In 2007 and 2008 the exercise were focused on seismic waveform data analysis. Since 2009 the analysis of infrasound data was included and additional attention was attached to the radionuclide component. In 2010 a realistic noble gas release scenario was selected as the trigger event which could be expected after an underground nuclear test. The epicenter location of an event from the Reviewed Event Bulletin (REB), unknown for participants of the exercise, was selected as the source of the noble gas

  19. Atmospheric transport modelling in support of CTBT verification—overview and basic concepts

    NASA Astrophysics Data System (ADS)

    Wotawa, Gerhard; De Geer, Lars-Erik; Denier, Philippe; Kalinowski, Martin; Toivonen, Harri; D'Amours, Real; Desiato, Franco; Issartel, Jean-Pierre; Langer, Matthias; Seibert, Petra; Frank, Andreas; Sloan, Craig; Yamazawa, Hiromi

    Under the provisions of the Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global monitoring system comprising different verification technologies is currently being set up. The network will include 80 radionuclide (RN) stations distributed all over the globe that measure treaty-relevant radioactive species. While the seismic subsystem cannot distinguish between chemical and nuclear explosions, RN monitoring would provide the "smoking gun" of a possible treaty violation. Atmospheric transport modelling (ATM) will be an integral part of CTBT verification, since it provides a geo-temporal location capability for the RN technology. In this paper, the basic concept for the future ATM software system to be installed at the International Data Centre is laid out. The system is based on the operational computation of multi-dimensional source-receptor sensitivity fields for all RN samples by means of adjoint tracer transport modelling. While the source-receptor matrix methodology has already been applied in the past, the system that we suggest will be unique and unprecedented, since it is global, real-time and aims at uncovering source scenarios that are compatible with measurements. Furthermore, it has to deal with source dilution ratios that are by orders of magnitude larger than in typical transport model applications. This new verification software will need continuous scientific attention, and may well provide a prototype system for future applications in areas of environmental monitoring, emergency response and verification of other international agreements and treaties.

  20. The new geospatial tools: global transparency enhancing safeguards verification

    SciTech Connect

    Pabian, Frank Vincent

    2010-09-16

    This paper focuses on the importance and potential role of the new, freely available, geospatial tools for enhancing IAEA safeguards and how, together with commercial satellite imagery, they can be used to promote 'all-source synergy'. As additional 'open sources', these new geospatial tools have heralded a new era of 'global transparency' and they can be used to substantially augment existing information-driven safeguards gathering techniques, procedures, and analyses in the remote detection of undeclared facilities, as well as support ongoing monitoring and verification of various treaty (e.g., NPT, FMCT) relevant activities and programs. As an illustration of how these new geospatial tools may be applied, an original exemplar case study provides how it is possible to derive value-added follow-up information on some recent public media reporting of a former clandestine underground plutonium production complex (now being converted to a 'Tourist Attraction' given the site's abandonment by China in the early 1980s). That open source media reporting, when combined with subsequent commentary found in various Internet-based Blogs and Wikis, led to independent verification of the reporting with additional ground truth via 'crowdsourcing' (tourist photos as found on 'social networking' venues like Google Earth's Panoramio layer and Twitter). Confirmation of the precise geospatial location of the site (along with a more complete facility characterization incorporating 3-D Modeling and visualization) was only made possible following the acquisition of higher resolution commercial satellite imagery that could be correlated with the reporting, ground photos, and an interior diagram, through original imagery analysis of the overhead imagery.

  1. The data dictionary: A view into the CTBT knowledge base

    SciTech Connect

    Shepherd, E.R.; Keyser, R.G.; Armstrong, H.M.

    1997-08-01

    The data dictionary for the Comprehensive Test Ban Treaty (CTBT) knowledge base provides a comprehensive, current catalog of the projected contents of the knowledge base. It is written from a data definition view of the knowledge base and therefore organizes information in a fashion that allows logical storage within the computer. The data dictionary introduces two organization categories of data: the datatype, which is a broad, high-level category of data, and the dataset, which is a specific instance of a datatype. The knowledge base, and thus the data dictionary, consist of a fixed, relatively small number of datatypes, but new datasets are expected to be added on a regular basis. The data dictionary is a tangible result of the design effort for the knowledge base and is intended to be used by anyone who accesses the knowledge base for any purpose, such as populating the knowledge base with data, or accessing the data for use with automatic data processing (ADP) routines, or browsing through the data for verification purposes. For these two reasons, it is important to discuss the development of the data dictionary as well as to describe its contents to better understand its usefulness; that is the purpose of this paper.

  2. Automatic radioxenon analyzer for CTBT monitoring

    SciTech Connect

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE`s NN-20 Comprehensive Test Ban Treaty (CTBT) R&D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, {sup 131m}Xe(11.9 d), {sup 133m}Xe(2.19 d), {sup 133}Xe (5.24 d), and {sup 135}Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m{sup 3}/h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 {mu}Bq/m{sup 3} of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers.

  3. Global Monitoring of the CTBT: Progress, Capabilities and Plans (Invited)

    NASA Astrophysics Data System (ADS)

    Zerbo, L.

    2013-12-01

    The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), established in 1996, is tasked with building up the verification regime of the CTBT. The regime includes a global system for monitoring the earth, the oceans and the atmosphere for nuclear tests, and an on-site inspection (OSI) capability. More than 80% of the 337 facilities of the International Monitoring System (IMS) have been installed and are sending data to the International Data Centre (IDC) in Vienna, Austria for processing. These IMS data along with IDC processed and reviewed products are available to all States that have signed the Treaty. Concurrent with the build-up of the global monitoring networks, near-field geophysical methods are being developed and tested for OSIs. The monitoring system is currently operating in a provisional mode, as the Treaty has not yet entered into force. Progress in installing and operating the IMS and the IDC and in building up an OSI capability will be described. The capabilities of the monitoring networks have progressively improved as stations are added to the IMS and IDC processing techniques refined. Detection thresholds for seismic, hydroacoustic, infrasound and radionuclide events have been measured and in general are equal to or lower than the predictions used during the Treaty negotiations. The measurements have led to improved models and tools that allow more accurate predictions of future capabilities and network performance under any configuration. Unplanned tests of the monitoring network occurred when the DPRK announced nuclear tests in 2006, 2009, and 2013. All three tests were well above the detection threshold and easily detected and located by the seismic monitoring network. In addition, noble gas consistent with the nuclear tests in 2006 and 2013 (according to atmospheric transport models) was detected by stations in the network. On-site inspections of these tests were not conducted as the Treaty has not entered

  4. Nuclear explosion source terms for CTBT monitoring

    SciTech Connect

    Lougheed, R.W.; Wild, J.F.; Harvey, T.

    1996-10-01

    Detection of radionuclides from a suspected nuclear explosion is required to provide absolute proof that the event was nuclear. Various evasion scenarios could be employed to attempt to hide the radionuclide signals. We will present estimates of the possible reduction in specific gaseous and particulate fission products for explosion scenarios from underground to underwater and the use of rain storms that a potential CTBT violator might employ to evade detection. We will consider the effect that the chemical behavior of the fission products that are initially formed in nuclear explosions will have on the possible release and transport of the longer-lived fission products that would actually be measured by remote monitoring stations or by using on-site inspection techniques.

  5. Geophysics, Remote Sensing, and the Comprehensive Nuclear-Test-Ban Treaty (CTBT) Integrated Field Exercise 2014

    NASA Astrophysics Data System (ADS)

    Sussman, A. J.; Macleod, G.; Labak, P.; Malich, G.; Rowlands, A. P.; Craven, J.; Sweeney, J. J.; Chiappini, M.; Tuckwell, G.; Sankey, P.

    2015-12-01

    The Integrated Field Exercise of 2014 (IFE14) was an event held in the Hashemite Kingdom of Jordan (with concurrent activities in Austria) that tested the operational and technical capabilities of an on-site inspection (OSI) within the CTBT verification regime. During an OSI, up to 40 international inspectors will search an area for evidence of a nuclear explosion. Over 250 experts from ~50 countries were involved in IFE14 (the largest simulation of a real OSI to date) and worked from a number of different directions, such as the Exercise Management and Control Teams (which executed the scenario in which the exercise was played) and those participants performing as members of the Inspection Team (IT). One of the main objectives of IFE14 was to test and integrate Treaty allowed inspection techniques, including a number of geophysical and remote sensing methods. In order to develop a scenario in which the simulated exercise could be carried out, suites of physical features in the IFE14 inspection area were designed and engineered by the Scenario Task Force (STF) that the IT could detect by applying the geophysical and remote sensing inspection technologies, in addition to other techniques allowed by the CTBT. For example, in preparation for IFE14, the STF modeled a seismic triggering event that was provided to the IT to prompt them to detect and localize aftershocks in the vicinity of a possible explosion. Similarly, the STF planted shallow targets such as borehole casings and pipes for detection using other geophysical methods. In addition, airborne technologies, which included multi-spectral imaging, were deployed such that the IT could identify freshly exposed surfaces, imported materials, and other areas that had been subject to modification. This presentation will introduce the CTBT and OSI, explain the IFE14 in terms of the goals specific to geophysical and remote sensing methods, and show how both the preparation for and execution of IFE14 meet those goals.

  6. Cluster Analysis for CTBT Seismic Event Monitoring

    SciTech Connect

    Carr, Dorthe B.; Young, Chris J.; Aster, Richard C.; Zhang, Xioabing

    1999-08-03

    Mines at regional distances are expected to be continuing sources of small, ambiguous events which must be correctly identified as part of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) monitoring process. Many of these events are small enough that they are only seen by one or two stations, so locating them by traditional methods maybe impossible or at best leads to poorly resolved parameters. To further complicate matters, these events have parametric characteristics (explosive sources, shallow depths) which make them difficult to identify as definite non-nuclear events using traditional discrimination methods. Fortunately, explosions from the same mines tend to have similar waveforms, making it possible to identify an unknown event by comparison with characteristic archived events that have been associated with specific mines. In this study we examine the use of hierarchical cluster methods to identify groups of similar events. These methods produce dendrograms, which are tree-like structures showing the relationships between entities. Hierarchical methods are well-suited to use for event clustering because they are well documented, easy to implement, computationally cheap enough to run multiple times for a given data set, and because these methods produce results which can be readily interpreted. To aid in determining the proper threshold value for defining event families for a given dendrogram, we use cophenetic correlation (which compares a model of the similarity behavior to actual behavior), variance, and a new metric developed for this study. Clustering methods are compared using archived regional and local distance mining blasts recorded at two sites in the western U.S. with different tectonic and instrumentation characteristics: the three-component broadband DSVS station in Pinedale, Wyoming and the short period New Mexico Tech (NMT) network in central New Mexico. Ground truth for the events comes from the mining industry and local network locations

  7. Enhanced global Radionuclide Source Attribution for the Nuclear-Test-Ban Verification by means of the Adjoint Ensemble Dispersion Modeling Technique applied at the IDC/CTBTO.

    NASA Astrophysics Data System (ADS)

    Becker, A.; Wotawa, G.; de Geer, L.

    2006-05-01

    findings of the ensemble dispersion modeling (EDM) technique No. 5 efforts performed by Galmarini et al, 2004 (Atmos. Env. 38, 4607-4617). As the scope of the adjoint EDM methodology is not limited to CTBT verification but can be applied to any kind of nuclear event monitoring and location it bears the potential to improve the design of manifold emergency response systems towards preparedness concepts as needed for mitigation of disasters (like Chernobyl) and pre-emptive estimation of pollution hazards.

  8. DOE program on seismic characterization for regions of interest to CTBT monitoring

    SciTech Connect

    Ryall, A.S.; Weaver, T.A.

    1995-07-01

    The primary goal of the DOE programs on Geophysical Characterization of (1) the Middle East and North Africa (ME-NA) and (2) Southern Asia (SA) is to provide the Air Force Technical Applications Center (AFRAC) with the analytic tools and knowledge base to permit effective verification of Comprehensive Test Ban Treaty (CTBT) compliance in those regions. The program also aims at using these regionalizations as models for the development of a detailed prescription for seismic calibration and knowledge base compilation in areas where the US has had little or no previous monitoring experience. In any given region, the CTBT seismic monitoring system will depend heavily on a few key arrays and/or three-component stations, and it will be important to know as much as possible about the physical properties of the earth`s crust and upper mantle: (1) in the vicinity of these stations, (2) in areas of potential earthquake activity or commercial blasting in the region containing the stations, and (3) along the propagation path from the sources to the stations. To be able to discriminate between various source types, we will also need to know how well the various event characterization techniques perform when they are transported from one tectonic or geologic environment to another. The Department of Energy`s CMT R&D program plan (DOE, 1994), which includes the ME-NA and SA characterization programs, incorporates an iterative process that combines field experiments, computer modeling and data analysis for the development, testing, evaluation and modification of data processing algorithms as appropriate to achieve specific US monitoring objectives. This process will be applied to seismic event detection, location and identification.

  9. Site and Event Characterization Using the CTBT On-Site Inspection Techniques (Invited)

    NASA Astrophysics Data System (ADS)

    Labak, P.; Gaya Pique, L. R.; Rowlands, A. P.; Arndt, R. H.

    2013-12-01

    One of the four elements of the CTBT verification regime is On-Site Inspection (OSI). The sole purpose of an OSI is to clarify whether a nuclear weapon test explosion or any other nuclear explosion has been conducted in violation of the CTBT. An OSI would be conducted within an area no bigger than 1000 km2 and by no more than 40 inspectors at any one time, applying search logic and inspection techniques with the aim of collecting relevant information that will be the basis for the inspection report. During the course of an OSI less intrusive techniques applied over broad areas (usually with lower spatial resolution) are supplemented with more intrusive techniques applied to more targeted areas (usually at a higher spatial resolution). Environmental setting and the evolution of OSI-relevant observables over time will influence the application of OSI techniques. In the course of the development of OSI methodology and relevant techniques, field tests and exercises have been conducted. While earlier activities mainly focused on progress of individual techniques (such as visual observation, passive seismological monitoring for aftershocks and measurements of radioactivity), recent work covered both technique development (such as multi-spectral imaging including infrared measurements, and environmental sampling and analysis of solids, liquids and gases) as well as the integration of techniques, search logic and data flow. We will highlight examples of application of OSI technologies for site and event characterization from recently conducted field tests and exercises and demonstrate the synthesis of techniques and data necessary for the conduct of an OSI.

  10. Progress in CTBT Monitoring since its 1999 Senate Defeat

    NASA Astrophysics Data System (ADS)

    Hafemeister, David

    2009-05-01

    Progress in monitoring the Comprehensive Nuclear Test Ban Treaty (CTBT) is examined, beginning with the 2002 National Academy of Sciences CTBT study, followed by recent findings on regional seismology, array--monitoring, correlation--detection, seismic modeling and non-seismic technologies. The NAS--CTBT study concluded that the fully--completed International Monitoring System (IMS) will reliably detect and identify underground nuclear explosions with a threshold of 0.1 kt in hard rock, if conducted anywhere in Europe, Asia, North Africa, and North America. In some locations the threshold is 0.01 kt or lower, using arrays or regional seismic stations. As an example, the 0.6--kiloton North Korean test of October 9, 2006 was promptly detected by seismometers in Australia, Europe, North America and Asia. The P/S ratio between 1--15 Hz clearly showed that the event was an explosion and not an earthquake. Radioactive venting, observed as far as Canada, proved that it was a nuclear explosion. Advances in seismic monitoring strengthen the conclusions of the NAS study. Interferometric synthetic aperture radar can, in some cases, identify and locate 1--kt tests at 500 m depth by measuring subsidence to 2--5 mm. InSAR can discriminate between earthquakes and explosions from the subsidence pattern and it can locate nuclear tests to within 100 meters.

  11. Global Communications Infrastructure: CTBT Treaty monitoring using space communications

    NASA Astrophysics Data System (ADS)

    Kebeasy, R.; Abaya, E.; Ricker, R.; Demeules, G.

    Article 1 on Basic Obligations of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) states that: "Each State Party undertakes not to carry out any nuclear weapon test explosion or any other nuclear explosion, and to prohibit and prevent any such nuclear explosion at any place under its jurisdiction or control. Each State Party undertakes, furthermore, to refrain from causing, encouraging, or in any way participating in the carrying out of any nuclear weapon test explosion or any other nuclear explosion." To monitor States Parties compliance with these Treaty provisions, an International Monitoring System (IMS) consisting of 321 monitoring stations and 16 laboratories in some 91 countries is being implemented to cover the whole globe, including its oceans and polar regions. The IMS employs four technologies--seismic, hydroacoustic, infrasound and radionuclide--to detect,locate and identify any seismic event of Richter magnitude 4 and above (equivalent to one kiloton of TNT) that may be associated with a nuclear test explosion. About one-half of this monitoring system is now operational in 67 countries. Monitoring stations send data in near real-time to an International Data Centre (IDC) in Vienna over a Global Communications Infrastructure (GCI) incorporating 10 geostationary satellites plus three satellites in inclined orbits. The satellites relay the data to commercial earth stations, from where they are transferred by terrestrial circuits to the IDC. The IDC automatically processes and interactively analyzes the monitoring data, and distributes the raw data and reports relevant to Treaty verification to National Data Centers in Member States over the same communications network. The GCI will eventually support about 250 thin route VSAT links to the monitoring stations, many of them at remote or harsh locations on the earth, plus additional links to national data centres in various countries. Off-the-shelf VSAT and networking hardware are deployed. This is the

  12. A verification strategy for web services composition using enhanced stacked automata model.

    PubMed

    Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali

    2015-01-01

    Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the

  13. Verification and Validation of NASA-Supported Enhancements to Decision Support Tools of PECAD

    NASA Technical Reports Server (NTRS)

    Ross, Kenton W.; McKellip, Rodney; Moore, Roxzana F.; Fendley, Debbie

    2005-01-01

    This section of the evaluation report summarizes the verification and validation (V&V) of recently implemented, NASA-supported enhancements to the decision support tools of the Production Estimates and Crop Assessment Division (PECAD). The implemented enhancements include operationally tailored Moderate Resolution Imaging Spectroradiometer (MODIS) products and products of the Global Reservoir and Lake Monitor (GRLM). The MODIS products are currently made available through two separate decision support tools: the MODIS Image Gallery and the U.S. Department of Agriculture (USDA) Foreign Agricultural Service (FAS) MODIS Normalized Difference Vegetation Index (NDVI) Database. Both the Global Reservoir and Lake Monitor and MODIS Image Gallery provide near-real-time products through PECAD's CropExplorer. This discussion addresses two areas: 1. Assessments of the standard NASA products on which these enhancements are based. 2. Characterizations of the performance of the new operational products.

  14. Synthetics vs. real waveforms from underground nuclear explosions as master templates for CTBT monitoring with cross-correlation

    NASA Astrophysics Data System (ADS)

    Rozhkov, M.; Kitov, I. O.; Bobrov, D.

    2013-12-01

    The cross-correlation (CC) and master event technique is efficient in Comprehensive Nuclear-Test Ban Treaty (CTBT) monitoring. Two primary goals of CTBT monitoring are detection and location of nuclear explosions. Therefore, the CC global monitoring should be focused on finding such events. The use of physically adequate masters may increase the number of valid events in the Reviewed Event Bulletin (REB) of the International Data Centre by a factor of 2. Inadequate master events may increase the number of irrelevant events in REB and reduce the sensitivity of the CC technique to valid events. In order to cover the entire earth, including vast aseismic territories, with the CC based nuclear test monitoring we conducted a thorough research and defined the most appropriate real and synthetic master events representing underground explosion sources. A procedure was developed on optimizing the master event simulation based on principal component analysis with bootstrap aggregation as a dimension reduction technique narrowing the classes of CC templates used in global detection and location process. Actual waveforms and metadata from the DTRA Verification Database (http://www.rdss.info) were used to validate our approach. The detection and location results based on real and synthetic master events were compared

  15. Alternatives for Laboratory Measurement of Aerosol Samples from the International Monitoring System of the CTBT

    NASA Astrophysics Data System (ADS)

    Miley, H.; Forrester, J. B.; Greenwood, L. R.; Keillor, M. E.; Eslinger, P. W.; Regmi, R.; Biegalski, S.; Erikson, L. E.

    2013-12-01

    The aerosol samples taken from the CTBT International Monitoring Systems stations are measured in the field with a minimum detectable concentration (MDC) of ~30 microBq/m3 of Ba-140. This is sufficient to detect far less than 1 kt of aerosol fission products in the atmosphere when the station is in the plume from such an event. Recent thinking about minimizing the potential source region (PSR) from a detection has led to a desire for a multi-station or multi-time period detection. These would be connected through the concept of ';event formation', analogous to event formation in seismic event study. However, to form such events, samples from the nearest neighbors of the detection would require re-analysis with a more sensitive laboratory to gain a substantially lower MDC, and potentially find radionuclide concentrations undetected by the station. The authors will present recent laboratory work with air filters showing various cost effective means for enhancing laboratory sensitivity.

  16. Enhancement of electrophoretic mobility of microparticles near a solid wall--experimental verification.

    PubMed

    Liang, Qian; Zhao, Cunlu; Yang, Chun

    2015-03-01

    Although the existing theories have predicted enhancement of electrophoretic mobility of microparticles near a solid wall, the relevant experimental studies are rare. This is mainly due to difficulties in experimentally controlling and measuring particle-wall separations under dynamic electrophoretic conditions. This paper reports an experimental verification of the enhancement of electrophoretic mobility of a microparticle moving near the wall of a microchannel. This is achieved by balancing dielectrophoretic and lift forces against gravitational force acting on the microparticle so as to control the gap of particle-wall separation. A simple experimental setup is configured and a fabrication method is developed to measure such separation gap. The experiments are conducted for various particle sizes under different electric field strengths. Our experimental results are compared against the available theoretical predictions in the literature. PMID:25421107

  17. Transitioning Enhanced Land Surface Initialization and Model Verification Capabilities to the Kenya Meteorological Department (KMD)

    NASA Technical Reports Server (NTRS)

    Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Zavodsky, Bradley T.; Srikishen, Jayanthi; Limaye, Ashutosh; Blankenship, Clay B.

    2016-01-01

    Flooding, severe weather, and drought are key forecasting challenges for the Kenya Meteorological Department (KMD), based in Nairobi, Kenya. Atmospheric processes leading to convection, excessive precipitation and/or prolonged drought can be strongly influenced by land cover, vegetation, and soil moisture content, especially during anomalous conditions and dry/wet seasonal transitions. It is thus important to represent accurately land surface state variables (green vegetation fraction, soil moisture, and soil temperature) in Numerical Weather Prediction (NWP) models. The NASA SERVIR and the Short-term Prediction Research and Transition (SPoRT) programs in Huntsville, AL have established a working partnership with KMD to enhance its regional modeling capabilities. SPoRT and SERVIR are providing experimental land surface initialization datasets and model verification capabilities for capacity building at KMD. To support its forecasting operations, KMD is running experimental configurations of the Weather Research and Forecasting (WRF; Skamarock et al. 2008) model on a 12-km/4-km nested regional domain over eastern Africa, incorporating the land surface datasets provided by NASA SPoRT and SERVIR. SPoRT, SERVIR, and KMD participated in two training sessions in March 2014 and June 2015 to foster the collaboration and use of unique land surface datasets and model verification capabilities. Enhanced regional modeling capabilities have the potential to improve guidance in support of daily operations and high-impact weather and climate outlooks over Eastern Africa. For enhanced land-surface initialization, the NASA Land Information System (LIS) is run over Eastern Africa at 3-km resolution, providing real-time land surface initialization data in place of interpolated global model soil moisture and temperature data available at coarser resolutions. Additionally, real-time green vegetation fraction (GVF) composites from the Suomi-NPP VIIRS instrument is being incorporated

  18. Verification and Validation of NASA-Supported Enhancements to PECAD's Decision Support Tools

    NASA Technical Reports Server (NTRS)

    McKellipo, Rodney; Ross, Kenton W.

    2006-01-01

    The NASA Applied Sciences Directorate (ASD), part of the Earth-Sun System Division of NASA's Science Mission Directorate, has partnered with the U.S. Department of Agriculture (USDA) to enhance decision support in the area of agricultural efficiency-an application of national importance. The ASD integrated the results of NASA Earth science research into USDA decision support tools employed by the USDA Foreign Agricultural Service (FAS) Production Estimates and Crop Assessment Division (PECAD), which supports national decision making by gathering, analyzing, and disseminating global crop intelligence. Verification and validation of the following enhancements are summarized: 1) Near-real-time Moderate Resolution Imaging Spectroradiometer (MODIS) products through PECAD's MODIS Image Gallery; 2) MODIS Normalized Difference Vegetation Index (NDVI) time series data through the USDA-FAS MODIS NDVI Database; and 3) Jason-1 and TOPEX/Poseidon lake level estimates through PECAD's Global Reservoir and Lake Monitor. Where possible, each enhanced product was characterized for accuracy, timeliness, and coverage, and the characterized performance was compared to PECAD operational requirements. The MODIS Image Gallery and the GRLM are more mature and have achieved a semi-operational status, whereas the USDA-FAS MODIS NDVI Database is still evolving and should be considered

  19. A key management concept for the CTBT International Monitoring System

    SciTech Connect

    Herrington, P.; Draelos, T.; Craft, R.; Brickell, E.; Frankel, Y.; Silvestri, M.

    1997-08-01

    Cryptographic authentication (commonly referred to as ``technical authentication`` in Working Group B) is an enabling technology which ensures the integrity of sensor data and security of digital networks under various data security compromise scenarios. The use of cryptographic authentication,however, implies the development of a key management infrastructure for establishing trust in the generation and distribution of cryptographic keys. This paper proposes security and operational requirements for a CTBT (Comprehensive Test Ban Treaty) key management system and, furthermore, presents a public key based solution satisfying the requirements. The key management system is instantiated with trust distribution technologies similar to those currently implemented in industrial public key infrastructures. A complete system solution is developed.

  20. Enhanced spacer-is-dielectric (sid) decomposition flow with model-based verification

    NASA Astrophysics Data System (ADS)

    Du, Yuelin; Song, Hua; Shiely, James; Wong, Martin D. F.

    2013-03-01

    Self-aligned double patterning (SADP) lithography is a leading candidate for 14nm node lower-metal layer fabrication. Besides the intrinsic overlay-tolerance capability, the accurate spacer width and uniformity control enables such technology to fabricate very narrow and dense patterns. Spacer-is-dielectric (SID) is the most popular flavor of SADP with higher flexibility in design. In the SID process, due to uniform spacer deposition, the spacer shape gets rounded at convex mandrel corners, and disregarding the corner rounding issue during SID decomposition may result in severe residue artifacts on device patterns. Previously, SADP decomposition was merely verified by Boolean operations on the decomposed layers, where the residue artifacts are not even identifiable. This paper proposes a model-based verification method for SID decomposition to identify the artifacts caused by spacer corner rounding. Then targeting residue artifact removal, an enhanced SID decomposition flow is introduced. Simulation results show that residue artifacts are removed effectively through the enhanced SID decomposition strategy.

  1. Dosimetric verification of enhanced dynamic wedges by a 2D ion chamber array

    NASA Astrophysics Data System (ADS)

    Oh, Se An; Kim, Sung Kyu; Kang, Min Kyu; Yea, Ji Woon; Kim, Eng Chan

    2013-12-01

    Wedge filters are commonly used to achieve dose uniformity in the target volume in radiotherapy and can be categorized as physical wedges (PWs) and enhanced dynamic wedges (EDWs). The EDW generates PW-like dose profiles while moving the upper jaw in the Y directions with a varying dose rate in the treatment beams. Task Group 53 of the AAPM (American Association of Physicists in Medicine) recommended that the dynamic wedge be verified before implementation in the radiation treatment planning (RTP) system. The aim of this study was to use the I'mRT MatriXX to verify the dose profiles of the EDWs manufactured by Varian. We used Pencil Beam Convolution algorithms (eclipse 8.6) for the calculation and I'mRT MatriXX with Plastic Water® phantom MULTICube for dose measurements. The gamma indices of the calculations and the measurements for the EDWs were 84.84% and 86.54% in 2%/2 mm tolerance, and 99.47% and 99.64% in 3%/3 mm tolerance for wedge angles of 15°, 30°, 45° and 60°, respectively. The dose distributions differed between the calculations using the system and the measurements in the penumbra and the outer beam regions of the wedge fields. We confirmed that the dosimetric verifications of the EDW were acceptable when using the criterion for external beam dose calculations of Task Group 53.

  2. Evaluation of database technologies for the CTBT Knowledge Base prototype

    SciTech Connect

    Keyser, R.; Shepard-Dombroski, E.; Baur, D.; Hipp, J.; Moore, S.; Young, C.; Chael, E.

    1996-11-01

    This document examines a number of different software technologies in the rapidly changing field of database management systems, evaluates these systems in light of the expected needs of the Comprehensive Test Ban Treaty (CTBT) Knowledge Base, and makes some recommendations for the initial prototypes of the Knowledge Base. The Knowledge Base requirements are examined and then used as criteria for evaluation of the database management options. A mock-up of the data expected in the Knowledge Base is used as a basis for examining how four different database technologies deal with the problems of storing and retrieving the data. Based on these requirement and the results of the evaluation, the recommendation is that the Illustra database be considered for the initial prototype of the Knowledge Base. Illustra offers a unique blend of performance, flexibility, and features that will aid in the implementation of the prototype. At the same time, Illustra provides a high level of compatibility with the hardware and software environments present at the US NDC (National Data Center) and the PIDC (Prototype International Data Center).

  3. Evaluation of infrasonic detection capability for the CTBT/IMS

    SciTech Connect

    Armstrong, W.T.; Whitaker, R.W.; Olson, J.V.

    1996-09-01

    Evaluation of infrasonic detection capability for the International Monitoring System of the Comprehensive Test Ban Treaty (IMS/CTBT) is made with respect to signal analysis and global coverage. Signal analysis is anecdotally reviewed with respect to composite power, correlation and F-statistic detection algorithms. In the absence of adaptive pre-filtering, either cross-correlation or F-statistic detection is required. As an unbounded quantity, the F-statistic offers potentially greater sensitivity to signals of interest. With PURE state pre-filtering, power detection begins to become competitive with correlation and F-statistic detection. Additional application of simple post-filters of minimum duration and maximum bearing deviation results in unique positive detection of an identified impulsive infrasonic signal. Global coverage estimates are performed as a useful deterministic evaluation of networks, offering an easily interpreted network performance, which compliments previous probabilistic network evaluations. In particular, adequate coverage (2 sites), uniform coverage, and redundant coverage (3 to 4 sites) provide figures of merit in evaluating detection, location and vulnerability, respectively. Coverage estimates of the I60 network have been performed which indicate generally adequate coverage for the majority of the globe. Modest increase of station gain (increase of number of elements from 4 to 7) results in significant increase in coverage for mean signal values. Ineffective sites and vulnerability sites are identified which suggest further refinement of the network is possible.

  4. Investigation of CTBT OSI Radionuclide Techniques at the DILUTED WATERS Nuclear Test Site

    SciTech Connect

    Baciak, James E.; Milbrath, Brian D.; Detwiler, Rebecca S.; Kirkham, Randy R.; Keillor, Martin E.; Lepel, Elwood A.; Seifert, Allen; Emer, Dudley; Floyd, Michael

    2012-11-01

    Under the Comprehensive Nuclear-Test-Ban Treaty (CTBT), a verification regime that includes the ability to conduct an On-Site Inspection (OSI) will be established. The Treaty allows for an OSI to include many techniques, including the radionuclide techniques of gamma radiation surveying and spectrometry and environmental sampling and analysis. Such radioactivity detection techniques can provide the “smoking gun” evidence that a nuclear test has occurred through the detection and quantification of indicative recent fission products. An OSI faces restrictions in time and manpower, as dictated by the Treaty; not to mention possible logistics difficulties due to the location and climate of the suspected explosion site. It is thus necessary to have a good understanding of the possible source term an OSI will encounter and the proper techniques that will be necessary for an effective OSI regime. One of the challenges during an OSI is to locate radioactive debris that has escaped an underground nuclear explosion (UNE) and settled on the surface near and downwind of ground zero. To support the understanding and selection of sampling and survey techniques for use in an OSI, we are currently designing an experiment, the Particulate Release Experiment (PRex), to simulate a small-scale vent from an underground nuclear explosion. PRex will occur at the Nevada National Security Site (NNSS). The project is conducted under the National Center for Nuclear Security (NCNS) funded by the National Nuclear Security Agency (NNSA). Prior to the release experiment, scheduled for Spring of 2013, the project scheduled a number of activities at the NNSS to prepare for the release experiment as well as to utilize the nuclear testing past of the NNSS for the development of OSI techniques for CTBT. One such activity—the focus of this report—was a survey and sampling campaign at the site of an old UNE that vented: DILUTED WATERS. Activities at DILUTED WATERS included vehicle-based survey

  5. Towards a new daily in-situ precipitation data set supporting parameterization of wet-deposition of CTBT relevant radionuclides

    NASA Astrophysics Data System (ADS)

    Becker, A.; Ceranna, L.; Ross, O.; Schneider, U.; Meyer-Christoffer, A.; Ziese, M.; Lehner, K.; Rudolf, B.

    2012-04-01

    As contribution to the World Climate Research Program (WCRP) and in support of the Global Climate Observing System (GCOS) of the World Meteorological Organization (WMO), the Deutscher Wetterdienst (DWD) operates the Global Precipitation Climatology Centre (GPCC). The GPCC re-analysis and near-real time monitoring products are recognized world-wide as the most reliable global data set on rain-gauge based (in-situ) precipitation measurements. The GPCC Monitoring Product (Schneider et al, 2011; Becker et al. 2012, Ziese et al, EGU2012-5442) is available two months after the fact based on the data gathered while listening to the GTS to fetch the SYNOP and CLIMAT messages. This product serves also the reference data to calibrate satellite based precipitation measurements yielding the Global Precipitation Climatology Project (GPCP) data set (Huffmann et al., 2009). The quickest GPCC product is the First Guess version of the GPCC Monitoring Product being available already 3-5 days after the month regarded. Both, the GPCC and the GPCP products bear the capability to serve as data base for the computational light-weight post processing of the wet deposition impact on the radionuclide (RN) monitoring capability of the CTBT network (Wotawa et al., 2009) on the regional and global scale, respectively. This is of major importance any time, a reliable quantitative assessment of the source-receptor sensitivity is needed, e.g. for the analysis of isotopic ratios. Actually the wet deposition recognition is a prerequisite if ratios of particulate and noble gas measurements come into play. This is so far a quite unexplored field of investigation, but would alleviate the clearance of several apparently CTBT relevant detections, encountered in the past, as bogus and provide an assessment for the so far overestimation of the RN detection capability of the CTBT network. Besides the climatological kind of wet deposition assessment for threshold monitoring purposes, there are also singular

  6. Surface-wave calibration studies for improved monitoring of a CTBT

    SciTech Connect

    Patton, H.J.; Jones, L.E.

    1998-12-31

    Seismic calibration of the International Monitoring System (IMS) and other key monitoring stations is critical for effective verification of a Comprehensive Test Ban Treaty (CTBT). Detection, location, and identification all depend upon calibration of source and path effects to ensure maximum efficiency of the IMS to monitor at small magnitudes. This project gathers information about the effects of source and propagation on surface waves for key monitoring areas in central Asia with initial focus on western China. Source calibration focuses on surface-wave determinations of focal depth and seismic moment, M{sub o}, for key earthquakes, which serve as calibration sources in location studies and for developing regional magnitude scales. The authors present a calibration procedure for Lg attenuation, which exploits an empirical relationship between M{sub o} and 1-Hz Lg amplitude for stable and tectonic continental regions. The procedure uses this relationship and estimates of M{sub o} to predict Lg amplitudes at a reference distance of 10 km from each calibrated source. Path-specific estimates of Q{sub o} in the power-law formula of Q (Q = Q{sub o}f{sup {zeta}}) are made using measurements of 1-Hz Lg amplitudes observed at the station and amplitudes predicted for the reference distance. Nuttli`s formula for m{sub b}(Lg) is thus calibrated for the source region of interest, and for paths to key monitoring stations. Path calibration focuses on measurement of surface-wave group velocity dispersion curves in the period range of 5 to 50 s. Concentrating on the Lop Nor source region initially, they employ broadband data recorded at CDSN stations, regional event (M > 4.0), and source-receiver path lengths from 200 to 2000 km. Their approach emphasizes path-specific calibration of key stations and source regions and will result in a family of regionally appropriate phase-match filters, designed to extract fundamental mode surface-wave arrivals for each region of interest

  7. Meteosat Second Generation: in-flight verification and calibration of the spinning enhanced visible and infrared imager (SEVIRI)

    NASA Astrophysics Data System (ADS)

    Ottenbacher, Andreas; Aminou, Donny M. A.

    2001-02-01

    12 The Meteosat Second Generation (MSG) program consists of a series of 3 geostationary satellites. The objectives for this program have been defined by the European meteorological community led by the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) and the European Space Agency (ESA). The first MSG satellite (MSG-1) is currently being developed by ESA and scheduled for launch in the year 2002. MSG-2 and MSG-3 will be procured by ESA on behalf of EUMETSAT. They will be built by European industry under ESA contract and are scheduled for launch in 2004 and 2009. EUMETSAT will launch and operate the three satellites and provide data until 2014. This paper focuses on the foreseen in-flight verification and calibration of the main payload on board the MSG satellites: the Spinning Enhanced Visible and Infrared Imager (SEVIRI). It commences with a brief description of the main elements of the MSG satellite and the SEVIRI instrument. Then the in-flight verification approach for functional and performance specifications will be presented. Finally, the paper will provide the envisaged in-flight calibration method for the SEVIRI instrument.

  8. Verification of Precipitation Enhancement due to Winter Orographic Cloud Seeding in the Payette River Basin of Western Idaho

    NASA Astrophysics Data System (ADS)

    Holbrook, V. P.; Kunkel, M. L.; Blestrud, D.

    2013-12-01

    The Idaho Power Company (IPCo) is a hydroelectric based utility serving eastern Oregon and most of southern Idaho. Snowpack is critical to IPCo operations and the company has invested in a winter orographic cloud seeding program for the Payette, Boise, and Upper Snake River basins to augment the snowpack. IPCo and the National Center for Atmospheric Research (NCAR) are in the middle of a two-year study to determine precipitation enhancement due to winter orographic cloud seeding in the Payette River basin. NCAR developed a cloud seeding module, as an enhancement to the Weather Research and Forecast (WRF) model, that inputs silver iodide released from both ground based and/or aircraft generators. The cloud seeding module then increases the precipitation as a function of the cloud seeding. The WRF model used for this program is run at the University of Arizona with a resolution of 1.8 kilometers using Thompson microphysics and Mellor-Yamada-Janic boundary layer scheme. Two different types of verification schemes to determine precipitation enhancement is being used for this program; model versus model and model versus precipitation gauges. In the model versus model method, a control model run uses NCAR developed criteria to identify the best times to operate cloud or airborne seeding generators and also establishes the baseline precipitation. The model is then rerun with the cloud seeding module turned on for the time periods determined by the control run. The precipitation enhancement due to cloud seeding is then the difference in precipitation between the control and seeding model runs. The second verification method is to use the model forecast precipitation in the seeded and non-seeded areas, compare against observed precipitation (from mainly SNOTEL gauges), and determine the precipitation enhancement due to cloud seeding. Up to 15 SNOTEL gauges in or near the Payette River basin along with 14 IPCo high resolution rain gauges will be used with this target

  9. A Visual Analytics Approach to Structured Data Analysis to Enhance Nonproliferation and Arms Control Verification Activities

    SciTech Connect

    Gillen, David S.

    2014-08-07

    Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in this domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of

  10. Test report for the infrasound prototype: For a CTBT IMS station

    SciTech Connect

    Breding, D.R.; Kromer, R.P.; Whitaker, R.W.; Sandoval, T.

    1997-11-01

    This document describes the results of the Comprehensive Test Ban Treaty (CTBT) Infrasound Prototype Development Test and Evaluation (DT&E). During DT&E the infrasound prototype was evaluated against requirements listed in the System Requirements Document (SRD) based on the Conference on Disarmament/Ad Hoc Committee on a Nuclear Test Ban/Working Papers 224 and 283 and the Preparatory Commission specifications as defined in CTBT/PC/II/1/Add.2, Appendix X, Table 5. The evaluation was conducted during a two-day period, August 6-7, 18997. The System Test Plan (STP) defined the plan and methods to test the infrasound prototype. Specific tests that were performed are detailed in the Test Procedures (TP).

  11. Feedback Seeking in Early Adolescence: Self-Enhancement or Self-Verification?

    ERIC Educational Resources Information Center

    Rosen, Lisa H.; Principe, Connor P.; Langlois, Judith H.

    2013-01-01

    The authors examined whether early adolescents ("N" = 90) solicit self-enhancing feedback (i.e., positive feedback) or self-verifying feedback (i.e., feedback congruent with self-views, even when these views are negative). Sixth, seventh, and eighth graders first completed a self-perception measure and then selected whether to receive…

  12. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    SciTech Connect

    Qiu, J; Li, H. Harlod; Zhang, T; Yang, D; Ma, F

    2015-06-15

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The most important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools.

  13. An experimental verification of metamaterial coupled enhanced transmission for antenna applications

    SciTech Connect

    Pushpakaran, Sarin V.; Raj, Rohith K.; Pradeep, Anju; Ouseph, Lindo; Hari, Mridula; Chandroth, Aanandan; Pezholil, Mohanan; Kesavath, Vasudevan

    2014-02-10

    Inspired by the work of Bethe on electromagnetic transmission through subwavelength hole, there has been immense interest on the extraordinary transmission through subwavelength slot/slit on metal plates. The invention of metamaterials has boosted the extra ordinary transmission through subwavelength slots. We examine computationally and experimentally the concept of metamaterial cover using an array of split ring resonators (SRRs), for enhancing the transmission in a stacked dipole antenna working in the S band. The front to back ratio is considerably improved by enhancing the magnetic resonant strength in close proximity of the slit of the upper parasitic dipole. The effect of stacking height of the SRR monolayer on the resonant characteristics of the split ring resonators and its effect on antenna radiation characteristics has been studied.

  14. Cimstation/IM enhanced data verification, CRADA final report for CRADA number Y-1292-0162

    SciTech Connect

    Biddix, M.D.; Turner, J.

    1994-05-16

    This report discusses a CRADA code used to enhance the Cimstation in ramifaction of inspection part programs as they are being develop. This report briefly discussed the following topics and contains a code listing in the back: algorithm explanation; Cimstation CAD models; importing inspection point data; need for a new algorithm; details of the algorithms; formulas/mathematics used for the algorithm; algorithm software design diagram; and software function descriptions.

  15. Exhaustive and Systematic Accuracy Verification and Enhancement of STI Stress Compact Model for General Realistic Layout Patterns

    NASA Astrophysics Data System (ADS)

    Yamada, Kenta; Syo, Toshiyuki; Yoshimura, Hisao; Ito, Masaru; Kunikiyo, Tatsuya; Kanamoto, Toshiki; Kumashiro, Shigetaka

    Layout-aware compact models proposed so far have been generally verified only for simple test patterns. However, real designs use much more complicated layout patterns. Therefore, models must be verified for such patterns to establish their practicality. This paper proposes a methodology and test patterns for exhaustively and systematically validating layout-aware compact models for general layout patterns for the first time. The methodology and test patterns are concretely shown through validation of a shallow trench isolation (STI) stress compact model proposed in [1]. First, the model parameters for a 55-nm CMOS technology are extracted, and then the model is verified and established to be accurate for the basic patterns used for parameter extraction. Next, fundamental ideas of model operation for general layout patterns are verified using various verification patterns. These tests revealed that the model is relatively weak in some cases not included in the basic patterns. Finally, the errors for these cases are eliminated by enhancing the algorithm. Consequently, the model is confirmed to have high generality. This methodology will be effective for validating other layout-aware compact models for general layout patterns.

  16. Localized surface plasmon resonance immunoassay and verification using surface-enhanced Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Yonzon, Chanda R.; Zhang, Xiaoyu; Van Duyne, Richard P.

    2003-11-01

    This work exploits the localized surface plasmon resonance (LSPR) spectroscopy of noble metal nanoparticles to achieve sensitive and selective detection of biological analytes. Noble metal nanoparticles exhibit an LSPR that is strongly dependent on their size, shape, material, and the local dielectric environment. The LSPR is also responsible for the intense signals observed in surface-enhanced Raman scattering (SERS). Ag nanoparticles fabricated using the nanosphere lithography (NSL) technique exploits this LSPR sensitivity as a signal transduction method in biosensing applications. The current work implements LSPR biosensing for the anti dinitrophenyl (antiDNP) immunoassay system. Upon forming the 2,4 dinitrobenzoic acid/antiDNP complex, this system shows a large LSPR shift of 44 nm when exposed to antiDNP concentration of 1.5 x 10-6 M. In addition, due to the unique molecular characteristics of the functional groups on the biosensor, it can also be characterized using SERS. First, the nanoparticles are functionalized with a mixed self-assembled monolayer (SAM) comprised of 2:1 octanethiol and 11-amino undecanethiol. The SAM is exposed to 2,4-dinitrobenzoic acid with the 1-ethyl-3-[3-dimethylaminopropyl]carbodiimide hydrochloride (EDC) coupling reagent. Finally, the 2,4-dinitrophenyl terminated SAM is exposed to various concentration of antiDNP. LSPR shifts indicate the occurrence of a binding event. SER spectra confirm binding of 2,4 dinitrobenzoic acid with amine-terminated SAM. This LSPR/SERS biosensing method can be generalized to a myriad of biologically relevant systems.

  17. Physical verification of contaminated sediment remediation: Capping, confined aquatic disposal, and enhanced natural recovery

    SciTech Connect

    Browning, D.

    1995-12-31

    Dredging and disposal in a confined aquatic disposal (CAD) site, capping with clean sediment, and natural recovery are commonly used, cost-effective remedial practices for contaminated sediments. Recent projects in Puget Sound, Washington and Southern California involved dredging and use of the material for capping and CAD fill. Both of these projects required physical monitoring to document sediment placement. Dredged sediments placed at these sites were optically identified using sediment vertical profile system (SVPS) photography. Optical criteria to distinguish cap/construction materials include grain-size, reflectance, and texture. Environmental parameters such as the extent and thickness of the CAD material or sediment cap deposits are evaluated against design and performance goals, typically the isolation of contaminants from the biologically active portion of the sediment column. Using SVPS, coring and other technologies, the stratigraphic contact between the capping/CAD sediment and the native sediment can be discerned. These measurements observations can ground-truth and be coupled with remote sensing to provide a more complete characterization of the entire remedial area. Physical isolation of the benthic community can be discerned by examining SVPS images for depth of bioturbation and sediment stratigraphy. On the periphery of cap/CAD deposits, thin layers of clean sediment ranging upwards from 1 mm thick can be identified. Dependent on the pre-remediation benthic community at the site, these thin layers of CAP/CAD sediment can be bioturbated by resident benthic infauna immediately after placement. The deposition and subsequent assimilation of the clean cap material into the contaminated sediments effectively reduces the concentration of contaminants in the biologically active zone thereby enhancing natural recovery in areas where regulatory criteria are focused on the biologically active zone.

  18. Hardware design document for the Infrasound Prototype for a CTBT IMS station

    SciTech Connect

    Breding, D.R.; Kromer, R.P.; Whitaker, R.W.; Sandoval, T.

    1997-11-01

    The Hardware Design Document (HDD) describes the various hardware components used in the Comprehensive Test Ban Treaty (CTBT) Infrasound Prototype and their interrelationships. It divides the infrasound prototype into hardware configurations items (HWCIs). The HDD uses techniques such as block diagrams and parts lists to present this information. The level of detail provided in the following sections should be sufficient to allow potential users to procure and install the infrasound system. Infrasonic monitoring is a low cost, robust, and effective technology for detecting atmospheric explosions. Low frequencies from explosion signals propagate to long ranges (few thousand kilometers) where they can be detected with an array of sensors.

  19. Big Data Solution for CTBT Monitoring Using Global Cross Correlation

    NASA Astrophysics Data System (ADS)

    Gaillard, P.; Bobrov, D.; Dupont, A.; Grenouille, A.; Kitov, I. O.; Rozhkov, M.

    2014-12-01

    Due to the mismatch between data volume and the performance of the Information Technology infrastructure used in seismic data centers, it becomes more and more difficult to process all the data with traditional applications in a reasonable elapsed time. To fulfill their missions, the International Data Centre of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO/IDC) and the Département Analyse Surveillance Environnement of Commissariat à l'Energie atomique et aux énergies alternatives (CEA/DASE) collect, process and produce complex data sets whose volume is growing exponentially. In the medium term, computer architectures, data management systems and application algorithms will require fundamental changes to meet the needs. This problem is well known and identified as a "Big Data" challenge. To tackle this major task, the CEA/DASE takes part during two years to the "DataScale" project. Started in September 2013, DataScale gathers a large set of partners (research laboratories, SMEs and big companies). The common objective is to design efficient solutions using the synergy between Big Data solutions and the High Performance Computing (HPC). The project will evaluate the relevance of these technological solutions by implementing a demonstrator for seismic event detections thanks to massive waveform correlations. The IDC has developed an expertise on such techniques leading to an algorithm called "Master Event" and provides a high-quality dataset for an extensive cross correlation study. The objective of the project is to enhance the Master Event algorithm and to reanalyze 10 years of waveform data from the International Monitoring System (IMS) network thanks to a dedicated HPC infrastructure operated by the "Centre de Calcul Recherche et Technologie" at the CEA of Bruyères-le-Châtel. The dataset used for the demonstrator includes more than 300,000 seismic events, tens of millions of raw detections and more than 30 terabytes of continuous seismic data

  20. Complete regional waveform modeling to estimate seismic velocity structure and source parameters for CTBT monitoring

    SciTech Connect

    Bredbeck, T; Rodgers, A; Walter, W

    1999-07-23

    The velocity structures and source parameters estimated by waveform modeling provide valuable information for CTBT monitoring. The inferred crustal and uppermost mantle structures advance understanding of tectonics and guides regionalization for event location and identification efforts. Estimation of source parameters such as seismic moment, depth and mechanism (whether earthquake, explosion or collapse) is crucial to event identification. In this paper we briefly outline some of the waveform modeling research for CTBT monitoring performed in the last year. In the future we will estimate structure for new regions by modeling waveforms of large well-observed events along additional paths. Of particular interest will be the estimation of velocity structure in aseismic regions such as most of Africa and the Former Soviet Union. Our previous work on aseismic regions in the Middle East, north Africa and south Asia give us confidence to proceed with our current methods. Using the inferred velocity models we plan to estimate source parameters for smaller events. It is especially important to obtain seismic moments of earthquakes for use in applying the Magnitude-Distance Amplitude Correction (MDAC; Taylor et al., 1999) to regional body-wave amplitudes for discrimination and calibrating the coda-based magnitude scales.

  1. Acoustic-Seismic Coupling of Broadband Signals - Analysis of Potential Disturbances during CTBT On-Site Inspection Measurements

    NASA Astrophysics Data System (ADS)

    Liebsch, Mattes; Altmann, Jürgen

    2015-04-01

    For the verification of the Comprehensive Nuclear Test Ban Treaty (CTBT) the precise localisation of possible underground nuclear explosion sites is important. During an on-site inspection (OSI) sensitive seismic measurements of aftershocks can be performed, which, however, can be disturbed by other signals. To improve the quality and effectiveness of these measurements it is essential to understand those disturbances so that they can be reduced or prevented. In our work we focus on disturbing signals caused by airborne sources: When the sound of aircraft (as often used by the inspectors themselves) hits the ground, it propagates through pores in the soil. Its energy is transferred to the ground and soil vibrations are created which can mask weak aftershock signals. The understanding of the coupling of acoustic waves to the ground is still incomplete. However, it is necessary to improve the performance of an OSI, e.g. to address potential consequences for the sensor placement, the helicopter trajectories etc. We present our recent advances in this field. We performed several measurements to record sound pressure and soil velocity produced by various sources, e.g. broadband excitation by jet aircraft passing overhead and signals artificially produced by a speaker. For our experimental set-up microphones were placed close to the ground and geophones were buried in different depths in the soil. Several sensors were shielded from the directly incident acoustic signals by a box coated with acoustic damping material. While sound pressure under the box was strongly reduced, the soil velocity measured under the box was just slightly smaller than outside of it. Thus these soil vibrations were mostly created outside the box and travelled through the soil to the sensors. This information is used to estimate characteristic propagation lengths of the acoustically induced signals in the soil. In the seismic data we observed interference patterns which are likely caused by the

  2. Applying monitoring, verification, and accounting techniques to a real-world, enhanced oil recovery operational CO2 leak

    USGS Publications Warehouse

    Wimmer, B.T.; Krapac, I.G.; Locke, R.; Iranmanesh, A.

    2011-01-01

    The use of carbon dioxide (CO2) for enhanced oil recovery (EOR) is being tested for oil fields in the Illinois Basin, USA. While this technology has shown promise for improving oil production, it has raised some issues about the safety of CO2 injection and storage. The Midwest Geological Sequestration Consortium (MGSC) organized a Monitoring, Verification, and Accounting (MVA) team to develop and deploy monitoring programs at three EOR sites in Illinois, Indiana, and Kentucky, USA. MVA goals include establishing baseline conditions to evaluate potential impacts from CO2 injection, demonstrating that project activities are protective of human health and the environment, and providing an accurate accounting of stored CO2. This paper focuses on the use of MVA techniques in monitoring a small CO2 leak from a supply line at an EOR facility under real-world conditions. The ability of shallow monitoring techniques to detect and quantify a CO2 leak under real-world conditions has been largely unproven. In July of 2009, a leak in the pipe supplying pressurized CO2 to an injection well was observed at an MGSC EOR site located in west-central Kentucky. Carbon dioxide was escaping from the supply pipe located approximately 1 m underground. The leak was discovered visually by site personnel and injection was halted immediately. At its largest extent, the hole created by the leak was approximately 1.9 m long by 1.7 m wide and 0.7 m deep in the land surface. This circumstance provided an excellent opportunity to evaluate the performance of several monitoring techniques including soil CO2 flux measurements, portable infrared gas analysis, thermal infrared imagery, and aerial hyperspectral imagery. Valuable experience was gained during this effort. Lessons learned included determining 1) hyperspectral imagery was not effective in detecting this relatively small, short-term CO2 leak, 2) even though injection was halted, the leak remained dynamic and presented a safety risk concern

  3. Seismic Characterization of Coal-Mining Seismicity in Utah for CTBT Monitoring

    SciTech Connect

    Arabasz, W J; Pechmann, J C

    2001-03-01

    Underground coal mining (down to {approx}0.75 km depth) in the contiguous Wasatch Plateau (WP) and Book Cliffs (BC) mining districts of east-central Utah induces abundant seismicity that is monitored by the University of Utah regional seismic network. This report presents the results of a systematic characterization of mining seismicity (magnitude {le} 4.2) in the WP-BC region from January 1978 to June 2000-together with an evaluation of three seismic events (magnitude {le} 4.3) associated with underground trona mining in southwestern Wyoming during January-August 2000. (Unless specified otherwise, magnitude implies Richter local magnitude, M{sub L}.) The University of Utah Seismograph Stations (UUSS) undertook this cooperative project to assist the University of California Lawrence Livermore National Laboratory (LLNL) in research and development relating to monitoring the Comprehensive Test Ban Treaty (CTBT). The project, which formally began February 28, 1998, and ended September 1, 2000, had three basic objectives: (1) Strategically install a three-component broadband digital seismic station in the WP-BC region to ensure the continuous recording of high-quality waveform data to meet the long-term needs of LLNL, UUSS, and other interested parties, including the international CTBT community. (2) Determine source mechanisms--to the extent that available source data and resources allowed--for comparative seismic characterization of stress release in mines versus earthquakes in the WP-BC study region. (3) Gather and report to LLNL local information on mine operations and associated seismicity, including ''ground truth'' for significant events. Following guidance from LLNL's Technical Representative, the focus of Objective 2 was changed slightly to place emphasis on three mining-related events that occurred in and near the study area after the original work plan had been made, thus posing new targets of opportunity. These included: a magnitude 3.8 shock that occurred

  4. Engineering Upgrades to the Radionuclide Aerosol Sampler/Analyzer for the CTBT International Monitoring System

    SciTech Connect

    Forrester, Joel B.; Carty, Fitz; Comes, Laura; Hayes, James C.; Miley, Harry S.; Morris, Scott J.; Ripplinger, Mike D.; Slaugh, Ryan W.; Van Davelaar, Peter

    2013-05-13

    The Radionuclide Aerosol Sampler/Analyzer (RASA) is an automated aerosol collection and analysis system designed by Pacific Northwest National Laboratory in the 1990’s and is deployed in several locations around the world as part of the International Monitoring System (IMS) required under the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The utility of such an automated system is the reduction of human intervention and the production of perfectly uniform results. However, maintainability and down time issues threaten this utility, even for systems with over 90% data availability. Engineering upgrades to the RASA are currently being pursued to address these issues, as well as Fukushima lessons learned. Current work includes a new automation control unit, and other potential improvements such as alternative detector cooling and sampling options are under review. This paper presents the current state of upgrades and improvements under investigation

  5. Capability of the CTBT infrasound stations detecting the 2013 Russian fireball

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; Le Pichon, Alexis; Mialle, Pierrick; Garces, Milton

    2015-04-01

    The explosive fragmentation of the 2013 Chelyabinsk meteorite generated a large airburst with an equivalent yield of 500 kT TNT. It is the most energetic event recorded by the infrasound component of the CTBT-IMS, globally detected by 20 out of 42 operational stations. This study performs a station-by-station estimation of the IMS detection capability to explain infrasound detections and non-detections from short to long distances, using the Chelyabinsk meteorite as global reference event. Investigated parameters influencing the detection capability are the directivity of the line source signal, the ducting of acoustic energy and the individual noise conditions at each station. Findings include a clear detection preference for stations perpendicular to the meteorite trajectory, even over large distances. Only a weak influence of stratospheric ducting is observed for this low-frequency case. Furthermore, a strong dependence on the diurnal variability of background noise levels at each station is observed, favoring nocturnal detections.

  6. Verification and Validation of NASA-Supported Enhancements to the Near Real Time Harmful Algal Blooms Observing System (HABSOS)

    NASA Technical Reports Server (NTRS)

    Spruce, Joseph P.; Hall, Calllie; McPherson, Terry; Spiering, Bruce; Brown, Richard; Estep, Lee; Lunde, Bruce; Guest, DeNeice; Navard, Andy; Pagnutti, Mary; Ryan, Robert E.

    2006-01-01

    This report discusses verification and validation (V&V) assessment of Moderate Resolution Imaging Spectroradiometer (MODIS) ocean data products contributed by the Naval Research Laboratory (NRL) and Applied Coherent Technologies (ACT) Corporation to National Oceanic Atmospheric Administration s (NOAA) Near Real Time (NRT) Harmful Algal Blooms Observing System (HABSOS). HABSOS is a maturing decision support tool (DST) used by NOAA and its partners involved with coastal and public health management.

  7. Blind Source Separation of Seismic Events with Independent Component Analysis: CTBT related exercise

    NASA Astrophysics Data System (ADS)

    Rozhkov, Mikhail; Kitov, Ivan

    2015-04-01

    Blind Source Separation (BSS) methods used in signal recovery applications are attractive for they use minimal a priori information about the signals they are dealing with. Homomorphic deconvolution and cepstrum estimation are probably the only methods used in certain extent in CTBT applications that can be attributed to the given branch of technology. However Expert Technical Analysis (ETA) conducted in CTBTO to improve the estimated values for the standard signal and event parameters according to the Protocol to the CTBT may face problems which cannot be resolved with certified CTBTO applications and may demand specific techniques not presently used. The problem to be considered within the ETA framework is the unambiguous separation of signals with close arrival times. Here, we examine two scenarios of interest: (1) separation of two almost co-located explosions conducted within fractions of seconds, and (2) extraction of explosion signals merged with wavetrains from strong earthquake. The importance of resolving the problem related to case 1 is connected with the correct explosion yield estimation. Case 2 is a well-known scenario of conducting clandestine nuclear tests. While the first case can be approached somehow with the means of cepstral methods, the second case can hardly be resolved with the conventional methods implemented at the International Data Centre, especially if the signals have close slowness and azimuth. Independent Component Analysis (in its FastICA implementation) implying non-Gaussianity of the underlying processes signal's mixture is a blind source separation method that we apply to resolve the mentioned above problems. We have tested this technique with synthetic waveforms, seismic data from DPRK explosions and mining blasts conducted within East-European platform as well as with signals from strong teleseismic events (Sumatra, April 2012 Mw=8.6, and Tohoku, March 2011 Mw=9.0 earthquakes). The data was recorded by seismic arrays of the

  8. CTBT infrasound network performance to detect the 2013 Russian fireball event

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; Le Pichon, Alexis; Mialle, Pierrick; Garcés, Milton A.

    2015-04-01

    The explosive fragmentation of the 2013 Chelyabinsk meteorite generated a large airburst with an equivalent yield of 500 kT TNT. It is the most energetic event recorded by the infrasound component of the Comprehensive Nuclear-Test-Ban Treaty-International Monitoring System (CTBT-IMS), globally detected by 20 out of 42 operational stations. This study performs a station-by-station estimation of the IMS detection capability to explain infrasound detections and nondetections from short to long distances, using the Chelyabinsk meteorite as global reference event. Investigated parameters influencing the detection capability are the directivity of the line source signal, the ducting of acoustic energy, and the individual noise conditions at each station. Findings include a clear detection preference for stations perpendicular to the meteorite trajectory, even over large distances. Only a weak influence of stratospheric ducting is observed for this low-frequency case. Furthermore, a strong dependence on the diurnal variability of background noise levels at each station is observed, favoring nocturnal detections.

  9. Geometric verification

    NASA Technical Reports Server (NTRS)

    Grebowsky, G. J.

    1982-01-01

    Present LANDSAT data formats are reviewed to clarify how the geodetic location and registration capabilities were defined for P-tape products and RBV data. Since there is only one geometric model used in the master data processor, geometric location accuracy of P-tape products depends on the absolute accuracy of the model and registration accuracy is determined by the stability of the model. Due primarily to inaccuracies in data provided by the LANDSAT attitude management system, desired accuracies are obtained only by using ground control points and a correlation process. The verification of system performance with regards to geodetic location requires the capability to determine pixel positions of map points in a P-tape array. Verification of registration performance requires the capability to determine pixel positions of common points (not necessarily map points) in 2 or more P-tape arrays for a given world reference system scene. Techniques for registration verification can be more varied and automated since map data are not required. The verification of LACIE extractions is used as an example.

  10. Verification Results of Jet Resonance-enhanced Multiphoton Ionization as a Real-time PCDD/F Emission Monitor

    EPA Science Inventory

    The Jet REMPI (Resonance Enhanced Multiphoton Ionization) monitor was tested on a hazardous waste firing boiler for its ability to determine concentrations of polychlorinated dibenzodioxins and dibenzofurans (PCDDs/Fs). Jet REMPI is a real time instrument capable of highly selec...

  11. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  12. Uranium systems to enhance benchmarks for use in the verification of criticality safety computer models. Final report, February 16, 1990--December 31, 1994

    SciTech Connect

    Busch, R.D.

    1995-02-24

    Dr. Robert Busch of the Department of Chemical and Nuclear Engineering was the principal investigator on this project with technical direction provided by the staff in the Nuclear Criticality Safety Group at Los Alamos. During the period of the contract, he had a number of graduate and undergraduate students working on subtasks. The objective of this work was to develop information on uranium systems to enhance benchmarks for use in the verification of criticality safety computer models. During the first year of this project, most of the work was focused on setting up the SUN SPARC-1 Workstation and acquiring the literature which described the critical experiments. By august 1990, the Workstation was operational with the current version of TWODANT loaded on the system. MCNP, version 4 tape was made available from Los Alamos late in 1990. Various documents were acquired which provide the initial descriptions of the critical experiments under consideration as benchmarks. The next four years were spent working on various benchmark projects. A number of publications and presentations were made on this material. These are briefly discussed in this report.

  13. NG09 And CTBT On-Site Inspection Noble Gas Sampling and Analysis Requirements

    NASA Astrophysics Data System (ADS)

    Carrigan, Charles R.; Tanaka, Junichi

    2010-05-01

    A provision of the Comprehensive Test Ban Treaty (CTBT) allows on-site inspections (OSIs) of suspect nuclear sites to determine if the occurrence of a detected event is nuclear in origin. For an underground nuclear explosion (UNE), the potential success of an OSI depends significantly on the containment scenario of the alleged event as well as the application of air and soil-gas radionuclide sampling techniques in a manner that takes into account both the suspect site geology and the gas transport physics. UNE scenarios may be broadly divided into categories involving the level of containment. The simplest to detect is a UNE that vents a significant portion of its radionuclide inventory and is readily detectable at distance by the International Monitoring System (IMS). The most well contained subsurface events will only be detectable during an OSI. In such cases, 37 Ar and radioactive xenon cavity gases may reach the surface through either "micro-seepage" or the barometric pumping process and only the careful siting of sampling locations, timing of sampling and application of the most site-appropriate atmospheric and soil-gas capturing methods will result in a confirmatory signal. The OSI noble gas field tests NG09 was recently held in Stupava, Slovakia to consider, in addition to other field sampling and analysis techniques, drilling and subsurface noble gas extraction methods that might be applied during an OSI. One of the experiments focused on challenges to soil-gas sampling near the soil-atmosphere interface. During withdrawal of soil gas from shallow, subsurface sample points, atmospheric dilution of the sample and the potential for introduction of unwanted atmospheric gases were considered. Tests were designed to evaluate surface infiltration and the ability of inflatable well-packers to seal out atmospheric gases during sample acquisition. We discuss these tests along with some model-based predictions regarding infiltration under different near

  14. Optimal design of antireflection coating and experimental verification by plasma enhanced chemical vapor deposition in small displays

    SciTech Connect

    Yang, S. M.; Hsieh, Y. C.; Jeng, C. A.

    2009-03-15

    Conventional antireflection coating by thin films of quarter-wavelength thickness is limited by material selections and these films' refractive indices. The optimal design by non-quarter-wavelength thickness is presented in this study. A multilayer thin-film model is developed by the admittance loci to show that the two-layer thin film of SiN{sub x}/SiO{sub y} at 124/87 nm and three layer of SiN{sub x}/SiN{sub y}/SiO{sub z} at 58/84/83 nm can achieve average transmittances of 94.4% and 94.9%, respectively, on polymer, glass, and silicon substrates. The optimal design is validated by plasma enhanced chemical vapor deposition of N{sub 2}O/SiH{sub 4} and NH{sub 3}/SiH{sub 4} to achieve the desired optical constants. Application of the antireflection coating to a 4 in. liquid crystal display demonstrates that the transmittance is over 94%, the mean luminance can be increased by 25%, and the total reflection angle increased from 41 deg. to 58 deg.

  15. Guide to good practices for independent verification

    SciTech Connect

    1998-12-01

    This Guide to Good Practices is written to enhance understanding of, and provide direction for, Independent Verification, Chapter X of Department of Energy (DOE) Order 5480.19, Conduct of Operations Requirements for DOE Facilities. The practices in this guide should be considered when planning or reviewing independent verification activities. Contractors are advised to adopt procedures that meet the intent of DOE Order 5480.19. Independent Verification is an element of an effective Conduct of Operations program. The complexity and array of activities performed in DOE facilities dictate the necessity for coordinated independent verification activities to promote safe and efficient operations.

  16. ETV - VERIFICATION TESTING (ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM)

    EPA Science Inventory

    Verification testing is a major component of the Environmental Technology Verification (ETV) program. The ETV Program was instituted to verify the performance of innovative technical solutions to problems that threaten human health or the environment and was created to substantia...

  17. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  18. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  19. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  20. Post-installation activities in the Comprehensive Nuclear Test Ban Treaty (CTBT) International Monitoring System (IMS) infrasound network

    NASA Astrophysics Data System (ADS)

    Vivas Veloso, J. A.; Christie, D. R.; Hoffmann, T. L.; Campus, P.; Bell, M.; Langlois, A.; Martysevich, P.; Demirovik, E.; Carvalho, J.; Kramer, A.; Wu, Sean F.

    2002-11-01

    The provisional operation and maintenance of IMS infrasound stations after installation and subsequent certification has the objective to prepare the infrasound network for entry into force of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The goal is to maintain and fine tune the technical capabilities of the network, to repair faulty equipment, and to ensure that stations continue to meet the minimum specifications through evaluation of data quality and station recalibration. Due to the globally dispersed nature of the network, this program constitutes a significant undertaking that requires careful consideration of possible logistic approaches and their financial implications. Currently, 11 of the 60 IMS infrasound stations are transmitting data in the post-installation Testing & Evaluation mode. Another 5 stations are under provisional operation and are maintained in post-certification mode. It is expected that 20% of the infrasound network will be certified by the end of 2002. This presentation will focus on the different phases of post-installation activities of the IMS infrasound program and the logistical challenges to be tackled to ensure a cost-efficient management of the network. Specific topics will include Testing & Evaluation and Certification of Infrasound Stations, as well as Configuration Management and Network Sustainment.

  1. Verification of Adaptive Systems

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui; Vassev, Emil; Hinchey, Mike; Rouff, Christopher; Buskens, Richard

    2012-01-01

    Adaptive systems are critical for future space and other unmanned and intelligent systems. Verification of these systems is also critical for their use in systems with potential harm to human life or with large financial investments. Due to their nondeterministic nature and extremely large state space, current methods for verification of software systems are not adequate to provide a high level of assurance for them. The combination of stabilization science, high performance computing simulations, compositional verification and traditional verification techniques, plus operational monitors, provides a complete approach to verification and deployment of adaptive systems that has not been used before. This paper gives an overview of this approach.

  2. The Case of the 12 May 2010 Event in North Korea: the Role of Temporary Seismic Deployments as National Technical Means for CTBT Verification

    NASA Astrophysics Data System (ADS)

    Koch, K.; Kim, W. Y.; Schaff, D. P.; Richards, P. G.

    2015-12-01

    Since 2012 there has been debate about a low-yield nuclear explosion within North Korea, initially claimed to have occurred in April/May 2010 on the basis of a number of Level 5 radionuclide detections from stations of the radionuclide subnetwork of the International Monitoring System (IMS) and additional reports from similar national facilities. Whereas the announced nuclear tests in North Korea in 2006, 2009 and 2013, were clearly detected seismically, there was initially a lack of detections from the seismological component of the IMS corresponding to a possible nuclear test in 2010. Work published recently by Zhang and Wen in Seismological Research Letters (Jan/Feb 2015) inferring seismological evidence for an explosion in North Korea, at about 0009 hours on 12 May 2010 (UTC), has attracted further attention. Previous studies of seismicity of the North Korean test site for days prior to this date had not found any such evidence from IMS or non-IMS stations. The data used by Zhang and Wen were from stations in northeastern China about 80 to 200 km from the North Korean test site and are currently not available for open research. A search for openly-available data was undertaken, resulting in relevant waveforms obtained both from the IRIS Consortium (from a PASSCAL experiment in Northeastern China, as noted also by Ford and Walter, 2015), and from another temporary seismic deployment, also in China. The data from these stations showed signals consistent with the seismic disturbance found by Zhang and Wen. These supplementary stations thus constitute a monitoring resource providing objective data, in the present case for an event even below magnitude 2 and thus much smaller than can be monitored by the usual assets. Efforts are currently underway to use the data from these stations to investigate the compatibility of the event with other explosion-type events, or with an earthquake.

  3. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  4. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  5. Use of open source information and commercial satellite imagery for nuclear nonproliferation regime compliance verification by a community of academics

    NASA Astrophysics Data System (ADS)

    Solodov, Alexander

    The proliferation of nuclear weapons is a great threat to world peace and stability. The question of strengthening the nonproliferation regime has been open for a long period of time. In 1997 the International Atomic Energy Agency (IAEA) Board of Governors (BOG) adopted the Additional Safeguards Protocol. The purpose of the protocol is to enhance the IAEA's ability to detect undeclared production of fissile materials in member states. However, the IAEA does not always have sufficient human and financial resources to accomplish this task. Developed here is a concept for making use of human and technical resources available in academia that could be used to enhance the IAEA's mission. The objective of this research was to study the feasibility of an academic community using commercially or publicly available sources of information and products for the purpose of detecting covert facilities and activities intended for the unlawful acquisition of fissile materials or production of nuclear weapons. In this study, the availability and use of commercial satellite imagery systems, commercial computer codes for satellite imagery analysis, Comprehensive Test Ban Treaty (CTBT) verification International Monitoring System (IMS), publicly available information sources such as watchdog groups and press reports, and Customs Services information were explored. A system for integrating these data sources to form conclusions was also developed. The results proved that publicly and commercially available sources of information and data analysis can be a powerful tool in tracking violations in the international nuclear nonproliferation regime and a framework for implementing these tools in academic community was developed. As a result of this study a formation of an International Nonproliferation Monitoring Academic Community (INMAC) is proposed. This would be an independent organization consisting of academics (faculty, staff and students) from both nuclear weapon states (NWS) and

  6. Verification System: First System-Wide Performance Test

    NASA Astrophysics Data System (ADS)

    Chernobay, I.; Zerbo, L.

    2006-05-01

    System-wide performance tests are essential for the development, testing and evaluation of individual components of the verification system. In addition to evaluating global readiness it helps establishing the practical and financial requirements for eventual operations. The first system-wide performance test (SPT1) was conducted in three phases: - A preparatory phase in May-June 2004 - A performance testing phase in April-June 2005 - An evaluation phase in the last half of 2005. The preparatory phase was developmental in nature. The main objectives for the performance testing phase included establishment of performance baseline under current provisional mode of operation (CTBT/PC- 19/1/Annex II, CTBT/WGB-21/1), examination of established requirements and procedures for operation and maintenance. To establish a system-wide performance baseline the system configuration was fixed for April-May 2005. The third month (June 2005) was used for implementation of 21 test case scenarios to examine either particular operational procedures or the response of the system components to the failures simulated under controlled conditions. A total of 163 stations and 5 certified radionuclide laboratories of International Monitoring System (IMS) participated in the performance testing phase - about 50% of the eventual IMS network. 156 IMS facilities and 40 National Data Centres (NDCs) were connected to the International Data Centre (IDC) via Global Communication Infrastructure (GCI) communication links. In addition, 12 legacy stations in the auxiliary seismic network sent data to the IDC over the Internet. During the performance testing phase, the IDC produced all required products, analysed more than 6100 seismic events and 1700 radionuclide spectra. Performance of all system elements was documented and analysed. IDC products were compared with results of data processing at the NDCs. On the basis of statistics and information collected during the SPT1 a system-wide performance

  7. Enhanced detectability of fluorinated derivatives of N,N-dialkylamino alcohols and precursors of nitrogen mustards by gas chromatography coupled to Fourier transform infrared spectroscopy analysis for verification of chemical weapons convention.

    PubMed

    Garg, Prabhat; Purohit, Ajay; Tak, Vijay K; Dubey, D K

    2009-11-01

    N,N-Dialkylamino alcohols, N-methyldiethanolamine, N-ethyldiethanolamine and triethanolamine are the precursors of VX type nerve agents and three different nitrogen mustards respectively. Their detection and identification is of paramount importance for verification analysis of chemical weapons convention. GC-FTIR is used as complimentary technique to GC-MS analysis for identification of these analytes. One constraint of GC-FTIR, its low sensitivity, was overcome by converting the analytes to their fluorinated derivatives. Owing to high absorptivity in IR region, these derivatives facilitated their detection by GC-FTIR analysis. Derivatizing reagents having trimethylsilyl, trifluoroacyl and heptafluorobutyryl groups on imidazole moiety were screened. Derivatives formed there were analyzed by GC-FTIR quantitatively. Of these reagents studied, heptafluorobutyrylimidazole (HFBI) produced the greatest increase in sensitivity by GC-FTIR detection. 60-125 folds of sensitivity enhancement were observed for the analytes by HFBI derivatization. Absorbance due to various functional groups responsible for enhanced sensitivity were compared by determining their corresponding relative molar extinction coefficients ( [Formula: see text] ) considering uniform optical path length. The RSDs for intraday repeatability and interday reproducibility for various derivatives were 0.2-1.1% and 0.3-1.8%. Limit of detection (LOD) was achieved up to 10-15ng and applicability of the method was tested with unknown samples obtained in international proficiency tests. PMID:19796767

  8. Verification of sensitivity enhancement of SWIR imager technology in advanced multispectral SWIR/VIS zoom cameras with constant and variable F-number

    NASA Astrophysics Data System (ADS)

    Hübner, M.; Achtner, B.; Kraus, M.; Siemens, C.; Münzberg, M.

    2016-05-01

    Current designs of combined VIS-color/SWIR camera optics use constant F-number over the full field of view (FOV) range. Especially in the SWIR, limited space for the camera integration in existing system volumes and relatively high pitch dimensions of 15μm or even 20μm force the use of relatively high F- numbers to accomplish narrow fields of view less than 2.0° with reasonable resolution for long range observation and targeting applications. Constant F-number designs are already reported and considered [1] for submarine applications. The comparison of electro-optical performance was based on the given detector noise performance and sensitivity data by the detector manufacturer [1] and further modelling of the imaging chain within linear MTF system theory. The visible channel provides limited twilight capability at F/2.6 but in the SWIR the twilight capability is degraded due to the relatively high F-number of F/7 or F/5.25 for 20 μm and 15 μm pitch, respectively. Differences between prediction and experimental verification of sensitivity in terms of noise equivalent irradiance (NEI) and scenery based limiting illumination levels are shown for the visible and the SWIR spectral range. Within this context, currently developed improvements using optical zoom designs for the multispectral SWIR/VIS camera optics with continuously variable Fnumber are discussed, offering increased low light level capabilities at wide and medium fields of view while still enabling a NFOV < 2° with superior long range targeting capabilities under limited atmospherical sight conditions at daytime.

  9. Voltage verification unit

    DOEpatents

    Martin, Edward J.

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  10. Verification of RADTRAN

    SciTech Connect

    Kanipe, F.L.; Neuhauser, K.S.

    1995-12-31

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes.

  11. On-site inspection for verification of a Comprehensive Test Ban Treaty

    SciTech Connect

    Heckrotte, W.

    1986-10-01

    A seismic monitoring system and on-site inspections are the major components of a verification system for a Comprehensive Test Ban Treaty (CTBT) to give parties assurance that clandestine underground nuclear weapon tests are not taking place. The primary task lies with the seismic monitoring system which must be capable of identifying most earthquakes in the magnitude range of concern as earthquakes, leaving a small number of unidentified events. If any unidentified event on the territory of one party appeared suspicious to another party, and thus potentially an explosion, an on-site inspection could be invoked to decide whether or not a nuclear explosion had taken place. Over the years, on-site inspections have been one of the most contentious issues in test ban negotiations and discussions. In the uncompleted test ban negotiations of 1977-80 between the US, UK, and USSR, voluntary OSIs were established as a basis for negotiation. Voluntary OSIs would require between the parties a common interest and cooperation toward resolving suspicions if OSIs were to serve the purpose of confidence building. On the technical level, an OSI could not assure identification of a clandestine test, but an evader would probably reject any request for an OSI at the site of an evasive test, rather than run the risk of an OSI. The verification system does not provide direct physical evidence of a violation. This could pose a difficult and controversial decision on compliance. 16 refs.

  12. Secure optical verification using dual phase-only correlation

    NASA Astrophysics Data System (ADS)

    Liu, Wei; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun; Liu, Shutian

    2015-02-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method.

  13. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  14. Nuclear disarmament verification

    SciTech Connect

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  15. Wind gust warning verification

    NASA Astrophysics Data System (ADS)

    Primo, Cristina

    2016-07-01

    Operational meteorological centres around the world increasingly include warnings as one of their regular forecast products. Warnings are issued to warn the public about extreme weather situations that might occur leading to damages and losses. In forecasting these extreme events, meteorological centres help their potential users in preventing the damage or losses they might suffer. However, verifying these warnings requires specific methods. This is due not only to the fact that they happen rarely, but also because a new temporal dimension is added when defining a warning, namely the time window of the forecasted event. This paper analyses the issues that might appear when dealing with warning verification. It also proposes some new verification approaches that can be applied to wind warnings. These new techniques are later applied to a real life example, the verification of wind gust warnings at the German Meteorological Centre ("Deutscher Wetterdienst"). Finally, the results obtained from the latter are discussed.

  16. Verification and validation benchmarks.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  17. TFE verification program

    NASA Astrophysics Data System (ADS)

    1994-01-01

    This is the final semiannual progress report for the Thermionic Fuel Elements (TFE) verification. A decision was made in August 1993 to begin a Close Out Program on October 1, 1993. Final reports summarizing the design analyses and test activities of the TFE Verification Program will be written, stand-alone documents for each task. The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein includes evaluated test data, design evaluations, the results of analyses and the significance of results.

  18. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  19. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  20. Context Effects in Sentence Verification.

    ERIC Educational Resources Information Center

    Kiger, John I.; Glass, Arnold L.

    1981-01-01

    Three experiments examined what happens to reaction time to verify easy items when they are mixed with difficult items in a verification task. Subjects verification of simple arithmetic equations and sentences took longer when placed in a difficult list. Difficult sentences also slowed the verification of easy arithmetic equations. (Author/RD)

  1. Research required to support comprehensive nuclear test ban treaty monitoring. Final report

    SciTech Connect

    1997-08-01

    After years of negotiation, the Comprehensive Nuclear Test-Ban Treaty (CTBT) was signed at the United Nations in September 1996. The treaty creates a need for global monitoring in the context of national and international efforts to control nuclear arms. To meet this technical challenge, the United States is at a time of pivotal decisions-making with regard to the level and nature of basic research in support of CTBT verification. To address this problem, this study identifies the basic research questions in the fields of seismology, hydroacoustics, infrasonics, and radionuclide monitoring that should be supported to enhance the capabilities to monitor and verify the CTBT.

  2. Computer Graphics Verification

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Video processing creates technical animation sequences using studio quality equipment to realistically represent fluid flow over space shuttle surfaces, helicopter rotors, and turbine blades.Computer systems Co-op, Tim Weatherford, performing computer graphics verification. Part of Co-op brochure.

  3. FPGA Verification Accelerator (FVAX)

    NASA Technical Reports Server (NTRS)

    Oh, Jane; Burke, Gary

    2008-01-01

    Is Verification Acceleration Possible? - Increasing the visibility of the internal nodes of the FPGA results in much faster debug time - Forcing internal signals directly allows a problem condition to be setup very quickly center dot Is this all? - No, this is part of a comprehensive effort to improve the JPL FPGA design and V&V process.

  4. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  5. Telescope performance verification

    NASA Astrophysics Data System (ADS)

    Swart, Gerhard P.; Buckley, David A. H.

    2004-09-01

    While Systems Engineering appears to be widely applied on the very large telescopes, it is lacking in the development of many of the medium and small telescopes currently in progress. The latter projects rely heavily on the experience of the project team, verbal requirements and conjecture based on the successes and failures of other telescopes. Furthermore, it is considered an unaffordable luxury to "close-the-loop" by carefully analysing and documenting the requirements and then verifying the telescope's compliance with them. In this paper the authors contend that a Systems Engineering approach is a keystone in the development of any telescope and that verification of the telescope's performance is not only an important management tool but also forms the basis upon which successful telescope operation can be built. The development of the Southern African Large Telescope (SALT) has followed such an approach and is now in the verification phase of its development. Parts of the SALT verification process will be discussed in some detail to illustrate the suitability of this approach, including oversight by the telescope shareholders, recording of requirements and results, design verification and performance testing. Initial test results will be presented where appropriate.

  6. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests

  7. Supporting the President's Arms Control and Nonproliferation Agenda: Transparency and Verification for Nuclear Arms Reductions

    SciTech Connect

    Doyle, James E; Meek, Elizabeth

    2009-01-01

    The President's arms control and nonproliferation agenda is still evolving and the details of initiatives supporting it remain undefined. This means that DOE, NNSA, NA-20, NA-24 and the national laboratories can help define the agenda, and the policies and the initiatives to support it. This will require effective internal and interagency coordination. The arms control and nonproliferation agenda is broad and includes the path-breaking goal of creating conditions for the elimination of nuclear weapons. Responsibility for various elements of the agenda will be widely scattered across the interagency. Therefore an interagency mapping exercise should be performed to identify the key points of engagement within NNSA and other agencies for creating effective policy coordination mechanisms. These can include informal networks, working groups, coordinating committees, interagency task forces, etc. It will be important for NA-20 and NA-24 to get a seat at the table and a functional role in many of these coordinating bodies. The arms control and nonproliferation agenda comprises both mature and developing policy initiatives. The more mature elements such as CTBT ratification and a follow-on strategic nuclear arms treaty with Russia have defined milestones. However, recent press reports indicate that even the START follow-on strategic arms pact that is planned to be complete by the end of 2009 may take significantly longer and be more expansive in scope. The Russians called for proposals to count non-deployed as well as deployed warheads. Other elements of the agenda such as FMCT, future bilateral nuclear arms reductions following a START follow-on treaty, nuclear posture changes, preparations for an international nuclear security summit, strengthened international safeguards and multilateral verification are in much earlier stages of development. For this reason any survey of arms control capabilities within the USG should be structured to address potential needs across the

  8. Robust verification analysis

    NASA Astrophysics Data System (ADS)

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-01

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  9. RESRAD-BUILD verification.

    SciTech Connect

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-31

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft{reg_sign} Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified.

  10. Experimental verification of acoustic trace wavelength enhancement.

    PubMed

    Cray, Benjamin A

    2015-12-01

    Directivity is essentially a measure of a sonar array's beamwidth that can be obtained in a spherically isotropic ambient noise field; narrow array mainbeam widths are more directive than broader mainbeam widths. For common sonar systems, the directivity factor (or directivity index) is directly proportional to the ratio of an incident acoustic trace wavelength to the sonar array's physical length (which is always constrained). Increasing this ratio, by creating additional trace wavelengths for a fixed array length, will increase array directivity. Embedding periodic structures within an array generates Bragg scattering of the incident acoustic plane wave along the array's surface. The Bragg scattered propagating waves are shifted in a precise manner and create shorter wavelength replicas of the original acoustic trace wavelength. These replicated trace wavelengths (which contain identical signal arrival information) increase an array's wavelength to length ratio and thus directivity. Therefore, a smaller array, in theory, can have the equivalent directivity of a much larger array. Measurements completed in January 2015 at the Naval Undersea Warfare Center's Acoustic Test Facility, in Newport, RI, verified, near perfectly, these replicated, shorter, trace wavelengths. PMID:26723331

  11. TFE verification program

    NASA Astrophysics Data System (ADS)

    1990-03-01

    The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a Thermionic Fuel Element (TFE) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program; (3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-88; and (5) Thermionic Program in 1986 and 1987.

  12. TFE Verification Program

    SciTech Connect

    Not Available

    1990-03-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown on Fig. 1-1. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program;(3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-86; (5) and Thermionic Technology Program in 1986 and 1987. 18 refs., 64 figs., 43 tabs.

  13. Systematic study of source mask optimization and verification flows

    NASA Astrophysics Data System (ADS)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  14. Continuous verification using multimodal biometrics.

    PubMed

    Sim, Terence; Zhang, Sheng; Janakiraman, Rajkumar; Kumar, Sandeep

    2007-04-01

    Conventional verification systems, such as those controlling access to a secure room, do not usually require the user to reauthenticate himself for continued access to the protected resource. This may not be sufficient for high-security environments in which the protected resource needs to be continuously monitored for unauthorized use. In such cases, continuous verification is needed. In this paper, we present the theory, architecture, implementation, and performance of a multimodal biometrics verification system that continuously verifies the presence of a logged-in user. Two modalities are currently used--face and fingerprint--but our theory can be readily extended to include more modalities. We show that continuous verification imposes additional requirements on multimodal fusion when compared to conventional verification systems. We also argue that the usual performance metrics of false accept and false reject rates are insufficient yardsticks for continuous verification and propose new metrics against which we benchmark our system. PMID:17299225

  15. Quantum money with classical verification

    SciTech Connect

    Gavinsky, Dmitry

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  16. Quantum money with classical verification

    NASA Astrophysics Data System (ADS)

    Gavinsky, Dmitry

    2014-12-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  17. A systems perspective of Comprehensive Test Ban Treaty monitoring and verification

    SciTech Connect

    Walker, L.S.

    1996-11-01

    On September 24, 1996, after decades of discussion and more than two years of intensive international negotiations, President Clinton, followed by representatives of (to date) more than 125 other countries, including the other four declared nuclear weapons states, signed the Comprehensive Test Ban Treaty. Each signatory now faces a complex set of technical and political considerations regarding the advisability of joining the treaty. Those considerations vary from country to country, but for many countries one of the key issues is the extent to which the treaty can be verified. In the case of the US, it is anticipated that treaty verifiability will be an important issue in the US Senate Advice and Consent Hearings. This paper will address treaty verifiability, with an emphasis on the interplay between the various elements of the International monitoring regime, as prescribed in the CTBT Treaty Text and its associated Protocol. These elements, coupled with the National regimes, will serve as an integrated set of overlapping, interlocking measures to support treaty verification. Taken as a whole, they present a formidable challenge to potential testers who wish not to be caught.

  18. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  19. TFE Verification Program

    SciTech Connect

    Not Available

    1993-05-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program builds directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern.

  20. Software Verification and Validation Procedure

    SciTech Connect

    Olund, Thomas S.

    2008-09-15

    This Software Verification and Validation procedure provides the action steps for the Tank Waste Information Network System (TWINS) testing process. The primary objective of the testing process is to provide assurance that the software functions as intended, and meets the requirements specified by the client. Verification and validation establish the primary basis for TWINS software product acceptance.

  1. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  2. HDL to verification logic translator

    NASA Astrophysics Data System (ADS)

    Gambles, J. W.; Windley, P. J.

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  3. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  4. Cancelable face verification using optical encryption and authentication.

    PubMed

    Taheri, Motahareh; Mozaffari, Saeed; Keshavarzi, Parviz

    2015-10-01

    In a cancelable biometric system, each instance of enrollment is distorted by a transform function, and the output should not be retransformed to the original data. This paper presents a new cancelable face verification system in the encrypted domain. Encrypted facial images are generated by a double random phase encoding (DRPE) algorithm using two keys (RPM1 and RPM2). To make the system noninvertible, a photon counting (PC) method is utilized, which requires a photon distribution mask for information reduction. Verification of sparse images that are not recognizable by direct visual inspection is performed by unconstrained minimum average correlation energy filter. In the proposed method, encryption keys (RPM1, RPM2, and PDM) are used in the sender side, and the receiver needs only encrypted images and correlation filters. In this manner, the system preserves privacy if correlation filters are obtained by an adversary. Performance of PC-DRPE verification system is evaluated under illumination variation, pose changes, and facial expression. Experimental results show that utilizing encrypted images not only increases security concerns but also enhances verification performance. This improvement can be attributed to the fact that, in the proposed system, the face verification problem is converted to key verification tasks. PMID:26479930

  5. On-machine dimensional verification. Final report

    SciTech Connect

    Rendulic, W.

    1993-08-01

    General technology for automating in-process verification of machined products has been studied and implemented on a variety of machines and products at AlliedSignal Inc., Kansas City Division (KCD). Tests have been performed to establish system accuracy and probe reliability on two numerically controlled machining centers. Commercial software has been revised, and new cycles such as skew check and skew machining, have been developed to enhance and expand probing capabilities. Probe benefits have been demonstrated in the area of setup, cycle time, part quality, tooling cost, and product sampling.

  6. Online fingerprint verification.

    PubMed

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications. PMID:17365425

  7. Verification of VENTSAR

    SciTech Connect

    Simpkins, A.A.

    1995-01-01

    The VENTSAR code is an upgraded and improved version of the VENTX code, which estimates concentrations on or near a building from a release at a nearby location. The code calculates the concentrations either for a given meteorological exceedance probability or for a given stability and wind speed combination. A single building can be modeled which lies in the path of the plume, or a penthouse can be added to the top of the building. Plume rise may also be considered. Release types can be either chemical or radioactive. Downwind concentrations are determined at user-specified incremental distances. This verification report was prepared to demonstrate that VENTSAR is properly executing all algorithms and transferring data. Hand calculations were also performed to ensure proper application of methodologies.

  8. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  9. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  10. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  11. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  12. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  13. Hardware proofs using EHDM and the RSRE verification methodology

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Sjogren, Jon A.

    1988-01-01

    Examined is a methodology for hardware verification developed by Royal Signals and Radar Establishment (RSRE) in the context of the SRI International's Enhanced Hierarchical Design Methodology (EHDM) specification/verification system. The methodology utilizes a four-level specification hierarchy with the following levels: functional level, finite automata model, block model, and circuit level. The properties of a level are proved as theorems in the level below it. This methodology is applied to a 6-bit counter problem and is critically examined. The specifications are written in EHDM's specification language, Extended Special, and the proofs are improving both the RSRE methodology and the EHDM system.

  14. Biometric verification with correlation filters.

    PubMed

    Vijaya Kumar, B V K; Savvides, Marios; Xie, Chunyan; Venkataramani, Krithika; Thornton, Jason; Mahalanobis, Abhijit

    2004-01-10

    Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric (such as a fingerprint, face image, or iris image) to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images (e.g., facial expressions, illumination changes, etc.). We investigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification. PMID:14735958

  15. Biometric verification with correlation filters

    NASA Astrophysics Data System (ADS)

    Vijaya Kumar, B. V. K.; Savvides, Marios; Xie, Chunyan; Venkataramani, Krithika; Thornton, Jason; Mahalanobis, Abhijit

    2004-01-01

    Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric (such as a fingerprint, face image, or iris image) to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images (e.g., facial expressions, illumination changes, etc.). We investigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification.

  16. TPS verification with UUT simulation

    NASA Astrophysics Data System (ADS)

    Wang, Guohua; Meng, Xiaofeng; Zhao, Ruixian

    2006-11-01

    TPS's (Test Program Set) verification or first article acceptance test commonly depends on fault insertion experiment on UUT (Unit Under Test). However the failure modes injected on UUT is limited and it is almost infeasible when the UUT is in development or in a distributed state. To resolve this problem, a TPS verification method based on UUT interface signal simulation is putting forward. The interoperability between ATS (automatic test system) and UUT simulation platform is very important to realize automatic TPS verification. After analyzing the ATS software architecture, the approach to realize interpretability between ATS software and UUT simulation platform is proposed. And then the UUT simulation platform software architecture is proposed based on the ATS software architecture. The hardware composition and software architecture of the UUT simulation is described in details. The UUT simulation platform has been implemented in avionics equipment TPS development, debug and verification.

  17. Fusion strategies for boosting cancelable online signature verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    2010-04-01

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. As one solution, we proposed a scheme to enhance the performance of a cancelable approach for online signature verification by combining scores calculated from two transformed datasets generated using two keys. Generally, the same verification algorithm is used for transformed data as for raw (non-transformed) data in cancelable approaches, and, in our previous work, a verification system developed for a non-transformed dataset was used to calculate the scores from transformed data. In this paper, we modify the verification system by using transformed data for training. Several experiments were performed by using public databases, and the experimental results show that the modification of the verification system improved the performances. Our cancelable system combines two scores to make a decision. Several fusion strategies are also considered, and the experimental results are reported here.

  18. Verification& Validation (V&V) Guidelines and Quantitative Reliability at Confidence (QRC): Basis for an Investment Strategy

    SciTech Connect

    Logan, R W; Nitta, C K

    2002-07-17

    This paper represents an attempt to summarize our thoughts regarding various methods and potential guidelines for Verification and Validation (V&V) and Uncertainty Quantification (UQ) that we have observed within the broader V&V community or generated ourselves. Our goals are to evaluate these various methods, to apply them to computational simulation analyses, and integrate them into methods for Quantitative Certification techniques for the nuclear stockpile. We describe the critical nature of high quality analyses with quantified V&V, and the essential role of V&V and UQ at specified Confidence levels in evaluating system certification status. Only after V&V has contributed to UQ at confidence can rational tradeoffs of various scenarios be made. UQ of performance and safety margins for various scenarios and issues are applied in assessments of Quantified Reliability at Confidence (QRC) and we summarize with a brief description of how these V&V generated QRC quantities fold into a Value-Engineering methodology for evaluating investment strategies. V&V contributes directly to the decision process for investment, through quantification of uncertainties at confidence for margin and reliability assessments. These contributions play an even greater role in a Comprehensive Test Ban Treaty (CTBT) environment than ever before, when reliance on simulation in the absence of the ability to perform nuclear testing is critical.

  19. Verification and validation in railway signalling engineering - an application of enterprise systems techniques

    NASA Astrophysics Data System (ADS)

    Chen, Xiangxian; Wang, Dong; Huang, Hai; Wang, Zheng

    2014-07-01

    Verification and validation of a railway signalling system is a crucial part of the workflow in railway signalling enterprises. Typically, the verification and validation of this type of safety-critical system is performed by means of an on-site test, which leads to a low efficiency and high costs. A novel method for the verification and validation of a railway signalling system is proposed as an application of the enterprise information system (EIS) technique. In this application, the EIS and the simulation test platform are combined together, which enhances the coherence and consistency of the information exchange between the system development and the system verification, to improve the work efficiency. The simulation and auto-test technology used in the system verification also lowers the human and financial costs.

  20. Appendix: Conjectures concerning proof, design, and verification.

    SciTech Connect

    Wos, L.

    2000-05-31

    This article focuses on an esoteric but practical use of automated reasoning that may indeed be new to many, especially those concerned primarily with verification of both hardware and software. Specifically, featured are a discussion and some methodology for taking an existing design -- of a circuit, a chip, a program, or the like--and refining and improving it in various ways. Although the methodology is general and does not require the use of a specific program, McCune's program OTTER does offer what is needed. OTTER has played and continues to play the key role in my research, and an interested person can gain access to this program in various ways, not the least of which is through the included CD-ROM in [3]. When success occurs, the result is a new design that may require fewer components, avoid the use of certain costly components, offer more reliability and ease of verification, and, perhaps most important, be more efficient in the contexts of speed and heat generation. Although the author has minimal experience in circuit design, circuit validation, program synthesis, program verification, and similar concerns, (at the encouragement of colleagues based on successes to be cited) he presents materials that might indeed be of substantial interest to manufacturers and programmers. He writes this article in part prompted by the recent activities of chip designers that include Intel and AMD, activities heavily emphasizing the proving of theorems. As for his research that appears to the author to be relevant, he has made an intense and most profitable study of finding proofs that are shorter [2,3], some that avoid the use of various types of term, some that are far less complex than previously known, and the like. Those results suggest to me a strong possible connection between more appealing proofs (in mathematics and in logic) and enhanced and improved design of both hardware and software. Here the author explores diverse conjectures that elucidate some of the

  1. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  2. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  3. Space Telescope performance and verification

    NASA Technical Reports Server (NTRS)

    Wright, W. F.

    1980-01-01

    The verification philosophy for the Space Telescope (ST) has evolved from years of experience with multispacecraft programs modified by the new factors introduced by the Space Transportation System. At the systems level of test, the ST will undergo joint qualification/acceptance tests with environment simulation using Lockheed's large spacecraft test facilities. These tests continue the process of detecting workmanship defects and module interface incompatibilities. The test program culminates in an 'all up' ST environmental test verification program resulting in a 'ready to launch' ST.

  4. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  5. A verification system of RMAP protocol controller

    NASA Astrophysics Data System (ADS)

    Khanov, V. Kh; Shakhmatov, A. V.; Chekmarev, S. A.

    2015-01-01

    The functional verification problem of IP blocks of RMAP protocol controller is considered. The application of the verification method using fully- functional models of the processor and the internal bus of a system-on-chip is justified. Principles of construction of a verification system based on the given approach are proposed. The practical results of creating a system of verification of IP block of RMAP protocol controller is presented.

  6. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  7. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Energy Regulatory Commission 18 CFR Part 40 Generator Verification Reliability Standards AGENCY: Federal... Organization: MOD-025-2 (Verification and Data Reporting of Generator Real and Reactive Power Capability and Synchronous Condenser Reactive Power Capability), MOD- 026-1 (Verification of Models and Data for...

  8. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-07-16

    This paper will explore the difficulties of deep reductions by examining the technical verification challenges. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 (Pifer 2010). Further reductions will include stepping stones at 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national lab complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  9. Visual Attention During Sentence Verification.

    ERIC Educational Resources Information Center

    Lucas, Peter A.

    Eye movement data were collected for 28 college students reading 32 sentences with sentence verification questions. The factors observed were target sentence voice (active/passive), probe voice, and correct response (true/false). Pairs of subjects received the same set of stimuli, but with agents and objects in the sentences reversed. As expected,…

  10. Improved method for coliform verification.

    PubMed

    Diehl, J D

    1991-02-01

    Modification of a method for coliform verification presented in Standard Methods for the Examination of Water and Wastewater is described. Modification of the method, which is based on beta-galactosidase production, involves incorporation of a lactose operon inducer in medium upon which presumptive coliform isolates are cultured prior to beta-galactosidase assay. PMID:1901712

  11. Improved method for coliform verification.

    PubMed Central

    Diehl, J D

    1991-01-01

    Modification of a method for coliform verification presented in Standard Methods for the Examination of Water and Wastewater is described. Modification of the method, which is based on beta-galactosidase production, involves incorporation of a lactose operon inducer in medium upon which presumptive coliform isolates are cultured prior to beta-galactosidase assay. PMID:1901712

  12. A scheme for symmetrization verification

    NASA Astrophysics Data System (ADS)

    Sancho, Pedro

    2011-08-01

    We propose a scheme for symmetrization verification in two-particle systems, based on one-particle detection and state determination. In contrast to previous proposals, it does not follow a Hong-Ou-Mandel-type approach. Moreover, the technique can be used to generate superposition states of single particles.

  13. VERIFICATION OF WATER QUALITY MODELS

    EPA Science Inventory

    The basic concepts of water quality models are reviewed and the need to recognize calibration and verification of models with observed data is stressed. Post auditing of models after environmental control procedures are implemented is necessary to determine true model prediction ...

  14. Kate's Model Verification Tools

    NASA Technical Reports Server (NTRS)

    Morgan, Steve

    1991-01-01

    Kennedy Space Center's Knowledge-based Autonomous Test Engineer (KATE) is capable of monitoring electromechanical systems, diagnosing their errors, and even repairing them when they crash. A survey of KATE's developer/modelers revealed that they were already using a sophisticated set of productivity enhancing tools. They did request five more, however, and those make up the body of the information presented here: (1) a transfer function code fitter; (2) a FORTRAN-Lisp translator; (3) three existing structural consistency checkers to aid in syntax checking their modeled device frames; (4) an automated procedure for calibrating knowledge base admittances to protect KATE's hardware mockups from inadvertent hand valve twiddling; and (5) three alternatives for the 'pseudo object', a programming patch that currently apprises KATE's modeling devices of their operational environments.

  15. COS Internal NUV Wavelength Verification

    NASA Astrophysics Data System (ADS)

    Keyes, Charles

    2009-07-01

    This program will be executed after the uplink of the OSM2 position updates derived from the determination of the wavelength-scale zero points and desired spectral ranges for each grating in activity COS14 {program 11474 - COS NUV Internal/External Wavelength Scales}. This program will verify that the operational spectral ranges for each grating, central wavelength, and FP-POS are those desired. Subsequent to a successful verification, COS NUV ERO observations and NUV science can be enabled. An internal wavelength calibration spectrum using the default PtNe lamp {lamp 1} with each NUV grating at each central wavelength setting and each FP-POS position will be obtained for the verification. Additional exposures and waits between certain exposures will be required to avoid - and to evaluate - mechanism drifts.

  16. COS Internal FUV Wavelength Verification

    NASA Astrophysics Data System (ADS)

    Keyes, Charles

    2009-07-01

    This program will be executed after the uplink of the OSM1 position updates derived from the determination of the wavelength-scale zero points and desired spectral ranges for each grating in activity COS29 {program 11487 - COS FUV Internal/External Wavelength Scales}. This program will verify that the operational spectral ranges for each grating, central wavelength, and FP-POS are those desired. Subsequent to a successful verification, COS FUV ERO observations that require accurate wavelength scales {if any} and FUV science can be enabled. An internal wavelength calibration spectrum using the default PtNe lamp {lamp 1} with each FUV grating at each central wavelength setting and each FP-POS position will be obtained for the verification. Additional exposures and waits between certain exposures will be required to avoid - and to evaluate - mechanism drifts.

  17. NEXT Thruster Component Verification Testing

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Sovey, James S.

    2007-01-01

    Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.

  18. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  19. Verification and transparency in future arms control

    SciTech Connect

    Pilat, J.F.

    1996-09-01

    Verification`s importance has changed dramatically over time, although it always has been in the forefront of arms control. The goals and measures of verification and the criteria for success have changed with the times as well, reflecting such factors as the centrality of the prospective agreement to East-West relations during the Cold War, the state of relations between the United States and the Soviet Union, and the technologies available for monitoring. Verification`s role may be declining in the post-Cold War period. The prospects for such a development will depend, first and foremost, on the high costs of traditional arms control, especially those associated with requirements for verification. Moreover, the growing interest in informal, or non-negotiated arms control does not allow for verification provisions by the very nature of these arrangements. Multilateral agreements are also becoming more prominent and argue against highly effective verification measures, in part because of fears of promoting proliferation by opening sensitive facilities to inspectors from potential proliferant states. As a result, it is likely that transparency and confidence-building measures will achieve greater prominence, both as supplements to and substitutes for traditional verification. Such measures are not panaceas and do not offer all that we came to expect from verification during the Cold war. But they may be the best possible means to deal with current problems of arms reductions and restraints at acceptable levels of expenditure.

  20. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  1. Nuclear Data Verification and Standardization

    SciTech Connect

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  2. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  3. Gender verification in competitive sports.

    PubMed

    Simpson, J L; Ljungqvist, A; de la Chapelle, A; Ferguson-Smith, M A; Genel, M; Carlson, A S; Ehrhardt, A A; Ferris, E

    1993-11-01

    The possibility that men might masquerade as women and be unfair competitors in women's sports is accepted as outrageous by athletes and the public alike. Since the 1930s, media reports have fuelled claims that individuals who once competed as female athletes subsequently appeared to be men. In most of these cases there was probably ambiguity of the external genitalia, possibly as a result of male pseudohermaphroditism. Nonetheless, beginning at the Rome Olympic Games in 1960, the International Amateur Athletics Federation (IAAF) began establishing rules of eligibility for women athletes. Initially, physical examination was used as a method for gender verification, but this plan was widely resented. Thus, sex chromatin testing (buccal smear) was introduced at the Mexico City Olympic Games in 1968. The principle was that genetic females (46,XX) show a single X-chromatic mass, whereas males (46,XY) do not. Unfortunately, sex chromatin analysis fell out of common diagnostic use by geneticists shortly after the International Olympic Committee (IOC) began its implementation for gender verification. The lack of laboratories routinely performing the test aggravated the problem of errors in interpretation by inexperienced workers, yielding false-positive and false-negative results. However, an even greater problem is that there exist phenotypic females with male sex chromatin patterns (e.g. androgen insensitivity, XY gonadal dysgenesis). These individuals have no athletic advantage as a result of their congenital abnormality and reasonably should not be excluded from competition. That is, only the chromosomal (genetic) sex is analysed by sex chromatin testing, not the anatomical or psychosocial status. For all the above reasons sex chromatin testing unfairly excludes many athletes. Although the IOC offered follow-up physical examinations that could have restored eligibility for those 'failing' sex chromatin tests, most affected athletes seemed to prefer to 'retire'. All

  4. Formal verification of a fault tolerant clock synchronization algorithm

    NASA Technical Reports Server (NTRS)

    Rushby, John; Vonhenke, Frieder

    1989-01-01

    A formal specification and mechanically assisted verification of the interactive convergence clock synchronization algorithm of Lamport and Melliar-Smith is described. Several technical flaws in the analysis given by Lamport and Melliar-Smith were discovered, even though their presentation is unusally precise and detailed. It seems that these flaws were not detected by informal peer scrutiny. The flaws are discussed and a revised presentation of the analysis is given that not only corrects the flaws but is also more precise and easier to follow. Some of the corrections to the flaws require slight modifications to the original assumptions underlying the algorithm and to the constraints on its parameters, and thus change the external specifications of the algorithm. The formal analysis of the interactive convergence clock synchronization algorithm was performed using the Enhanced Hierarchical Development Methodology (EHDM) formal specification and verification environment. This application of EHDM provides a demonstration of some of the capabilities of the system.

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION: GENERIC VERIFICATION PROTOCOL FOR BIOLOGICAL AND AEROSOL TESTING OF GENERAL VENTILATION AIR CLEANERS

    EPA Science Inventory

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  6. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  7. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  8. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  9. Application of Nuclear Physics Methods in the Verification System for the Comprehensive Nuclear-Test-Ban Treaty

    NASA Astrophysics Data System (ADS)

    Wilhelmsen, K.; Elmgren, K.; Jansson, P.

    2005-04-01

    Elements of the The Comprehensive Nuclear Test-Ban Treaty (CTBT) and its International Monitoring System (IMS) are briefly described. Two different radionuclide detection systems, developed by the Swedish Defence Research Agency (FOI), are treated in more detail.

  10. Imaging quality full chip verification for yield improvement

    NASA Astrophysics Data System (ADS)

    Yang, Qing; Zhou, CongShu; Quek, ShyueFong; Lu, Mark; Foong, YeeMei; Qiu, JianHong; Pandey, Taksh; Dover, Russell

    2013-04-01

    Basic image intensity parameters, like maximum and minimum intensity values (Imin and Imax), image logarithm slope (ILS), normalized image logarithm slope (NILS) and mask error enhancement factor (MEEF) , are well known as indexes of photolithography imaging quality. For full chip verification, hotspot detection is typically based on threshold values for line pinching or bridging. For image intensity parameters it is generally harder to quantify an absolute value to define where the process limit will occur, and at which process stage; lithography, etch or post- CMP. However it is easy to conclude that hot spots captured by image intensity parameters are more susceptible to process variation and very likely to impact yield. In addition these image intensity hot spots can be missed by using resist model verification because the resist model normally is calibrated by the wafer data on a single resist plane and is an empirical model which is trying to fit the resist critical dimension by some mathematic algorithm with combining optical calculation. Also at resolution enhancement technology (RET) development stage, full chip imaging quality check is also a method to qualify RET solution, like Optical Proximity Correct (OPC) performance. To add full chip verification using image intensity parameters is also not as costly as adding one more resist model simulation. From a foundry yield improvement and cost saving perspective, it is valuable to quantify the imaging quality to find design hot spots to correctly define the inline process control margin. This paper studies the correlation between image intensity parameters and process weakness or catastrophic hard failures at different process stages. It also demonstrated how OPC solution can improve full chip image intensity parameters. Rigorous 3D resist profile simulation across the full height of the resist stack was also performed to identify a correlation to the image intensity parameter. A methodology of post-OPC full

  11. Automatic verification methods for finite state systems

    SciTech Connect

    Sifakis, J. )

    1990-01-01

    This volume contains the proceedings of a workshop devoted to the verification of finite state systems. The workshop focused on the development and use of methods, tools and theories for automatic verification of finite state systems. The goal at the workshop was to compare verification methods and tools to assist the applications designer. The papers review verification techniques for finite state systems and evaluate their relative advantages. The techniques considered cover various specification formalisms such as process algebras, automata and logics. Most of the papers focus on exploitation of existing results in three application areas: hardware design, communication protocols and real-time systems.

  12. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  13. Input apparatus for dynamic signature verification systems

    DOEpatents

    EerNisse, Errol P.; Land, Cecil E.; Snelling, Jay B.

    1978-01-01

    The disclosure relates to signature verification input apparatus comprising a writing instrument and platen containing piezoelectric transducers which generate signals in response to writing pressures.

  14. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  15. Conformance Verification of Privacy Policies

    NASA Astrophysics Data System (ADS)

    Fu, Xiang

    Web applications are both the consumers and providers of information. To increase customer confidence, many websites choose to publish their privacy protection policies. However, policy conformance is often neglected. We propose a logic based framework for formally specifying and reasoning about the implementation of privacy protection by a web application. A first order extension of computation tree logic is used to specify a policy. A verification paradigm, built upon a static control/data flow analysis, is presented to verify if a policy is satisfied.

  16. Why do verification and validation?

    DOE PAGESBeta

    Hu, Kenneth T.; Paez, Thomas L.

    2016-02-19

    In this discussion paper, we explore different ways to assess the value of verification and validation (V&V) of engineering models. We first present a literature review on the value of V&V and then use value chains and decision trees to show how value can be assessed from a decision maker's perspective. In this context, the value is what the decision maker is willing to pay for V&V analysis with the understanding that the V&V results are uncertain. As a result, the 2014 Sandia V&V Challenge Workshop is used to illustrate these ideas.

  17. Science verification results from PMAS

    NASA Astrophysics Data System (ADS)

    Roth, M. M.; Becker, T.; Böhm, P.; Kelz, A.

    2004-02-01

    PMAS, the Potsdam Multi-Aperture Spectrophotometer, is a new integral field instrument which was commissioned at the Calar Alto 3.5m Telescope in May 2001. We report on results obtained from a science verification run in October 2001. We present observations of the low-metallicity blue compact dwarf galaxy SBS0335-052, the ultra-luminous X-ray Source X-1 in the Holmberg;II galaxy, the quadruple gravitational lens system Q2237+0305 (the ``Einstein Cross''), the Galactic planetary nebula NGC7027, and extragalactic planetary nebulae in M31. PMAS is now available as a common user instrument at Calar Alto Observatory.

  18. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  19. Radioxenon detections in the CTBT international monitoring system likely related to the announced nuclear test in North Korea on February 12, 2013.

    PubMed

    Ringbom, A; Axelsson, A; Aldener, M; Auer, M; Bowyer, T W; Fritioff, T; Hoffman, I; Khrustalev, K; Nikkinen, M; Popov, V; Popov, Y; Ungar, K; Wotawa, G

    2014-02-01

    Observations made in April 2013 of the radioxenon isotopes (133)Xe and (131m)Xe at measurement stations in Japan and Russia, belonging to the International Monitoring System for verification of the Comprehensive Nuclear-Test-Ban Treaty, are unique with respect to the measurement history of these stations. Comparison of measured data with calculated isotopic ratios as well as analysis using atmospheric transport modeling indicate that it is likely that the xenon measured was created in the underground nuclear test conducted by North Korea on February 12, 2013, and released 7-8 weeks later. More than one release is required to explain all observations. The (131m)Xe source terms for each release were calculated to 0.7 TBq, corresponding to about 1-10% of the total xenon inventory for a 10 kt explosion, depending on fractionation and release scenario. The observed ratios could not be used to obtain any information regarding the fissile material that was used in the test. PMID:24316684

  20. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  1. Runtime Verification with State Estimation

    NASA Technical Reports Server (NTRS)

    Stoller, Scott D.; Bartocci, Ezio; Seyster, Justin; Grosu, Radu; Havelund, Klaus; Smolka, Scott A.; Zadok, Erez

    2011-01-01

    We introduce the concept of Runtime Verification with State Estimation and show how this concept can be applied to estimate theprobability that a temporal property is satisfied by a run of a program when monitoring overhead is reduced by sampling. In such situations, there may be gaps in the observed program executions, thus making accurate estimation challenging. To deal with the effects of sampling on runtime verification, we view event sequences as observation sequences of a Hidden Markov Model (HMM), use an HMM model of the monitored program to "fill in" sampling-induced gaps in observation sequences, and extend the classic forward algorithm for HMM state estimation (which determines the probability of a state sequence, given an observation sequence) to compute the probability that the property is satisfied by an execution of the program. To validate our approach, we present a case study based on the mission software for a Mars rover. The results of our case study demonstrate high prediction accuracy for the probabilities computed by our algorithm. They also show that our technique is much more accurate than simply evaluating the temporal property on the given observation sequences, ignoring the gaps.

  2. RISKIND verification and benchmark comparisons

    SciTech Connect

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  3. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  4. Video-Based Fingerprint Verification

    PubMed Central

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283

  5. Application of Hadamard spectroscopy to automated structure verification in high-throughput NMR.

    PubMed

    Ruan, Ke; Yang, Shengtian; Van Sant, Karey A; Likos, John J

    2009-08-01

    Combined verification using 1-D proton and HSQC has been proved to be quite successful; the acquisition time of HSQC spectra, however, can be limiting in its high-throughput applications. The replacement with Hadamard HSQC can significantly enhance the throughput. We hereby propose a protocol to optimize the grouping of the predicted carbon chemical shifts from the proposed structure and the associated Hadamard frequencies and bandwidths. The resulting Hadamard HSQC spectra compare favorably with their Fourier-transformed counterparts, and have demonstrated to perform equivalently in terms of combined verification, but with several fold enhancement in throughput, as illustrated for 21 commercial available molecules and 16 prototypical drug compounds. Further improvement of the verification accuracy can be achieved by the cross validation from Hadamard TOCSY, which can be acquired without much sacrifice in throughput. PMID:19496061

  6. The Ontogeny of the Verification System.

    ERIC Educational Resources Information Center

    Akiyama, M. Michael; Guillory, Andrea W.

    1983-01-01

    Young children found it difficult to verify negative statements, but found affirmative statements, affirmative questions, and negative questions equally easy to deal with. It is proposed that children acquire the answering system earlier than the verification system, and use answering to verify statements before acquiring the verification system.…

  7. The monitoring and verification of nuclear weapons

    SciTech Connect

    Garwin, Richard L.

    2014-05-09

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  9. 25 CFR 61.8 - Verification forms.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR TRIBAL GOVERNMENT PREPARATION OF ROLLS OF INDIANS § 61.8... enrollment, a verification form, to be completed and returned, shall be mailed to each previous enrollee using the last address of record. The verification form will be used to ascertain the previous...

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  11. Gender Verification of Female Olympic Athletes.

    ERIC Educational Resources Information Center

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  12. The monitoring and verification of nuclear weapons

    NASA Astrophysics Data System (ADS)

    Garwin, Richard L.

    2014-05-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  13. 40 CFR 1066.220 - Linearity verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Linearity verification. 1066.220 Section 1066.220 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.220 Linearity verification. (a) Scope and frequency. Perform linearity...

  14. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  15. HTGR analytical methods and design verification

    SciTech Connect

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier.

  16. Students' Verification Strategies for Combinatorial Problems

    ERIC Educational Resources Information Center

    Mashiach Eizenberg, Michal; Zaslavsky, Orit

    2004-01-01

    We focus on a major difficulty in solving combinatorial problems, namely, on the verification of a solution. Our study aimed at identifying undergraduate students' tendencies to verify their solutions, and the verification strategies that they employ when solving these problems. In addition, an attempt was made to evaluate the level of efficiency…

  17. Verification testing of advanced environmental monitoring systems

    SciTech Connect

    Kelly, T.J.; Riggs, K.B.; Fuerst, R.G.

    1999-03-01

    This paper describes the Advanced Monitoring Systems (AMS) pilot project, one of 12 pilots comprising the US EPA`s Environmental Technology Verification (ETV) program. The aim of ETV is to promote the acceptance of environmental technologies in the marketplace, through objective third-party verification of technology performance.

  18. Hierarchical Design and Verification for VLSI

    NASA Technical Reports Server (NTRS)

    Shostak, R. E.; Elliott, W. D.; Levitt, K. N.

    1983-01-01

    The specification and verification work is described in detail, and some of the problems and issues to be resolved in their application to Very Large Scale Integration VLSI systems are examined. The hierarchical design methodologies enable a system architect or design team to decompose a complex design into a formal hierarchy of levels of abstraction. The first step inprogram verification is tree formation. The next step after tree formation is the generation from the trees of the verification conditions themselves. The approach taken here is similar in spirit to the corresponding step in program verification but requires modeling of the semantics of circuit elements rather than program statements. The last step is that of proving the verification conditions using a mechanical theorem-prover.

  19. New method of verificating optical flat flatness

    NASA Astrophysics Data System (ADS)

    Sun, Hao; Li, Xueyuan; Han, Sen; Zhu, Jianrong; Guo, Zhenglai; Fu, Yuegang

    2014-11-01

    Optical flat is commonly used in optical testing instruments, flatness is the most important parameter of forming errors. As measurement criteria, optical flat flatness (OFF) index needs to have good precision. Current measurement in China is heavily dependent on the artificial visual interpretation, through discrete points to characterize the flatness. The efficiency and accuracy of this method can not meet the demand of industrial development. In order to improve the testing efficiency and accuracy of measurement, it is necessary to develop an optical flat verification system, which can obtain all surface information rapidly and efficiently, at the same time, in accordance with current national metrological verification procedures. This paper reviews current optical flat verification method and solves the problems existing in previous test, by using new method and its supporting software. Final results show that the new system can improve verification efficiency and accuracy, by comparing with JJG 28-2000 metrological verification procedures method.

  20. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  1. MACCS2 development and verification efforts

    SciTech Connect

    Young, M.; Chanin, D.

    1997-03-01

    MACCS2 represents a major enhancement of the capabilities of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, released in 1987, was developed to estimate the potential impacts to the surrounding public of severe accidents at nuclear power plants. The principal phenomena considered in MACCS/MACCS2 are atmospheric transport and deposition under time-variant meteorology, short-term and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. MACCS2 was developed as a general-purpose analytical tool applicable to diverse reactor and nonreactor facilities. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. In addition, errors that had been identified in MACCS version1.5.11.1 were corrected, including an error that prevented the code from providing intermediate-phase results. MACCS2 version 1.10 beta test was released to the beta-test group in May, 1995. In addition, the University of New Mexico (UNM) has completed an independent verification study of the code package. Since the beta-test release of MACCS2 version 1.10, a number of minor errors have been identified and corrected, and a number of enhancements have been added to the code package. The code enhancements added since the beta-test release of version 1.10 include: (1) an option to allow the user to input the {sigma}{sub y} and {sigma}{sub z} plume expansion parameters in a table-lookup form for incremental downwind distances, (2) an option to define different initial dimensions for up to four segments of a release, (3) an enhancement to the COMIDA2 food-chain model preprocessor to allow the user to supply externally calculated tables of tritium food-chain dose per unit deposition on farmland to support analyses of tritium releases, and (4) the capability to calculate direction-dependent doses.

  2. Stennis Space Center Verification & Validation Capabilities

    NASA Technical Reports Server (NTRS)

    Pagnutti, Mary; Ryan, Robert E.; Holekamp, Kara; O'Neal, Duane; Knowlton, Kelly; Ross, Kenton; Blonski, Slawomir

    2007-01-01

    Scientists within NASA#s Applied Research & Technology Project Office (formerly the Applied Sciences Directorate) have developed a well-characterized remote sensing Verification & Validation (V&V) site at the John C. Stennis Space Center (SSC). This site enables the in-flight characterization of satellite and airborne high spatial resolution remote sensing systems and their products. The smaller scale of the newer high resolution remote sensing systems allows scientists to characterize geometric, spatial, and radiometric data properties using a single V&V site. The targets and techniques used to characterize data from these newer systems can differ significantly from the techniques used to characterize data from the earlier, coarser spatial resolution systems. Scientists have used the SSC V&V site to characterize thermal infrared systems. Enhancements are being considered to characterize active lidar systems. SSC employs geodetic targets, edge targets, radiometric tarps, atmospheric monitoring equipment, and thermal calibration ponds to characterize remote sensing data products. Similar techniques are used to characterize moderate spatial resolution sensing systems at selected nearby locations. The SSC Instrument Validation Lab is a key component of the V&V capability and is used to calibrate field instrumentation and to provide National Institute of Standards and Technology traceability. This poster presents a description of the SSC characterization capabilities and examples of calibration data.

  3. Speaker verification using combined acoustic and EM sensor signal processing

    SciTech Connect

    Ng, L C; Gable, T J; Holzrichter, J F

    2000-11-10

    Low Power EM radar-like sensors have made it possible to measure properties of the human speech production system in real-time, without acoustic interference. This greatly enhances the quality and quantity of information for many speech related applications. See Holzrichter, Burnett, Ng, and Lea, J. Acoustic. SOC. Am . 103 ( 1) 622 (1998). By combining the Glottal-EM-Sensor (GEMS) with the Acoustic-signals, we've demonstrated an almost 10 fold reduction in error rates from a speaker verification system experiment under a moderate noisy environment (-10dB).

  4. Performing Verification and Validation in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  5. TEST DESIGN FOR ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) OF ADD-ON NOX CONTROL UTILIZING OZONE INJECTION

    EPA Science Inventory

    The paper discusses the test design for environmental technology verification (ETV) of add-0n nitrogen oxides (NOx) control utilizing ozone injection. (NOTE: ETV is an EPA-established program to enhance domestic and international market acceptance of new or improved commercially...

  6. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  7. Verification of FANTASTIC integrated code

    NASA Technical Reports Server (NTRS)

    Chauhan, Rajinder Singh

    1987-01-01

    FANTASTIC is an acronym for Failure Analysis Nonlinear Thermal and Structural Integrated Code. This program was developed by Failure Analysis Associates, Palo Alto, Calif., for MSFC to improve the accuracy of solid rocket motor nozzle analysis. FANTASTIC has three modules: FACT - thermochemical analysis; FAHT - heat transfer analysis; and FAST - structural analysis. All modules have keywords for data input. Work is in progress for the verification of the FAHT module, which is done by using data for various problems with known solutions as inputs to the FAHT module. The information obtained is used to identify problem areas of the code and passed on to the developer for debugging purposes. Failure Analysis Associates have revised the first version of the FANTASTIC code and a new improved version has been released to the Thermal Systems Branch.

  8. Automated claim and payment verification.

    PubMed

    Segal, Mark J; Morris, Susan; Rubin, James M O

    2002-01-01

    Since the start of managed care, there has been steady deterioration in the ability of physicians, hospitals, payors, and patients to understand reimbursement and the contracts and payment policies that drive it. This lack of transparency has generated administrative costs, confusion, and mistrust. It is therefore essential that physicians, hospitals, and payors have rapid access to accurate information on contractual payment terms. This article summarizes problems with contract-based reimbursement and needed responses by medical practices. It describes an innovative, Internet-based claims and payment verification service, Phynance, which automatically verifies the accuracy of all claims and payments by payor, contract and line item. This service enables practices to know and apply the one, true, contractually obligated allowable. The article details implementation costs and processes and anticipated return on investment. The resulting transparency improves business processes throughout health care, increasing efficiency and lowering costs for physicians, hospitals, payors, employers--and patients. PMID:12122814

  9. Retail applications of signature verification

    NASA Astrophysics Data System (ADS)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  10. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  11. Technology Foresight and nuclear test verification: a structured and participatory approach

    NASA Astrophysics Data System (ADS)

    Noack, Patrick; Gaya-Piqué, Luis; Haralabus, Georgios; Auer, Matthias; Jain, Amit; Grenard, Patrick

    2013-04-01

    maturity: the number of years until the technology in question reaches Development Stage 3 (i.e. prototype validated). 6. Integration effort: the anticipated level of effort required by the PTS to fully integrate the technology, process, concept or idea into is verification environment. 7. Time to impact: the number of years until the technology is fully developed and integrated into the PTS verification environment and delivers on its full potential. The resulting database is coupled to Pivot, a novel information management software tool which offers powerful visualisation of the taxonomy's parameters for each technology. Pivot offers many advantages over conventional spreadhseet-interfaced database tools: based on shared categories in the taxonomy, users can quickly and intuitively discover linkages, communalities and various interpretations about prospective CTBT pertinent technologies. It is easily possible to visualise a resulting sub-set of technologies that conform to the specific user-selected attributes from the full range of taxonomy categories. In this presentation we will illustrate the range of future technologies, processes, concepts and ideas; we will demonstrate how the Pivot tool can be fruitfully applied to assist in strategic planning and development, and to identify gaps apparent on the technology development horizon. Finally, we will show how the Pivot tool together with the taxonomy offer real and emerging insights to make sense of large amounts of disparate technologies.

  12. Liquefied Natural Gas (LNG) dispenser verification device

    NASA Astrophysics Data System (ADS)

    Xiong, Maotao; Yang, Jie-bin; Zhao, Pu-jun; Yu, Bo; Deng, Wan-quan

    2013-01-01

    The composition of working principle and calibration status of LNG (Liquefied Natural Gas) dispenser in China are introduced. According to the defect of weighing method in the calibration of LNG dispenser, LNG dispenser verification device has been researched. The verification device bases on the master meter method to verify LNG dispenser in the field. The experimental results of the device indicate it has steady performance, high accuracy level and flexible construction, and it reaches the international advanced level. Then LNG dispenser verification device will promote the development of LNG dispenser industry in China and to improve the technical level of LNG dispenser manufacture.

  13. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  14. ETV - ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) - RISK MANAGEMENT

    EPA Science Inventory

    In October 1995, the Environmental Technology Verification (ETV) Program was established by EPA. The goal of ETV is to provide credible performance data for commercial-ready environmental technologies to speed their implementation for the benefit of vendors, purchasers, permitter...

  15. The PASCAL-HDM Verification System

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The PASCAL-HDM verification system is described. This system supports the mechanical generation of verification conditions from PASCAL programs and HDM-SPECIAL specifications using the Floyd-Hoare axiomatic method. Tools are provided to parse programs and specifications, check their static semantics, generate verification conditions from Hoare rules, and translate the verification conditions appropriately for proof using the Shostak Theorem Prover, are explained. The differences between standard PASCAL and the language handled by this system are explained. This consists mostly of restrictions to the standard language definition, the only extensions or modifications being the addition of specifications to the code and the change requiring the references to a function of no arguments to have empty parentheses.

  16. Engineering drawing field verification program. Revision 3

    SciTech Connect

    Ulk, P.F.

    1994-10-12

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented.

  17. VERIFICATION OF GLOBAL CLIMATE CHANGE MITIGATION TECHNOLOGIES

    EPA Science Inventory

    This is a continuation of independent performance evaluations of environmental technologies under EPA's Environmental Technology Verification Program. Emissions of some greenhouse gases, most notably methane. can be controlled profitably now, even in the absence of regulations. ...

  18. MAMA Software Features: Quantification Verification Documentation-1

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  19. U.S. Environmental Technology Verification Program

    EPA Science Inventory

    Overview of the U.S. Environmental Technology Verification Program (ETV), the ETV Greenhouse Gas Technology Center, and energy-related ETV projects. Presented at the Department of Energy's National Renewable Laboratory in Boulder, Colorado on June 23, 2008.

  20. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  1. Calibration and verification of environmental models

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Weinberg, N.; Hiser, H.

    1976-01-01

    The problems of calibration and verification of mesoscale models used for investigating power plant discharges are considered. The value of remote sensors for data acquisition is discussed as well as an investigation of Biscayne Bay in southern Florida.

  2. Verification timer for AECL 780 Cobalt unit.

    PubMed

    Smathers, J B; Holly, F E

    1984-05-01

    To obtain verification of the proper time setting of the motorized run down timer for a AECL 780 Cobalt Unit, a digital timer is described, which can be added to the system for under $300. PMID:6735762

  3. Electronic Verification at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Johnson, T. W.

    1995-01-01

    This document reviews some current applications of Electronic Verification and the benefits such applications are providing the Kennedy Space Center (KSC). It also previews some new technologies, including statistics regarding performance and possible utilization of the technology.

  4. THE EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM

    EPA Science Inventory

    The Environmental Protection Agency (EPA) instituted the Environmental Technology Verification Program--or ETV--to verify the performance of innovative technical solutions to problems that threaten human health or the environment. ETV was created to substantially accelerate the e...

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: FUEL CELLS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: STORMWATER TECHNOLOGIES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techn...

  7. Handbook: Design of automated redundancy verification

    NASA Technical Reports Server (NTRS)

    Ford, F. A.; Hasslinger, T. W.; Moreno, F. J.

    1971-01-01

    The use of the handbook is discussed and the design progress is reviewed. A description of the problem is presented, and examples are given to illustrate the necessity for redundancy verification, along with the types of situations to which it is typically applied. Reusable space vehicles, such as the space shuttle, are recognized as being significant in the development of the automated redundancy verification problem.

  8. The NPARC Alliance Verification and Validation Archive

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Dudek, Julianne C.; Tatum, Kenneth E.

    2000-01-01

    The NPARC Alliance (National Project for Applications oriented Research in CFD) maintains a publicly-available, web-based verification and validation archive as part of the development and support of the WIND CFD code. The verification and validation methods used for the cases attempt to follow the policies and guidelines of the ASME and AIAA. The emphasis is on air-breathing propulsion flow fields with Mach numbers ranging from low-subsonic to hypersonic.

  9. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  10. Transmutation Fuel Performance Code Thermal Model Verification

    SciTech Connect

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  11. Verification and Validation in Computational Fluid Dynamics

    SciTech Connect

    OBERKAMPF, WILLIAM L.; TRUCANO, TIMOTHY G.

    2002-03-01

    Verification and validation (V and V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V and V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V and V, and develops a number of extensions to existing ideas. The review of the development of V and V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V and V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized.

  12. Development of advanced seal verification

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Kosten, Susan E.; Abushagur, Mustafa A.

    1992-01-01

    The purpose of this research is to develop a technique to monitor and insure seal integrity with a sensor that has no active elements to burn-out during a long duration activity, such as a leakage test or especially during a mission in space. The original concept proposed is that by implementing fiber optic sensors, changes in the integrity of a seal can be monitored in real time and at no time should the optical fiber sensor fail. The electrical components which provide optical excitation and detection through the fiber are not part of the seal; hence, if these electrical components fail, they can be easily changed without breaking the seal. The optical connections required for the concept to work does present a functional problem to work out. The utility of the optical fiber sensor for seal monitoring should be general enough that the degradation of a seal can be determined before catastrophic failure occurs and appropriate action taken. Two parallel efforts were performed in determining the feasibility of using optical fiber sensors for seal verification. In one study, research on interferometric measurements of the mechanical response of the optical fiber sensors to seal integrity was studied. In a second study, the implementation of the optical fiber to a typical vacuum chamber was implemented and feasibility studies on microbend experiments in the vacuum chamber were performed. Also, an attempt was made to quantify the amount of pressure actually being applied to the optical fiber using finite element analysis software by Algor.

  13. Learning Assumptions for Compositional Verification

    NASA Technical Reports Server (NTRS)

    Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.

  14. Cold Flow Verification Test Facility

    SciTech Connect

    Shamsi, A.; Shadle, L.J.

    1996-12-31

    The cold flow verification test facility consists of a 15-foot high, 3-foot diameter, domed vessel made of clear acrylic in two flanged sections. The unit can operate up to pressures of 14 psig. The internals include a 10-foot high jetting fluidized bed, a cylindrical baffle that hangs from the dome, and a rotating grate for control of continuous solids removal. The fluid bed is continuously fed solids (20 to 150 lb/hr) through a central nozzle made up of concentric pipes. It can either be configured as a half or full cylinder of various dimensions. The fluid bed has flow loops for separate air flow control for conveying solids (inner jet, 500 to 100000 scfh) , make-up into the jet (outer jet, 500 to 8000 scfh), spargers in the solids removal annulus (100 to 2000 scfh), and 6 air jets (20 to 200 scfh) on the sloping conical grid. Additional air (500 to 10000 scfh) can be added to the top of the dome and under the rotating grate. The outer vessel, the hanging cylindrical baffles or skirt, and the rotating grate can be used to study issues concerning moving bed reactors. There is ample allowance for access and instrumentation in the outer shell. Furthermore, this facility is available for future Cooperative Research and Development Program Manager Agreements (CRADA) to study issues and problems associated with fluid- and fixed-bed reactors. The design allows testing of different dimensions and geometries.

  15. Verification of excess defense material

    SciTech Connect

    Fearey, B.L.; Pilat, J.F.; Eccleston, G.W.; Nicholas, N.J.; Tape, J.W.

    1997-12-01

    The international community in the post-Cold War period has expressed an interest in the International Atomic Energy Agency (IAEA) using its expertise in support of the arms control and disarmament process in unprecedented ways. The pledges of the US and Russian presidents to place excess defense materials under some type of international inspections raises the prospect of using IAEA safeguards approaches for monitoring excess materials, which include both classified and unclassified materials. Although the IAEA has suggested the need to address inspections of both types of materials, the most troublesome and potentially difficult problems involve approaches to the inspection of classified materials. The key issue for placing classified nuclear components and materials under IAEA safeguards is the conflict between these traditional IAEA materials accounting procedures and the US classification laws and nonproliferation policy designed to prevent the disclosure of critical weapon-design information. Possible verification approaches to classified excess defense materials could be based on item accountancy, attributes measurements, and containment and surveillance. Such approaches are not wholly new; in fact, they are quite well established for certain unclassified materials. Such concepts may be applicable to classified items, but the precise approaches have yet to be identified, fully tested, or evaluated for technical and political feasibility, or for their possible acceptability in an international inspection regime. Substantial work remains in these areas. This paper examines many of the challenges presented by international inspections of classified materials.

  16. National Verification System of National Meteorological Center , China

    NASA Astrophysics Data System (ADS)

    Zhang, Jinyan; Wei, Qing; Qi, Dan

    2016-04-01

    Product Quality Verification Division for official weather forecasting of China was founded in April, 2011. It is affiliated to Forecast System Laboratory (FSL), National Meteorological Center (NMC), China. There are three employees in this department. I'm one of the employees and I am in charge of Product Quality Verification Division in NMC, China. After five years of construction, an integrated realtime National Verification System of NMC, China has been established. At present, its primary roles include: 1) to verify official weather forecasting quality of NMC, China; 2) to verify the official city weather forecasting quality of Provincial Meteorological Bureau; 3) to evaluate forecasting quality for each forecasters in NMC, China. To verify official weather forecasting quality of NMC, China, we have developed : • Grid QPF Verification module ( including upascale) • Grid temperature, humidity and wind forecast verification module • Severe convective weather forecast verification module • Typhoon forecast verification module • Disaster forecast verification • Disaster warning verification module • Medium and extend period forecast verification module • Objective elements forecast verification module • Ensemble precipitation probabilistic forecast verification module To verify the official city weather forecasting quality of Provincial Meteorological Bureau, we have developed : • City elements forecast verification module • Public heavy rain forecast verification module • City air quality forecast verification module. To evaluate forecasting quality for each forecasters in NMC, China, we have developed : • Off-duty forecaster QPF practice evaluation module • QPF evaluation module for forecasters • Severe convective weather forecast evaluation module • Typhoon track forecast evaluation module for forecasters • Disaster warning evaluation module for forecasters • Medium and extend period forecast evaluation module The further

  17. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Gas analyzer range verification and drift verification. 1065.550 Section 1065.550 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Performing an Emission Test Over Specified Duty Cycles § 1065.550 Gas analyzer...

  18. Monitoring and verification R&D

    SciTech Connect

    Pilat, Joseph F; Budlong - Sylvester, Kory W; Fearey, Bryan L

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.

  19. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  20. Ozone Monitoring Instrument geolocation verification

    NASA Astrophysics Data System (ADS)

    Kroon, M.; Dobber, M. R.; Dirksen, R.; Veefkind, J. P.; van den Oord, G. H. J.; Levelt, P. F.

    2008-08-01

    Verification of the geolocation assigned to individual ground pixels as measured by the Ozone Monitoring Instrument (OMI) aboard the NASA EOS-Aura satellite was performed by comparing geophysical Earth surface details as observed in OMI false color images with the high-resolution continental outline vector map as provided by the Interactive Data Language (IDL) software tool from ITT Visual Information Solutions. The OMI false color images are generated from the OMI visible channel by integration over 20-nm-wide spectral bands of the Earth radiance intensity around 484 nm, 420 nm, and 360 nm wavelength per ground pixel. Proportional to the integrated intensity, we assign color values composed of CRT standard red, green, and blue to the OMI ground pixels. Earth surface details studied are mostly high-contrast coast lines where arid land or desert meets deep blue ocean. The IDL high-resolution vector map is based on the 1993 CIA World Database II Map with a 1-km accuracy. Our results indicate that the average OMI geolocation offset over the years 2005-2006 is 0.79 km in latitude and 0.29 km in longitude, with a standard deviation of 1.64 km in latitude and 2.04 km in longitude, respectively. Relative to the OMI nadir pixel size, one obtains mean displacements of ˜6.1% in latitude and ˜1.2% in longitude, with standard deviations of 12.6% and 7.9%, respectively. We conclude that the geolocation assigned to individual OMI ground pixels is sufficiently accurate to support scientific studies of atmospheric features as observed in OMI level 2 satellite data products, such as air quality issues on urban scales or volcanic eruptions and its plumes, that occur on spatial scales comparable to or smaller than OMI nadir pixels.

  1. Fuel Retrieval System (FRS) Design Verification

    SciTech Connect

    YANOCHKO, R.M.

    2000-01-27

    This document was prepared as part of an independent review to explain design verification activities already completed, and to define the remaining design verification actions for the Fuel Retrieval System. The Fuel Retrieval Subproject was established as part of the Spent Nuclear Fuel Project (SNF Project) to retrieve and repackage the SNF located in the K Basins. The Fuel Retrieval System (FRS) construction work is complete in the KW Basin, and start-up testing is underway Design modifications and construction planning are also underway for the KE Basin. An independent review of the design verification process as applied to the K Basin projects was initiated in support of preparation for the SNF Project operational readiness review (ORR).

  2. PBL Verification with Radiosonde and Aircraft Data

    NASA Astrophysics Data System (ADS)

    Tsidulko, M.; McQueen, J.; Dimego, G.; Ek, M.

    2008-12-01

    Boundary layer depth is an important characteristic in weather forecasting and it is a key parameter in air quality modeling determining extent of turbulence and dispersion for pollutants. Real-time PBL depths from the NAM(WRF/NMM) model are verified with different types of observations. PBL depths verification is incorporated into NCEP verification system including an ability to provide a range of statistical characteristics for the boundary layer heights. For the model, several types of boundary layer definitions are used. PBL height from the TKE scheme, critical Ri number approach as well as mixed layer depth are compared with observations. Observed PBL depths are determined applying Ri number approach to radiosonde profiles. Also, preliminary study of using ACARS data for PBL verification is conducted.

  3. Critical Surface Cleaning and Verification Alternatives

    NASA Technical Reports Server (NTRS)

    Melton, Donald M.; McCool, A. (Technical Monitor)

    2000-01-01

    As a result of federal and state requirements, historical critical cleaning and verification solvents such as Freon 113, Freon TMC, and Trichloroethylene (TCE) are either highly regulated or no longer 0 C available. Interim replacements such as HCFC 225 have been qualified, however toxicity and future phase-out regulations necessitate long term solutions. The scope of this project was to qualify a safe and environmentally compliant LOX surface verification alternative to Freon 113, TCE and HCFC 225. The main effort was focused on initiating the evaluation and qualification of HCFC 225G as an alternate LOX verification solvent. The project was scoped in FY 99/00 to perform LOX compatibility, cleaning efficiency and qualification on flight hardware.

  4. Land Ice Verification and Validation Kit

    SciTech Connect

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and test data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.

  5. Packaged low-level waste verification system

    SciTech Connect

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-08-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy`s National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL).

  6. Land Ice Verification and Validation Kit

    Energy Science and Technology Software Center (ESTSC)

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&Vmore » involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and test data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  7. Active alignment/contact verification system

    DOEpatents

    Greenbaum, William M.

    2000-01-01

    A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.

  8. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  9. On Backward-Style Anonymity Verification

    NASA Astrophysics Data System (ADS)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  10. 340 and 310 drawing field verification

    SciTech Connect

    Langdon, J.

    1996-09-27

    The purpose of the drawing field verification work plan is to provide reliable drawings for the 310 Treated Effluent Disposal Facility (TEDF) and 340 Waste Handling Facility (340 Facility). The initial scope of this work plan is to provide field verified and updated versions of all the 340 Facility essential drawings. This plan can also be used for field verification of any other drawings that the facility management directs to be so updated. Any drawings revised by this work plan will be issued in an AutoCAD format.

  11. Verification of Plan Models Using UPPAAL

    NASA Technical Reports Server (NTRS)

    Khatib, Lina; Muscettola, Nicola; Haveland, Klaus; Lau, Sonic (Technical Monitor)

    2001-01-01

    This paper describes work on the verification of HSTS, the planner and scheduler of the Remote Agent autonomous control system deployed in Deep Space 1 (DS1). The verification is done using UPPAAL, a real time model checking tool. We start by motivating our work in the introduction. Then we give a brief description of HSTS and UPPAAL. After that, we give a mapping of HSTS models into UPPAAL and we present samples of plan model properties one may want to verify. Finally, we conclude with a summary.

  12. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  13. Proton Therapy Verification with PET Imaging

    PubMed Central

    Zhu, Xuping; Fakhri, Georges El

    2013-01-01

    Proton therapy is very sensitive to uncertainties introduced during treatment planning and dose delivery. PET imaging of proton induced positron emitter distributions is the only practical approach for in vivo, in situ verification of proton therapy. This article reviews the current status of proton therapy verification with PET imaging. The different data detecting systems (in-beam, in-room and off-line PET), calculation methods for the prediction of proton induced PET activity distributions, and approaches for data evaluation are discussed. PMID:24312147

  14. Design and ground verification of proximity operations

    NASA Astrophysics Data System (ADS)

    Tobias, A.; Ankersen, F.; Fehse, W.; Pauvert, C.; Pairot, J.

    This paper describes the approach to guidance, navigation, and control (GNC) design and verification for proximity operations. The most critical part of the rendezvous mission is the proximity operations phase when the distance between chaser and target is below approximately 20 m. Safety is the overriding consideration in the design of the GNC system. Requirements on the GNC system also stem from the allocation of performance between proximity operations and the mating process, docking, or capture for berthing. Whereas the design process follows a top down approach, the verification process goes bottom up in a stepwise way according to the development stage.

  15. Electric power system test and verification program

    NASA Technical Reports Server (NTRS)

    Rylicki, Daniel S.; Robinson, Frank, Jr.

    1994-01-01

    Space Station Freedom's (SSF's) electric power system (EPS) hardware and software verification is performed at all levels of integration, from components to assembly and system level tests. Careful planning is essential to ensure the EPS is tested properly on the ground prior to launch. The results of the test performed on breadboard model hardware and analyses completed to date have been evaluated and used to plan for design qualification and flight acceptance test phases. These results and plans indicate the verification program for SSF's 75-kW EPS would have been successful and completed in time to support the scheduled first element launch.

  16. VERIFICATION OF A TOXIC ORGANIC SUBSTANCE TRANSPORT AND BIOACCUMULATION MODEL

    EPA Science Inventory

    A field verification of the Toxic Organic Substance Transport and Bioaccumulation Model (TOXIC) was conducted using the insecticide dieldrin and the herbicides alachlor and atrazine as the test compounds. The test sites were two Iowa reservoirs. The verification procedure include...

  17. 19 CFR 181.74 - Verification visit procedures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... generates a reliable receipt, to the CBP officer who gave the notification provided for in § 181.73 of this... otherwise cooperate during the verification visit shall mean that the verification visit never took...

  18. 19 CFR 181.74 - Verification visit procedures.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... generates a reliable receipt, to the CBP officer who gave the notification provided for in § 181.73 of this... otherwise cooperate during the verification visit shall mean that the verification visit never took...

  19. Generic Protocol for the Verification of Ballast Water Treatment Technology

    EPA Science Inventory

    In anticipation of the need to address performance verification and subsequent approval of new and innovative ballast water treatment technologies for shipboard installation, the U.S Coast Guard and the Environmental Protection Agency‘s Environmental Technology Verification Progr...

  20. ETV INTERNATIONAL OUTREACH ACTIVITIES (ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program's international outrearch activities have extended as far as Canada, Germany, Taiwan and the Philippines. Vendors from Canada and Germany were hosted at verification tests of turbidimeters. In May 1999, EPA's ETV Coordinator...

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM FOR MONITORING AND CHARACTERIZATION

    EPA Science Inventory

    The Environmental Technology Verification Program is a service of the Environmental Protection Agency designed to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of performance. The goal of ETV i...

  2. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    NASA Technical Reports Server (NTRS)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PERFORMANCE VERIFICATION OF THE W.L. GORE & ASSOCIATES GORE-SORBER SCREENING SURVEY

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  4. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  5. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  6. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  7. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  8. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  9. 34 CFR 668.54 - Selection of applications for verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false Selection of applications for verification. 668.54... Aid Application Information § 668.54 Selection of applications for verification. (a) General... to the institution, had previously completed the verification process at the institution from...

  10. 19 CFR 181.74 - Verification visit procedures.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 19 Customs Duties 2 2012-04-01 2012-04-01 false Verification visit procedures. 181.74 Section 181.74 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) NORTH AMERICAN FREE TRADE AGREEMENT Origin Verifications and Determinations § 181.74 Verification visit procedures....

  11. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  12. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  13. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  14. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  15. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  16. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort...

  17. 40 CFR 1066.240 - Torque transducer verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Torque transducer verification. 1066... POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification. Verify torque-measurement systems by performing the verifications described in §§ 1066.270...

  18. Seven Years of ACTS Technology Verification Experiments Reviewed

    NASA Technical Reports Server (NTRS)

    Acosta, Roberto J.; Johnson, Sandra K.; McEntee, Kathleen M.; Gauntner, William; Feliciano, Walber

    2000-01-01

    The Advanced Communications Technology Satellite (ACTS) was designed to achieve a 99.5-percent system availability rate and signals with less than one error in 10(exp 7) bits throughout the continental United States. To accomplish such a high rate of system availability, ACTS uses multiple narrow hopping beams and very small aperture terminal (VSAT) technology. In addition, ACTS uses an adaptive rain fade compensation protocol to reduce the negative effects of propagation on the system. To enhance knowledge on how propagation and system variances affect system availability, researchers at the NASA Glenn Research Center at Lewis Field performed technology verification experiments over a 7-yr period (from September 1993 to the present). These experiments include T1VSAT System Availability, Statistical Rain Fade Compensation Characterization, Statistical Characterization of Ka-Band Propagation Effects on Communication Link Performance, and Multibeam Antenna Performance.

  19. Digital video system for on-line portal verification

    NASA Astrophysics Data System (ADS)

    Leszczynski, Konrad W.; Shalev, Shlomo; Cosby, N. Scott

    1990-07-01

    A digital system has been developed for on-line acquisition, processing and display of portal images during radiation therapy treatment. A metal/phosphor screen combination is the primary detector, where the conversion from high-energy photons to visible light takes place. A mirror angled at 45 degrees reflects the primary image to a low-light-level camera, which is removed from the direct radiation beam. The image registered by the camera is digitized, processed and displayed on a CRT monitor. Advanced digital techniques for processing of on-line images have been developed and implemented to enhance image contrast and suppress the noise. Some elements of automated radiotherapy treatment verification have been introduced.

  20. Synthesis, verification, and optimization of systolic arrays

    SciTech Connect

    Rajopadhye, S.V.

    1986-01-01

    This dissertation addresses the issue of providing a sound theoretical basis for three important issues relating to systolic arrays, namely synthesis, verification, and optimization. Former research has concentrated on analysis of the dependency structure of the computation, and there have been numerous approaches to map this dependency structure onto a locally interconnected network. This study pursues a similar approach, but with a major generalization of the class of problems analyzed. In earlier research, it was essential that the dependencies were expressible as constant vectors (from a point in the domain to the points that it depended on); here they are permitted to be arbitrary linear functions of the point. Theory for synthesizing systolic architectures from such generalized specifications is developed. Also a systematic (mechanizable) approach to the synthesis of systolic architectures that have control signals is presented. In the areas of verification and optimization, a rigorous mathematical framework is presented that permits reasoning about the behavior of systolic arrays as functions on streams of data. Using this approach, the verification of such architectures reduces to the problem of verification of functional program.s

  1. PROMOTING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATION

    EPA Science Inventory

    The paper discusses the promotion of improved air quality through environmental technology verifications (ETVs). In 1995, the U.S. EPA's Office of Research and Development began the ETV Program in response to President Clinton's "Bridge to a Sustainable Future" and Vice Presiden...

  2. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  3. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  4. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  5. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  6. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  7. Gender, Legitimation, and Identity Verification in Groups

    ERIC Educational Resources Information Center

    Burke, Peter J.; Stets, Jan E.; Cerven, Christine

    2007-01-01

    Drawing upon identity theory, expectation states theory, and legitimation theory, we examine how the task leader identity in task-oriented groups is more likely to be verified for persons with high status characteristics. We hypothesize that identity verification will be accomplished more readily for male group members and legitimated task leaders…

  8. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  9. The Verification Guide, 1998-99.

    ERIC Educational Resources Information Center

    Office of Postsecondary Education, Washington DC. Student Financial Assistance Programs.

    This guide is intended to assist financial aid administrators at postsecondary education institutions in completing verification, the process of checking the accuracy of the information students provide when they apply for financial aid under student financial assistance (SFA) programs administered by the U.S. Department of Education. The first…

  10. Alternative Model Learner Verification and Revision Statutes.

    ERIC Educational Resources Information Center

    Geffert, Hannah N.; And Others

    A learner verification and revision (LVR) process attempts to discover difficulties learners experience in using instructional materials and to formulate possible ways of modifying the instructional materials to eliminate the difficulties. It is a means of ensuring useful learner input into the prepublication development and postpublication…

  11. Environmental Technology Verification Program Fact Sheet

    EPA Science Inventory

    This is a Fact Sheet for the ETV Program. The EPA Environmental Technology Verification Program (ETV) develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The program ...

  12. An Interactive System for Graduation Verification.

    ERIC Educational Resources Information Center

    Wang, Y.; Dasarathy, B.

    1981-01-01

    This description of a computerized graduation verification system developed and implemented at the University of South Carolina at Columbia discusses the "table-driven" feature of the programs and details the implementation of the system, including examples of the Extended Backus Naur Form (EBNF) notation used to represent the system language…

  13. RELAP-7 SOFTWARE VERIFICATION AND VALIDATION PLAN

    SciTech Connect

    Smith, Curtis L; Choi, Yong-Joon; Zou, Ling

    2014-09-01

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  14. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  15. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  16. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  17. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  18. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  19. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  20. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  1. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  2. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  3. Learner Verification: A Publisher's Case Study.

    ERIC Educational Resources Information Center

    Wilson, George

    Learner verification, a process by which publishers monitor the effectiveness of their products and strive to improve their services to schools, is a practice that most companies take seriously. The quality of educational materials may be ensured in many ways: by analysis of sales, through firsthand investigation, and by employing a system of…

  4. ENVIRONMENTAL TECHNOLOGY VERIFICATION OF BAGHOUSE FILTRATION PRODUCTS

    EPA Science Inventory

    The Environmental Technology Verification Program (ETV) was started by EPA in 1995 to generate independent credible data on the performance of innovative technologies that have potential to improve protection of public health and the environment. ETV does not approve or certify p...

  5. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 4 2013-10-01 2013-10-01 false Eligibility verification. 457.380 Section 457.380 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STATE CHILDREN'S HEALTH INSURANCE PROGRAMS (SCHIPs) ALLOTMENTS AND GRANTS TO STATES State Plan Requirements: Eligibility,...

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION OF URBAN RUNOFF MODELS

    EPA Science Inventory

    This paper will present the verification process and available results of the XP-SWMM modeling system produced by XP-Software conducted unde the USEPA's ETV Program. Wet weather flow (WWF) models are used throughout the US for the evaluation of storm and combined sewer systems. M...

  7. Formal hardware verification of digital circuits

    NASA Technical Reports Server (NTRS)

    Joyce, J.; Seger, C.-J.

    1991-01-01

    The use of formal methods to verify the correctness of digital circuits is less constrained by the growing complexity of digital circuits than conventional methods based on exhaustive simulation. This paper briefly outlines three main approaches to formal hardware verification: symbolic simulation, state machine analysis, and theorem-proving.

  8. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  9. Environmental Technology Verification (ETV) Quality Program (Poster)

    EPA Science Inventory

    This is a poster created for the ETV Quality Program. The EPA Environmental Technology Verification Program (ETV) develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The...

  10. Distilling the Verification Process for Prognostics Algorithms

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai

    2013-01-01

    The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.

  11. Hardware verification at Computational Logic, Inc.

    NASA Technical Reports Server (NTRS)

    Brock, Bishop C.; Hunt, Warren A., Jr.

    1990-01-01

    The following topics are covered in viewgraph form: (1) hardware verification; (2) Boyer-Moore logic; (3) core RISC; (4) the FM8502 fabrication, implementation specification, and pinout; (5) hardware description language; (6) arithmetic logic generator; (7) near term expected results; (8) present trends; (9) future directions; (10) collaborations and technology transfer; and (11) technology enablers.

  12. 40 CFR 1066.135 - Linearity verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... that are unique to testing under this part. (Note: See the definition of “linearity” in 40 CFR 1065... as described in 40 CFR 1065.307, with the exceptions noted in this section. (a) For gas analyzer... verification between 15 and 50% of the full-scale analyzer range. (2) Use the linearity requirements of 40...

  13. On the role of code comparisons in verification and validation.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2003-08-01

    This report presents a perspective on the role of code comparison activities in verification and validation. We formally define the act of code comparison as the Code Comparison Principle (CCP) and investigate its application in both verification and validation. One of our primary conclusions is that the use of code comparisons for validation is improper and dangerous. We also conclude that while code comparisons may be argued to provide a beneficial component in code verification activities, there are higher quality code verification tasks that should take precedence. Finally, we provide a process for application of the CCP that we believe is minimal for achieving benefit in verification processes.

  14. The AdaptiV Approach to Verification of Adaptive Systems

    SciTech Connect

    Rouff, Christopher; Buskens, Richard; Pullum, Laura L; Cui, Xiaohui; Hinchey, Mike

    2012-01-01

    Adaptive systems are critical for future space and other unmanned and intelligent systems. Verification of these systems is also critical for their use in systems with potential harm to human life or with large financial investments. Due to their nondeterministic nature and extremely large state space, current methods for verification of software systems are not adequate to provide a high level of assurance. The combination of stabilization science, high performance computing simulations, compositional verification and traditional verification techniques, plus operational monitors, provides a complete approach to verification and deployment of adaptive systems that has not been used before. This paper gives an overview of this approach.

  15. Infrasound workshop for CTBT monitoring: Proceedings

    SciTech Connect

    Christie, D.; Whitaker, R.

    1998-11-01

    It is expected that the establishment of new infrasound stations in the global IMS network by the Provisional Technical Secretariat of the CTBTO in Vienna will commence in the middle of 1998. Thus, decisions on the final operational design for IMS infrasound stations will have to be made within the next 12 months. Though many of the basic design problems have been resolved, it is clear that further work needs to be carried out during the coming year to ensure that IMS infrasound stations will operate with maximum capability in accord with the specifications determined during the May 1997 PrepCom Meeting. Some of the papers presented at the Workshop suggest that it may be difficult to design a four-element infrasound array station that will reliably detect and locate infrasound signals at all frequencies in the specified range from 0.02 to 4.0 Hz in all noise environments. Hence, if the basic design of an infrasound array is restricted to four array elements, the final optimized design may be suited only to the detection and location of signals in a more limited pass-band. Several participants have also noted that the reliable discrimination of infrasound signals could be quite difficult if the detection system leads to signal distortion. Thus, it has been emphasized that the detection system should not, if possible, compromise signal fidelity. This report contains the workshop agenda, a list of participants, and abstracts and viewgraphs from each presentation.

  16. Verification and validation in computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  17. Systems Approach to Arms Control Verification

    SciTech Connect

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  18. Collaborative Localization and Location Verification in WSNs

    PubMed Central

    Miao, Chunyu; Dai, Guoyong; Ying, Kezhen; Chen, Qingzhang

    2015-01-01

    Localization is one of the most important technologies in wireless sensor networks. A lightweight distributed node localization scheme is proposed by considering the limited computational capacity of WSNs. The proposed scheme introduces the virtual force model to determine the location by incremental refinement. Aiming at solving the drifting problem and malicious anchor problem, a location verification algorithm based on the virtual force mode is presented. In addition, an anchor promotion algorithm using the localization reliability model is proposed to re-locate the drifted nodes. Extended simulation experiments indicate that the localization algorithm has relatively high precision and the location verification algorithm has relatively high accuracy. The communication overhead of these algorithms is relative low, and the whole set of reliable localization methods is practical as well as comprehensive. PMID:25954948

  19. Conceptual design. Final report: TFE Verification Program

    SciTech Connect

    Not Available

    1994-03-01

    This report documents the TFE Conceptual Design, which provided the design guidance for the TFE Verification program. The primary goals of this design effort were: (1) establish the conceptual design of an in-core thermionic reactor for a 2 Mw(e) space nuclear power system with a 7-year operating lifetime; (2) demonstrate scalability of the above concept over the output power range of 500 kW(e) to 5 MW(e); and (3) define the TFE which is the basis for the 2 MW (e) reactor design. This TFE specification provided the basis for the test program. These primary goals were achieved. The technical approach taking in the conceptual design effort is discussed in Section 2, and the results are discussed in Section 3. The remainder of this introduction draws a perspective on the role that this conceptual design task played in the TFE Verification Program.

  20. Sensor-fusion-based biometric identity verification

    SciTech Connect

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W.; Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person`s identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm.

  1. Validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Gilstrap, Lewey

    1991-01-01

    Validation and verification (V&V) are procedures used to evaluate system structure or behavior with respect to a set of requirements. Although expert systems are often developed as a series of prototypes without requirements, it is not possible to perform V&V on any system for which requirements have not been prepared. In addition, there are special problems associated with the evaluation of expert systems that do not arise in the evaluation of conventional systems, such as verification of the completeness and accuracy of the knowledge base. The criticality of most NASA missions make it important to be able to certify the performance of the expert systems used to support these mission. Recommendations for the most appropriate method for integrating V&V into the Expert System Development Methodology (ESDM) and suggestions for the most suitable approaches for each stage of ESDM development are presented.

  2. Packaged low-level waste verification system

    SciTech Connect

    Tuite, K.; Winberg, M.R.; McIsaac, C.V.

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  3. Test load verification through strain data analysis

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Harrington, F.

    1995-01-01

    A traditional binding acceptance criterion on polycrystalline structures is the experimental verification of the ultimate factor of safety. At fracture, the induced strain is inelastic and about an order-of-magnitude greater than designed for maximum expected operational limit. At this extreme strained condition, the structure may rotate and displace at the applied verification load such as to unknowingly distort the load transfer into the static test article. Test may result in erroneously accepting a submarginal design or rejecting a reliable one. A technique was developed to identify, monitor, and assess the load transmission error through two back-to-back surface-measured strain data. The technique is programmed for expediency and convenience. Though the method was developed to support affordable aerostructures, the method is also applicable for most high-performance air and surface transportation structural systems.

  4. Cleaning verification by air/water impingement

    NASA Technical Reports Server (NTRS)

    Jones, Lisa L.; Littlefield, Maria D.; Melton, Gregory S.; Caimi, Raoul E. B.; Thaxton, Eric A.

    1995-01-01

    This paper will discuss how the Kennedy Space Center intends to perform precision cleaning verification by Air/Water Impingement in lieu of chlorofluorocarbon-113 gravimetric nonvolatile residue analysis (NVR). Test results will be given that demonstrate the effectiveness of the Air/Water system. A brief discussion of the Total Carbon method via the use of a high temperature combustion analyzer will also be given. The necessary equipment for impingement will be shown along with other possible applications of this technology.

  5. Dose Verification in IMRT and VMAT

    SciTech Connect

    Feygelman, Vladimir; Nelms, Benjamin E.

    2011-05-05

    This is a review paper of the current IMRT dosimetric verification methods, with the emphasis on the solid state dosimeters. Different types of IMRT treatments and the associated quality assurance challenges are described. The prevailing techniques of quantifying dosimetric agreement and their weaknesses in terms of clinical relevance are discussed. A variety of empirical, semi-empirical, and purely calculational methods are summarized from a clinical practice point of view. A number of available commercial devices and emerging technologies are described.

  6. Verification system for postoperative autologous blood retransfusion.

    PubMed

    Yoshikawa, Takeki; Kimura, Eizen; Kobayashi, Shinji; Ishihara, Ken

    2013-01-01

    Medical staff members should match blood products with patients using a barcode authentication system for blood transfusion to prevent medical accidents. However, our hospital only verifies the blood products of the Japanese Red Cross Society and the preserved autologous blood, not the autologous blood salvaged during the operation or from the oxygenator. In this study, we developed the barcode medication administration system and mobile device for verification. This system will prevent blood transfusion errors in the ward setting. PMID:23920751

  7. Component Verification and Certification in NASA Missions

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Penix, John; Norvig, Peter (Technical Monitor)

    2001-01-01

    Software development for NASA missions is a particularly challenging task. Missions are extremely ambitious scientifically, have very strict time frames, and must be accomplished with a maximum degree of reliability. Verification technologies must therefore be pushed far beyond their current capabilities. Moreover, reuse and adaptation of software architectures and components must be incorporated in software development within and across missions. This paper discusses NASA applications that we are currently investigating from these perspectives.

  8. Verification and performance tests of HYCAR program

    NASA Technical Reports Server (NTRS)

    Bhatia, Veena

    1985-01-01

    The HYCAR program simulates the network protocols of HYPERchannel and Fiber Optic Demonstration System (FODS) and other related protocols. Verification tests of the program were conducted using the FODS protocol. The tests validated the operation of the program through deterministic and analytical means. Extensive experimentation with the simulator produced a set of performance characteristics for the FODS protocol under varied loading conditions. These characteristics are consistent with those expected, and are documented with the validation tests.

  9. Survey of Existing Tools for Formal Verification.

    SciTech Connect

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.; Jackson, Mayo

    2014-12-01

    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  10. Verification technologies, January--February 1990

    SciTech Connect

    Staehle, G.; Stull, S.

    1990-01-01

    The purpose of this article is to summarize some of the work at the Department of Energy (DOE) national laboratories applicable to on-site inspections (OSIs) to assist in the verification of arms-reduction treaties --- nuclear and other armaments. Not included are technologies that would be specifically applicable to treaties placing limitations on nuclear testing or technologies normally characterized as national technical means (NTM).

  11. Secure Image Hash Comparison for Warhead Verification

    SciTech Connect

    Bruillard, Paul J.; Jarman, Kenneth D.; Robinson, Sean M.

    2014-06-06

    The effort to inspect and verify warheads in the context of possible future arms control treaties is rife with security and implementation issues. In this paper we review prior work on perceptual image hashing for template-based warhead verification. Furthermore, we formalize the notion of perceptual hashes and demonstrate that large classes of such functions are likely not cryptographically secure. We close with a brief discussion of fully homomorphic encryption as an alternative technique.

  12. Dynamic analysis for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Fralich, R. W.; Green, C. E.; Rheinfurth, M. H.

    1972-01-01

    Two approaches that are used for determining the modes and frequencies of space shuttle structures are discussed. The first method, direct numerical analysis, involves finite element mathematical modeling of the space shuttle structure in order to use computer programs for dynamic structural analysis. The second method utilizes modal-coupling techniques of experimental verification made by vibrating only spacecraft components and by deducing modes and frequencies of the complete vehicle from results obtained in the component tests.

  13. Systematic physical verification with topological patterns

    NASA Astrophysics Data System (ADS)

    Dai, Vito; Lai, Ya-Chieh; Gennari, Frank; Teoh, Edward; Capodieci, Luigi

    2014-03-01

    Design rule checks (DRC) are the industry workhorse for constraining design to ensure both physical and electrical manufacturability. Where DRCs fail to fully capture the concept of manufacturability, pattern-based approaches, such as DRC Plus, fill the gap using a library of patterns to capture and identify problematic 2D configurations. Today, both a DRC deck and a pattern matching deck may be found in advanced node process development kits. Major electronic design automation (EDA) vendors offer both DRC and pattern matching solutions for physical verification; in fact, both are frequently integrated into the same physical verification tool. In physical verification, DRCs represent dimensional constraints relating directly to process limitations. On the other hand, patterns represent the 2D placement of surrounding geometries that can introduce systematic process effects. It is possible to combine both DRCs and patterns in a single topological pattern representation. A topological pattern has two separate components: a bitmap representing the placement and alignment of polygon edges, and a vector of dimensional constraints. The topological pattern is unique and unambiguous; there is no code to write, and no two different ways to represent the same physical structure. Furthermore, markers aligned to the pattern can be generated to designate specific layout optimizations for improving manufacturability. In this paper, we describe how to do systematic physical verification with just topological patterns. Common mappings between traditional design rules and topological pattern rules are presented. We describe techniques that can be used during the development of a topological rule deck such as: taking constraints defined on one rule, and systematically projecting it onto other related rules; systematically separating a single rule into two or more rules, when the single rule is not sufficient to capture manufacturability constraints; creating test layout which

  14. Signal verification can promote reliable signalling.

    PubMed

    Broom, Mark; Ruxton, Graeme D; Schaefer, H Martin

    2013-11-22

    The central question in communication theory is whether communication is reliable, and if so, which mechanisms select for reliability. The primary approach in the past has been to attribute reliability to strategic costs associated with signalling as predicted by the handicap principle. Yet, reliability can arise through other mechanisms, such as signal verification; but the theoretical understanding of such mechanisms has received relatively little attention. Here, we model whether verification can lead to reliability in repeated interactions that typically characterize mutualisms. Specifically, we model whether fruit consumers that discriminate among poor- and good-quality fruits within a population can select for reliable fruit signals. In our model, plants either signal or they do not; costs associated with signalling are fixed and independent of plant quality. We find parameter combinations where discriminating fruit consumers can select for signal reliability by abandoning unprofitable plants more quickly. This self-serving behaviour imposes costs upon plants as a by-product, rendering it unprofitable for unrewarding plants to signal. Thus, strategic costs to signalling are not a prerequisite for reliable communication. We expect verification to more generally explain signal reliability in repeated consumer-resource interactions that typify mutualisms but also in antagonistic interactions such as mimicry and aposematism. PMID:24068354

  15. Tags and seals for arms control verification

    SciTech Connect

    DeVolpi, A.

    1990-09-18

    Tags and seals have long been recognized as important tools in arms control. The trend in control of armaments is to limit militarily significant equipment that is capable of being verified through direct and cooperative means, chiefly on-site inspection or monitoring. Although this paper will focus on the CFE treaty, the role of tags and seals for other treaties will also be addressed. Published technology and concepts will be reviewed, based on open sources. Arms control verification tags are defined as unique identifiers designed to be tamper-revealing; in that respect, seals are similar, being used as indicators of unauthorized access. Tamper-revealing tags might be considered as single-point markers, seals as two-point couplings, and nets as volume containment. The functions of an arms control tag can be considered to be two-fold: to provide field verification of the identity of a treaty-limited item (TLI), and to have a means of authentication of the tag and its tamper-revealing features. Authentication could take place in the field or be completed elsewhere. For CFE, the goal of tags and seals can be to reduce the overall cost of the entire verification system.

  16. Formal verification of an avionics microprocessor

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam, K.; Miller, Steven P.

    1995-01-01

    Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.

  17. On code verification of RANS solvers

    NASA Astrophysics Data System (ADS)

    Eça, L.; Klaij, C. M.; Vaz, G.; Hoekstra, M.; Pereira, F. S.

    2016-04-01

    This article discusses Code Verification of Reynolds-Averaged Navier Stokes (RANS) solvers that rely on face based finite volume discretizations for volumes of arbitrary shape. The study includes test cases with known analytical solutions (generated with the method of manufactured solutions) corresponding to laminar and turbulent flow, with the latter using eddy-viscosity turbulence models. The procedure to perform Code Verification based on grid refinement studies is discussed and the requirements for its correct application are illustrated in a simple one-dimensional problem. It is shown that geometrically similar grids are recommended for proper Code Verification and so the data should not have scatter making the use of least square fits unnecessary. Results show that it may be advantageous to determine the extrapolated error to cell size/time step zero instead of assuming that it is zero, especially when it is hard to determine the asymptotic order of grid convergence. In the RANS examples, several of the features of the ReFRESCO solver are checked including the effects of the available turbulence models in the convergence properties of the code. It is shown that it is required to account for non-orthogonality effects in the discretization of the diffusion terms and that the turbulence quantities transport equations can deteriorate the order of grid convergence of mean flow quantities.

  18. Automating engineering verification in ALMA subsystems

    NASA Astrophysics Data System (ADS)

    Ortiz, José; Castillo, Jorge

    2014-08-01

    The Atacama Large Millimeter/submillimeter Array is an interferometer comprising 66 individual high precision antennas located over 5000 meters altitude in the north of Chile. Several complex electronic subsystems need to be meticulously tested at different stages of an antenna commissioning, both independently and when integrated together. First subsystem integration takes place at the Operations Support Facilities (OSF), at an altitude of 3000 meters. Second integration occurs at the high altitude Array Operations Site (AOS), where also combined performance with Central Local Oscillator (CLO) and Correlator is assessed. In addition, there are several other events requiring complete or partial verification of instrument specifications compliance, such as parts replacements, calibration, relocation within AOS, preventive maintenance and troubleshooting due to poor performance in scientific observations. Restricted engineering time allocation and the constant pressure of minimizing downtime in a 24/7 astronomical observatory, impose the need to complete (and report) the aforementioned verifications in the least possible time. Array-wide disturbances, such as global power interruptions and following recovery, generate the added challenge of executing this checkout on multiple antenna elements at once. This paper presents the outcome of the automation of engineering verification setup, execution, notification and reporting in ALMA and how these efforts have resulted in a dramatic reduction of both time and operator training required. Signal Path Connectivity (SPC) checkout is introduced as a notable case of such automation.

  19. Trajectory Based Behavior Analysis for User Verification

    NASA Astrophysics Data System (ADS)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  20. Signal verification can promote reliable signalling

    PubMed Central

    Broom, Mark; Ruxton, Graeme D.; Schaefer, H. Martin

    2013-01-01

    The central question in communication theory is whether communication is reliable, and if so, which mechanisms select for reliability. The primary approach in the past has been to attribute reliability to strategic costs associated with signalling as predicted by the handicap principle. Yet, reliability can arise through other mechanisms, such as signal verification; but the theoretical understanding of such mechanisms has received relatively little attention. Here, we model whether verification can lead to reliability in repeated interactions that typically characterize mutualisms. Specifically, we model whether fruit consumers that discriminate among poor- and good-quality fruits within a population can select for reliable fruit signals. In our model, plants either signal or they do not; costs associated with signalling are fixed and independent of plant quality. We find parameter combinations where discriminating fruit consumers can select for signal reliability by abandoning unprofitable plants more quickly. This self-serving behaviour imposes costs upon plants as a by-product, rendering it unprofitable for unrewarding plants to signal. Thus, strategic costs to signalling are not a prerequisite for reliable communication. We expect verification to more generally explain signal reliability in repeated consumer–resource interactions that typify mutualisms but also in antagonistic interactions such as mimicry and aposematism. PMID:24068354

  1. Verification of micro-beam irradiation

    NASA Astrophysics Data System (ADS)

    Li, Qiongge; Juang, Titania; Beth, Rachel; Chang, Sha; Oldham, Mark

    2015-01-01

    Micro-beam Radiation Therapy (MRT) is an experimental radiation therapy with provocative experimental data indicating potential for improved efficacy in some diseases. Here we demonstrated a comprehensive micro-beam verification method utilizing high resolution (50pm) PRESAGE/Micro-Optical-CT 3D Dosimetry. A small PRESAGE cylindrical dosimeter was irradiated by a novel compact Carbon-Nano-Tube (CNT) field emission based MRT system. The Percentage Depth Dose (PDD), Peak-to-Valley Dose Ratio (PVDR) and beam width (FWHM) data were obtained and analyzed from a three strips radiation experiment. A fast dose drop-off with depth, a preserved beam width with depth (an averaged FWHM across three beams remains constant (405.3um, sigma=13.2um) between depth of 3.0~14.0mm), and a high PVDR value (increases with depth from 6.3 at 3.0mm depth to 8.6 at 14.0mm depth) were discovered during this verification process. Some operating procedures such as precise dosimeter mounting, robust mechanical motions (especially rotation) and stray-light artifact management were optimized and developed to achieve a more accurate and dosimetric verification method.

  2. Verification in Referral-Based Crowdsourcing

    PubMed Central

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  3. SMAP Verification and Validation Project - Final Report

    NASA Technical Reports Server (NTRS)

    Murry, Michael

    2012-01-01

    In 2007, the National Research Council (NRC) released the Decadal Survey of Earth science. In the future decade, the survey identified 15 new space missions of significant scientific and application value for the National Aeronautics and Space Administration (NASA) to undertake. One of these missions was the Soil Moisture Active Passive (SMAP) mission that NASA assigned to the Jet Propulsion Laboratory (JPL) in 2008. The goal of SMAP1 is to provide global, high resolution mapping of soil moisture and its freeze/thaw states. The SMAP project recently passed its Critical Design Review and is proceeding with its fabrication and testing phase.Verification and Validation (V&V) is widely recognized as a critical component in system engineering and is vital to the success of any space mission. V&V is a process that is used to check that a system meets its design requirements and specifications in order to fulfill its intended purpose. Verification often refers to the question "Have we built the system right?" whereas Validation asks "Have we built the right system?" Currently the SMAP V&V team is verifying design requirements through inspection, demonstration, analysis, or testing. An example of the SMAP V&V process is the verification of the antenna pointing accuracy with mathematical models since it is not possible to provide the appropriate micro-gravity environment for testing the antenna on Earth before launch.

  4. Runtime Verification of C Memory Safety

    NASA Astrophysics Data System (ADS)

    Roşu, Grigore; Schulte, Wolfram; Şerbănuţă, Traian Florin

    C is the most widely used imperative system’s implementation language. While C provides types and high-level abstractions, its design goal has been to provide highest performance which often requires low-level access to memory. As a consequence C supports arbitrary pointer arithmetic, casting, and explicit allocation and deallocation. These operations are difficult to use, resulting in programs that often have software bugs like buffer overflows and dangling pointers that cause security vulnerabilities. We say a C program is memory safe, if at runtime it never goes wrong with such a memory access error. Based on standards for writing “good” C code, this paper proposes strong memory safety as the least restrictive formal definition of memory safety amenable for runtime verification. We show that although verification of memory safety is in general undecidable, even when restricted to closed, terminating programs, runtime verification of strong memory safety is a decision procedure for this class of programs. We verify strong memory safety of a program by executing the program using a symbolic, deterministic definition of the dynamic semantics. A prototype implementation of these ideas shows the feasibility of this approach.

  5. Verification in referral-based crowdsourcing.

    PubMed

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through "referral-based crowdsourcing": the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  6. Speaker Verification Using Subword Neural Tree Networks.

    NASA Astrophysics Data System (ADS)

    Liou, Han-Sheng

    1995-01-01

    In this dissertation, a new neural-network-based algorithm for text-dependent speaker verification is presented. The algorithm uses a set of concatenated Neural Tree Networks (NTN's) trained on subword units to model a password. In contrast to the conventional stochastic approaches which model the subword units by Hidden Markov Models (HMM's), the new approach utilizes the discriminative training scheme to train a NTN for each subword unit. Two types of subword unit are investigated, phone-like units (PLU's) and HMM state-based units (HSU's). The training of the models includes the following steps. The training utterances of a password is first segmented into subword units using a HMM-based segmentation method. A NTN is then trained for each subword unit. In order to retrieve the temporal information which is relatively important in text-dependent speaker verification, the proposed paradigm integrates the discriminatory ability of the NTN with the temporal models of the HMM. A new scoring method using phonetic weighting to improve the speaker verification performance is also introduced. The proposed algorithms are evaluated by experiments on a TI isolated-word database, YOHO database, and several hundred utterances collected over telephone channel. Performance improvements are obtained over conventional techniques.

  7. Evaluation of the 29-km Eta Model. Part 1; Objective Verification at Three Selected Stations

    NASA Technical Reports Server (NTRS)

    Nutter, Paul A.; Manobianco, John; Merceret, Francis J. (Technical Monitor)

    1998-01-01

    This paper describes an objective verification of the National Centers for Environmental Prediction (NCEP) 29-km eta model from May 1996 through January 1998. The evaluation was designed to assess the model's surface and upper-air point forecast accuracy at three selected locations during separate warm (May - August) and cool (October - January) season periods. In order to enhance sample sizes available for statistical calculations, the objective verification includes two consecutive warm and cool season periods. Systematic model deficiencies comprise the larger portion of the total error in most of the surface forecast variables that were evaluated. The error characteristics for both surface and upper-air forecasts vary widely by parameter, season, and station location. At upper levels, a few characteristic biases are identified. Overall however, the upper-level errors are more nonsystematic in nature and could be explained partly by observational measurement uncertainty. With a few exceptions, the upper-air results also indicate that 24-h model error growth is not statistically significant. In February and August 1997, NCEP implemented upgrades to the eta model's physical parameterizations that were designed to change some of the model's error characteristics near the surface. The results shown in this paper indicate that these upgrades led to identifiable and statistically significant changes in forecast accuracy for selected surface parameters. While some of the changes were expected, others were not consistent with the intent of the model updates and further emphasize the need for ongoing sensitivity studies and localized statistical verification efforts. Objective verification of point forecasts is a stringent measure of model performance, but when used alone, is not enough to quantify the overall value that model guidance may add to the forecast process. Therefore, results from a subjective verification of the meso-eta model over the Florida peninsula are

  8. Verification of a Viscous Computational Aeroacoustics Code Using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  9. Verification of a Viscous Computational Aeroacoustics Code using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  10. Delivery verification and dose reconstruction in tomotherapy

    NASA Astrophysics Data System (ADS)

    Kapatoes, Jeffrey Michael

    2000-11-01

    It has long been a desire in photon-beam radiation therapy to make use of the significant fraction of the beam exiting the patient to infer how much of the beam energy was actually deposited in the patient. With a linear accelerator and corresponding exit detector mounted on the same ring gantry, tomotherapy provides a unique opportunity to accomplish this. Dose reconstruction describes the process in which the full three-dimensional dose actually deposited in a patient is computed. Dose reconstruction requires two inputs: an image of the patient at the time of treatment and the actual energy fluence delivered. Dose is reconstructed by computing the dose in the CT with the verified energy fluence using any model-based algorithm such as convolution/superposition or Monte Carlo. In tomotherapy, the CT at the time of treatment is obtained by megavoltage CT, the merits of which have been studied and proven. The actual energy fluence delivered to the patient is computed in a process called delivery verification. Methods for delivery verification and dose reconstruction in tomotherapy were investigated in this work. It is shown that delivery verification can be realized by a linear model of the tornotherapy system. However, due to the measurements required with this initial approach, clinical implementation would be difficult. Therefore, a clinically viable method for delivery verification was established, the details of which are discussed. With the verified energy fluence from delivery verification, an assessment of the accuracy and usefulness of dose reconstruction is performed. The latter two topics are presented in the context of a generalized dose comparison tool developed for intensity modulated radiation therapy. Finally, the importance of having a CT from the time of treatment for reconstructing the dose is shown. This is currently a point of contention in modern clinical radiotherapy and it is proven that using the incorrect CT for dose reconstruction can lead

  11. Thoughts on Verification of Nuclear Disarmament

    SciTech Connect

    Dunlop, W H

    2007-09-26

    It is my pleasure to be here to day to participate in this Conference. My thanks to the organizers for preparing such an interesting agenda on a very difficult topic. My effort in preparing my presentation was performed under the auspices of the U.S. Department of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48. And as many of you know Lawrence Livermore National Laboratory is now, as of Oct 1st, under contract to the Lawrence Livermore National Security LLC. There has been a long history of how to view verification of arms control agreements. The basis for verification during the days of SALT was that verification would be based on each country's national technical means. For treaties dealing with strategic missiles this worked well as the individual items subject to verification were of such a size that they were visible by the National Technical Means available at the time. And it was felt that the counting of missiles and launchers could be verified by our National Technical Means. For nuclear testing treaties the use of seismic measurements developed into a capability that was reasonably robust for all but the smallest of nuclear tests. However, once we had the Threshold Test Ban Treaty, there was a significant problem in that the fidelity of the measurements were not sufficient to determine if a test was slightly above the 150 kt limit or slightly below the 150 kt limit. This led some in the US to believe that the Soviet Union was not living up to the TTBT agreement. An on-site verification protocol was negotiated in 1988 and 1989 that allowed the US to make hydrodynamic yield measurements on Soviet tests above 50 kt yield and regional seismic measurements on all tests above 35 kt of yield; and the Soviets to make the same type of measurements on US tests to ensure that they were not over 150 kt. These on-site measurements were considered reasonably intrusive. Again the measurement capability was not

  12. Electron photon verification calculations using MCNP4B

    SciTech Connect

    Gierga, D.P.; Adams, K.J.

    1998-07-01

    MCNP4B was released in February 1997 with significant enhancements to electron/photon transport methods. These enhancements have been verified against a wide range of published electron/photon experiments, spanning high energy bremsstrahlung production to electron transmission and reflection. Three sets of bremsstrahlung experiments were simulated. The first verification calculations for bremsstrahlung production used the experimental results in Faddegon for 15 MeV electrons incident on lead, aluminum, and beryllium targets. The calculated integrated bremsstrahlung yields, the bremsstrahlung energy spectra, and the mean energy of the bremsstrahlung beam were compared with experiment. The impact of several MCNP tally options and physics parameters was explored in detail. The second was the experiment of O`Dell which measured the bremsstrahlung spectra from 10 and 20.9 MeV electrons incident on a gold/tungsten target. The final set was a comparison of relative experimental spectra with calculated results for 9.66 MeV electrons incident on tungsten based on the experiment of Starfelt and Koch. The transmission experiments of Ebert were also studied, including comparisons of transmission coefficients for 10.2 MeV electrons incident on carbon, silver, and uranium foils. The agreement between experiment and simulation was usually within two standard deviations of the experimental and calculational errors.

  13. Simulated Order Verification and Medication Reconciliation during an Introductory Pharmacy Practice Experience

    PubMed Central

    Chesson, Melissa M.; Momary, Kathryn M.

    2015-01-01

    Objective. To create, implement, and assess a simulated medication reconciliation and an order verification activity using hospital training software. Design. A simulated patient with medication orders and home medications was built into existing hospital training software. Students in an institutional introductory pharmacy practice experience (IPPE) reconciled the patient’s medications and determined whether or not to verify the inpatient orders based on his medical history and laboratory data. After reconciliation, students identified medication discrepancies and documented their rationale for rejecting inpatient orders. Assessment. For a 3-year period, the majority of students agreed the simulation enhanced their learning, taught valuable clinical decision-making skills, integrated material from previous courses, and stimulated their interest in institutional pharmacy. Overall feedback from student evaluations about the IPPE also was favorable. Conclusion. Use of existing hospital training software can affordably simulate the pharmacist’s role in order verification and medication reconciliation, as well as improve clinical decision-making. PMID:27168609

  14. METCAN updates for high temperature composite behavior: Simulation/verification

    NASA Technical Reports Server (NTRS)

    Lee, H.-J.; Murthy, P. L. N.; Chamis, Christos C.

    1991-01-01

    The continued verification (comparisons with experimental data) of the METCAN (Metal Matrix Composite Analyzer) computer code is updated. Verification includes comparisons at room and high temperatures for two composites, SiC/Ti-15-3 and SiC/Ti-6-4. Specifically, verification of the SiC/Ti-15-3 composite includes comparisons of strength, modulus, and Poisson's ratio as well as stress-strain curves for four laminates at room temperature. High temperature verification includes comparisons of strength and stress-strain curves for two laminates. Verification of SiC/Ti-6-4 is for a transverse room temperature stress-strain curve and comparisons for transverse strength at three temperatures. Results of the verification indicates that METCAN can be used with confidence to simulate the high temperature nonlinear behavior of metal matrix composites.

  15. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    NASA Astrophysics Data System (ADS)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  16. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  17. Verification of post permanently manned configuration Space Station elements

    NASA Technical Reports Server (NTRS)

    Scully, E. J.; Edwards, M. D.

    1986-01-01

    An account is given of the techniques and ground systems designed to fulfill post permanently manned configuration (PMC) Space Station verification tasks. Consideration is given to analysis using computer math models and computer-aided interface verification systems, testing using simulators and interface mixtures, and special inspection. It is noted that an initial Space Station design that accommodates and facilitates verification is crucial to an effective verification program as well as proper instrumentation, built-in test capability, and a precise configuration management, control and record system. It is concluded that post PMC verification should be accounted for both in the initial Space Station design and in the subsequent development of initial assembly flight verification techniques and capabilities.

  18. NEMVP: North American energy measurement and verification protocol

    SciTech Connect

    1996-03-01

    This measurement and verification protocol discusses procedures that,when implemented, allow buyers, sellers, and financiers of energy projects to quantify energy conservation measure performance and savings.

  19. 19 CFR 181.74 - Verification visit procedures.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... THE TREASURY (CONTINUED) NORTH AMERICAN FREE TRADE AGREEMENT Origin Verifications and Determinations... Canadian or Mexican producer of a material, to maintain records or provide access to such records...

  20. Validation, Verification and Evaluation of Visualization Software: Position Statement

    NASA Technical Reports Server (NTRS)

    Globus, Al; Kutler, Paul (Technical Monitor)

    1998-01-01

    Visualization software needs rigorous verification in the form of much better testing, and experiments with human subjects are essential to scientifically validate and evaluate visualization techniques.

  1. Application of Annotated Logic Program to Pipeline Process Safety Verification

    NASA Astrophysics Data System (ADS)

    Nakamatsu, Kazumi

    We have developed a paraconsistent annotated logic program called Extended Vector Annotated Logic Program with Strong Negation (abbr. EVALPSN), which can deal with defeasible deontic reasoning and contradiction, and we have already applied it to safety verification and control such as railway interlocking safety verification and traffic signal control. In this paper, we introduce process safety verification as an application of EVALPSN with a small example for brewery pipeline valve control. The safety verification control is based on EVALPSN defeasible deontic reasoning to avoid unexpected mix of different sorts of liquid in pipeline networks.

  2. Verification issues for rule-based expert systems

    NASA Technical Reports Server (NTRS)

    Culbert, Chris; Riley, Gary; Savely, Robert T.

    1987-01-01

    Verification and validation of expert systems is very important for the future success of this technology. Software will never be used in non-trivial applications unless the program developers can assure both users and managers that the software is reliable and generally free from error. Therefore, verification and validation of expert systems must be done. The primary hindrance to effective verification and validation is the use of methodologies which do not produce testable requirements. An extension of the flight technique panels used in previous NASA programs should provide both documented requirements and very high levels of verification for expert systems.

  3. Evaluation of Mesoscale Model Phenomenological Verification Techniques

    NASA Technical Reports Server (NTRS)

    Lambert, Winifred

    2006-01-01

    Forecasters at the Spaceflight Meteorology Group, 45th Weather Squadron, and National Weather Service in Melbourne, FL use mesoscale numerical weather prediction model output in creating their operational forecasts. These models aid in forecasting weather phenomena that could compromise the safety of launch, landing, and daily ground operations and must produce reasonable weather forecasts in order for their output to be useful in operations. Considering the importance of model forecasts to operations, their accuracy in forecasting critical weather phenomena must be verified to determine their usefulness. The currently-used traditional verification techniques involve an objective point-by-point comparison of model output and observations valid at the same time and location. The resulting statistics can unfairly penalize high-resolution models that make realistic forecasts of a certain phenomena, but are offset from the observations in small time and/or space increments. Manual subjective verification can provide a more valid representation of model performance, but is time-consuming and prone to personal biases. An objective technique that verifies specific meteorological phenomena, much in the way a human would in a subjective evaluation, would likely produce a more realistic assessment of model performance. Such techniques are being developed in the research community. The Applied Meteorology Unit (AMU) was tasked to conduct a literature search to identify phenomenological verification techniques being developed, determine if any are ready to use operationally, and outline the steps needed to implement any operationally-ready techniques into the Advanced Weather Information Processing System (AWIPS). The AMU conducted a search of all literature on the topic of phenomenological-based mesoscale model verification techniques and found 10 different techniques in various stages of development. Six of the techniques were developed to verify precipitation forecasts, one

  4. VEG-01: Veggie Hardware Verification Testing

    NASA Technical Reports Server (NTRS)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  5. Aerospace Nickel-cadmium Cell Verification

    NASA Technical Reports Server (NTRS)

    Manzo, Michelle A.; Strawn, D. Michael; Hall, Stephen W.

    2001-01-01

    During the early years of satellites, NASA successfully flew "NASA-Standard" nickel-cadmium (Ni-Cd) cells manufactured by GE/Gates/SAFF on a variety of spacecraft. In 1992 a NASA Battery Review Board determined that the strategy of a NASA Standard Cell and Battery Specification and the accompanying NASA control of a standard manufacturing control document (MCD) for Ni-Cd cells and batteries was unwarranted. As a result of that determination, standards were abandoned and the use of cells other than the NASA Standard was required. In order to gain insight into the performance and characteristics of the various aerospace Ni-Cd products available, tasks were initiated within the NASA Aerospace Flight Battery Systems Program that involved the procurement and testing of representative aerospace Ni-Cd cell designs. A standard set of test conditions was established in order to provide similar information about the products from various vendors. The objective of this testing was to provide independent verification of representative commercial flight cells available in the marketplace today. This paper will provide a summary of the verification tests run on cells from various manufacturers: Sanyo 35 Ampere-hour (Ali) standard and 35 Ali advanced Ni-Cd cells, SAFr 50 Ah Ni-Cd cells and Eagle-Picher 21 Ali Magnum and 21 Ali Super Ni-CdTM cells from Eagle-Picher were put through a full evaluation. A limited number of 18 and 55 Ali cells from Acme Electric were also tested to provide an initial evaluation of the Acme aerospace cell designs. Additionally, 35 Ali aerospace design Ni-MH cells from Sanyo were evaluated under the standard conditions established for this program. Ile test program is essentially complete. The cell design parameters, the verification test plan and the details of the test result will be discussed.

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION COATINGS AND COATINGS EQUIPMENT PROGRAM, UV-CURABLE COATINGS GENERIC VERIFICATION PROTOCOL

    EPA Science Inventory

    The primary purpose of this document is to establish the Generic Verification Protocol (GVP) for ultraviolet (UV)-curable coatings, to be referred to as the UV-Curable Coatings GVP. The secondary purpose is to establish the generic format and guidelines for product specific Tes...

  7. Industrial methodology for process verification in research (IMPROVER): toward systems biology verification

    PubMed Central

    Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo

    2012-01-01

    Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044

  8. [Research on metrological verification and the development of the verification equipment for digital electrocardiographs].

    PubMed

    Duan, Cheng-hua; Jia, Jian-ge; Li, Ying-xue; Duan, Xin-an; Zhang, Shu-wang

    2005-07-01

    In order to guarantee that digital electrocardiographs with an adequate accuracy and reliabity are available for measurements in clinical practice, it is very important to determine the metrological characteristics, methods and to develop the equipments for verification. This paper elaborates the basic functions, specification, schematic drawing and key techniques of calibration equipments as well as test items for digital electrocardiographs. PMID:16419941

  9. Pion treatment procedures and verification techniques

    SciTech Connect

    Zink, S.R.; Bush, S.E.; Gilman, C.J.; Hilko, R.H.; Justice, R.K.; Osborne, E.C.; Smith, A.R.; Berardo, P.A.

    1984-05-01

    Procedures and techniques developed for the negative pi-meson (pion) radiotherapy program at the Los Alamos Meson Physics Facility, Los Alamos, NM, are reviewed and described. A particular pion patient is followed through the entire planning and treatment sequence to describe CT scanning procedures, bolus and collimator and treatment techniques developed to minimize positioning errors (less than 5 mm). Comparison of 2-D and 3-d isodose calculation developed at Los Alamos showed differences of less than 10% attributable to multiple scattering effects and the computational models used. Treatment verification methods using in vivo ion chamber dosimetry generally confirmed the prescribed dose delivery within 10% and using TLD within 18%.

  10. A comparison of software verification techniques

    NASA Technical Reports Server (NTRS)

    1985-01-01

    A controlled experiment performed by the Software Engineering Laboratory (SEL) to compare the effectiveness of code reading, functional testing, and structural testing as software verification techniques is described. The experiment results indicate that code reading provides the greatest error detection capability at the lowest cost, whereas structural testing is the least effective technique. The experiment plan is explained, the experiment results are described, related results from other studies are discussed. The application of these results to the development of software in the flight dynamics environment is considered. Appendices summarize the experiment data and list the test programs.

  11. Low level vapor verification of monomethyl hydrazine

    NASA Technical Reports Server (NTRS)

    Mehta, Narinder

    1990-01-01

    The vapor scrubbing system and the coulometric test procedure for the low level vapor verification of monomethyl hydrazine (MMH) are evaluated. Experimental data on precision, efficiency of the scrubbing liquid, instrument response, detection and reliable quantitation limits, stability of the vapor scrubbed solution, and interference were obtained to assess the applicability of the method for the low ppb level detection of the analyte vapor in air. The results indicated that the analyte vapor scrubbing system and the coulometric test procedure can be utilized for the quantitative detection of low ppb level vapor of MMH in air.

  12. Damage inspection and verification of tethers (DIVOT)

    NASA Technical Reports Server (NTRS)

    Howard, George E.; Gray, John; Levi, Alejandro

    1989-01-01

    The feasibility of an optically-based noncontacting Damage Inspection and Verification of Tethers (DIVOT) System has been developed and successfully demonstrated. DIVOT can be used to inspect a tether for micrometeorite-sized damage sites quickly, at tether scanning rates exceeding 1 m/s. The DIVOT system concept is described and the DIVOT experimental setup for demonstrating the system is reviewed. The experiments are described and the results are reported. The establishment of a relationship between damage and optical response is discussed.

  13. Experimental verification of a large flexible manipulator

    NASA Technical Reports Server (NTRS)

    Lee, Jac Won; Huggins, James D.; Book, Wayne J.

    1988-01-01

    A large experimental lightweight manipulator would be useful for material handling, for welding, or for ultrasonic inspection of a large structure, such as an airframe. The flexible parallel link mechanism is designed for high rigidity without increasing weight. This constrained system is analyzed by singular value decomposition of the constraint Jacobian matrix. A verification of the modeling using the assumed mode method is presented. Eigenvalues and eigenvectors of the linearized model are compared to the measured system natural frequencies and their associated mode shapes. The modeling results for large motions are compared to the time response data from the experiments. The hydraulic actuator is verified.

  14. Thermionic fuel element Verification Program - Overview

    NASA Astrophysics Data System (ADS)

    Bohl, Richard J.; Dahlberg, Richard C.; Dutt, Dale S.; Wood, John T.

    The TFE Verification Program is in the sixth year of a program to demonstrate the performance and lifetime of thermionic fuel elements for high power space applications. Data from accelerated tests in FETF and EBR-II show component lifetimes longer than 7 yr. Alumina insulators have shown good performance at high fast fluence. Graphite-cesium reservoirs based on isotropic graphite also meet requirements. Three TFEs are currently operating in the TRIGA reactor, the oldest having accumulated 15,000 hr of irradiation as of 1 October 1990.

  15. Formal Verification of Large Software Systems

    NASA Technical Reports Server (NTRS)

    Yin, Xiang; Knight, John

    2010-01-01

    We introduce a scalable proof structure to facilitate formal verification of large software systems. In our approach, we mechanically synthesize an abstract specification from the software implementation, match its static operational structure to that of the original specification, and organize the proof as the conjunction of a series of lemmas about the specification structure. By setting up a different lemma for each distinct element and proving each lemma independently, we obtain the important benefit that the proof scales easily for large systems. We present details of the approach and an illustration of its application on a challenge problem from the security domain

  16. Accelerating functional verification of an integrated circuit

    SciTech Connect

    Deindl, Michael; Ruedinger, Jeffrey Joseph; Zoellin, Christian G.

    2015-10-27

    Illustrative embodiments include a method, system, and computer program product for accelerating functional verification in simulation testing of an integrated circuit (IC). Using a processor and a memory, a serial operation is replaced with a direct register access operation, wherein the serial operation is configured to perform bit shifting operation using a register in a simulation of the IC. The serial operation is blocked from manipulating the register in the simulation of the IC. Using the register in the simulation of the IC, the direct register access operation is performed in place of the serial operation.

  17. Thermionic fuel element verification program—overview

    NASA Astrophysics Data System (ADS)

    Bohl, Richard J.; Dutt, Dale S.; Dahlberg, Richard C.; Wood, John T.

    1991-01-01

    TFE Verification Program is in the sixth year of a program to demonstrate the performance and lifetime of thermionic fuel elements for high power space applications. It is jointly funded by SIDO and DOE. Data from accelerated tests in FFTF and EBR-II show component lifetimes longer than 7 years. Alumina insulators have shown good performance at high fast fluence. Graphite-cesium reservoirs based on isotropic graphite also meet requirements. Three TFEs are current operating in the TRIGA reactor, the oldest having accumulated 15,000 hours of irradiation as of 1 October 1990.

  18. Infrared scanner concept verification test report

    NASA Technical Reports Server (NTRS)

    Bachtel, F. D.

    1980-01-01

    The test results from a concept verification test conducted to assess the use of an infrared scanner as a remote temperature sensing device for the space shuttle program are presented. The temperature and geometric resolution limits, atmospheric attenuation effects including conditions with fog and rain, and the problem of surface emissivity variations are included. It is concluded that the basic concept of using an infrared scanner to determine near freezing surface temperatures is feasible. The major problem identified is concerned with infrared reflections which result in significant errors if not controlled. Action taken to manage these errors result in design and operational constraints to control the viewing angle and surface emissivity.

  19. Environmental Technology Verification Report -- Baghouse filtration products, GE Energy QG061 filtration media ( tested May 2007)

    EPA Science Inventory

    EPA has created the Environmental Technology Verification Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The Air Pollution Control Technology Verification Center, a cente...

  20. Subsurface barrier integrity verification using perfluorocarbon tracers

    SciTech Connect

    Sullivan, T.M.; Heiser, J.; Milian, L.; Senum, G.

    1996-12-01

    Subsurface barriers are an extremely promising remediation option to many waste management problems. Gas phase tracers include perfluorocarbon tracers (PFT`s) and chlorofluorocarbon tracers (CFC`s). Both have been applied for leak detection in subsurface systems. The focus of this report is to describe the barrier verification tests conducted using PFT`s and analysis of the data from the tests. PFT verification tests have been performed on a simulated waste pit at the Hanford Geotechnical facility and on an actual waste pit at Brookhaven National Laboratory (BNL). The objective of these tests were to demonstrate the proof-of-concept that PFT technology can be used to determine if small breaches form in the barrier and for estimating the effectiveness of the barrier in preventing migration of the gas tracer to the monitoring wells. The subsurface barrier systems created at Hanford and BNL are described. The experimental results and the analysis of the data follow. Based on the findings of this study, conclusions are offered and suggestions for future work are presented.

  1. SPR Hydrostatic Column Model Verification and Validation.

    SciTech Connect

    Bettin, Giorgia; Lord, David; Rudeen, David Keith

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  2. Monitoring/Verification Using DMS: TATP Example

    SciTech Connect

    Kevin Kyle; Stephan Weeks

    2008-03-01

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operationsmanagement systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biological materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. GC is the leading analytical method for the separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.

  3. Monitoring/Verification using DMS: TATP Example

    SciTech Connect

    Stephan Weeks, Kevin Kyle, Manuel Manard

    2008-05-30

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations-management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biological materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. Fast GC is the leading field analytical method for gas phase separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.

  4. Verification and validation for induction heating

    SciTech Connect

    Lam, Kin; Tippetts, Trevor B; Allen, David W

    2008-01-01

    Truchas is a software package being developed at LANL within the Telluride project for predicting the complex physical processes in metal alloy casting. The software was designed to be massively parallel, multi-material, multi-physics, and to run on 3D, fully unstructured meshes. This work describes a Verification and Validation assessment of Truchas for simulating the induction heating phase of a casting process. We used existing data from a simple experiment involving the induction heating of a graphite cylinder, as graphite is a common material used for mold assemblies. Because we do not have complete knowledge of all the conditions and properties in this experiment (as is the case in many other experiments), we performed a parameter sensitivity study, modeled the uncertainties of the most sensitive parameters, and quantified how these uncertainties propagate to the Truchas output response. A verification analysis produced estimates of the numerical error of the Truchas solution to our computational model. The outputs from Truchas runs with randomly sampled parameter values were used for the validation study.

  5. Verification and validation of control system software

    SciTech Connect

    Munro, J.K. Jr.; Kisner, R.A. ); Bhadtt, S.C. )

    1991-01-01

    The following guidelines are proposed for verification and validation (V V) of nuclear power plant control system software: (a) use risk management to decide what and how much V V is needed; (b) classify each software application using a scheme that reflects what type and how much V V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs.

  6. Advanced verification methods for OVI security ink

    NASA Astrophysics Data System (ADS)

    Coombs, Paul G.; McCaffery, Shaun F.; Markantes, Tom

    2006-02-01

    OVI security ink +, incorporating OVP security pigment* microflakes, enjoys a history of effective document protection. This security feature provides not only first-line recognition by the person on the street, but also facilitates machine-readability. This paper explores the evolution of OVI reader technology from proof-of-concept to miniaturization. Three different instruments have been built to advance the technology of OVI machine verification. A bench-top unit has been constructed which allows users to automatically verify a multitude of different banknotes and OVI images. In addition, high speed modules were fabricated and tested in a state of the art banknote sorting machine. Both units demonstrate the ability of modern optical components to illuminate and collect light reflected from the interference platelets within OVI ink. Electronic hardware and software convert and process the optical information in milliseconds to accurately determine the authenticity of the security feature. Most recently, OVI ink verification hardware has been miniaturized and simplified providing yet another platform for counterfeit protection. These latest devices provide a tool for store clerks and bank tellers to unambiguously determine the validity of banknotes in the time period it takes the cash drawer to be opened.

  7. Optical security verification for blurred fingerprints

    NASA Astrophysics Data System (ADS)

    Soon, Boon Y.; Karim, Mohammad A.; Alam, Mohammad S.

    1998-12-01

    Optical fingerprint security verification is gaining popularity, as it has the potential to perform correlation at the speed of light. With advancement in optical security verification techniques, authentication process can be almost foolproof and reliable for financial transaction, banking, etc. In law enforcement, when a fingerprint is obtained from a crime scene, it may be blurred and can be an unhealthy candidate for correlation purposes. Therefore, the blurred fingerprint needs to be clarified before it is used for the correlation process. There are a several different types of blur, such as linear motion blur and defocus blur, induced by aberration of imaging system. In addition, we may or may not know the blur function. In this paper, we propose the non-singularity inverse filtering in frequency/power domain for deblurring known motion-induced blur in fingerprints. This filtering process will be incorporated with the pow spectrum subtraction technique, uniqueness comparison scheme, and the separated target and references planes method in the joint transform correlator. The proposed hardware implementation is a hybrid electronic-optical correlator system. The performance of the proposed system would be verified with computer simulation for both cases: with and without additive random noise corruption.

  8. Fifty years of progress in speaker verification

    NASA Astrophysics Data System (ADS)

    Rosenberg, Aaron E.

    2004-10-01

    The modern era in speaker recognition started about 50 years ago at Bell Laboratories with the controversial invention of the voiceprint technique for speaker identification based on expert analysis of speech spectrograms. Early speaker recognition research concentrated on finding acoustic-phonetic features effective in discriminating speakers. The first truly automatic text dependent speaker verification systems were based on time contours or templates of speaker specific acoustic features. An important element of these systems was the ability to time warp sample templates with model templates in order to provide useful comparisons. Most modern text dependent speaker verification systems are based on statistical representations of acoustic features analyzed as a function of time over specified utterances, most particularly the hidden markov model (HMM) representation. Modern text independent systems are based on vector quantization representations and, more recently, on Gaussian mixture model (GMM) representations. An important ingredient of statistically based systems is likelihood ratio decision techniques making use of speaker background models. Some recent research has shown how to extract higher level features based on speaking behavior and combine it with lower level, acoustic features for improved performance. The talk will present these topics in historical order showing the evolution of techniques.

  9. DESIGN INFORMATION VERIFICATION FOR NUCLEAR SAFEGUARDS

    SciTech Connect

    Robert S. Bean; Richard R. M. Metcalf; Phillip C. Durst

    2009-07-01

    A critical aspect of international safeguards activities performed by the International Atomic Energy Agency (IAEA) is the verification that facility design and construction (including upgrades and modifications) do not create opportunities for nuclear proliferation. These Design Information Verification activities require that IAEA inspectors compare current and past information about the facility to verify the operator’s declaration of proper use. The actual practice of DIV presents challenges to the inspectors due to the large amount of data generated, concerns about sensitive or proprietary data, the overall complexity of the facility, and the effort required to extract just the safeguards relevant information. Planned and anticipated facilities will (especially in the case of reprocessing plants) be ever larger and increasingly complex, thus exacerbating the challenges. This paper reports the results of a workshop held at the Idaho National Laboratory in March 2009, which considered technologies and methods to address these challenges. The use of 3D Laser Range Finding, Outdoor Visualization System, Gamma-LIDAR, and virtual facility modeling, as well as methods to handle the facility data issues (quantity, sensitivity, and accessibility and portability for the inspector) were presented. The workshop attendees drew conclusions about the use of these techniques with respect to successfully employing them in an operating environment, using a Fuel Conditioning Facility walk-through as a baseline for discussion.

  10. Orbit attitude processor. STS-1 bench program verification test plan

    NASA Technical Reports Server (NTRS)

    Mcclain, C. R.

    1980-01-01

    A plan for the static verification of the STS-1 ATT PROC ORBIT software requirements is presented. The orbit version of the SAPIENS bench program is used to generate the verification data. A brief discussion of the simulation software and flight software modules is presented along with a description of the test cases.

  11. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PEMS calibrations and verifications. 1065.920 Section 1065.920 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065.920 PEMS calibrations and verifications....

  12. 28 CFR 811.9 - Periodic verification of registration information.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 2 2014-07-01 2014-07-01 false Periodic verification of registration... THE DISTRICT OF COLUMBIA SEX OFFENDER REGISTRATION § 811.9 Periodic verification of registration... photograph that is five or more years old. (e) CSOSA, either on its own accord or with its law...

  13. 28 CFR 811.9 - Periodic verification of registration information.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 28 Judicial Administration 2 2013-07-01 2013-07-01 false Periodic verification of registration... THE DISTRICT OF COLUMBIA SEX OFFENDER REGISTRATION § 811.9 Periodic verification of registration... photograph that is five or more years old. (e) CSOSA, either on its own accord or with its law...

  14. Review and verification of CARE 3 mathematical model and code

    NASA Technical Reports Server (NTRS)

    Rose, D. M.; Altschul, R. E.; Manke, J. W.; Nelson, D. L.

    1983-01-01

    The CARE-III mathematical model and code verification performed by Boeing Computer Services were documented. The mathematical model was verified for permanent and intermittent faults. The transient fault model was not addressed. The code verification was performed on CARE-III, Version 3. A CARE III Version 4, which corrects deficiencies identified in Version 3, is being developed.

  15. 24 CFR 960.259 - Family information and verification.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 24 Housing and Urban Development 4 2014-04-01 2014-04-01 false Family information and verification... URBAN DEVELOPMENT ADMISSION TO, AND OCCUPANCY OF, PUBLIC HOUSING Rent and Reexamination § 960.259 Family information and verification. (a) Family obligation to supply information. (1) The family must supply...

  16. 24 CFR 5.659 - Family information and verification.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 24 Housing and Urban Development 1 2013-04-01 2013-04-01 false Family information and verification... Assisted Housing Serving Persons with Disabilities: Family Income and Family Payment; Occupancy... § 5.659 Family information and verification. (a) Applicability. This section states requirements...

  17. 24 CFR 5.659 - Family information and verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 24 Housing and Urban Development 1 2012-04-01 2012-04-01 false Family information and verification... Assisted Housing Serving Persons with Disabilities: Family Income and Family Payment; Occupancy... § 5.659 Family information and verification. (a) Applicability. This section states requirements...

  18. 24 CFR 960.259 - Family information and verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Family information and verification... URBAN DEVELOPMENT ADMISSION TO, AND OCCUPANCY OF, PUBLIC HOUSING Rent and Reexamination § 960.259 Family information and verification. (a) Family obligation to supply information. (1) The family must supply...

  19. 24 CFR 960.259 - Family information and verification.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 24 Housing and Urban Development 4 2013-04-01 2013-04-01 false Family information and verification... URBAN DEVELOPMENT ADMISSION TO, AND OCCUPANCY OF, PUBLIC HOUSING Rent and Reexamination § 960.259 Family information and verification. (a) Family obligation to supply information. (1) The family must supply...

  20. 24 CFR 960.259 - Family information and verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 24 Housing and Urban Development 4 2012-04-01 2012-04-01 false Family information and verification... URBAN DEVELOPMENT ADMISSION TO, AND OCCUPANCY OF, PUBLIC HOUSING Rent and Reexamination § 960.259 Family information and verification. (a) Family obligation to supply information. (1) The family must supply...

  1. 24 CFR 5.659 - Family information and verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Family information and verification... Assisted Housing Serving Persons with Disabilities: Family Income and Family Payment; Occupancy... § 5.659 Family information and verification. (a) Applicability. This section states requirements...

  2. 24 CFR 5.659 - Family information and verification.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 24 Housing and Urban Development 1 2014-04-01 2014-04-01 false Family information and verification... Assisted Housing Serving Persons with Disabilities: Family Income and Family Payment; Occupancy... § 5.659 Family information and verification. (a) Applicability. This section states requirements...

  3. 24 CFR 960.259 - Family information and verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Family information and verification... URBAN DEVELOPMENT ADMISSION TO, AND OCCUPANCY OF, PUBLIC HOUSING Rent and Reexamination § 960.259 Family information and verification. (a) Family obligation to supply information. (1) The family must supply...

  4. 24 CFR 5.659 - Family information and verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Family information and verification... Assisted Housing Serving Persons with Disabilities: Family Income and Family Payment; Occupancy... § 5.659 Family information and verification. (a) Applicability. This section states requirements...

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION: ADD-ON NOX CONTROLS

    EPA Science Inventory

    The paper discusses the environmental technology verification (ETV) of add-on nitrogen oxide (NOx) controls. Research Triangle Institute (RTI) is EPA's cooperating partner for the Air Pollution Control Technology (APCT) Program, one of a dozen ETV pilot programs. Verification of ...

  6. 47 CFR 2.952 - Limitation on verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Limitation on verification. 2.952 Section 2.952 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures Verification § 2.952 Limitation on...

  7. 12 CFR 741.202 - Audit and verification requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Audit and verification requirements. 741.202... Unions That Also Apply to Federally Insured State-Chartered Credit Unions § 741.202 Audit and verification requirements. (a) The supervisory committee of each credit union insured pursuant to Title II...

  8. 45 CFR 1626.7 - Verification of eligible alien status.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 4 2012-10-01 2012-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the...

  9. 45 CFR 1626.7 - Verification of eligible alien status.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the...

  10. 45 CFR 1626.7 - Verification of eligible alien status.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 4 2013-10-01 2013-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the...

  11. 45 CFR 1626.7 - Verification of eligible alien status.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 4 2011-10-01 2011-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the...

  12. 45 CFR 1626.6 - Verification of citizenship.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Verification of citizenship. 1626.6 Section 1626.6 Public Welfare Regulations Relating to Public Welfare (Continued) LEGAL SERVICES CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.6 Verification of citizenship. (a) A recipient shall require...

  13. Pattern-based full-chip process verification

    NASA Astrophysics Data System (ADS)

    Ying, Changsheng; Kwon, Yongjun; Fornari, Paul; Perçin, Gökhan; Liu, Anwei

    2014-03-01

    This paper discusses a novel pattern based standalone process verification technique that meets with current and future needs for semiconductor manufacturing of memory and logic devices. The choosing the right process verification technique is essential to bridge the discrepancy between the intended and the printed pattern. As the industry moving to very low k1 patterning solutions at each technology node, the challenges for process verification are becoming nightmare for lithography engineers, such as large number of possible verification defects and defect disposition. In low k1 lithography, demand for full-chip process verification is increasing. Full-chip process verification is applied post to process and optical proximity correction (OPC) step. The current challenges in process verification are large number of defects reported, disposition difficulties, long defect review times, and no feedback provided to OPC. The technique presented here is based on pattern based verification where each reported defects are classified in terms of patterns and these patterns are saved to a database. Later this database is used for screening incoming new design prior to OPC step.

  14. 46 CFR 42.13-45 - Verification of marks.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Verification of marks. 42.13-45 Section 42.13-45 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) LOAD LINES DOMESTIC AND FOREIGN VOYAGES BY SEA General Rules for Determining Load Lines § 42.13-45 Verification of marks. (a)...

  15. 46 CFR 42.13-45 - Verification of marks.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 2 2014-10-01 2014-10-01 false Verification of marks. 42.13-45 Section 42.13-45 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) LOAD LINES DOMESTIC AND FOREIGN VOYAGES BY SEA General Rules for Determining Load Lines § 42.13-45 Verification of marks. (a)...

  16. 46 CFR 42.13-45 - Verification of marks.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 2 2012-10-01 2012-10-01 false Verification of marks. 42.13-45 Section 42.13-45 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) LOAD LINES DOMESTIC AND FOREIGN VOYAGES BY SEA General Rules for Determining Load Lines § 42.13-45 Verification of marks. (a)...

  17. 46 CFR 42.13-45 - Verification of marks.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 2 2013-10-01 2013-10-01 false Verification of marks. 42.13-45 Section 42.13-45 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) LOAD LINES DOMESTIC AND FOREIGN VOYAGES BY SEA General Rules for Determining Load Lines § 42.13-45 Verification of marks. (a)...

  18. 46 CFR 42.13-45 - Verification of marks.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Verification of marks. 42.13-45 Section 42.13-45 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) LOAD LINES DOMESTIC AND FOREIGN VOYAGES BY SEA General Rules for Determining Load Lines § 42.13-45 Verification of marks. (a)...

  19. 21 CFR 21.44 - Verification of identity.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought....

  20. CMOS VLSI Layout and Verification of a SIMD Computer

    NASA Technical Reports Server (NTRS)

    Zheng, Jianqing

    1996-01-01

    A CMOS VLSI layout and verification of a 3 x 3 processor parallel computer has been completed. The layout was done using the MAGIC tool and the verification using HSPICE. Suggestions for expanding the computer into a million processor network are presented. Many problems that might be encountered when implementing a massively parallel computer are discussed.

  1. Student-Teacher Linkage Verification: Model Process and Recommendations

    ERIC Educational Resources Information Center

    Watson, Jeffery; Graham, Matthew; Thorn, Christopher A.

    2012-01-01

    As momentum grows for tracking the role of individual educators in student performance, school districts across the country are implementing projects that involve linking teachers to their students. Programs that link teachers to student outcomes require a verification process for student-teacher linkages. Linkage verification improves accuracy by…

  2. 26 CFR 1.6065-1 - Verification of returns.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 13 2010-04-01 2010-04-01 false Verification of returns. 1.6065-1 Section 1... (CONTINUED) INCOME TAXES Signing and Verifying of Returns and Other Documents § 1.6065-1 Verification of... compensation or as an incident to the performance of other services for which such person receives...

  3. 40 CFR 1066.235 - Speed verification procedure.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Speed verification procedure. 1066.235 Section 1066.235 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.235 Speed verification procedure. (a) Overview. This section describes how...

  4. 40 CFR 1066.235 - Speed verification procedure.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Speed verification procedure. 1066.235 Section 1066.235 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.235 Speed verification procedure. (a) Overview. This section describes how...

  5. SITE CHARACTERIZATION AND MONITORING TECHNOLOGY VERIFICATION: PROGRESS AND RESULTS

    EPA Science Inventory

    The Site Characterization and Monitoring Technology Pilot of the U.S. Environmental Protection Agency's Environmental Technology Verification Program (ETV) has been engaged in verification activities since the fall of 1994 (U.S. EPA, 1997). The purpose of the ETV is to promote th...

  6. 76 FR 23861 - Documents Acceptable for Employment Eligibility Verification; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-29

    ... a final rule in the Federal Register at 76 FR 21225 establishing Documents Acceptable for Employment... RIN 1615-AB69 Documents Acceptable for Employment Eligibility Verification; Correction AGENCY: U.S... final rule titled Documents Acceptable for Employment Eligibility Verification published in the...

  7. 45 CFR 1626.7 - Verification of eligible alien status.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 4 2014-10-01 2014-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the...

  8. Validation and verification plan for safety and PRA codes

    SciTech Connect

    Ades, M.J. ); Crowe, R.D.; Toffer, H. )

    1991-04-01

    This report discusses a verification and validation (V V) plan for computer codes used for safety analysis and probabilistic risk assessment calculations. The present plan fulfills the commitments by Westinghouse Savannah River Company (WSRC) to the Department of Energy Savannah River Office (DOE-SRO) to bring the essential safety analysis and probabilistic risk assessment codes in compliance with verification and validation requirements.

  9. 19 CFR 181.72 - Verification scope and method.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 2 2013-04-01 2013-04-01 false Verification scope and method. 181.72 Section 181.72 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) NORTH AMERICAN FREE TRADE AGREEMENT Origin Verifications and...

  10. 19 CFR 10.309 - Verification of documentation.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 1 2013-04-01 2013-04-01 false Verification of documentation. 10.309 Section 10.309 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY ARTICLES CONDITIONALLY FREE, SUBJECT TO A REDUCED RATE, ETC. United States-Canada Free Trade Agreement § 10.309 Verification...

  11. 19 CFR 181.73 - Notification of verification visit.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 19 Customs Duties 2 2012-04-01 2012-04-01 false Notification of verification visit. 181.73 Section 181.73 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) NORTH AMERICAN FREE TRADE AGREEMENT Origin Verifications...

  12. 19 CFR 181.73 - Notification of verification visit.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Notification of verification visit. 181.73 Section 181.73 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) NORTH AMERICAN FREE TRADE AGREEMENT Origin Verifications...

  13. 19 CFR 181.72 - Verification scope and method.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 19 Customs Duties 2 2012-04-01 2012-04-01 false Verification scope and method. 181.72 Section 181.72 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) NORTH AMERICAN FREE TRADE AGREEMENT Origin Verifications and...

  14. 19 CFR 181.73 - Notification of verification visit.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 2 2013-04-01 2013-04-01 false Notification of verification visit. 181.73 Section 181.73 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) NORTH AMERICAN FREE TRADE AGREEMENT Origin Verifications...

  15. 19 CFR 181.73 - Notification of verification visit.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 19 Customs Duties 2 2011-04-01 2011-04-01 false Notification of verification visit. 181.73 Section 181.73 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) NORTH AMERICAN FREE TRADE AGREEMENT Origin Verifications...

  16. A Tutorial on Text-Independent Speaker Verification

    NASA Astrophysics Data System (ADS)

    Bimbot, Frédéric; Bonastre, Jean-François; Fredouille, Corinne; Gravier, Guillaume; Magrin-Chagnolleau, Ivan; Meignier, Sylvain; Merlin, Teva; Ortega-García, Javier; Petrovska-Delacrétaz, Dijana; Reynolds, Douglas A.

    2004-12-01

    This paper presents an overview of a state-of-the-art text-independent speaker verification system. First, an introduction proposes a modular scheme of the training and test phases of a speaker verification system. Then, the most commonly speech parameterization used in speaker verification, namely, cepstral analysis, is detailed. Gaussian mixture modeling, which is the speaker modeling technique used in most systems, is then explained. A few speaker modeling alternatives, namely, neural networks and support vector machines, are mentioned. Normalization of scores is then explained, as this is a very important step to deal with real-world data. The evaluation of a speaker verification system is then detailed, and the detection error trade-off (DET) curve is explained. Several extensions of speaker verification are then enumerated, including speaker tracking and segmentation by speakers. Then, some applications of speaker verification are proposed, including on-site applications, remote applications, applications relative to structuring audio information, and games. Issues concerning the forensic area are then recalled, as we believe it is very important to inform people about the actual performance and limitations of speaker verification systems. This paper concludes by giving a few research trends in speaker verification for the next couple of years.

  17. Certification and verification for Calmac flat plate solar collector

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Information used in the certification and verification of the Calmac Flat Plate Collector is presented. Contained are such items as test procedures and results, information on materials used, installation, operation, and maintenance manuals, and other information pertaining to the verification and certification.

  18. 40 CFR 1066.235 - Speed verification procedure.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Speed verification procedure. 1066.235 Section 1066.235 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.235 Speed verification procedure. (a) Overview. This section describes how...

  19. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... mean values as described in 40 CFR 1065.602(b). (7) Calculate the base inertia error, I berror, for... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification....

  20. Design, analysis, and test verification of advanced encapsulation system

    NASA Technical Reports Server (NTRS)

    Garcia, A.; Minning, C.

    1981-01-01

    Procurement of 4 in x 4 in polycrystalline solar cells were proceeded with some delays. A total of 1200 cells were procured for use in both the verification testing and qualification testing. Additional thermal structural analyses were run and the data are presented. An outline of the verification testing is included with information on test specimen construction.

  1. 20 CFR 212.5 - Verification of military service.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 1 2014-04-01 2012-04-01 true Verification of military service. 212.5... MILITARY SERVICE § 212.5 Verification of military service. Military service may be verified by the... armed forces that shows the beginning and ending dates of the individual's active military service; or...

  2. 20 CFR 212.5 - Verification of military service.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 1 2013-04-01 2012-04-01 true Verification of military service. 212.5... MILITARY SERVICE § 212.5 Verification of military service. Military service may be verified by the... armed forces that shows the beginning and ending dates of the individual's active military service; or...

  3. 20 CFR 212.5 - Verification of military service.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 1 2012-04-01 2012-04-01 false Verification of military service. 212.5... MILITARY SERVICE § 212.5 Verification of military service. Military service may be verified by the... armed forces that shows the beginning and ending dates of the individual's active military service; or...

  4. 40 CFR 1065.345 - Vacuum-side leak verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Vacuum-side leak verification. 1065....345 Vacuum-side leak verification. (a) Scope and frequency. Verify that there are no significant vacuum-side leaks using one of the leak tests described in this section. For laboratory testing,...

  5. 40 CFR 1065.345 - Vacuum-side leak verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Vacuum-side leak verification. 1065....345 Vacuum-side leak verification. (a) Scope and frequency. Verify that there are no significant vacuum-side leaks using one of the leak tests described in this section. For laboratory testing,...

  6. 25 CFR 39.405 - How will verifications be conducted?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 1 2011-04-01 2011-04-01 false How will verifications be conducted? 39.405 Section 39.405 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR EDUCATION THE INDIAN SCHOOL EQUALIZATION PROGRAM Accountability § 39.405 How will verifications be conducted? The eligibility of every student shall be verified. The ELO will take...

  7. 19 CFR 181.72 - Verification scope and method.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Verification scope and method. 181.72 Section 181... § 181.72 Verification scope and method. (a) General. Subject to paragraph (e) of this section, Customs... any other method that produces a confirmation of receipt by the exporter or producer; or (B) By...

  8. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 2 2014-10-01 2014-10-01 false Design verification testing. 61.40-3 Section 61.40-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design...

  9. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Design verification testing. 61.40-3 Section 61.40-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design...

  10. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Design verification testing. 61.40-3 Section 61.40-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design...

  11. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 2 2012-10-01 2012-10-01 false Design verification testing. 61.40-3 Section 61.40-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design...

  12. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 2 2013-10-01 2013-10-01 false Design verification testing. 61.40-3 Section 61.40-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design...

  13. Three Lectures on Theorem-proving and Program Verification

    NASA Technical Reports Server (NTRS)

    Moore, J. S.

    1983-01-01

    Topics concerning theorem proving and program verification are discussed with particlar emphasis on the Boyer/Moore theorem prover, and approaches to program verification such as the functional and interpreter methods and the inductive assertion approach. A history of the discipline and specific program examples are included.

  14. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  15. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  16. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  17. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  18. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  19. 20 CFR 212.5 - Verification of military service.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Verification of military service. 212.5... MILITARY SERVICE § 212.5 Verification of military service. Military service may be verified by the... armed forces that shows the beginning and ending dates of the individual's active military service; or...

  20. 20 CFR 212.5 - Verification of military service.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 1 2011-04-01 2011-04-01 false Verification of military service. 212.5... MILITARY SERVICE § 212.5 Verification of military service. Military service may be verified by the... armed forces that shows the beginning and ending dates of the individual's active military service; or...

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION TEST PROTOCOL, GENERAL VENTILATION FILTERS

    EPA Science Inventory

    The Environmental Technology Verification Test Protocol, General Ventilation Filters provides guidance for verification tests.

    Reference is made in the protocol to the ASHRAE 52.2P "Method of Testing General Ventilation Air-cleaning Devices for Removal Efficiency by P...

  2. A verification logic representation of indeterministic signal states

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1991-01-01

    The integration of modern CAD tools with formal verification environments require translation from hardware description language to verification logic. A signal representation including both unknown state and a degree of strength indeterminacy is essential for the correct modeling of many VLSI circuit designs. A higher-order logic theory of indeterministic logic signals is presented.

  3. 19 CFR 10.309 - Verification of documentation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... THE TREASURY ARTICLES CONDITIONALLY FREE, SUBJECT TO A REDUCED RATE, ETC. United States-Canada Free Trade Agreement § 10.309 Verification of documentation. Any evidence of country of origin or of direct shipment submitted in support of a preference under the Agreement shall be subject to such verification...

  4. 40 CFR 1066.240 - Torque transducer verification and calibration.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification and calibration. Calibrate torque-measurement systems as described in 40 CFR 1065.310. ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Torque transducer verification...

  5. 40 CFR 1066.240 - Torque transducer verification and calibration.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification and calibration. Calibrate torque-measurement systems as described in 40 CFR 1065.310. ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Torque transducer verification...

  6. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct such measurements to test the algorithms during the analyzer interference verification....

  7. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct such measurements to test the algorithms during the analyzer interference verification....

  8. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct such measurements to test the algorithms during the analyzer interference verification....

  9. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct such measurements to test the algorithms during the analyzer interference verification....

  10. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct such measurements to test the algorithms during the analyzer interference verification....

  11. Observatory verification: principles and lessons learned in commissioning the Hubble Observatory following shuttle servicing

    NASA Astrophysics Data System (ADS)

    Biagetti, Carl

    2002-12-01

    The Hubble Space Telescope (HST) was designed for periodic servicing by Space Shuttle astronauts. These servicing missions enable state-of-the-art upgrades to the Observatory"s scientific capabilities, engineering upgrades and refurbishments, and, when needed, repairs. Since its launch and deployment in 1990, there have been four space shuttle missions to service the HST. (A fifth is currently scheduled for March 2004) In each case, upon completion of a servicing mission and the astronauts" release of the telescope, HST undergoes a period of intense and highly coordinated verification activities designed to commission the Observatory"s new capabilities and components for normal operations. The commissioning program following the 1990 deployment mission was known as OV/SV (orbital verification/science verification) while each of those following the subsequent Shuttle servicings has become known as servicing mission observatory verification, or SMOV. The 1990 OV/SV activities were hampered and greatly complicated by the problem of spherical aberration of the primary optics. The first servicing mission, SM1, in December 1993, is still remembered as the Hubble repair mission, having restored HST"s optics to within the original mission specifications. SMOV1 was important not only for confirming the optical fixes with spectacular early images, but also for demonstrating the effectiveness of "success-oriented" scheduling as a technique for orbital verification. The second servicing mission, SM2, in February 1997, greatly enhanced the scientific capabilities of HST but did so at the cost of greatly increased mechanical and operational complexity. The resulting SMOV2 program was accordingly the most complicated and ambitious till then and, as it turned out, the most responsive and resilient, as the newly installed instruments presented serious, unforeseen on-orbit problems. The third servicing mission, SM3a, carried out in December 1999, was essentially an emergency mission

  12. DFM flow by using combination between design based metrology system and model based verification at sub-50nm memory device

    NASA Astrophysics Data System (ADS)

    Kim, Cheol-kyun; Kim, Jungchan; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu; Kim, Jinwoong

    2007-03-01

    As the minimum transistor length is getting smaller, the variation and uniformity of transistor length seriously effect device performance. So, the importance of optical proximity effects correction (OPC) and resolution enhancement technology (RET) cannot be overemphasized. However, OPC process is regarded by some as a necessary evil in device performance. In fact, every group which includes process and design, are interested in whole chip CD variation trend and CD uniformity, which represent real wafer. Recently, design based metrology systems are capable of detecting difference between data base to wafer SEM image. Design based metrology systems are able to extract information of whole chip CD variation. According to the results, OPC abnormality was identified and design feedback items are also disclosed. The other approaches are accomplished on EDA companies, like model based OPC verifications. Model based verification will be done for full chip area by using well-calibrated model. The object of model based verification is the prediction of potential weak point on wafer and fast feed back to OPC and design before reticle fabrication. In order to achieve robust design and sufficient device margin, appropriate combination between design based metrology system and model based verification tools is very important. Therefore, we evaluated design based metrology system and matched model based verification system for optimum combination between two systems. In our study, huge amount of data from wafer results are classified and analyzed by statistical method and classified by OPC feedback and design feedback items. Additionally, novel DFM flow would be proposed by using combination of design based metrology and model based verification tools.

  13. Color information verification system based on singular value decomposition in gyrator transform domains

    NASA Astrophysics Data System (ADS)

    Abuturab, Muhammad Rafiq

    2014-06-01

    A new color image security system based on singular value decomposition (SVD) in gyrator transform (GT) domains is proposed. In the encryption process, a color image is decomposed into red, green and blue channels. Each channel is independently modulated by random phase masks and then separately gyrator transformed at different parameters. The three gyrator spectra are joined by multiplication to get one gray ciphertext. The ciphertext is separated into U, S, and V parts by SVD. All the three parts are individually gyrator transformed at different transformation angles. The three encoded information can be assigned to different authorized users for highly secure verification. Only when all the authorized users place the U, S, and V parts in correct multiplication order in the verification system, the correct information can be obtained with all the right keys. In the proposed method, SVD offers one-way asymmetrical decomposition algorithm and it is an optimal matrix decomposition in a least-square sense. The transformation angles of GT provide very sensitive additional keys. The pre-generated keys for red, green and blue channels are served as decryption (private) keys. As all the three encrypted parts are the gray scale ciphertexts with stationary white noise distributions, which have camouflage property to some extent. These advantages enhance the security and robustness. Numerical simulations are presented to support the viability of the proposed verification system.

  14. Verification and quality control of routine hematology analyzers.

    PubMed

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. PMID:27161194

  15. Approaches to verification of two-dimensional water quality models

    SciTech Connect

    Butkus, S.R. . Water Quality Dept.)

    1990-11-01

    The verification of a water quality model is the one procedure most needed by decision making evaluating a model predictions, but is often not adequate or done at all. The results of a properly conducted verification provide the decision makers with an estimate of the uncertainty associated with model predictions. Several statistical tests are available for quantifying of the performance of a model. Six methods of verification were evaluated using an application of the BETTER two-dimensional water quality model for Chickamauga reservoir. Model predictions for ten state variables were compared to observed conditions from 1989. Spatial distributions of the verification measures showed the model predictions were generally adequate, except at a few specific locations in the reservoir. The most useful statistics were the mean standard error of the residuals. Quantifiable measures of model performance should be calculated during calibration and verification of future applications of the BETTER model. 25 refs., 5 figs., 7 tabs.

  16. Formal verification of an oral messages algorithm for interactive consistency

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1992-01-01

    The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.

  17. Structural Verification of the Redesigned Space Shuttle Bipod Foam Closeout

    NASA Technical Reports Server (NTRS)

    Poole, Eric L.; Rogers, Patrick R.

    2005-01-01

    This document outlines the structural verification approach for the Space Shuttle External Tank Forward Bipod Foam Closeout. Due to the Space Shuttle Columbia accident, debris has become a major concern. The intent of the structural verification is to ensure that any debris shed from the bipod is within acceptable limits. Since cohesive failure due to internal defects was identified as the most likely cause of the STS-107 bipod ramp foam failure, verification for this failure mode receives particular emphasis. However, all failure modes for TPS are considered and appropriate verification rationale is developed for each failure mode. Figure 1 depicts the structural verification of a production design where analysis and test are the primary methods of verification. It can be seen that the successful completion of structural verification is dependent on three main areas: 1. Production process control and quality assurance must ensure that test articles and/or analytical models are representative of (or conservatively envelope) production hardware in terms of geometry, materials and processing. Variability and defects must be considered. 2. Flight environments must be sufficiently characterized to bound driving environments for all failure modes. Applied environments, either test or analytical, must be representative of flight environments and have a load factor that satisfies design requirements. 3. Structural verification must include all failure modes. A comprehensive list of failure modes and the underlying failure mechanisms has been generated based on flight and test experience. Verification tests and / or analyses must address each failure mode. ET TPS Verification is accomplished by a combination of analysis, test, and similarity.

  18. Computer aided production planning - SWZ system of order verification

    NASA Astrophysics Data System (ADS)

    Krenczyk, D.; Skolud, B.

    2015-11-01

    SWZ (System of order verification) is a computer implementation of the methodology that support fast decision making on the acceptability of a production order, which allows to determine not the best possible solution, but admissible solution that is possible to find in an acceptable time (feasible solution) and acceptable due to the existing constraints. The methodology uses the propagation of constraints techniques and reduced to test a sequence of arbitrarily selected conditions. Fulfilment of all the conditions (the conjunction) provides the ability to perform production orders. In the paper examples of the application of SWZ system comprising the steps of planning and control is presented. The obtained results allowing the determination of acceptable production flow in the system - determination of the manufacturing system parameters those that ensure execution of orders in time under the resource constraints. SWZ also allows to generate the dispatching rules as a sequence of processing operations for each production resource, performed periodically during the production flow in the system. Furthermore the example of SWZ and simulation system integration is shown. SWZ has been enhanced with a module generating files containing the script code of the system model using the internal language of simulation and visualization system.

  19. On-site inspection for verification of compliance with treaties

    SciTech Connect

    Dorn, D.W.

    1989-12-01

    OSIs (On-Site Inspections) can be divided into three types of inspections: (1) routine inspections at declared facilities, (2) short-notice inspections at declared facilities, and (3) short-notice inspections at undeclared facilities. Routine inspections at declared facilities are basically cooperative measures, and will likely be confidence building. Short notice inspections at declared facilities can increase the difficulties of conducting non-complaint activities and can serve either as confidence building measures or as a more confrontational action. Short notice inspections at undeclared facilities are confrontational and are likely to be most useful to obtain information which will confirm that obtained through NTM (National Technical Means). Although some information will be compromised during conduct of an OSI, specific procedures to be followed prior to and during the course of the inspections can mitigate this loss. In addition, OSIs at declared facilities serve to enhance NTM and to provide an on-site presence. On the other hand, the short-notice inspections at undeclared facilities are unlikely to uncover gross violations and may well lead to very significant losses of national security information. It may be, though, that challenge or suspect-site inspections will provide a perceived increase in effective verification of compliance. This perception may be necessary in order for the national authorities to ratify the treaty.

  20. Simple metrics for verification and validation of macrosegregation model predictions

    NASA Astrophysics Data System (ADS)

    Vušanović, I.; Voller, V. R.

    2016-03-01

    While the numerical simulation of macrosegregation is now a common place activity efforts can still be enhanced by developing quantitative measures of the results. Here, on treating the nodal field of concentration predictions from a macrosegregation simulation as a sample from a statistical distribution, we demonstrate how statistical measures can be used in verification and validation. The first set of such measures is simply the central moments of the distribution, i.e., the mean, the standard deviation, and the skewness; measurements that provide quantitative checks of mass balance and grid convergence. In addition, building on recently reported work [1], we also demonstrate how to construct and use a cumulative distribution function (CDF) of the nodal concentration field; a measure that can be used to determine the fraction of the casting volume concentrations less than a specified value. We show how the CDF can be used to compare the influence of various process conditions and phenomena related to domain size, cooling rate, permeability, and micro-segregation.

  1. Verification of Minimum Detectable Activity for Radiological Threat Source Search

    NASA Astrophysics Data System (ADS)

    Gardiner, Hannah; Myjak, Mitchell; Baciak, James; Detwiler, Rebecca; Seifert, Carolyn

    2015-10-01

    The Department of Homeland Security's Domestic Nuclear Detection Office is working to develop advanced technologies that will improve the ability to detect, localize, and identify radiological and nuclear sources from airborne platforms. The Airborne Radiological Enhanced-sensor System (ARES) program is developing advanced data fusion algorithms for analyzing data from a helicopter-mounted radiation detector. This detector platform provides a rapid, wide-area assessment of radiological conditions at ground level. The NSCRAD (Nuisance-rejection Spectral Comparison Ratios for Anomaly Detection) algorithm was developed to distinguish low-count sources of interest from benign naturally occurring radiation and irrelevant nuisance sources. It uses a number of broad, overlapping regions of interest to statistically compare each newly measured spectrum with the current estimate for the background to identify anomalies. We recently developed a method to estimate the minimum detectable activity (MDA) of NSCRAD in real time. We present this method here and report on the MDA verification using both laboratory measurements and simulated injects on measured backgrounds at or near the detection limits. This work is supported by the US Department of Homeland Security, Domestic Nuclear Detection Office, under competitively awarded contract/IAA HSHQDC-12-X-00376. This support does not constitute an express or implied endorsement on the part of the Gov't.

  2. Independent verification and validation of large software requirement specification databases

    SciTech Connect

    Twitchell, K.E.

    1992-04-01

    To enhance quality, an independent verification and validation (IV V) review is conducted as software requirements are defined. Requirements are inspected for consistency and completeness. IV V strives to detect defects early in the software development life cycle and to prevent problems before they occur. The IV V review process of a massive software requirements specification, the Reserve Component Automation System (RCAS) Functional Description (FD) is explored. Analysis of the RCAS FD error history determined that there are no predictors of errors. The size of the FD mandates electronic analysis of the databases. Software which successfully performs automated consistency and completeness checks is discussed. The process of verifying the quality of analysis software is described. The use of intuitive ad hoc techniques, in addition to the automatic analysis of the databases, is required because of the varying content of the requirements databases. The ad hoc investigation process is discussed. Case studies are provided to illustrate how the process works. This thesis demonstrates that it is possible to perform an IV V review on a massive software requirements specification. Automatic analysis enables inspecting for completeness and consistency. The work with the RCAS FD clearly indicates that the IV V review process is not static; it must continually grow, adapt, and change as conditions warrant. The ad hoc investigation process provides this required flexibility This process also analyzes errors discovered by manual review and automatic processing. The analysis results in the development of new algorithms and the addition of new programs to the automatic inspection software.

  3. Static corrections for enhanced signal detection at IMS seismic arrays

    NASA Astrophysics Data System (ADS)

    Wilkins, Neil; Wookey, James; Selby, Neil

    2016-04-01

    Seismic monitoring forms an important part of the International Monitoring System (IMS) for verifying the Comprehensive nuclear Test Ban Treaty (CTBT). Analysis of seismic data can be used to discriminate between nuclear explosions and the tens of thousands of natural earthquakes of similar magnitude that occur every year. This is known as "forensic seismology", and techniques include measuring the P-to-S wave amplitude ratio, the body-to-surface wave magnitude ratio (mb/Ms), and source depth. Measurement of these seismic discriminants requires very high signal-to-noise ratio (SNR) data, and this has led to the development and deployment of seismic arrays as part of the IMS. Array processing methodologies such as stacking can be used, but optimum SNR improvement needs an accurate estimate of the arrival time of the particular seismic phase. To enhance the imaging capability of IMS arrays, we aim to develop site-specific static corrections to the arrival time as a function of frequency, slowness and backazimuth. Here, we present initial results for the IMS TORD array in Niger. Vespagrams are calculated for various events using the F-statistic to clearly identify seismic phases and measure their arrival times. Observed arrival times are compared with those predicted by 1D and 3D velocity models, and residuals are calculated for a range of backazimuths and slownesses. Finally, we demonstrate the improvement in signal fidelity provided by these corrections.

  4. Image Hashes as Templates for Verification

    SciTech Connect

    Janik, Tadeusz; Jarman, Kenneth D.; Robinson, Sean M.; Seifert, Allen; McDonald, Benjamin S.; White, Timothy A.

    2012-07-17

    Imaging systems can provide measurements that confidently assess characteristics of nuclear weapons and dismantled weapon components, and such assessment will be needed in future verification for arms control. Yet imaging is often viewed as too intrusive, raising concern about the ability to protect sensitive information. In particular, the prospect of using image-based templates for verifying the presence or absence of a warhead, or of the declared configuration of fissile material in storage, may be rejected out-of-hand as being too vulnerable to violation of information barrier (IB) principles. Development of a rigorous approach for generating and comparing reduced-information templates from images, and assessing the security, sensitivity, and robustness of verification using such templates, are needed to address these concerns. We discuss our efforts to develop such a rigorous approach based on a combination of image-feature extraction and encryption-utilizing hash functions to confirm proffered declarations, providing strong classified data security while maintaining high confidence for verification. The proposed work is focused on developing secure, robust, tamper-sensitive and automatic techniques that may enable the comparison of non-sensitive hashed image data outside an IB. It is rooted in research on so-called perceptual hash functions for image comparison, at the interface of signal/image processing, pattern recognition, cryptography, and information theory. Such perceptual or robust image hashing—which, strictly speaking, is not truly cryptographic hashing—has extensive application in content authentication and information retrieval, database search, and security assurance. Applying and extending the principles of perceptual hashing to imaging for arms control, we propose techniques that are sensitive to altering, forging and tampering of the imaged object yet robust and tolerant to content-preserving image distortions and noise. Ensuring that the

  5. Integrated Medical Model Verification, Validation, and Credibility

    NASA Technical Reports Server (NTRS)

    Walton, Marlei; Kerstman, Eric; Foy, Millennia; Shah, Ronak; Saile, Lynn; Boley, Lynn; Butler, Doug; Myers, Jerry

    2014-01-01

    The Integrated Medical Model (IMM) was designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic (stochastic process) model based on historical data, cohort data, and subject matter expert opinion. A probabilistic approach is taken since exact (deterministic) results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation. METHODS: In 2008, the IMM team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) was published, the IMM team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. RESULTS: IMM VVC updates are compiled recurrently and include 7009 Compliance and Credibility matrices, IMM VV Plan status, and a synopsis of any changes or updates to the IMM during the reporting period. Reporting tools have evolved over the lifetime of the IMM project to better communicate VVC status. This has included refining original 7009 methodology with augmentation from the NASA-STD-7009 Guidance Document. End user requests and requirements are being satisfied as evidenced by ISS Program acceptance of IMM risk forecasts, transition to an operational model and

  6. Experimental verification of vapor deposition model in Mach 0.3 burner rigs

    NASA Technical Reports Server (NTRS)

    Gokoglu, S. A.

    1984-01-01

    A comprehensive theoretical framework of deposition from combustion gases was developed covering the spectrum of various mass delivery mechanisms including vapor, thermophoretically enhanced small particle, and inertially impacting large particle deposition. Rational yet simple correlations were provided to facilitate engineering surface arrival rate predictions. Experimental verification of the deposition theory was validated using burner rigs. Toward this end, a Mach 0.3 burner rig apparatus was designed to measure deposition rates from salt-seeded combustion gases on an internally cooled cylindrical collector.

  7. Specification of Selected Performance Monitoring and Commissioning Verification Algorithms for CHP Systems

    SciTech Connect

    Brambley, Michael R.; Katipamula, Srinivas

    2006-10-06

    Pacific Northwest National Laboratory (PNNL) is assisting the U.S. Department of Energy (DOE) Distributed Energy (DE) Program by developing advanced control algorithms that would lead to development of tools to enhance performance and reliability, and reduce emissions of distributed energy technologies, including combined heat and power technologies. This report documents phase 2 of the program, providing a detailed functional specification for algorithms for performance monitoring and commissioning verification, scheduled for development in FY 2006. The report identifies the systems for which algorithms will be developed, the specific functions of each algorithm, metrics which the algorithms will output, and inputs required by each algorithm.

  8. Selected Examples of LDRD Projects Supporting Test Ban Treaty Verification and Nonproliferation

    SciTech Connect

    Jackson, K.; Al-Ayat, R.; Walter, W. R.

    2015-02-23

    The Laboratory Directed Research and Development (LDRD) Program at the DOE National Laboratories was established to ensure the scientific and technical vitality of these institutions and to enhance the their ability to respond to evolving missions and anticipate national needs. LDRD allows the Laboratory directors to invest a percentage of their total annual budget in cutting-edge research and development projects within their mission areas. We highlight a selected set of LDRD-funded projects, in chronological order, that have helped provide capabilities, people and infrastructure that contributed greatly to our ability to respond to technical challenges in support of test ban treaty verification and nonproliferation.

  9. EXAMINING THE ROLE AND RESEARCH CHALLENGES OF SOCIAL MEDIA AS A TOOL FOR NONPROLIFERATION AND ARMS CONTROL TREATY VERIFICATION

    SciTech Connect

    Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.; Gastelum, Zoe N.; Kreyling, Sean J.; West, Curtis L.

    2014-05-13

    of an event. The goal of this paper is to instigate a discussion within the verification community as to where and how social media can be effectively utilized to complement and enhance traditional treaty verification efforts. In addition, this paper seeks to identify areas of future research and development necessary to adapt social media analytic tools and techniques, and to form the seed for social media analytics to aid and inform arms control and nonproliferation policymakers and analysts. While social media analysis (as well as open source analysis as a whole) will not ever be able to replace traditional arms control verification measures, they do supply unique signatures that can augment existing analysis.

  10. MOV reliability evaluation and periodic verification scheduling

    SciTech Connect

    Bunte, B.D.

    1996-12-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.

  11. The Role of science in treaty verification

    SciTech Connect

    Gavron, A. I.

    2004-01-01

    Technologically advanced nations are currently applying more science to treaty verification than ever before. Satellites gather a multitude of information relating to proliferation concerns using thermal imaging analysis, nuclear radiation measurements, and optical and radio frequency signals detection. Ground stations gather complementary signals such as seismic events and radioactive emissions. Export controls in many countries attempt to intercept materials and technical means that could be used for nuclear proliferation. Never the less, we have witnessed a plethora of nuclear proliferation episodes, that were undetected (or were belatedly detected) by these technologies - the Indian nuclear tests in 1998, the Libyan nuclear buildup, the Iranian enrichment program and the North Korea nuclear weapons program are some prime examples. In this talk we will discuss some of the technologies used for proliferation detection. In particular, we will note some of the issues relating to nuclear materials control agreements that epitomize political difficulties as they impact the implementation of science and technology.

  12. Understanding correlation coefficients in treaty verification

    SciTech Connect

    DeVolpi, A.

    1991-11-01

    When a pair of images are compared on a point-by-point basis, the linear-correlation coefficient is usually used as a measure of similarity or dissimilarity. This paper evaluates the theoretical underpinnings and limitation of the linear-correlation coefficient, as well as other related statistics, particularly for cases where inherent white noise is present. As a result of the limitations in linear-correlation, an additional step has been derived -- local-sum clustering -- in order to improve recognition of small dissimilarities in a pair images. Results show that three-stage procedure, consisting of first establishing congruence of the two images, than using the linear-correlation coefficient as a test of true negatives, and finally qualifying a true positive by using the cluster (local-sum) method. These algorithmic stages would be especially useful in arms control treaty verification.

  13. Freeze verification: time for a fresh approach

    SciTech Connect

    Paine, C.

    1983-01-01

    The administration's claim that some elements of a comprehensive nuclear freeze are unverifiable does not specify the nature of those elements and whether they represent a real threat to national security if we trusted the USSR to comply. The author contends that clandestine development of new weapons will have little strategic effect since both sides already have total destructive power. The risks of noncompliance are largely political and less than the risks of continued arms buildup. Since the USSR would also want the US to be bound by freeze terms, deterrence would come from mutual benefit. Hardliners argue that cheating is easier in a closed society; that our democracy would tend to relax and the USSR would move ahead with its plans for world domination. The author argues that, over time, a freeze would diminish Soviet confidence in its nuclear war fighting capabilities and that adequate verification is possible with monitoring and warning arrangements. (DCK)

  14. Towards Formal Verification of a Separation Microkernel

    NASA Astrophysics Data System (ADS)

    Butterfield, Andrew; Sanan, David; Hinchey, Mike

    2013-08-01

    The best approach to verifying an IMA separation kernel is to use a (fixed) time-space partitioning kernel with a multiple independent levels of separation (MILS) architecture. We describe an activity that explores the cost and feasibility of doing a formal verification of such a kernel to the Common Criteria (CC) levels mandated by the Separation Kernel Protection Profile (SKPP). We are developing a Reference Specification of such a kernel, and are using higher-order logic (HOL) to construct formal models of this specification and key separation properties. We then plan to do a dry run of part of a formal proof of those properties using the Isabelle/HOL theorem prover.

  15. The future of large optical system verification

    NASA Astrophysics Data System (ADS)

    Matthews, Gary

    2005-08-01

    As optical systems grow in size, there becomes a point in which traditional system verification prior to launch will become impossible. This implies that observatory ground testing will not be completed. Our history does not support this premise and therefore results in an unacceptable programmatic risk. But, if the dream of building 20-30 meter systems is ever to become true, these realities must be accepted. To make this possible, new and better analytical tools and processes must be developed and certified on programs that can be tested on the ground. This change in paradigm does not eliminate critical testing; it just does it at different assembly levels and most likely adds alignment flexibility to correct optical errors after launch. This paper provides ideas on how the hardware, analysis tools, and testing may evolve to support these ambitious future programs.

  16. Coherent Lidar Design and Performance Verification

    NASA Technical Reports Server (NTRS)

    Frehlich, Rod

    1996-01-01

    This final report summarizes the investigative results from the 3 complete years of funding and corresponding publications are listed. The first year saw the verification of beam alignment for coherent Doppler lidar in space by using the surface return. The second year saw the analysis and computerized simulation of using heterodyne efficiency as an absolute measure of performance of coherent Doppler lidar. A new method was proposed to determine the estimation error for Doppler lidar wind measurements without the need for an independent wind measurement. Coherent Doppler lidar signal covariance, including wind shear and turbulence, was derived and calculated for typical atmospheric conditions. The effects of wind turbulence defined by Kolmogorov spatial statistics were investigated theoretically and with simulations. The third year saw the performance of coherent Doppler lidar in the weak signal regime determined by computer simulations using the best velocity estimators. Improved algorithms for extracting the performance of velocity estimators with wind turbulence included were also produced.

  17. Nondestructive verification of continuous-variable entanglement

    NASA Astrophysics Data System (ADS)

    de Faria, Alencar J.

    2016-07-01

    An optical procedure in the context of continuous variables to verify bipartite entanglement without destroying both systems and their entanglement is proposed. To perform the nondestructive verification of entanglement, the method relies on beam-splitter and quantum nondemolition (QND) interactions of the signal modes with two ancillary probe modes. The probe modes are measured by homodyne detections, and the obtained information is used to feed forward modulation of signal modes, concluding the procedure. Characterizing the method by figures of merit used in QND processes, we can establish the conditions for an effectively quantum scheme. Based on such conditions, it is shown that the classical information acquired from the homodyne detections of probe modes is sufficient to verify the entanglement of the output signal modes. The processing impact due to added noise on the output entanglement is assessed in the case of Gaussian modes.

  18. Figures of Merit for Control Verification

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Goesu. Daniel P.

    2008-01-01

    This paper proposes a methodology for evaluating a controller's ability to satisfy a set of closed-loop specifications when the plant has an arbitrary functional dependency on uncertain parameters. Control verification metrics applicable to deterministic and probabilistic uncertainty models are proposed. These metrics, which result from sizing the largest uncertainty set of a given class for which the specifications are satisfied, enable systematic assessment of competing control alternatives regardless of the methods used to derive them. A particularly attractive feature of the tools derived is that their efficiency and accuracy do not depend on the robustness of the controller. This is in sharp contrast to Monte Carlo based methods where the number of simulations required to accurately approximate the failure probability grows exponentially with its closeness to zero. This framework allows for the integration of complex, high-fidelity simulations of the integrated system and only requires standard optimization algorithms for its implementation.

  19. Biometric Subject Verification Based on Electrocardiographic Signals

    NASA Technical Reports Server (NTRS)

    Dusan, Sorin V. (Inventor); Jorgensen, Charles C. (Inventor)

    2014-01-01

    A method of authenticating or declining to authenticate an asserted identity of a candidate-person. In an enrollment phase, a reference PQRST heart action graph is provided or constructed from information obtained from a plurality of graphs that resemble each other for a known reference person, using a first graph comparison metric. In a verification phase, a candidate-person asserts his/her identity and presents a plurality of his/her heart cycle graphs. If a sufficient number of the candidate-person's measured graphs resemble each other, a representative composite graph is constructed from the candidate-person's graphs and is compared with a composite reference graph, for the person whose identity is asserted, using a second graph comparison metric. When the second metric value lies in a selected range, the candidate-person's assertion of identity is accepted.

  20. Thermionic Fuel Element Verification Program - Overview

    NASA Astrophysics Data System (ADS)

    Bohl, Richard J.; Dahlberg, Richard C.; Dutt, Dale S.; Wood, John T.

    The Thermionic Fuel Element (TFE) Verification program was established in 1986 to resolve the technology concerns raised in Phase 1 of the SP-100 program, namely, the performance and lifetime of thermionic fuel elements in a fast spectrum reactor. The program builds directly on an extensive database developed in the 1960s and early 1970s in an AEC/NASA-sponsored program, when TFEs were developed and tested at design conditions for over 10,000 h. The current effort has reestablished that technology and is extending the lifetime up to 7 to 10 yr. A TFE lifetime of more than 2 yr has been demonstrated in the TRIGA reactor. Component lifetimes of more than 10 yr have been demonstrated in accelerated tests in the FFTF (Richland) and EBR-II (Idaho) test reactors. Program completion is scheduled for FY-95.