Science.gov

Sample records for enhancing ctbt verification

  1. Visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-03-01

    On-site visual inspection will play an essential role in future Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection can greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can visual inspection offer ``ground truth`` in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending party may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection.

  2. CTBT integrated verification system evaluation model supplement

    SciTech Connect

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  3. CTBT Integrated Verification System Evaluation Model

    SciTech Connect

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  4. Completing and sustaining IMS network for the CTBT Verification Regime

    NASA Astrophysics Data System (ADS)

    Meral Ozel, N.

    2015-12-01

    The CTBT International Monitoring System is to be comprised of 337 facilities located all over the world for the purpose of detecting and locating nuclear test explosions. Major challenges remain, namely the completion of the network where most of the remaining stations have either environmental, logistical and/or political issues to surmont (89% of the stations have already been built) and the sustainment of a reliable and state-of the-art network covering 4 technologies - seismic, infrasound , hydroacoustic and radionuclide. To have a credible and trustworthy verification system ready for entry into force of the Treaty, the CTBTO is protecting and enhancing its investment of its global network of stations and is providing effective data to the International Data Centre (IDC) and Member States. Regarding the protection of the CTBTO's investment and enhanced sustainment of IMS station operations, the IMS Division is enhancing the capabilities of the monitoring system by applying advances in instrumentation and introducing new software applications that are fit for purpose. Some examples are the development of noble gas laboratory systems to process and analyse subsoil samples, development of a mobile noble gas system for onsite inspection purposes, optimization of Beta Gamma detectors for Xenon detection, assessing and improving the efficiency of wind noise reduction systems for infrasound stations, development and testing of infrasound stations with a self-calibrating capability, and research into the use of modular designs for the hydroacoustic network.

  5. Decision analysis framework for evaluating CTBT seismic verification options

    SciTech Connect

    Judd, B.R.; Strait, R.S.; Younker, L.W.

    1986-09-01

    This report describes a decision analysis framework for evaluating seismic verification options for a Comprehensive Test Ban Treaty (CTBT). In addition to providing policy makers with insights into the relative merits of different options, the framework is intended to assist in formulating and evaluating political decisions - such as responses to evidence of violations - and in setting research priorities related to the options. To provide these broad analytical capabilities to decision makers, the framework incorporates a wide variety of issues. These include seismic monitoring capabilities, evasion possibilities, evidence produced by seismic systems, US response to the evidence, the dependence between US and Soviet decision-making, and the relative values of possible outcomes to the US and the Soviet Union. An added benefit of the framework is its potential use to improve communication about these CTBT verification issues among US experts and decision makers. The framework has been implemented on a portable microcomputer to facilitate this communication through demonstration and rapid evaluation of alternative judgments and policy choices. The report presents the framework and its application in four parts. The first part describes the decision analysis framework and the types of analytical results produced. In the second part, the framework is used to evaluate representative seismic verification options. The third part describes the results of sensitivity analyses that determine the relative importance of the uncertainties or subjective judgments that influence the evaluation of the options. The fourth (and final) part summaries conclusions and presents implications of the sample analytical results for further research and for policy-making related to CTBT verification. The fourth section also describes the next steps in the development and use of the decision analysis framework.

  6. Ground-based visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-11-01

    Ground-based visual inspection will play an essential role in On-Site Inspection (OSI) for Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection will greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can ground-based visual inspection offer effective documentation in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending state may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection. The inspections will be carried out by inspectors from members of the CTBT Organization.

  7. How to tackle wet-deposition of radionuclides in the context of RN threshold-monitoring for CTBT verification?

    NASA Astrophysics Data System (ADS)

    Becker, A.; Ceranna, L.; Ross, O.; Schneider, U.; Meyer-Christoffer, A.; Ziese, M.; Rudolf, B.

    2011-12-01

    As contribution to the World Climate Research Program (WCRP) and in support of the Global Climate Observing System (GCOS) of the World Meteorological Organization (WMO), the Deutscher Wetterdienst (DWD) operates the Global Precipitation Climatology Centre (GPCC). The GPCC re-analysis and near-real time monitoring products are recognized world-wide as the most reliable global data set on rain-gauge based (in-situ) precipitation measurements. The GPCC Monitoring Product (Rudolf and Becker, 2010) is available two months after the fact based on the data gathered while listening to the GTS to fetch the SYNOP and CLIMAT messages. This product serves also the reference data to calibrate satellite based precipitation measurements yielding the Global Precipitation Climatology Project (GPCP) data set (Huffmann et al., 2009). The quickest GPCC product is the First Guess version of the GPCC Monitoring Product being available already 3-5 days after the month regarded. Both, the GPCC and the GPCP products bear the capability to serve as data base for the computational light-weight post processing of the wet deposition impact on the radionuclide monitoring capability of the CTBT network (Wotawa et al., 2009) on the regional and global scale, respectively. This is of major importance any time, a reliable quantitative assessment of the source-receptor sensitivity is needed, e.g. for the analysis of isotopic ratios. Actually the wet deposition recognition is a prerequisite if ratios of particulate and noble gas measurements come into play. This is so far a quite unexplored field of investigation, but would alleviate the clearance of several apparently CTBT relevant detections, encountered in the past, as bogus and provide an assessment for the so far overestimation of the IMS RN detection capability (Figure 1). References Huffman, G.J, et al., 2009: Improving the Global Precipitation Record: GPCP Version 2.1. Geophys. Res. Lett., 36,L17808, doi:10.1029/2009GL040000. Rudolf, B. and A

  8. CTBT technical issues handbook

    SciTech Connect

    Zucca, J.J.

    1994-05-01

    The purpose of this handbook is to give the nonspecialist in nuclear explosion physics and nuclear test monitoring an introduction to the topic as it pertains to a Comprehensive Test Ban Treaty (CTBT). The authors have tried to make the handbook visually oriented, with figures paired to short discussions. As such, the handbook may be read straight through or in sections. The handbook covers four main areas and ends with a glossary, which includes both scientific terms and acronyms likely to be encountered during CTBT negotiations. The following topics are covered: (1) Physics of nuclear explosion experiments. This is a description of basic nuclear physics and elementary nuclear weapon design. Also discussed are testing practices. (2) Other nuclear experiments. This section discusses experiments that produce small amounts of nuclear energy but differ from explosion experiments discussed in the first chapter. This includes the type of activities, such as laser fusion, that would continue after a CTBT is in force. (3) Monitoring tests in various environments. This section describes the different physical environments in which a test could be conducted (underground, in the atmosphere, in space, underwater, and in the laboratory); the sources of non-nuclear events (such as earthquakes and mining operations); and the opportunities for evasion. (4) On-site inspections. A CTBT is likely to include these inspections as an element of the verification provisions, in order to resolve the nature of ambiguous events. This chapter describes some technical considerations and technologies that are likely to be useful. (5) Selecting verification measures. This chapter discusses the uncertain nature of the evidence from monitoring systems and how compliance judgments could be made, taking the uncertainties into account. It also discusses how to allocate monitoring resources, given the likelihood of testing by various countries in various environments.

  9. CTBT on-site inspections

    SciTech Connect

    Zucca, J. J.

    2014-05-09

    On-site inspection (OSI) is a critical part of the verification regime for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The OSI verification regime provides for international inspectors to make a suite of measurements and observations on site at the location of an event of interest. The other critical component of the verification regime is the International Monitoring System (IMS), which is a globally distributed network of monitoring stations. The IMS along with technical monitoring data from CTBT member countries, as appropriate, will be used to trigger an OSI. After the decision is made to carry out an OSI, it is important for the inspectors to deploy to the field site rapidly to be able to detect short-lived phenomena such as the aftershocks that may be observable after an underground nuclear explosion. The inspectors will be on site from weeks to months and will be working with many tens of tons of equipment. Parts of the OSI regime will be tested in a field exercise in the country of Jordan late in 2014. The build-up of the OSI regime has been proceeding steadily since the CTBT was signed in 1996 and is on track to becoming a deterrent to someone considering conducting a nuclear explosion in violation of the Treaty.

  10. Proceedings of the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT)

    SciTech Connect

    Nichols, James W., LTC

    2000-09-15

    These proceedings contain papers prepared for the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), held 13-15 September 2000 in New Orleans, Louisiana. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Defense Threat Reduction Agency (DTRA), Air Force Technical Applications Center (AFTAC), Department of Defense (DoD), US Army Space and Missile Defense Command, Defense Special Weapons Agency (DSWA), and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  11. Verification and enhancement high resolution layers 2012 for Bulgaria

    NASA Astrophysics Data System (ADS)

    Dimitrov, Ventzeslav; Lubenov, Todor

    Production of high-resolution layers (HRL) is a substantial part of the pan-European component of GMES/Copernicus initial operations (GIO) land monitoring service. The focus of this paper is on the results of the implementation of HRL verification and enhancement tasks for Bulgarian territory. For the reference year 2012 five HRL on land cover characteristics were produced by service providers through sophisticated classification of multi-sensor and multi-temporal satellite images: imperviousness, forests, grasslands, wetlands and permanent water bodies. As a result of the verification systematic classification errors were identified relevant to the subsequent enhancement procedure. The verification was carried out through visual inspection of stratified samples in the HRL using reliable reference spatial data sets, checking for commission and omission errors. The applied procedure included three major parts, the first two - obligatory: general overview of data quality, look-and-feel control of critical strata and statistically based quantitative verification. The enhancement task consisted in correcting errors revealed by the verification giving as a result final enhanced HRL products. Stratification schemes, evaluation grades by strata and HRL from look-and-feel verification and accuracy values from statistical verification are presented. Types and quantities of removed mistakes during the enhancement are structured and summarised. Results show that all HRL except the grasslands layer meet the 85% accuracy requirements.

  12. CTBT (7-chlorotetrazolo[5,1-c]benzo[1,2,4]triazine) producing ROS affects growth and viability of filamentous fungi.

    PubMed

    Culakova, Hana; Dzugasova, Vladimira; Gbelska, Yvetta; Subik, Julius

    2012-03-01

    CTBT (7-chlorotetrazolo[5,1-c]benzo[1,2,4]triazine) causes intracellular superoxide production and oxidative stress and enhances the susceptibility of Saccharomyces cerevisiae, Candida albicans, and C. glabrata cells to cycloheximide, 5-fluorocytosine, and azole antimycotic drugs. Here, we demonstrate the antifungal activity of CTBT against 14 tested filamentous fungi. CTBT prevented spore germination and mycelial proliferation of Aspergillus niger and the pathogenic Aspergillus fumigatus. The action of CTBT is fungicidal. CTBT increased the formation of reactive oxygen species in fungal mycelium as detected by 2',7'-dichlorodihydrofluorescein diacetate and reduced the radial growth of colonies in a dose-dependent manner. Co-application of CTBT and itraconazole led to complete inhibition of fungal growth at dosages lower than the chemicals alone. Antifungal and chemosensitizing activities of CTBT in filamentous fungi may be useful in combination treatments of infections caused by drug-resistant fungal pathogens. PMID:22212016

  13. Scientific Meetings Database: A New Tool for CTBT-Related International Cooperation

    SciTech Connect

    Knapik, Jerzy F.; Girven, Mary L.

    1999-08-20

    The mission of international cooperation is defined in the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Ways and means of implementation were the subject of discussion during the International Cooperation Workshop held in Vienna in November 1998, and during the Regional Workshop for CTBTO International Cooperation held in Cairo, Egypt in June 1999. In particular, a database of ''Scientific and Technical Meetings Directly or Indirectly Related to CTBT Verification-Related Technologies'' was developed by the CTBTO PrepCom/PTS/International Cooperation section and integrated into the organization's various web sites in cooperation with the U.S. Department of Energy CTBT Research and Development Program. This database, the structure and use of which is described in this paper/presentation is meant to assist the CTBT-related scientific community in identifying worldwide expertise in the CTBT verification-related technologies and should help experts, particularly those of less technologically advanced States Signatories, to strengthen contacts and to pursue international cooperation under the Tredy regime. Specific opportunities for international cooperation, in particular those provided by active participation in the use and further development of this database, are presented in this paper and/or presentation.

  14. Three years of operational experience from Schauinsland CTBT monitoring station.

    PubMed

    Zähringer, M; Bieringer, J; Schlosser, C

    2008-04-01

    Data from three years of operation of a low-level aerosol sampler and analyzer (RASA) at Schauinsland monitoring station are reported. The system is part of the International Monitoring System (IMS) for verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The fully automatic system is capable to measure aerosol borne gamma emitters with high sensitivity and routinely quantifies 7Be and 212Pb. The system had a high level of data availability of 90% within the reporting period. A daily screening process rendered 66 tentative identifications of verification relevant radionuclides since the system entered IMS operation in February 2004. Two of these were real events and associated to a plausible source. The remaining 64 cases can consistently be explained by detector background and statistical phenomena. Inter-comparison with data from a weekly sampler operated at the same station shows instabilities of the calibration during the test phase and a good agreement since certification of the system. PMID:18053622

  15. Enhanced Verification Test Suite for Physics Simulation Codes

    SciTech Connect

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    sophistication or other physics regimes (e.g., energetic material response, magneto-hydrodynamics), would represent a scientifically desirable complement to the fundamental test cases discussed in this report. The authors believe that this document can be used to enhance the verification analyses undertaken at the DOE WP Laboratories and, thus, to improve the quality, credibility, and usefulness of the simulation codes that are analyzed with these problems.

  16. Development to Release of CTBT Knowledge Base Datasets

    SciTech Connect

    Moore, S.G.; Shepherd, E.R.

    1998-10-20

    For the CTBT Knowledge Base to be useful as a tool for improving U.S. monitoring capabilities, the contents of the Knowledge Base must be subjected to a well-defined set of procedures to ensure integrity and relevance of the con- stituent datasets. This paper proposes a possible set of procedures for datasets that are delivered to Sandia National Laboratories (SNL) for inclusion in the Knowledge Base. The proposed procedures include defining preliminary acceptance criteria, performing verification and validation activities, and subjecting the datasets to approvrd by domain experts. Preliminary acceptance criteria include receipt of the data, its metadata, and a proposal for its usability for U.S. National Data Center operations. Verification activi- ties establish the correctness and completeness of the data, while validation activities establish the relevance of the data to its proposed use. Results from these activities are presented to domain experts, such as analysts and peers for final approval of the datasets for release to the Knowledge Base. Formats and functionality will vary across datasets, so the procedures proposed herein define an overall plan for establishing integrity and relevance of the dataset. Specific procedures for verification, validation, and approval will be defined for each dataset, or for each type of dataset, as appropriate. Potential dataset sources including Los Alamos National Laboratories and Lawrence Livermore National Laborato- ries have contributed significantly to the development of thk process.

  17. Construction of a Shallow Underground Low-background Detector for a CTBT Radionuclide Laboratory

    SciTech Connect

    Forrester, Joel B.; Greenwood, Lawrence R.; Miley, Harry S.; Myers, Allan W.; Overman, Cory T.

    2013-05-01

    The International Monitoring System (IMS) is a verification component of the Comprehensive Nuclear-Test-Ban Treaty (CTBT), and in addition to a series of radionuclide monitoring stations, contains sixteen radionuclide laboratories capable of verification of radionuclide station measurements. This paper presents an overview of a new commercially obtained low-background detector system for radionuclide aerosol measurements recently installed in a shallow (>30 meters water equivalent) underground clean-room facility at Pacific Northwest National Laboratory. Specifics such as low-background shielding materials, active shielding methods, and improvements in sensitivity to IMS isotopes will be covered.

  18. Scenario details of NPE 2012 - Independent performance assessment by simulated CTBT violation

    NASA Astrophysics Data System (ADS)

    Gestermann, N.; Bönnemann, C.; Ceranna, L.; Ross, O.; Schlosser, C.

    2012-04-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) was opened for signature on 24 September 1996. The technical preparations for monitoring CTBT compliance are moving ahead rapidly and many efforts have been made since then to establish the verification system. In that regard the two underground nuclear explosions conducted by the Democratic People's Republic of Korea 2006 and 2009 were the first real performance tests of the system. In the light of these events National Data Centres (NDCs) realized the need of getting more familiar with the verification regime details. The idea of an independent annual exercise to evaluate the processing and analysis procedures applied at the National Data Centres of the CTBT was born at the NDC Evaluation Workshop in Kiev, Ukraine, 2006. The exercises should simulate a fictitious violation of the CTBT and all NDCs are invited to clarify the nature of the selected event. This exercise should help to evaluate the effectiveness of procedures applied at NDCs, as well as the quality, completeness, and usefulness of IDC products. Moreover, the National Data Centres Preparedness Exercise (NPE) is a measure for the readiness of the NDCs to fulfill their duties in regard of the CTBT verification, the treaty compliance based judgments about the nature of events as natural or artificial and chemical or nuclear, respectively. NPEs proved to be an efficient indicative tool for testing the performance of the verification system and its elements. In 2007 and 2008 the exercise were focused on seismic waveform data analysis. Since 2009 the analysis of infrasound data was included and additional attention was attached to the radionuclide component. In 2010 a realistic noble gas release scenario was selected as the trigger event which could be expected after an underground nuclear test. The epicenter location of an event from the Reviewed Event Bulletin (REB), unknown for participants of the exercise, was selected as the source of the noble gas

  19. Atmospheric transport modelling in support of CTBT verification—overview and basic concepts

    NASA Astrophysics Data System (ADS)

    Wotawa, Gerhard; De Geer, Lars-Erik; Denier, Philippe; Kalinowski, Martin; Toivonen, Harri; D'Amours, Real; Desiato, Franco; Issartel, Jean-Pierre; Langer, Matthias; Seibert, Petra; Frank, Andreas; Sloan, Craig; Yamazawa, Hiromi

    Under the provisions of the Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global monitoring system comprising different verification technologies is currently being set up. The network will include 80 radionuclide (RN) stations distributed all over the globe that measure treaty-relevant radioactive species. While the seismic subsystem cannot distinguish between chemical and nuclear explosions, RN monitoring would provide the "smoking gun" of a possible treaty violation. Atmospheric transport modelling (ATM) will be an integral part of CTBT verification, since it provides a geo-temporal location capability for the RN technology. In this paper, the basic concept for the future ATM software system to be installed at the International Data Centre is laid out. The system is based on the operational computation of multi-dimensional source-receptor sensitivity fields for all RN samples by means of adjoint tracer transport modelling. While the source-receptor matrix methodology has already been applied in the past, the system that we suggest will be unique and unprecedented, since it is global, real-time and aims at uncovering source scenarios that are compatible with measurements. Furthermore, it has to deal with source dilution ratios that are by orders of magnitude larger than in typical transport model applications. This new verification software will need continuous scientific attention, and may well provide a prototype system for future applications in areas of environmental monitoring, emergency response and verification of other international agreements and treaties.

  20. The new geospatial tools: global transparency enhancing safeguards verification

    SciTech Connect

    Pabian, Frank Vincent

    2010-09-16

    This paper focuses on the importance and potential role of the new, freely available, geospatial tools for enhancing IAEA safeguards and how, together with commercial satellite imagery, they can be used to promote 'all-source synergy'. As additional 'open sources', these new geospatial tools have heralded a new era of 'global transparency' and they can be used to substantially augment existing information-driven safeguards gathering techniques, procedures, and analyses in the remote detection of undeclared facilities, as well as support ongoing monitoring and verification of various treaty (e.g., NPT, FMCT) relevant activities and programs. As an illustration of how these new geospatial tools may be applied, an original exemplar case study provides how it is possible to derive value-added follow-up information on some recent public media reporting of a former clandestine underground plutonium production complex (now being converted to a 'Tourist Attraction' given the site's abandonment by China in the early 1980s). That open source media reporting, when combined with subsequent commentary found in various Internet-based Blogs and Wikis, led to independent verification of the reporting with additional ground truth via 'crowdsourcing' (tourist photos as found on 'social networking' venues like Google Earth's Panoramio layer and Twitter). Confirmation of the precise geospatial location of the site (along with a more complete facility characterization incorporating 3-D Modeling and visualization) was only made possible following the acquisition of higher resolution commercial satellite imagery that could be correlated with the reporting, ground photos, and an interior diagram, through original imagery analysis of the overhead imagery.

  1. The data dictionary: A view into the CTBT knowledge base

    SciTech Connect

    Shepherd, E.R.; Keyser, R.G.; Armstrong, H.M.

    1997-08-01

    The data dictionary for the Comprehensive Test Ban Treaty (CTBT) knowledge base provides a comprehensive, current catalog of the projected contents of the knowledge base. It is written from a data definition view of the knowledge base and therefore organizes information in a fashion that allows logical storage within the computer. The data dictionary introduces two organization categories of data: the datatype, which is a broad, high-level category of data, and the dataset, which is a specific instance of a datatype. The knowledge base, and thus the data dictionary, consist of a fixed, relatively small number of datatypes, but new datasets are expected to be added on a regular basis. The data dictionary is a tangible result of the design effort for the knowledge base and is intended to be used by anyone who accesses the knowledge base for any purpose, such as populating the knowledge base with data, or accessing the data for use with automatic data processing (ADP) routines, or browsing through the data for verification purposes. For these two reasons, it is important to discuss the development of the data dictionary as well as to describe its contents to better understand its usefulness; that is the purpose of this paper.

  2. Automatic radioxenon analyzer for CTBT monitoring

    SciTech Connect

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE`s NN-20 Comprehensive Test Ban Treaty (CTBT) R&D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, {sup 131m}Xe(11.9 d), {sup 133m}Xe(2.19 d), {sup 133}Xe (5.24 d), and {sup 135}Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m{sup 3}/h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 {mu}Bq/m{sup 3} of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers.

  3. Global Monitoring of the CTBT: Progress, Capabilities and Plans (Invited)

    NASA Astrophysics Data System (ADS)

    Zerbo, L.

    2013-12-01

    The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), established in 1996, is tasked with building up the verification regime of the CTBT. The regime includes a global system for monitoring the earth, the oceans and the atmosphere for nuclear tests, and an on-site inspection (OSI) capability. More than 80% of the 337 facilities of the International Monitoring System (IMS) have been installed and are sending data to the International Data Centre (IDC) in Vienna, Austria for processing. These IMS data along with IDC processed and reviewed products are available to all States that have signed the Treaty. Concurrent with the build-up of the global monitoring networks, near-field geophysical methods are being developed and tested for OSIs. The monitoring system is currently operating in a provisional mode, as the Treaty has not yet entered into force. Progress in installing and operating the IMS and the IDC and in building up an OSI capability will be described. The capabilities of the monitoring networks have progressively improved as stations are added to the IMS and IDC processing techniques refined. Detection thresholds for seismic, hydroacoustic, infrasound and radionuclide events have been measured and in general are equal to or lower than the predictions used during the Treaty negotiations. The measurements have led to improved models and tools that allow more accurate predictions of future capabilities and network performance under any configuration. Unplanned tests of the monitoring network occurred when the DPRK announced nuclear tests in 2006, 2009, and 2013. All three tests were well above the detection threshold and easily detected and located by the seismic monitoring network. In addition, noble gas consistent with the nuclear tests in 2006 and 2013 (according to atmospheric transport models) was detected by stations in the network. On-site inspections of these tests were not conducted as the Treaty has not entered

  4. Nuclear explosion source terms for CTBT monitoring

    SciTech Connect

    Lougheed, R.W.; Wild, J.F.; Harvey, T.

    1996-10-01

    Detection of radionuclides from a suspected nuclear explosion is required to provide absolute proof that the event was nuclear. Various evasion scenarios could be employed to attempt to hide the radionuclide signals. We will present estimates of the possible reduction in specific gaseous and particulate fission products for explosion scenarios from underground to underwater and the use of rain storms that a potential CTBT violator might employ to evade detection. We will consider the effect that the chemical behavior of the fission products that are initially formed in nuclear explosions will have on the possible release and transport of the longer-lived fission products that would actually be measured by remote monitoring stations or by using on-site inspection techniques.

  5. Geophysics, Remote Sensing, and the Comprehensive Nuclear-Test-Ban Treaty (CTBT) Integrated Field Exercise 2014

    NASA Astrophysics Data System (ADS)

    Sussman, A. J.; Macleod, G.; Labak, P.; Malich, G.; Rowlands, A. P.; Craven, J.; Sweeney, J. J.; Chiappini, M.; Tuckwell, G.; Sankey, P.

    2015-12-01

    The Integrated Field Exercise of 2014 (IFE14) was an event held in the Hashemite Kingdom of Jordan (with concurrent activities in Austria) that tested the operational and technical capabilities of an on-site inspection (OSI) within the CTBT verification regime. During an OSI, up to 40 international inspectors will search an area for evidence of a nuclear explosion. Over 250 experts from ~50 countries were involved in IFE14 (the largest simulation of a real OSI to date) and worked from a number of different directions, such as the Exercise Management and Control Teams (which executed the scenario in which the exercise was played) and those participants performing as members of the Inspection Team (IT). One of the main objectives of IFE14 was to test and integrate Treaty allowed inspection techniques, including a number of geophysical and remote sensing methods. In order to develop a scenario in which the simulated exercise could be carried out, suites of physical features in the IFE14 inspection area were designed and engineered by the Scenario Task Force (STF) that the IT could detect by applying the geophysical and remote sensing inspection technologies, in addition to other techniques allowed by the CTBT. For example, in preparation for IFE14, the STF modeled a seismic triggering event that was provided to the IT to prompt them to detect and localize aftershocks in the vicinity of a possible explosion. Similarly, the STF planted shallow targets such as borehole casings and pipes for detection using other geophysical methods. In addition, airborne technologies, which included multi-spectral imaging, were deployed such that the IT could identify freshly exposed surfaces, imported materials, and other areas that had been subject to modification. This presentation will introduce the CTBT and OSI, explain the IFE14 in terms of the goals specific to geophysical and remote sensing methods, and show how both the preparation for and execution of IFE14 meet those goals.

  6. Cluster Analysis for CTBT Seismic Event Monitoring

    SciTech Connect

    Carr, Dorthe B.; Young, Chris J.; Aster, Richard C.; Zhang, Xioabing

    1999-08-03

    Mines at regional distances are expected to be continuing sources of small, ambiguous events which must be correctly identified as part of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) monitoring process. Many of these events are small enough that they are only seen by one or two stations, so locating them by traditional methods maybe impossible or at best leads to poorly resolved parameters. To further complicate matters, these events have parametric characteristics (explosive sources, shallow depths) which make them difficult to identify as definite non-nuclear events using traditional discrimination methods. Fortunately, explosions from the same mines tend to have similar waveforms, making it possible to identify an unknown event by comparison with characteristic archived events that have been associated with specific mines. In this study we examine the use of hierarchical cluster methods to identify groups of similar events. These methods produce dendrograms, which are tree-like structures showing the relationships between entities. Hierarchical methods are well-suited to use for event clustering because they are well documented, easy to implement, computationally cheap enough to run multiple times for a given data set, and because these methods produce results which can be readily interpreted. To aid in determining the proper threshold value for defining event families for a given dendrogram, we use cophenetic correlation (which compares a model of the similarity behavior to actual behavior), variance, and a new metric developed for this study. Clustering methods are compared using archived regional and local distance mining blasts recorded at two sites in the western U.S. with different tectonic and instrumentation characteristics: the three-component broadband DSVS station in Pinedale, Wyoming and the short period New Mexico Tech (NMT) network in central New Mexico. Ground truth for the events comes from the mining industry and local network locations

  7. Enhanced global Radionuclide Source Attribution for the Nuclear-Test-Ban Verification by means of the Adjoint Ensemble Dispersion Modeling Technique applied at the IDC/CTBTO.

    NASA Astrophysics Data System (ADS)

    Becker, A.; Wotawa, G.; de Geer, L.

    2006-05-01

    findings of the ensemble dispersion modeling (EDM) technique No. 5 efforts performed by Galmarini et al, 2004 (Atmos. Env. 38, 4607-4617). As the scope of the adjoint EDM methodology is not limited to CTBT verification but can be applied to any kind of nuclear event monitoring and location it bears the potential to improve the design of manifold emergency response systems towards preparedness concepts as needed for mitigation of disasters (like Chernobyl) and pre-emptive estimation of pollution hazards.

  8. DOE program on seismic characterization for regions of interest to CTBT monitoring

    SciTech Connect

    Ryall, A.S.; Weaver, T.A.

    1995-07-01

    The primary goal of the DOE programs on Geophysical Characterization of (1) the Middle East and North Africa (ME-NA) and (2) Southern Asia (SA) is to provide the Air Force Technical Applications Center (AFRAC) with the analytic tools and knowledge base to permit effective verification of Comprehensive Test Ban Treaty (CTBT) compliance in those regions. The program also aims at using these regionalizations as models for the development of a detailed prescription for seismic calibration and knowledge base compilation in areas where the US has had little or no previous monitoring experience. In any given region, the CTBT seismic monitoring system will depend heavily on a few key arrays and/or three-component stations, and it will be important to know as much as possible about the physical properties of the earth`s crust and upper mantle: (1) in the vicinity of these stations, (2) in areas of potential earthquake activity or commercial blasting in the region containing the stations, and (3) along the propagation path from the sources to the stations. To be able to discriminate between various source types, we will also need to know how well the various event characterization techniques perform when they are transported from one tectonic or geologic environment to another. The Department of Energy`s CMT R&D program plan (DOE, 1994), which includes the ME-NA and SA characterization programs, incorporates an iterative process that combines field experiments, computer modeling and data analysis for the development, testing, evaluation and modification of data processing algorithms as appropriate to achieve specific US monitoring objectives. This process will be applied to seismic event detection, location and identification.

  9. Site and Event Characterization Using the CTBT On-Site Inspection Techniques (Invited)

    NASA Astrophysics Data System (ADS)

    Labak, P.; Gaya Pique, L. R.; Rowlands, A. P.; Arndt, R. H.

    2013-12-01

    One of the four elements of the CTBT verification regime is On-Site Inspection (OSI). The sole purpose of an OSI is to clarify whether a nuclear weapon test explosion or any other nuclear explosion has been conducted in violation of the CTBT. An OSI would be conducted within an area no bigger than 1000 km2 and by no more than 40 inspectors at any one time, applying search logic and inspection techniques with the aim of collecting relevant information that will be the basis for the inspection report. During the course of an OSI less intrusive techniques applied over broad areas (usually with lower spatial resolution) are supplemented with more intrusive techniques applied to more targeted areas (usually at a higher spatial resolution). Environmental setting and the evolution of OSI-relevant observables over time will influence the application of OSI techniques. In the course of the development of OSI methodology and relevant techniques, field tests and exercises have been conducted. While earlier activities mainly focused on progress of individual techniques (such as visual observation, passive seismological monitoring for aftershocks and measurements of radioactivity), recent work covered both technique development (such as multi-spectral imaging including infrared measurements, and environmental sampling and analysis of solids, liquids and gases) as well as the integration of techniques, search logic and data flow. We will highlight examples of application of OSI technologies for site and event characterization from recently conducted field tests and exercises and demonstrate the synthesis of techniques and data necessary for the conduct of an OSI.

  10. Progress in CTBT Monitoring since its 1999 Senate Defeat

    NASA Astrophysics Data System (ADS)

    Hafemeister, David

    2009-05-01

    Progress in monitoring the Comprehensive Nuclear Test Ban Treaty (CTBT) is examined, beginning with the 2002 National Academy of Sciences CTBT study, followed by recent findings on regional seismology, array--monitoring, correlation--detection, seismic modeling and non-seismic technologies. The NAS--CTBT study concluded that the fully--completed International Monitoring System (IMS) will reliably detect and identify underground nuclear explosions with a threshold of 0.1 kt in hard rock, if conducted anywhere in Europe, Asia, North Africa, and North America. In some locations the threshold is 0.01 kt or lower, using arrays or regional seismic stations. As an example, the 0.6--kiloton North Korean test of October 9, 2006 was promptly detected by seismometers in Australia, Europe, North America and Asia. The P/S ratio between 1--15 Hz clearly showed that the event was an explosion and not an earthquake. Radioactive venting, observed as far as Canada, proved that it was a nuclear explosion. Advances in seismic monitoring strengthen the conclusions of the NAS study. Interferometric synthetic aperture radar can, in some cases, identify and locate 1--kt tests at 500 m depth by measuring subsidence to 2--5 mm. InSAR can discriminate between earthquakes and explosions from the subsidence pattern and it can locate nuclear tests to within 100 meters.

  11. Global Communications Infrastructure: CTBT Treaty monitoring using space communications

    NASA Astrophysics Data System (ADS)

    Kebeasy, R.; Abaya, E.; Ricker, R.; Demeules, G.

    Article 1 on Basic Obligations of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) states that: "Each State Party undertakes not to carry out any nuclear weapon test explosion or any other nuclear explosion, and to prohibit and prevent any such nuclear explosion at any place under its jurisdiction or control. Each State Party undertakes, furthermore, to refrain from causing, encouraging, or in any way participating in the carrying out of any nuclear weapon test explosion or any other nuclear explosion." To monitor States Parties compliance with these Treaty provisions, an International Monitoring System (IMS) consisting of 321 monitoring stations and 16 laboratories in some 91 countries is being implemented to cover the whole globe, including its oceans and polar regions. The IMS employs four technologies--seismic, hydroacoustic, infrasound and radionuclide--to detect,locate and identify any seismic event of Richter magnitude 4 and above (equivalent to one kiloton of TNT) that may be associated with a nuclear test explosion. About one-half of this monitoring system is now operational in 67 countries. Monitoring stations send data in near real-time to an International Data Centre (IDC) in Vienna over a Global Communications Infrastructure (GCI) incorporating 10 geostationary satellites plus three satellites in inclined orbits. The satellites relay the data to commercial earth stations, from where they are transferred by terrestrial circuits to the IDC. The IDC automatically processes and interactively analyzes the monitoring data, and distributes the raw data and reports relevant to Treaty verification to National Data Centers in Member States over the same communications network. The GCI will eventually support about 250 thin route VSAT links to the monitoring stations, many of them at remote or harsh locations on the earth, plus additional links to national data centres in various countries. Off-the-shelf VSAT and networking hardware are deployed. This is the

  12. A verification strategy for web services composition using enhanced stacked automata model.

    PubMed

    Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali

    2015-01-01

    Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the

  13. Verification and Validation of NASA-Supported Enhancements to Decision Support Tools of PECAD

    NASA Technical Reports Server (NTRS)

    Ross, Kenton W.; McKellip, Rodney; Moore, Roxzana F.; Fendley, Debbie

    2005-01-01

    This section of the evaluation report summarizes the verification and validation (V&V) of recently implemented, NASA-supported enhancements to the decision support tools of the Production Estimates and Crop Assessment Division (PECAD). The implemented enhancements include operationally tailored Moderate Resolution Imaging Spectroradiometer (MODIS) products and products of the Global Reservoir and Lake Monitor (GRLM). The MODIS products are currently made available through two separate decision support tools: the MODIS Image Gallery and the U.S. Department of Agriculture (USDA) Foreign Agricultural Service (FAS) MODIS Normalized Difference Vegetation Index (NDVI) Database. Both the Global Reservoir and Lake Monitor and MODIS Image Gallery provide near-real-time products through PECAD's CropExplorer. This discussion addresses two areas: 1. Assessments of the standard NASA products on which these enhancements are based. 2. Characterizations of the performance of the new operational products.

  14. Synthetics vs. real waveforms from underground nuclear explosions as master templates for CTBT monitoring with cross-correlation

    NASA Astrophysics Data System (ADS)

    Rozhkov, M.; Kitov, I. O.; Bobrov, D.

    2013-12-01

    The cross-correlation (CC) and master event technique is efficient in Comprehensive Nuclear-Test Ban Treaty (CTBT) monitoring. Two primary goals of CTBT monitoring are detection and location of nuclear explosions. Therefore, the CC global monitoring should be focused on finding such events. The use of physically adequate masters may increase the number of valid events in the Reviewed Event Bulletin (REB) of the International Data Centre by a factor of 2. Inadequate master events may increase the number of irrelevant events in REB and reduce the sensitivity of the CC technique to valid events. In order to cover the entire earth, including vast aseismic territories, with the CC based nuclear test monitoring we conducted a thorough research and defined the most appropriate real and synthetic master events representing underground explosion sources. A procedure was developed on optimizing the master event simulation based on principal component analysis with bootstrap aggregation as a dimension reduction technique narrowing the classes of CC templates used in global detection and location process. Actual waveforms and metadata from the DTRA Verification Database (http://www.rdss.info) were used to validate our approach. The detection and location results based on real and synthetic master events were compared

  15. Alternatives for Laboratory Measurement of Aerosol Samples from the International Monitoring System of the CTBT

    NASA Astrophysics Data System (ADS)

    Miley, H.; Forrester, J. B.; Greenwood, L. R.; Keillor, M. E.; Eslinger, P. W.; Regmi, R.; Biegalski, S.; Erikson, L. E.

    2013-12-01

    The aerosol samples taken from the CTBT International Monitoring Systems stations are measured in the field with a minimum detectable concentration (MDC) of ~30 microBq/m3 of Ba-140. This is sufficient to detect far less than 1 kt of aerosol fission products in the atmosphere when the station is in the plume from such an event. Recent thinking about minimizing the potential source region (PSR) from a detection has led to a desire for a multi-station or multi-time period detection. These would be connected through the concept of ';event formation', analogous to event formation in seismic event study. However, to form such events, samples from the nearest neighbors of the detection would require re-analysis with a more sensitive laboratory to gain a substantially lower MDC, and potentially find radionuclide concentrations undetected by the station. The authors will present recent laboratory work with air filters showing various cost effective means for enhancing laboratory sensitivity.

  16. Enhancement of electrophoretic mobility of microparticles near a solid wall--experimental verification.

    PubMed

    Liang, Qian; Zhao, Cunlu; Yang, Chun

    2015-03-01

    Although the existing theories have predicted enhancement of electrophoretic mobility of microparticles near a solid wall, the relevant experimental studies are rare. This is mainly due to difficulties in experimentally controlling and measuring particle-wall separations under dynamic electrophoretic conditions. This paper reports an experimental verification of the enhancement of electrophoretic mobility of a microparticle moving near the wall of a microchannel. This is achieved by balancing dielectrophoretic and lift forces against gravitational force acting on the microparticle so as to control the gap of particle-wall separation. A simple experimental setup is configured and a fabrication method is developed to measure such separation gap. The experiments are conducted for various particle sizes under different electric field strengths. Our experimental results are compared against the available theoretical predictions in the literature. PMID:25421107

  17. Transitioning Enhanced Land Surface Initialization and Model Verification Capabilities to the Kenya Meteorological Department (KMD)

    NASA Technical Reports Server (NTRS)

    Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Zavodsky, Bradley T.; Srikishen, Jayanthi; Limaye, Ashutosh; Blankenship, Clay B.

    2016-01-01

    Flooding, severe weather, and drought are key forecasting challenges for the Kenya Meteorological Department (KMD), based in Nairobi, Kenya. Atmospheric processes leading to convection, excessive precipitation and/or prolonged drought can be strongly influenced by land cover, vegetation, and soil moisture content, especially during anomalous conditions and dry/wet seasonal transitions. It is thus important to represent accurately land surface state variables (green vegetation fraction, soil moisture, and soil temperature) in Numerical Weather Prediction (NWP) models. The NASA SERVIR and the Short-term Prediction Research and Transition (SPoRT) programs in Huntsville, AL have established a working partnership with KMD to enhance its regional modeling capabilities. SPoRT and SERVIR are providing experimental land surface initialization datasets and model verification capabilities for capacity building at KMD. To support its forecasting operations, KMD is running experimental configurations of the Weather Research and Forecasting (WRF; Skamarock et al. 2008) model on a 12-km/4-km nested regional domain over eastern Africa, incorporating the land surface datasets provided by NASA SPoRT and SERVIR. SPoRT, SERVIR, and KMD participated in two training sessions in March 2014 and June 2015 to foster the collaboration and use of unique land surface datasets and model verification capabilities. Enhanced regional modeling capabilities have the potential to improve guidance in support of daily operations and high-impact weather and climate outlooks over Eastern Africa. For enhanced land-surface initialization, the NASA Land Information System (LIS) is run over Eastern Africa at 3-km resolution, providing real-time land surface initialization data in place of interpolated global model soil moisture and temperature data available at coarser resolutions. Additionally, real-time green vegetation fraction (GVF) composites from the Suomi-NPP VIIRS instrument is being incorporated

  18. Verification and Validation of NASA-Supported Enhancements to PECAD's Decision Support Tools

    NASA Technical Reports Server (NTRS)

    McKellipo, Rodney; Ross, Kenton W.

    2006-01-01

    The NASA Applied Sciences Directorate (ASD), part of the Earth-Sun System Division of NASA's Science Mission Directorate, has partnered with the U.S. Department of Agriculture (USDA) to enhance decision support in the area of agricultural efficiency-an application of national importance. The ASD integrated the results of NASA Earth science research into USDA decision support tools employed by the USDA Foreign Agricultural Service (FAS) Production Estimates and Crop Assessment Division (PECAD), which supports national decision making by gathering, analyzing, and disseminating global crop intelligence. Verification and validation of the following enhancements are summarized: 1) Near-real-time Moderate Resolution Imaging Spectroradiometer (MODIS) products through PECAD's MODIS Image Gallery; 2) MODIS Normalized Difference Vegetation Index (NDVI) time series data through the USDA-FAS MODIS NDVI Database; and 3) Jason-1 and TOPEX/Poseidon lake level estimates through PECAD's Global Reservoir and Lake Monitor. Where possible, each enhanced product was characterized for accuracy, timeliness, and coverage, and the characterized performance was compared to PECAD operational requirements. The MODIS Image Gallery and the GRLM are more mature and have achieved a semi-operational status, whereas the USDA-FAS MODIS NDVI Database is still evolving and should be considered

  19. A key management concept for the CTBT International Monitoring System

    SciTech Connect

    Herrington, P.; Draelos, T.; Craft, R.; Brickell, E.; Frankel, Y.; Silvestri, M.

    1997-08-01

    Cryptographic authentication (commonly referred to as ``technical authentication`` in Working Group B) is an enabling technology which ensures the integrity of sensor data and security of digital networks under various data security compromise scenarios. The use of cryptographic authentication,however, implies the development of a key management infrastructure for establishing trust in the generation and distribution of cryptographic keys. This paper proposes security and operational requirements for a CTBT (Comprehensive Test Ban Treaty) key management system and, furthermore, presents a public key based solution satisfying the requirements. The key management system is instantiated with trust distribution technologies similar to those currently implemented in industrial public key infrastructures. A complete system solution is developed.

  20. Enhanced spacer-is-dielectric (sid) decomposition flow with model-based verification

    NASA Astrophysics Data System (ADS)

    Du, Yuelin; Song, Hua; Shiely, James; Wong, Martin D. F.

    2013-03-01

    Self-aligned double patterning (SADP) lithography is a leading candidate for 14nm node lower-metal layer fabrication. Besides the intrinsic overlay-tolerance capability, the accurate spacer width and uniformity control enables such technology to fabricate very narrow and dense patterns. Spacer-is-dielectric (SID) is the most popular flavor of SADP with higher flexibility in design. In the SID process, due to uniform spacer deposition, the spacer shape gets rounded at convex mandrel corners, and disregarding the corner rounding issue during SID decomposition may result in severe residue artifacts on device patterns. Previously, SADP decomposition was merely verified by Boolean operations on the decomposed layers, where the residue artifacts are not even identifiable. This paper proposes a model-based verification method for SID decomposition to identify the artifacts caused by spacer corner rounding. Then targeting residue artifact removal, an enhanced SID decomposition flow is introduced. Simulation results show that residue artifacts are removed effectively through the enhanced SID decomposition strategy.

  1. Dosimetric verification of enhanced dynamic wedges by a 2D ion chamber array

    NASA Astrophysics Data System (ADS)

    Oh, Se An; Kim, Sung Kyu; Kang, Min Kyu; Yea, Ji Woon; Kim, Eng Chan

    2013-12-01

    Wedge filters are commonly used to achieve dose uniformity in the target volume in radiotherapy and can be categorized as physical wedges (PWs) and enhanced dynamic wedges (EDWs). The EDW generates PW-like dose profiles while moving the upper jaw in the Y directions with a varying dose rate in the treatment beams. Task Group 53 of the AAPM (American Association of Physicists in Medicine) recommended that the dynamic wedge be verified before implementation in the radiation treatment planning (RTP) system. The aim of this study was to use the I'mRT MatriXX to verify the dose profiles of the EDWs manufactured by Varian. We used Pencil Beam Convolution algorithms (eclipse 8.6) for the calculation and I'mRT MatriXX with Plastic Water® phantom MULTICube for dose measurements. The gamma indices of the calculations and the measurements for the EDWs were 84.84% and 86.54% in 2%/2 mm tolerance, and 99.47% and 99.64% in 3%/3 mm tolerance for wedge angles of 15°, 30°, 45° and 60°, respectively. The dose distributions differed between the calculations using the system and the measurements in the penumbra and the outer beam regions of the wedge fields. We confirmed that the dosimetric verifications of the EDW were acceptable when using the criterion for external beam dose calculations of Task Group 53.

  2. Evaluation of infrasonic detection capability for the CTBT/IMS

    SciTech Connect

    Armstrong, W.T.; Whitaker, R.W.; Olson, J.V.

    1996-09-01

    Evaluation of infrasonic detection capability for the International Monitoring System of the Comprehensive Test Ban Treaty (IMS/CTBT) is made with respect to signal analysis and global coverage. Signal analysis is anecdotally reviewed with respect to composite power, correlation and F-statistic detection algorithms. In the absence of adaptive pre-filtering, either cross-correlation or F-statistic detection is required. As an unbounded quantity, the F-statistic offers potentially greater sensitivity to signals of interest. With PURE state pre-filtering, power detection begins to become competitive with correlation and F-statistic detection. Additional application of simple post-filters of minimum duration and maximum bearing deviation results in unique positive detection of an identified impulsive infrasonic signal. Global coverage estimates are performed as a useful deterministic evaluation of networks, offering an easily interpreted network performance, which compliments previous probabilistic network evaluations. In particular, adequate coverage (2 sites), uniform coverage, and redundant coverage (3 to 4 sites) provide figures of merit in evaluating detection, location and vulnerability, respectively. Coverage estimates of the I60 network have been performed which indicate generally adequate coverage for the majority of the globe. Modest increase of station gain (increase of number of elements from 4 to 7) results in significant increase in coverage for mean signal values. Ineffective sites and vulnerability sites are identified which suggest further refinement of the network is possible.

  3. Evaluation of database technologies for the CTBT Knowledge Base prototype

    SciTech Connect

    Keyser, R.; Shepard-Dombroski, E.; Baur, D.; Hipp, J.; Moore, S.; Young, C.; Chael, E.

    1996-11-01

    This document examines a number of different software technologies in the rapidly changing field of database management systems, evaluates these systems in light of the expected needs of the Comprehensive Test Ban Treaty (CTBT) Knowledge Base, and makes some recommendations for the initial prototypes of the Knowledge Base. The Knowledge Base requirements are examined and then used as criteria for evaluation of the database management options. A mock-up of the data expected in the Knowledge Base is used as a basis for examining how four different database technologies deal with the problems of storing and retrieving the data. Based on these requirement and the results of the evaluation, the recommendation is that the Illustra database be considered for the initial prototype of the Knowledge Base. Illustra offers a unique blend of performance, flexibility, and features that will aid in the implementation of the prototype. At the same time, Illustra provides a high level of compatibility with the hardware and software environments present at the US NDC (National Data Center) and the PIDC (Prototype International Data Center).

  4. Investigation of CTBT OSI Radionuclide Techniques at the DILUTED WATERS Nuclear Test Site

    SciTech Connect

    Baciak, James E.; Milbrath, Brian D.; Detwiler, Rebecca S.; Kirkham, Randy R.; Keillor, Martin E.; Lepel, Elwood A.; Seifert, Allen; Emer, Dudley; Floyd, Michael

    2012-11-01

    Under the Comprehensive Nuclear-Test-Ban Treaty (CTBT), a verification regime that includes the ability to conduct an On-Site Inspection (OSI) will be established. The Treaty allows for an OSI to include many techniques, including the radionuclide techniques of gamma radiation surveying and spectrometry and environmental sampling and analysis. Such radioactivity detection techniques can provide the “smoking gun” evidence that a nuclear test has occurred through the detection and quantification of indicative recent fission products. An OSI faces restrictions in time and manpower, as dictated by the Treaty; not to mention possible logistics difficulties due to the location and climate of the suspected explosion site. It is thus necessary to have a good understanding of the possible source term an OSI will encounter and the proper techniques that will be necessary for an effective OSI regime. One of the challenges during an OSI is to locate radioactive debris that has escaped an underground nuclear explosion (UNE) and settled on the surface near and downwind of ground zero. To support the understanding and selection of sampling and survey techniques for use in an OSI, we are currently designing an experiment, the Particulate Release Experiment (PRex), to simulate a small-scale vent from an underground nuclear explosion. PRex will occur at the Nevada National Security Site (NNSS). The project is conducted under the National Center for Nuclear Security (NCNS) funded by the National Nuclear Security Agency (NNSA). Prior to the release experiment, scheduled for Spring of 2013, the project scheduled a number of activities at the NNSS to prepare for the release experiment as well as to utilize the nuclear testing past of the NNSS for the development of OSI techniques for CTBT. One such activity—the focus of this report—was a survey and sampling campaign at the site of an old UNE that vented: DILUTED WATERS. Activities at DILUTED WATERS included vehicle-based survey

  5. Towards a new daily in-situ precipitation data set supporting parameterization of wet-deposition of CTBT relevant radionuclides

    NASA Astrophysics Data System (ADS)

    Becker, A.; Ceranna, L.; Ross, O.; Schneider, U.; Meyer-Christoffer, A.; Ziese, M.; Lehner, K.; Rudolf, B.

    2012-04-01

    As contribution to the World Climate Research Program (WCRP) and in support of the Global Climate Observing System (GCOS) of the World Meteorological Organization (WMO), the Deutscher Wetterdienst (DWD) operates the Global Precipitation Climatology Centre (GPCC). The GPCC re-analysis and near-real time monitoring products are recognized world-wide as the most reliable global data set on rain-gauge based (in-situ) precipitation measurements. The GPCC Monitoring Product (Schneider et al, 2011; Becker et al. 2012, Ziese et al, EGU2012-5442) is available two months after the fact based on the data gathered while listening to the GTS to fetch the SYNOP and CLIMAT messages. This product serves also the reference data to calibrate satellite based precipitation measurements yielding the Global Precipitation Climatology Project (GPCP) data set (Huffmann et al., 2009). The quickest GPCC product is the First Guess version of the GPCC Monitoring Product being available already 3-5 days after the month regarded. Both, the GPCC and the GPCP products bear the capability to serve as data base for the computational light-weight post processing of the wet deposition impact on the radionuclide (RN) monitoring capability of the CTBT network (Wotawa et al., 2009) on the regional and global scale, respectively. This is of major importance any time, a reliable quantitative assessment of the source-receptor sensitivity is needed, e.g. for the analysis of isotopic ratios. Actually the wet deposition recognition is a prerequisite if ratios of particulate and noble gas measurements come into play. This is so far a quite unexplored field of investigation, but would alleviate the clearance of several apparently CTBT relevant detections, encountered in the past, as bogus and provide an assessment for the so far overestimation of the RN detection capability of the CTBT network. Besides the climatological kind of wet deposition assessment for threshold monitoring purposes, there are also singular

  6. Surface-wave calibration studies for improved monitoring of a CTBT

    SciTech Connect

    Patton, H.J.; Jones, L.E.

    1998-12-31

    Seismic calibration of the International Monitoring System (IMS) and other key monitoring stations is critical for effective verification of a Comprehensive Test Ban Treaty (CTBT). Detection, location, and identification all depend upon calibration of source and path effects to ensure maximum efficiency of the IMS to monitor at small magnitudes. This project gathers information about the effects of source and propagation on surface waves for key monitoring areas in central Asia with initial focus on western China. Source calibration focuses on surface-wave determinations of focal depth and seismic moment, M{sub o}, for key earthquakes, which serve as calibration sources in location studies and for developing regional magnitude scales. The authors present a calibration procedure for Lg attenuation, which exploits an empirical relationship between M{sub o} and 1-Hz Lg amplitude for stable and tectonic continental regions. The procedure uses this relationship and estimates of M{sub o} to predict Lg amplitudes at a reference distance of 10 km from each calibrated source. Path-specific estimates of Q{sub o} in the power-law formula of Q (Q = Q{sub o}f{sup {zeta}}) are made using measurements of 1-Hz Lg amplitudes observed at the station and amplitudes predicted for the reference distance. Nuttli`s formula for m{sub b}(Lg) is thus calibrated for the source region of interest, and for paths to key monitoring stations. Path calibration focuses on measurement of surface-wave group velocity dispersion curves in the period range of 5 to 50 s. Concentrating on the Lop Nor source region initially, they employ broadband data recorded at CDSN stations, regional event (M > 4.0), and source-receiver path lengths from 200 to 2000 km. Their approach emphasizes path-specific calibration of key stations and source regions and will result in a family of regionally appropriate phase-match filters, designed to extract fundamental mode surface-wave arrivals for each region of interest

  7. Meteosat Second Generation: in-flight verification and calibration of the spinning enhanced visible and infrared imager (SEVIRI)

    NASA Astrophysics Data System (ADS)

    Ottenbacher, Andreas; Aminou, Donny M. A.

    2001-02-01

    12 The Meteosat Second Generation (MSG) program consists of a series of 3 geostationary satellites. The objectives for this program have been defined by the European meteorological community led by the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) and the European Space Agency (ESA). The first MSG satellite (MSG-1) is currently being developed by ESA and scheduled for launch in the year 2002. MSG-2 and MSG-3 will be procured by ESA on behalf of EUMETSAT. They will be built by European industry under ESA contract and are scheduled for launch in 2004 and 2009. EUMETSAT will launch and operate the three satellites and provide data until 2014. This paper focuses on the foreseen in-flight verification and calibration of the main payload on board the MSG satellites: the Spinning Enhanced Visible and Infrared Imager (SEVIRI). It commences with a brief description of the main elements of the MSG satellite and the SEVIRI instrument. Then the in-flight verification approach for functional and performance specifications will be presented. Finally, the paper will provide the envisaged in-flight calibration method for the SEVIRI instrument.

  8. Verification of Precipitation Enhancement due to Winter Orographic Cloud Seeding in the Payette River Basin of Western Idaho

    NASA Astrophysics Data System (ADS)

    Holbrook, V. P.; Kunkel, M. L.; Blestrud, D.

    2013-12-01

    The Idaho Power Company (IPCo) is a hydroelectric based utility serving eastern Oregon and most of southern Idaho. Snowpack is critical to IPCo operations and the company has invested in a winter orographic cloud seeding program for the Payette, Boise, and Upper Snake River basins to augment the snowpack. IPCo and the National Center for Atmospheric Research (NCAR) are in the middle of a two-year study to determine precipitation enhancement due to winter orographic cloud seeding in the Payette River basin. NCAR developed a cloud seeding module, as an enhancement to the Weather Research and Forecast (WRF) model, that inputs silver iodide released from both ground based and/or aircraft generators. The cloud seeding module then increases the precipitation as a function of the cloud seeding. The WRF model used for this program is run at the University of Arizona with a resolution of 1.8 kilometers using Thompson microphysics and Mellor-Yamada-Janic boundary layer scheme. Two different types of verification schemes to determine precipitation enhancement is being used for this program; model versus model and model versus precipitation gauges. In the model versus model method, a control model run uses NCAR developed criteria to identify the best times to operate cloud or airborne seeding generators and also establishes the baseline precipitation. The model is then rerun with the cloud seeding module turned on for the time periods determined by the control run. The precipitation enhancement due to cloud seeding is then the difference in precipitation between the control and seeding model runs. The second verification method is to use the model forecast precipitation in the seeded and non-seeded areas, compare against observed precipitation (from mainly SNOTEL gauges), and determine the precipitation enhancement due to cloud seeding. Up to 15 SNOTEL gauges in or near the Payette River basin along with 14 IPCo high resolution rain gauges will be used with this target

  9. A Visual Analytics Approach to Structured Data Analysis to Enhance Nonproliferation and Arms Control Verification Activities

    SciTech Connect

    Gillen, David S.

    2014-08-07

    Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in this domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of

  10. Test report for the infrasound prototype: For a CTBT IMS station

    SciTech Connect

    Breding, D.R.; Kromer, R.P.; Whitaker, R.W.; Sandoval, T.

    1997-11-01

    This document describes the results of the Comprehensive Test Ban Treaty (CTBT) Infrasound Prototype Development Test and Evaluation (DT&E). During DT&E the infrasound prototype was evaluated against requirements listed in the System Requirements Document (SRD) based on the Conference on Disarmament/Ad Hoc Committee on a Nuclear Test Ban/Working Papers 224 and 283 and the Preparatory Commission specifications as defined in CTBT/PC/II/1/Add.2, Appendix X, Table 5. The evaluation was conducted during a two-day period, August 6-7, 18997. The System Test Plan (STP) defined the plan and methods to test the infrasound prototype. Specific tests that were performed are detailed in the Test Procedures (TP).

  11. Feedback Seeking in Early Adolescence: Self-Enhancement or Self-Verification?

    ERIC Educational Resources Information Center

    Rosen, Lisa H.; Principe, Connor P.; Langlois, Judith H.

    2013-01-01

    The authors examined whether early adolescents ("N" = 90) solicit self-enhancing feedback (i.e., positive feedback) or self-verifying feedback (i.e., feedback congruent with self-views, even when these views are negative). Sixth, seventh, and eighth graders first completed a self-perception measure and then selected whether to receive…

  12. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    SciTech Connect

    Qiu, J; Li, H. Harlod; Zhang, T; Yang, D; Ma, F

    2015-06-15

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The most important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools.

  13. An experimental verification of metamaterial coupled enhanced transmission for antenna applications

    SciTech Connect

    Pushpakaran, Sarin V.; Raj, Rohith K.; Pradeep, Anju; Ouseph, Lindo; Hari, Mridula; Chandroth, Aanandan; Pezholil, Mohanan; Kesavath, Vasudevan

    2014-02-10

    Inspired by the work of Bethe on electromagnetic transmission through subwavelength hole, there has been immense interest on the extraordinary transmission through subwavelength slot/slit on metal plates. The invention of metamaterials has boosted the extra ordinary transmission through subwavelength slots. We examine computationally and experimentally the concept of metamaterial cover using an array of split ring resonators (SRRs), for enhancing the transmission in a stacked dipole antenna working in the S band. The front to back ratio is considerably improved by enhancing the magnetic resonant strength in close proximity of the slit of the upper parasitic dipole. The effect of stacking height of the SRR monolayer on the resonant characteristics of the split ring resonators and its effect on antenna radiation characteristics has been studied.

  14. Cimstation/IM enhanced data verification, CRADA final report for CRADA number Y-1292-0162

    SciTech Connect

    Biddix, M.D.; Turner, J.

    1994-05-16

    This report discusses a CRADA code used to enhance the Cimstation in ramifaction of inspection part programs as they are being develop. This report briefly discussed the following topics and contains a code listing in the back: algorithm explanation; Cimstation CAD models; importing inspection point data; need for a new algorithm; details of the algorithms; formulas/mathematics used for the algorithm; algorithm software design diagram; and software function descriptions.

  15. Exhaustive and Systematic Accuracy Verification and Enhancement of STI Stress Compact Model for General Realistic Layout Patterns

    NASA Astrophysics Data System (ADS)

    Yamada, Kenta; Syo, Toshiyuki; Yoshimura, Hisao; Ito, Masaru; Kunikiyo, Tatsuya; Kanamoto, Toshiki; Kumashiro, Shigetaka

    Layout-aware compact models proposed so far have been generally verified only for simple test patterns. However, real designs use much more complicated layout patterns. Therefore, models must be verified for such patterns to establish their practicality. This paper proposes a methodology and test patterns for exhaustively and systematically validating layout-aware compact models for general layout patterns for the first time. The methodology and test patterns are concretely shown through validation of a shallow trench isolation (STI) stress compact model proposed in [1]. First, the model parameters for a 55-nm CMOS technology are extracted, and then the model is verified and established to be accurate for the basic patterns used for parameter extraction. Next, fundamental ideas of model operation for general layout patterns are verified using various verification patterns. These tests revealed that the model is relatively weak in some cases not included in the basic patterns. Finally, the errors for these cases are eliminated by enhancing the algorithm. Consequently, the model is confirmed to have high generality. This methodology will be effective for validating other layout-aware compact models for general layout patterns.

  16. Physical verification of contaminated sediment remediation: Capping, confined aquatic disposal, and enhanced natural recovery

    SciTech Connect

    Browning, D.

    1995-12-31

    Dredging and disposal in a confined aquatic disposal (CAD) site, capping with clean sediment, and natural recovery are commonly used, cost-effective remedial practices for contaminated sediments. Recent projects in Puget Sound, Washington and Southern California involved dredging and use of the material for capping and CAD fill. Both of these projects required physical monitoring to document sediment placement. Dredged sediments placed at these sites were optically identified using sediment vertical profile system (SVPS) photography. Optical criteria to distinguish cap/construction materials include grain-size, reflectance, and texture. Environmental parameters such as the extent and thickness of the CAD material or sediment cap deposits are evaluated against design and performance goals, typically the isolation of contaminants from the biologically active portion of the sediment column. Using SVPS, coring and other technologies, the stratigraphic contact between the capping/CAD sediment and the native sediment can be discerned. These measurements observations can ground-truth and be coupled with remote sensing to provide a more complete characterization of the entire remedial area. Physical isolation of the benthic community can be discerned by examining SVPS images for depth of bioturbation and sediment stratigraphy. On the periphery of cap/CAD deposits, thin layers of clean sediment ranging upwards from 1 mm thick can be identified. Dependent on the pre-remediation benthic community at the site, these thin layers of CAP/CAD sediment can be bioturbated by resident benthic infauna immediately after placement. The deposition and subsequent assimilation of the clean cap material into the contaminated sediments effectively reduces the concentration of contaminants in the biologically active zone thereby enhancing natural recovery in areas where regulatory criteria are focused on the biologically active zone.

  17. Localized surface plasmon resonance immunoassay and verification using surface-enhanced Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Yonzon, Chanda R.; Zhang, Xiaoyu; Van Duyne, Richard P.

    2003-11-01

    This work exploits the localized surface plasmon resonance (LSPR) spectroscopy of noble metal nanoparticles to achieve sensitive and selective detection of biological analytes. Noble metal nanoparticles exhibit an LSPR that is strongly dependent on their size, shape, material, and the local dielectric environment. The LSPR is also responsible for the intense signals observed in surface-enhanced Raman scattering (SERS). Ag nanoparticles fabricated using the nanosphere lithography (NSL) technique exploits this LSPR sensitivity as a signal transduction method in biosensing applications. The current work implements LSPR biosensing for the anti dinitrophenyl (antiDNP) immunoassay system. Upon forming the 2,4 dinitrobenzoic acid/antiDNP complex, this system shows a large LSPR shift of 44 nm when exposed to antiDNP concentration of 1.5 x 10-6 M. In addition, due to the unique molecular characteristics of the functional groups on the biosensor, it can also be characterized using SERS. First, the nanoparticles are functionalized with a mixed self-assembled monolayer (SAM) comprised of 2:1 octanethiol and 11-amino undecanethiol. The SAM is exposed to 2,4-dinitrobenzoic acid with the 1-ethyl-3-[3-dimethylaminopropyl]carbodiimide hydrochloride (EDC) coupling reagent. Finally, the 2,4-dinitrophenyl terminated SAM is exposed to various concentration of antiDNP. LSPR shifts indicate the occurrence of a binding event. SER spectra confirm binding of 2,4 dinitrobenzoic acid with amine-terminated SAM. This LSPR/SERS biosensing method can be generalized to a myriad of biologically relevant systems.

  18. Hardware design document for the Infrasound Prototype for a CTBT IMS station

    SciTech Connect

    Breding, D.R.; Kromer, R.P.; Whitaker, R.W.; Sandoval, T.

    1997-11-01

    The Hardware Design Document (HDD) describes the various hardware components used in the Comprehensive Test Ban Treaty (CTBT) Infrasound Prototype and their interrelationships. It divides the infrasound prototype into hardware configurations items (HWCIs). The HDD uses techniques such as block diagrams and parts lists to present this information. The level of detail provided in the following sections should be sufficient to allow potential users to procure and install the infrasound system. Infrasonic monitoring is a low cost, robust, and effective technology for detecting atmospheric explosions. Low frequencies from explosion signals propagate to long ranges (few thousand kilometers) where they can be detected with an array of sensors.

  19. Big Data Solution for CTBT Monitoring Using Global Cross Correlation

    NASA Astrophysics Data System (ADS)

    Gaillard, P.; Bobrov, D.; Dupont, A.; Grenouille, A.; Kitov, I. O.; Rozhkov, M.

    2014-12-01

    Due to the mismatch between data volume and the performance of the Information Technology infrastructure used in seismic data centers, it becomes more and more difficult to process all the data with traditional applications in a reasonable elapsed time. To fulfill their missions, the International Data Centre of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO/IDC) and the Département Analyse Surveillance Environnement of Commissariat à l'Energie atomique et aux énergies alternatives (CEA/DASE) collect, process and produce complex data sets whose volume is growing exponentially. In the medium term, computer architectures, data management systems and application algorithms will require fundamental changes to meet the needs. This problem is well known and identified as a "Big Data" challenge. To tackle this major task, the CEA/DASE takes part during two years to the "DataScale" project. Started in September 2013, DataScale gathers a large set of partners (research laboratories, SMEs and big companies). The common objective is to design efficient solutions using the synergy between Big Data solutions and the High Performance Computing (HPC). The project will evaluate the relevance of these technological solutions by implementing a demonstrator for seismic event detections thanks to massive waveform correlations. The IDC has developed an expertise on such techniques leading to an algorithm called "Master Event" and provides a high-quality dataset for an extensive cross correlation study. The objective of the project is to enhance the Master Event algorithm and to reanalyze 10 years of waveform data from the International Monitoring System (IMS) network thanks to a dedicated HPC infrastructure operated by the "Centre de Calcul Recherche et Technologie" at the CEA of Bruyères-le-Châtel. The dataset used for the demonstrator includes more than 300,000 seismic events, tens of millions of raw detections and more than 30 terabytes of continuous seismic data

  20. Complete regional waveform modeling to estimate seismic velocity structure and source parameters for CTBT monitoring

    SciTech Connect

    Bredbeck, T; Rodgers, A; Walter, W

    1999-07-23

    The velocity structures and source parameters estimated by waveform modeling provide valuable information for CTBT monitoring. The inferred crustal and uppermost mantle structures advance understanding of tectonics and guides regionalization for event location and identification efforts. Estimation of source parameters such as seismic moment, depth and mechanism (whether earthquake, explosion or collapse) is crucial to event identification. In this paper we briefly outline some of the waveform modeling research for CTBT monitoring performed in the last year. In the future we will estimate structure for new regions by modeling waveforms of large well-observed events along additional paths. Of particular interest will be the estimation of velocity structure in aseismic regions such as most of Africa and the Former Soviet Union. Our previous work on aseismic regions in the Middle East, north Africa and south Asia give us confidence to proceed with our current methods. Using the inferred velocity models we plan to estimate source parameters for smaller events. It is especially important to obtain seismic moments of earthquakes for use in applying the Magnitude-Distance Amplitude Correction (MDAC; Taylor et al., 1999) to regional body-wave amplitudes for discrimination and calibrating the coda-based magnitude scales.

  1. Acoustic-Seismic Coupling of Broadband Signals - Analysis of Potential Disturbances during CTBT On-Site Inspection Measurements

    NASA Astrophysics Data System (ADS)

    Liebsch, Mattes; Altmann, Jürgen

    2015-04-01

    For the verification of the Comprehensive Nuclear Test Ban Treaty (CTBT) the precise localisation of possible underground nuclear explosion sites is important. During an on-site inspection (OSI) sensitive seismic measurements of aftershocks can be performed, which, however, can be disturbed by other signals. To improve the quality and effectiveness of these measurements it is essential to understand those disturbances so that they can be reduced or prevented. In our work we focus on disturbing signals caused by airborne sources: When the sound of aircraft (as often used by the inspectors themselves) hits the ground, it propagates through pores in the soil. Its energy is transferred to the ground and soil vibrations are created which can mask weak aftershock signals. The understanding of the coupling of acoustic waves to the ground is still incomplete. However, it is necessary to improve the performance of an OSI, e.g. to address potential consequences for the sensor placement, the helicopter trajectories etc. We present our recent advances in this field. We performed several measurements to record sound pressure and soil velocity produced by various sources, e.g. broadband excitation by jet aircraft passing overhead and signals artificially produced by a speaker. For our experimental set-up microphones were placed close to the ground and geophones were buried in different depths in the soil. Several sensors were shielded from the directly incident acoustic signals by a box coated with acoustic damping material. While sound pressure under the box was strongly reduced, the soil velocity measured under the box was just slightly smaller than outside of it. Thus these soil vibrations were mostly created outside the box and travelled through the soil to the sensors. This information is used to estimate characteristic propagation lengths of the acoustically induced signals in the soil. In the seismic data we observed interference patterns which are likely caused by the

  2. Applying monitoring, verification, and accounting techniques to a real-world, enhanced oil recovery operational CO2 leak

    USGS Publications Warehouse

    Wimmer, B.T.; Krapac, I.G.; Locke, R.; Iranmanesh, A.

    2011-01-01

    The use of carbon dioxide (CO2) for enhanced oil recovery (EOR) is being tested for oil fields in the Illinois Basin, USA. While this technology has shown promise for improving oil production, it has raised some issues about the safety of CO2 injection and storage. The Midwest Geological Sequestration Consortium (MGSC) organized a Monitoring, Verification, and Accounting (MVA) team to develop and deploy monitoring programs at three EOR sites in Illinois, Indiana, and Kentucky, USA. MVA goals include establishing baseline conditions to evaluate potential impacts from CO2 injection, demonstrating that project activities are protective of human health and the environment, and providing an accurate accounting of stored CO2. This paper focuses on the use of MVA techniques in monitoring a small CO2 leak from a supply line at an EOR facility under real-world conditions. The ability of shallow monitoring techniques to detect and quantify a CO2 leak under real-world conditions has been largely unproven. In July of 2009, a leak in the pipe supplying pressurized CO2 to an injection well was observed at an MGSC EOR site located in west-central Kentucky. Carbon dioxide was escaping from the supply pipe located approximately 1 m underground. The leak was discovered visually by site personnel and injection was halted immediately. At its largest extent, the hole created by the leak was approximately 1.9 m long by 1.7 m wide and 0.7 m deep in the land surface. This circumstance provided an excellent opportunity to evaluate the performance of several monitoring techniques including soil CO2 flux measurements, portable infrared gas analysis, thermal infrared imagery, and aerial hyperspectral imagery. Valuable experience was gained during this effort. Lessons learned included determining 1) hyperspectral imagery was not effective in detecting this relatively small, short-term CO2 leak, 2) even though injection was halted, the leak remained dynamic and presented a safety risk concern

  3. Seismic Characterization of Coal-Mining Seismicity in Utah for CTBT Monitoring

    SciTech Connect

    Arabasz, W J; Pechmann, J C

    2001-03-01

    Underground coal mining (down to {approx}0.75 km depth) in the contiguous Wasatch Plateau (WP) and Book Cliffs (BC) mining districts of east-central Utah induces abundant seismicity that is monitored by the University of Utah regional seismic network. This report presents the results of a systematic characterization of mining seismicity (magnitude {le} 4.2) in the WP-BC region from January 1978 to June 2000-together with an evaluation of three seismic events (magnitude {le} 4.3) associated with underground trona mining in southwestern Wyoming during January-August 2000. (Unless specified otherwise, magnitude implies Richter local magnitude, M{sub L}.) The University of Utah Seismograph Stations (UUSS) undertook this cooperative project to assist the University of California Lawrence Livermore National Laboratory (LLNL) in research and development relating to monitoring the Comprehensive Test Ban Treaty (CTBT). The project, which formally began February 28, 1998, and ended September 1, 2000, had three basic objectives: (1) Strategically install a three-component broadband digital seismic station in the WP-BC region to ensure the continuous recording of high-quality waveform data to meet the long-term needs of LLNL, UUSS, and other interested parties, including the international CTBT community. (2) Determine source mechanisms--to the extent that available source data and resources allowed--for comparative seismic characterization of stress release in mines versus earthquakes in the WP-BC study region. (3) Gather and report to LLNL local information on mine operations and associated seismicity, including ''ground truth'' for significant events. Following guidance from LLNL's Technical Representative, the focus of Objective 2 was changed slightly to place emphasis on three mining-related events that occurred in and near the study area after the original work plan had been made, thus posing new targets of opportunity. These included: a magnitude 3.8 shock that occurred

  4. Engineering Upgrades to the Radionuclide Aerosol Sampler/Analyzer for the CTBT International Monitoring System

    SciTech Connect

    Forrester, Joel B.; Carty, Fitz; Comes, Laura; Hayes, James C.; Miley, Harry S.; Morris, Scott J.; Ripplinger, Mike D.; Slaugh, Ryan W.; Van Davelaar, Peter

    2013-05-13

    The Radionuclide Aerosol Sampler/Analyzer (RASA) is an automated aerosol collection and analysis system designed by Pacific Northwest National Laboratory in the 1990’s and is deployed in several locations around the world as part of the International Monitoring System (IMS) required under the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The utility of such an automated system is the reduction of human intervention and the production of perfectly uniform results. However, maintainability and down time issues threaten this utility, even for systems with over 90% data availability. Engineering upgrades to the RASA are currently being pursued to address these issues, as well as Fukushima lessons learned. Current work includes a new automation control unit, and other potential improvements such as alternative detector cooling and sampling options are under review. This paper presents the current state of upgrades and improvements under investigation

  5. Capability of the CTBT infrasound stations detecting the 2013 Russian fireball

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; Le Pichon, Alexis; Mialle, Pierrick; Garces, Milton

    2015-04-01

    The explosive fragmentation of the 2013 Chelyabinsk meteorite generated a large airburst with an equivalent yield of 500 kT TNT. It is the most energetic event recorded by the infrasound component of the CTBT-IMS, globally detected by 20 out of 42 operational stations. This study performs a station-by-station estimation of the IMS detection capability to explain infrasound detections and non-detections from short to long distances, using the Chelyabinsk meteorite as global reference event. Investigated parameters influencing the detection capability are the directivity of the line source signal, the ducting of acoustic energy and the individual noise conditions at each station. Findings include a clear detection preference for stations perpendicular to the meteorite trajectory, even over large distances. Only a weak influence of stratospheric ducting is observed for this low-frequency case. Furthermore, a strong dependence on the diurnal variability of background noise levels at each station is observed, favoring nocturnal detections.

  6. Verification and Validation of NASA-Supported Enhancements to the Near Real Time Harmful Algal Blooms Observing System (HABSOS)

    NASA Technical Reports Server (NTRS)

    Spruce, Joseph P.; Hall, Calllie; McPherson, Terry; Spiering, Bruce; Brown, Richard; Estep, Lee; Lunde, Bruce; Guest, DeNeice; Navard, Andy; Pagnutti, Mary; Ryan, Robert E.

    2006-01-01

    This report discusses verification and validation (V&V) assessment of Moderate Resolution Imaging Spectroradiometer (MODIS) ocean data products contributed by the Naval Research Laboratory (NRL) and Applied Coherent Technologies (ACT) Corporation to National Oceanic Atmospheric Administration s (NOAA) Near Real Time (NRT) Harmful Algal Blooms Observing System (HABSOS). HABSOS is a maturing decision support tool (DST) used by NOAA and its partners involved with coastal and public health management.

  7. Blind Source Separation of Seismic Events with Independent Component Analysis: CTBT related exercise

    NASA Astrophysics Data System (ADS)

    Rozhkov, Mikhail; Kitov, Ivan

    2015-04-01

    Blind Source Separation (BSS) methods used in signal recovery applications are attractive for they use minimal a priori information about the signals they are dealing with. Homomorphic deconvolution and cepstrum estimation are probably the only methods used in certain extent in CTBT applications that can be attributed to the given branch of technology. However Expert Technical Analysis (ETA) conducted in CTBTO to improve the estimated values for the standard signal and event parameters according to the Protocol to the CTBT may face problems which cannot be resolved with certified CTBTO applications and may demand specific techniques not presently used. The problem to be considered within the ETA framework is the unambiguous separation of signals with close arrival times. Here, we examine two scenarios of interest: (1) separation of two almost co-located explosions conducted within fractions of seconds, and (2) extraction of explosion signals merged with wavetrains from strong earthquake. The importance of resolving the problem related to case 1 is connected with the correct explosion yield estimation. Case 2 is a well-known scenario of conducting clandestine nuclear tests. While the first case can be approached somehow with the means of cepstral methods, the second case can hardly be resolved with the conventional methods implemented at the International Data Centre, especially if the signals have close slowness and azimuth. Independent Component Analysis (in its FastICA implementation) implying non-Gaussianity of the underlying processes signal's mixture is a blind source separation method that we apply to resolve the mentioned above problems. We have tested this technique with synthetic waveforms, seismic data from DPRK explosions and mining blasts conducted within East-European platform as well as with signals from strong teleseismic events (Sumatra, April 2012 Mw=8.6, and Tohoku, March 2011 Mw=9.0 earthquakes). The data was recorded by seismic arrays of the

  8. CTBT infrasound network performance to detect the 2013 Russian fireball event

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; Le Pichon, Alexis; Mialle, Pierrick; Garcés, Milton A.

    2015-04-01

    The explosive fragmentation of the 2013 Chelyabinsk meteorite generated a large airburst with an equivalent yield of 500 kT TNT. It is the most energetic event recorded by the infrasound component of the Comprehensive Nuclear-Test-Ban Treaty-International Monitoring System (CTBT-IMS), globally detected by 20 out of 42 operational stations. This study performs a station-by-station estimation of the IMS detection capability to explain infrasound detections and nondetections from short to long distances, using the Chelyabinsk meteorite as global reference event. Investigated parameters influencing the detection capability are the directivity of the line source signal, the ducting of acoustic energy, and the individual noise conditions at each station. Findings include a clear detection preference for stations perpendicular to the meteorite trajectory, even over large distances. Only a weak influence of stratospheric ducting is observed for this low-frequency case. Furthermore, a strong dependence on the diurnal variability of background noise levels at each station is observed, favoring nocturnal detections.

  9. Geometric verification

    NASA Technical Reports Server (NTRS)

    Grebowsky, G. J.

    1982-01-01

    Present LANDSAT data formats are reviewed to clarify how the geodetic location and registration capabilities were defined for P-tape products and RBV data. Since there is only one geometric model used in the master data processor, geometric location accuracy of P-tape products depends on the absolute accuracy of the model and registration accuracy is determined by the stability of the model. Due primarily to inaccuracies in data provided by the LANDSAT attitude management system, desired accuracies are obtained only by using ground control points and a correlation process. The verification of system performance with regards to geodetic location requires the capability to determine pixel positions of map points in a P-tape array. Verification of registration performance requires the capability to determine pixel positions of common points (not necessarily map points) in 2 or more P-tape arrays for a given world reference system scene. Techniques for registration verification can be more varied and automated since map data are not required. The verification of LACIE extractions is used as an example.

  10. Verification Results of Jet Resonance-enhanced Multiphoton Ionization as a Real-time PCDD/F Emission Monitor

    EPA Science Inventory

    The Jet REMPI (Resonance Enhanced Multiphoton Ionization) monitor was tested on a hazardous waste firing boiler for its ability to determine concentrations of polychlorinated dibenzodioxins and dibenzofurans (PCDDs/Fs). Jet REMPI is a real time instrument capable of highly selec...

  11. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  12. Uranium systems to enhance benchmarks for use in the verification of criticality safety computer models. Final report, February 16, 1990--December 31, 1994

    SciTech Connect

    Busch, R.D.

    1995-02-24

    Dr. Robert Busch of the Department of Chemical and Nuclear Engineering was the principal investigator on this project with technical direction provided by the staff in the Nuclear Criticality Safety Group at Los Alamos. During the period of the contract, he had a number of graduate and undergraduate students working on subtasks. The objective of this work was to develop information on uranium systems to enhance benchmarks for use in the verification of criticality safety computer models. During the first year of this project, most of the work was focused on setting up the SUN SPARC-1 Workstation and acquiring the literature which described the critical experiments. By august 1990, the Workstation was operational with the current version of TWODANT loaded on the system. MCNP, version 4 tape was made available from Los Alamos late in 1990. Various documents were acquired which provide the initial descriptions of the critical experiments under consideration as benchmarks. The next four years were spent working on various benchmark projects. A number of publications and presentations were made on this material. These are briefly discussed in this report.

  13. NG09 And CTBT On-Site Inspection Noble Gas Sampling and Analysis Requirements

    NASA Astrophysics Data System (ADS)

    Carrigan, Charles R.; Tanaka, Junichi

    2010-05-01

    A provision of the Comprehensive Test Ban Treaty (CTBT) allows on-site inspections (OSIs) of suspect nuclear sites to determine if the occurrence of a detected event is nuclear in origin. For an underground nuclear explosion (UNE), the potential success of an OSI depends significantly on the containment scenario of the alleged event as well as the application of air and soil-gas radionuclide sampling techniques in a manner that takes into account both the suspect site geology and the gas transport physics. UNE scenarios may be broadly divided into categories involving the level of containment. The simplest to detect is a UNE that vents a significant portion of its radionuclide inventory and is readily detectable at distance by the International Monitoring System (IMS). The most well contained subsurface events will only be detectable during an OSI. In such cases, 37 Ar and radioactive xenon cavity gases may reach the surface through either "micro-seepage" or the barometric pumping process and only the careful siting of sampling locations, timing of sampling and application of the most site-appropriate atmospheric and soil-gas capturing methods will result in a confirmatory signal. The OSI noble gas field tests NG09 was recently held in Stupava, Slovakia to consider, in addition to other field sampling and analysis techniques, drilling and subsurface noble gas extraction methods that might be applied during an OSI. One of the experiments focused on challenges to soil-gas sampling near the soil-atmosphere interface. During withdrawal of soil gas from shallow, subsurface sample points, atmospheric dilution of the sample and the potential for introduction of unwanted atmospheric gases were considered. Tests were designed to evaluate surface infiltration and the ability of inflatable well-packers to seal out atmospheric gases during sample acquisition. We discuss these tests along with some model-based predictions regarding infiltration under different near

  14. Optimal design of antireflection coating and experimental verification by plasma enhanced chemical vapor deposition in small displays

    SciTech Connect

    Yang, S. M.; Hsieh, Y. C.; Jeng, C. A.

    2009-03-15

    Conventional antireflection coating by thin films of quarter-wavelength thickness is limited by material selections and these films' refractive indices. The optimal design by non-quarter-wavelength thickness is presented in this study. A multilayer thin-film model is developed by the admittance loci to show that the two-layer thin film of SiN{sub x}/SiO{sub y} at 124/87 nm and three layer of SiN{sub x}/SiN{sub y}/SiO{sub z} at 58/84/83 nm can achieve average transmittances of 94.4% and 94.9%, respectively, on polymer, glass, and silicon substrates. The optimal design is validated by plasma enhanced chemical vapor deposition of N{sub 2}O/SiH{sub 4} and NH{sub 3}/SiH{sub 4} to achieve the desired optical constants. Application of the antireflection coating to a 4 in. liquid crystal display demonstrates that the transmittance is over 94%, the mean luminance can be increased by 25%, and the total reflection angle increased from 41 deg. to 58 deg.

  15. Guide to good practices for independent verification

    SciTech Connect

    1998-12-01

    This Guide to Good Practices is written to enhance understanding of, and provide direction for, Independent Verification, Chapter X of Department of Energy (DOE) Order 5480.19, Conduct of Operations Requirements for DOE Facilities. The practices in this guide should be considered when planning or reviewing independent verification activities. Contractors are advised to adopt procedures that meet the intent of DOE Order 5480.19. Independent Verification is an element of an effective Conduct of Operations program. The complexity and array of activities performed in DOE facilities dictate the necessity for coordinated independent verification activities to promote safe and efficient operations.

  16. ETV - VERIFICATION TESTING (ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM)

    EPA Science Inventory

    Verification testing is a major component of the Environmental Technology Verification (ETV) program. The ETV Program was instituted to verify the performance of innovative technical solutions to problems that threaten human health or the environment and was created to substantia...

  17. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  18. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  19. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  20. Post-installation activities in the Comprehensive Nuclear Test Ban Treaty (CTBT) International Monitoring System (IMS) infrasound network

    NASA Astrophysics Data System (ADS)

    Vivas Veloso, J. A.; Christie, D. R.; Hoffmann, T. L.; Campus, P.; Bell, M.; Langlois, A.; Martysevich, P.; Demirovik, E.; Carvalho, J.; Kramer, A.; Wu, Sean F.

    2002-11-01

    The provisional operation and maintenance of IMS infrasound stations after installation and subsequent certification has the objective to prepare the infrasound network for entry into force of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The goal is to maintain and fine tune the technical capabilities of the network, to repair faulty equipment, and to ensure that stations continue to meet the minimum specifications through evaluation of data quality and station recalibration. Due to the globally dispersed nature of the network, this program constitutes a significant undertaking that requires careful consideration of possible logistic approaches and their financial implications. Currently, 11 of the 60 IMS infrasound stations are transmitting data in the post-installation Testing & Evaluation mode. Another 5 stations are under provisional operation and are maintained in post-certification mode. It is expected that 20% of the infrasound network will be certified by the end of 2002. This presentation will focus on the different phases of post-installation activities of the IMS infrasound program and the logistical challenges to be tackled to ensure a cost-efficient management of the network. Specific topics will include Testing & Evaluation and Certification of Infrasound Stations, as well as Configuration Management and Network Sustainment.

  1. Verification of Adaptive Systems

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui; Vassev, Emil; Hinchey, Mike; Rouff, Christopher; Buskens, Richard

    2012-01-01

    Adaptive systems are critical for future space and other unmanned and intelligent systems. Verification of these systems is also critical for their use in systems with potential harm to human life or with large financial investments. Due to their nondeterministic nature and extremely large state space, current methods for verification of software systems are not adequate to provide a high level of assurance for them. The combination of stabilization science, high performance computing simulations, compositional verification and traditional verification techniques, plus operational monitors, provides a complete approach to verification and deployment of adaptive systems that has not been used before. This paper gives an overview of this approach.

  2. The Case of the 12 May 2010 Event in North Korea: the Role of Temporary Seismic Deployments as National Technical Means for CTBT Verification

    NASA Astrophysics Data System (ADS)

    Koch, K.; Kim, W. Y.; Schaff, D. P.; Richards, P. G.

    2015-12-01

    Since 2012 there has been debate about a low-yield nuclear explosion within North Korea, initially claimed to have occurred in April/May 2010 on the basis of a number of Level 5 radionuclide detections from stations of the radionuclide subnetwork of the International Monitoring System (IMS) and additional reports from similar national facilities. Whereas the announced nuclear tests in North Korea in 2006, 2009 and 2013, were clearly detected seismically, there was initially a lack of detections from the seismological component of the IMS corresponding to a possible nuclear test in 2010. Work published recently by Zhang and Wen in Seismological Research Letters (Jan/Feb 2015) inferring seismological evidence for an explosion in North Korea, at about 0009 hours on 12 May 2010 (UTC), has attracted further attention. Previous studies of seismicity of the North Korean test site for days prior to this date had not found any such evidence from IMS or non-IMS stations. The data used by Zhang and Wen were from stations in northeastern China about 80 to 200 km from the North Korean test site and are currently not available for open research. A search for openly-available data was undertaken, resulting in relevant waveforms obtained both from the IRIS Consortium (from a PASSCAL experiment in Northeastern China, as noted also by Ford and Walter, 2015), and from another temporary seismic deployment, also in China. The data from these stations showed signals consistent with the seismic disturbance found by Zhang and Wen. These supplementary stations thus constitute a monitoring resource providing objective data, in the present case for an event even below magnitude 2 and thus much smaller than can be monitored by the usual assets. Efforts are currently underway to use the data from these stations to investigate the compatibility of the event with other explosion-type events, or with an earthquake.

  3. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  4. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  5. Use of open source information and commercial satellite imagery for nuclear nonproliferation regime compliance verification by a community of academics

    NASA Astrophysics Data System (ADS)

    Solodov, Alexander

    The proliferation of nuclear weapons is a great threat to world peace and stability. The question of strengthening the nonproliferation regime has been open for a long period of time. In 1997 the International Atomic Energy Agency (IAEA) Board of Governors (BOG) adopted the Additional Safeguards Protocol. The purpose of the protocol is to enhance the IAEA's ability to detect undeclared production of fissile materials in member states. However, the IAEA does not always have sufficient human and financial resources to accomplish this task. Developed here is a concept for making use of human and technical resources available in academia that could be used to enhance the IAEA's mission. The objective of this research was to study the feasibility of an academic community using commercially or publicly available sources of information and products for the purpose of detecting covert facilities and activities intended for the unlawful acquisition of fissile materials or production of nuclear weapons. In this study, the availability and use of commercial satellite imagery systems, commercial computer codes for satellite imagery analysis, Comprehensive Test Ban Treaty (CTBT) verification International Monitoring System (IMS), publicly available information sources such as watchdog groups and press reports, and Customs Services information were explored. A system for integrating these data sources to form conclusions was also developed. The results proved that publicly and commercially available sources of information and data analysis can be a powerful tool in tracking violations in the international nuclear nonproliferation regime and a framework for implementing these tools in academic community was developed. As a result of this study a formation of an International Nonproliferation Monitoring Academic Community (INMAC) is proposed. This would be an independent organization consisting of academics (faculty, staff and students) from both nuclear weapon states (NWS) and

  6. Verification System: First System-Wide Performance Test

    NASA Astrophysics Data System (ADS)

    Chernobay, I.; Zerbo, L.

    2006-05-01

    System-wide performance tests are essential for the development, testing and evaluation of individual components of the verification system. In addition to evaluating global readiness it helps establishing the practical and financial requirements for eventual operations. The first system-wide performance test (SPT1) was conducted in three phases: - A preparatory phase in May-June 2004 - A performance testing phase in April-June 2005 - An evaluation phase in the last half of 2005. The preparatory phase was developmental in nature. The main objectives for the performance testing phase included establishment of performance baseline under current provisional mode of operation (CTBT/PC- 19/1/Annex II, CTBT/WGB-21/1), examination of established requirements and procedures for operation and maintenance. To establish a system-wide performance baseline the system configuration was fixed for April-May 2005. The third month (June 2005) was used for implementation of 21 test case scenarios to examine either particular operational procedures or the response of the system components to the failures simulated under controlled conditions. A total of 163 stations and 5 certified radionuclide laboratories of International Monitoring System (IMS) participated in the performance testing phase - about 50% of the eventual IMS network. 156 IMS facilities and 40 National Data Centres (NDCs) were connected to the International Data Centre (IDC) via Global Communication Infrastructure (GCI) communication links. In addition, 12 legacy stations in the auxiliary seismic network sent data to the IDC over the Internet. During the performance testing phase, the IDC produced all required products, analysed more than 6100 seismic events and 1700 radionuclide spectra. Performance of all system elements was documented and analysed. IDC products were compared with results of data processing at the NDCs. On the basis of statistics and information collected during the SPT1 a system-wide performance

  7. Enhanced detectability of fluorinated derivatives of N,N-dialkylamino alcohols and precursors of nitrogen mustards by gas chromatography coupled to Fourier transform infrared spectroscopy analysis for verification of chemical weapons convention.

    PubMed

    Garg, Prabhat; Purohit, Ajay; Tak, Vijay K; Dubey, D K

    2009-11-01

    N,N-Dialkylamino alcohols, N-methyldiethanolamine, N-ethyldiethanolamine and triethanolamine are the precursors of VX type nerve agents and three different nitrogen mustards respectively. Their detection and identification is of paramount importance for verification analysis of chemical weapons convention. GC-FTIR is used as complimentary technique to GC-MS analysis for identification of these analytes. One constraint of GC-FTIR, its low sensitivity, was overcome by converting the analytes to their fluorinated derivatives. Owing to high absorptivity in IR region, these derivatives facilitated their detection by GC-FTIR analysis. Derivatizing reagents having trimethylsilyl, trifluoroacyl and heptafluorobutyryl groups on imidazole moiety were screened. Derivatives formed there were analyzed by GC-FTIR quantitatively. Of these reagents studied, heptafluorobutyrylimidazole (HFBI) produced the greatest increase in sensitivity by GC-FTIR detection. 60-125 folds of sensitivity enhancement were observed for the analytes by HFBI derivatization. Absorbance due to various functional groups responsible for enhanced sensitivity were compared by determining their corresponding relative molar extinction coefficients ( [Formula: see text] ) considering uniform optical path length. The RSDs for intraday repeatability and interday reproducibility for various derivatives were 0.2-1.1% and 0.3-1.8%. Limit of detection (LOD) was achieved up to 10-15ng and applicability of the method was tested with unknown samples obtained in international proficiency tests. PMID:19796767

  8. Verification of sensitivity enhancement of SWIR imager technology in advanced multispectral SWIR/VIS zoom cameras with constant and variable F-number

    NASA Astrophysics Data System (ADS)

    Hübner, M.; Achtner, B.; Kraus, M.; Siemens, C.; Münzberg, M.

    2016-05-01

    Current designs of combined VIS-color/SWIR camera optics use constant F-number over the full field of view (FOV) range. Especially in the SWIR, limited space for the camera integration in existing system volumes and relatively high pitch dimensions of 15μm or even 20μm force the use of relatively high F- numbers to accomplish narrow fields of view less than 2.0° with reasonable resolution for long range observation and targeting applications. Constant F-number designs are already reported and considered [1] for submarine applications. The comparison of electro-optical performance was based on the given detector noise performance and sensitivity data by the detector manufacturer [1] and further modelling of the imaging chain within linear MTF system theory. The visible channel provides limited twilight capability at F/2.6 but in the SWIR the twilight capability is degraded due to the relatively high F-number of F/7 or F/5.25 for 20 μm and 15 μm pitch, respectively. Differences between prediction and experimental verification of sensitivity in terms of noise equivalent irradiance (NEI) and scenery based limiting illumination levels are shown for the visible and the SWIR spectral range. Within this context, currently developed improvements using optical zoom designs for the multispectral SWIR/VIS camera optics with continuously variable Fnumber are discussed, offering increased low light level capabilities at wide and medium fields of view while still enabling a NFOV < 2° with superior long range targeting capabilities under limited atmospherical sight conditions at daytime.

  9. Voltage verification unit

    DOEpatents

    Martin, Edward J.

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  10. Verification of RADTRAN

    SciTech Connect

    Kanipe, F.L.; Neuhauser, K.S.

    1995-12-31

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes.

  11. On-site inspection for verification of a Comprehensive Test Ban Treaty

    SciTech Connect

    Heckrotte, W.

    1986-10-01

    A seismic monitoring system and on-site inspections are the major components of a verification system for a Comprehensive Test Ban Treaty (CTBT) to give parties assurance that clandestine underground nuclear weapon tests are not taking place. The primary task lies with the seismic monitoring system which must be capable of identifying most earthquakes in the magnitude range of concern as earthquakes, leaving a small number of unidentified events. If any unidentified event on the territory of one party appeared suspicious to another party, and thus potentially an explosion, an on-site inspection could be invoked to decide whether or not a nuclear explosion had taken place. Over the years, on-site inspections have been one of the most contentious issues in test ban negotiations and discussions. In the uncompleted test ban negotiations of 1977-80 between the US, UK, and USSR, voluntary OSIs were established as a basis for negotiation. Voluntary OSIs would require between the parties a common interest and cooperation toward resolving suspicions if OSIs were to serve the purpose of confidence building. On the technical level, an OSI could not assure identification of a clandestine test, but an evader would probably reject any request for an OSI at the site of an evasive test, rather than run the risk of an OSI. The verification system does not provide direct physical evidence of a violation. This could pose a difficult and controversial decision on compliance. 16 refs.

  12. Secure optical verification using dual phase-only correlation

    NASA Astrophysics Data System (ADS)

    Liu, Wei; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun; Liu, Shutian

    2015-02-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method.

  13. Nuclear disarmament verification

    SciTech Connect

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  14. Wind gust warning verification

    NASA Astrophysics Data System (ADS)

    Primo, Cristina

    2016-07-01

    Operational meteorological centres around the world increasingly include warnings as one of their regular forecast products. Warnings are issued to warn the public about extreme weather situations that might occur leading to damages and losses. In forecasting these extreme events, meteorological centres help their potential users in preventing the damage or losses they might suffer. However, verifying these warnings requires specific methods. This is due not only to the fact that they happen rarely, but also because a new temporal dimension is added when defining a warning, namely the time window of the forecasted event. This paper analyses the issues that might appear when dealing with warning verification. It also proposes some new verification approaches that can be applied to wind warnings. These new techniques are later applied to a real life example, the verification of wind gust warnings at the German Meteorological Centre ("Deutscher Wetterdienst"). Finally, the results obtained from the latter are discussed.

  15. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  16. Verification and validation benchmarks.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  17. TFE verification program

    NASA Astrophysics Data System (ADS)

    1994-01-01

    This is the final semiannual progress report for the Thermionic Fuel Elements (TFE) verification. A decision was made in August 1993 to begin a Close Out Program on October 1, 1993. Final reports summarizing the design analyses and test activities of the TFE Verification Program will be written, stand-alone documents for each task. The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein includes evaluated test data, design evaluations, the results of analyses and the significance of results.

  18. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  19. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  20. Context Effects in Sentence Verification.

    ERIC Educational Resources Information Center

    Kiger, John I.; Glass, Arnold L.

    1981-01-01

    Three experiments examined what happens to reaction time to verify easy items when they are mixed with difficult items in a verification task. Subjects verification of simple arithmetic equations and sentences took longer when placed in a difficult list. Difficult sentences also slowed the verification of easy arithmetic equations. (Author/RD)

  1. Research required to support comprehensive nuclear test ban treaty monitoring. Final report

    SciTech Connect

    1997-08-01

    After years of negotiation, the Comprehensive Nuclear Test-Ban Treaty (CTBT) was signed at the United Nations in September 1996. The treaty creates a need for global monitoring in the context of national and international efforts to control nuclear arms. To meet this technical challenge, the United States is at a time of pivotal decisions-making with regard to the level and nature of basic research in support of CTBT verification. To address this problem, this study identifies the basic research questions in the fields of seismology, hydroacoustics, infrasonics, and radionuclide monitoring that should be supported to enhance the capabilities to monitor and verify the CTBT.

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  3. Computer Graphics Verification

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Video processing creates technical animation sequences using studio quality equipment to realistically represent fluid flow over space shuttle surfaces, helicopter rotors, and turbine blades.Computer systems Co-op, Tim Weatherford, performing computer graphics verification. Part of Co-op brochure.

  4. FPGA Verification Accelerator (FVAX)

    NASA Technical Reports Server (NTRS)

    Oh, Jane; Burke, Gary

    2008-01-01

    Is Verification Acceleration Possible? - Increasing the visibility of the internal nodes of the FPGA results in much faster debug time - Forcing internal signals directly allows a problem condition to be setup very quickly center dot Is this all? - No, this is part of a comprehensive effort to improve the JPL FPGA design and V&V process.

  5. Telescope performance verification

    NASA Astrophysics Data System (ADS)

    Swart, Gerhard P.; Buckley, David A. H.

    2004-09-01

    While Systems Engineering appears to be widely applied on the very large telescopes, it is lacking in the development of many of the medium and small telescopes currently in progress. The latter projects rely heavily on the experience of the project team, verbal requirements and conjecture based on the successes and failures of other telescopes. Furthermore, it is considered an unaffordable luxury to "close-the-loop" by carefully analysing and documenting the requirements and then verifying the telescope's compliance with them. In this paper the authors contend that a Systems Engineering approach is a keystone in the development of any telescope and that verification of the telescope's performance is not only an important management tool but also forms the basis upon which successful telescope operation can be built. The development of the Southern African Large Telescope (SALT) has followed such an approach and is now in the verification phase of its development. Parts of the SALT verification process will be discussed in some detail to illustrate the suitability of this approach, including oversight by the telescope shareholders, recording of requirements and results, design verification and performance testing. Initial test results will be presented where appropriate.

  6. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests

  7. Supporting the President's Arms Control and Nonproliferation Agenda: Transparency and Verification for Nuclear Arms Reductions

    SciTech Connect

    Doyle, James E; Meek, Elizabeth

    2009-01-01

    The President's arms control and nonproliferation agenda is still evolving and the details of initiatives supporting it remain undefined. This means that DOE, NNSA, NA-20, NA-24 and the national laboratories can help define the agenda, and the policies and the initiatives to support it. This will require effective internal and interagency coordination. The arms control and nonproliferation agenda is broad and includes the path-breaking goal of creating conditions for the elimination of nuclear weapons. Responsibility for various elements of the agenda will be widely scattered across the interagency. Therefore an interagency mapping exercise should be performed to identify the key points of engagement within NNSA and other agencies for creating effective policy coordination mechanisms. These can include informal networks, working groups, coordinating committees, interagency task forces, etc. It will be important for NA-20 and NA-24 to get a seat at the table and a functional role in many of these coordinating bodies. The arms control and nonproliferation agenda comprises both mature and developing policy initiatives. The more mature elements such as CTBT ratification and a follow-on strategic nuclear arms treaty with Russia have defined milestones. However, recent press reports indicate that even the START follow-on strategic arms pact that is planned to be complete by the end of 2009 may take significantly longer and be more expansive in scope. The Russians called for proposals to count non-deployed as well as deployed warheads. Other elements of the agenda such as FMCT, future bilateral nuclear arms reductions following a START follow-on treaty, nuclear posture changes, preparations for an international nuclear security summit, strengthened international safeguards and multilateral verification are in much earlier stages of development. For this reason any survey of arms control capabilities within the USG should be structured to address potential needs across the

  8. RESRAD-BUILD verification.

    SciTech Connect

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-31

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft{reg_sign} Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified.

  9. Robust verification analysis

    NASA Astrophysics Data System (ADS)

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-01

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  10. Experimental verification of acoustic trace wavelength enhancement.

    PubMed

    Cray, Benjamin A

    2015-12-01

    Directivity is essentially a measure of a sonar array's beamwidth that can be obtained in a spherically isotropic ambient noise field; narrow array mainbeam widths are more directive than broader mainbeam widths. For common sonar systems, the directivity factor (or directivity index) is directly proportional to the ratio of an incident acoustic trace wavelength to the sonar array's physical length (which is always constrained). Increasing this ratio, by creating additional trace wavelengths for a fixed array length, will increase array directivity. Embedding periodic structures within an array generates Bragg scattering of the incident acoustic plane wave along the array's surface. The Bragg scattered propagating waves are shifted in a precise manner and create shorter wavelength replicas of the original acoustic trace wavelength. These replicated trace wavelengths (which contain identical signal arrival information) increase an array's wavelength to length ratio and thus directivity. Therefore, a smaller array, in theory, can have the equivalent directivity of a much larger array. Measurements completed in January 2015 at the Naval Undersea Warfare Center's Acoustic Test Facility, in Newport, RI, verified, near perfectly, these replicated, shorter, trace wavelengths. PMID:26723331

  11. TFE verification program

    NASA Astrophysics Data System (ADS)

    1990-03-01

    The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a Thermionic Fuel Element (TFE) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program; (3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-88; and (5) Thermionic Program in 1986 and 1987.

  12. TFE Verification Program

    SciTech Connect

    Not Available

    1990-03-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown on Fig. 1-1. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program;(3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-86; (5) and Thermionic Technology Program in 1986 and 1987. 18 refs., 64 figs., 43 tabs.

  13. Systematic study of source mask optimization and verification flows

    NASA Astrophysics Data System (ADS)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  14. Continuous verification using multimodal biometrics.

    PubMed

    Sim, Terence; Zhang, Sheng; Janakiraman, Rajkumar; Kumar, Sandeep

    2007-04-01

    Conventional verification systems, such as those controlling access to a secure room, do not usually require the user to reauthenticate himself for continued access to the protected resource. This may not be sufficient for high-security environments in which the protected resource needs to be continuously monitored for unauthorized use. In such cases, continuous verification is needed. In this paper, we present the theory, architecture, implementation, and performance of a multimodal biometrics verification system that continuously verifies the presence of a logged-in user. Two modalities are currently used--face and fingerprint--but our theory can be readily extended to include more modalities. We show that continuous verification imposes additional requirements on multimodal fusion when compared to conventional verification systems. We also argue that the usual performance metrics of false accept and false reject rates are insufficient yardsticks for continuous verification and propose new metrics against which we benchmark our system. PMID:17299225

  15. Quantum money with classical verification

    SciTech Connect

    Gavinsky, Dmitry

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  16. Quantum money with classical verification

    NASA Astrophysics Data System (ADS)

    Gavinsky, Dmitry

    2014-12-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  17. A systems perspective of Comprehensive Test Ban Treaty monitoring and verification

    SciTech Connect

    Walker, L.S.

    1996-11-01

    On September 24, 1996, after decades of discussion and more than two years of intensive international negotiations, President Clinton, followed by representatives of (to date) more than 125 other countries, including the other four declared nuclear weapons states, signed the Comprehensive Test Ban Treaty. Each signatory now faces a complex set of technical and political considerations regarding the advisability of joining the treaty. Those considerations vary from country to country, but for many countries one of the key issues is the extent to which the treaty can be verified. In the case of the US, it is anticipated that treaty verifiability will be an important issue in the US Senate Advice and Consent Hearings. This paper will address treaty verifiability, with an emphasis on the interplay between the various elements of the International monitoring regime, as prescribed in the CTBT Treaty Text and its associated Protocol. These elements, coupled with the National regimes, will serve as an integrated set of overlapping, interlocking measures to support treaty verification. Taken as a whole, they present a formidable challenge to potential testers who wish not to be caught.

  18. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  19. TFE Verification Program

    SciTech Connect

    Not Available

    1993-05-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program builds directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern.

  20. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  1. HDL to verification logic translator

    NASA Astrophysics Data System (ADS)

    Gambles, J. W.; Windley, P. J.

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  2. Software Verification and Validation Procedure

    SciTech Connect

    Olund, Thomas S.

    2008-09-15

    This Software Verification and Validation procedure provides the action steps for the Tank Waste Information Network System (TWINS) testing process. The primary objective of the testing process is to provide assurance that the software functions as intended, and meets the requirements specified by the client. Verification and validation establish the primary basis for TWINS software product acceptance.

  3. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  4. Cancelable face verification using optical encryption and authentication.

    PubMed

    Taheri, Motahareh; Mozaffari, Saeed; Keshavarzi, Parviz

    2015-10-01

    In a cancelable biometric system, each instance of enrollment is distorted by a transform function, and the output should not be retransformed to the original data. This paper presents a new cancelable face verification system in the encrypted domain. Encrypted facial images are generated by a double random phase encoding (DRPE) algorithm using two keys (RPM1 and RPM2). To make the system noninvertible, a photon counting (PC) method is utilized, which requires a photon distribution mask for information reduction. Verification of sparse images that are not recognizable by direct visual inspection is performed by unconstrained minimum average correlation energy filter. In the proposed method, encryption keys (RPM1, RPM2, and PDM) are used in the sender side, and the receiver needs only encrypted images and correlation filters. In this manner, the system preserves privacy if correlation filters are obtained by an adversary. Performance of PC-DRPE verification system is evaluated under illumination variation, pose changes, and facial expression. Experimental results show that utilizing encrypted images not only increases security concerns but also enhances verification performance. This improvement can be attributed to the fact that, in the proposed system, the face verification problem is converted to key verification tasks. PMID:26479930

  5. On-machine dimensional verification. Final report

    SciTech Connect

    Rendulic, W.

    1993-08-01

    General technology for automating in-process verification of machined products has been studied and implemented on a variety of machines and products at AlliedSignal Inc., Kansas City Division (KCD). Tests have been performed to establish system accuracy and probe reliability on two numerically controlled machining centers. Commercial software has been revised, and new cycles such as skew check and skew machining, have been developed to enhance and expand probing capabilities. Probe benefits have been demonstrated in the area of setup, cycle time, part quality, tooling cost, and product sampling.

  6. Verification of VENTSAR

    SciTech Connect

    Simpkins, A.A.

    1995-01-01

    The VENTSAR code is an upgraded and improved version of the VENTX code, which estimates concentrations on or near a building from a release at a nearby location. The code calculates the concentrations either for a given meteorological exceedance probability or for a given stability and wind speed combination. A single building can be modeled which lies in the path of the plume, or a penthouse can be added to the top of the building. Plume rise may also be considered. Release types can be either chemical or radioactive. Downwind concentrations are determined at user-specified incremental distances. This verification report was prepared to demonstrate that VENTSAR is properly executing all algorithms and transferring data. Hand calculations were also performed to ensure proper application of methodologies.

  7. Online fingerprint verification.

    PubMed

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications. PMID:17365425

  8. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  9. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  10. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  11. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  12. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  13. Hardware proofs using EHDM and the RSRE verification methodology

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Sjogren, Jon A.

    1988-01-01

    Examined is a methodology for hardware verification developed by Royal Signals and Radar Establishment (RSRE) in the context of the SRI International's Enhanced Hierarchical Design Methodology (EHDM) specification/verification system. The methodology utilizes a four-level specification hierarchy with the following levels: functional level, finite automata model, block model, and circuit level. The properties of a level are proved as theorems in the level below it. This methodology is applied to a 6-bit counter problem and is critically examined. The specifications are written in EHDM's specification language, Extended Special, and the proofs are improving both the RSRE methodology and the EHDM system.

  14. TPS verification with UUT simulation

    NASA Astrophysics Data System (ADS)

    Wang, Guohua; Meng, Xiaofeng; Zhao, Ruixian

    2006-11-01

    TPS's (Test Program Set) verification or first article acceptance test commonly depends on fault insertion experiment on UUT (Unit Under Test). However the failure modes injected on UUT is limited and it is almost infeasible when the UUT is in development or in a distributed state. To resolve this problem, a TPS verification method based on UUT interface signal simulation is putting forward. The interoperability between ATS (automatic test system) and UUT simulation platform is very important to realize automatic TPS verification. After analyzing the ATS software architecture, the approach to realize interpretability between ATS software and UUT simulation platform is proposed. And then the UUT simulation platform software architecture is proposed based on the ATS software architecture. The hardware composition and software architecture of the UUT simulation is described in details. The UUT simulation platform has been implemented in avionics equipment TPS development, debug and verification.

  15. Biometric verification with correlation filters.

    PubMed

    Vijaya Kumar, B V K; Savvides, Marios; Xie, Chunyan; Venkataramani, Krithika; Thornton, Jason; Mahalanobis, Abhijit

    2004-01-10

    Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric (such as a fingerprint, face image, or iris image) to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images (e.g., facial expressions, illumination changes, etc.). We investigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification. PMID:14735958

  16. Biometric verification with correlation filters

    NASA Astrophysics Data System (ADS)

    Vijaya Kumar, B. V. K.; Savvides, Marios; Xie, Chunyan; Venkataramani, Krithika; Thornton, Jason; Mahalanobis, Abhijit

    2004-01-01

    Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric (such as a fingerprint, face image, or iris image) to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images (e.g., facial expressions, illumination changes, etc.). We investigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification.

  17. Fusion strategies for boosting cancelable online signature verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    2010-04-01

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. As one solution, we proposed a scheme to enhance the performance of a cancelable approach for online signature verification by combining scores calculated from two transformed datasets generated using two keys. Generally, the same verification algorithm is used for transformed data as for raw (non-transformed) data in cancelable approaches, and, in our previous work, a verification system developed for a non-transformed dataset was used to calculate the scores from transformed data. In this paper, we modify the verification system by using transformed data for training. Several experiments were performed by using public databases, and the experimental results show that the modification of the verification system improved the performances. Our cancelable system combines two scores to make a decision. Several fusion strategies are also considered, and the experimental results are reported here.

  18. Verification& Validation (V&V) Guidelines and Quantitative Reliability at Confidence (QRC): Basis for an Investment Strategy

    SciTech Connect

    Logan, R W; Nitta, C K

    2002-07-17

    This paper represents an attempt to summarize our thoughts regarding various methods and potential guidelines for Verification and Validation (V&V) and Uncertainty Quantification (UQ) that we have observed within the broader V&V community or generated ourselves. Our goals are to evaluate these various methods, to apply them to computational simulation analyses, and integrate them into methods for Quantitative Certification techniques for the nuclear stockpile. We describe the critical nature of high quality analyses with quantified V&V, and the essential role of V&V and UQ at specified Confidence levels in evaluating system certification status. Only after V&V has contributed to UQ at confidence can rational tradeoffs of various scenarios be made. UQ of performance and safety margins for various scenarios and issues are applied in assessments of Quantified Reliability at Confidence (QRC) and we summarize with a brief description of how these V&V generated QRC quantities fold into a Value-Engineering methodology for evaluating investment strategies. V&V contributes directly to the decision process for investment, through quantification of uncertainties at confidence for margin and reliability assessments. These contributions play an even greater role in a Comprehensive Test Ban Treaty (CTBT) environment than ever before, when reliance on simulation in the absence of the ability to perform nuclear testing is critical.

  19. Verification and validation in railway signalling engineering - an application of enterprise systems techniques

    NASA Astrophysics Data System (ADS)

    Chen, Xiangxian; Wang, Dong; Huang, Hai; Wang, Zheng

    2014-07-01

    Verification and validation of a railway signalling system is a crucial part of the workflow in railway signalling enterprises. Typically, the verification and validation of this type of safety-critical system is performed by means of an on-site test, which leads to a low efficiency and high costs. A novel method for the verification and validation of a railway signalling system is proposed as an application of the enterprise information system (EIS) technique. In this application, the EIS and the simulation test platform are combined together, which enhances the coherence and consistency of the information exchange between the system development and the system verification, to improve the work efficiency. The simulation and auto-test technology used in the system verification also lowers the human and financial costs.

  20. Appendix: Conjectures concerning proof, design, and verification.

    SciTech Connect

    Wos, L.

    2000-05-31

    This article focuses on an esoteric but practical use of automated reasoning that may indeed be new to many, especially those concerned primarily with verification of both hardware and software. Specifically, featured are a discussion and some methodology for taking an existing design -- of a circuit, a chip, a program, or the like--and refining and improving it in various ways. Although the methodology is general and does not require the use of a specific program, McCune's program OTTER does offer what is needed. OTTER has played and continues to play the key role in my research, and an interested person can gain access to this program in various ways, not the least of which is through the included CD-ROM in [3]. When success occurs, the result is a new design that may require fewer components, avoid the use of certain costly components, offer more reliability and ease of verification, and, perhaps most important, be more efficient in the contexts of speed and heat generation. Although the author has minimal experience in circuit design, circuit validation, program synthesis, program verification, and similar concerns, (at the encouragement of colleagues based on successes to be cited) he presents materials that might indeed be of substantial interest to manufacturers and programmers. He writes this article in part prompted by the recent activities of chip designers that include Intel and AMD, activities heavily emphasizing the proving of theorems. As for his research that appears to the author to be relevant, he has made an intense and most profitable study of finding proofs that are shorter [2,3], some that avoid the use of various types of term, some that are far less complex than previously known, and the like. Those results suggest to me a strong possible connection between more appealing proofs (in mathematics and in logic) and enhanced and improved design of both hardware and software. Here the author explores diverse conjectures that elucidate some of the

  1. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  2. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  3. Space Telescope performance and verification

    NASA Technical Reports Server (NTRS)

    Wright, W. F.

    1980-01-01

    The verification philosophy for the Space Telescope (ST) has evolved from years of experience with multispacecraft programs modified by the new factors introduced by the Space Transportation System. At the systems level of test, the ST will undergo joint qualification/acceptance tests with environment simulation using Lockheed's large spacecraft test facilities. These tests continue the process of detecting workmanship defects and module interface incompatibilities. The test program culminates in an 'all up' ST environmental test verification program resulting in a 'ready to launch' ST.

  4. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  5. A verification system of RMAP protocol controller

    NASA Astrophysics Data System (ADS)

    Khanov, V. Kh; Shakhmatov, A. V.; Chekmarev, S. A.

    2015-01-01

    The functional verification problem of IP blocks of RMAP protocol controller is considered. The application of the verification method using fully- functional models of the processor and the internal bus of a system-on-chip is justified. Principles of construction of a verification system based on the given approach are proposed. The practical results of creating a system of verification of IP block of RMAP protocol controller is presented.

  6. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Energy Regulatory Commission 18 CFR Part 40 Generator Verification Reliability Standards AGENCY: Federal... Organization: MOD-025-2 (Verification and Data Reporting of Generator Real and Reactive Power Capability and Synchronous Condenser Reactive Power Capability), MOD- 026-1 (Verification of Models and Data for...

  7. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  8. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-07-16

    This paper will explore the difficulties of deep reductions by examining the technical verification challenges. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 (Pifer 2010). Further reductions will include stepping stones at 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national lab complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  9. Visual Attention During Sentence Verification.

    ERIC Educational Resources Information Center

    Lucas, Peter A.

    Eye movement data were collected for 28 college students reading 32 sentences with sentence verification questions. The factors observed were target sentence voice (active/passive), probe voice, and correct response (true/false). Pairs of subjects received the same set of stimuli, but with agents and objects in the sentences reversed. As expected,…

  10. Improved method for coliform verification.

    PubMed

    Diehl, J D

    1991-02-01

    Modification of a method for coliform verification presented in Standard Methods for the Examination of Water and Wastewater is described. Modification of the method, which is based on beta-galactosidase production, involves incorporation of a lactose operon inducer in medium upon which presumptive coliform isolates are cultured prior to beta-galactosidase assay. PMID:1901712

  11. Improved method for coliform verification.

    PubMed Central

    Diehl, J D

    1991-01-01

    Modification of a method for coliform verification presented in Standard Methods for the Examination of Water and Wastewater is described. Modification of the method, which is based on beta-galactosidase production, involves incorporation of a lactose operon inducer in medium upon which presumptive coliform isolates are cultured prior to beta-galactosidase assay. PMID:1901712

  12. A scheme for symmetrization verification

    NASA Astrophysics Data System (ADS)

    Sancho, Pedro

    2011-08-01

    We propose a scheme for symmetrization verification in two-particle systems, based on one-particle detection and state determination. In contrast to previous proposals, it does not follow a Hong-Ou-Mandel-type approach. Moreover, the technique can be used to generate superposition states of single particles.

  13. VERIFICATION OF WATER QUALITY MODELS

    EPA Science Inventory

    The basic concepts of water quality models are reviewed and the need to recognize calibration and verification of models with observed data is stressed. Post auditing of models after environmental control procedures are implemented is necessary to determine true model prediction ...

  14. Kate's Model Verification Tools

    NASA Technical Reports Server (NTRS)

    Morgan, Steve

    1991-01-01

    Kennedy Space Center's Knowledge-based Autonomous Test Engineer (KATE) is capable of monitoring electromechanical systems, diagnosing their errors, and even repairing them when they crash. A survey of KATE's developer/modelers revealed that they were already using a sophisticated set of productivity enhancing tools. They did request five more, however, and those make up the body of the information presented here: (1) a transfer function code fitter; (2) a FORTRAN-Lisp translator; (3) three existing structural consistency checkers to aid in syntax checking their modeled device frames; (4) an automated procedure for calibrating knowledge base admittances to protect KATE's hardware mockups from inadvertent hand valve twiddling; and (5) three alternatives for the 'pseudo object', a programming patch that currently apprises KATE's modeling devices of their operational environments.

  15. COS Internal NUV Wavelength Verification

    NASA Astrophysics Data System (ADS)

    Keyes, Charles

    2009-07-01

    This program will be executed after the uplink of the OSM2 position updates derived from the determination of the wavelength-scale zero points and desired spectral ranges for each grating in activity COS14 {program 11474 - COS NUV Internal/External Wavelength Scales}. This program will verify that the operational spectral ranges for each grating, central wavelength, and FP-POS are those desired. Subsequent to a successful verification, COS NUV ERO observations and NUV science can be enabled. An internal wavelength calibration spectrum using the default PtNe lamp {lamp 1} with each NUV grating at each central wavelength setting and each FP-POS position will be obtained for the verification. Additional exposures and waits between certain exposures will be required to avoid - and to evaluate - mechanism drifts.

  16. COS Internal FUV Wavelength Verification

    NASA Astrophysics Data System (ADS)

    Keyes, Charles

    2009-07-01

    This program will be executed after the uplink of the OSM1 position updates derived from the determination of the wavelength-scale zero points and desired spectral ranges for each grating in activity COS29 {program 11487 - COS FUV Internal/External Wavelength Scales}. This program will verify that the operational spectral ranges for each grating, central wavelength, and FP-POS are those desired. Subsequent to a successful verification, COS FUV ERO observations that require accurate wavelength scales {if any} and FUV science can be enabled. An internal wavelength calibration spectrum using the default PtNe lamp {lamp 1} with each FUV grating at each central wavelength setting and each FP-POS position will be obtained for the verification. Additional exposures and waits between certain exposures will be required to avoid - and to evaluate - mechanism drifts.

  17. NEXT Thruster Component Verification Testing

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Sovey, James S.

    2007-01-01

    Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.

  18. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  19. Verification and transparency in future arms control

    SciTech Connect

    Pilat, J.F.

    1996-09-01

    Verification`s importance has changed dramatically over time, although it always has been in the forefront of arms control. The goals and measures of verification and the criteria for success have changed with the times as well, reflecting such factors as the centrality of the prospective agreement to East-West relations during the Cold War, the state of relations between the United States and the Soviet Union, and the technologies available for monitoring. Verification`s role may be declining in the post-Cold War period. The prospects for such a development will depend, first and foremost, on the high costs of traditional arms control, especially those associated with requirements for verification. Moreover, the growing interest in informal, or non-negotiated arms control does not allow for verification provisions by the very nature of these arrangements. Multilateral agreements are also becoming more prominent and argue against highly effective verification measures, in part because of fears of promoting proliferation by opening sensitive facilities to inspectors from potential proliferant states. As a result, it is likely that transparency and confidence-building measures will achieve greater prominence, both as supplements to and substitutes for traditional verification. Such measures are not panaceas and do not offer all that we came to expect from verification during the Cold war. But they may be the best possible means to deal with current problems of arms reductions and restraints at acceptable levels of expenditure.

  20. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  1. Nuclear Data Verification and Standardization

    SciTech Connect

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  2. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  3. Gender verification in competitive sports.

    PubMed

    Simpson, J L; Ljungqvist, A; de la Chapelle, A; Ferguson-Smith, M A; Genel, M; Carlson, A S; Ehrhardt, A A; Ferris, E

    1993-11-01

    The possibility that men might masquerade as women and be unfair competitors in women's sports is accepted as outrageous by athletes and the public alike. Since the 1930s, media reports have fuelled claims that individuals who once competed as female athletes subsequently appeared to be men. In most of these cases there was probably ambiguity of the external genitalia, possibly as a result of male pseudohermaphroditism. Nonetheless, beginning at the Rome Olympic Games in 1960, the International Amateur Athletics Federation (IAAF) began establishing rules of eligibility for women athletes. Initially, physical examination was used as a method for gender verification, but this plan was widely resented. Thus, sex chromatin testing (buccal smear) was introduced at the Mexico City Olympic Games in 1968. The principle was that genetic females (46,XX) show a single X-chromatic mass, whereas males (46,XY) do not. Unfortunately, sex chromatin analysis fell out of common diagnostic use by geneticists shortly after the International Olympic Committee (IOC) began its implementation for gender verification. The lack of laboratories routinely performing the test aggravated the problem of errors in interpretation by inexperienced workers, yielding false-positive and false-negative results. However, an even greater problem is that there exist phenotypic females with male sex chromatin patterns (e.g. androgen insensitivity, XY gonadal dysgenesis). These individuals have no athletic advantage as a result of their congenital abnormality and reasonably should not be excluded from competition. That is, only the chromosomal (genetic) sex is analysed by sex chromatin testing, not the anatomical or psychosocial status. For all the above reasons sex chromatin testing unfairly excludes many athletes. Although the IOC offered follow-up physical examinations that could have restored eligibility for those 'failing' sex chromatin tests, most affected athletes seemed to prefer to 'retire'. All

  4. Formal verification of a fault tolerant clock synchronization algorithm

    NASA Technical Reports Server (NTRS)

    Rushby, John; Vonhenke, Frieder

    1989-01-01

    A formal specification and mechanically assisted verification of the interactive convergence clock synchronization algorithm of Lamport and Melliar-Smith is described. Several technical flaws in the analysis given by Lamport and Melliar-Smith were discovered, even though their presentation is unusally precise and detailed. It seems that these flaws were not detected by informal peer scrutiny. The flaws are discussed and a revised presentation of the analysis is given that not only corrects the flaws but is also more precise and easier to follow. Some of the corrections to the flaws require slight modifications to the original assumptions underlying the algorithm and to the constraints on its parameters, and thus change the external specifications of the algorithm. The formal analysis of the interactive convergence clock synchronization algorithm was performed using the Enhanced Hierarchical Development Methodology (EHDM) formal specification and verification environment. This application of EHDM provides a demonstration of some of the capabilities of the system.

  5. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION: GENERIC VERIFICATION PROTOCOL FOR BIOLOGICAL AND AEROSOL TESTING OF GENERAL VENTILATION AIR CLEANERS

    EPA Science Inventory

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  7. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  8. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  9. Application of Nuclear Physics Methods in the Verification System for the Comprehensive Nuclear-Test-Ban Treaty

    NASA Astrophysics Data System (ADS)

    Wilhelmsen, K.; Elmgren, K.; Jansson, P.

    2005-04-01

    Elements of the The Comprehensive Nuclear Test-Ban Treaty (CTBT) and its International Monitoring System (IMS) are briefly described. Two different radionuclide detection systems, developed by the Swedish Defence Research Agency (FOI), are treated in more detail.

  10. Imaging quality full chip verification for yield improvement

    NASA Astrophysics Data System (ADS)

    Yang, Qing; Zhou, CongShu; Quek, ShyueFong; Lu, Mark; Foong, YeeMei; Qiu, JianHong; Pandey, Taksh; Dover, Russell

    2013-04-01

    Basic image intensity parameters, like maximum and minimum intensity values (Imin and Imax), image logarithm slope (ILS), normalized image logarithm slope (NILS) and mask error enhancement factor (MEEF) , are well known as indexes of photolithography imaging quality. For full chip verification, hotspot detection is typically based on threshold values for line pinching or bridging. For image intensity parameters it is generally harder to quantify an absolute value to define where the process limit will occur, and at which process stage; lithography, etch or post- CMP. However it is easy to conclude that hot spots captured by image intensity parameters are more susceptible to process variation and very likely to impact yield. In addition these image intensity hot spots can be missed by using resist model verification because the resist model normally is calibrated by the wafer data on a single resist plane and is an empirical model which is trying to fit the resist critical dimension by some mathematic algorithm with combining optical calculation. Also at resolution enhancement technology (RET) development stage, full chip imaging quality check is also a method to qualify RET solution, like Optical Proximity Correct (OPC) performance. To add full chip verification using image intensity parameters is also not as costly as adding one more resist model simulation. From a foundry yield improvement and cost saving perspective, it is valuable to quantify the imaging quality to find design hot spots to correctly define the inline process control margin. This paper studies the correlation between image intensity parameters and process weakness or catastrophic hard failures at different process stages. It also demonstrated how OPC solution can improve full chip image intensity parameters. Rigorous 3D resist profile simulation across the full height of the resist stack was also performed to identify a correlation to the image intensity parameter. A methodology of post-OPC full

  11. Automatic verification methods for finite state systems

    SciTech Connect

    Sifakis, J. )

    1990-01-01

    This volume contains the proceedings of a workshop devoted to the verification of finite state systems. The workshop focused on the development and use of methods, tools and theories for automatic verification of finite state systems. The goal at the workshop was to compare verification methods and tools to assist the applications designer. The papers review verification techniques for finite state systems and evaluate their relative advantages. The techniques considered cover various specification formalisms such as process algebras, automata and logics. Most of the papers focus on exploitation of existing results in three application areas: hardware design, communication protocols and real-time systems.

  12. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  13. Input apparatus for dynamic signature verification systems

    DOEpatents

    EerNisse, Errol P.; Land, Cecil E.; Snelling, Jay B.

    1978-01-01

    The disclosure relates to signature verification input apparatus comprising a writing instrument and platen containing piezoelectric transducers which generate signals in response to writing pressures.

  14. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  15. Conformance Verification of Privacy Policies

    NASA Astrophysics Data System (ADS)

    Fu, Xiang

    Web applications are both the consumers and providers of information. To increase customer confidence, many websites choose to publish their privacy protection policies. However, policy conformance is often neglected. We propose a logic based framework for formally specifying and reasoning about the implementation of privacy protection by a web application. A first order extension of computation tree logic is used to specify a policy. A verification paradigm, built upon a static control/data flow analysis, is presented to verify if a policy is satisfied.

  16. Why do verification and validation?

    DOE PAGESBeta

    Hu, Kenneth T.; Paez, Thomas L.

    2016-02-19

    In this discussion paper, we explore different ways to assess the value of verification and validation (V&V) of engineering models. We first present a literature review on the value of V&V and then use value chains and decision trees to show how value can be assessed from a decision maker's perspective. In this context, the value is what the decision maker is willing to pay for V&V analysis with the understanding that the V&V results are uncertain. As a result, the 2014 Sandia V&V Challenge Workshop is used to illustrate these ideas.

  17. Science verification results from PMAS

    NASA Astrophysics Data System (ADS)

    Roth, M. M.; Becker, T.; Böhm, P.; Kelz, A.

    2004-02-01

    PMAS, the Potsdam Multi-Aperture Spectrophotometer, is a new integral field instrument which was commissioned at the Calar Alto 3.5m Telescope in May 2001. We report on results obtained from a science verification run in October 2001. We present observations of the low-metallicity blue compact dwarf galaxy SBS0335-052, the ultra-luminous X-ray Source X-1 in the Holmberg;II galaxy, the quadruple gravitational lens system Q2237+0305 (the ``Einstein Cross''), the Galactic planetary nebula NGC7027, and extragalactic planetary nebulae in M31. PMAS is now available as a common user instrument at Calar Alto Observatory.

  18. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  19. Radioxenon detections in the CTBT international monitoring system likely related to the announced nuclear test in North Korea on February 12, 2013.

    PubMed

    Ringbom, A; Axelsson, A; Aldener, M; Auer, M; Bowyer, T W; Fritioff, T; Hoffman, I; Khrustalev, K; Nikkinen, M; Popov, V; Popov, Y; Ungar, K; Wotawa, G

    2014-02-01

    Observations made in April 2013 of the radioxenon isotopes (133)Xe and (131m)Xe at measurement stations in Japan and Russia, belonging to the International Monitoring System for verification of the Comprehensive Nuclear-Test-Ban Treaty, are unique with respect to the measurement history of these stations. Comparison of measured data with calculated isotopic ratios as well as analysis using atmospheric transport modeling indicate that it is likely that the xenon measured was created in the underground nuclear test conducted by North Korea on February 12, 2013, and released 7-8 weeks later. More than one release is required to explain all observations. The (131m)Xe source terms for each release were calculated to 0.7 TBq, corresponding to about 1-10% of the total xenon inventory for a 10 kt explosion, depending on fractionation and release scenario. The observed ratios could not be used to obtain any information regarding the fissile material that was used in the test. PMID:24316684

  20. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  1. Application of Hadamard spectroscopy to automated structure verification in high-throughput NMR.

    PubMed

    Ruan, Ke; Yang, Shengtian; Van Sant, Karey A; Likos, John J

    2009-08-01

    Combined verification using 1-D proton and HSQC has been proved to be quite successful; the acquisition time of HSQC spectra, however, can be limiting in its high-throughput applications. The replacement with Hadamard HSQC can significantly enhance the throughput. We hereby propose a protocol to optimize the grouping of the predicted carbon chemical shifts from the proposed structure and the associated Hadamard frequencies and bandwidths. The resulting Hadamard HSQC spectra compare favorably with their Fourier-transformed counterparts, and have demonstrated to perform equivalently in terms of combined verification, but with several fold enhancement in throughput, as illustrated for 21 commercial available molecules and 16 prototypical drug compounds. Further improvement of the verification accuracy can be achieved by the cross validation from Hadamard TOCSY, which can be acquired without much sacrifice in throughput. PMID:19496061

  2. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  3. RISKIND verification and benchmark comparisons

    SciTech Connect

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  4. Video-Based Fingerprint Verification

    PubMed Central

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283

  5. Runtime Verification with State Estimation

    NASA Technical Reports Server (NTRS)

    Stoller, Scott D.; Bartocci, Ezio; Seyster, Justin; Grosu, Radu; Havelund, Klaus; Smolka, Scott A.; Zadok, Erez

    2011-01-01

    We introduce the concept of Runtime Verification with State Estimation and show how this concept can be applied to estimate theprobability that a temporal property is satisfied by a run of a program when monitoring overhead is reduced by sampling. In such situations, there may be gaps in the observed program executions, thus making accurate estimation challenging. To deal with the effects of sampling on runtime verification, we view event sequences as observation sequences of a Hidden Markov Model (HMM), use an HMM model of the monitored program to "fill in" sampling-induced gaps in observation sequences, and extend the classic forward algorithm for HMM state estimation (which determines the probability of a state sequence, given an observation sequence) to compute the probability that the property is satisfied by an execution of the program. To validate our approach, we present a case study based on the mission software for a Mars rover. The results of our case study demonstrate high prediction accuracy for the probabilities computed by our algorithm. They also show that our technique is much more accurate than simply evaluating the temporal property on the given observation sequences, ignoring the gaps.

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  7. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  8. The Ontogeny of the Verification System.

    ERIC Educational Resources Information Center

    Akiyama, M. Michael; Guillory, Andrea W.

    1983-01-01

    Young children found it difficult to verify negative statements, but found affirmative statements, affirmative questions, and negative questions equally easy to deal with. It is proposed that children acquire the answering system earlier than the verification system, and use answering to verify statements before acquiring the verification system.…

  9. The monitoring and verification of nuclear weapons

    SciTech Connect

    Garwin, Richard L.

    2014-05-09

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  11. 25 CFR 61.8 - Verification forms.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR TRIBAL GOVERNMENT PREPARATION OF ROLLS OF INDIANS § 61.8... enrollment, a verification form, to be completed and returned, shall be mailed to each previous enrollee using the last address of record. The verification form will be used to ascertain the previous...

  12. 40 CFR 1066.220 - Linearity verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Linearity verification. 1066.220 Section 1066.220 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.220 Linearity verification. (a) Scope and frequency. Perform linearity...

  13. HTGR analytical methods and design verification

    SciTech Connect

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier.

  14. Students' Verification Strategies for Combinatorial Problems

    ERIC Educational Resources Information Center

    Mashiach Eizenberg, Michal; Zaslavsky, Orit

    2004-01-01

    We focus on a major difficulty in solving combinatorial problems, namely, on the verification of a solution. Our study aimed at identifying undergraduate students' tendencies to verify their solutions, and the verification strategies that they employ when solving these problems. In addition, an attempt was made to evaluate the level of efficiency…

  15. The monitoring and verification of nuclear weapons

    NASA Astrophysics Data System (ADS)

    Garwin, Richard L.

    2014-05-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  16. Verification testing of advanced environmental monitoring systems

    SciTech Connect

    Kelly, T.J.; Riggs, K.B.; Fuerst, R.G.

    1999-03-01

    This paper describes the Advanced Monitoring Systems (AMS) pilot project, one of 12 pilots comprising the US EPA`s Environmental Technology Verification (ETV) program. The aim of ETV is to promote the acceptance of environmental technologies in the marketplace, through objective third-party verification of technology performance.

  17. Gender Verification of Female Olympic Athletes.

    ERIC Educational Resources Information Center

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  18. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  19. Hierarchical Design and Verification for VLSI

    NASA Technical Reports Server (NTRS)

    Shostak, R. E.; Elliott, W. D.; Levitt, K. N.

    1983-01-01

    The specification and verification work is described in detail, and some of the problems and issues to be resolved in their application to Very Large Scale Integration VLSI systems are examined. The hierarchical design methodologies enable a system architect or design team to decompose a complex design into a formal hierarchy of levels of abstraction. The first step inprogram verification is tree formation. The next step after tree formation is the generation from the trees of the verification conditions themselves. The approach taken here is similar in spirit to the corresponding step in program verification but requires modeling of the semantics of circuit elements rather than program statements. The last step is that of proving the verification conditions using a mechanical theorem-prover.

  20. New method of verificating optical flat flatness

    NASA Astrophysics Data System (ADS)

    Sun, Hao; Li, Xueyuan; Han, Sen; Zhu, Jianrong; Guo, Zhenglai; Fu, Yuegang

    2014-11-01

    Optical flat is commonly used in optical testing instruments, flatness is the most important parameter of forming errors. As measurement criteria, optical flat flatness (OFF) index needs to have good precision. Current measurement in China is heavily dependent on the artificial visual interpretation, through discrete points to characterize the flatness. The efficiency and accuracy of this method can not meet the demand of industrial development. In order to improve the testing efficiency and accuracy of measurement, it is necessary to develop an optical flat verification system, which can obtain all surface information rapidly and efficiently, at the same time, in accordance with current national metrological verification procedures. This paper reviews current optical flat verification method and solves the problems existing in previous test, by using new method and its supporting software. Final results show that the new system can improve verification efficiency and accuracy, by comparing with JJG 28-2000 metrological verification procedures method.

  1. MACCS2 development and verification efforts

    SciTech Connect

    Young, M.; Chanin, D.

    1997-03-01

    MACCS2 represents a major enhancement of the capabilities of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, released in 1987, was developed to estimate the potential impacts to the surrounding public of severe accidents at nuclear power plants. The principal phenomena considered in MACCS/MACCS2 are atmospheric transport and deposition under time-variant meteorology, short-term and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. MACCS2 was developed as a general-purpose analytical tool applicable to diverse reactor and nonreactor facilities. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. In addition, errors that had been identified in MACCS version1.5.11.1 were corrected, including an error that prevented the code from providing intermediate-phase results. MACCS2 version 1.10 beta test was released to the beta-test group in May, 1995. In addition, the University of New Mexico (UNM) has completed an independent verification study of the code package. Since the beta-test release of MACCS2 version 1.10, a number of minor errors have been identified and corrected, and a number of enhancements have been added to the code package. The code enhancements added since the beta-test release of version 1.10 include: (1) an option to allow the user to input the {sigma}{sub y} and {sigma}{sub z} plume expansion parameters in a table-lookup form for incremental downwind distances, (2) an option to define different initial dimensions for up to four segments of a release, (3) an enhancement to the COMIDA2 food-chain model preprocessor to allow the user to supply externally calculated tables of tritium food-chain dose per unit deposition on farmland to support analyses of tritium releases, and (4) the capability to calculate direction-dependent doses.

  2. Stennis Space Center Verification & Validation Capabilities

    NASA Technical Reports Server (NTRS)

    Pagnutti, Mary; Ryan, Robert E.; Holekamp, Kara; O'Neal, Duane; Knowlton, Kelly; Ross, Kenton; Blonski, Slawomir

    2007-01-01

    Scientists within NASA#s Applied Research & Technology Project Office (formerly the Applied Sciences Directorate) have developed a well-characterized remote sensing Verification & Validation (V&V) site at the John C. Stennis Space Center (SSC). This site enables the in-flight characterization of satellite and airborne high spatial resolution remote sensing systems and their products. The smaller scale of the newer high resolution remote sensing systems allows scientists to characterize geometric, spatial, and radiometric data properties using a single V&V site. The targets and techniques used to characterize data from these newer systems can differ significantly from the techniques used to characterize data from the earlier, coarser spatial resolution systems. Scientists have used the SSC V&V site to characterize thermal infrared systems. Enhancements are being considered to characterize active lidar systems. SSC employs geodetic targets, edge targets, radiometric tarps, atmospheric monitoring equipment, and thermal calibration ponds to characterize remote sensing data products. Similar techniques are used to characterize moderate spatial resolution sensing systems at selected nearby locations. The SSC Instrument Validation Lab is a key component of the V&V capability and is used to calibrate field instrumentation and to provide National Institute of Standards and Technology traceability. This poster presents a description of the SSC characterization capabilities and examples of calibration data.

  3. Performing Verification and Validation in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  4. Speaker verification using combined acoustic and EM sensor signal processing

    SciTech Connect

    Ng, L C; Gable, T J; Holzrichter, J F

    2000-11-10

    Low Power EM radar-like sensors have made it possible to measure properties of the human speech production system in real-time, without acoustic interference. This greatly enhances the quality and quantity of information for many speech related applications. See Holzrichter, Burnett, Ng, and Lea, J. Acoustic. SOC. Am . 103 ( 1) 622 (1998). By combining the Glottal-EM-Sensor (GEMS) with the Acoustic-signals, we've demonstrated an almost 10 fold reduction in error rates from a speaker verification system experiment under a moderate noisy environment (-10dB).

  5. TEST DESIGN FOR ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) OF ADD-ON NOX CONTROL UTILIZING OZONE INJECTION

    EPA Science Inventory

    The paper discusses the test design for environmental technology verification (ETV) of add-0n nitrogen oxides (NOx) control utilizing ozone injection. (NOTE: ETV is an EPA-established program to enhance domestic and international market acceptance of new or improved commercially...

  6. Retail applications of signature verification

    NASA Astrophysics Data System (ADS)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  7. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  8. Verification of FANTASTIC integrated code

    NASA Technical Reports Server (NTRS)

    Chauhan, Rajinder Singh

    1987-01-01

    FANTASTIC is an acronym for Failure Analysis Nonlinear Thermal and Structural Integrated Code. This program was developed by Failure Analysis Associates, Palo Alto, Calif., for MSFC to improve the accuracy of solid rocket motor nozzle analysis. FANTASTIC has three modules: FACT - thermochemical analysis; FAHT - heat transfer analysis; and FAST - structural analysis. All modules have keywords for data input. Work is in progress for the verification of the FAHT module, which is done by using data for various problems with known solutions as inputs to the FAHT module. The information obtained is used to identify problem areas of the code and passed on to the developer for debugging purposes. Failure Analysis Associates have revised the first version of the FANTASTIC code and a new improved version has been released to the Thermal Systems Branch.

  9. Automated claim and payment verification.

    PubMed

    Segal, Mark J; Morris, Susan; Rubin, James M O

    2002-01-01

    Since the start of managed care, there has been steady deterioration in the ability of physicians, hospitals, payors, and patients to understand reimbursement and the contracts and payment policies that drive it. This lack of transparency has generated administrative costs, confusion, and mistrust. It is therefore essential that physicians, hospitals, and payors have rapid access to accurate information on contractual payment terms. This article summarizes problems with contract-based reimbursement and needed responses by medical practices. It describes an innovative, Internet-based claims and payment verification service, Phynance, which automatically verifies the accuracy of all claims and payments by payor, contract and line item. This service enables practices to know and apply the one, true, contractually obligated allowable. The article details implementation costs and processes and anticipated return on investment. The resulting transparency improves business processes throughout health care, increasing efficiency and lowering costs for physicians, hospitals, payors, employers--and patients. PMID:12122814

  10. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  11. Technology Foresight and nuclear test verification: a structured and participatory approach

    NASA Astrophysics Data System (ADS)

    Noack, Patrick; Gaya-Piqué, Luis; Haralabus, Georgios; Auer, Matthias; Jain, Amit; Grenard, Patrick

    2013-04-01

    maturity: the number of years until the technology in question reaches Development Stage 3 (i.e. prototype validated). 6. Integration effort: the anticipated level of effort required by the PTS to fully integrate the technology, process, concept or idea into is verification environment. 7. Time to impact: the number of years until the technology is fully developed and integrated into the PTS verification environment and delivers on its full potential. The resulting database is coupled to Pivot, a novel information management software tool which offers powerful visualisation of the taxonomy's parameters for each technology. Pivot offers many advantages over conventional spreadhseet-interfaced database tools: based on shared categories in the taxonomy, users can quickly and intuitively discover linkages, communalities and various interpretations about prospective CTBT pertinent technologies. It is easily possible to visualise a resulting sub-set of technologies that conform to the specific user-selected attributes from the full range of taxonomy categories. In this presentation we will illustrate the range of future technologies, processes, concepts and ideas; we will demonstrate how the Pivot tool can be fruitfully applied to assist in strategic planning and development, and to identify gaps apparent on the technology development horizon. Finally, we will show how the Pivot tool together with the taxonomy offer real and emerging insights to make sense of large amounts of disparate technologies.

  12. Liquefied Natural Gas (LNG) dispenser verification device

    NASA Astrophysics Data System (ADS)

    Xiong, Maotao; Yang, Jie-bin; Zhao, Pu-jun; Yu, Bo; Deng, Wan-quan

    2013-01-01

    The composition of working principle and calibration status of LNG (Liquefied Natural Gas) dispenser in China are introduced. According to the defect of weighing method in the calibration of LNG dispenser, LNG dispenser verification device has been researched. The verification device bases on the master meter method to verify LNG dispenser in the field. The experimental results of the device indicate it has steady performance, high accuracy level and flexible construction, and it reaches the international advanced level. Then LNG dispenser verification device will promote the development of LNG dispenser industry in China and to improve the technical level of LNG dispenser manufacture.

  13. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  14. VERIFICATION OF GLOBAL CLIMATE CHANGE MITIGATION TECHNOLOGIES

    EPA Science Inventory

    This is a continuation of independent performance evaluations of environmental technologies under EPA's Environmental Technology Verification Program. Emissions of some greenhouse gases, most notably methane. can be controlled profitably now, even in the absence of regulations. ...

  15. Electronic Verification at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Johnson, T. W.

    1995-01-01

    This document reviews some current applications of Electronic Verification and the benefits such applications are providing the Kennedy Space Center (KSC). It also previews some new technologies, including statistics regarding performance and possible utilization of the technology.

  16. THE EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM

    EPA Science Inventory

    The Environmental Protection Agency (EPA) instituted the Environmental Technology Verification Program--or ETV--to verify the performance of innovative technical solutions to problems that threaten human health or the environment. ETV was created to substantially accelerate the e...

  17. ETV - ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) - RISK MANAGEMENT

    EPA Science Inventory

    In October 1995, the Environmental Technology Verification (ETV) Program was established by EPA. The goal of ETV is to provide credible performance data for commercial-ready environmental technologies to speed their implementation for the benefit of vendors, purchasers, permitter...

  18. The PASCAL-HDM Verification System

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The PASCAL-HDM verification system is described. This system supports the mechanical generation of verification conditions from PASCAL programs and HDM-SPECIAL specifications using the Floyd-Hoare axiomatic method. Tools are provided to parse programs and specifications, check their static semantics, generate verification conditions from Hoare rules, and translate the verification conditions appropriately for proof using the Shostak Theorem Prover, are explained. The differences between standard PASCAL and the language handled by this system are explained. This consists mostly of restrictions to the standard language definition, the only extensions or modifications being the addition of specifications to the code and the change requiring the references to a function of no arguments to have empty parentheses.

  19. Engineering drawing field verification program. Revision 3

    SciTech Connect

    Ulk, P.F.

    1994-10-12

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented.

  20. Verification timer for AECL 780 Cobalt unit.

    PubMed

    Smathers, J B; Holly, F E

    1984-05-01

    To obtain verification of the proper time setting of the motorized run down timer for a AECL 780 Cobalt Unit, a digital timer is described, which can be added to the system for under $300. PMID:6735762

  1. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  2. Calibration and verification of environmental models

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Weinberg, N.; Hiser, H.

    1976-01-01

    The problems of calibration and verification of mesoscale models used for investigating power plant discharges are considered. The value of remote sensors for data acquisition is discussed as well as an investigation of Biscayne Bay in southern Florida.

  3. MAMA Software Features: Quantification Verification Documentation-1

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  4. U.S. Environmental Technology Verification Program

    EPA Science Inventory

    Overview of the U.S. Environmental Technology Verification Program (ETV), the ETV Greenhouse Gas Technology Center, and energy-related ETV projects. Presented at the Department of Energy's National Renewable Laboratory in Boulder, Colorado on June 23, 2008.

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: FUEL CELLS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: STORMWATER TECHNOLOGIES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techn...

  7. Handbook: Design of automated redundancy verification

    NASA Technical Reports Server (NTRS)

    Ford, F. A.; Hasslinger, T. W.; Moreno, F. J.

    1971-01-01

    The use of the handbook is discussed and the design progress is reviewed. A description of the problem is presented, and examples are given to illustrate the necessity for redundancy verification, along with the types of situations to which it is typically applied. Reusable space vehicles, such as the space shuttle, are recognized as being significant in the development of the automated redundancy verification problem.

  8. The NPARC Alliance Verification and Validation Archive

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Dudek, Julianne C.; Tatum, Kenneth E.

    2000-01-01

    The NPARC Alliance (National Project for Applications oriented Research in CFD) maintains a publicly-available, web-based verification and validation archive as part of the development and support of the WIND CFD code. The verification and validation methods used for the cases attempt to follow the policies and guidelines of the ASME and AIAA. The emphasis is on air-breathing propulsion flow fields with Mach numbers ranging from low-subsonic to hypersonic.

  9. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  10. Transmutation Fuel Performance Code Thermal Model Verification

    SciTech Connect

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  11. Verification and Validation in Computational Fluid Dynamics

    SciTech Connect

    OBERKAMPF, WILLIAM L.; TRUCANO, TIMOTHY G.

    2002-03-01

    Verification and validation (V and V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V and V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V and V, and develops a number of extensions to existing ideas. The review of the development of V and V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V and V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized.

  12. Development of advanced seal verification

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Kosten, Susan E.; Abushagur, Mustafa A.

    1992-01-01

    The purpose of this research is to develop a technique to monitor and insure seal integrity with a sensor that has no active elements to burn-out during a long duration activity, such as a leakage test or especially during a mission in space. The original concept proposed is that by implementing fiber optic sensors, changes in the integrity of a seal can be monitored in real time and at no time should the optical fiber sensor fail. The electrical components which provide optical excitation and detection through the fiber are not part of the seal; hence, if these electrical components fail, they can be easily changed without breaking the seal. The optical connections required for the concept to work does present a functional problem to work out. The utility of the optical fiber sensor for seal monitoring should be general enough that the degradation of a seal can be determined before catastrophic failure occurs and appropriate action taken. Two parallel efforts were performed in determining the feasibility of using optical fiber sensors for seal verification. In one study, research on interferometric measurements of the mechanical response of the optical fiber sensors to seal integrity was studied. In a second study, the implementation of the optical fiber to a typical vacuum chamber was implemented and feasibility studies on microbend experiments in the vacuum chamber were performed. Also, an attempt was made to quantify the amount of pressure actually being applied to the optical fiber using finite element analysis software by Algor.

  13. Learning Assumptions for Compositional Verification

    NASA Technical Reports Server (NTRS)

    Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.

  14. Cold Flow Verification Test Facility

    SciTech Connect

    Shamsi, A.; Shadle, L.J.

    1996-12-31

    The cold flow verification test facility consists of a 15-foot high, 3-foot diameter, domed vessel made of clear acrylic in two flanged sections. The unit can operate up to pressures of 14 psig. The internals include a 10-foot high jetting fluidized bed, a cylindrical baffle that hangs from the dome, and a rotating grate for control of continuous solids removal. The fluid bed is continuously fed solids (20 to 150 lb/hr) through a central nozzle made up of concentric pipes. It can either be configured as a half or full cylinder of various dimensions. The fluid bed has flow loops for separate air flow control for conveying solids (inner jet, 500 to 100000 scfh) , make-up into the jet (outer jet, 500 to 8000 scfh), spargers in the solids removal annulus (100 to 2000 scfh), and 6 air jets (20 to 200 scfh) on the sloping conical grid. Additional air (500 to 10000 scfh) can be added to the top of the dome and under the rotating grate. The outer vessel, the hanging cylindrical baffles or skirt, and the rotating grate can be used to study issues concerning moving bed reactors. There is ample allowance for access and instrumentation in the outer shell. Furthermore, this facility is available for future Cooperative Research and Development Program Manager Agreements (CRADA) to study issues and problems associated with fluid- and fixed-bed reactors. The design allows testing of different dimensions and geometries.

  15. Verification of excess defense material

    SciTech Connect

    Fearey, B.L.; Pilat, J.F.; Eccleston, G.W.; Nicholas, N.J.; Tape, J.W.

    1997-12-01

    The international community in the post-Cold War period has expressed an interest in the International Atomic Energy Agency (IAEA) using its expertise in support of the arms control and disarmament process in unprecedented ways. The pledges of the US and Russian presidents to place excess defense materials under some type of international inspections raises the prospect of using IAEA safeguards approaches for monitoring excess materials, which include both classified and unclassified materials. Although the IAEA has suggested the need to address inspections of both types of materials, the most troublesome and potentially difficult problems involve approaches to the inspection of classified materials. The key issue for placing classified nuclear components and materials under IAEA safeguards is the conflict between these traditional IAEA materials accounting procedures and the US classification laws and nonproliferation policy designed to prevent the disclosure of critical weapon-design information. Possible verification approaches to classified excess defense materials could be based on item accountancy, attributes measurements, and containment and surveillance. Such approaches are not wholly new; in fact, they are quite well established for certain unclassified materials. Such concepts may be applicable to classified items, but the precise approaches have yet to be identified, fully tested, or evaluated for technical and political feasibility, or for their possible acceptability in an international inspection regime. Substantial work remains in these areas. This paper examines many of the challenges presented by international inspections of classified materials.

  16. National Verification System of National Meteorological Center , China

    NASA Astrophysics Data System (ADS)

    Zhang, Jinyan; Wei, Qing; Qi, Dan

    2016-04-01

    Product Quality Verification Division for official weather forecasting of China was founded in April, 2011. It is affiliated to Forecast System Laboratory (FSL), National Meteorological Center (NMC), China. There are three employees in this department. I'm one of the employees and I am in charge of Product Quality Verification Division in NMC, China. After five years of construction, an integrated realtime National Verification System of NMC, China has been established. At present, its primary roles include: 1) to verify official weather forecasting quality of NMC, China; 2) to verify the official city weather forecasting quality of Provincial Meteorological Bureau; 3) to evaluate forecasting quality for each forecasters in NMC, China. To verify official weather forecasting quality of NMC, China, we have developed : • Grid QPF Verification module ( including upascale) • Grid temperature, humidity and wind forecast verification module • Severe convective weather forecast verification module • Typhoon forecast verification module • Disaster forecast verification • Disaster warning verification module • Medium and extend period forecast verification module • Objective elements forecast verification module • Ensemble precipitation probabilistic forecast verification module To verify the official city weather forecasting quality of Provincial Meteorological Bureau, we have developed : • City elements forecast verification module • Public heavy rain forecast verification module • City air quality forecast verification module. To evaluate forecasting quality for each forecasters in NMC, China, we have developed : • Off-duty forecaster QPF practice evaluation module • QPF evaluation module for forecasters • Severe convective weather forecast evaluation module • Typhoon track forecast evaluation module for forecasters • Disaster warning evaluation module for forecasters • Medium and extend period forecast evaluation module The further

  17. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Gas analyzer range verification and drift verification. 1065.550 Section 1065.550 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Performing an Emission Test Over Specified Duty Cycles § 1065.550 Gas analyzer...

  18. Monitoring and verification R&D

    SciTech Connect

    Pilat, Joseph F; Budlong - Sylvester, Kory W; Fearey, Bryan L

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.

  19. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  20. Ozone Monitoring Instrument geolocation verification

    NASA Astrophysics Data System (ADS)

    Kroon, M.; Dobber, M. R.; Dirksen, R.; Veefkind, J. P.; van den Oord, G. H. J.; Levelt, P. F.

    2008-08-01

    Verification of the geolocation assigned to individual ground pixels as measured by the Ozone Monitoring Instrument (OMI) aboard the NASA EOS-Aura satellite was performed by comparing geophysical Earth surface details as observed in OMI false color images with the high-resolution continental outline vector map as provided by the Interactive Data Language (IDL) software tool from ITT Visual Information Solutions. The OMI false color images are generated from the OMI visible channel by integration over 20-nm-wide spectral bands of the Earth radiance intensity around 484 nm, 420 nm, and 360 nm wavelength per ground pixel. Proportional to the integrated intensity, we assign color values composed of CRT standard red, green, and blue to the OMI ground pixels. Earth surface details studied are mostly high-contrast coast lines where arid land or desert meets deep blue ocean. The IDL high-resolution vector map is based on the 1993 CIA World Database II Map with a 1-km accuracy. Our results indicate that the average OMI geolocation offset over the years 2005-2006 is 0.79 km in latitude and 0.29 km in longitude, with a standard deviation of 1.64 km in latitude and 2.04 km in longitude, respectively. Relative to the OMI nadir pixel size, one obtains mean displacements of ˜6.1% in latitude and ˜1.2% in longitude, with standard deviations of 12.6% and 7.9%, respectively. We conclude that the geolocation assigned to individual OMI ground pixels is sufficiently accurate to support scientific studies of atmospheric features as observed in OMI level 2 satellite data products, such as air quality issues on urban scales or volcanic eruptions and its plumes, that occur on spatial scales comparable to or smaller than OMI nadir pixels.

  1. Fuel Retrieval System (FRS) Design Verification

    SciTech Connect

    YANOCHKO, R.M.

    2000-01-27

    This document was prepared as part of an independent review to explain design verification activities already completed, and to define the remaining design verification actions for the Fuel Retrieval System. The Fuel Retrieval Subproject was established as part of the Spent Nuclear Fuel Project (SNF Project) to retrieve and repackage the SNF located in the K Basins. The Fuel Retrieval System (FRS) construction work is complete in the KW Basin, and start-up testing is underway Design modifications and construction planning are also underway for the KE Basin. An independent review of the design verification process as applied to the K Basin projects was initiated in support of preparation for the SNF Project operational readiness review (ORR).

  2. PBL Verification with Radiosonde and Aircraft Data

    NASA Astrophysics Data System (ADS)

    Tsidulko, M.; McQueen, J.; Dimego, G.; Ek, M.

    2008-12-01

    Boundary layer depth is an important characteristic in weather forecasting and it is a key parameter in air quality modeling determining extent of turbulence and dispersion for pollutants. Real-time PBL depths from the NAM(WRF/NMM) model are verified with different types of observations. PBL depths verification is incorporated into NCEP verification system including an ability to provide a range of statistical characteristics for the boundary layer heights. For the model, several types of boundary layer definitions are used. PBL height from the TKE scheme, critical Ri number approach as well as mixed layer depth are compared with observations. Observed PBL depths are determined applying Ri number approach to radiosonde profiles. Also, preliminary study of using ACARS data for PBL verification is conducted.

  3. Packaged low-level waste verification system

    SciTech Connect

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-08-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy`s National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL).

  4. Land Ice Verification and Validation Kit

    Energy Science and Technology Software Center (ESTSC)

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&Vmore » involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and test data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  5. Active alignment/contact verification system

    DOEpatents

    Greenbaum, William M.

    2000-01-01

    A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.

  6. Land Ice Verification and Validation Kit

    SciTech Connect

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and test data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.

  7. Critical Surface Cleaning and Verification Alternatives

    NASA Technical Reports Server (NTRS)

    Melton, Donald M.; McCool, A. (Technical Monitor)

    2000-01-01

    As a result of federal and state requirements, historical critical cleaning and verification solvents such as Freon 113, Freon TMC, and Trichloroethylene (TCE) are either highly regulated or no longer 0 C available. Interim replacements such as HCFC 225 have been qualified, however toxicity and future phase-out regulations necessitate long term solutions. The scope of this project was to qualify a safe and environmentally compliant LOX surface verification alternative to Freon 113, TCE and HCFC 225. The main effort was focused on initiating the evaluation and qualification of HCFC 225G as an alternate LOX verification solvent. The project was scoped in FY 99/00 to perform LOX compatibility, cleaning efficiency and qualification on flight hardware.

  8. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  9. Proton Therapy Verification with PET Imaging

    PubMed Central

    Zhu, Xuping; Fakhri, Georges El

    2013-01-01

    Proton therapy is very sensitive to uncertainties introduced during treatment planning and dose delivery. PET imaging of proton induced positron emitter distributions is the only practical approach for in vivo, in situ verification of proton therapy. This article reviews the current status of proton therapy verification with PET imaging. The different data detecting systems (in-beam, in-room and off-line PET), calculation methods for the prediction of proton induced PET activity distributions, and approaches for data evaluation are discussed. PMID:24312147

  10. Design and ground verification of proximity operations

    NASA Astrophysics Data System (ADS)

    Tobias, A.; Ankersen, F.; Fehse, W.; Pauvert, C.; Pairot, J.

    This paper describes the approach to guidance, navigation, and control (GNC) design and verification for proximity operations. The most critical part of the rendezvous mission is the proximity operations phase when the distance between chaser and target is below approximately 20 m. Safety is the overriding consideration in the design of the GNC system. Requirements on the GNC system also stem from the allocation of performance between proximity operations and the mating process, docking, or capture for berthing. Whereas the design process follows a top down approach, the verification process goes bottom up in a stepwise way according to the development stage.