Science.gov

Sample records for enhancing ctbt verification

  1. Visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-03-01

    On-site visual inspection will play an essential role in future Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection can greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can visual inspection offer ``ground truth`` in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending party may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection.

  2. CTBT integrated verification system evaluation model supplement

    SciTech Connect

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  3. CTBT Integrated Verification System Evaluation Model

    SciTech Connect

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  4. Completing and sustaining IMS network for the CTBT Verification Regime

    NASA Astrophysics Data System (ADS)

    Meral Ozel, N.

    2015-12-01

    The CTBT International Monitoring System is to be comprised of 337 facilities located all over the world for the purpose of detecting and locating nuclear test explosions. Major challenges remain, namely the completion of the network where most of the remaining stations have either environmental, logistical and/or political issues to surmont (89% of the stations have already been built) and the sustainment of a reliable and state-of the-art network covering 4 technologies - seismic, infrasound , hydroacoustic and radionuclide. To have a credible and trustworthy verification system ready for entry into force of the Treaty, the CTBTO is protecting and enhancing its investment of its global network of stations and is providing effective data to the International Data Centre (IDC) and Member States. Regarding the protection of the CTBTO's investment and enhanced sustainment of IMS station operations, the IMS Division is enhancing the capabilities of the monitoring system by applying advances in instrumentation and introducing new software applications that are fit for purpose. Some examples are the development of noble gas laboratory systems to process and analyse subsoil samples, development of a mobile noble gas system for onsite inspection purposes, optimization of Beta Gamma detectors for Xenon detection, assessing and improving the efficiency of wind noise reduction systems for infrasound stations, development and testing of infrasound stations with a self-calibrating capability, and research into the use of modular designs for the hydroacoustic network.

  5. Ground-based visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-11-01

    Ground-based visual inspection will play an essential role in On-Site Inspection (OSI) for Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection will greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can ground-based visual inspection offer effective documentation in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending state may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection. The inspections will be carried out by inspectors from members of the CTBT Organization.

  6. Tracking the Fukushima releases: from environmental monitoring to a showcase of CTBT verification

    NASA Astrophysics Data System (ADS)

    Steinhauser, Georg

    2013-04-01

    In the course of the Fukushima nuclear accident large amounts of antropogenic radionuclides relevant to the Comprehensive Nuclear-Test-Ban-Treaty (CTBT) were released and detected globally. Our group participated in the large European monitoring campaign and tracked fission products in various environmental media in Austria. We could show that the intake of environmental I-131 into the thyroids of wild animals can be used for verification of the CTBT. Due to continuous and highly specific accumulation of I-131, its apparent half-life in the thyroid biomonitor exceeds the physical one, thus making I-131 detectable three weeks longer than using conventional CTBT-grade high volume air samplers. This means an increase in sensitivity of almost one order of magnitude compared with conventional systems. In a second campaign we analysed the large data set of analyses of Japanese foods. Food was regarded as a geographically well localized environmental sample. The objective of this study was to determine the radiocesium activity ratio (Cs-134/137) in foods from each geographic area to possibly identify the radioactive signature of the four different reactors (i.e. four independent sources) in the distinct regions. No clear deviations from the average value (0.98) could be confirmed in the various regions. Hence, the releases from reactor No. 4 (carrying a significantly smaller activity ratio) are assumed to be small when compared with the other three reactor release. The individual radioisotopic signatures of reactors No. 1, 2, and 3 could not be identified in various Japanese regions using the food samples, indicating integral radiocesium contamination from these sources.

  7. CTBT technical issues handbook

    SciTech Connect

    Zucca, J.J.

    1994-05-01

    The purpose of this handbook is to give the nonspecialist in nuclear explosion physics and nuclear test monitoring an introduction to the topic as it pertains to a Comprehensive Test Ban Treaty (CTBT). The authors have tried to make the handbook visually oriented, with figures paired to short discussions. As such, the handbook may be read straight through or in sections. The handbook covers four main areas and ends with a glossary, which includes both scientific terms and acronyms likely to be encountered during CTBT negotiations. The following topics are covered: (1) Physics of nuclear explosion experiments. This is a description of basic nuclear physics and elementary nuclear weapon design. Also discussed are testing practices. (2) Other nuclear experiments. This section discusses experiments that produce small amounts of nuclear energy but differ from explosion experiments discussed in the first chapter. This includes the type of activities, such as laser fusion, that would continue after a CTBT is in force. (3) Monitoring tests in various environments. This section describes the different physical environments in which a test could be conducted (underground, in the atmosphere, in space, underwater, and in the laboratory); the sources of non-nuclear events (such as earthquakes and mining operations); and the opportunities for evasion. (4) On-site inspections. A CTBT is likely to include these inspections as an element of the verification provisions, in order to resolve the nature of ambiguous events. This chapter describes some technical considerations and technologies that are likely to be useful. (5) Selecting verification measures. This chapter discusses the uncertain nature of the evidence from monitoring systems and how compliance judgments could be made, taking the uncertainties into account. It also discusses how to allocate monitoring resources, given the likelihood of testing by various countries in various environments.

  8. CTBT on-site inspections

    NASA Astrophysics Data System (ADS)

    Zucca, J. J.

    2014-05-01

    On-site inspection (OSI) is a critical part of the verification regime for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The OSI verification regime provides for international inspectors to make a suite of measurements and observations on site at the location of an event of interest. The other critical component of the verification regime is the International Monitoring System (IMS), which is a globally distributed network of monitoring stations. The IMS along with technical monitoring data from CTBT member countries, as appropriate, will be used to trigger an OSI. After the decision is made to carry out an OSI, it is important for the inspectors to deploy to the field site rapidly to be able to detect short-lived phenomena such as the aftershocks that may be observable after an underground nuclear explosion. The inspectors will be on site from weeks to months and will be working with many tens of tons of equipment. Parts of the OSI regime will be tested in a field exercise in the country of Jordan late in 2014. The build-up of the OSI regime has been proceeding steadily since the CTBT was signed in 1996 and is on track to becoming a deterrent to someone considering conducting a nuclear explosion in violation of the Treaty.

  9. CTBT on-site inspections

    SciTech Connect

    Zucca, J. J.

    2014-05-09

    On-site inspection (OSI) is a critical part of the verification regime for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The OSI verification regime provides for international inspectors to make a suite of measurements and observations on site at the location of an event of interest. The other critical component of the verification regime is the International Monitoring System (IMS), which is a globally distributed network of monitoring stations. The IMS along with technical monitoring data from CTBT member countries, as appropriate, will be used to trigger an OSI. After the decision is made to carry out an OSI, it is important for the inspectors to deploy to the field site rapidly to be able to detect short-lived phenomena such as the aftershocks that may be observable after an underground nuclear explosion. The inspectors will be on site from weeks to months and will be working with many tens of tons of equipment. Parts of the OSI regime will be tested in a field exercise in the country of Jordan late in 2014. The build-up of the OSI regime has been proceeding steadily since the CTBT was signed in 1996 and is on track to becoming a deterrent to someone considering conducting a nuclear explosion in violation of the Treaty.

  10. Proceedings of the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT)

    SciTech Connect

    Nichols, James W., LTC

    2000-09-15

    These proceedings contain papers prepared for the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), held 13-15 September 2000 in New Orleans, Louisiana. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Defense Threat Reduction Agency (DTRA), Air Force Technical Applications Center (AFTAC), Department of Defense (DoD), US Army Space and Missile Defense Command, Defense Special Weapons Agency (DSWA), and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  11. The DOE CTBT R&D effort at Livermore: calibrating to enhance international monitoring for clandestine nuclear explosions

    SciTech Connect

    Myers, S; Harris, D; Mayeda, K; Rodgers, A; Schultz, C; Walters, W; Zucca, J

    1999-04-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT), which was signed in 1996 and still needs to be ratified by the US, forbids all nuclear tests and creates an international monitoring system (IMS) to search for evidence of clandestine nuclear explosions. As specified in the treaty, the IMS will consist of 170 seismic stations that record underground elastic waves, 60 infrasound stations to record low-frequency sound waves in the air, 11 hydroacoustic stations to record underwater sound waves, and 80 radionuclide stations to record airborne radionuclide gases or particles. The International Data Center (IDC), located in Vienna, receives data from the IMS system and applies standard event screening criteria to any detected events with the objective of characterizing and highlighting events considered to be consistent with natural phenomena or a non-nuclear man made phenomena. The National Data Center (NDC) for each country must go a step further than the IDC and identify events as consistent with natural phenomena, non-nuclear manmade phenomena, or a banned nuclear test using these monitoring technologies.

  12. Technology Innovation for the CTBT, the National Laboratory Contribution

    NASA Astrophysics Data System (ADS)

    Goldstein, W. H.

    2016-12-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) and its Protocol are the result of a long history of scientific engagement and international technical collaboration. The U.S. Department of Energy National Laboratories have been conducting nuclear explosive test-ban research for over 50 years and have made significant contributions to this legacy. Recent examples include the RSTT (regional seismic travel time) computer code and the Smart Sampler—both of these products are the result of collaborations among Livermore, Sandia, Los Alamos, and Pacific Northwest National Laboratories. The RSTT code enables fast and accurate seismic event locations using regional data. This code solves the long-standing problem of using teleseismic and regional seismic data together to locate events. The Smart Sampler is designed for use in On-site Inspections to sample soil gases to look for noble gas fission products from a potential underground nuclear explosive test. The Smart Sampler solves the long-standing problem of collecting soil gases without contaminating the sample with gases from the atmosphere by operating only during atmospheric low-pressure events. Both these products are being evaluated by the Preparatory Commission for the CTBT Organization and the international community. In addition to R&D, the National Laboratories provide experts to support U.S. policy makers in ongoing discussions such as CTBT Working Group B, which sets policy for the development of the CTBT monitoring and verification regime.

  13. Optimizing master event templates for CTBT monitoring with dimensionality reduction techniques: real waveforms vs. synthetics.

    NASA Astrophysics Data System (ADS)

    Rozhkov, Mikhail; Bobrov, Dmitry; Kitov, Ivan

    2014-05-01

    The Master Event technique is a powerful tool for Expert Technical Analysis within the CTBT framework as well as for real-time monitoring with the waveform cross-correlation (CC) (match filter) approach. The primary goal of CTBT monitoring is detection and location of nuclear explosions. Therefore, the cross-correlation monitoring should be focused on finding such events. The use of physically adequate waveform templates may significantly increase the number of valid, both natural and manmade, events in the Reviewed Event Bulletin (REB) of the International Data Centre. Inadequate templates for master events may increase the number of CTBT irrelevant events in REB and reduce the sensitivity of the CC technique to valid events. In order to cover the entire earth, including vast aseismic territories, with the CC based nuclear test monitoring we conducted a thorough research and defined the most appropriate real and synthetic master events representing underground explosion sources. A procedure was developed on optimizing the master event template simulation and narrowing the classes of CC templates used in detection and location process based on principal and independent component analysis (PCA and ICA). Actual waveforms and metadata from the DTRA Verification Database were used to validate our approach. The detection and location results based on real and synthetic master events were compared. The prototype of CC-based Global Grid monitoring system developed in IDC during last year was populated with different hybrid waveform templates (synthetics, synthetics components, and real components) and its performance was assessed with the world seismicity data flow, including the DPRK-2013 event. The specific features revealed in this study for the P-waves from the DPRK underground nuclear explosions (UNEs) can reduce the global detection threshold of seismic monitoring under the CTBT by 0.5 units of magnitude. This corresponds to the reduction in the test yield by a

  14. Scientific Meetings Database: A New Tool for CTBT-Related International Cooperation

    SciTech Connect

    Knapik, Jerzy F.; Girven, Mary L.

    1999-08-20

    The mission of international cooperation is defined in the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Ways and means of implementation were the subject of discussion during the International Cooperation Workshop held in Vienna in November 1998, and during the Regional Workshop for CTBTO International Cooperation held in Cairo, Egypt in June 1999. In particular, a database of ''Scientific and Technical Meetings Directly or Indirectly Related to CTBT Verification-Related Technologies'' was developed by the CTBTO PrepCom/PTS/International Cooperation section and integrated into the organization's various web sites in cooperation with the U.S. Department of Energy CTBT Research and Development Program. This database, the structure and use of which is described in this paper/presentation is meant to assist the CTBT-related scientific community in identifying worldwide expertise in the CTBT verification-related technologies and should help experts, particularly those of less technologically advanced States Signatories, to strengthen contacts and to pursue international cooperation under the Tredy regime. Specific opportunities for international cooperation, in particular those provided by active participation in the use and further development of this database, are presented in this paper and/or presentation.

  15. Modeling and Verification of Privacy Enhancing Protocols

    NASA Astrophysics Data System (ADS)

    Suriadi, Suriadi; Ouyang, Chun; Smith, Jason; Foo, Ernest

    Privacy enhancing protocols (PEPs) are a family of protocols that allow secure exchange and management of sensitive user information. They are important in preserving users' privacy in today's open environment. Proof of the correctness of PEPs is necessary before they can be deployed. However, the traditional provable security approach, though well established for verifying cryptographic primitives, is not applicable to PEPs. We apply the formal method of Coloured Petri Nets (CPNs) to construct an executable specification of a representative PEP, namely the Private Information Escrow Bound to Multiple Conditions Protocol (PIEMCP). Formal semantics of the CPN specification allow us to reason about various security properties of PIEMCP using state space analysis techniques. This investigation provides us with preliminary insights for modeling and verification of PEPs in general, demonstrating the benefit of applying the CPN-based formal approach to proving the correctness of PEPs.

  16. Three years of operational experience from Schauinsland CTBT monitoring station.

    PubMed

    Zähringer, M; Bieringer, J; Schlosser, C

    2008-04-01

    Data from three years of operation of a low-level aerosol sampler and analyzer (RASA) at Schauinsland monitoring station are reported. The system is part of the International Monitoring System (IMS) for verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The fully automatic system is capable to measure aerosol borne gamma emitters with high sensitivity and routinely quantifies 7Be and 212Pb. The system had a high level of data availability of 90% within the reporting period. A daily screening process rendered 66 tentative identifications of verification relevant radionuclides since the system entered IMS operation in February 2004. Two of these were real events and associated to a plausible source. The remaining 64 cases can consistently be explained by detector background and statistical phenomena. Inter-comparison with data from a weekly sampler operated at the same station shows instabilities of the calibration during the test phase and a good agreement since certification of the system.

  17. Enhanced Cancelable Biometrics for Online Signature Verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. In this paper, we propose a scheme to improve the performance of a cancelable approach for online signature verification. Our scheme generates two cancelable dataset from one raw dataset and uses them for verification. Preliminary experiments were performed using a distance-based online signature verification algorithm. The experimental results show that our proposed scheme is promising.

  18. Observation on the CTBT and Nonproliferation

    SciTech Connect

    Shotts, W.J.

    2000-07-19

    The CTBT sits in a broad national security context. The stated purpose of the treaty is to ban nuclear testing and thereby slow nuclear proliferation. However, it also heightens issues of concern for U.S. national security related to stockpile stewardship, worldwide monitoring, and the status of other countries' nuclear weapons programs. These issues were recognized during the negotiation of the CTBT and articulated, in August 1995, as the set of safeguards under which the U.S. would be willing to sign a CTBT. Safeguards A, B, C, and F address maintenance of the U.S. nuclear stockpile, Safeguard D addresses improved monitoring capabilities, and Safeguard E addresses the need to be knowledgeable about foreign nuclear programs.

  19. Construction of a Shallow Underground Low-background Detector for a CTBT Radionuclide Laboratory

    SciTech Connect

    Forrester, Joel B.; Greenwood, Lawrence R.; Miley, Harry S.; Myers, Allan W.; Overman, Cory T.

    2013-05-01

    The International Monitoring System (IMS) is a verification component of the Comprehensive Nuclear-Test-Ban Treaty (CTBT), and in addition to a series of radionuclide monitoring stations, contains sixteen radionuclide laboratories capable of verification of radionuclide station measurements. This paper presents an overview of a new commercially obtained low-background detector system for radionuclide aerosol measurements recently installed in a shallow (>30 meters water equivalent) underground clean-room facility at Pacific Northwest National Laboratory. Specifics such as low-background shielding materials, active shielding methods, and improvements in sensitivity to IMS isotopes will be covered.

  20. Scenario details of NPE 2012 - Independent performance assessment by simulated CTBT violation

    NASA Astrophysics Data System (ADS)

    Gestermann, N.; Bönnemann, C.; Ceranna, L.; Ross, O.; Schlosser, C.

    2012-04-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) was opened for signature on 24 September 1996. The technical preparations for monitoring CTBT compliance are moving ahead rapidly and many efforts have been made since then to establish the verification system. In that regard the two underground nuclear explosions conducted by the Democratic People's Republic of Korea 2006 and 2009 were the first real performance tests of the system. In the light of these events National Data Centres (NDCs) realized the need of getting more familiar with the verification regime details. The idea of an independent annual exercise to evaluate the processing and analysis procedures applied at the National Data Centres of the CTBT was born at the NDC Evaluation Workshop in Kiev, Ukraine, 2006. The exercises should simulate a fictitious violation of the CTBT and all NDCs are invited to clarify the nature of the selected event. This exercise should help to evaluate the effectiveness of procedures applied at NDCs, as well as the quality, completeness, and usefulness of IDC products. Moreover, the National Data Centres Preparedness Exercise (NPE) is a measure for the readiness of the NDCs to fulfill their duties in regard of the CTBT verification, the treaty compliance based judgments about the nature of events as natural or artificial and chemical or nuclear, respectively. NPEs proved to be an efficient indicative tool for testing the performance of the verification system and its elements. In 2007 and 2008 the exercise were focused on seismic waveform data analysis. Since 2009 the analysis of infrasound data was included and additional attention was attached to the radionuclide component. In 2010 a realistic noble gas release scenario was selected as the trigger event which could be expected after an underground nuclear test. The epicenter location of an event from the Reviewed Event Bulletin (REB), unknown for participants of the exercise, was selected as the source of the noble gas

  1. Enhanced Verification Test Suite for Physics Simulation Codes

    SciTech Connect

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    sophistication or other physics regimes (e.g., energetic material response, magneto-hydrodynamics), would represent a scientifically desirable complement to the fundamental test cases discussed in this report. The authors believe that this document can be used to enhance the verification analyses undertaken at the DOE WP Laboratories and, thus, to improve the quality, credibility, and usefulness of the simulation codes that are analyzed with these problems.

  2. Technical Readiness of CTBT Monitoring Regime and Next Steps for Entry into Force

    NASA Astrophysics Data System (ADS)

    Zerbo, L.

    2016-12-01

    2016 marks the 20th anniversary of the opening for signature of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The strength of the CTBT lies in the fact that it is supported by a solid and proven verification regime. Made up of 337 monitoring facilities (using seismic, hydroacoustic, infrasound and radionuclide technologies) as well as 250 communication facilities, the International Monitoring System has global reach and is approximately 90% complete. Significant opportunities to the further build-up of the IMS network are being realized and technological advancements are being pursued in earnest. This represents an unprecedented achievement in the history of multilateral verification. The timely detection of the 4 announced nuclear tests by the Democratic People's Republic of Korea is proof of the effectiveness and utility of our data. Following the most recent DPRK nuclear test in January of this year, States were provided with automated and reviewed data on the event well within the short timelines provided for by the Treaty. Despite the advances in monitoring and detection capability, the CTBT has not yet become international law. The Treaty has 183 States Signatories and 164 ratifying States, but will not enter into force until the remaining 8 so-called "Annex 2" States complete their ratification procedures. This presentation will reflect on the de facto benefits that the Treaty has already brought, but also on the challenges that remain, both in terms of ensuring that the verification regime keeps pace with developments in science and technology and in securing a de jure global ban on nuclear testing.

  3. The data dictionary: A view into the CTBT knowledge base

    SciTech Connect

    Shepherd, E.R.; Keyser, R.G.; Armstrong, H.M.

    1997-08-01

    The data dictionary for the Comprehensive Test Ban Treaty (CTBT) knowledge base provides a comprehensive, current catalog of the projected contents of the knowledge base. It is written from a data definition view of the knowledge base and therefore organizes information in a fashion that allows logical storage within the computer. The data dictionary introduces two organization categories of data: the datatype, which is a broad, high-level category of data, and the dataset, which is a specific instance of a datatype. The knowledge base, and thus the data dictionary, consist of a fixed, relatively small number of datatypes, but new datasets are expected to be added on a regular basis. The data dictionary is a tangible result of the design effort for the knowledge base and is intended to be used by anyone who accesses the knowledge base for any purpose, such as populating the knowledge base with data, or accessing the data for use with automatic data processing (ADP) routines, or browsing through the data for verification purposes. For these two reasons, it is important to discuss the development of the data dictionary as well as to describe its contents to better understand its usefulness; that is the purpose of this paper.

  4. Automatic radioxenon analyzer for CTBT monitoring

    SciTech Connect

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE`s NN-20 Comprehensive Test Ban Treaty (CTBT) R&D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, {sup 131m}Xe(11.9 d), {sup 133m}Xe(2.19 d), {sup 133}Xe (5.24 d), and {sup 135}Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m{sup 3}/h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 {mu}Bq/m{sup 3} of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers.

  5. Global Monitoring of the CTBT: Progress, Capabilities and Plans (Invited)

    NASA Astrophysics Data System (ADS)

    Zerbo, L.

    2013-12-01

    The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), established in 1996, is tasked with building up the verification regime of the CTBT. The regime includes a global system for monitoring the earth, the oceans and the atmosphere for nuclear tests, and an on-site inspection (OSI) capability. More than 80% of the 337 facilities of the International Monitoring System (IMS) have been installed and are sending data to the International Data Centre (IDC) in Vienna, Austria for processing. These IMS data along with IDC processed and reviewed products are available to all States that have signed the Treaty. Concurrent with the build-up of the global monitoring networks, near-field geophysical methods are being developed and tested for OSIs. The monitoring system is currently operating in a provisional mode, as the Treaty has not yet entered into force. Progress in installing and operating the IMS and the IDC and in building up an OSI capability will be described. The capabilities of the monitoring networks have progressively improved as stations are added to the IMS and IDC processing techniques refined. Detection thresholds for seismic, hydroacoustic, infrasound and radionuclide events have been measured and in general are equal to or lower than the predictions used during the Treaty negotiations. The measurements have led to improved models and tools that allow more accurate predictions of future capabilities and network performance under any configuration. Unplanned tests of the monitoring network occurred when the DPRK announced nuclear tests in 2006, 2009, and 2013. All three tests were well above the detection threshold and easily detected and located by the seismic monitoring network. In addition, noble gas consistent with the nuclear tests in 2006 and 2013 (according to atmospheric transport models) was detected by stations in the network. On-site inspections of these tests were not conducted as the Treaty has not entered

  6. Geophysics, Remote Sensing, and the Comprehensive Nuclear-Test-Ban Treaty (CTBT) Integrated Field Exercise 2014

    NASA Astrophysics Data System (ADS)

    Sussman, A. J.; Macleod, G.; Labak, P.; Malich, G.; Rowlands, A. P.; Craven, J.; Sweeney, J. J.; Chiappini, M.; Tuckwell, G.; Sankey, P.

    2015-12-01

    The Integrated Field Exercise of 2014 (IFE14) was an event held in the Hashemite Kingdom of Jordan (with concurrent activities in Austria) that tested the operational and technical capabilities of an on-site inspection (OSI) within the CTBT verification regime. During an OSI, up to 40 international inspectors will search an area for evidence of a nuclear explosion. Over 250 experts from ~50 countries were involved in IFE14 (the largest simulation of a real OSI to date) and worked from a number of different directions, such as the Exercise Management and Control Teams (which executed the scenario in which the exercise was played) and those participants performing as members of the Inspection Team (IT). One of the main objectives of IFE14 was to test and integrate Treaty allowed inspection techniques, including a number of geophysical and remote sensing methods. In order to develop a scenario in which the simulated exercise could be carried out, suites of physical features in the IFE14 inspection area were designed and engineered by the Scenario Task Force (STF) that the IT could detect by applying the geophysical and remote sensing inspection technologies, in addition to other techniques allowed by the CTBT. For example, in preparation for IFE14, the STF modeled a seismic triggering event that was provided to the IT to prompt them to detect and localize aftershocks in the vicinity of a possible explosion. Similarly, the STF planted shallow targets such as borehole casings and pipes for detection using other geophysical methods. In addition, airborne technologies, which included multi-spectral imaging, were deployed such that the IT could identify freshly exposed surfaces, imported materials, and other areas that had been subject to modification. This presentation will introduce the CTBT and OSI, explain the IFE14 in terms of the goals specific to geophysical and remote sensing methods, and show how both the preparation for and execution of IFE14 meet those goals.

  7. Technology Advancement and the CTBT: Taking One Step Back from the Nuclear Brink

    NASA Astrophysics Data System (ADS)

    Perry, W. J.

    2016-12-01

    Technology plays a pivotal role in international nuclear security and technological advancement continues to support a path toward stability. One near-term and readily-obtainable step back from the nuclear brink is the Comprehensive Nuclear-test Ban Treaty (CTBT). The technology to independently verify adherence to the CTBT has matured in the 20 years since the Treaty was opened for signature. Technology has also improved the safety and reliability of the US nuclear stockpile in the absence of testing. Due to these advances over the past two decades neither verification nor stockpiles effectiveness should be an impediment to the Treaty's entry into force. Other technical and geo-political evolution in this same period has changed the perceived benefit of nuclear weapons as instruments of security. Recognizing the change technology has brought to deliberation of nuclear security, nations are encouraged to take this one step away from instability.This presentation will reflect on the history and assumptions that have been used to justify the build-up and configuration of nuclear stockpiles, the changes in technology and conditions that alter the basis of these original assumptions, and the re-analysis of security using current and future assumptions that point to the need for revised nuclear policies. The author has a unique and well informed perspective as both the most senior US Defense Official and a technologist.

  8. The new geospatial tools: global transparency enhancing safeguards verification

    SciTech Connect

    Pabian, Frank Vincent

    2010-09-16

    This paper focuses on the importance and potential role of the new, freely available, geospatial tools for enhancing IAEA safeguards and how, together with commercial satellite imagery, they can be used to promote 'all-source synergy'. As additional 'open sources', these new geospatial tools have heralded a new era of 'global transparency' and they can be used to substantially augment existing information-driven safeguards gathering techniques, procedures, and analyses in the remote detection of undeclared facilities, as well as support ongoing monitoring and verification of various treaty (e.g., NPT, FMCT) relevant activities and programs. As an illustration of how these new geospatial tools may be applied, an original exemplar case study provides how it is possible to derive value-added follow-up information on some recent public media reporting of a former clandestine underground plutonium production complex (now being converted to a 'Tourist Attraction' given the site's abandonment by China in the early 1980s). That open source media reporting, when combined with subsequent commentary found in various Internet-based Blogs and Wikis, led to independent verification of the reporting with additional ground truth via 'crowdsourcing' (tourist photos as found on 'social networking' venues like Google Earth's Panoramio layer and Twitter). Confirmation of the precise geospatial location of the site (along with a more complete facility characterization incorporating 3-D Modeling and visualization) was only made possible following the acquisition of higher resolution commercial satellite imagery that could be correlated with the reporting, ground photos, and an interior diagram, through original imagery analysis of the overhead imagery.

  9. The browser prototype for the CTBT knowledge base

    SciTech Connect

    Armstrong, H.M.; Keyser, R.G.

    1997-07-02

    As part of the United States Department of Energy`s (DOE) Comprehensive Test Ban Treaty (CTBT) research and development effort, a Knowledge Base is being developed. This Knowledge Base will store the regional geophysical research results as well as geographic contexual information and make this information available to the Automated Data Processing (ADP routines) as well as human analysts involved in CTBT monitoring. This paper focuses on the initial development of a browser prototype to be used to interactively examine the contents of the CTBT Knowledge Base. The browser prototype is intended to be a research tool to experiment with different ways to display and integrate the datasets. An initial prototype version has been developed using Environmental Systems Research Incorporated`s (ESRI) ARC/INFO Geographic Information System (GIS) product. The conceptual requirements, design, initial implementation, current status, and future work plans are discussed. 4 refs., 2 figs.

  10. Enhanced verification test suite for physics simulation codes

    SciTech Connect

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.; Cotrell, David L.; Johnson, Bryan; Knupp, Patrick; Rider, William J.; Trucano, Timothy G.; Weirs, V. Gregory

    2008-09-01

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  11. Site and Event Characterization Using the CTBT On-Site Inspection Techniques (Invited)

    NASA Astrophysics Data System (ADS)

    Labak, P.; Gaya Pique, L. R.; Rowlands, A. P.; Arndt, R. H.

    2013-12-01

    One of the four elements of the CTBT verification regime is On-Site Inspection (OSI). The sole purpose of an OSI is to clarify whether a nuclear weapon test explosion or any other nuclear explosion has been conducted in violation of the CTBT. An OSI would be conducted within an area no bigger than 1000 km2 and by no more than 40 inspectors at any one time, applying search logic and inspection techniques with the aim of collecting relevant information that will be the basis for the inspection report. During the course of an OSI less intrusive techniques applied over broad areas (usually with lower spatial resolution) are supplemented with more intrusive techniques applied to more targeted areas (usually at a higher spatial resolution). Environmental setting and the evolution of OSI-relevant observables over time will influence the application of OSI techniques. In the course of the development of OSI methodology and relevant techniques, field tests and exercises have been conducted. While earlier activities mainly focused on progress of individual techniques (such as visual observation, passive seismological monitoring for aftershocks and measurements of radioactivity), recent work covered both technique development (such as multi-spectral imaging including infrared measurements, and environmental sampling and analysis of solids, liquids and gases) as well as the integration of techniques, search logic and data flow. We will highlight examples of application of OSI technologies for site and event characterization from recently conducted field tests and exercises and demonstrate the synthesis of techniques and data necessary for the conduct of an OSI.

  12. DOE program on seismic characterization for regions of interest to CTBT monitoring

    SciTech Connect

    Ryall, A.S.; Weaver, T.A.

    1995-07-01

    The primary goal of the DOE programs on Geophysical Characterization of (1) the Middle East and North Africa (ME-NA) and (2) Southern Asia (SA) is to provide the Air Force Technical Applications Center (AFRAC) with the analytic tools and knowledge base to permit effective verification of Comprehensive Test Ban Treaty (CTBT) compliance in those regions. The program also aims at using these regionalizations as models for the development of a detailed prescription for seismic calibration and knowledge base compilation in areas where the US has had little or no previous monitoring experience. In any given region, the CTBT seismic monitoring system will depend heavily on a few key arrays and/or three-component stations, and it will be important to know as much as possible about the physical properties of the earth`s crust and upper mantle: (1) in the vicinity of these stations, (2) in areas of potential earthquake activity or commercial blasting in the region containing the stations, and (3) along the propagation path from the sources to the stations. To be able to discriminate between various source types, we will also need to know how well the various event characterization techniques perform when they are transported from one tectonic or geologic environment to another. The Department of Energy`s CMT R&D program plan (DOE, 1994), which includes the ME-NA and SA characterization programs, incorporates an iterative process that combines field experiments, computer modeling and data analysis for the development, testing, evaluation and modification of data processing algorithms as appropriate to achieve specific US monitoring objectives. This process will be applied to seismic event detection, location and identification.

  13. The CTBT's International Monitoring System and On-Site Inspection Capabilities: a Status Report

    NASA Astrophysics Data System (ADS)

    Zerbo, Lassina

    2017-01-01

    At its 20th anniversary the Comprehensive Nuclear-Test-Ban Treaty has now gathered 183 State Signatories, of which 166 have ratified. But 8 States remain to ratify before we reach entry into force. In the meantime the CTBT verification regime has accumulated two decades worth of experience, and has achieved proven results. The regime includes a global system for monitoring the earth, the oceans and the atmosphere and an on-site inspection (OSI) capability. It uses seismic, hydroacoustic, infrasound and radionuclide technologies to do so. More than 90% of the 337 facilities of the International Monitoring System (IMS) have been installed and are sending data to the International Data Centre (IDC) in Vienna, Austria for processing. These IMS data along with IDC processed and reviewed products are available to all States that have signed the Treaty. The monitoring system has been put to test and demonstrated its effectiveness by detecting, locating and reporting on the DPRK announced nuclear tests in 2006, 2009, 2013 and twice in 2016. In addition to detecting radioxenon consistent with the nuclear tests in 2006 and 2013 the IMS radionuclide network also added value in the response to the tragic events in Fukushima in 2011. We continue to find new civil and scientific applications of the IMS that are made available to the international community to deal with major societal issues such as sustainable development, disaster risk reduction and climate change. OSI capabilities continue to be developed and tested. The Integrated Field Exercise in Jordan in 2014 demonstrated that they have reached a high level of operational readiness. The CTBT has been a catalyst for the development of new scientific fields in particular in the noble gas monitoring technology. CTBTO seeks to continuously improve its technologies and methods through interaction with the scientific community.

  14. Progress in CTBT Monitoring since its 1999 Senate Defeat

    NASA Astrophysics Data System (ADS)

    Hafemeister, David

    2009-05-01

    Progress in monitoring the Comprehensive Nuclear Test Ban Treaty (CTBT) is examined, beginning with the 2002 National Academy of Sciences CTBT study, followed by recent findings on regional seismology, array--monitoring, correlation--detection, seismic modeling and non-seismic technologies. The NAS--CTBT study concluded that the fully--completed International Monitoring System (IMS) will reliably detect and identify underground nuclear explosions with a threshold of 0.1 kt in hard rock, if conducted anywhere in Europe, Asia, North Africa, and North America. In some locations the threshold is 0.01 kt or lower, using arrays or regional seismic stations. As an example, the 0.6--kiloton North Korean test of October 9, 2006 was promptly detected by seismometers in Australia, Europe, North America and Asia. The P/S ratio between 1--15 Hz clearly showed that the event was an explosion and not an earthquake. Radioactive venting, observed as far as Canada, proved that it was a nuclear explosion. Advances in seismic monitoring strengthen the conclusions of the NAS study. Interferometric synthetic aperture radar can, in some cases, identify and locate 1--kt tests at 500 m depth by measuring subsidence to 2--5 mm. InSAR can discriminate between earthquakes and explosions from the subsidence pattern and it can locate nuclear tests to within 100 meters.

  15. Global Communications Infrastructure: CTBT Treaty monitoring using space communications

    NASA Astrophysics Data System (ADS)

    Kebeasy, R.; Abaya, E.; Ricker, R.; Demeules, G.

    Article 1 on Basic Obligations of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) states that: "Each State Party undertakes not to carry out any nuclear weapon test explosion or any other nuclear explosion, and to prohibit and prevent any such nuclear explosion at any place under its jurisdiction or control. Each State Party undertakes, furthermore, to refrain from causing, encouraging, or in any way participating in the carrying out of any nuclear weapon test explosion or any other nuclear explosion." To monitor States Parties compliance with these Treaty provisions, an International Monitoring System (IMS) consisting of 321 monitoring stations and 16 laboratories in some 91 countries is being implemented to cover the whole globe, including its oceans and polar regions. The IMS employs four technologies--seismic, hydroacoustic, infrasound and radionuclide--to detect,locate and identify any seismic event of Richter magnitude 4 and above (equivalent to one kiloton of TNT) that may be associated with a nuclear test explosion. About one-half of this monitoring system is now operational in 67 countries. Monitoring stations send data in near real-time to an International Data Centre (IDC) in Vienna over a Global Communications Infrastructure (GCI) incorporating 10 geostationary satellites plus three satellites in inclined orbits. The satellites relay the data to commercial earth stations, from where they are transferred by terrestrial circuits to the IDC. The IDC automatically processes and interactively analyzes the monitoring data, and distributes the raw data and reports relevant to Treaty verification to National Data Centers in Member States over the same communications network. The GCI will eventually support about 250 thin route VSAT links to the monitoring stations, many of them at remote or harsh locations on the earth, plus additional links to national data centres in various countries. Off-the-shelf VSAT and networking hardware are deployed. This is the

  16. NPE 2010 results - Independent performance assessment by simulated CTBT violation scenarios

    NASA Astrophysics Data System (ADS)

    Ross, O.; Bönnemann, C.; Ceranna, L.; Gestermann, N.; Hartmann, G.; Plenefisch, T.

    2012-04-01

    For verification of compliance to the Comprehensive Nuclear-Test-Ban Treaty (CTBT) the global International Monitoring System (IMS) is currently being built up. The IMS is designed to detect nuclear explosions through their seismic, hydroacoustic, infrasound, and radionuclide signature. The IMS data are collected, processed to analysis products, and distributed to the state signatories by the International Data Centre (IDC) in Vienna. The state signatories themselves may operate National Data Centers (NDC) giving technical advice concerning CTBT verification to the government. NDC Preparedness Exercises (NPE) are regularly performed to practice the verification procedures for the detection of nuclear explosions in the framework of CTBT monitoring. The initial focus of the NPE 2010 was on the component of radionuclide detections and the application of Atmospheric Transport Modeling (ATM) for defining the source region of a radionuclide event. The exercise was triggered by fictitious radioactive noble gas detections which were calculated beforehand secretly by forward ATM for a hypothetical xenon release scenario starting at location and time of a real seismic event. The task for the exercise participants was to find potential source events by atmospheric backtracking and to analyze in the following promising candidate events concerning their waveform signals. The study shows one possible way of solution for NPE 2010 as it was performed at German NDC by a team without precedent knowledge of the selected event and release scenario. The ATM Source Receptor Sensitivity (SRS) fields as provided by the IDC were evaluated in a logical approach in order to define probable source regions for several days before the first reported fictitious radioactive xenon finding. Additional information on likely event times was derived from xenon isotopic ratios where applicable. Of the considered seismic events in the potential source region all except one could be identified as

  17. Enhanced global Radionuclide Source Attribution for the Nuclear-Test-Ban Verification by means of the Adjoint Ensemble Dispersion Modeling Technique applied at the IDC/CTBTO.

    NASA Astrophysics Data System (ADS)

    Becker, A.; Wotawa, G.; de Geer, L.

    2006-05-01

    findings of the ensemble dispersion modeling (EDM) technique No. 5 efforts performed by Galmarini et al, 2004 (Atmos. Env. 38, 4607-4617). As the scope of the adjoint EDM methodology is not limited to CTBT verification but can be applied to any kind of nuclear event monitoring and location it bears the potential to improve the design of manifold emergency response systems towards preparedness concepts as needed for mitigation of disasters (like Chernobyl) and pre-emptive estimation of pollution hazards.

  18. Aerial and ground-based inspections of mine sites in the Western U.S.-implications for on-site inspection overflights, under the CTBT

    SciTech Connect

    Heuze, F.E.

    1997-07-01

    The verification regime of the Comprehensive Test Ban Treaty (CTBT) provides for the possibility of On-Site Inspections (OSI`s) to resolve questions concerning suspicious events which may have been clandestine nuclear tests. Overflights by fixed-wing or rotary-wing aircraft, as part of an OSI, are permitted by the Treaty. These flights are intended to facilitate the narrowing of the inspection area, from an initial permissible 1000 km{sup 2}, and to help select the locations to deploy observers and ground-based sensors (seismic, radionuclides, . . .) Because of the substantial amount of seismicity generated by mining operations worldwide, it is expected that mine sites and mine districts would be prime candidates for OSI`S. To gain experience in this context, a number of aerial and ground-based mine site inspections have been performed in the Western U.S. by Lawrence Livermore National Laboratory since 1994. These inspections are part of a broad range of CTBT mining-related projects conducted by the U.S. Department of Energy and its National Laboratories. The various sites are described next, and inferences are made concerning CTBT OSI`S. All the mines are legitimate operations, with no implication whatsoever of any clandestine tests.

  19. Aircraft geometry verification with enhanced computer generated displays

    NASA Technical Reports Server (NTRS)

    Cozzolongo, J. V.

    1982-01-01

    A method for visual verification of aerodynamic geometries using computer generated, color shaded images is described. The mathematical models representing aircraft geometries are created for use in theoretical aerodynamic analyses and in computer aided manufacturing. The aerodynamic shapes are defined using parametric bi-cubic splined patches. This mathematical representation is then used as input to an algorithm that generates a color shaded image of the geometry. A discussion of the techniques used in the mathematical representation of the geometry and in the rendering of the color shaded display is presented. The results include examples of color shaded displays, which are contrasted with wire frame type displays. The examples also show the use of mapped surface pressures in terms of color shaded images of V/STOL fighter/attack aircraft and advanced turboprop aircraft.

  20. Aircraft geometry verification with enhanced computer-generated displays

    NASA Technical Reports Server (NTRS)

    Cozzolongo, J. V.

    1982-01-01

    A method for visual verification of aerodynamic geometries using computer-generated, color-shaded images is described. The mathematical models representing aircraft geometries are created for use in theoretical aerodynamic analyses and in computer-aided manufacturing. The aerodynamic shapes are defined using parametric bi-cubic splined patches. This mathematical representation is then used as input to an algorithm that generates a color-shaded image of the geometry. A discussion of the techniques used in the mathematical representation of the geometry and in the rendering of the color-shaded display is presented. The results include examples of color-shaded displays, which are contrasted with wire-frame-type displays. The examples also show the use of mapped surface pressures in terms of color-shaded images of V/STOL fighter/attack aircraft and advanced turboprop aircraft.

  1. A verification strategy for web services composition using enhanced stacked automata model.

    PubMed

    Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali

    2015-01-01

    Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the

  2. Verification and Validation of NASA-Supported Enhancements to Decision Support Tools of PECAD

    NASA Technical Reports Server (NTRS)

    Ross, Kenton W.; McKellip, Rodney; Moore, Roxzana F.; Fendley, Debbie

    2005-01-01

    This section of the evaluation report summarizes the verification and validation (V&V) of recently implemented, NASA-supported enhancements to the decision support tools of the Production Estimates and Crop Assessment Division (PECAD). The implemented enhancements include operationally tailored Moderate Resolution Imaging Spectroradiometer (MODIS) products and products of the Global Reservoir and Lake Monitor (GRLM). The MODIS products are currently made available through two separate decision support tools: the MODIS Image Gallery and the U.S. Department of Agriculture (USDA) Foreign Agricultural Service (FAS) MODIS Normalized Difference Vegetation Index (NDVI) Database. Both the Global Reservoir and Lake Monitor and MODIS Image Gallery provide near-real-time products through PECAD's CropExplorer. This discussion addresses two areas: 1. Assessments of the standard NASA products on which these enhancements are based. 2. Characterizations of the performance of the new operational products.

  3. Evaluation of infrasonic detection capability for the CTBT/IMS

    SciTech Connect

    Armstrong, W.T.; Whitaker, R.W.; Olson, J.V.

    1996-09-01

    Evaluation of infrasonic detection capability for the International Monitoring System of the Comprehensive Test Ban Treaty (IMS/CTBT) is made with respect to signal analysis and global coverage. Signal analysis is anecdotally reviewed with respect to composite power, correlation and F-statistic detection algorithms. In the absence of adaptive pre-filtering, either cross-correlation or F-statistic detection is required. As an unbounded quantity, the F-statistic offers potentially greater sensitivity to signals of interest. With PURE state pre-filtering, power detection begins to become competitive with correlation and F-statistic detection. Additional application of simple post-filters of minimum duration and maximum bearing deviation results in unique positive detection of an identified impulsive infrasonic signal. Global coverage estimates are performed as a useful deterministic evaluation of networks, offering an easily interpreted network performance, which compliments previous probabilistic network evaluations. In particular, adequate coverage (2 sites), uniform coverage, and redundant coverage (3 to 4 sites) provide figures of merit in evaluating detection, location and vulnerability, respectively. Coverage estimates of the I60 network have been performed which indicate generally adequate coverage for the majority of the globe. Modest increase of station gain (increase of number of elements from 4 to 7) results in significant increase in coverage for mean signal values. Ineffective sites and vulnerability sites are identified which suggest further refinement of the network is possible.

  4. Evaluation of database technologies for the CTBT Knowledge Base prototype

    SciTech Connect

    Keyser, R.; Shepard-Dombroski, E.; Baur, D.; Hipp, J.; Moore, S.; Young, C.; Chael, E.

    1996-11-01

    This document examines a number of different software technologies in the rapidly changing field of database management systems, evaluates these systems in light of the expected needs of the Comprehensive Test Ban Treaty (CTBT) Knowledge Base, and makes some recommendations for the initial prototypes of the Knowledge Base. The Knowledge Base requirements are examined and then used as criteria for evaluation of the database management options. A mock-up of the data expected in the Knowledge Base is used as a basis for examining how four different database technologies deal with the problems of storing and retrieving the data. Based on these requirement and the results of the evaluation, the recommendation is that the Illustra database be considered for the initial prototype of the Knowledge Base. Illustra offers a unique blend of performance, flexibility, and features that will aid in the implementation of the prototype. At the same time, Illustra provides a high level of compatibility with the hardware and software environments present at the US NDC (National Data Center) and the PIDC (Prototype International Data Center).

  5. Field test for treatment verification of an in-situ enhanced bioremediation study

    SciTech Connect

    Taur, C.K.; Chang, S.C.

    1995-09-01

    Due to a leakage from a 12-inch pressurized diesel steel pipe four years ago, an area of approximately 30,000 square meters was contaminated. A pilot study applying the technology of in-situ enhanced bioremediation was conducted. In the study, a field test kit and on-site monitoring equipment were applied for site characterization and treatment verification. Physically, the enhanced bioremediation study consisted of an air extraction and air supply system, and a nutrition supply network. Certain consistent sampling methodology was employed. Progress was verified by daily monitoring and monthly verification. The objective of this study was to evaluate the capabilities of indigenous microorganisms to biodegrade the petroleum hydrocarbons with provision of oxygen and nutrients. Nine extraction wells and eight air sparging wells were installed. The air sparging wells injected the air into geoformation and the extraction wells provided the underground air circulation. The soil samples were obtained monthly for treatment verification by a Minuteman drilling machine with 2.5-foot-long hollow-stem augers. The samples were analyzed on site for TPH-diesel concentration by a field test kit manufactured by HNU-Hanby, Houston, Texas. The analytical results from the field test kit were compared with the results from an environmental laboratory. The TVPH concentrations of the air extracted from the vadose zone by a vacuum blower and the extraction wells were routinely monitored by a Foxboro FID and Cosmos XP-311A combustible air detector. The daily monitoring of TVPH concentrations provided the reliable data for assessing the remedial progress.

  6. Investigation of CTBT OSI Radionuclide Techniques at the DILUTED WATERS Nuclear Test Site

    SciTech Connect

    Baciak, James E.; Milbrath, Brian D.; Detwiler, Rebecca S.; Kirkham, Randy R.; Keillor, Martin E.; Lepel, Elwood A.; Seifert, Allen; Emer, Dudley; Floyd, Michael

    2012-11-01

    Under the Comprehensive Nuclear-Test-Ban Treaty (CTBT), a verification regime that includes the ability to conduct an On-Site Inspection (OSI) will be established. The Treaty allows for an OSI to include many techniques, including the radionuclide techniques of gamma radiation surveying and spectrometry and environmental sampling and analysis. Such radioactivity detection techniques can provide the “smoking gun” evidence that a nuclear test has occurred through the detection and quantification of indicative recent fission products. An OSI faces restrictions in time and manpower, as dictated by the Treaty; not to mention possible logistics difficulties due to the location and climate of the suspected explosion site. It is thus necessary to have a good understanding of the possible source term an OSI will encounter and the proper techniques that will be necessary for an effective OSI regime. One of the challenges during an OSI is to locate radioactive debris that has escaped an underground nuclear explosion (UNE) and settled on the surface near and downwind of ground zero. To support the understanding and selection of sampling and survey techniques for use in an OSI, we are currently designing an experiment, the Particulate Release Experiment (PRex), to simulate a small-scale vent from an underground nuclear explosion. PRex will occur at the Nevada National Security Site (NNSS). The project is conducted under the National Center for Nuclear Security (NCNS) funded by the National Nuclear Security Agency (NNSA). Prior to the release experiment, scheduled for Spring of 2013, the project scheduled a number of activities at the NNSS to prepare for the release experiment as well as to utilize the nuclear testing past of the NNSS for the development of OSI techniques for CTBT. One such activity—the focus of this report—was a survey and sampling campaign at the site of an old UNE that vented: DILUTED WATERS. Activities at DILUTED WATERS included vehicle-based survey

  7. Towards a new daily in-situ precipitation data set supporting parameterization of wet-deposition of CTBT relevant radionuclides

    NASA Astrophysics Data System (ADS)

    Becker, A.; Ceranna, L.; Ross, O.; Schneider, U.; Meyer-Christoffer, A.; Ziese, M.; Lehner, K.; Rudolf, B.

    2012-04-01

    As contribution to the World Climate Research Program (WCRP) and in support of the Global Climate Observing System (GCOS) of the World Meteorological Organization (WMO), the Deutscher Wetterdienst (DWD) operates the Global Precipitation Climatology Centre (GPCC). The GPCC re-analysis and near-real time monitoring products are recognized world-wide as the most reliable global data set on rain-gauge based (in-situ) precipitation measurements. The GPCC Monitoring Product (Schneider et al, 2011; Becker et al. 2012, Ziese et al, EGU2012-5442) is available two months after the fact based on the data gathered while listening to the GTS to fetch the SYNOP and CLIMAT messages. This product serves also the reference data to calibrate satellite based precipitation measurements yielding the Global Precipitation Climatology Project (GPCP) data set (Huffmann et al., 2009). The quickest GPCC product is the First Guess version of the GPCC Monitoring Product being available already 3-5 days after the month regarded. Both, the GPCC and the GPCP products bear the capability to serve as data base for the computational light-weight post processing of the wet deposition impact on the radionuclide (RN) monitoring capability of the CTBT network (Wotawa et al., 2009) on the regional and global scale, respectively. This is of major importance any time, a reliable quantitative assessment of the source-receptor sensitivity is needed, e.g. for the analysis of isotopic ratios. Actually the wet deposition recognition is a prerequisite if ratios of particulate and noble gas measurements come into play. This is so far a quite unexplored field of investigation, but would alleviate the clearance of several apparently CTBT relevant detections, encountered in the past, as bogus and provide an assessment for the so far overestimation of the RN detection capability of the CTBT network. Besides the climatological kind of wet deposition assessment for threshold monitoring purposes, there are also singular

  8. Transitioning Enhanced Land Surface Initialization and Model Verification Capabilities to the Kenya Meteorological Department (KMD)

    NASA Technical Reports Server (NTRS)

    Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Zavodsky, Bradley T.; Srikishen, Jayanthi; Limaye, Ashutosh; Blankenship, Clay B.

    2016-01-01

    Flooding, severe weather, and drought are key forecasting challenges for the Kenya Meteorological Department (KMD), based in Nairobi, Kenya. Atmospheric processes leading to convection, excessive precipitation and/or prolonged drought can be strongly influenced by land cover, vegetation, and soil moisture content, especially during anomalous conditions and dry/wet seasonal transitions. It is thus important to represent accurately land surface state variables (green vegetation fraction, soil moisture, and soil temperature) in Numerical Weather Prediction (NWP) models. The NASA SERVIR and the Short-term Prediction Research and Transition (SPoRT) programs in Huntsville, AL have established a working partnership with KMD to enhance its regional modeling capabilities. SPoRT and SERVIR are providing experimental land surface initialization datasets and model verification capabilities for capacity building at KMD. To support its forecasting operations, KMD is running experimental configurations of the Weather Research and Forecasting (WRF; Skamarock et al. 2008) model on a 12-km/4-km nested regional domain over eastern Africa, incorporating the land surface datasets provided by NASA SPoRT and SERVIR. SPoRT, SERVIR, and KMD participated in two training sessions in March 2014 and June 2015 to foster the collaboration and use of unique land surface datasets and model verification capabilities. Enhanced regional modeling capabilities have the potential to improve guidance in support of daily operations and high-impact weather and climate outlooks over Eastern Africa. For enhanced land-surface initialization, the NASA Land Information System (LIS) is run over Eastern Africa at 3-km resolution, providing real-time land surface initialization data in place of interpolated global model soil moisture and temperature data available at coarser resolutions. Additionally, real-time green vegetation fraction (GVF) composites from the Suomi-NPP VIIRS instrument is being incorporated

  9. OPEN DATA EXCHANGE IN SUPPORT OF CTBT RESEARCH

    NASA Astrophysics Data System (ADS)

    Willemann, R. J.; Simpson, D. W.; Suarez, G.; Ahern, T. K.

    2009-12-01

    Openly exchanged seismological data contribute to research and monitoring related to the CTBT and complement the more formally structured and restricted data procedures of the International Monitoring System (IMS). In addition to providing an increasingly rich source of data for fundamental research programs and monitoring, the open exchange of data encourages international and multi-disciplinary collaboration and leads to improvements in data quality and network practices. The IRIS Consortium and its Members commit to its mission to “promote exchange of geophysical data and knowledge, through use of standards for network operations, data formats, and exchange protocols, and through pursuing policies of free and unrestricted data access”. Data collected through IRIS programs as supported by the US National Science Foundation are archived at the IRIS Data Management Center (DMC) from which they are freely and openly available to researchers and the public. Continuous data from the IRIS/USGS Global Seismographic Network and the EarthScope Transportable Array are available in real-time. Investigators using portable PASSCAL and the EarthScope Flexible Array equipment can request a two-year proprietary period, following which all data must be archived at the DMC and made openly available. In addition to data from IRIS programs, the DMC archives and distributes data from many networks affiliated with the International Federation of Digital Seismographic Networks (FDSN), other national networks, and regional networks in the U.S.A. Networked data services provide access to the archives of additional data centers worldwide. Data exchange with the USGS National Earthquake Information Center, regional networks in the U.S. and international mission agencies contributes to global and national earthquake monitoring. All of the data in the DMC can be accessed with a common set of tools, providing easy access to waveforms from thousands of sensors in hundreds of networks

  10. Verification and Validation of NASA-Supported Enhancements to PECAD's Decision Support Tools

    NASA Technical Reports Server (NTRS)

    McKellipo, Rodney; Ross, Kenton W.

    2006-01-01

    The NASA Applied Sciences Directorate (ASD), part of the Earth-Sun System Division of NASA's Science Mission Directorate, has partnered with the U.S. Department of Agriculture (USDA) to enhance decision support in the area of agricultural efficiency-an application of national importance. The ASD integrated the results of NASA Earth science research into USDA decision support tools employed by the USDA Foreign Agricultural Service (FAS) Production Estimates and Crop Assessment Division (PECAD), which supports national decision making by gathering, analyzing, and disseminating global crop intelligence. Verification and validation of the following enhancements are summarized: 1) Near-real-time Moderate Resolution Imaging Spectroradiometer (MODIS) products through PECAD's MODIS Image Gallery; 2) MODIS Normalized Difference Vegetation Index (NDVI) time series data through the USDA-FAS MODIS NDVI Database; and 3) Jason-1 and TOPEX/Poseidon lake level estimates through PECAD's Global Reservoir and Lake Monitor. Where possible, each enhanced product was characterized for accuracy, timeliness, and coverage, and the characterized performance was compared to PECAD operational requirements. The MODIS Image Gallery and the GRLM are more mature and have achieved a semi-operational status, whereas the USDA-FAS MODIS NDVI Database is still evolving and should be considered

  11. Enhanced spacer-is-dielectric (sid) decomposition flow with model-based verification

    NASA Astrophysics Data System (ADS)

    Du, Yuelin; Song, Hua; Shiely, James; Wong, Martin D. F.

    2013-03-01

    Self-aligned double patterning (SADP) lithography is a leading candidate for 14nm node lower-metal layer fabrication. Besides the intrinsic overlay-tolerance capability, the accurate spacer width and uniformity control enables such technology to fabricate very narrow and dense patterns. Spacer-is-dielectric (SID) is the most popular flavor of SADP with higher flexibility in design. In the SID process, due to uniform spacer deposition, the spacer shape gets rounded at convex mandrel corners, and disregarding the corner rounding issue during SID decomposition may result in severe residue artifacts on device patterns. Previously, SADP decomposition was merely verified by Boolean operations on the decomposed layers, where the residue artifacts are not even identifiable. This paper proposes a model-based verification method for SID decomposition to identify the artifacts caused by spacer corner rounding. Then targeting residue artifact removal, an enhanced SID decomposition flow is introduced. Simulation results show that residue artifacts are removed effectively through the enhanced SID decomposition strategy.

  12. Test report for the infrasound prototype: For a CTBT IMS station

    SciTech Connect

    Breding, D.R.; Kromer, R.P.; Whitaker, R.W.; Sandoval, T.

    1997-11-01

    This document describes the results of the Comprehensive Test Ban Treaty (CTBT) Infrasound Prototype Development Test and Evaluation (DT&E). During DT&E the infrasound prototype was evaluated against requirements listed in the System Requirements Document (SRD) based on the Conference on Disarmament/Ad Hoc Committee on a Nuclear Test Ban/Working Papers 224 and 283 and the Preparatory Commission specifications as defined in CTBT/PC/II/1/Add.2, Appendix X, Table 5. The evaluation was conducted during a two-day period, August 6-7, 18997. The System Test Plan (STP) defined the plan and methods to test the infrasound prototype. Specific tests that were performed are detailed in the Test Procedures (TP).

  13. OPC verification and hotspot management for yield enhancement through layout analysis

    NASA Astrophysics Data System (ADS)

    Yoo, Gyun; Kim, Jungchan; Lee, Taehyeong; Jung, Areum; Yang, Hyunjo; Yim, Donggyu; Park, Sungki; Maruyama, Kotaro; Yamamoto, Masahiro; Vikram, Abhishek; Park, Sangho

    2011-03-01

    As the design rule shrinks down, various techniques such as RET, DFM have been continuously developed and applied to lithography field. And we have struggled not only to obtain sufficient process window with those techniques but also to feedback hot spots to OPC process for yield improvement in mass production. OPC verification procedure which iterates its processes from OPC to wafer verification until the CD targets are met and hot spots are cleared is becoming more important to ensure robust and accurate patterning and tight hot spot management. Generally, wafer verification results which demonstrate how well OPC corrections are made need to be fed back to OPC engineer in effective and accurate way. First of all, however, it is not possible to cover all transistors in full-chip with some OPC monitoring points which have been used for wafer verification. Secondly, the hot spots which are extracted by OPC simulator are not always reliable enough to represent defective information for full-chip. Finally, it takes much TAT and labor to do this with CD SEM measurement. These difficulties on wafer verification would be improved by design based analysis. The optimal OPC monitoring points are created by classifying all transistors in full chip layout and Hotspot set is selected by pattern matching process using the NanoScopeTM, which is known as a fast design based analysis tool, with a very small amount of hotspots extracted by OPC simulator in full chip layout. Then, each set is used for wafer verification using design based inspection tool, NGR2150TM. In this paper, new verification methodology based on design based analysis will be introduced as an alternative method for effective control of OPC accuracy and hot spot management.

  14. Using Academia-Industry Partnerships to Enhance Software Verification & Validation Education via Active Learning Tools

    ERIC Educational Resources Information Center

    Acharya, Sushil; Manohar, Priyadarshan; Wu, Peter; Schilling, Walter

    2017-01-01

    Imparting real world experiences in a software verification and validation (SV&V) course is often a challenge due to the lack of effective active learning tools. This pedagogical requirement is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains. Realizing the…

  15. Getting to Zero Yield: The Evolution of the U.S. Position on the CTBT

    NASA Astrophysics Data System (ADS)

    Zimmerman, Peter D.

    1998-03-01

    In 1994 the United States favored a Comprehensive Test Ban Treaty (CTBT) which permitted tiny "hydronuclear" experiments with a nuclear energy release of four pounds or less. Other nuclear powers supported yield limits as high as large fractions of a kiloton, while most non-nuclear nations participating in the discussions at the United Nations Conference on Disarmament wanted to prohibit all nuclear explosions -- some even favoring an end to computer simulations. On the other hand, China wished an exception to permit high yield "peaceful" nuclear explosions. For the United States to adopt a new position favoring a "true zero" several pieces had to fall into place: 1) The President had to be assured that the U.S. could preserve the safety and reliability of the enduring stockpile without yield testing; 2) the U.S. needed to be sure that the marginal utility of zero-yield experiments was at least as great for this country as for any other; 3) that tests with any nuclear yield might have more marginal utility for nuclear proliferators than for the United States, thus marginally eroding this country's position; 4) the United States required a treaty which would permit maintenance of the capacity to return to testing should a national emergency requiring a nuclear test arise; and 5) all of the five nuclear weapons states had to realize that only a true-zero CTBT would have the desired political effects. This paper will outline the physics near zero yield and show why President Clinton was persuaded by arguments from many viewpoints to endorse a true test ban in August, 1996 and to sign the CTBT in September, 1997.

  16. Hardware design document for the Infrasound Prototype for a CTBT IMS station

    SciTech Connect

    Breding, D.R.; Kromer, R.P.; Whitaker, R.W.; Sandoval, T.

    1997-11-01

    The Hardware Design Document (HDD) describes the various hardware components used in the Comprehensive Test Ban Treaty (CTBT) Infrasound Prototype and their interrelationships. It divides the infrasound prototype into hardware configurations items (HWCIs). The HDD uses techniques such as block diagrams and parts lists to present this information. The level of detail provided in the following sections should be sufficient to allow potential users to procure and install the infrasound system. Infrasonic monitoring is a low cost, robust, and effective technology for detecting atmospheric explosions. Low frequencies from explosion signals propagate to long ranges (few thousand kilometers) where they can be detected with an array of sensors.

  17. Analysis of an indirect neutron signature for enhanced UF6 cylinder verification

    NASA Astrophysics Data System (ADS)

    Kulisek, J. A.; McDonald, B. S.; Smith, L. E.; Zalavadia, M. A.; Webster, J. B.

    2017-02-01

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF6) cylinders. The current method provides relatively low accuracy for the assay of 235U enrichment, especially for natural and depleted UF6. Furthermore, the current method provides no capability to assay the absolute mass of 235U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from 235U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVANT). HEVANT enables full-volume assay of UF6 cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF6. In this work, Monte Carlo modeling is used as the basis for characterizing HEVANT in terms of the individual contributions to HEVANT from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVANT signature to manipulation by the nearby placement of neutron-conversion materials.

  18. Complete regional waveform modeling to estimate seismic velocity structure and source parameters for CTBT monitoring

    SciTech Connect

    Bredbeck, T; Rodgers, A; Walter, W

    1999-07-23

    The velocity structures and source parameters estimated by waveform modeling provide valuable information for CTBT monitoring. The inferred crustal and uppermost mantle structures advance understanding of tectonics and guides regionalization for event location and identification efforts. Estimation of source parameters such as seismic moment, depth and mechanism (whether earthquake, explosion or collapse) is crucial to event identification. In this paper we briefly outline some of the waveform modeling research for CTBT monitoring performed in the last year. In the future we will estimate structure for new regions by modeling waveforms of large well-observed events along additional paths. Of particular interest will be the estimation of velocity structure in aseismic regions such as most of Africa and the Former Soviet Union. Our previous work on aseismic regions in the Middle East, north Africa and south Asia give us confidence to proceed with our current methods. Using the inferred velocity models we plan to estimate source parameters for smaller events. It is especially important to obtain seismic moments of earthquakes for use in applying the Magnitude-Distance Amplitude Correction (MDAC; Taylor et al., 1999) to regional body-wave amplitudes for discrimination and calibrating the coda-based magnitude scales.

  19. Big Data Solution for CTBT Monitoring Using Global Cross Correlation

    NASA Astrophysics Data System (ADS)

    Gaillard, P.; Bobrov, D.; Dupont, A.; Grenouille, A.; Kitov, I. O.; Rozhkov, M.

    2014-12-01

    Due to the mismatch between data volume and the performance of the Information Technology infrastructure used in seismic data centers, it becomes more and more difficult to process all the data with traditional applications in a reasonable elapsed time. To fulfill their missions, the International Data Centre of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO/IDC) and the Département Analyse Surveillance Environnement of Commissariat à l'Energie atomique et aux énergies alternatives (CEA/DASE) collect, process and produce complex data sets whose volume is growing exponentially. In the medium term, computer architectures, data management systems and application algorithms will require fundamental changes to meet the needs. This problem is well known and identified as a "Big Data" challenge. To tackle this major task, the CEA/DASE takes part during two years to the "DataScale" project. Started in September 2013, DataScale gathers a large set of partners (research laboratories, SMEs and big companies). The common objective is to design efficient solutions using the synergy between Big Data solutions and the High Performance Computing (HPC). The project will evaluate the relevance of these technological solutions by implementing a demonstrator for seismic event detections thanks to massive waveform correlations. The IDC has developed an expertise on such techniques leading to an algorithm called "Master Event" and provides a high-quality dataset for an extensive cross correlation study. The objective of the project is to enhance the Master Event algorithm and to reanalyze 10 years of waveform data from the International Monitoring System (IMS) network thanks to a dedicated HPC infrastructure operated by the "Centre de Calcul Recherche et Technologie" at the CEA of Bruyères-le-Châtel. The dataset used for the demonstrator includes more than 300,000 seismic events, tens of millions of raw detections and more than 30 terabytes of continuous seismic data

  20. Enhanced high-level Petri nets with multiple colors for knowledge verification/validation of rule-based expert systems.

    PubMed

    Wu, C H; Lee, S J

    1997-01-01

    Exploring the properties of rule-based expert systems through Petri net models has received a lot of attention. Traditional Petri nets provide a straightforward but inadequate method for knowledge verification/validation of rule-based expert systems. We propose an enhanced high-level Petri net model in which variables and negative information can be represented and processed properly. Rule inference is modeled exactly and some important aspects in rule-based systems (RBSs), such as conservation of facts, refraction, and closed-world assumption, are considered in this model. With the coloring scheme proposed in this paper, the tasks involved in checking the logic structure and output correctness of an RES are formally investigated. We focus on the detection of redundancy, conflicts, cycles, unnecessary conditions, dead ends, and unreachable goals in an RES. These knowledge verification/validation (KVV) tasks are formulated as the reachability problem and improper knowledge can be detected by solving a set of equations with respect to multiple colors. The complexity of our method is discussed and a comparison of our model with other Petri net models is presented.

  1. Acoustic-Seismic Coupling of Broadband Signals - Analysis of Potential Disturbances during CTBT On-Site Inspection Measurements

    NASA Astrophysics Data System (ADS)

    Liebsch, Mattes; Altmann, Jürgen

    2015-04-01

    For the verification of the Comprehensive Nuclear Test Ban Treaty (CTBT) the precise localisation of possible underground nuclear explosion sites is important. During an on-site inspection (OSI) sensitive seismic measurements of aftershocks can be performed, which, however, can be disturbed by other signals. To improve the quality and effectiveness of these measurements it is essential to understand those disturbances so that they can be reduced or prevented. In our work we focus on disturbing signals caused by airborne sources: When the sound of aircraft (as often used by the inspectors themselves) hits the ground, it propagates through pores in the soil. Its energy is transferred to the ground and soil vibrations are created which can mask weak aftershock signals. The understanding of the coupling of acoustic waves to the ground is still incomplete. However, it is necessary to improve the performance of an OSI, e.g. to address potential consequences for the sensor placement, the helicopter trajectories etc. We present our recent advances in this field. We performed several measurements to record sound pressure and soil velocity produced by various sources, e.g. broadband excitation by jet aircraft passing overhead and signals artificially produced by a speaker. For our experimental set-up microphones were placed close to the ground and geophones were buried in different depths in the soil. Several sensors were shielded from the directly incident acoustic signals by a box coated with acoustic damping material. While sound pressure under the box was strongly reduced, the soil velocity measured under the box was just slightly smaller than outside of it. Thus these soil vibrations were mostly created outside the box and travelled through the soil to the sensors. This information is used to estimate characteristic propagation lengths of the acoustically induced signals in the soil. In the seismic data we observed interference patterns which are likely caused by the

  2. Linear models to perform treaty verification tasks for enhanced information security

    SciTech Connect

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A.

    2016-11-12

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  3. Linear models to perform treaty verification tasks for enhanced information security

    DOE PAGES

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; ...

    2016-11-12

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensionalmore » vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.« less

  4. Linear models to perform treaty verification tasks for enhanced information security

    NASA Astrophysics Data System (ADS)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A.

    2017-02-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  5. Development of a bottom-hole gamma-ray diagnostic capability for high-level environments, during CTBT on-site inspection drilling

    SciTech Connect

    Fontenot, Robert

    1998-05-01

    The verification regime of the Comprehensive Test Ban Treaty (CTBT) provides for the possibility of on-site inspections (OSI`s) to resolve questions concerning suspicious events which may have been clandestine nuclear tests. The initial phase of an OSI may provide enough evidence to justify a request to the CTBT Organization for allowing drilling, so as to recover further evidence of a nuclear event. The equipment that was used for such `re-entry` drilling in the days of U.S. underground nuclear testing is considered too heavy and cumbersome for OSI deployments. So, an effort was initiated in 1995 to define, assemble, and demonstrate a new OSI drilling capability. Coiled-tubing (C-T) was selected as the most attractive technology because of its portability and its directional drilling capability (1). Following this selection, a preliminary engineering design was performed in 1996 for a Rapid Deployment Drilling System (RDDS). This system must have capabilities for downhole diagnostics of temperature and gamma-rays, since both types of data could be used to confirm the presence of an underground nuclear explosion. The study then identified two candidate downhole diagnostic systems operating with CT: the VIPER system of Schlumberger-Anadrill, and the Transocean system (2). In the current phase of this continuing effort the VIPER system has been retained as the first candidate because, everything else being equal, it is readily accessible domestically. One project, conducted by Maurer Engineering of Houston, TX, is specifying the details of the proposed CT system, its footprint, its modalities of air transport, and its costs of deployment and operation. The expected rate-of-penetration in rocks with unconfined compressive strength up to 14,500 psi (100 MPa) is also being estimated, based on laboratory-scale drilling tests on rock cores. Another project, which is the object of this report, has for an objective to develop and calibrate a downhole gamma-ray diagnostic

  6. A Visual Analytics Approach to Structured Data Analysis to Enhance Nonproliferation and Arms Control Verification Activities

    SciTech Connect

    Gillen, David S.

    2014-08-07

    Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in this domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of

  7. Verification of Precipitation Enhancement due to Winter Orographic Cloud Seeding in the Payette River Basin of Western Idaho

    NASA Astrophysics Data System (ADS)

    Holbrook, V. P.; Kunkel, M. L.; Blestrud, D.

    2013-12-01

    The Idaho Power Company (IPCo) is a hydroelectric based utility serving eastern Oregon and most of southern Idaho. Snowpack is critical to IPCo operations and the company has invested in a winter orographic cloud seeding program for the Payette, Boise, and Upper Snake River basins to augment the snowpack. IPCo and the National Center for Atmospheric Research (NCAR) are in the middle of a two-year study to determine precipitation enhancement due to winter orographic cloud seeding in the Payette River basin. NCAR developed a cloud seeding module, as an enhancement to the Weather Research and Forecast (WRF) model, that inputs silver iodide released from both ground based and/or aircraft generators. The cloud seeding module then increases the precipitation as a function of the cloud seeding. The WRF model used for this program is run at the University of Arizona with a resolution of 1.8 kilometers using Thompson microphysics and Mellor-Yamada-Janic boundary layer scheme. Two different types of verification schemes to determine precipitation enhancement is being used for this program; model versus model and model versus precipitation gauges. In the model versus model method, a control model run uses NCAR developed criteria to identify the best times to operate cloud or airborne seeding generators and also establishes the baseline precipitation. The model is then rerun with the cloud seeding module turned on for the time periods determined by the control run. The precipitation enhancement due to cloud seeding is then the difference in precipitation between the control and seeding model runs. The second verification method is to use the model forecast precipitation in the seeded and non-seeded areas, compare against observed precipitation (from mainly SNOTEL gauges), and determine the precipitation enhancement due to cloud seeding. Up to 15 SNOTEL gauges in or near the Payette River basin along with 14 IPCo high resolution rain gauges will be used with this target

  8. ETV, LT2 and You: How the Environmental Technology Verification Program Can Assist with the Long Term 2 Enhanced Surface Water Treatment Rule

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Drinking Water Systems (DWS) Center has verified the performance of treatment technologies that may be used by communities in meeting the newly promulgated (2006) U.S. Environmental Protection Agency (USEPA) Long Term 2 Enhanced Sur...

  9. ETV, LT2 and You: How the Environmental Technology Verification Program Can Assist with the Long Term 2 Enhanced Surface Water Treatment Rule

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Drinking Water Systems (DWS) Center has verified the performance of treatment technologies that may be used by communities in meeting the newly promulgated (2006) U.S. Environmental Protection Agency (USEPA) Long Term 2 Enhanced Sur...

  10. Seismic Characterization of Coal-Mining Seismicity in Utah for CTBT Monitoring

    SciTech Connect

    Arabasz, W J; Pechmann, J C

    2001-03-01

    Underground coal mining (down to {approx}0.75 km depth) in the contiguous Wasatch Plateau (WP) and Book Cliffs (BC) mining districts of east-central Utah induces abundant seismicity that is monitored by the University of Utah regional seismic network. This report presents the results of a systematic characterization of mining seismicity (magnitude {le} 4.2) in the WP-BC region from January 1978 to June 2000-together with an evaluation of three seismic events (magnitude {le} 4.3) associated with underground trona mining in southwestern Wyoming during January-August 2000. (Unless specified otherwise, magnitude implies Richter local magnitude, M{sub L}.) The University of Utah Seismograph Stations (UUSS) undertook this cooperative project to assist the University of California Lawrence Livermore National Laboratory (LLNL) in research and development relating to monitoring the Comprehensive Test Ban Treaty (CTBT). The project, which formally began February 28, 1998, and ended September 1, 2000, had three basic objectives: (1) Strategically install a three-component broadband digital seismic station in the WP-BC region to ensure the continuous recording of high-quality waveform data to meet the long-term needs of LLNL, UUSS, and other interested parties, including the international CTBT community. (2) Determine source mechanisms--to the extent that available source data and resources allowed--for comparative seismic characterization of stress release in mines versus earthquakes in the WP-BC study region. (3) Gather and report to LLNL local information on mine operations and associated seismicity, including ''ground truth'' for significant events. Following guidance from LLNL's Technical Representative, the focus of Objective 2 was changed slightly to place emphasis on three mining-related events that occurred in and near the study area after the original work plan had been made, thus posing new targets of opportunity. These included: a magnitude 3.8 shock that occurred

  11. Report on the test and evaluation of the Chaparral Physics Model 4.1.1 prototype microbarograph for CTBT infrasound array application

    SciTech Connect

    Kromer, R.P.; McDonald, T.S.

    1998-01-01

    The Sandia National Laboratories has tested and evaluated the Chaparral Physics Model 4.1.1 prototype infrasound sensor to CTBT specifications. The sensor was characterized by using a piston-phone chamber to set and measure sensor sensitivity. Multiple sensor side-by-side coherence analysis testing provided a measure of sensor relative gain and phase; sensor self-noise was computed using this technique. The performance of the sensor calibration circuitry was evaluated. Sensor performance was compared to CTBT specifications. The Chaparral sensor met or exceeded all CTBT specifications.

  12. Capability of the CTBT infrasound stations detecting the 2013 Russian fireball

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; Le Pichon, Alexis; Mialle, Pierrick; Garces, Milton

    2015-04-01

    The explosive fragmentation of the 2013 Chelyabinsk meteorite generated a large airburst with an equivalent yield of 500 kT TNT. It is the most energetic event recorded by the infrasound component of the CTBT-IMS, globally detected by 20 out of 42 operational stations. This study performs a station-by-station estimation of the IMS detection capability to explain infrasound detections and non-detections from short to long distances, using the Chelyabinsk meteorite as global reference event. Investigated parameters influencing the detection capability are the directivity of the line source signal, the ducting of acoustic energy and the individual noise conditions at each station. Findings include a clear detection preference for stations perpendicular to the meteorite trajectory, even over large distances. Only a weak influence of stratospheric ducting is observed for this low-frequency case. Furthermore, a strong dependence on the diurnal variability of background noise levels at each station is observed, favoring nocturnal detections.

  13. Engineering Upgrades to the Radionuclide Aerosol Sampler/Analyzer for the CTBT International Monitoring System

    SciTech Connect

    Forrester, Joel B.; Carty, Fitz; Comes, Laura; Hayes, James C.; Miley, Harry S.; Morris, Scott J.; Ripplinger, Mike D.; Slaugh, Ryan W.; Van Davelaar, Peter

    2013-05-13

    The Radionuclide Aerosol Sampler/Analyzer (RASA) is an automated aerosol collection and analysis system designed by Pacific Northwest National Laboratory in the 1990’s and is deployed in several locations around the world as part of the International Monitoring System (IMS) required under the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The utility of such an automated system is the reduction of human intervention and the production of perfectly uniform results. However, maintainability and down time issues threaten this utility, even for systems with over 90% data availability. Engineering upgrades to the RASA are currently being pursued to address these issues, as well as Fukushima lessons learned. Current work includes a new automation control unit, and other potential improvements such as alternative detector cooling and sampling options are under review. This paper presents the current state of upgrades and improvements under investigation

  14. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    SciTech Connect

    Qiu, J; Li, H. Harlod; Zhang, T; Yang, D; Ma, F

    2015-06-15

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The most important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools.

  15. GENERIC VERIFICATION PROTOCOL FOR CHEMICALLY-ENHANCED HIGH-RATE SEPARATION

    EPA Science Inventory

    Chemically enhanced high rate separation is a type of physical-chemical treatment technology well suited to the treatment of wet weather flow. The CEHRS technology offers a robust treatment alternative for application to combined sewer overflows, sanitary sewer overflow and exces...

  16. Feedback Seeking in Early Adolescence: Self-Enhancement or Self-Verification?

    ERIC Educational Resources Information Center

    Rosen, Lisa H.; Principe, Connor P.; Langlois, Judith H.

    2013-01-01

    The authors examined whether early adolescents ("N" = 90) solicit self-enhancing feedback (i.e., positive feedback) or self-verifying feedback (i.e., feedback congruent with self-views, even when these views are negative). Sixth, seventh, and eighth graders first completed a self-perception measure and then selected whether to receive…

  17. Verification of enhanced sintering concepts using iron compacts alloyed with boron

    SciTech Connect

    Madan, D.S.

    1988-01-01

    The theories of enhanced sintering identify three criteria for a sintering enhancer, namely, high solubility of base in the second phase, segregation of the additive at the interparticle boundaries, and easy diffusion of the base material through the segregated phase. This study proposes a figure of merit relating to thermodynamic quantities that is capable of reliably identifying a sintering enhancer. The new figure of merit was defined as a weighted sum of solubility, segregation, and diffusion terms. It is a dimensionless quantity and can be easily calculated from phase diagram features. The proposed figure of merit was used to successfully identify effective sintering enhancers for iron and the results compared to sintered density values obtained from a controlled experiment and from past literature. Based on these evaluations and past literature, boron was identified as a potential sintering enhancer for iron. A carefully planned experimental procedure was adopted to understand the sintering of the iron-boron system. Elemental boron and Fe{sub 2}B were added to water atomized and carbonyl iron powders. A total of 37 compositions containing 0 to 0.8 wt% boron were investigated. All samples were compacted to the same green density and sintered for one hour at 1200 C in dry hydrogen. Densification studies, dilatometry, thermal analysis, mechanical testing, and microstructural analysis were conducted. The addition of boron increased the sintered density, hardness, and strength of the iron compacts. The sintering of this system is dominated by the formation of the Fe-Fe{sub 2}B eutectic liquid phase at about 1175 C. Densification occurred by particle rearrangement and the process of solution-reprecipitation. Higher boron contents resulted in a greater amount of liquid phase, lower porosity, and a smaller pore size.

  18. An experimental verification of metamaterial coupled enhanced transmission for antenna applications

    NASA Astrophysics Data System (ADS)

    Pushpakaran, Sarin V.; Raj, Rohith K.; Pradeep, Anju; Ouseph, Lindo; Hari, Mridula; Chandroth, Aanandan; Pezholil, Mohanan; Kesavath, Vasudevan

    2014-02-01

    Inspired by the work of Bethe on electromagnetic transmission through subwavelength hole, there has been immense interest on the extraordinary transmission through subwavelength slot/slit on metal plates. The invention of metamaterials has boosted the extra ordinary transmission through subwavelength slots. We examine computationally and experimentally the concept of metamaterial cover using an array of split ring resonators (SRRs), for enhancing the transmission in a stacked dipole antenna working in the S band. The front to back ratio is considerably improved by enhancing the magnetic resonant strength in close proximity of the slit of the upper parasitic dipole. The effect of stacking height of the SRR monolayer on the resonant characteristics of the split ring resonators and its effect on antenna radiation characteristics has been studied.

  19. An experimental verification of metamaterial coupled enhanced transmission for antenna applications

    SciTech Connect

    Pushpakaran, Sarin V.; Raj, Rohith K.; Pradeep, Anju; Ouseph, Lindo; Hari, Mridula; Chandroth, Aanandan; Pezholil, Mohanan; Kesavath, Vasudevan

    2014-02-10

    Inspired by the work of Bethe on electromagnetic transmission through subwavelength hole, there has been immense interest on the extraordinary transmission through subwavelength slot/slit on metal plates. The invention of metamaterials has boosted the extra ordinary transmission through subwavelength slots. We examine computationally and experimentally the concept of metamaterial cover using an array of split ring resonators (SRRs), for enhancing the transmission in a stacked dipole antenna working in the S band. The front to back ratio is considerably improved by enhancing the magnetic resonant strength in close proximity of the slit of the upper parasitic dipole. The effect of stacking height of the SRR monolayer on the resonant characteristics of the split ring resonators and its effect on antenna radiation characteristics has been studied.

  20. Appearance feedback in intimate relationships: the role of self-verification and self-enhancement.

    PubMed

    Brown, Jennifer N; Stukas, Arthur A; Evans, Lynette

    2013-01-01

    To better understand how body image operates within the context of intimate relationships, we investigated women's responses to appearance feedback from an intimate partner. Participants (N=192) imagined receiving feedback from their partner that was either consistent with their own appearance self-view (i.e., self-verifying), more positive (i.e., self-enhancing), or less positive (i.e., devaluing), and then provided their affective and cognitive reactions. As expected, women's perceptions of their own appearance moderated their reactions. Women with more negative self-views felt happier with enhancing feedback, but thought that it meant their partner understood them less well. They also felt less happy when they received verifying feedback, but felt more understood by their partners. Thus, women with body image dissatisfaction may find themselves stuck in the "cognitive-affective crossfire" reacting ambivalently whether their partner enhances their appearance or confirms their negative self-views. Further examination of partners' actual feedback is needed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Blind Source Separation of Seismic Events with Independent Component Analysis: CTBT related exercise

    NASA Astrophysics Data System (ADS)

    Rozhkov, Mikhail; Kitov, Ivan

    2015-04-01

    Blind Source Separation (BSS) methods used in signal recovery applications are attractive for they use minimal a priori information about the signals they are dealing with. Homomorphic deconvolution and cepstrum estimation are probably the only methods used in certain extent in CTBT applications that can be attributed to the given branch of technology. However Expert Technical Analysis (ETA) conducted in CTBTO to improve the estimated values for the standard signal and event parameters according to the Protocol to the CTBT may face problems which cannot be resolved with certified CTBTO applications and may demand specific techniques not presently used. The problem to be considered within the ETA framework is the unambiguous separation of signals with close arrival times. Here, we examine two scenarios of interest: (1) separation of two almost co-located explosions conducted within fractions of seconds, and (2) extraction of explosion signals merged with wavetrains from strong earthquake. The importance of resolving the problem related to case 1 is connected with the correct explosion yield estimation. Case 2 is a well-known scenario of conducting clandestine nuclear tests. While the first case can be approached somehow with the means of cepstral methods, the second case can hardly be resolved with the conventional methods implemented at the International Data Centre, especially if the signals have close slowness and azimuth. Independent Component Analysis (in its FastICA implementation) implying non-Gaussianity of the underlying processes signal's mixture is a blind source separation method that we apply to resolve the mentioned above problems. We have tested this technique with synthetic waveforms, seismic data from DPRK explosions and mining blasts conducted within East-European platform as well as with signals from strong teleseismic events (Sumatra, April 2012 Mw=8.6, and Tohoku, March 2011 Mw=9.0 earthquakes). The data was recorded by seismic arrays of the

  2. An Improved Method for Seismic Event Depth and Moment Tensor Determination: CTBT Related Application

    NASA Astrophysics Data System (ADS)

    Stachnik, J.; Rozhkov, M.; Baker, B.

    2016-12-01

    According to the Protocol to CTBT, International Data Center is required to conduct expert technical analysis and special studies to improve event parameters and assist State Parties in identifying the source of specific event. Determination of seismic event source mechanism and its depth is a part of these tasks. It is typically done through a strategic linearized inversion of the waveforms for a complete or subset of source parameters, or similarly defined grid search through precomputed Greens Functions created for particular source models. We show preliminary results using the latter approach from an improved software design and applied on a moderately powered computer. In this development we tried to be compliant with different modes of CTBT monitoring regime and cover wide range of source-receiver distances (regional to teleseismic), resolve shallow source depths, provide full moment tensor solution based on body and surface waves recordings, be fast to satisfy both on-demand studies and automatic processing and properly incorporate observed waveforms and any uncertainties a priori as well as accurately estimate posteriori uncertainties. Implemented HDF5 based Green's Functions pre-packaging allows much greater flexibility in utilizing different software packages and methods for computation. Further additions will have the rapid use of Instaseis/AXISEM full waveform synthetics added to a pre-computed GF archive. Along with traditional post processing analysis of waveform misfits through several objective functions and variance reduction, we follow a probabilistic approach to assess the robustness of moment tensor solution. In a course of this project full moment tensor and depth estimates are determined for DPRK 2009, 2013 and 2016 events and shallow earthquakes using a new implementation of waveform fitting of teleseismic P waves. A full grid search over the entire moment tensor space is used to appropriately sample all possible solutions. A recent method by

  3. Enhanced magnetic particle transport by integration of a magnetic flux guide: Experimental verification of simulated behavior

    NASA Astrophysics Data System (ADS)

    Wirix-Speetjens, Roel; Fyen, Wim; Boeck, Jo De; Borghs, Gustaaf

    2006-04-01

    In the past, magnetic biosensors have shown to be promising alternatives for classical fluorescence-based microarrays, replacing the fluorescent label by a superparamagnetic particle. While on-chip detection of magnetic particles is firmly established, research groups continue to explore the unique ability of manipulating these particles by applying controlled magnetic forces. One of the challenging tasks in designing magnetic force generating structures remains the generation of large forces for a minimal current consumption. Previously, a simple transporting device for single magnetic particles has been demonstrated using a magnetic field that is generated by two tapered current carrying conductors [R. Wirix-Speetjens, W. Fyen, K. Xu, J. De Boeck, and G. Borghs, IEEE Trans. Magn. 41(10), 4128 (2005)]. We also developed a model to accurately predict the motion of a magnetic particle moving in the vicinity of a solid wall. Using this model, we now present a technique that enhances the magnetic force up to a factor of 3 using a magnetic flux guide. The larger magnetic force results in an average speed of the particle which increases with a factor of 3. These simulations show good agreement with experimental results.

  4. Experimental verification of enhanced sound transmission from water to air at low frequencies.

    PubMed

    Calvo, David C; Nicholas, Michael; Orris, Gregory J

    2013-11-01

    Laboratory measurements of enhanced sound transmission from water to air at low frequencies are presented. The pressure at a monitoring hydrophone is found to decrease for shallow source depths in agreement with the classical theory of a monopole source in proximity to a pressure release interface. On the other hand, for source depths below 1/10 of an acoustic wavelength in water, the radiation pattern in the air measured by two microphones becomes progressively omnidirectional in contrast to the classical geometrical acoustics picture in which sound is contained within a cone of 13.4° half angle. The measured directivities agree with wavenumber integration results for a point source over a range of frequencies and source depths. The wider radiation pattern owes itself to the conversion of evanescent waves in the water into propagating waves in the air that fill the angular space outside the cone. A ratio of pressure measurements made using an on-axis microphone and a near-axis hydrophone are also reported and compared with theory. Collectively, these pressure measurements are consistent with the theory of anomalous transparency of the water-air interface in which a large fraction of acoustic power emitted by a shallow source is radiated into the air.

  5. CTBT infrasound network performance to detect the 2013 Russian fireball event

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; Le Pichon, Alexis; Mialle, Pierrick; Garcés, Milton A.

    2015-04-01

    The explosive fragmentation of the 2013 Chelyabinsk meteorite generated a large airburst with an equivalent yield of 500 kT TNT. It is the most energetic event recorded by the infrasound component of the Comprehensive Nuclear-Test-Ban Treaty-International Monitoring System (CTBT-IMS), globally detected by 20 out of 42 operational stations. This study performs a station-by-station estimation of the IMS detection capability to explain infrasound detections and nondetections from short to long distances, using the Chelyabinsk meteorite as global reference event. Investigated parameters influencing the detection capability are the directivity of the line source signal, the ducting of acoustic energy, and the individual noise conditions at each station. Findings include a clear detection preference for stations perpendicular to the meteorite trajectory, even over large distances. Only a weak influence of stratospheric ducting is observed for this low-frequency case. Furthermore, a strong dependence on the diurnal variability of background noise levels at each station is observed, favoring nocturnal detections.

  6. CTBT infrasound network performance to detect the 2013 Russian fireball event

    DOE PAGES

    Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; ...

    2015-03-18

    The explosive fragmentation of the 2013 Chelyabinsk meteorite generated a large airburst with an equivalent yield of 500 kT TNT. It is the most energetic event recorded by the infrasound component of the Comprehensive Nuclear-Test-Ban Treaty-International Monitoring System (CTBT-IMS), globally detected by 20 out of 42 operational stations. This study performs a station-by-station estimation of the IMS detection capability to explain infrasound detections and nondetections from short to long distances, using the Chelyabinsk meteorite as global reference event. Investigated parameters influencing the detection capability are the directivity of the line source signal, the ducting of acoustic energy, and the individualmore » noise conditions at each station. Findings include a clear detection preference for stations perpendicular to the meteorite trajectory, even over large distances. Only a weak influence of stratospheric ducting is observed for this low-frequency case. As a result, a strong dependence on the diurnal variability of background noise levels at each station is observed, favoring nocturnal detections.« less

  7. CTBT infrasound network performance to detect the 2013 Russian fireball event

    SciTech Connect

    Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; Le Pichon, Alexis; Mialle, Pierrick

    2015-03-18

    The explosive fragmentation of the 2013 Chelyabinsk meteorite generated a large airburst with an equivalent yield of 500 kT TNT. It is the most energetic event recorded by the infrasound component of the Comprehensive Nuclear-Test-Ban Treaty-International Monitoring System (CTBT-IMS), globally detected by 20 out of 42 operational stations. This study performs a station-by-station estimation of the IMS detection capability to explain infrasound detections and nondetections from short to long distances, using the Chelyabinsk meteorite as global reference event. Investigated parameters influencing the detection capability are the directivity of the line source signal, the ducting of acoustic energy, and the individual noise conditions at each station. Findings include a clear detection preference for stations perpendicular to the meteorite trajectory, even over large distances. Only a weak influence of stratospheric ducting is observed for this low-frequency case. As a result, a strong dependence on the diurnal variability of background noise levels at each station is observed, favoring nocturnal detections.

  8. Including oxygen enhancement ratio in ion beam treatment planning: model implementation and experimental verification

    NASA Astrophysics Data System (ADS)

    Scifoni, E.; Tinganelli, W.; Weyrather, W. K.; Durante, M.; Maier, A.; Krämer, M.

    2013-06-01

    We present a method for adapting a biologically optimized treatment planning for particle beams to a spatially inhomogeneous tumor sensitivity due to hypoxia, and detected e.g., by PET functional imaging. The TRiP98 code, established treatment planning system for particles, has been extended for including explicitly the oxygen enhancement ratio (OER) in the biological effect calculation, providing the first set up of a dedicated ion beam treatment planning approach directed to hypoxic tumors, TRiP-OER, here reported together with experimental tests. A simple semi-empirical model for calculating the OER as a function of oxygen concentration and dose averaged linear energy transfer, generating input tables for the program is introduced. The code is then extended in order to import such tables coming from the present or alternative models, accordingly and to perform forward and inverse planning, i.e., predicting the survival response of differently oxygenated areas as well as optimizing the required dose for restoring a uniform survival effect in the whole irradiated target. The multiple field optimization results show how the program selects the best beam components for treating the hypoxic regions. The calculations performed for different ions, provide indications for the possible clinical advantages of a multi-ion treatment. Finally the predictivity of the code is tested through dedicated cell culture experiments on extended targets irradiation using specially designed hypoxic chambers, providing a qualitative agreement, despite some limits in full survival calculations arising from the RBE assessment. The comparison of the predictions resulting by using different model tables are also reported.

  9. Applying monitoring, verification, and accounting techniques to a real-world, enhanced oil recovery operational CO2 leak

    USGS Publications Warehouse

    Wimmer, B.T.; Krapac, I.G.; Locke, R.; Iranmanesh, A.

    2011-01-01

    The use of carbon dioxide (CO2) for enhanced oil recovery (EOR) is being tested for oil fields in the Illinois Basin, USA. While this technology has shown promise for improving oil production, it has raised some issues about the safety of CO2 injection and storage. The Midwest Geological Sequestration Consortium (MGSC) organized a Monitoring, Verification, and Accounting (MVA) team to develop and deploy monitoring programs at three EOR sites in Illinois, Indiana, and Kentucky, USA. MVA goals include establishing baseline conditions to evaluate potential impacts from CO2 injection, demonstrating that project activities are protective of human health and the environment, and providing an accurate accounting of stored CO2. This paper focuses on the use of MVA techniques in monitoring a small CO2 leak from a supply line at an EOR facility under real-world conditions. The ability of shallow monitoring techniques to detect and quantify a CO2 leak under real-world conditions has been largely unproven. In July of 2009, a leak in the pipe supplying pressurized CO2 to an injection well was observed at an MGSC EOR site located in west-central Kentucky. Carbon dioxide was escaping from the supply pipe located approximately 1 m underground. The leak was discovered visually by site personnel and injection was halted immediately. At its largest extent, the hole created by the leak was approximately 1.9 m long by 1.7 m wide and 0.7 m deep in the land surface. This circumstance provided an excellent opportunity to evaluate the performance of several monitoring techniques including soil CO2 flux measurements, portable infrared gas analysis, thermal infrared imagery, and aerial hyperspectral imagery. Valuable experience was gained during this effort. Lessons learned included determining 1) hyperspectral imagery was not effective in detecting this relatively small, short-term CO2 leak, 2) even though injection was halted, the leak remained dynamic and presented a safety risk concern

  10. NG09 And CTBT On-Site Inspection Noble Gas Sampling and Analysis Requirements

    NASA Astrophysics Data System (ADS)

    Carrigan, Charles R.; Tanaka, Junichi

    2010-05-01

    A provision of the Comprehensive Test Ban Treaty (CTBT) allows on-site inspections (OSIs) of suspect nuclear sites to determine if the occurrence of a detected event is nuclear in origin. For an underground nuclear explosion (UNE), the potential success of an OSI depends significantly on the containment scenario of the alleged event as well as the application of air and soil-gas radionuclide sampling techniques in a manner that takes into account both the suspect site geology and the gas transport physics. UNE scenarios may be broadly divided into categories involving the level of containment. The simplest to detect is a UNE that vents a significant portion of its radionuclide inventory and is readily detectable at distance by the International Monitoring System (IMS). The most well contained subsurface events will only be detectable during an OSI. In such cases, 37 Ar and radioactive xenon cavity gases may reach the surface through either "micro-seepage" or the barometric pumping process and only the careful siting of sampling locations, timing of sampling and application of the most site-appropriate atmospheric and soil-gas capturing methods will result in a confirmatory signal. The OSI noble gas field tests NG09 was recently held in Stupava, Slovakia to consider, in addition to other field sampling and analysis techniques, drilling and subsurface noble gas extraction methods that might be applied during an OSI. One of the experiments focused on challenges to soil-gas sampling near the soil-atmosphere interface. During withdrawal of soil gas from shallow, subsurface sample points, atmospheric dilution of the sample and the potential for introduction of unwanted atmospheric gases were considered. Tests were designed to evaluate surface infiltration and the ability of inflatable well-packers to seal out atmospheric gases during sample acquisition. We discuss these tests along with some model-based predictions regarding infiltration under different near

  11. Verification and Validation of NASA-Supported Enhancements to the Near Real Time Harmful Algal Blooms Observing System (HABSOS)

    NASA Technical Reports Server (NTRS)

    Spruce, Joseph P.; Hall, Calllie; McPherson, Terry; Spiering, Bruce; Brown, Richard; Estep, Lee; Lunde, Bruce; Guest, DeNeice; Navard, Andy; Pagnutti, Mary; hide

    2006-01-01

    This report discusses verification and validation (V&V) assessment of Moderate Resolution Imaging Spectroradiometer (MODIS) ocean data products contributed by the Naval Research Laboratory (NRL) and Applied Coherent Technologies (ACT) Corporation to National Oceanic Atmospheric Administration s (NOAA) Near Real Time (NRT) Harmful Algal Blooms Observing System (HABSOS). HABSOS is a maturing decision support tool (DST) used by NOAA and its partners involved with coastal and public health management.

  12. Summary report of the workshop on the U.S. use of surface waves for monitoring the CTBT

    SciTech Connect

    Ritzwoller, M; Walter, W R

    1998-09-01

    The workshop addressed the following general research goals of relevance to monitoring and verifying the Comprehensive Test Ban Treaty (CTBT): A) To apprise participants of current and planned research in order to facilitate information exchange, collaboration, and peer review. B) To compare and discuss techniques for data selection, measurement, error assessment, modeling methodologies, etc. To compare results in regions where they overlap and understand the causes of obsenied differences. C) To hear about the U.S. research customer's (AFTAC and DOE Knowledge Base) current and anticipated interests in surface wave research. D) To discuss information flow and integration. How can research results be prepared for efficient use and integration into operational systems E) To identify and discuss fruitful future directions for research.

  13. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  14. Seismic field measurements in Kylylahti, Finland, in support of the further development of geophysical seismic techniques for CTBT On-site Inspections

    NASA Astrophysics Data System (ADS)

    Labak, Peter; Lindblom, Pasi; Malich, Gregor

    2017-04-01

    The Integrated Field Exercise of 2014 (IFE14) was a field event held in the Hashemite Kingdom of Jordan (with concurrent activities in Austria) during which the operational and technical capabilities of a Comprehensive Test Ban Treaty's (CTBT) on-site inspection (OSI) were tested in integrated manner. Many of the inspection techniques permitted by the CTBT were applied during IFE14 including a range of geophysical techniques, however, one of the techniques foreseen by the CTBT but not yet developed is resonance seismometry. During August and September 2016, seismic field measurements have been conducted in the region of Kylylahti, Finland, in support of the further development of geophysical seismic techniques for OSIs. 45 seismic stations were used to continuously acquire seismic signals. During that period, data from local, regional and teleseismic natural events and man-made events were acquired, including from a devastating earthquake in Italy and the nuclear explosion announced by the Democratic People's Republic of Korea on 9 September 2016. Also, data were acquired following the small-scale use of man-made chemical explosives in the area and of vibratory sources. This presentation will show examples from the data set and will discuss its use for the development of resonance seimometry for OSIs.

  15. Verification Results of Jet Resonance-enhanced Multiphoton Ionization as a Real-time PCDD/F Emission Monitor

    EPA Science Inventory

    The Jet REMPI (Resonance Enhanced Multiphoton Ionization) monitor was tested on a hazardous waste firing boiler for its ability to determine concentrations of polychlorinated dibenzodioxins and dibenzofurans (PCDDs/Fs). Jet REMPI is a real time instrument capable of highly selec...

  16. Verification Results of Jet Resonance-enhanced Multiphoton Ionization as a Real-time PCDD/F Emission Monitor

    EPA Science Inventory

    The Jet REMPI (Resonance Enhanced Multiphoton Ionization) monitor was tested on a hazardous waste firing boiler for its ability to determine concentrations of polychlorinated dibenzodioxins and dibenzofurans (PCDDs/Fs). Jet REMPI is a real time instrument capable of highly selec...

  17. Uranium systems to enhance benchmarks for use in the verification of criticality safety computer models. Final report, February 16, 1990--December 31, 1994

    SciTech Connect

    Busch, R.D.

    1995-02-24

    Dr. Robert Busch of the Department of Chemical and Nuclear Engineering was the principal investigator on this project with technical direction provided by the staff in the Nuclear Criticality Safety Group at Los Alamos. During the period of the contract, he had a number of graduate and undergraduate students working on subtasks. The objective of this work was to develop information on uranium systems to enhance benchmarks for use in the verification of criticality safety computer models. During the first year of this project, most of the work was focused on setting up the SUN SPARC-1 Workstation and acquiring the literature which described the critical experiments. By august 1990, the Workstation was operational with the current version of TWODANT loaded on the system. MCNP, version 4 tape was made available from Los Alamos late in 1990. Various documents were acquired which provide the initial descriptions of the critical experiments under consideration as benchmarks. The next four years were spent working on various benchmark projects. A number of publications and presentations were made on this material. These are briefly discussed in this report.

  18. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  19. Analysis and experimental verification of multiple scattering of acoustoelastic waves in thin plates for enhanced energy harvesting

    NASA Astrophysics Data System (ADS)

    Darabi, Amir; Leamy, Michael J.

    2017-08-01

    Acoustoelastic wave energy harvesting in thin plates and other structures has recently gained attention from the energy harvesting research community. Metamaterial-inspired concepts for enhancing wave power generation have been investigated, including metamaterial funnels, mirrors, and defect-based resonators. In support of such concepts, this paper presents an electromechanically coupled, multiple scattering formulation for accurately modeling, exploring, and optimizing metamaterial-based harvesting systems incorporating scatterers (e.g., cylindrical inclusions and voids). Following development, the formulation is applied to determining optimal arrangements of scatterers, nominally in a semi-elliptical path, which maximize electrical power harvested, This is done, in part, by diminishing side lobes resulting from ellipse truncation. Optimization results exhibit minimized side lobes and harvester power nearly ten times that of the non-optimized case. Finally, an experimental study is presented which confirms many of the model predictions.

  20. Optimal design of antireflection coating and experimental verification by plasma enhanced chemical vapor deposition in small displays

    SciTech Connect

    Yang, S. M.; Hsieh, Y. C.; Jeng, C. A.

    2009-03-15

    Conventional antireflection coating by thin films of quarter-wavelength thickness is limited by material selections and these films' refractive indices. The optimal design by non-quarter-wavelength thickness is presented in this study. A multilayer thin-film model is developed by the admittance loci to show that the two-layer thin film of SiN{sub x}/SiO{sub y} at 124/87 nm and three layer of SiN{sub x}/SiN{sub y}/SiO{sub z} at 58/84/83 nm can achieve average transmittances of 94.4% and 94.9%, respectively, on polymer, glass, and silicon substrates. The optimal design is validated by plasma enhanced chemical vapor deposition of N{sub 2}O/SiH{sub 4} and NH{sub 3}/SiH{sub 4} to achieve the desired optical constants. Application of the antireflection coating to a 4 in. liquid crystal display demonstrates that the transmittance is over 94%, the mean luminance can be increased by 25%, and the total reflection angle increased from 41 deg. to 58 deg.

  1. The Case of the 12 May 2010 Event in North Korea: the Role of Temporary Seismic Deployments as National Technical Means for CTBT Verification

    NASA Astrophysics Data System (ADS)

    Koch, K.; Kim, W. Y.; Schaff, D. P.; Richards, P. G.

    2015-12-01

    Since 2012 there has been debate about a low-yield nuclear explosion within North Korea, initially claimed to have occurred in April/May 2010 on the basis of a number of Level 5 radionuclide detections from stations of the radionuclide subnetwork of the International Monitoring System (IMS) and additional reports from similar national facilities. Whereas the announced nuclear tests in North Korea in 2006, 2009 and 2013, were clearly detected seismically, there was initially a lack of detections from the seismological component of the IMS corresponding to a possible nuclear test in 2010. Work published recently by Zhang and Wen in Seismological Research Letters (Jan/Feb 2015) inferring seismological evidence for an explosion in North Korea, at about 0009 hours on 12 May 2010 (UTC), has attracted further attention. Previous studies of seismicity of the North Korean test site for days prior to this date had not found any such evidence from IMS or non-IMS stations. The data used by Zhang and Wen were from stations in northeastern China about 80 to 200 km from the North Korean test site and are currently not available for open research. A search for openly-available data was undertaken, resulting in relevant waveforms obtained both from the IRIS Consortium (from a PASSCAL experiment in Northeastern China, as noted also by Ford and Walter, 2015), and from another temporary seismic deployment, also in China. The data from these stations showed signals consistent with the seismic disturbance found by Zhang and Wen. These supplementary stations thus constitute a monitoring resource providing objective data, in the present case for an event even below magnitude 2 and thus much smaller than can be monitored by the usual assets. Efforts are currently underway to use the data from these stations to investigate the compatibility of the event with other explosion-type events, or with an earthquake.

  2. Verification Games: Crowd-Sourced Formal Verification

    DTIC Science & Technology

    2016-03-01

    Formal Verification the verification tools developed by the Programming Languages and Software Engineering group were improved. A series of games... software makes it imperative to find more effective and efficient mechanisms for improving software reliability. Formal verification is an important part...of this effort, since it is the only way to be certain that a given piece of software is free of (certain types of) errors. To date, formal

  3. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  4. Big Data solution for CTBT monitoring: CEA-IDC joint global cross correlation project

    NASA Astrophysics Data System (ADS)

    Bobrov, Dmitry; Bell, Randy; Brachet, Nicolas; Gaillard, Pierre; Kitov, Ivan; Rozhkov, Mikhail

    2014-05-01

    Waveform cross-correlation when applied to historical datasets of seismic records provides dramatic improvements in detection, location, and magnitude estimation of natural and manmade seismic events. With correlation techniques, the amplitude threshold of signal detection can be reduced globally by a factor of 2 to 3 relative to currently standard beamforming and STA/LTA detector. The gain in sensitivity corresponds to a body wave magnitude reduction by 0.3 to 0.4 units and doubles the number of events meeting high quality requirements (e.g. detected by three and more seismic stations of the International Monitoring System (IMS). This gain is crucial for seismic monitoring under the Comprehensive Nuclear-Test-Ban Treaty. The International Data Centre (IDC) dataset includes more than 450,000 seismic events, tens of millions of raw detections and continuous seismic data from the primary IMS stations since 2000. This high-quality dataset is a natural candidate for an extensive cross correlation study and the basis of further enhancements in monitoring capabilities. Without this historical dataset recorded by the permanent IMS Seismic Network any improvements would not be feasible. However, due to the mismatch between the volume of data and the performance of the standard Information Technology infrastructure, it becomes impossible to process all the data within tolerable elapsed time. To tackle this problem known as "BigData", the CEA/DASE is part of the French project "DataScale". One objective is to reanalyze 10 years of waveform data from the IMS network with the cross-correlation technique thanks to a dedicated High Performance Computer (HPC) infrastructure operated by the Centre de Calcul Recherche et Technologie (CCRT) at the CEA of Bruyères-le-Châtel. Within 2 years we are planning to enhance detection and phase association algorithms (also using machine learning and automatic classification) and process about 30 terabytes of data provided by the IDC to

  5. Columbus pressurized module verification

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Comandatore, Emanuele

    1986-01-01

    The baseline verification approach of the COLUMBUS Pressurized Module was defined during the A and B1 project phases. Peculiarities of the verification program are the testing requirements derived from the permanent manned presence in space. The model philosophy and the test program have been developed in line with the overall verification concept. Such critical areas as meteoroid protections, heat pipe radiators and module seals are identified and tested. Verification problem areas are identified and recommendations for the next development are proposed.

  6. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  7. A fast and robust method for moment tensor and depth determination of shallow seismic events in CTBT related studies.

    NASA Astrophysics Data System (ADS)

    Baker, Ben; Stachnik, Joshua; Rozhkov, Mikhail

    2017-04-01

    International Data Center is required to conduct expert technical analysis and special studies to improve event parameters and assist State Parties in identifying the source of specific event according to the protocol to the Protocol to the Comprehensive Nuclear Test Ban Treaty. Determination of seismic event source mechanism and its depth is closely related to these tasks. It is typically done through a strategic linearized inversion of the waveforms for a complete or subset of source parameters, or similarly defined grid search through precomputed Greens Functions created for particular source models. In this presentation we demonstrate preliminary results obtained with the latter approach from an improved software design. In this development we tried to be compliant with different modes of CTBT monitoring regime and cover wide range of source-receiver distances (regional to teleseismic), resolve shallow source depths, provide full moment tensor solution based on body and surface waves recordings, be fast to satisfy both on-demand studies and automatic processing and properly incorporate observed waveforms and any uncertainties a priori as well as accurately estimate posteriori uncertainties. Posterior distributions of moment tensor parameters show narrow peaks where a significant number of reliable surface wave observations are available. For earthquake examples, fault orientation (strike, dip, and rake) posterior distributions also provide results consistent with published catalogues. Inclusion of observations on horizontal components will provide further constraints. In addition, the calculation of teleseismic P wave Green's Functions are improved through prior analysis to determine an appropriate attenuation parameter for each source-receiver path. Implemented HDF5 based Green's Functions pre-packaging allows much greater flexibility in utilizing different software packages and methods for computation. Further additions will have the rapid use of Instaseis

  8. Verification of Adaptive Systems

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui; Vassev, Emil; Hinchey, Mike; Rouff, Christopher; Buskens, Richard

    2012-01-01

    Adaptive systems are critical for future space and other unmanned and intelligent systems. Verification of these systems is also critical for their use in systems with potential harm to human life or with large financial investments. Due to their nondeterministic nature and extremely large state space, current methods for verification of software systems are not adequate to provide a high level of assurance for them. The combination of stabilization science, high performance computing simulations, compositional verification and traditional verification techniques, plus operational monitors, provides a complete approach to verification and deployment of adaptive systems that has not been used before. This paper gives an overview of this approach.

  9. Use of open source information and commercial satellite imagery for nuclear nonproliferation regime compliance verification by a community of academics

    NASA Astrophysics Data System (ADS)

    Solodov, Alexander

    The proliferation of nuclear weapons is a great threat to world peace and stability. The question of strengthening the nonproliferation regime has been open for a long period of time. In 1997 the International Atomic Energy Agency (IAEA) Board of Governors (BOG) adopted the Additional Safeguards Protocol. The purpose of the protocol is to enhance the IAEA's ability to detect undeclared production of fissile materials in member states. However, the IAEA does not always have sufficient human and financial resources to accomplish this task. Developed here is a concept for making use of human and technical resources available in academia that could be used to enhance the IAEA's mission. The objective of this research was to study the feasibility of an academic community using commercially or publicly available sources of information and products for the purpose of detecting covert facilities and activities intended for the unlawful acquisition of fissile materials or production of nuclear weapons. In this study, the availability and use of commercial satellite imagery systems, commercial computer codes for satellite imagery analysis, Comprehensive Test Ban Treaty (CTBT) verification International Monitoring System (IMS), publicly available information sources such as watchdog groups and press reports, and Customs Services information were explored. A system for integrating these data sources to form conclusions was also developed. The results proved that publicly and commercially available sources of information and data analysis can be a powerful tool in tracking violations in the international nuclear nonproliferation regime and a framework for implementing these tools in academic community was developed. As a result of this study a formation of an International Nonproliferation Monitoring Academic Community (INMAC) is proposed. This would be an independent organization consisting of academics (faculty, staff and students) from both nuclear weapon states (NWS) and

  10. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  11. Improvements in Calibration and Analysis of the CTBT-relevant Radioxenon Isotopes with High Resolution SiPIN-based Electron Detectors

    NASA Astrophysics Data System (ADS)

    Khrustalev, K.

    2016-12-01

    Current process for the calibration of the beta-gamma detectors used for radioxenon isotope measurements for CTBT purposes is laborious and time consuming. It uses a combination of point sources and gaseous sources resulting in differences between energy and resolution calibrations. The emergence of high resolution SiPIN based electron detectors allows improvements in the calibration and analysis process to be made. Thanks to high electron resolution of SiPIN detectors ( 8-9 keV@129 keV) compared to plastic scintillators ( 35 keV@129keV) there are a lot more CE peaks (from radioxenon and radon progenies) can be resolved and used for energy and resolution calibration in the energy range of the CTBT-relevant radioxenon isotopes. The long term stability of the SiPIN energy calibration allows one to significantly reduce the time of the QC measurements needed for checking the stability of the E/R calibration. The currently used second order polynomials for the E/R calibration fitting are unphysical and shall be replaced by a linear energy calibration for NaI and SiPIN, owing to high linearity and dynamic range of the modern digital DAQ systems, and resolution calibration functions shall be modified to reflect the underlying physical processes. Alternatively, one can completely abandon the use of fitting functions and use only point-values of E/R (similar to the efficiency calibration currently used) at the energies relevant for the isotopes of interest (ROI - Regions Of Interest ). Current analysis considers the detector as a set of single channel analysers, with an established set of coefficients relating the positions of ROIs with the positions of the QC peaks. The analysis of the spectra can be made more robust using peak and background fitting in the ROIs with a single free parameter (peak area) of the potential peaks from the known isotopes and a fixed E/R calibration values set.

  12. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  13. On-site inspection for verification of a Comprehensive Test Ban Treaty

    SciTech Connect

    Heckrotte, W.

    1986-10-01

    A seismic monitoring system and on-site inspections are the major components of a verification system for a Comprehensive Test Ban Treaty (CTBT) to give parties assurance that clandestine underground nuclear weapon tests are not taking place. The primary task lies with the seismic monitoring system which must be capable of identifying most earthquakes in the magnitude range of concern as earthquakes, leaving a small number of unidentified events. If any unidentified event on the territory of one party appeared suspicious to another party, and thus potentially an explosion, an on-site inspection could be invoked to decide whether or not a nuclear explosion had taken place. Over the years, on-site inspections have been one of the most contentious issues in test ban negotiations and discussions. In the uncompleted test ban negotiations of 1977-80 between the US, UK, and USSR, voluntary OSIs were established as a basis for negotiation. Voluntary OSIs would require between the parties a common interest and cooperation toward resolving suspicions if OSIs were to serve the purpose of confidence building. On the technical level, an OSI could not assure identification of a clandestine test, but an evader would probably reject any request for an OSI at the site of an evasive test, rather than run the risk of an OSI. The verification system does not provide direct physical evidence of a violation. This could pose a difficult and controversial decision on compliance. 16 refs.

  14. Voltage verification unit

    DOEpatents

    Martin, Edward J [Virginia Beach, VA

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  15. Modular Machine Code Verification

    DTIC Science & Technology

    2007-05-01

    assembly level program verification is to design a type system for assembly language. Partly inspired by the Typed Intermediate Language (TIL) [57... designed to support direct verification of assembly programs with non-trivial 126 properties not expressible in traditional types. Besides the examples...provably sound tal for back-end opti- mization. In Proc. 2003 ACM Conference on Programming Language Design and Imple- mentation, pages 208–219. ACM

  16. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    PubMed

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  17. Enhanced detectability of fluorinated derivatives of N,N-dialkylamino alcohols and precursors of nitrogen mustards by gas chromatography coupled to Fourier transform infrared spectroscopy analysis for verification of chemical weapons convention.

    PubMed

    Garg, Prabhat; Purohit, Ajay; Tak, Vijay K; Dubey, D K

    2009-11-06

    N,N-Dialkylamino alcohols, N-methyldiethanolamine, N-ethyldiethanolamine and triethanolamine are the precursors of VX type nerve agents and three different nitrogen mustards respectively. Their detection and identification is of paramount importance for verification analysis of chemical weapons convention. GC-FTIR is used as complimentary technique to GC-MS analysis for identification of these analytes. One constraint of GC-FTIR, its low sensitivity, was overcome by converting the analytes to their fluorinated derivatives. Owing to high absorptivity in IR region, these derivatives facilitated their detection by GC-FTIR analysis. Derivatizing reagents having trimethylsilyl, trifluoroacyl and heptafluorobutyryl groups on imidazole moiety were screened. Derivatives formed there were analyzed by GC-FTIR quantitatively. Of these reagents studied, heptafluorobutyrylimidazole (HFBI) produced the greatest increase in sensitivity by GC-FTIR detection. 60-125 folds of sensitivity enhancement were observed for the analytes by HFBI derivatization. Absorbance due to various functional groups responsible for enhanced sensitivity were compared by determining their corresponding relative molar extinction coefficients ( [Formula: see text] ) considering uniform optical path length. The RSDs for intraday repeatability and interday reproducibility for various derivatives were 0.2-1.1% and 0.3-1.8%. Limit of detection (LOD) was achieved up to 10-15ng and applicability of the method was tested with unknown samples obtained in international proficiency tests.

  18. Verification of sensitivity enhancement of SWIR imager technology in advanced multispectral SWIR/VIS zoom cameras with constant and variable F-number

    NASA Astrophysics Data System (ADS)

    Hübner, M.; Achtner, B.; Kraus, M.; Siemens, C.; Münzberg, M.

    2016-05-01

    Current designs of combined VIS-color/SWIR camera optics use constant F-number over the full field of view (FOV) range. Especially in the SWIR, limited space for the camera integration in existing system volumes and relatively high pitch dimensions of 15μm or even 20μm force the use of relatively high F- numbers to accomplish narrow fields of view less than 2.0° with reasonable resolution for long range observation and targeting applications. Constant F-number designs are already reported and considered [1] for submarine applications. The comparison of electro-optical performance was based on the given detector noise performance and sensitivity data by the detector manufacturer [1] and further modelling of the imaging chain within linear MTF system theory. The visible channel provides limited twilight capability at F/2.6 but in the SWIR the twilight capability is degraded due to the relatively high F-number of F/7 or F/5.25 for 20 μm and 15 μm pitch, respectively. Differences between prediction and experimental verification of sensitivity in terms of noise equivalent irradiance (NEI) and scenery based limiting illumination levels are shown for the visible and the SWIR spectral range. Within this context, currently developed improvements using optical zoom designs for the multispectral SWIR/VIS camera optics with continuously variable Fnumber are discussed, offering increased low light level capabilities at wide and medium fields of view while still enabling a NFOV < 2° with superior long range targeting capabilities under limited atmospherical sight conditions at daytime.

  19. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  20. Voice verification upgrade

    NASA Astrophysics Data System (ADS)

    Davis, R. L.; Sinnamon, J. T.; Cox, D. L.

    1982-06-01

    This contractor has two major objectives. The first was to build, test, and deliver to the government an entry control system using speaker verification (voice authentication) as the mechanism for verifying the user's claimed identity. This system included a physical mantrap, with an integral weight scale to prevent more than one user from gaining access with one verification (tailgating). The speaker verification part of the entry control system contained all the updates and embellishments to the algorithm that was developed earlier for the BISS (Base and Installation Security System) system under contract with the Electronic Systems Division of the USAF. These updates were tested prior to and during the contract on an operational system used at Texas Instruments in Dallas, Texas, for controlling entry to the Corporate Information Center (CIC).

  1. Wind gust warning verification

    NASA Astrophysics Data System (ADS)

    Primo, Cristina

    2016-07-01

    Operational meteorological centres around the world increasingly include warnings as one of their regular forecast products. Warnings are issued to warn the public about extreme weather situations that might occur leading to damages and losses. In forecasting these extreme events, meteorological centres help their potential users in preventing the damage or losses they might suffer. However, verifying these warnings requires specific methods. This is due not only to the fact that they happen rarely, but also because a new temporal dimension is added when defining a warning, namely the time window of the forecasted event. This paper analyses the issues that might appear when dealing with warning verification. It also proposes some new verification approaches that can be applied to wind warnings. These new techniques are later applied to a real life example, the verification of wind gust warnings at the German Meteorological Centre ("Deutscher Wetterdienst"). Finally, the results obtained from the latter are discussed.

  2. Nuclear disarmament verification

    SciTech Connect

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  3. Verification and validation benchmarks.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  4. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  5. Voice Verification Upgrade.

    DTIC Science & Technology

    1982-06-01

    to develop speaker verification techniques for use over degraded commun- ication channels -- specifically telephone lines. A test of BISS type speaker...verification technology was performed on a degraded channel and compensation techniques were then developed . The fifth program [103 (Total Voice SV...UPGAW. *mbit aL DuI~sel Jme T. SImmoon e~d David L. Cox AAWVLP FIR MIEW RMAS Utgl~rIMIW At" DT11C AU9 231f CD, _ ROME AIR DEVELOPMENT CENTER Air

  6. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  7. Supporting the President's Arms Control and Nonproliferation Agenda: Transparency and Verification for Nuclear Arms Reductions

    SciTech Connect

    Doyle, James E; Meek, Elizabeth

    2009-01-01

    The President's arms control and nonproliferation agenda is still evolving and the details of initiatives supporting it remain undefined. This means that DOE, NNSA, NA-20, NA-24 and the national laboratories can help define the agenda, and the policies and the initiatives to support it. This will require effective internal and interagency coordination. The arms control and nonproliferation agenda is broad and includes the path-breaking goal of creating conditions for the elimination of nuclear weapons. Responsibility for various elements of the agenda will be widely scattered across the interagency. Therefore an interagency mapping exercise should be performed to identify the key points of engagement within NNSA and other agencies for creating effective policy coordination mechanisms. These can include informal networks, working groups, coordinating committees, interagency task forces, etc. It will be important for NA-20 and NA-24 to get a seat at the table and a functional role in many of these coordinating bodies. The arms control and nonproliferation agenda comprises both mature and developing policy initiatives. The more mature elements such as CTBT ratification and a follow-on strategic nuclear arms treaty with Russia have defined milestones. However, recent press reports indicate that even the START follow-on strategic arms pact that is planned to be complete by the end of 2009 may take significantly longer and be more expansive in scope. The Russians called for proposals to count non-deployed as well as deployed warheads. Other elements of the agenda such as FMCT, future bilateral nuclear arms reductions following a START follow-on treaty, nuclear posture changes, preparations for an international nuclear security summit, strengthened international safeguards and multilateral verification are in much earlier stages of development. For this reason any survey of arms control capabilities within the USG should be structured to address potential needs across the

  8. FMEF Electrical single line diagram and panel schedule verification process

    SciTech Connect

    FONG, S.K.

    1998-11-11

    Since the FMEF did not have a mission, a formal drawing verification program was not developed, however, a verification process on essential electrical single line drawings and panel schedules was established to benefit the operations lock and tag program and to enhance the electrical safety culture of the facility. The purpose of this document is to provide a basis by which future landlords and cognizant personnel can understand the degree of verification performed on the electrical single lines and panel schedules. It is the intent that this document be revised or replaced by a more formal requirements document if a mission is identified for the FMEF.

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  11. Computer Graphics Verification

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Video processing creates technical animation sequences using studio quality equipment to realistically represent fluid flow over space shuttle surfaces, helicopter rotors, and turbine blades.Computer systems Co-op, Tim Weatherford, performing computer graphics verification. Part of Co-op brochure.

  12. FPGA Verification Accelerator (FVAX)

    NASA Technical Reports Server (NTRS)

    Oh, Jane; Burke, Gary

    2008-01-01

    Is Verification Acceleration Possible? - Increasing the visibility of the internal nodes of the FPGA results in much faster debug time - Forcing internal signals directly allows a problem condition to be setup very quickly center dot Is this all? - No, this is part of a comprehensive effort to improve the JPL FPGA design and V&V process.

  13. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests

  14. A systems perspective of Comprehensive Test Ban Treaty monitoring and verification

    SciTech Connect

    Walker, L.S.

    1996-11-01

    On September 24, 1996, after decades of discussion and more than two years of intensive international negotiations, President Clinton, followed by representatives of (to date) more than 125 other countries, including the other four declared nuclear weapons states, signed the Comprehensive Test Ban Treaty. Each signatory now faces a complex set of technical and political considerations regarding the advisability of joining the treaty. Those considerations vary from country to country, but for many countries one of the key issues is the extent to which the treaty can be verified. In the case of the US, it is anticipated that treaty verifiability will be an important issue in the US Senate Advice and Consent Hearings. This paper will address treaty verifiability, with an emphasis on the interplay between the various elements of the International monitoring regime, as prescribed in the CTBT Treaty Text and its associated Protocol. These elements, coupled with the National regimes, will serve as an integrated set of overlapping, interlocking measures to support treaty verification. Taken as a whole, they present a formidable challenge to potential testers who wish not to be caught.

  15. Improved Verification for Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Powell, Mark A.

    2008-01-01

    Aerospace systems are subject to many stringent performance requirements to be verified with low risk. This report investigates verification planning using conditional approaches vice the standard classical statistical methods, and usage of historical surrogate data for requirement validation and in verification planning. The example used in this report to illustrate the results of these investigations is a proposed mission assurance requirement with the concomitant maximum acceptable verification risk for the NASA Constellation Program Orion Launch Abort System (LAS). This report demonstrates the following improvements: 1) verification planning using conditional approaches vice classical statistical methods results in plans that are more achievable and feasible; 2) historical surrogate data can be used to bound validation of performance requirements; and, 3) incorporation of historical surrogate data in verification planning using conditional approaches produces even less costly and more reasonable verification plans. The procedures presented in this report may produce similar improvements and cost savings in verification for any stringent performance requirement for an aerospace system.

  16. Verification of the in vivo activity of three distinct cis-acting elements within the Gata1 gene promoter-proximal enhancer in mice.

    PubMed

    Shimizu, Ritsuko; Hasegawa, Atsushi; Ottolenghi, Sergio; Ronchi, Antonella; Yamamoto, Masayuki

    2013-11-01

    The transcription factor GATA1 is essential for erythroid and megakaryocytic cell differentiation. Gata1 hematopoietic regulatory domain (G1HRD) has been shown to recapitulate endogenous Gata1 gene expression in transgenic mouse assays in vivo. G1HRD contains a promoter-proximal enhancer composed of a GATA-palindrome motif, four CP2-binding sites and two CACCC boxes. We prepared transgenic reporter mouse lines in which green fluorescent protein and β-galactosidase expression are driven by wild-type G1HRD (as a positive control) and the G1HRD harboring mutations within these cis-acting elements (as the experimental conditions), respectively. Exploiting this transgenic dual reporter (TDR) assay, we show here that in definitive erythropoiesis, G1HRD activity was markedly affected by individual mutations in the GATA-palindrome motif and the CACCC boxes. Mutation of CP2-binding sites also moderately decreased G1HRD activity. The combined mutation of the CP2-binding sites and the GATA-palindrome motif resulted in complete loss of G1HRD activity. In contrast, in primitive erythroid cells, individual mutations of each element did not affect G1HRD activity; G1HRD activity was abolished only when these three mutations were combined. These results thus show that all three elements independently and cooperatively contribute to G1HRD activity in vivo in definitive erythropoiesis, although these are contributing redundantly to primitive erythropoiesis.

  17. RESRAD-BUILD verification.

    SciTech Connect

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-31

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft{reg_sign} Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified.

  18. Robust verification analysis

    SciTech Connect

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-15

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  19. Robust verification analysis

    NASA Astrophysics Data System (ADS)

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-01

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  20. Microcode Verification Project.

    DTIC Science & Technology

    1980-05-01

    MICROCOPY RESOLUTION TEST CHART MADCTR.S042 /2>t w NI TeduIem R"pm’ 00 0 MICRQCODE VERIFICATION PROJECT Unhvrsity of Southern California Stephen D...in the production, testing , and maintenance of Air Force software. This effort was undertaken in response to that goal. The objective of the effort was...rather than hard wiring, is a recent development in computer technology. Hardware diagnostics do not fulfill testing requirements for these computers

  1. Implementation and verification of an enhanced algorithm for the automatic computation of RR-interval series derived from 24 h 12-lead ECGs.

    PubMed

    Hagmair, Stefan; Braunisch, Matthias C; Bachler, Martin; Schmaderer, Christoph; Hasenau, Anna-Lena; Bauer, Axel; Rizas, Kostantinos D; Wassertheurer, Siegfried; Mayer, Christopher C

    2017-01-01

    An important tool in early diagnosis of cardiac dysfunctions is the analysis of electrocardiograms (ECGs) obtained from ambulatory long-term recordings. Heart rate variability (HRV) analysis became a significant tool for assessing the cardiac health. The usefulness of HRV assessment for the prediction of cardiovascular events in end-stage renal disease patients was previously reported. The aim of this work is to verify an enhanced algorithm to obtain an RR-interval time series in a fully automated manner. The multi-lead corrected R-peaks of each ECG lead are used for RR-series computation and the algorithm is verified by a comparison with manually reviewed reference RR-time series. Twenty-four hour 12-lead ECG recordings of 339 end-stage renal disease patients from the ISAR (rISk strAtification in end-stage Renal disease) study were used. Seven universal indicators were calculated to allow for a generalization of the comparison results. The median score of the indicator of synchronization, i.e. intraclass correlation coefficient, was 96.4% and the median of the root mean square error of the difference time series was 7.5 ms. The negligible error and high synchronization rate indicate high similarity and verified the agreement between the fully automated RR-interval series calculated with the AIT Multi-Lead ECGsolver and the reference time series. As a future perspective, HRV parameters calculated on this RR-time series can be evaluated in longitudinal studies to ensure clinical benefit.

  2. TFE Verification Program

    SciTech Connect

    Not Available

    1990-03-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown on Fig. 1-1. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program;(3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-86; (5) and Thermionic Technology Program in 1986 and 1987. 18 refs., 64 figs., 43 tabs.

  3. Does Z' equal 1 or 2? Enhanced powder NMR crystallography verification of a disordered room temperature crystal structure of a p38 inhibitor for chronic obstructive pulmonary disease.

    PubMed

    Widdifield, Cory M; Nilsson Lill, Sten O; Broo, Anders; Lindkvist, Maria; Pettersen, Anna; Svensk Ankarberg, Anna; Aldred, Peter; Schantz, Staffan; Emsley, Lyndon

    2017-06-28

    The crystal structure of the Form A polymorph of N-cyclopropyl-3-fluoro-4-methyl-5-[3-[[1-[2-[2-(methylamino)ethoxy]phenyl]cyclopropyl]amino]-2-oxo-pyrazin-1-yl]benzamide (i.e., AZD7624), determined using single-crystal X-ray diffraction (scXRD) at 100 K, contains two molecules in the asymmetric unit (Z' = 2) and has regions of local static disorder. This substance has been in phase IIa drug development trials for the treatment of chronic obstructive pulmonary disease, a disease which affects over 300 million people and contributes to nearly 3 million deaths annually. While attempting to verify the crystal structure using nuclear magnetic resonance crystallography (NMRX), we measured (13)C solid-state NMR (SSNMR) spectra at 295 K that appeared consistent with Z' = 1 rather than Z' = 2. To understand this surprising observation, we used multinuclear SSNMR ((1)H, (13)C, (15)N), gauge-including projector augmented-wave density functional theory (GIPAW DFT) calculations, crystal structure prediction (CSP), and powder XRD (pXRD) to determine the room temperature crystal structure. Due to the large size of AZD7624 (ca. 500 amu, 54 distinct (13)C environments for Z' = 2), static disorder at 100 K, and (as we show) dynamic disorder at ambient temperatures, NMR spectral assignment was a challenge. We introduce a method to enhance confidence in NMR assignments by comparing experimental (13)C isotropic chemical shifts against site-specific DFT-calculated shift distributions established using CSP-generated crystal structures. The assignment and room temperature NMRX structure determination process also included measurements of (13)C shift tensors and the observation of residual dipolar coupling between (13)C and (14)N. CSP generated ca. 90 reasonable candidate structures (Z' = 1 and Z' = 2), which when coupled with GIPAW DFT results, room temperature pXRD, and the assigned SSNMR data, establish Z' = 2 at room temperature. We find that the polymorphic Form A of AZD7624 is

  4. Systematic study of source mask optimization and verification flows

    NASA Astrophysics Data System (ADS)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  5. Quantum money with classical verification

    NASA Astrophysics Data System (ADS)

    Gavinsky, Dmitry

    2014-12-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  6. Quantum money with classical verification

    SciTech Connect

    Gavinsky, Dmitry

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  7. Computation and Analysis of the Global Distribution of the Radioxenon Isotope 133Xe based on Emissions from Nuclear Power Plants and Radioisotope Production Facilities and its Relevance for the Verification of the Nuclear-Test-Ban Treaty

    NASA Astrophysics Data System (ADS)

    Wotawa, Gerhard; Becker, Andreas; Kalinowski, Martin; Saey, Paul; Tuma, Matthias; Zähringer, Matthias

    2010-05-01

    Monitoring of radioactive noble gases, in particular xenon isotopes, is a crucial element of the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The capability of the noble gas network, which is currently under construction, to detect signals from a nuclear explosion critically depends on the background created by other sources. Therefore, the global distribution of these isotopes based on emissions and transport patterns needs to be understood. A significant xenon background exists in the reactor regions of North America, Europe and Asia. An emission inventory of the four relevant xenon isotopes has recently been created, which specifies source terms for each power plant. As the major emitters of xenon isotopes worldwide, a few medical radioisotope production facilities have been recently identified, in particular the facilities in Chalk River (Canada), Fleurus (Belgium), Pelindaba (South Africa) and Petten (Netherlands). Emissions from these sites are expected to exceed those of the other sources by orders of magnitude. In this study, emphasis is put on 133Xe, which is the most prevalent xenon isotope. First, based on the emissions known, the resulting 133Xe concentration levels at all noble gas stations of the final CTBT verification network were calculated and found to be consistent with observations. Second, it turned out that emissions from the radioisotope facilities can explain a number of observed peaks, meaning that atmospheric transport modelling is an important tool for the categorization of measurements. Third, it became evident that Nuclear Power Plant emissions are more difficult to treat in the models, since their temporal variation is high and not generally reported. Fourth, there are indications that the assumed annual emissions may be underestimated by factors of two to ten, while the general emission patterns seem to be well understood. Finally, it became evident that 133Xe sources mainly influence the sensitivity of the

  8. Formal verification of human-automation interaction.

    PubMed

    Degani, Asaf; Heymann, Michael

    2002-01-01

    This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.

  9. Formal verification of human-automation interaction

    NASA Technical Reports Server (NTRS)

    Degani, Asaf; Heymann, Michael

    2002-01-01

    This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.

  10. Production readiness verification testing

    NASA Technical Reports Server (NTRS)

    James, A. M.; Bohon, H. L.

    1980-01-01

    A Production Readiness Verification Testing (PRVT) program has been established to determine if structures fabricated from advanced composites can be committed on a production basis to commercial airline service. The program utilizes subcomponents which reflect the variabilities in structure that can realistically be expected from current production and quality control technology to estimate the production qualities, variation in static strength, and durability of advanced composite structures. The results of the static tests and a durability assessment after one year of continuous load/environment testing of twenty two duplicates of each of two structural components (a segment of the front spar and cover of a vertical stabilizer box structure) are discussed.

  11. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  12. Quality Analysis for Windows, 2002-2003: A Verification Tool. EDExpress Training. Participant Guide.

    ERIC Educational Resources Information Center

    Office of Federal Student Aid (ED), Washington, DC.

    This training manual introduces "Quality Analysis for Windows: A Verification Tool," personal computer software developed by the U.S. Department of Education to assist institutions of higher education in increasing the accuracy of student financial aid awards, improving campus verification procedures, and enhancing management…

  13. CIVET: Continuous Integration, Verification, Enhancement, and Testing

    SciTech Connect

    Alger, Brian; Gaston, Derek R.; Permann, Cody J; Peterson, John W.; Slaughter, Andrew E.; Andrs, David

    2016-11-29

    A Git server (GitHub, GitLab, BitBucket) sends event notifications to the Civet server. These are either a " Pull Request" or a "Push" notification. Civet then checks the database to determine what tests need to be run and marks them as ready to run. Civet clients, running on dedicated machines, query the server for available jobs that are ready to run. When a client gets a job it executes the scripts attached to the job and report back to the server the output and exit status. When the client updates the server, the server will also update the Git server with the result of the job, as well as updating the main web page.

  14. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  15. Software Verification and Validation Procedure

    SciTech Connect

    Olund, Thomas S.

    2008-09-15

    This Software Verification and Validation procedure provides the action steps for the Tank Waste Information Network System (TWINS) testing process. The primary objective of the testing process is to provide assurance that the software functions as intended, and meets the requirements specified by the client. Verification and validation establish the primary basis for TWINS software product acceptance.

  16. HDL to verification logic translator

    NASA Astrophysics Data System (ADS)

    Gambles, J. W.; Windley, P. J.

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  17. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  18. TFE Verification Program

    SciTech Connect

    Not Available

    1993-05-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program builds directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern.

  19. Cancelable face verification using optical encryption and authentication.

    PubMed

    Taheri, Motahareh; Mozaffari, Saeed; Keshavarzi, Parviz

    2015-10-01

    In a cancelable biometric system, each instance of enrollment is distorted by a transform function, and the output should not be retransformed to the original data. This paper presents a new cancelable face verification system in the encrypted domain. Encrypted facial images are generated by a double random phase encoding (DRPE) algorithm using two keys (RPM1 and RPM2). To make the system noninvertible, a photon counting (PC) method is utilized, which requires a photon distribution mask for information reduction. Verification of sparse images that are not recognizable by direct visual inspection is performed by unconstrained minimum average correlation energy filter. In the proposed method, encryption keys (RPM1, RPM2, and PDM) are used in the sender side, and the receiver needs only encrypted images and correlation filters. In this manner, the system preserves privacy if correlation filters are obtained by an adversary. Performance of PC-DRPE verification system is evaluated under illumination variation, pose changes, and facial expression. Experimental results show that utilizing encrypted images not only increases security concerns but also enhances verification performance. This improvement can be attributed to the fact that, in the proposed system, the face verification problem is converted to key verification tasks.

  20. Shift Verification and Validation

    SciTech Connect

    Pandya, Tara M.; Evans, Thomas M.; Davidson, Gregory G; Johnson, Seth R.; Godfrey, Andrew T.

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  1. Online fingerprint verification.

    PubMed

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  2. Verification of VENTSAR

    SciTech Connect

    Simpkins, A.A.

    1995-01-01

    The VENTSAR code is an upgraded and improved version of the VENTX code, which estimates concentrations on or near a building from a release at a nearby location. The code calculates the concentrations either for a given meteorological exceedance probability or for a given stability and wind speed combination. A single building can be modeled which lies in the path of the plume, or a penthouse can be added to the top of the building. Plume rise may also be considered. Release types can be either chemical or radioactive. Downwind concentrations are determined at user-specified incremental distances. This verification report was prepared to demonstrate that VENTSAR is properly executing all algorithms and transferring data. Hand calculations were also performed to ensure proper application of methodologies.

  3. Requirements for Nuclear Test Verification

    NASA Astrophysics Data System (ADS)

    Dreicer, M.

    2001-05-01

    Verification comprises the collection and assessment of reliable, relevant information for ascertaining the degree to which our foreign partners are adhering to their international security commitments. In the past, treaty verification was largely a bilateral and the information used to make compliance judgements was under government control. Verification data was collected in joint bilaterally-controlled conditions or at large distances. The international verification regime being developed by the Comprehensive Test Ban Treaty Preparatory Commission will be providing a vast amount of data to a large number of Member States and scientific researchers. The increasingly rapid communication of data from many global sources, including the International Monitoring System, has shifted the traditional views of how verification should be implemented. The newly formed Bureau of Verification and Compliance in the U.S. Department of State is working to develop an overall concept of what sources of information and day-to-day activities are needed to carry out its verification and compliance functions. This presentation will set out preliminary ideas of how this will be and will include ideas of what types of research and development are needed.

  4. On-machine dimensional verification. Final report

    SciTech Connect

    Rendulic, W.

    1993-08-01

    General technology for automating in-process verification of machined products has been studied and implemented on a variety of machines and products at AlliedSignal Inc., Kansas City Division (KCD). Tests have been performed to establish system accuracy and probe reliability on two numerically controlled machining centers. Commercial software has been revised, and new cycles such as skew check and skew machining, have been developed to enhance and expand probing capabilities. Probe benefits have been demonstrated in the area of setup, cycle time, part quality, tooling cost, and product sampling.

  5. Site Specific Verification Guidelines.

    SciTech Connect

    Harding, Steve; Gordon, Frederick M.; Kennedy, Mike

    1992-05-01

    The Bonneville Power Administration (BPA) and the Northwest region have moved from energy surplus to a time when demand for energy is likely to exceed available supplies. The Northwest Power Planning Council is calling for a major push to acquire new resources.'' To meet anticipated loads in the next decade, BPA and the region must more than double that rate at which we acquire conservation resources. BPA hopes to achieve some of this doubling by programs independently designed and implemented by utilities and other parties without intensive BPA involvement. BPA will accept proposals for programs using performance-based payments, in which BPA bases its reimbursement to the sponsor on measured energy savings rather than program costs. To receive payment for conservation projects developed under performance-based programs, utilities and other project developers must propose verification plans to measure the amount of energy savings. BPA has traditionally used analysis of billing histories, before and after measure installation, adjusted by a comparison group on non-participating customers to measure conservation savings. This approach does not work well for all conversation projects. For large or unusual facilities the comparison group approach is not reliable due to the absence of enough comparable non-participants to allow appropriate statistical analysis. For these facilities, which include large commercial and institutional buildings, industrial projects, and complex combinations of building types served by a single utility meter, savings must be verified on a site-specific basis. These guidelines were written to help proposers understand what Bonneville considers the important issues in site specific verification of conservation performance. It also provides a toolbox of methods with guidance on their application and use. 15 refs.

  6. Verification& Validation (V&V) Guidelines and Quantitative Reliability at Confidence (QRC): Basis for an Investment Strategy

    SciTech Connect

    Logan, R W; Nitta, C K

    2002-07-17

    This paper represents an attempt to summarize our thoughts regarding various methods and potential guidelines for Verification and Validation (V&V) and Uncertainty Quantification (UQ) that we have observed within the broader V&V community or generated ourselves. Our goals are to evaluate these various methods, to apply them to computational simulation analyses, and integrate them into methods for Quantitative Certification techniques for the nuclear stockpile. We describe the critical nature of high quality analyses with quantified V&V, and the essential role of V&V and UQ at specified Confidence levels in evaluating system certification status. Only after V&V has contributed to UQ at confidence can rational tradeoffs of various scenarios be made. UQ of performance and safety margins for various scenarios and issues are applied in assessments of Quantified Reliability at Confidence (QRC) and we summarize with a brief description of how these V&V generated QRC quantities fold into a Value-Engineering methodology for evaluating investment strategies. V&V contributes directly to the decision process for investment, through quantification of uncertainties at confidence for margin and reliability assessments. These contributions play an even greater role in a Comprehensive Test Ban Treaty (CTBT) environment than ever before, when reliance on simulation in the absence of the ability to perform nuclear testing is critical.

  7. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within 370...

  8. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within 370...

  9. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within 370...

  10. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within 370...

  11. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within 370...

  12. Generic interpreters and microprocessor verification

    NASA Technical Reports Server (NTRS)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  13. Cold fusion verification

    NASA Astrophysics Data System (ADS)

    North, M. H.; Mastny, G. F.; Wesley, E. J.

    1991-03-01

    The objective of this work to verify and reproduce experimental observations of Cold Nuclear Fusion (CNF), as originally reported in 1989. The method was to start with the original report and add such additional information as became available to build a set of operational electrolytic CNF cells. Verification was to be achieved by first observing cells for neutron production, and for those cells that demonstrated a nuclear effect, careful calorimetric measurements were planned. The authors concluded, after laboratory experience, reading published work, talking with others in the field, and attending conferences, that CNF probably is chimera and will go the way of N-rays and polywater. The neutron detector used for these tests was a completely packaged unit built into a metal suitcase that afforded electrostatic shielding for the detectors and self-contained electronics. It was battery-powered, although it was on charge for most of the long tests. The sensor element consists of He detectors arranged in three independent layers in a solid moderating block. The count from each of the three layers as well as the sum of all the detectors were brought out and recorded separately. The neutron measurements were made with both the neutron detector and the sample tested in a cave made of thick moderating material that surrounded the two units on the sides and bottom.

  14. Woodward Effect Experimental Verifications

    NASA Astrophysics Data System (ADS)

    March, Paul

    2004-02-01

    The work of J. F. Woodward (1990 1996a; 1996b; 1998; 2002a; 2002b; 2004) on the existence of ``mass fluctuations'' and their use in exotic propulsion schemes was examined for possible application in improving space flight propulsion and power generation. Woodward examined Einstein's General Relativity Theory (GRT) and assumed that if the strong Machian interpretation of GRT as well as gravitational / inertia like Wheeler-Feynman radiation reaction forces hold, then when an elementary particle is accelerated through a potential gradient, its rest mass should fluctuate around its mean value during its acceleration. Woodward also used GRT to clarify the precise experimental conditions necessary for observing and exploiting these mass fluctuations or ``Woodward effect'' (W-E). Later, in collaboration with his ex-graduate student T. Mahood, they also pushed the experimental verification boundaries of these proposals. If these purported mass fluctuations occur as Woodward claims, and his assumption that gravity and inertia are both byproducts of the same GRT based phenomenon per Mach's Principle is correct, then many innovative applications such as propellantless propulsion and gravitational exotic matter generators may be feasible. This paper examines the reality of mass fluctuations and the feasibility of using the W-E to design propellantless propulsion devices in the near to mid-term future. The latest experimental results, utilizing MHD-like force rectification systems, will also be presented.

  15. Reliable Entanglement Verification

    NASA Astrophysics Data System (ADS)

    Arrazola, Juan; Gittsovich, Oleg; Donohue, John; Lavoie, Jonathan; Resch, Kevin; Lütkenhaus, Norbert

    2013-05-01

    Entanglement plays a central role in quantum protocols. It is therefore important to be able to verify the presence of entanglement in physical systems from experimental data. In the evaluation of these data, the proper treatment of statistical effects requires special attention, as one can never claim to have verified the presence of entanglement with certainty. Recently increased attention has been paid to the development of proper frameworks to pose and to answer these type of questions. In this work, we apply recent results by Christandl and Renner on reliable quantum state tomography to construct a reliable entanglement verification procedure based on the concept of confidence regions. The statements made do not require the specification of a prior distribution nor the assumption of an independent and identically distributed (i.i.d.) source of states. Moreover, we develop efficient numerical tools that are necessary to employ this approach in practice, rendering the procedure ready to be employed in current experiments. We demonstrate this fact by analyzing the data of an experiment where photonic entangled two-photon states were generated and whose entanglement is verified with the use of an accessible nonlinear witness.

  16. Fusion strategies for boosting cancelable online signature verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    2010-04-01

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. As one solution, we proposed a scheme to enhance the performance of a cancelable approach for online signature verification by combining scores calculated from two transformed datasets generated using two keys. Generally, the same verification algorithm is used for transformed data as for raw (non-transformed) data in cancelable approaches, and, in our previous work, a verification system developed for a non-transformed dataset was used to calculate the scores from transformed data. In this paper, we modify the verification system by using transformed data for training. Several experiments were performed by using public databases, and the experimental results show that the modification of the verification system improved the performances. Our cancelable system combines two scores to make a decision. Several fusion strategies are also considered, and the experimental results are reported here.

  17. Appendix: Conjectures concerning proof, design, and verification.

    SciTech Connect

    Wos, L.

    2000-05-31

    This article focuses on an esoteric but practical use of automated reasoning that may indeed be new to many, especially those concerned primarily with verification of both hardware and software. Specifically, featured are a discussion and some methodology for taking an existing design -- of a circuit, a chip, a program, or the like--and refining and improving it in various ways. Although the methodology is general and does not require the use of a specific program, McCune's program OTTER does offer what is needed. OTTER has played and continues to play the key role in my research, and an interested person can gain access to this program in various ways, not the least of which is through the included CD-ROM in [3]. When success occurs, the result is a new design that may require fewer components, avoid the use of certain costly components, offer more reliability and ease of verification, and, perhaps most important, be more efficient in the contexts of speed and heat generation. Although the author has minimal experience in circuit design, circuit validation, program synthesis, program verification, and similar concerns, (at the encouragement of colleagues based on successes to be cited) he presents materials that might indeed be of substantial interest to manufacturers and programmers. He writes this article in part prompted by the recent activities of chip designers that include Intel and AMD, activities heavily emphasizing the proving of theorems. As for his research that appears to the author to be relevant, he has made an intense and most profitable study of finding proofs that are shorter [2,3], some that avoid the use of various types of term, some that are far less complex than previously known, and the like. Those results suggest to me a strong possible connection between more appealing proofs (in mathematics and in logic) and enhanced and improved design of both hardware and software. Here the author explores diverse conjectures that elucidate some of the

  18. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  19. Technical challenges for dismantlement verification

    SciTech Connect

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-11-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion.

  20. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  1. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  2. Environmental Technology Verification Program Materials ...

    EPA Pesticide Factsheets

    The protocol provides generic procedures for implementing a verification test for the performance of in situ chemical oxidation (ISCO), focused specifically to expand the application of ISCO at manufactured gas plants with polyaromatic hydrocarbon (PAH) contamination (MGP/PAH) and at active gas station sites. The protocol provides generic procedures for implementing a verification test for the performance of in situ chemical oxidation (ISCO), focused specifically to expand the application of ISCO at manufactured gas plants with polyaromatic hydrocarbon (PAH) contamination (MGP/PAH) and at active gas station sites.

  3. CHEMICAL INDUCTION MIXER VERIFICATION - ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Wet-Weather Flow Technologies Pilot of the Environmental Technology Verification (ETV) Program, which is supported by the U.S. Environmental Protection Agency and facilitated by NSF International, has recently evaluated the performance of chemical induction mixers used for di...

  4. CHEMICAL INDUCTION MIXER VERIFICATION - ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Wet-Weather Flow Technologies Pilot of the Environmental Technology Verification (ETV) Program, which is supported by the U.S. Environmental Protection Agency and facilitated by NSF International, has recently evaluated the performance of chemical induction mixers used for di...

  5. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by the... for verification. When seeking verification of a contact lens prescription, a seller shall provide the...

  6. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  7. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Energy Regulatory Commission 18 CFR Part 40 Generator Verification Reliability Standards AGENCY: Federal... Organization: MOD-025-2 (Verification and Data Reporting of Generator Real and Reactive Power Capability and Synchronous Condenser Reactive Power Capability), MOD- 026-1 (Verification of Models and Data for Generator...

  8. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  9. Radioxenon detections in the CTBT international monitoring system likely related to the announced nuclear test in North Korea on February 12, 2013.

    PubMed

    Ringbom, A; Axelsson, A; Aldener, M; Auer, M; Bowyer, T W; Fritioff, T; Hoffman, I; Khrustalev, K; Nikkinen, M; Popov, V; Popov, Y; Ungar, K; Wotawa, G

    2014-02-01

    Observations made in April 2013 of the radioxenon isotopes (133)Xe and (131m)Xe at measurement stations in Japan and Russia, belonging to the International Monitoring System for verification of the Comprehensive Nuclear-Test-Ban Treaty, are unique with respect to the measurement history of these stations. Comparison of measured data with calculated isotopic ratios as well as analysis using atmospheric transport modeling indicate that it is likely that the xenon measured was created in the underground nuclear test conducted by North Korea on February 12, 2013, and released 7-8 weeks later. More than one release is required to explain all observations. The (131m)Xe source terms for each release were calculated to 0.7 TBq, corresponding to about 1-10% of the total xenon inventory for a 10 kt explosion, depending on fractionation and release scenario. The observed ratios could not be used to obtain any information regarding the fissile material that was used in the test. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Kleene Algebra and Bytecode Verification

    DTIC Science & Technology

    2016-04-27

    Languages, ACM SIGPLAN/SIGACT, 1998, pp. 149–160. [2] Coglio, A., Simple verification technique for complex Java bytecode subroutines, Concurrency and...of Programming Languages (POPL’73), ACM , 1973, pp. 194–206. [6] Kot, L. and D. Kozen, Second-order abstract interpretation via Kleene algebra

  11. Automated verification system user's guide

    NASA Technical Reports Server (NTRS)

    Hoffman, R. H.

    1972-01-01

    Descriptions of the operational requirements for all of the programs of the Automated Verification System (AVS) are provided. The AVS programs are: (1) FORTRAN code analysis and instrumentation program (QAMOD); (2) Test Effectiveness Evaluation Program (QAPROC); (3) Transfer Control Variable Tracking Program (QATRAK); (4) Program Anatomy Table Generator (TABGEN); and (5) Network Path Analysis Program (RAMBLE).

  12. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-07-16

    This paper will explore the difficulties of deep reductions by examining the technical verification challenges. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 (Pifer 2010). Further reductions will include stepping stones at 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national lab complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  13. LANL measurements verification acceptance criteria

    SciTech Connect

    Chavez, D. M.

    2001-01-01

    The possibility of SNM diversion/theft is a major concern to organizations charged with control of Special Nuclear Material (SNM). Verification measurements are used to aid in the detection of SNM losses. The acceptance/rejection criteria for verification measurements are dependent on the facility-specific processes, the knowledge of the measured item, and the measurement technique applied. This paper will discuss some of the LANL measurement control steps and criteria applied for the acceptance of a verification measurement. The process involves interaction among the facility operations personnel, the subject matter experts of a specific instrument/technique, the process knowledge on the matrix of the measured item, and the measurement-specific precision and accuracy values. By providing an introduction to a site-specific application of measurement verification acceptance criteria, safeguards, material custodians, and SNM measurement professionals are assisted in understanding the acceptance/rejection process for measurements and their contribution of the process to the detection of SNM diversion.

  14. Digital Imaging Techniques for Radiotherapy Treatment Verification

    NASA Astrophysics Data System (ADS)

    Leszczynski, Konrad Wojciech

    The curative effect of ionizing radiation depends strongly upon the precision with which dose is delivered to the prescribed target volume. The requirement for high geometric accuracy in patient positioning is even more stringent where complex treatment techniques are used, such as conformal, dynamic arc or truly 3-D (non-coplanar) beams. It is expected that digital on-line portal imaging devices will play a key role in the monitoring of radiation therapy treatments. Different approaches to on-line portal image acquisition have been compared, and the basic imaging properties of a video portal imager have been evaluated and discussed in this thesis. Analysis of the system performance indicates the most efficient ways to effect improvements in spatial resolution and signal-to-noise ratio. Digital image processing techniques for noise suppression and contrast enhancement have been developed and implemented in order to facilitate visual analysis of on-line portal images. Results obtained with phantom and clinical images indicate that improvement in image quality can be achieved using adaptive filtering and local histogram modification. A novel study of observer performance with on-line portal images showed that enhancement of contrast by selective local histogram modification significantly improves perceptibility of anatomical landmarks and assures higher accuracy in quantitative computer-assisted treatment verification. Fully automated treatment verification is the ultimate goal of on-line digital portal imaging. It should include analysis of size and shape of the radiation field as well as evaluation of placement of the field with respect to the internal anatomy of the patient. A computerized technique, has been developed, for extraction of the treatment field edges and for parametrization of the field, and examples of its application to automated analysis of size and shape of the radiation field are presented.

  15. Kate's Model Verification Tools

    NASA Technical Reports Server (NTRS)

    Morgan, Steve

    1991-01-01

    Kennedy Space Center's Knowledge-based Autonomous Test Engineer (KATE) is capable of monitoring electromechanical systems, diagnosing their errors, and even repairing them when they crash. A survey of KATE's developer/modelers revealed that they were already using a sophisticated set of productivity enhancing tools. They did request five more, however, and those make up the body of the information presented here: (1) a transfer function code fitter; (2) a FORTRAN-Lisp translator; (3) three existing structural consistency checkers to aid in syntax checking their modeled device frames; (4) an automated procedure for calibrating knowledge base admittances to protect KATE's hardware mockups from inadvertent hand valve twiddling; and (5) three alternatives for the 'pseudo object', a programming patch that currently apprises KATE's modeling devices of their operational environments.

  16. Gender verification testing in sport.

    PubMed

    Ferris, E A

    1992-07-01

    Gender verification testing in sport, first introduced in 1966 by the International Amateur Athletic Federation (IAAF) in response to fears that males with a physical advantage in terms of muscle mass and strength were cheating by masquerading as females in women's competition, has led to unfair disqualifications of women athletes and untold psychological harm. The discredited sex chromatin test, which identifies only the sex chromosome component of gender and is therefore misleading, was abandoned in 1991 by the IAAF in favour of medical checks for all athletes, women and men, which preclude the need for gender testing. But, women athletes will still be tested at the Olympic Games at Albertville and Barcelona using polymerase chain reaction (PCR) to amplify DNA sequences on the Y chromosome which identifies genetic sex only. Gender verification testing may in time be abolished when the sporting community are fully cognizant of its scientific and ethical implications.

  17. Wavelet Features Based Fingerprint Verification

    NASA Astrophysics Data System (ADS)

    Bagadi, Shweta U.; Thalange, Asha V.; Jain, Giridhar P.

    2010-11-01

    In this work; we present a automatic fingerprint identification system based on Level 3 features. Systems based only on minutiae features do not perform well for poor quality images. In practice, we often encounter extremely dry, wet fingerprint images with cuts, warts, etc. Due to such fingerprints, minutiae based systems show poor performance for real time authentication applications. To alleviate the problem of poor quality fingerprints, and to improve overall performance of the system, this paper proposes fingerprint verification based on wavelet statistical features & co-occurrence matrix features. The features include mean, standard deviation, energy, entropy, contrast, local homogeneity, cluster shade, cluster prominence, Information measure of correlation. In this method, matching can be done between the input image and the stored template without exhaustive search using the extracted feature. The wavelet transform based approach is better than the existing minutiae based method and it takes less response time and hence suitable for on-line verification, with high accuracy.

  18. NEXT Thruster Component Verification Testing

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Sovey, James S.

    2007-01-01

    Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.

  19. Land surface Verification Toolkit (LVT)

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  20. Ontology Matching with Semantic Verification

    PubMed Central

    Jean-Mary, Yves R.; Shironoshita, E. Patrick; Kabuka, Mansur R.

    2009-01-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies. PMID:20186256

  1. Verification and transparency in future arms control

    SciTech Connect

    Pilat, J.F.

    1996-09-01

    Verification`s importance has changed dramatically over time, although it always has been in the forefront of arms control. The goals and measures of verification and the criteria for success have changed with the times as well, reflecting such factors as the centrality of the prospective agreement to East-West relations during the Cold War, the state of relations between the United States and the Soviet Union, and the technologies available for monitoring. Verification`s role may be declining in the post-Cold War period. The prospects for such a development will depend, first and foremost, on the high costs of traditional arms control, especially those associated with requirements for verification. Moreover, the growing interest in informal, or non-negotiated arms control does not allow for verification provisions by the very nature of these arrangements. Multilateral agreements are also becoming more prominent and argue against highly effective verification measures, in part because of fears of promoting proliferation by opening sensitive facilities to inspectors from potential proliferant states. As a result, it is likely that transparency and confidence-building measures will achieve greater prominence, both as supplements to and substitutes for traditional verification. Such measures are not panaceas and do not offer all that we came to expect from verification during the Cold war. But they may be the best possible means to deal with current problems of arms reductions and restraints at acceptable levels of expenditure.

  2. Structural System Identification Technology Verification

    DTIC Science & Technology

    1981-11-01

    USAAVRADCOM-TR-81-D-28Q V󈧄 ADA1091 81 LEI STRUCTURAL SYSTEM IDENTIFICATION TECHNOLOGY VERIFICATION \\ N. Giansante, A. Berman, W. o. Flannelly, E...release; distribution unlimited. Prepared for APPLIED TECHNOLOGY LABORATORY U. S. ARMY RESEARCH AND TECHNOLOGY LABORATORIES (AVRADCOM) S Fort Eustis...Va. 23604 4-J" APPLI ED TECHNOLOGY LABORATORY POSITION STATEMENT The Applied Technology Laboratory has been involved in the development of the Struc

  3. Crowd-Sourced Program Verification

    DTIC Science & Technology

    2012-12-01

    S / ROBERT L. KAMINSKI WARREN H. DEBANY, JR. Work Unit Manager Technical Advisor, Information Exploitation & Operations...Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington, DC 20503. PLEASE DO NOT RETURN...investigation, the contractor constructed a prototype of a crowd-sourced verification system that takes as input a given program and produces as output a

  4. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  5. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  6. Earthquake Forecasting, Validation and Verification

    NASA Astrophysics Data System (ADS)

    Rundle, J.; Holliday, J.; Turcotte, D.; Donnellan, A.; Tiampo, K.; Klein, B.

    2009-05-01

    Techniques for earthquake forecasting are in development using both seismicity data mining methods, as well as numerical simulations. The former rely on the development of methods to recognize patterns in data, while the latter rely on the use of dynamical models that attempt to faithfully replicate the actual fault systems. Testing such forecasts is necessary not only to determine forecast quality, but also to improve forecasts. A large number of techniques to validate and verify forecasts have been developed for weather and financial applications. Many of these have been elaborated in public locations, including, for example, the URL as listed below. Typically, the goal is to test for forecast resolution, reliability and sharpness. A good forecast is characterized by consistency, quality and value. Most, if not all of these forecast verification procedures can be readily applied to earthquake forecasts as well. In this talk, we discuss both methods of forecasting, as well as validation and verification using a number of these standard methods. We show how these test methods might be useful for both fault-based forecasting, a group of forecast methods that includes the WGCEP and simulator-based renewal models, and grid-based forecasting, which includes the Relative Intensity, Pattern Informatics, and smoothed seismicity methods. We find that applying these standard methods of forecast verification is straightforward. Judgments about the quality of a given forecast method can often depend on the test applied, as well as on the preconceptions and biases of the persons conducting the tests.

  7. Nuclear Data Verification and Standardization

    SciTech Connect

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  8. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  9. Verification Test Suite for Physics Simulation Codes

    SciTech Connect

    Brock, J S; Kamm, J R; Rider, W J; Brandon, S; Woodward, C; Knupp, P; Trucano, T G

    2006-12-21

    The DOE/NNSA Advanced Simulation & Computing (ASC) Program directs the development, demonstration and deployment of physics simulation codes. The defensible utilization of these codes for high-consequence decisions requires rigorous verification and validation of the simulation software. The physics and engineering codes used at Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratory (SNL) are arguably among the most complex utilized in computational science. Verification represents an important aspect of the development, assessment and application of simulation software for physics and engineering. The purpose of this note is to formally document the existing tri-laboratory suite of verification problems used by LANL, LLNL, and SNL, i.e., the Tri-Lab Verification Test Suite. Verification is often referred to as ensuring that ''the [discrete] equations are solved [numerically] correctly''. More precisely, verification develops evidence of mathematical consistency between continuum partial differential equations (PDEs) and their discrete analogues, and provides an approach by which to estimate discretization errors. There are two variants of verification: (1) code verification, which compares simulation results to known analytical solutions, and (2) calculation verification, which estimates convergence rates and discretization errors without knowledge of a known solution. Together, these verification analyses support defensible verification and validation (V&V) of physics and engineering codes that are used to simulate complex problems that do not possess analytical solutions. Discretization errors (e.g., spatial and temporal errors) are embedded in the numerical solutions of the PDEs that model the relevant governing equations. Quantifying discretization errors, which comprise only a portion of the total numerical simulation error, is possible through code and calculation verification. Code verification

  10. Gender verification in competitive sports.

    PubMed

    Simpson, J L; Ljungqvist, A; de la Chapelle, A; Ferguson-Smith, M A; Genel, M; Carlson, A S; Ehrhardt, A A; Ferris, E

    1993-11-01

    The possibility that men might masquerade as women and be unfair competitors in women's sports is accepted as outrageous by athletes and the public alike. Since the 1930s, media reports have fuelled claims that individuals who once competed as female athletes subsequently appeared to be men. In most of these cases there was probably ambiguity of the external genitalia, possibly as a result of male pseudohermaphroditism. Nonetheless, beginning at the Rome Olympic Games in 1960, the International Amateur Athletics Federation (IAAF) began establishing rules of eligibility for women athletes. Initially, physical examination was used as a method for gender verification, but this plan was widely resented. Thus, sex chromatin testing (buccal smear) was introduced at the Mexico City Olympic Games in 1968. The principle was that genetic females (46,XX) show a single X-chromatic mass, whereas males (46,XY) do not. Unfortunately, sex chromatin analysis fell out of common diagnostic use by geneticists shortly after the International Olympic Committee (IOC) began its implementation for gender verification. The lack of laboratories routinely performing the test aggravated the problem of errors in interpretation by inexperienced workers, yielding false-positive and false-negative results. However, an even greater problem is that there exist phenotypic females with male sex chromatin patterns (e.g. androgen insensitivity, XY gonadal dysgenesis). These individuals have no athletic advantage as a result of their congenital abnormality and reasonably should not be excluded from competition. That is, only the chromosomal (genetic) sex is analysed by sex chromatin testing, not the anatomical or psychosocial status. For all the above reasons sex chromatin testing unfairly excludes many athletes. Although the IOC offered follow-up physical examinations that could have restored eligibility for those 'failing' sex chromatin tests, most affected athletes seemed to prefer to 'retire'. All

  11. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  12. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  13. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION COATINGS AND COATING EQUIPMENT PROGRAM (ETV CCEP): LIQUID COATINGS--GENERIC VERIFICATION PROTOCOL

    EPA Science Inventory

    This report is a generic verification protocol or GVP which provides standards for testing liquid coatings for their enviornmental impacts under the Environmental Technology Verification program. It provides generic guidelines for product specific testing and quality assurance p...

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION: GENERIC VERIFICATION PROTOCOL FOR BIOLOGICAL AND AEROSOL TESTING OF GENERAL VENTILATION AIR CLEANERS

    EPA Science Inventory

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  16. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  17. Using tools for verification, documentation and testing

    NASA Technical Reports Server (NTRS)

    Osterweil, L. J.

    1978-01-01

    Methodologies are discussed on four of the major approaches to program upgrading -- namely dynamic testing, symbolic execution, formal verification and static analysis. The different patterns of strengths, weaknesses and applications of these approaches are shown. It is demonstrated that these patterns are in many ways complementary, offering the hope that they can be coordinated and unified into a single comprehensive program testing and verification system capable of performing a diverse and useful variety of error detection, verification and documentation functions.

  18. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  19. Formal verification of a fault tolerant clock synchronization algorithm

    NASA Technical Reports Server (NTRS)

    Rushby, John; Vonhenke, Frieder

    1989-01-01

    A formal specification and mechanically assisted verification of the interactive convergence clock synchronization algorithm of Lamport and Melliar-Smith is described. Several technical flaws in the analysis given by Lamport and Melliar-Smith were discovered, even though their presentation is unusally precise and detailed. It seems that these flaws were not detected by informal peer scrutiny. The flaws are discussed and a revised presentation of the analysis is given that not only corrects the flaws but is also more precise and easier to follow. Some of the corrections to the flaws require slight modifications to the original assumptions underlying the algorithm and to the constraints on its parameters, and thus change the external specifications of the algorithm. The formal analysis of the interactive convergence clock synchronization algorithm was performed using the Enhanced Hierarchical Development Methodology (EHDM) formal specification and verification environment. This application of EHDM provides a demonstration of some of the capabilities of the system.

  20. How Formal Dynamic Verification Tools Facilitate Novel Concurrency Visualizations

    NASA Astrophysics Data System (ADS)

    Aananthakrishnan, Sriram; Delisi, Michael; Vakkalanka, Sarvani; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev

    With the exploding scale of concurrency, presenting valuable pieces of information collected by formal verification tools intuitively and graphically can greatly enhance concurrent system debugging. Traditional MPI program debuggers present trace views of MPI program executions. Such views are redundant, often containing equivalent traces that permute independent MPI calls. In our ISP formal dynamic verifier for MPI programs, we present a collection of alternate views made possible by the use of formal dynamic verification. Some of ISP’s views help pinpoint errors, some facilitate discerning errors by eliminating redundancy, while others help understand the program better by displaying concurrent even orderings that must be respected by all MPI implementations, in the form of completes-before graphs. In this paper, we describe ISP’s graphical user interface (GUI) capabilities in all these areas which are currently supported by a portable Java based GUI, a Microsoft Visual Studio GUI, and an Eclipse based GUI whose development is in progress.

  1. Input apparatus for dynamic signature verification systems

    DOEpatents

    EerNisse, Errol P.; Land, Cecil E.; Snelling, Jay B.

    1978-01-01

    The disclosure relates to signature verification input apparatus comprising a writing instrument and platen containing piezoelectric transducers which generate signals in response to writing pressures.

  2. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  3. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  4. Acceptance sampling methods for sample results verification

    SciTech Connect

    Jesse, C.A.

    1993-06-01

    This report proposes a statistical sampling method for use during the sample results verification portion of the validation of data packages. In particular, this method was derived specifically for the validation of data packages for metals target analyte analysis performed under United States Environmental Protection Agency Contract Laboratory Program protocols, where sample results verification can be quite time consuming. The purpose of such a statistical method is to provide options in addition to the ``all or nothing`` options that currently exist for sample results verification. The proposed method allows the amount of data validated during the sample results verification process to be based on a balance between risks and the cost of inspection.

  5. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  6. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Cycles § 1065.550 Gas analyzer range verification and drift verification. (a) Range verification. If an... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations...-specific emissions over the entire duty cycle for drift. For each constituent to be verified, both sets...

  7. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  8. Why do verification and validation?

    SciTech Connect

    Hu, Kenneth T.; Paez, Thomas L.

    2016-02-19

    In this discussion paper, we explore different ways to assess the value of verification and validation (V&V) of engineering models. We first present a literature review on the value of V&V and then use value chains and decision trees to show how value can be assessed from a decision maker's perspective. In this context, the value is what the decision maker is willing to pay for V&V analysis with the understanding that the V&V results are uncertain. As a result, the 2014 Sandia V&V Challenge Workshop is used to illustrate these ideas.

  9. Approaches to wind resource verification

    NASA Technical Reports Server (NTRS)

    Barchet, W. R.

    1982-01-01

    Verification of the regional wind energy resource assessments produced by the Pacific Northwest Laboratory addresses the question: Is the magnitude of the resource given in the assessments truly representative of the area of interest? Approaches using qualitative indicators of wind speed (tree deformation, eolian features), old and new data of opportunity not at sites specifically chosen for their exposure to the wind, and data by design from locations specifically selected to be good wind sites are described. Data requirements and evaluation procedures for verifying the resource are discussed.

  10. Optimal Imaging for Treaty Verification

    SciTech Connect

    Brubaker, Erik; Hilton, Nathan R.; Johnson, William; Marleau, Peter; Kupinski, Matthew; MacGahan, Christopher Jonathan

    2014-09-01

    Future arms control treaty verification regimes may use radiation imaging measurements to confirm and track nuclear warheads or other treaty accountable items (TAIs). This project leverages advanced inference methods developed for medical and adaptive imaging to improve task performance in arms control applications. Additionally, we seek a method to acquire and analyze imaging data of declared TAIs without creating an image of those objects or otherwise storing or revealing any classified information. Such a method would avoid the use of classified-information barriers (IB).

  11. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  12. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  13. The monitoring and verification of nuclear weapons

    SciTech Connect

    Garwin, Richard L.

    2014-05-09

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  14. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  15. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  16. Gender Verification of Female Olympic Athletes.

    ERIC Educational Resources Information Center

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  17. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 1 2013-10-01 2013-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  18. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  19. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 1 2014-10-01 2014-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  20. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  1. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  2. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... a flight. Verification must include flight testing. ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Verification program. 460.17 Section 460.17... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17...

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  4. Fingerprint verification prediction model in hand dermatitis.

    PubMed

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  6. 40 CFR 1066.220 - Linearity verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Linearity verification. 1066.220 Section 1066.220 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.220 Linearity verification. (a) Scope...

  7. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  8. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  9. 18 CFR 286.107 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 286.107 Section 286.107 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... Contested Audit Findings and Proposed Remedies § 286.107 Verification. The facts stated in the...

  10. 18 CFR 41.5 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 41.5 Section 41.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  11. 18 CFR 349.5 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 349.5 Section 349.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... PROPOSED REMEDIES § 349.5 Verification. The facts stated in the memorandum must be sworn to by...

  12. 18 CFR 286.107 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 286.107 Section 286.107 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... Contested Audit Findings and Proposed Remedies § 286.107 Verification. The facts stated in the...

  13. 18 CFR 41.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 41.5 Section 41.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  14. 18 CFR 41.5 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification. 41.5 Section 41.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  15. 18 CFR 349.5 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification. 349.5 Section 349.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... PROPOSED REMEDIES § 349.5 Verification. The facts stated in the memorandum must be sworn to by...

  16. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  17. 18 CFR 349.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 349.5 Section 349.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... PROPOSED REMEDIES § 349.5 Verification. The facts stated in the memorandum must be sworn to by...

  18. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Verification program. 460.17 Section...

  19. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Verification program. 460.17 Section...

  20. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Verification program. 460.17 Section...

  1. Identity Verification, Control, and Aggression in Marriage

    ERIC Educational Resources Information Center

    Stets, Jan E.; Burke, Peter J.

    2005-01-01

    In this research we study the identity verification process and its effects in marriage. Drawing on identity control theory, we hypothesize that a lack of verification in the spouse identity (1) threatens stable self-meanings and interaction patterns between spouses, and (2) challenges a (nonverified) spouse's perception of control over the…

  2. CFE verification: The decision to inspect

    SciTech Connect

    Allentuck, J.

    1990-01-01

    Verification of compliance with the provisions of the treaty on Conventional Forces-Europe (CFE) is subject to inspection quotas of various kinds. Thus the decision to carry out a specific inspection or verification activity must be prudently made. This decision process is outlined, and means for conserving quotas'' are suggested. 4 refs., 1 fig.

  3. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements:...

  4. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 5 2011-04-01 2011-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements:...

  5. 18 CFR 34.8 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 34.8 Section 34.8 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF... SECURITIES OR THE ASSUMPTION OF LIABILITIES § 34.8 Verification. Link to an amendment published at 70 FR...

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  7. Guidelines for qualifying cleaning and verification materials

    NASA Technical Reports Server (NTRS)

    Webb, D.

    1995-01-01

    This document is intended to provide guidance in identifying technical issues which must be addressed in a comprehensive qualification plan for materials used in cleaning and cleanliness verification processes. Information presented herein is intended to facilitate development of a definitive checklist that should address all pertinent materials issues when down selecting a cleaning/verification media.

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  9. Technology Foresight and nuclear test verification: a structured and participatory approach

    NASA Astrophysics Data System (ADS)

    Noack, Patrick; Gaya-Piqué, Luis; Haralabus, Georgios; Auer, Matthias; Jain, Amit; Grenard, Patrick

    2013-04-01

    maturity: the number of years until the technology in question reaches Development Stage 3 (i.e. prototype validated). 6. Integration effort: the anticipated level of effort required by the PTS to fully integrate the technology, process, concept or idea into is verification environment. 7. Time to impact: the number of years until the technology is fully developed and integrated into the PTS verification environment and delivers on its full potential. The resulting database is coupled to Pivot, a novel information management software tool which offers powerful visualisation of the taxonomy's parameters for each technology. Pivot offers many advantages over conventional spreadhseet-interfaced database tools: based on shared categories in the taxonomy, users can quickly and intuitively discover linkages, communalities and various interpretations about prospective CTBT pertinent technologies. It is easily possible to visualise a resulting sub-set of technologies that conform to the specific user-selected attributes from the full range of taxonomy categories. In this presentation we will illustrate the range of future technologies, processes, concepts and ideas; we will demonstrate how the Pivot tool can be fruitfully applied to assist in strategic planning and development, and to identify gaps apparent on the technology development horizon. Finally, we will show how the Pivot tool together with the taxonomy offer real and emerging insights to make sense of large amounts of disparate technologies.

  10. Video-based fingerprint verification.

    PubMed

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-09-04

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, "inside similarity" and "outside similarity" are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low.

  11. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness, which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  12. RISKIND verification and benchmark comparisons

    SciTech Connect

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  13. Video-Based Fingerprint Verification

    PubMed Central

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283

  14. Runtime Verification with State Estimation

    NASA Technical Reports Server (NTRS)

    Stoller, Scott D.; Bartocci, Ezio; Seyster, Justin; Grosu, Radu; Havelund, Klaus; Smolka, Scott A.; Zadok, Erez

    2011-01-01

    We introduce the concept of Runtime Verification with State Estimation and show how this concept can be applied to estimate theprobability that a temporal property is satisfied by a run of a program when monitoring overhead is reduced by sampling. In such situations, there may be gaps in the observed program executions, thus making accurate estimation challenging. To deal with the effects of sampling on runtime verification, we view event sequences as observation sequences of a Hidden Markov Model (HMM), use an HMM model of the monitored program to "fill in" sampling-induced gaps in observation sequences, and extend the classic forward algorithm for HMM state estimation (which determines the probability of a state sequence, given an observation sequence) to compute the probability that the property is satisfied by an execution of the program. To validate our approach, we present a case study based on the mission software for a Mars rover. The results of our case study demonstrate high prediction accuracy for the probabilities computed by our algorithm. They also show that our technique is much more accurate than simply evaluating the temporal property on the given observation sequences, ignoring the gaps.

  15. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  16. Modular verification of concurrent systems

    SciTech Connect

    Sobel, A.E.K.

    1986-01-01

    During the last ten years, a number of authors have proposed verification techniques that allow one to prove properties of individual processes by using global assumptions about the behavior of the remaining processes in the distributed program. As a result, one must justify these global assumptions before drawing any conclusions regarding the correctness of the entire program. This justification is often the most difficult part of the proof and presents a serious obstacle to hierarchical program development. This thesis develops a new approach to the verification of concurrent systems. The approach is modular and supports compositional development of programs since the proofs of each individual process of a program are completely isolated from all others. The generality of this approach is illustrated by applying it to a representative set of contemporary concurrent programming languages, namely: CSP, ADA, Distributed Processes, and a shared variable language. In addition, it is also shown how the approach may be used to deal with a number of other constructs that have been proposed for inclusion in concurrent languages: FORK and JOIN primitives, nested monitor calls, path expressions, atomic transactions, and asynchronous message passing. These results allow argument that the approach is universal and can be used to design proof systems for any concurrent language.

  17. Hierarchical Design and Verification for VLSI

    NASA Technical Reports Server (NTRS)

    Shostak, R. E.; Elliott, W. D.; Levitt, K. N.

    1983-01-01

    The specification and verification work is described in detail, and some of the problems and issues to be resolved in their application to Very Large Scale Integration VLSI systems are examined. The hierarchical design methodologies enable a system architect or design team to decompose a complex design into a formal hierarchy of levels of abstraction. The first step inprogram verification is tree formation. The next step after tree formation is the generation from the trees of the verification conditions themselves. The approach taken here is similar in spirit to the corresponding step in program verification but requires modeling of the semantics of circuit elements rather than program statements. The last step is that of proving the verification conditions using a mechanical theorem-prover.

  18. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  19. Gender verification of female athletes.

    PubMed

    Elsas, L J; Ljungqvist, A; Ferguson-Smith, M A; Simpson, J L; Genel, M; Carlson, A S; Ferris, E; de la Chapelle, A; Ehrhardt, A A

    2000-01-01

    The International Olympic Committee (IOC) officially mandated gender verification for female athletes beginning in 1968 and continuing through 1998. The rationale was to prevent masquerading males and women with "unfair, male-like" physical advantage from competing in female-only events. Visual observation and gynecological examination had been tried on a trial basis for two years at some competitions leading up to the 1968 Olympic Games, but these invasive and demeaning processes were jettisoned in favor of laboratory-based genetic tests. Sex chromatin and more recently DNA analyses for Y-specific male material were then required of all female athletes immediately preceding IOC-sanctioned sporting events, and many other international and national competitions following the IOC model. On-site gender verification has since been found to be highly discriminatory, and the cause of emotional trauma and social stigmatization for many females with problems of intersex who have been screened out from competition. Despite compelling evidence for the lack of scientific merit for chromosome-based screening for gender, as well as its functional and ethical inconsistencies, the IOC persisted in its policy for 30 years. The coauthors of this manuscript have worked with some success to rescind this policy through educating athletes and sports governors regarding the psychological and physical nature of sexual differentiation, and the inequities of genetic sex testing. In 1990, the International Amateur Athletics Federation (IAAF) called for abandonment of required genetic screening of women athletes, and by 1992 had adopted a fairer, medically justifiable model for preventing only male "impostors" in international track and field. At the recent recommendation of the IOC Athletes Commission, the Executive Board of the IOC has finally recognized the medical and functional inconsistencies and undue costs of chromosome-based methods. In 1999, the IOC ratified the abandonment of on

  20. Pediatric Readiness and Facility Verification.

    PubMed

    Remick, Katherine; Kaji, Amy H; Olson, Lenora; Ely, Michael; Schmuhl, Patricia; McGrath, Nancy; Edgerton, Elizabeth; Gausche-Hill, Marianne

    2016-03-01

    We perform a needs assessment of pediatric readiness, using a novel scoring system in California emergency departments (EDs), and determine the effect of pediatric verification processes on pediatric readiness. ED nurse managers from all 335 acute care hospital EDs in California were sent a 60-question Web-based assessment. A weighted pediatric readiness score (WPRS), using a 100-point scale, and gap analysis were calculated for each participating ED. Nurse managers from 90% (300/335) of EDs completed the Web-based assessment, including 51 pediatric verified EDs, 67 designated trauma centers, and 31 EDs assessed for pediatric capabilities. Most pediatric visits (87%) occurred in nonchildren's hospitals. The overall median WPRS was 69 (interquartile ratio [IQR] 57.7, 85.9). Pediatric verified EDs had a higher WPRS (89.6; IQR 84.1, 94.1) compared with nonverified EDs (65.5; IQR 55.5, 76.3) and EDs assessed for pediatric capabilities (70.7; IQR 57.4, 88.9). When verification status and ED volume were controlled for, trauma center designation was not predictive of an increase in the WPRS. Forty-three percent of EDs reported the presence of a quality improvement plan that included pediatric elements, and 53% reported a pediatric emergency care coordinator. When coordinator and quality improvement plan were controlled for, the presence of at least 1 pediatric emergency care coordinator was associated with a higher WPRS (85; IQR 75, 93.1) versus EDs without a coordinator (58; IQR 50.1, 66.9), and the presence of a quality improvement plan was associated with a higher WPRS (88; IQR 76.7, 95) compared with that of hospitals without a plan (62; IQR 51.2, 68.7). Of pediatric verified EDs, 92% had a quality improvement plan for pediatric emergency care and 96% had a pediatric emergency care coordinator. We report on the first comprehensive statewide assessment of "pediatric readiness" in EDs according to the 2009 "Guidelines for Care of Children in the Emergency Department

  1. Stennis Space Center Verification & Validation Capabilities

    NASA Technical Reports Server (NTRS)

    Pagnutti, Mary; Ryan, Robert E.; Holekamp, Kara; O'Neal, Duane; Knowlton, Kelly; Ross, Kenton; Blonski, Slawomir

    2007-01-01

    Scientists within NASA#s Applied Research & Technology Project Office (formerly the Applied Sciences Directorate) have developed a well-characterized remote sensing Verification & Validation (V&V) site at the John C. Stennis Space Center (SSC). This site enables the in-flight characterization of satellite and airborne high spatial resolution remote sensing systems and their products. The smaller scale of the newer high resolution remote sensing systems allows scientists to characterize geometric, spatial, and radiometric data properties using a single V&V site. The targets and techniques used to characterize data from these newer systems can differ significantly from the techniques used to characterize data from the earlier, coarser spatial resolution systems. Scientists have used the SSC V&V site to characterize thermal infrared systems. Enhancements are being considered to characterize active lidar systems. SSC employs geodetic targets, edge targets, radiometric tarps, atmospheric monitoring equipment, and thermal calibration ponds to characterize remote sensing data products. Similar techniques are used to characterize moderate spatial resolution sensing systems at selected nearby locations. The SSC Instrument Validation Lab is a key component of the V&V capability and is used to calibrate field instrumentation and to provide National Institute of Standards and Technology traceability. This poster presents a description of the SSC characterization capabilities and examples of calibration data.

  2. Empirical verification of evolutionary theories of aging

    PubMed Central

    Glebov, Anastasia; Asbah, Nimara; Bruno, Luigi; Meunier, Carolynne; Iouk, Tatiana; Titorenko, Vladimir I.

    2016-01-01

    We recently selected 3 long-lived mutant strains of Saccharomyces cerevisiae by a lasting exposure to exogenous lithocholic acid. Each mutant strain can maintain the extended chronological lifespan after numerous passages in medium without lithocholic acid. In this study, we used these long-lived yeast mutants for empirical verification of evolutionary theories of aging. We provide evidence that the dominant polygenic trait extending longevity of each of these mutants 1) does not affect such key features of early-life fitness as the exponential growth rate, efficacy of post-exponential growth and fecundity; and 2) enhances such features of early-life fitness as susceptibility to chronic exogenous stresses, and the resistance to apoptotic and liponecrotic forms of programmed cell death. These findings validate evolutionary theories of programmed aging. We also demonstrate that under laboratory conditions that imitate the process of natural selection within an ecosystem, each of these long-lived mutant strains is forced out of the ecosystem by the parental wild-type strain exhibiting shorter lifespan. We therefore concluded that yeast cells have evolved some mechanisms for limiting their lifespan upon reaching a certain chronological age. These mechanisms drive the evolution of yeast longevity towards maintaining a finite yeast chronological lifespan within ecosystems. PMID:27783562

  3. Double patterning from design enablement to verification

    NASA Astrophysics Data System (ADS)

    Abercrombie, David; Lacour, Pat; El-Sewefy, Omar; Volkov, Alex; Levine, Evgueni; Arb, Kellen; Reid, Chris; Li, Qiao; Ghosh, Pradiptya

    2011-11-01

    Litho-etch-litho-etch (LELE) is the double patterning (DP) technology of choice for 20 nm contact, via, and lower metal layers. We discuss the unique design and process characteristics of LELE DP, the challenges they present, and various solutions. ∘ We examine DP design methodologies, current DP conflict feedback mechanisms, and how they can help designers identify and resolve conflicts. ∘ In place and route (P&R), the placement engine must now be aware of the assumptions made during IP cell design, and use placement directives provide by the library designer. We examine the new effects DP introduces in detail routing, discuss how multiple choices of LELE and the cut allowances can lead to different solutions, and describe new capabilities required by detail routers and P&R engines. ∘ We discuss why LELE DP cuts and overlaps are critical to optical process correction (OPC), and how a hybrid mechanism of rule and model-based overlap generation can provide a fast and effective solution. ∘ With two litho-etch steps, mask misalignment and image rounding are now verification considerations. We present enhancements to the OPCVerify engine that check for pinching and bridging in the presence of DP overlay errors and acute angles.

  4. Stennis Space Center Verification & Validation Capabilities

    NASA Technical Reports Server (NTRS)

    Pagnutti, Mary; Ryan, Robert E.; Holekamp, Kara; O'Neal, Duane; Knowlton, Kelly; Ross, Kenton; Blonski, Slawomir

    2007-01-01

    Scientists within NASA#s Applied Research & Technology Project Office (formerly the Applied Sciences Directorate) have developed a well-characterized remote sensing Verification & Validation (V&V) site at the John C. Stennis Space Center (SSC). This site enables the in-flight characterization of satellite and airborne high spatial resolution remote sensing systems and their products. The smaller scale of the newer high resolution remote sensing systems allows scientists to characterize geometric, spatial, and radiometric data properties using a single V&V site. The targets and techniques used to characterize data from these newer systems can differ significantly from the techniques used to characterize data from the earlier, coarser spatial resolution systems. Scientists have used the SSC V&V site to characterize thermal infrared systems. Enhancements are being considered to characterize active lidar systems. SSC employs geodetic targets, edge targets, radiometric tarps, atmospheric monitoring equipment, and thermal calibration ponds to characterize remote sensing data products. Similar techniques are used to characterize moderate spatial resolution sensing systems at selected nearby locations. The SSC Instrument Validation Lab is a key component of the V&V capability and is used to calibrate field instrumentation and to provide National Institute of Standards and Technology traceability. This poster presents a description of the SSC characterization capabilities and examples of calibration data.

  5. MACCS2 development and verification efforts

    SciTech Connect

    Young, M.; Chanin, D.

    1997-03-01

    MACCS2 represents a major enhancement of the capabilities of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, released in 1987, was developed to estimate the potential impacts to the surrounding public of severe accidents at nuclear power plants. The principal phenomena considered in MACCS/MACCS2 are atmospheric transport and deposition under time-variant meteorology, short-term and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. MACCS2 was developed as a general-purpose analytical tool applicable to diverse reactor and nonreactor facilities. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. In addition, errors that had been identified in MACCS version1.5.11.1 were corrected, including an error that prevented the code from providing intermediate-phase results. MACCS2 version 1.10 beta test was released to the beta-test group in May, 1995. In addition, the University of New Mexico (UNM) has completed an independent verification study of the code package. Since the beta-test release of MACCS2 version 1.10, a number of minor errors have been identified and corrected, and a number of enhancements have been added to the code package. The code enhancements added since the beta-test release of version 1.10 include: (1) an option to allow the user to input the {sigma}{sub y} and {sigma}{sub z} plume expansion parameters in a table-lookup form for incremental downwind distances, (2) an option to define different initial dimensions for up to four segments of a release, (3) an enhancement to the COMIDA2 food-chain model preprocessor to allow the user to supply externally calculated tables of tritium food-chain dose per unit deposition on farmland to support analyses of tritium releases, and (4) the capability to calculate direction-dependent doses.

  6. MFTF sensor verification computer program

    SciTech Connect

    Chow, H.K.

    1984-11-09

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system.

  7. KAT-7 Science Verification Highlights

    NASA Astrophysics Data System (ADS)

    Lucero, Danielle M.; Carignan, Claude; KAT-7 Science Data; Processing Team, KAT-7 Science Commissioning Team

    2015-01-01

    KAT-7 is a pathfinder of the Square Kilometer Array precursor MeerKAT, which is under construction. Its short baselines and low system temperature make it sensitive to large scale, low surface brightness emission. This makes it an ideal instrument to use in searches for faint extended radio emission and low surface density extraplanar gas. We present an update on the progress of several such ongoing KAT-7 science verification projects. These include a large scale radio continuum and polarization survey of the Galactic Center, deep HI observations (100+ hours) of nearby disk galaxies (e.g. NGC253 and NGC3109), and targeted searches for HI tidal tails in galaxy groups (e.g. IC1459). A brief status update for MeerKAT will also be presented if time permits.

  8. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  9. Verification of FANTASTIC integrated code

    NASA Technical Reports Server (NTRS)

    Chauhan, Rajinder Singh

    1987-01-01

    FANTASTIC is an acronym for Failure Analysis Nonlinear Thermal and Structural Integrated Code. This program was developed by Failure Analysis Associates, Palo Alto, Calif., for MSFC to improve the accuracy of solid rocket motor nozzle analysis. FANTASTIC has three modules: FACT - thermochemical analysis; FAHT - heat transfer analysis; and FAST - structural analysis. All modules have keywords for data input. Work is in progress for the verification of the FAHT module, which is done by using data for various problems with known solutions as inputs to the FAHT module. The information obtained is used to identify problem areas of the code and passed on to the developer for debugging purposes. Failure Analysis Associates have revised the first version of the FANTASTIC code and a new improved version has been released to the Thermal Systems Branch.

  10. Retail applications of signature verification

    NASA Astrophysics Data System (ADS)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  11. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  12. Performing Verification and Validation in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  13. Performing Verification and Validation in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  14. Inclusion type radiochromic gel dosimeter for threedimensional dose verification

    NASA Astrophysics Data System (ADS)

    Usui, Shuji; Yoshioka, Munenori; Hayashi, Shin-ichiro; Tominaga, Takahiro

    2015-01-01

    For the verification of 3D dose distributions in modern radiation therapy, a new inclusion type radiochromic gel detector has been developed. In this gel, a hydrophobic leuco dye (leucomalachite green: LMG) was dissolved in water as an inclusion complex with highly branched cyclic dextrin. The radiation induced radical oxidation property of the LMG gel with various sensitizers was investigated. As a result, the optical dose responses were enhanced by the addition of bromoacetic acid and manganese (II) chloride. Unfavorable auto-oxidation of the gel was reduced when it was stored at 4°C.

  15. Speaker verification using combined acoustic and EM sensor signal processing

    SciTech Connect

    Ng, L C; Gable, T J; Holzrichter, J F

    2000-11-10

    Low Power EM radar-like sensors have made it possible to measure properties of the human speech production system in real-time, without acoustic interference. This greatly enhances the quality and quantity of information for many speech related applications. See Holzrichter, Burnett, Ng, and Lea, J. Acoustic. SOC. Am . 103 ( 1) 622 (1998). By combining the Glottal-EM-Sensor (GEMS) with the Acoustic-signals, we've demonstrated an almost 10 fold reduction in error rates from a speaker verification system experiment under a moderate noisy environment (-10dB).

  16. 37 CFR 382.6 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... payments. (a) General. This section prescribes general rules pertaining to the verification of the payment... requesting the verification procedure shall retain the report of the verification for a period of three years... independent auditor, shall serve as an acceptable verification procedure for all interested parties. (f) Costs...

  17. Formal specification and verification of Ada software

    NASA Technical Reports Server (NTRS)

    Hird, Geoffrey R.

    1991-01-01

    The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.

  18. Liquefied Natural Gas (LNG) dispenser verification device

    NASA Astrophysics Data System (ADS)

    Xiong, Maotao; Yang, Jie-bin; Zhao, Pu-jun; Yu, Bo; Deng, Wan-quan

    2013-01-01

    The composition of working principle and calibration status of LNG (Liquefied Natural Gas) dispenser in China are introduced. According to the defect of weighing method in the calibration of LNG dispenser, LNG dispenser verification device has been researched. The verification device bases on the master meter method to verify LNG dispenser in the field. The experimental results of the device indicate it has steady performance, high accuracy level and flexible construction, and it reaches the international advanced level. Then LNG dispenser verification device will promote the development of LNG dispenser industry in China and to improve the technical level of LNG dispenser manufacture.

  19. TEST DESIGN FOR ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) OF ADD-ON NOX CONTROL UTILIZING OZONE INJECTION

    EPA Science Inventory

    The paper discusses the test design for environmental technology verification (ETV) of add-0n nitrogen oxides (NOx) control utilizing ozone injection. (NOTE: ETV is an EPA-established program to enhance domestic and international market acceptance of new or improved commercially...

  20. TEST DESIGN FOR ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) OF ADD-ON NOX CONTROL UTILIZING OZONE INJECTION

    EPA Science Inventory

    The paper discusses the test design for environmental technology verification (ETV) of add-0n nitrogen oxides (NOx) control utilizing ozone injection. (NOTE: ETV is an EPA-established program to enhance domestic and international market acceptance of new or improved commercially...

  1. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  2. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  3. The PASCAL-HDM Verification System

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The PASCAL-HDM verification system is described. This system supports the mechanical generation of verification conditions from PASCAL programs and HDM-SPECIAL specifications using the Floyd-Hoare axiomatic method. Tools are provided to parse programs and specifications, check their static semantics, generate verification conditions from Hoare rules, and translate the verification conditions appropriately for proof using the Shostak Theorem Prover, are explained. The differences between standard PASCAL and the language handled by this system are explained. This consists mostly of restrictions to the standard language definition, the only extensions or modifications being the addition of specifications to the code and the change requiring the references to a function of no arguments to have empty parentheses.

  4. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: STORMWATER TECHNOLOGIES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techn...

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: FUEL CELLS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  7. The PASCAL-HDM Verification System

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The PASCAL-HDM verification system is described. This system supports the mechanical generation of verification conditions from PASCAL programs and HDM-SPECIAL specifications using the Floyd-Hoare axiomatic method. Tools are provided to parse programs and specifications, check their static semantics, generate verification conditions from Hoare rules, and translate the verification conditions appropriately for proof using the Shostak Theorem Prover, are explained. The differences between standard PASCAL and the language handled by this system are explained. This consists mostly of restrictions to the standard language definition, the only extensions or modifications being the addition of specifications to the code and the change requiring the references to a function of no arguments to have empty parentheses.

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: FUEL CELLS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: STORMWATER TECHNOLOGIES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techn...

  10. Environmental Technology Verification Program (ETV) Policy Compendium

    EPA Science Inventory

    The Policy Compendium summarizes operational decisions made to date by participants in the U.S. Environmental Protection Agency's (EPA's) Environmental Technology Verification Program (ETV) to encourage consistency among the ETV centers. The policies contained herein evolved fro...

  11. U.S. Environmental Technology Verification Program

    EPA Science Inventory

    Overview of the U.S. Environmental Technology Verification Program (ETV), the ETV Greenhouse Gas Technology Center, and energy-related ETV projects. Presented at the Department of Energy's National Renewable Laboratory in Boulder, Colorado on June 23, 2008.

  12. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  13. Engineering drawing field verification program. Revision 3

    SciTech Connect

    Ulk, P.F.

    1994-10-12

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented.

  14. Environmental Technology Verification Program (ETV) Policy Compendium

    EPA Science Inventory

    The Policy Compendium summarizes operational decisions made to date by participants in the U.S. Environmental Protection Agency's (EPA's) Environmental Technology Verification Program (ETV) to encourage consistency among the ETV centers. The policies contained herein evolved fro...

  15. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... deviation occurs; (d) Reviewing the critical limits; (e) Reviewing other records pertaining to the...

  16. MAMA Software Features: Quantification Verification Documentation-1

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  17. Assessment of the utility of on-site inspection for INF treaty verification. Sanitized. Technical report

    SciTech Connect

    Baker, J.C.; Hart, D.M.; Doherty, R.T.

    1983-11-10

    This report analyzes the utility of on-site inspection (OSI) for enhancing Intermediate-Range Nuclear Force (INF) treaty verification of Soviet compliance with US-proposed collateral limits on short-range ballistic missiles (SRBMs). It outlines a detailed verification regime that relies on manned OSI teams to help verify limitations on Soviet SRBM deployments. It also assesses the OSI regime's potential impact on US Pershing deployments. Finally, the report reviews the history of American policy concerning on-site inspection and evaluates the overall utility of OSI in support of National Technical Means.

  18. Dynamic testing for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.

    1972-01-01

    Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.

  19. Modular Typestate Verification of Aliased Objects

    DTIC Science & Technology

    2007-03-01

    objects that change their state from one to another. Fugue [11] is the only existing typestate-based object-oriented protocol verification system that we...usage and implementation of such expressive typestate protocols. Our verification approach is highly inspired by Fugue [11]. We extend state invariants...packing, and frames to work in our context. We improve support for inheritance in comparison to Fugue by decoupling states of frames and reducing

  20. Certifiable Specification and Verification of C Programs

    NASA Astrophysics Data System (ADS)

    Lüth, Christoph; Walter, Dennis

    A novel approach to the specification and verification of C programs through an annotation language that is a mixture between JML and the language of Isabelle/HOL is proposed. This yields three benefits: specifications are concise and close to the underlying mathematical model; existing Isabelle theories can be reused; and the leap of faith from specification language to encoding in a logic is small. This is of particular relevance for software certification, and verification in application areas such as robotics.

  1. Identity verification in quantum key distribution

    NASA Astrophysics Data System (ADS)

    Zeng, Guihua; Zhang, Weiping

    2000-02-01

    The security of the previous quantum key distribution protocols, which is guaranteed by the laws of quantum physics, is based on legitimate users. However, impersonation of the legitimate communicators by eavesdroppers, in practice, will be inevitable. In this paper, we proposed a quantum key verification scheme, which can simultaneously distribute the quantum secret key and verify the communicators' identity. Investigation shows that this proposed identity verification scheme is secure.

  2. Handbook: Design of automated redundancy verification

    NASA Technical Reports Server (NTRS)

    Ford, F. A.; Hasslinger, T. W.; Moreno, F. J.

    1971-01-01

    The use of the handbook is discussed and the design progress is reviewed. A description of the problem is presented, and examples are given to illustrate the necessity for redundancy verification, along with the types of situations to which it is typically applied. Reusable space vehicles, such as the space shuttle, are recognized as being significant in the development of the automated redundancy verification problem.

  3. The NPARC Alliance Verification and Validation Archive

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Dudek, Julianne C.; Tatum, Kenneth E.

    2000-01-01

    The NPARC Alliance (National Project for Applications oriented Research in CFD) maintains a publicly-available, web-based verification and validation archive as part of the development and support of the WIND CFD code. The verification and validation methods used for the cases attempt to follow the policies and guidelines of the ASME and AIAA. The emphasis is on air-breathing propulsion flow fields with Mach numbers ranging from low-subsonic to hypersonic.

  4. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  5. Transmutation Fuel Performance Code Thermal Model Verification

    SciTech Connect

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  6. National Verification System of National Meteorological Center , China

    NASA Astrophysics Data System (ADS)

    Zhang, Jinyan; Wei, Qing; Qi, Dan

    2016-04-01

    Product Quality Verification Division for official weather forecasting of China was founded in April, 2011. It is affiliated to Forecast System Laboratory (FSL), National Meteorological Center (NMC), China. There are three employees in this department. I'm one of the employees and I am in charge of Product Quality Verification Division in NMC, China. After five years of construction, an integrated realtime National Verification System of NMC, China has been established. At present, its primary roles include: 1) to verify official weather forecasting quality of NMC, China; 2) to verify the official city weather forecasting quality of Provincial Meteorological Bureau; 3) to evaluate forecasting quality for each forecasters in NMC, China. To verify official weather forecasting quality of NMC, China, we have developed : • Grid QPF Verification module ( including upascale) • Grid temperature, humidity and wind forecast verification module • Severe convective weather forecast verification module • Typhoon forecast verification module • Disaster forecast verification • Disaster warning verification module • Medium and extend period forecast verification module • Objective elements forecast verification module • Ensemble precipitation probabilistic forecast verification module To verify the official city weather forecasting quality of Provincial Meteorological Bureau, we have developed : • City elements forecast verification module • Public heavy rain forecast verification module • City air quality forecast verification module. To evaluate forecasting quality for each forecasters in NMC, China, we have developed : • Off-duty forecaster QPF practice evaluation module • QPF evaluation module for forecasters • Severe convective weather forecast evaluation module • Typhoon track forecast evaluation module for forecasters • Disaster warning evaluation module for forecasters • Medium and extend period forecast evaluation module The further

  7. Infrasound workshop for CTBT monitoring: Proceedings

    SciTech Connect

    Christie, D.; Whitaker, R.

    1998-11-01

    It is expected that the establishment of new infrasound stations in the global IMS network by the Provisional Technical Secretariat of the CTBTO in Vienna will commence in the middle of 1998. Thus, decisions on the final operational design for IMS infrasound stations will have to be made within the next 12 months. Though many of the basic design problems have been resolved, it is clear that further work needs to be carried out during the coming year to ensure that IMS infrasound stations will operate with maximum capability in accord with the specifications determined during the May 1997 PrepCom Meeting. Some of the papers presented at the Workshop suggest that it may be difficult to design a four-element infrasound array station that will reliably detect and locate infrasound signals at all frequencies in the specified range from 0.02 to 4.0 Hz in all noise environments. Hence, if the basic design of an infrasound array is restricted to four array elements, the final optimized design may be suited only to the detection and location of signals in a more limited pass-band. Several participants have also noted that the reliable discrimination of infrasound signals could be quite difficult if the detection system leads to signal distortion. Thus, it has been emphasized that the detection system should not, if possible, compromise signal fidelity. This report contains the workshop agenda, a list of participants, and abstracts and viewgraphs from each presentation.

  8. Development of advanced seal verification

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Kosten, Susan E.; Abushagur, Mustafa A.

    1992-01-01

    The purpose of this research is to develop a technique to monitor and insure seal integrity with a sensor that has no active elements to burn-out during a long duration activity, such as a leakage test or especially during a mission in space. The original concept proposed is that by implementing fiber optic sensors, changes in the integrity of a seal can be monitored in real time and at no time should the optical fiber sensor fail. The electrical components which provide optical excitation and detection through the fiber are not part of the seal; hence, if these electrical components fail, they can be easily changed without breaking the seal. The optical connections required for the concept to work does present a functional problem to work out. The utility of the optical fiber sensor for seal monitoring should be general enough that the degradation of a seal can be determined before catastrophic failure occurs and appropriate action taken. Two parallel efforts were performed in determining the feasibility of using optical fiber sensors for seal verification. In one study, research on interferometric measurements of the mechanical response of the optical fiber sensors to seal integrity was studied. In a second study, the implementation of the optical fiber to a typical vacuum chamber was implemented and feasibility studies on microbend experiments in the vacuum chamber were performed. Also, an attempt was made to quantify the amount of pressure actually being applied to the optical fiber using finite element analysis software by Algor.

  9. Verification of excess defense material

    SciTech Connect

    Fearey, B.L.; Pilat, J.F.; Eccleston, G.W.; Nicholas, N.J.; Tape, J.W.

    1997-12-01

    The international community in the post-Cold War period has expressed an interest in the International Atomic Energy Agency (IAEA) using its expertise in support of the arms control and disarmament process in unprecedented ways. The pledges of the US and Russian presidents to place excess defense materials under some type of international inspections raises the prospect of using IAEA safeguards approaches for monitoring excess materials, which include both classified and unclassified materials. Although the IAEA has suggested the need to address inspections of both types of materials, the most troublesome and potentially difficult problems involve approaches to the inspection of classified materials. The key issue for placing classified nuclear components and materials under IAEA safeguards is the conflict between these traditional IAEA materials accounting procedures and the US classification laws and nonproliferation policy designed to prevent the disclosure of critical weapon-design information. Possible verification approaches to classified excess defense materials could be based on item accountancy, attributes measurements, and containment and surveillance. Such approaches are not wholly new; in fact, they are quite well established for certain unclassified materials. Such concepts may be applicable to classified items, but the precise approaches have yet to be identified, fully tested, or evaluated for technical and political feasibility, or for their possible acceptability in an international inspection regime. Substantial work remains in these areas. This paper examines many of the challenges presented by international inspections of classified materials.

  10. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Innov-X XT400 Series (XT400) x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XT400 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XT400 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was as

  11. ALMA Band 5 Science Verification

    NASA Astrophysics Data System (ADS)

    Humphreys, L.; Biggs, A.; Immer, K.; Laing, R.; Liu, H. B.; Marconi, G.; Mroczkowski, T.; Testi, L.; Yagoubov, P.

    2017-03-01

    ALMA Band 5 (163–211 GHz) was recently commissioned and Science Verification (SV) observations were obtained in the latter half of 2016. A primary scientific focus of this band is the H2O line at 183.3 GHz, which can be observed around 15% of the time when the precipitable water vapour is sufficiently low (< 0.5 mm). Many more lines are covered in Band 5 and can be observed for over 70% of the time on Chajnantor, requiring similar restrictions to those for ALMA Bands 4 and 6. Examples include the H218O line at 203 GHz, some of the bright (3–2) lines of singly and doubly deuterated forms of formaldehyde, the (2–1) lines of HCO+, HCN, HNC, N2H+ and several of their isotopologues. A young star-forming region near the centre of the Milky Way, an evolved star also in our Galaxy, and a nearby ultraluminous infrared galaxy (ULIRG) were observed as part of the SV process and the data are briefly described. The reduced data, along with imaged data products, are now public and demonstrate the power of ALMA for high-resolution studies of H2O and other molecules in a variety of astronomical targets.

  12. Biometric verification in dynamic writing

    NASA Astrophysics Data System (ADS)

    George, Susan E.

    2002-03-01

    Pen-tablet devices capable of capturing the dynamics of writing record temporal and pressure information as well as the spatial pattern. This paper explores biometric verification based upon the dynamics of writing where writers are distinguished not on the basis of what they write (ie the signature), but how they write. We have collected samples of dynamic writing from 38 Chinese writers. Each writer was asked to provide 10 copies of a paragraph of text and the same number of signature samples. From the data we have extracted stroke-based primitives from the sentence data utilizing pen-up/down information and heuristic rules about the shape of the character. The x, y and pressure values of each primitive were interpolated into an even temporal range based upon a 20 msec sampling rate. We applied the Daubechies 1 wavelet transform to the x signal, y signal and pressure signal using the coefficients as inputs to a multi-layer perceptron trained with back-propagation on the sentence data. We found a sensitivity of 0.977 and specificity of 0.990 recognizing writers based on test primitives extracted from sentence data and measures of 0.916 and 0.961 respectively, from test primitives extracted from signature data.

  13. Learning Assumptions for Compositional Verification

    NASA Technical Reports Server (NTRS)

    Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.

  14. Cold Flow Verification Test Facility

    SciTech Connect

    Shamsi, A.; Shadle, L.J.

    1996-12-31

    The cold flow verification test facility consists of a 15-foot high, 3-foot diameter, domed vessel made of clear acrylic in two flanged sections. The unit can operate up to pressures of 14 psig. The internals include a 10-foot high jetting fluidized bed, a cylindrical baffle that hangs from the dome, and a rotating grate for control of continuous solids removal. The fluid bed is continuously fed solids (20 to 150 lb/hr) through a central nozzle made up of concentric pipes. It can either be configured as a half or full cylinder of various dimensions. The fluid bed has flow loops for separate air flow control for conveying solids (inner jet, 500 to 100000 scfh) , make-up into the jet (outer jet, 500 to 8000 scfh), spargers in the solids removal annulus (100 to 2000 scfh), and 6 air jets (20 to 200 scfh) on the sloping conical grid. Additional air (500 to 10000 scfh) can be added to the top of the dome and under the rotating grate. The outer vessel, the hanging cylindrical baffles or skirt, and the rotating grate can be used to study issues concerning moving bed reactors. There is ample allowance for access and instrumentation in the outer shell. Furthermore, this facility is available for future Cooperative Research and Development Program Manager Agreements (CRADA) to study issues and problems associated with fluid- and fixed-bed reactors. The design allows testing of different dimensions and geometries.

  15. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Rontec PicoTAX x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the PicoTAX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the PicoTAX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by c

  16. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Elvatech, Ltd. ElvaX (ElvaX) x-ray fluorescence (XRF) analyzer distributed in the United States by Xcalibur XRF Services (Xcalibur), was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ElvaX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ElvaX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as s

  17. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Niton XLt 700 Series (XLt) XRF Services x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XLt analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XLt analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy

  18. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Oxford ED2000 x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ED2000 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ED2000 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by com

  19. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Rigaku ZSX Mini II (ZSX Mini II) XRF Services x-ray fluorescence (XRF) analyzer was demon-strated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ZSX Mini II analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ZSX Mini II analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element con

  20. Speaker verification based on the fusion of speech acoustics and inverted articulatory signals☆

    PubMed Central

    Li, Ming; Kim, Jangwon; Lammert, Adam; Ghosh, Prasanta Kumar; Ramanarayanan, Vikram; Narayanan, Shrikanth

    2016-01-01

    We propose a practical, feature-level and score-level fusion approach by combining acoustic and estimated articulatory information for both text independent and text dependent speaker verification. From a practical point of view, we study how to improve speaker verification performance by combining dynamic articulatory information with the conventional acoustic features. On text independent speaker verification, we find that concatenating articulatory features obtained from measured speech production data with conventional Mel-frequency cepstral coefficients (MFCCs) improves the performance dramatically. However, since directly measuring articulatory data is not feasible in many real world applications, we also experiment with estimated articulatory features obtained through acoustic-to-articulatory inversion. We explore both feature level and score level fusion methods and find that the overall system performance is significantly enhanced even with estimated articulatory features. Such a performance boost could be due to the inter-speaker variation information embedded in the estimated articulatory features. Since the dynamics of articulation contain important information, we included inverted articulatory trajectories in text dependent speaker verification. We demonstrate that the articulatory constraints introduced by inverted articulatory features help to reject wrong password trials and improve the performance after score level fusion. We evaluate the proposed methods on the X-ray Microbeam database and the RSR 2015 database, respectively, for the aforementioned two tasks. Experimental results show that we achieve more than 15% relative equal error rate reduction for both speaker verification tasks. PMID:28496292

  1. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  2. Monitoring and verification R&D

    SciTech Connect

    Pilat, Joseph F; Budlong - Sylvester, Kory W; Fearey, Bryan L

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.

  3. Verification of forecast ensembles in complex terrain including observation uncertainty

    NASA Astrophysics Data System (ADS)

    Dorninger, Manfred; Kloiber, Simon

    2017-04-01

    Traditionally, verification means to verify a forecast (ensemble) with the truth represented by observations. The observation errors are quite often neglected arguing that they are small when compared to the forecast error. In this study as part of the MesoVICT (Mesoscale Verification Inter-comparison over Complex Terrain) project it will be shown, that observation errors have to be taken into account for verification purposes. The observation uncertainty is estimated from the VERA (Vienna Enhanced Resolution Analysis) and represented via two analysis ensembles which are compared to the forecast ensemble. For the whole study results from COSMO-LEPS provided by Arpae-SIMC Emilia-Romagna are used as forecast ensemble. The time period covers the MesoVICT core case from 20-22 June 2007. In a first step, all ensembles are investigated concerning their distribution. Several tests have been executed (Kolmogorov-Smirnov-Test, Finkelstein-Schafer Test, Chi-Square Test etc.) showing no exact mathematical distribution. So the main focus is on non-parametric statistics (e.g. Kernel density estimation, Boxplots etc.) and also the deviation between "forced" normal distributed data and the kernel density estimations. In a next step the observational deviations due to the analysis ensembles are analysed. In a first approach scores are multiple times calculated with every single ensemble member from the analysis ensemble regarded as "true" observation. The results are presented as boxplots for the different scores and parameters. Additionally, the bootstrapping method is also applied to the ensembles. These possible approaches to incorporating observational uncertainty into the computation of statistics will be discussed in the talk.

  4. INF verification: a guide for the perplexed

    SciTech Connect

    Mendelsohn, J.

    1987-09-01

    The administration has dug itself some deep holes on the verification issue. It will have to conclude an arms control treaty without having resolved earlier (but highly questionable) compliance issues on which it has placed great emphasis. It will probably have to abandon its more sweeping (and unnecessary) on-site inspection (OSI) proposals because of adverse security and political implications for the United States and its allies. And, finally, it will probably have to present to the Congress an INF treaty that will provide for a considerably less-stringent (but nonetheless adequate) verification regime that it had originally demanded. It is difficult to dispel the impression that, when the likelihood of concluding an INF treaty seemed remote, the administration indulged its penchant for intrusive and non-negotiable verification measures. As the possibility of, and eagerness for, a treaty increased, and as the Soviet Union shifted its policy from one of the resistance to OSI to one of indicating that on-site verification involved reciprocal obligations, the administration was forced to scale back its OSI rhetoric. This re-evaluation of OSI by the administration does not make the INF treaty any less verifiable; from the outset the Reagan administration was asking for a far more extensive verification package than was necessary, practicable, acceptable, or negotiable.

  5. Hybrid Deep Learning for Face Verification.

    PubMed

    Sun, Yi; Wang, Xiaogang; Tang, Xiaoou

    2016-10-01

    This paper proposes a hybrid convolutional network (ConvNet)-Restricted Boltzmann Machine (RBM) model for face verification. A key contribution of this work is to learn high-level relational visual features with rich identity similarity information. The deep ConvNets in our model start by extracting local relational visual features from two face images in comparison, which are further processed through multiple layers to extract high-level and global relational features. To keep enough discriminative information, we use the last hidden layer neuron activations of the ConvNet as features for face verification instead of those of the output layer. To characterize face similarities from different aspects, we concatenate the features extracted from different face region pairs by different deep ConvNets. The resulting high-dimensional relational features are classified by an RBM for face verification. After pre-training each ConvNet and the RBM separately, the entire hybrid network is jointly optimized to further improve the accuracy. Various aspects of the ConvNet structures, relational features, and face verification classifiers are investigated. Our model achieves the state-of-the-art face verification performance on the challenging LFW dataset under both the unrestricted protocol and the setting when outside data is allowed to be used for training.

  6. Complementary technologies for verification of excess plutonium

    SciTech Connect

    Langner, , D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-12-31

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of {sup 240}Pu to {sup 239}Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime.

  7. Neighborhood Repulsed Metric Learning for Kinship Verification.

    PubMed

    Lu, Jiwen; Zhou, Xiuzhuang; Tan, Yap-Pen; Shang, Yuanyuan; Zhou, Jie

    2013-07-16

    Kinship verification from facial images is an interesting and challenging problem in computer vision, and there is very limited attempts on tackle this problem in the iterature. In this paper, we propose a new neighborhood repulsed metric learning (NRML) method for kinship verification. Motivated by the fact that interclass samples (without kinship relations) with higher similarity usually lie in a neighborhood and are more easily misclassified than those with lower similarity, we aim to learn a distance metric under which the intraclass samples (with kinship relations) are pulled as close as possible and interclass samples lying in a neighborhood are repulsed and pushed away as far as possible, simultaneously, such that more discriminative information can be exploited for verification. To make better use of multiple feature descriptors to extract complementary information, we further propose a multiview NRML (MNRML) method to seek a common distance metric to perform multiple feature fusion to improve the kinship verification performance. Experimental results are presented to demonstrate the efficacy of our proposed methods. Lastly, we also test human ability in kinship verification from facial images and our experimental results show that our methods are comparable to that of human observers.

  8. Neighborhood repulsed metric learning for kinship verification.

    PubMed

    Lu, Jiwen; Zhou, Xiuzhuang; Tan, Yap-Pen; Shang, Yuanyuan; Zhou, Jie

    2014-02-01

    Kinship verification from facial images is an interesting and challenging problem in computer vision, and there are very limited attempts on tackle this problem in the literature. In this paper, we propose a new neighborhood repulsed metric learning (NRML) method for kinship verification. Motivated by the fact that interclass samples (without a kinship relation) with higher similarity usually lie in a neighborhood and are more easily misclassified than those with lower similarity, we aim to learn a distance metric under which the intraclass samples (with a kinship relation) are pulled as close as possible and interclass samples lying in a neighborhood are repulsed and pushed away as far as possible, simultaneously, such that more discriminative information can be exploited for verification. To make better use of multiple feature descriptors to extract complementary information, we further propose a multiview NRML (MNRML) method to seek a common distance metric to perform multiple feature fusion to improve the kinship verification performance. Experimental results are presented to demonstrate the efficacy of our proposed methods. Finally, we also test human ability in kinship verification from facial images and our experimental results show that our methods are comparable to that of human observers.

  9. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  10. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  11. Land Ice Verification and Validation Kit

    SciTech Connect

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and test data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.

  12. Packaged low-level waste verification system

    SciTech Connect

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-08-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy`s National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL).

  13. PBL Verification with Radiosonde and Aircraft Data

    NASA Astrophysics Data System (ADS)

    Tsidulko, M.; McQueen, J.; Dimego, G.; Ek, M.

    2008-12-01

    Boundary layer depth is an important characteristic in weather forecasting and it is a key parameter in air quality modeling determining extent of turbulence and dispersion for pollutants. Real-time PBL depths from the NAM(WRF/NMM) model are verified with different types of observations. PBL depths verification is incorporated into NCEP verification system including an ability to provide a range of statistical characteristics for the boundary layer heights. For the model, several types of boundary layer definitions are used. PBL height from the TKE scheme, critical Ri number approach as well as mixed layer depth are compared with observations. Observed PBL depths are determined applying Ri number approach to radiosonde profiles. Also, preliminary study of using ACARS data for PBL verification is conducted.

  14. Active alignment/contact verification system

    DOEpatents

    Greenbaum, William M.

    2000-01-01

    A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.

  15. Software Verification of Orion Cockpit Displays

    NASA Technical Reports Server (NTRS)

    Biswas, M. A. Rafe; Garcia, Samuel; Prado, Matthew; Hossain, Sadad; Souris, Matthew; Morin, Lee

    2017-01-01

    NASA's latest spacecraft Orion is in the development process of taking humans deeper into space. Orion is equipped with three main displays to monitor and control the spacecraft. To ensure the software behind the glass displays operates without faults, rigorous testing is needed. To conduct such testing, the Rapid Prototyping Lab at NASA's Johnson Space Center along with the University of Texas at Tyler employed a software verification tool, EggPlant Functional by TestPlant. It is an image based test automation tool that allows users to create scripts to verify the functionality within a program. A set of edge key framework and Common EggPlant Functions were developed to enable creation of scripts in an efficient fashion. This framework standardized the way to code and to simulate user inputs in the verification process. Moreover, the Common EggPlant Functions can be used repeatedly in verification of different displays.

  16. Land Ice Verification and Validation Kit

    SciTech Connect

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and test data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.

  17. Critical Surface Cleaning and Verification Alternatives

    NASA Technical Reports Server (NTRS)

    Melton, Donald M.; McCool, A. (Technical Monitor)

    2000-01-01

    As a result of federal and state requirements, historical critical cleaning and verification solvents such as Freon 113, Freon TMC, and Trichloroethylene (TCE) are either highly regulated or no longer 0 C available. Interim replacements such as HCFC 225 have been qualified, however toxicity and future phase-out regulations necessitate long term solutions. The scope of this project was to qualify a safe and environmentally compliant LOX surface verification alternative to Freon 113, TCE and HCFC 225. The main effort was focused on initiating the evaluation and qualification of HCFC 225G as an alternate LOX verification solvent. The project was scoped in FY 99/00 to perform LOX compatibility, cleaning efficiency and qualification on flight hardware.

  18. Hierarchical Representation Learning for Kinship Verification.

    PubMed

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  19. Hierarchical Representation Learning for Kinship Verification.

    PubMed

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2016-09-14

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this research, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determines their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d1, and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical Kinship Verification via Representation Learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU Kinship Database is created which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields stateof- the-art kinship verification accuracy on the WVU Kinship database and on four existing benchmark datasets. Further, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  20. Generic Protocol for the Verification of Ballast Water Treatment Technology

    EPA Science Inventory

    In anticipation of the need to address performance verification and subsequent approval of new and innovative ballast water treatment technologies for shipboard installation, the U.S Coast Guard and the Environmental Protection Agency‘s Environmental Technology Verification Progr...

  1. Jet Propulsion Laboratory Environmental Verification Processes and Test Effectiveness

    NASA Technical Reports Server (NTRS)

    Hoffman, Alan R.; Green, Nelson W.

    2006-01-01

    Viewgraphs on the JPL processes for enviornmental verification and testing of aerospace systems is presented. The topics include: 1) Processes: a) JPL Design Principles b) JPL Flight Project Practices; 2) Environmental Verification; and 3) Test Effectiveness Assessment: Inflight Anomaly Trends.

  2. Generic Protocol for the Verification of Ballast Water Treatment Technology

    EPA Science Inventory

    In anticipation of the need to address performance verification and subsequent approval of new and innovative ballast water treatment technologies for shipboard installation, the U.S Coast Guard and the Environmental Protection Agency‘s Environmental Technology Verification Progr...

  3. On-Site Inspection as an Enhancement to Verification.

    DTIC Science & Technology

    1989-08-01

    and William J. Taylor , Jr., American National Security: Policy and Process (Baltimore, Maryland: The John Hopkins Press, 1984), 524. 44 34. Harvard...Study Group, Living With Nuclear Weapons, 195. 35. Jordan and Taylor , American National Security, 525. 36. Sidney N. Graybeal and Michael Krepon, "The...New York: Random House, 1984), 152. 38. Ibid., 155. 39. Jordan and Taylor , American National Security, 530- 531. 40. Smoke, National Security, 170

  4. An Efficient Joint Formulation for Bayesian Face Verification.

    PubMed

    Chen, Dong; Cao, Xudong; Wipf, David; Wen, Fang; Sun, Jian

    2017-01-01

    This paper revisits the classical Bayesian face recognition algorithm from Baback Moghaddam et al. and proposes enhancements tailored to face verification, the problem of predicting whether or not a pair of facial images share the same identity. Like a variety of face verification algorithms, the original Bayesian face model only considers the appearance difference between two faces rather than the raw images themselves. However, we argue that such a fixed and blind projection may prematurely reduce the separability between classes. Consequently, we model two facial images jointly with an appropriate prior that considers intra- and extra-personal variations over the image pairs. This joint formulation is trained using a principled EM algorithm, while testing involves only efficient closed-formed computations that are suitable for real-time practical deployment. Supporting theoretical analyses investigate computational complexity, scale-invariance properties, and convergence issues. We also detail important relationships with existing algorithms, such as probabilistic linear discriminant analysis and metric learning. Finally, on extensive experimental evaluations, the proposed model is superior to the classical Bayesian face algorithm and many alternative state-of-the-art supervised approaches, achieving the best test accuracy on three challenging datasets, Labeled Face in Wild, Multi-PIE, and YouTube Faces, all with unparalleled computational efficiency.

  5. Verification of Plan Models Using UPPAAL

    NASA Technical Reports Server (NTRS)

    Khatib, Lina; Muscettola, Nicola; Haveland, Klaus; Lau, Sonic (Technical Monitor)

    2001-01-01

    This paper describes work on the verification of HSTS, the planner and scheduler of the Remote Agent autonomous control system deployed in Deep Space 1 (DS1). The verification is done using UPPAAL, a real time model checking tool. We start by motivating our work in the introduction. Then we give a brief description of HSTS and UPPAAL. After that, we give a mapping of HSTS models into UPPAAL and we present samples of plan model properties one may want to verify. Finally, we conclude with a summary.

  6. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  7. Proton Therapy Verification with PET Imaging

    PubMed Central

    Zhu, Xuping; Fakhri, Georges El

    2013-01-01

    Proton therapy is very sensitive to uncertainties introduced during treatment planning and dose delivery. PET imaging of proton induced positron emitter distributions is the only practical approach for in vivo, in situ verification of proton therapy. This article reviews the current status of proton therapy verification with PET imaging. The different data detecting systems (in-beam, in-room and off-line PET), calculation methods for the prediction of proton induced PET activity distributions, and approaches for data evaluation are discussed. PMID:24312147

  8. Challenges in High-Assurance Runtime Verification

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.

    2016-01-01

    Safety-critical systems are growing more complex and becoming increasingly autonomous. Runtime Verification (RV) has the potential to provide protections when a system cannot be assured by conventional means, but only if the RV itself can be trusted. In this paper, we proffer a number of challenges to realizing high-assurance RV and illustrate how we have addressed them in our research. We argue that high-assurance RV provides a rich target for automated verification tools in hope of fostering closer collaboration among the communities.

  9. Electric power system test and verification program

    NASA Technical Reports Server (NTRS)

    Rylicki, Daniel S.; Robinson, Frank, Jr.

    1994-01-01

    Space Station Freedom's (SSF's) electric power system (EPS) hardware and software verification is performed at all levels of integration, from components to assembly and system level tests. Careful planning is essential to ensure the EPS is tested properly on the ground prior to launch. The results of the test performed on breadboard model hardware and analyses completed to date have been evaluated and used to plan for design qualification and flight acceptance test phases. These results and plans indicate the verification program for SSF's 75-kW EPS would have been successful and completed in time to support the scheduled first element launch.

  10. Verification Of Tooling For Robotic Welding

    NASA Technical Reports Server (NTRS)

    Osterloh, Mark R.; Sliwinski, Karen E.; Anderson, Ronald R.

    1991-01-01

    Computer simulations, robotic inspections, and visual inspections performed to detect discrepancies. Method for verification of tooling for robotic welding involves combination of computer simulations and visual inspections. Verification process ensures accuracy of mathematical model representing tooling in off-line programming system that numerically simulates operation of robotic welding system. Process helps prevent damaging collisions between welding equipment and workpiece, ensures tooling positioned and oriented properly with respect to workpiece, and/or determines whether tooling to be modified or adjusted to achieve foregoing objectives.

  11. Fuel Retrieval System Design Verification Report

    SciTech Connect

    GROTH, B.D.

    2000-04-11

    The Fuel Retrieval Subproject was established as part of the Spent Nuclear Fuel Project (SNF Project) to retrieve and repackage the SNF located in the K Basins. The Fuel Retrieval System (FRS) construction work is complete in the KW Basin, and start-up testing is underway. Design modifications and construction planning are also underway for the KE Basin. An independent review of the design verification process as applied to the K Basin projects was initiated in support of preparation for the SNF Project operational readiness review (ORR). A Design Verification Status Questionnaire, Table 1, is included which addresses Corrective Action SNF-EG-MA-EG-20000060, Item No.9 (Miller 2000).

  12. Towards Verification and Validation for Increased Autonomy

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra

    2017-01-01

    This presentation goes over the work we have performed over the last few years on verification and validation of the next generation onboard collision avoidance system, ACAS X, for commercial aircraft. It describes our work on probabilistic verification and synthesis of the model that ACAS X is based on, and goes on to the validation of that model with respect to actual simulation and flight data. The presentation then moves on to identify the characteristics of ACAS X that are related to autonomy and to discuss the challenges that autonomy pauses on VV. All work presented has already been published.

  13. On Backward-Style Anonymity Verification

    NASA Astrophysics Data System (ADS)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  14. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    NASA Technical Reports Server (NTRS)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PERFORMANCE VERIFICATION OF THE W.L. GORE & ASSOCIATES GORE-SORBER SCREENING SURVEY

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  16. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort...

  17. 15 CFR 748.13 - Delivery Verification (DV).

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 15 Commerce and Foreign Trade 2 2011-01-01 2011-01-01 false Delivery Verification (DV). 748.13... (CLASSIFICATION, ADVISORY, AND LICENSE) AND DOCUMENTATION § 748.13 Delivery Verification (DV). (a) Scope. (1) BIS may request the licensee to obtain verifications of delivery on a selective basis. A...

  18. 15 CFR 748.13 - Delivery Verification (DV).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 2 2010-01-01 2010-01-01 false Delivery Verification (DV). 748.13... (CLASSIFICATION, ADVISORY, AND LICENSE) AND DOCUMENTATION § 748.13 Delivery Verification (DV). (a) Scope. (1) BIS may request the licensee to obtain verifications of delivery on a selective basis. A...

  19. 40 CFR 1066.240 - Torque transducer verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Torque transducer verification. 1066... POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification. Verify torque-measurement systems by performing the verifications described in §§ 1066.270...

  20. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  1. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  2. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  3. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  4. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  5. 37 CFR 382.5 - Verification of statements of account.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... of account. (a) General. This section prescribes general rules pertaining to the verification of the... the verification procedure shall retain the report of the verification for a period of three years. (e... ordinary course of business according to generally accepted auditing standards by an independent auditor...

  6. 37 CFR 262.7 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Verification of royalty payments. (a) General. This section prescribes procedures by which any Copyright Owner... Designated Agent. Any such audit shall be conducted by an independent and qualified auditor identified in the... verification procedure shall retain the report of the verification for a period of not less than 3 years. (e...

  7. 37 CFR 261.6 - Verification of statements of account.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) General. This section prescribes general rules pertaining to the verification of the statements of account... auditor identified in the notice, and shall be binding on all Designated Agents, and all Copyright Owners.... The Designated Agent requesting the verification procedure shall retain the report of the verification...

  8. 21 CFR 120.11 - Verification and validation.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Verification and validation. 120.11 Section 120.11... § 120.11 Verification and validation. (a) Verification. Each processor shall verify that the Hazard...)(iv)(C) of this section, are subject to the recordkeeping requirements of § 120.12. (b) Validation of...

  9. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...

  10. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...

  11. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...

  12. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...

  13. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...

  14. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT...

  15. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during...

  16. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  17. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  18. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  19. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  20. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  1. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... SOP's and the procedures specified therein by determining that they meet the requirements of this part. Such verification may include: (a) Reviewing the Sanitation SOP's; (b) Reviewing the daily records documenting the implementation of the Sanitation SOP's and the procedures specified therein and any...

  2. Formal hardware verification of digital circuits

    NASA Technical Reports Server (NTRS)

    Joyce, J.; Seger, C.-J.

    1991-01-01

    The use of formal methods to verify the correctness of digital circuits is less constrained by the growing complexity of digital circuits than conventional methods based on exhaustive simulation. This paper briefly outlines three main approaches to formal hardware verification: symbolic simulation, state machine analysis, and theorem-proving.

  3. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment. ...

  4. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment. ...

  5. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment. ...

  6. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Eligibility verification. 457.380 Section 457.380 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STATE CHILDREN'S HEALTH INSURANCE PROGRAMS (SCHIPs) ALLOTMENTS AND GRANTS TO STATES State...

  7. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 4 2011-10-01 2011-10-01 false Eligibility verification. 457.380 Section 457.380 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STATE CHILDREN'S HEALTH INSURANCE PROGRAMS (SCHIPs) ALLOTMENTS AND GRANTS TO STATES State...

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION OF BAGHOUSE FILTRATION PRODUCTS

    EPA Science Inventory

    The Environmental Technology Verification Program (ETV) was started by EPA in 1995 to generate independent credible data on the performance of innovative technologies that have potential to improve protection of public health and the environment. ETV does not approve or certify p...

  9. Verification of method performance for clinical laboratories.

    PubMed

    Nichols, James H

    2009-01-01

    Method verification, a one-time process to determine performance characteristics before a test system is utilized for patient testing, is often confused with method validation, establishing the performance of a new diagnostic tool such as an internally developed or modified method. A number of international quality standards (International Organization for Standardization (ISO) and Clinical Laboratory Standards Institute (CLSI)), accreditation agency guidelines (College of American Pathologists (CAP), Joint Commission, U.K. Clinical Pathology Accreditation (CPA)), and regional laws (Clinical Laboratory Improvement Amendments of 1988 (CLIA'88)) exist describing the requirements for method verification and validation. Consumers of marketed test kits should verify method accuracy, precision, analytic measurement range, and the appropriateness of reference intervals to the institution's patient population. More extensive validation may be required for new methods and those manufacturer methods that have been modified by the laboratory, including analytic sensitivity and specificity. This manuscript compares the various recommendations for method verification and discusses the CLSI evaluation protocols (EP) that are available to guide laboratories in performing method verification experiments.

  10. 18 CFR 33.7 - Verification.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Verification. 33.7 Section 33.7 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF... thereto and having knowledge of the matters therein set forth, and must be verified under oath. ...

  11. 18 CFR 33.7 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 33.7 Section 33.7 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF... thereto and having knowledge of the matters therein set forth, and must be verified under oath. ...

  12. 18 CFR 34.8 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 34.8 Section 34.8 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF... applicant, who has knowledge of the matters set forth therein, and it shall be verified under oath...

  13. 18 CFR 33.7 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 33.7 Section 33.7 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF... thereto and having knowledge of the matters therein set forth, and must be verified under oath. ...

  14. 18 CFR 33.7 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification. 33.7 Section 33.7 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF... thereto and having knowledge of the matters therein set forth, and must be verified under oath. ...

  15. 18 CFR 34.8 - Verification.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Verification. 34.8 Section 34.8 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF... applicant, who has knowledge of the matters set forth therein, and it shall be verified under oath...

  16. Environmental Technology Verification (ETV) Quality Program (Poster)

    EPA Science Inventory

    This is a poster created for the ETV Quality Program. The EPA Environmental Technology Verification Program (ETV) develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The...

  17. 2017 EPA Protocol Gas Verification Program Participants

    EPA Pesticide Factsheets

    A list of participants for 2016 EPA's Protocol Gas Verification Program (PGVP) for stationary source monitoring. The list also has vendor IDs, which are production site-specific, and are the same ones used in the PGVP for ambient air monitoring.

  18. Learner Verification: A Publisher's Case Study.

    ERIC Educational Resources Information Center

    Wilson, George

    Learner verification, a process by which publishers monitor the effectiveness of their products and strive to improve their services to schools, is a practice that most companies take seriously. The quality of educational materials may be ensured in many ways: by analysis of sales, through firsthand investigation, and by employing a system of…

  19. Environmental Technology Verification (ETV) Quality Program (Poster)

    EPA Science Inventory

    This is a poster created for the ETV Quality Program. The EPA Environmental Technology Verification Program (ETV) develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The...

  20. Gender, Legitimation, and Identity Verification in Groups

    ERIC Educational Resources Information Center

    Burke, Peter J.; Stets, Jan E.; Cerven, Christine

    2007-01-01

    Drawing upon identity theory, expectation states theory, and legitimation theory, we examine how the task leader identity in task-oriented groups is more likely to be verified for persons with high status characteristics. We hypothesize that identity verification will be accomplished more readily for male group members and legitimated task leaders…

  1. The Verification Guide, 1998-99.

    ERIC Educational Resources Information Center

    Office of Postsecondary Education, Washington DC. Student Financial Assistance Programs.

    This guide is intended to assist financial aid administrators at postsecondary education institutions in completing verification, the process of checking the accuracy of the information students provide when they apply for financial aid under student financial assistance (SFA) programs administered by the U.S. Department of Education. The first…

  2. The Assembly, Integration, and Verification (AIV) team

    NASA Astrophysics Data System (ADS)

    2009-06-01

    Assembly, Integration, and Verification (AIV) is the process by which the software and hardware deliveries from the distributed ALMA partners (North America, South America, Europe, and East Asia) are assembled and integrated into a working system, and the initial technical capabilities tested to insure that they will meet the observatories exacting requirements for science.

  3. The ALMA Commissioning and Science Verification Team

    NASA Astrophysics Data System (ADS)

    Hales, A.; Sheth, K.; Wilson, T. L.

    2010-04-01

    The goal of Commissioning is to take ALMA from the stage reached at the end of AIV, that is, a system that functions at an engineering level to an instrument that meets the science / astronomy requirements. Science Verification is the quantitative confirmation that the data produced by the instrument is valid and has the required characteristics in terms of sensitivity, image quality and accuracy.

  4. An Interactive System for Graduation Verification.

    ERIC Educational Resources Information Center

    Wang, Y.; Dasarathy, B.

    1981-01-01

    This description of a computerized graduation verification system developed and implemented at the University of South Carolina at Columbia discusses the "table-driven" feature of the programs and details the implementation of the system, including examples of the Extended Backus Naur Form (EBNF) notation used to represent the system…

  5. 18 CFR 34.8 - Verification.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... SECURITIES OR THE ASSUMPTION OF LIABILITIES § 34.8 Verification. Link to an amendment published at 70 FR.... Effective Date Note: At 70 FR 35375, June 20, 2005, § 34.8 was revised, effective at the time of the next e...

  6. 18 CFR 34.8 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... SECURITIES OR THE ASSUMPTION OF LIABILITIES § 34.8 Verification. Link to an amendment published at 70 FR.... Effective Date Note: At 70 FR 35375, June 20, 2005, § 34.8 was revised, effective at the time of the next e...

  7. Synthesis and verification of analog integrated circuits

    NASA Astrophysics Data System (ADS)

    Antao, Brian Anthony Amacio

    This dissertation addresses analog integrated circuit design from a systems perspective. An analog behavioral level is characterized in terms of transfer functions and state-space models; then a behavioral synthesis process defines high-level architectures independent of circuit implementation technologies. The concept of an intermediate architecture is introduced to synthesize the behavioural architectures into technology specific architectures. The two levels provide for system level design and a design space exploration, which allows tradeoffs between various architectures and implementation technologies. A customized behavioral simulation methodology is developed for architectural verification. The behavioral simulator is implemented in a circuit simulation paradigm with time and frequency domain analysis, steady state analysis and sensitivity and distortion analysis. The verification process ensures that the behavioral specifications are met and provides a set of metrics such as sensitivities for design space exploration. The synthesis and verification functionalities are implemented as the two software modules ARCHGEN and ARCHSIM. The two modules are coupled together into an integrated synthesis and verification framework. A C++ based Architecture Specification Language is devised for hardware description. The implementation is open-ended making possible the application of software synthesis techniques to customize the tools to specific classes of analog systems. A constrained nonlinear optimization approach is formulated for circuit-level design of MOS integrated circuits. A C++ based circuit description language is developed for defining the analytical design models, i.e. describing circuit topologies and performance functions for the optimizer.

  8. PROMOTING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATION

    EPA Science Inventory

    The paper discusses the promotion of improved air quality through environmental technology verifications (ETVs). In 1995, the U.S. EPA's Office of Research and Development began the ETV Program in response to President Clinton's "Bridge to a Sustainable Future" and Vice Presiden...

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION OF BAGHOUSE FILTRATION PRODUCTS

    EPA Science Inventory

    The Environmental Technology Verification Program (ETV) was started by EPA in 1995 to generate independent credible data on the performance of innovative technologies that have potential to improve protection of public health and the environment. ETV does not approve or certify p...

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION OF URBAN RUNOFF MODELS

    EPA Science Inventory

    This paper will present the verification process and available results of the XP-SWMM modeling system produced by XP-Software conducted unde the USEPA's ETV Program. Wet weather flow (WWF) models are used throughout the US for the evaluation of storm and combined sewer systems. M...

  11. Symbolic Model Checking for Sequential Circuit Verification

    DTIC Science & Technology

    1993-07-15

    Emerson, and. Sistla [17] is mod- ilied to represent state graphs using hmM deision diarg ns (BDDs) [7] and parfitoned wuaition relstions [10, 11]. Because...Verlag. [17] E. M. Clarke, E. A. Emerson, and A. P. Sistla . Automatic verification of finite-state concurrent systems using temporal logic

  12. Distilling the Verification Process for Prognostics Algorithms

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai

    2013-01-01

    The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.

  13. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  14. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  15. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  16. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  17. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 4 2012-10-01 2012-10-01 false Eligibility verification. 457.380 Section 457.380 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STATE CHILDREN'S HEALTH INSURANCE PROGRAMS (SCHIPs) ALLOTMENTS AND GRANTS TO STATES State...

  18. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  19. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  20. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  1. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  2. 21 CFR 123.8 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Verification. 123.8 Section 123.8 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... processor shall verify that the HACCP plan is adequate to control food safety hazards that are...

  3. PROMOTING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATION

    EPA Science Inventory

    The paper discusses the promotion of improved air quality through environmental technology verifications (ETVs). In 1995, the U.S. EPA's Office of Research and Development began the ETV Program in response to President Clinton's "Bridge to a Sustainable Future" and Vice Presiden...

  4. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part will...

  5. 10 CFR 300.11 - Independent verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... verifiers, and has been empowered to make decisions relevant to the provision of a verification statement... methods; and (v) Risk assessment and methodologies and materiality analysis procedures outlined by other... Accreditation Board program for Environmental Management System auditors (ANSI-RAB-EMS); Board of Environmental...

  6. 10 CFR 300.11 - Independent verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... verifiers, and has been empowered to make decisions relevant to the provision of a verification statement... methods; and (v) Risk assessment and methodologies and materiality analysis procedures outlined by other... Accreditation Board program for Environmental Management System auditors (ANSI-RAB-EMS); Board of Environmental...

  7. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  8. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 4 2014-10-01 2014-10-01 false Eligibility verification. 457.380 Section 457.380 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STATE CHILDREN'S HEALTH INSURANCE PROGRAMS (SCHIPs) ALLOTMENTS AND GRANTS TO STATES State...

  9. Seven Years of ACTS Technology Verification Experiments Reviewed

    NASA Technical Reports Server (NTRS)

    Acosta, Roberto J.; Johnson, Sandra K.; McEntee, Kathleen M.; Gauntner, William; Feliciano, Walber

    2000-01-01

    The Advanced Communications Technology Satellite (ACTS) was designed to achieve a 99.5-percent system availability rate and signals with less than one error in 10(exp 7) bits throughout the continental United States. To accomplish such a high rate of system availability, ACTS uses multiple narrow hopping beams and very small aperture terminal (VSAT) technology. In addition, ACTS uses an adaptive rain fade compensation protocol to reduce the negative effects of propagation on the system. To enhance knowledge on how propagation and system variances affect system availability, researchers at the NASA Glenn Research Center at Lewis Field performed technology verification experiments over a 7-yr period (from September 1993 to the present). These experiments include T1VSAT System Availability, Statistical Rain Fade Compensation Characterization, Statistical Characterization of Ka-Band Propagation Effects on Communication Link Performance, and Multibeam Antenna Performance.

  10. Digital video system for on-line portal verification

    NASA Astrophysics Data System (ADS)

    Leszczynski, Konrad W.; Shalev, Shlomo; Cosby, N. Scott

    1990-07-01

    A digital system has been developed for on-line acquisition, processing and display of portal images during radiation therapy treatment. A metal/phosphor screen combination is the primary detector, where the conversion from high-energy photons to visible light takes place. A mirror angled at 45 degrees reflects the primary image to a low-light-level camera, which is removed from the direct radiation beam. The image registered by the camera is digitized, processed and displayed on a CRT monitor. Advanced digital techniques for processing of on-line images have been developed and implemented to enhance image contrast and suppress the noise. Some elements of automated radiotherapy treatment verification have been introduced.

  11. The AdaptiV Approach to Verification of Adaptive Systems

    SciTech Connect

    Rouff, Christopher; Buskens, Richard; Pullum, Laura L; Cui, Xiaohui; Hinchey, Mike

    2012-01-01

    Adaptive systems are critical for future space and other unmanned and intelligent systems. Verification of these systems is also critical for their use in systems with potential harm to human life or with large financial investments. Due to their nondeterministic nature and extremely large state space, current methods for verification of software systems are not adequate to provide a high level of assurance. The combination of stabilization science, high performance computing simulations, compositional verification and traditional verification techniques, plus operational monitors, provides a complete approach to verification and deployment of adaptive systems that has not been used before. This paper gives an overview of this approach.

  12. On the role of code comparisons in verification and validation.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2003-08-01

    This report presents a perspective on the role of code comparison activities in verification and validation. We formally define the act of code comparison as the Code Comparison Principle (CCP) and investigate its application in both verification and validation. One of our primary conclusions is that the use of code comparisons for validation is improper and dangerous. We also conclude that while code comparisons may be argued to provide a beneficial component in code verification activities, there are higher quality code verification tasks that should take precedence. Finally, we provide a process for application of the CCP that we believe is minimal for achieving benefit in verification processes.

  13. Monitoring, reporting and verification for national REDD + programmes: two proposals

    NASA Astrophysics Data System (ADS)

    Herold, Martin; Skutsch, Margaret

    2011-01-01

    Different options have been suggested by Parties to the UNFCCC (United Framework Convention on Climate Change) for inclusion in national approaches to REDD and REDD + (reduced deforestation, reduced degradation, enhancement of forest carbon stocks, sustainable management of forest, and conservation of forest carbon stocks). This paper proposes that from the practical and technical points of view of designing action for REDD and REDD + at local and sub-national level, as well as from the point of view of the necessary MRV (monitoring, reporting and verification), these should be grouped into three categories: conservation, which is rewarded on the basis of no changes in forest stock, reduced deforestation, in which lowered rates of forest area loss are rewarded, and positive impacts on carbon stock changes in forests remaining forest, which includes reduced degradation, sustainable management of forest of various kinds, and forest enhancement. Thus we have moved degradation, which conventionally is grouped with deforestation, into the forest management group reported as areas remaining forest land, with which it has, in reality, and particularly as regards MRV, much more in common. Secondly, in the context of the fact that REDD/REDD + is to take the form of a national or near-national approach, we argue that while systematic national monitoring is important, it may not be necessary for REDD/REDD + activities, or for national MRV, to be started at equal levels of intensity all over the country. Rather, areas where interventions seem easiest to start may be targeted, and here data measurements may be more rigorous (Tier 3), for example based on stakeholder self-monitoring with independent verification, while in other, untreated areas, a lower level of monitoring may be pursued, at least in the first instance. Treated areas may be targeted for any of the three groups of activities (conservation, reduced deforestation, and positive impact on carbon stock increases in

  14. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE PAGES

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...

    2017-03-23

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  15. RADI's Airborne X-SAR with High Resolution: Performance, Characterization and Verification

    NASA Astrophysics Data System (ADS)

    Shen, T.; Li, J.; Wang, Z. R.; Huang, L.

    2016-11-01

    X-SAR is an airborne multi-mode synthetic aperture radar (SAR) system with high- resolution, interferometer and full-polarization, developed by the Institute of Remote Sensing and Digital Earth (RADI), Chinese Academy of Sciences (CAS), funded by the CAS Large Research Infrastructures. Since 2009, the first developed stage of X-SAR system was successfully implemented to an operational SAR with high resolution (up to 0.5 meter). In May 2013, the imaging verification on flights test was carried out. The data calibration on the laboratory measurements were completed at the end of 2015. Many valuable results of imaging verification and data calibration have emphasized the quantitative microwave measurement capabilities. This paper presents the results of X-SAR system performance, characterization, optimization, and verification as carried out during the flight trials and laboratory measurement. The system performance and calibration parameters are presented such as transmitter amplitude accuracy, phase noise, system gain change with temperature variation, long-term radiometric stability. The imaging verification of the key performance parameters is discussed, including target-response function, target pairs discrimination, image noise and radiometric resolution. The example imagery of radiometric enhanced products for intensity change detection is also described.

  16. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    NASA Astrophysics Data System (ADS)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  17. Validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Gilstrap, Lewey

    1991-01-01

    Validation and verification (V&V) are procedures used to evaluate system structure or behavior with respect to a set of requirements. Although expert systems are often developed as a series of prototypes without requirements, it is not possible to perform V&V on any system for which requirements have not been prepared. In addition, there are special problems associated with the evaluation of expert systems that do not arise in the evaluation of conventional systems, such as verification of the completeness and accuracy of the knowledge base. The criticality of most NASA missions make it important to be able to certify the performance of the expert systems used to support these mission. Recommendations for the most appropriate method for integrating V&V into the Expert System Development Methodology (ESDM) and suggestions for the most suitable approaches for each stage of ESDM development are presented.

  18. On random transformations for changeable face verification.

    PubMed

    Wang, Yongjin; Hatzinakos, Dimitrios

    2011-06-01

    The generation of changeable and privacy-preserving biometric templates is important for the pervasive deployment of biometric technology in a wide variety of applications. This paper presents a systematic analysis of random transformation-based methods for addressing the changeability and privacy problems in biometrics-based verification systems. The proposed methods transform the original biometric feature vectors using random transformations, and the sorted index numbers (SIN) of the resulting vectors in the transformed domain are stored as the biometric templates. Three types of random transformations, namely, random additive transform, random multiplicative transform, and random projection, are discussed and analyzed. The random transformations, in combination with the SIN approach, constitute repeatable and noninvertible transformations; hence, the generated templates are changeable and provide privacy protection. The effectiveness of the proposed methods is well supported by both detailed analysis and extensive experimentation on a face verification problem.

  19. Verification tests for contaminant transport codes

    SciTech Connect

    Rowe, R.K.; Nadarajah, P.

    1996-12-31

    The importance of verifying contaminant transport codes and the techniques that may be used in this verification process are discussed. Commonly used contaminant transport codes are characterized as belonging to one of several types or classes of solution, such as analytic, finite layer, boundary element, finite difference and finite element. Both the level of approximation and the solution methodology should be verified for each contaminant transport code. One powerful method that may be used in contaminant transport code verification is cross-checking (benchmarking) with other codes. This technique is used to check the results of codes from one solution class with the results of codes from another solution class. In this paper cross-checking is performed for three classes of solution; these are, analytic, finite layer, and finite element.

  20. Sensor-fusion-based biometric identity verification

    SciTech Connect

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W.; Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person`s identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm.

  1. Collaborative Localization and Location Verification in WSNs

    PubMed Central

    Miao, Chunyu; Dai, Guoyong; Ying, Kezhen; Chen, Qingzhang

    2015-01-01

    Localization is one of the most important technologies in wireless sensor networks. A lightweight distributed node localization scheme is proposed by considering the limited computational capacity of WSNs. The proposed scheme introduces the virtual force model to determine the location by incremental refinement. Aiming at solving the drifting problem and malicious anchor problem, a location verification algorithm based on the virtual force mode is presented. In addition, an anchor promotion algorithm using the localization reliability model is proposed to re-locate the drifted nodes. Extended simulation experiments indicate that the localization algorithm has relatively high precision and the location verification algorithm has relatively high accuracy. The communication overhead of these algorithms is relative low, and the whole set of reliable localization methods is practical as well as comprehensive. PMID:25954948

  2. Packaged low-level waste verification system

    SciTech Connect

    Tuite, K.; Winberg, M.R.; McIsaac, C.V.

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  3. Conceptual design. Final report: TFE Verification Program

    SciTech Connect

    Not Available

    1994-03-01

    This report documents the TFE Conceptual Design, which provided the design guidance for the TFE Verification program. The primary goals of this design effort were: (1) establish the conceptual design of an in-core thermionic reactor for a 2 Mw(e) space nuclear power system with a 7-year operating lifetime; (2) demonstrate scalability of the above concept over the output power range of 500 kW(e) to 5 MW(e); and (3) define the TFE which is the basis for the 2 MW (e) reactor design. This TFE specification provided the basis for the test program. These primary goals were achieved. The technical approach taking in the conceptual design effort is discussed in Section 2, and the results are discussed in Section 3. The remainder of this introduction draws a perspective on the role that this conceptual design task played in the TFE Verification Program.

  4. Spacecraft attitude calibration/verification baseline study

    NASA Technical Reports Server (NTRS)

    Chen, L. C.

    1981-01-01

    A baseline study for a generalized spacecraft attitude calibration/verification system is presented. It can be used to define software specifications for three major functions required by a mission: the pre-launch parameter observability and data collection strategy study; the in-flight sensor calibration; and the post-calibration attitude accuracy verification. Analytical considerations are given for both single-axis and three-axis spacecrafts. The three-axis attitudes considered include the inertial-pointing attitudes, the reference-pointing attitudes, and attitudes undergoing specific maneuvers. The attitude sensors and hardware considered include the Earth horizon sensors, the plane-field Sun sensors, the coarse and fine two-axis digital Sun sensors, the three-axis magnetometers, the fixed-head star trackers, and the inertial reference gyros.

  5. Verification of floating-point software

    NASA Technical Reports Server (NTRS)

    Hoover, Doug N.

    1990-01-01

    Floating point computation presents a number of problems for formal verification. Should one treat the actual details of floating point operations, or accept them as imprecisely defined, or should one ignore round-off error altogether and behave as if floating point operations are perfectly accurate. There is the further problem that a numerical algorithm usually only approximately computes some mathematical function, and we often do not know just how good the approximation is, even in the absence of round-off error. ORA has developed a theory of asymptotic correctness which allows one to verify floating point software with a minimum entanglement in these problems. This theory and its implementation in the Ariel C verification system are described. The theory is illustrated using a simple program which finds a zero of a given function by bisection. This paper is presented in viewgraph form.

  6. NGSI: IAEA Verification of UF6 Cylinders

    SciTech Connect

    Curtis, Michael M.

    2012-06-05

    The International Atomic Energy Agency (IAEA) is often ignorant of the location of declared, uranium hexafluoride (UF6) cylinders following verification, because cylinders are not typically tracked onsite or off. This paper will assess various methods the IAEA uses to verify cylinder gross defects, and how the task could be ameliorated through the use of improved identification and monitoring. The assessment will be restricted to current verification methods together with one that has been applied on a trial basis—short-notice random inspections coupled with mailbox declarations. This paper is part of the NNSA Office of Nonproliferation and International Security’s Next Generation Safeguards Initiative (NGSI) program to investigate the concept of a global monitoring scheme that uniquely identifies and tracks UF6 cylinders.

  7. Systems Approach to Arms Control Verification

    SciTech Connect

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  8. Test load verification through strain data analysis

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Harrington, F.

    1995-01-01

    A traditional binding acceptance criterion on polycrystalline structures is the experimental verification of the ultimate factor of safety. At fracture, the induced strain is inelastic and about an order-of-magnitude greater than designed for maximum expected operational limit. At this extreme strained condition, the structure may rotate and displace at the applied verification load such as to unknowingly distort the load transfer into the static test article. Test may result in erroneously accepting a submarginal design or rejecting a reliable one. A technique was developed to identify, monitor, and assess the load transmission error through two back-to-back surface-measured strain data. The technique is programmed for expediency and convenience. Though the method was developed to support affordable aerostructures, the method is also applicable for most high-performance air and surface transportation structural systems.

  9. Precision segmented reflector, figure verification sensor

    NASA Technical Reports Server (NTRS)

    Manhart, Paul K.; Macenka, Steve A.

    1989-01-01

    The Precision Segmented Reflector (PSR) program currently under way at the Jet Propulsion Laboratory is a test bed and technology demonstration program designed to develop and study the structural and material technologies required for lightweight, precision segmented reflectors. A Figure Verification Sensor (FVS) which is designed to monitor the active control system of the segments is described, a best fit surface is defined, and an image or wavefront quality of the assembled array of reflecting panels is assessed

  10. Survey of Existing Tools for Formal Verification.

    SciTech Connect

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.; Jackson, Mayo

    2014-12-01

    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  11. Cleaning verification by air/water impingement

    NASA Technical Reports Server (NTRS)

    Jones, Lisa L.; Littlefield, Maria D.; Melton, Gregory S.; Caimi, Raoul E. B.; Thaxton, Eric A.

    1995-01-01

    This paper will discuss how the Kennedy Space Center intends to perform precision cleaning verification by Air/Water Impingement in lieu of chlorofluorocarbon-113 gravimetric nonvolatile residue analysis (NVR). Test results will be given that demonstrate the effectiveness of the Air/Water system. A brief discussion of the Total Carbon method via the use of a high temperature combustion analyzer will also be given. The necessary equipment for impingement will be shown along with other possible applications of this technology.

  12. Verification tests of durable TPS concepts

    NASA Technical Reports Server (NTRS)

    Shideler, J. L.; Webb, G. L.; Pittman, C. M.

    1984-01-01

    Titanium multiwall, superalloy honeycomb, and Advanced Carbon-carbon (ACC) multipost Thermal Protection System (TPS) concepts are being developed to provide durable protection for surfaces of future space transportation systems. Verification tests including thermal, vibration, acoustic, water absorption, lightning strike, and aerothermal tests are described. Preliminary results indicate that the three TPS concepts are viable up to a surface temperature in excess of 2300 F.

  13. Secure Image Hash Comparison for Warhead Verification

    SciTech Connect

    Bruillard, Paul J.; Jarman, Kenneth D.; Robinson, Sean M.

    2014-06-06

    The effort to inspect and verify warheads in the context of possible future arms control treaties is rife with security and implementation issues. In this paper we review prior work on perceptual image hashing for template-based warhead verification. Furthermore, we formalize the notion of perceptual hashes and demonstrate that large classes of such functions are likely not cryptographically secure. We close with a brief discussion of fully homomorphic encryption as an alternative technique.

  14. Lithium-Ion Verification Test Program

    NASA Technical Reports Server (NTRS)

    McKissock, Barbara; Manzo, Michelle; Miller, Thomas; Reid, Concha; Bennett, William; Gemeiner, Russel

    2006-01-01

    Need for technology verification for aerospace applications. Structure flexible program that will allow assessment of current technology capabilities. Provide information about various vendors. Provide for assessment of technology developments. Developed statistical DOE to interpret relationships in data and to address program test goals and resource limitations. Data will be used to develop a model to predict life of cells as a function of DOD, temperature, and EOCV.

  15. Verification of the databases EXFOR and ENDF

    NASA Astrophysics Data System (ADS)

    Berton, Gottfried; Damart, Guillaume; Cabellos, Oscar; Beauzamy, Bernard; Soppera, Nicolas; Bossant, Manuel

    2017-09-01

    The objective of this work is for the verification of large experimental (EXFOR) and evaluated nuclear reaction databases (JEFF, ENDF, JENDL, TENDL…). The work is applied to neutron reactions in EXFOR data, including threshold reactions, isomeric transitions, angular distributions and data in the resonance region of both isotopes and natural elements. Finally, a comparison of the resonance integrals compiled in EXFOR database with those derived from the evaluated libraries is also performed.

  16. Component Verification and Certification in NASA Missions

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Penix, John; Norvig, Peter (Technical Monitor)

    2001-01-01

    Software development for NASA missions is a particularly challenging task. Missions are extremely ambitious scientifically, have very strict time frames, and must be accomplished with a maximum degree of reliability. Verification technologies must therefore be pushed far beyond their current capabilities. Moreover, reuse and adaptation of software architectures and components must be incorporated in software development within and across missions. This paper discusses NASA applications that we are currently investigating from these perspectives.

  17. Validation, Verification and Certification of Embedded Systems

    DTIC Science & Technology

    2005-10-01

    Walkthroughs (pluralistic, cognitive ) • Usability tests • Thinking aloud • Measured performance • Field usability testing • Follow-up studies...VERIFICATION AND VALIDATION: CURRENT AND BEST PRACTICE RTO-TR-IST-027 3 - 11 Cognitive walkthrough is a technique for evaluating user interfaces by...analysing the mental processes required of users. Like heuristic evaluation , the results are based on the judgement of the cognitive walkthrough analyst

  18. Program Verification for Optimized Byte Copy

    DTIC Science & Technology

    1994-07-01

    with advice from Matthias Felleisen and Robert Harper, produced a transformational proof similar to those described by Mason [6]. The program has been...AD-A283 915 lii M~lii IMRI nl~ltlllrl Program Verification for Optimized Byte Copy EC Accesion For Edoardo S. Biagioni NTIS CRA&I July 1994 - DTIC...Dist Special Carnegie Mellon University Pittsburgh, PA 15213 00 SCW) Also published as Fox Memorandum CMU-CS-FOX-94-06 ( Abstract We present a program

  19. Identity Verification Systems as a Critical Infrastructure

    DTIC Science & Technology

    2012-03-01

    fraudulent credit card scanners , stolen purse or wallet, or phone and Internet scams.9 The FTC also reports that identity thieves steal information in...alternative to knowledge and token-based verification. Fingerprints, retinal scans, facial recognition software, and DNA provide technically and...utilizing these systems. U.S.-VISIT was intended to automate the entry and exit process for foreign travelers. Biometric fingerprint scanners and

  20. Automating engineering verification in ALMA subsystems

    NASA Astrophysics Data System (ADS)

    Ortiz, José; Castillo, Jorge

    2014-08-01

    The Atacama Large Millimeter/submillimeter Array is an interferometer comprising 66 individual high precision antennas located over 5000 meters altitude in the north of Chile. Several complex electronic subsystems need to be meticulously tested at different stages of an antenna commissioning, both independently and when integrated together. First subsystem integration takes place at the Operations Support Facilities (OSF), at an altitude of 3000 meters. Second integration occurs at the high altitude Array Operations Site (AOS), where also combined performance with Central Local Oscillator (CLO) and Correlator is assessed. In addition, there are several other events requiring complete or partial verification of instrument specifications compliance, such as parts replacements, calibration, relocation within AOS, preventive maintenance and troubleshooting due to poor performance in scientific observations. Restricted engineering time allocation and the constant pressure of minimizing downtime in a 24/7 astronomical observatory, impose the need to complete (and report) the aforementioned verifications in the least possible time. Array-wide disturbances, such as global power interruptions and following recovery, generate the added challenge of executing this checkout on multiple antenna elements at once. This paper presents the outcome of the automation of engineering verification setup, execution, notification and reporting in ALMA and how these efforts have resulted in a dramatic reduction of both time and operator training required. Signal Path Connectivity (SPC) checkout is introduced as a notable case of such automation.

  1. Component testing for dynamic model verification

    NASA Technical Reports Server (NTRS)

    Hasselman, T. K.; Chrostowski, J. D.

    1984-01-01

    Dynamic model verification is the process whereby an analytical model of a dynamic system is compared with experimental data, adjusted if necessary to bring it into agreement with the data, and then qualified for future use in predicting system response in a different dynamic environment. These are various ways to conduct model verification. The approach taken here employs Bayesian statistical parameter estimation. Unlike curve fitting, whose objective is to minimize the difference between some analytical function and a given quantity of test data (or curve), Bayesian estimation attempts also to minimize the difference between the parameter values of that funciton (the model) and their initial estimates, in a least squares sense. The objectives of dynamic model verification, therefore, are to produce a model which: (1) is in agreement with test data; (2) will assist in the interpretation of test data; (3) can be used to help verify a design; (4) will reliably predict performance; and (5) in the case of space structures, will facilitate dynamic control.

  2. Verification in referral-based crowdsourcing.

    PubMed

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through "referral-based crowdsourcing": the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge.

  3. Formal verification of an avionics microprocessor

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam, K.; Miller, Steven P.

    1995-01-01

    Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.

  4. Signal verification can promote reliable signalling

    PubMed Central

    Broom, Mark; Ruxton, Graeme D.; Schaefer, H. Martin

    2013-01-01

    The central question in communication theory is whether communication is reliable, and if so, which mechanisms select for reliability. The primary approach in the past has been to attribute reliability to strategic costs associated with signalling as predicted by the handicap principle. Yet, reliability can arise through other mechanisms, such as signal verification; but the theoretical understanding of such mechanisms has received relatively little attention. Here, we model whether verification can lead to reliability in repeated interactions that typically characterize mutualisms. Specifically, we model whether fruit consumers that discriminate among poor- and good-quality fruits within a population can select for reliable fruit signals. In our model, plants either signal or they do not; costs associated with signalling are fixed and independent of plant quality. We find parameter combinations where discriminating fruit consumers can select for signal reliability by abandoning unprofitable plants more quickly. This self-serving behaviour imposes costs upon plants as a by-product, rendering it unprofitable for unrewarding plants to signal. Thus, strategic costs to signalling are not a prerequisite for reliable communication. We expect verification to more generally explain signal reliability in repeated consumer–resource interactions that typify mutualisms but also in antagonistic interactions such as mimicry and aposematism. PMID:24068354

  5. SMAP Verification and Validation Project - Final Report

    NASA Technical Reports Server (NTRS)

    Murry, Michael

    2012-01-01

    In 2007, the National Research Council (NRC) released the Decadal Survey of Earth science. In the future decade, the survey identified 15 new space missions of significant scientific and application value for the National Aeronautics and Space Administration (NASA) to undertake. One of these missions was the Soil Moisture Active Passive (SMAP) mission that NASA assigned to the Jet Propulsion Laboratory (JPL) in 2008. The goal of SMAP1 is to provide global, high resolution mapping of soil moisture and its freeze/thaw states. The SMAP project recently passed its Critical Design Review and is proceeding with its fabrication and testing phase.Verification and Validation (V&V) is widely recognized as a critical component in system engineering and is vital to the success of any space mission. V&V is a process that is used to check that a system meets its design requirements and specifications in order to fulfill its intended purpose. Verification often refers to the question "Have we built the system right?" whereas Validation asks "Have we built the right system?" Currently the SMAP V&V team is verifying design requirements through inspection, demonstration, analysis, or testing. An example of the SMAP V&V process is the verification of the antenna pointing accuracy with mathematical models since it is not possible to provide the appropriate micro-gravity environment for testing the antenna on Earth before launch.

  6. SMAP Verification and Validation Project - Final Report

    NASA Technical Reports Server (NTRS)

    Murry, Michael

    2012-01-01

    In 2007, the National Research Council (NRC) released the Decadal Survey of Earth science. In the future decade, the survey identified 15 new space missions of significant scientific and application value for the National Aeronautics and Space Administration (NASA) to undertake. One of these missions was the Soil Moisture Active Passive (SMAP) mission that NASA assigned to the Jet Propulsion Laboratory (JPL) in 2008. The goal of SMAP1 is to provide global, high resolution mapping of soil moisture and its freeze/thaw states. The SMAP project recently passed its Critical Design Review and is proceeding with its fabrication and testing phase.Verification and Validation (V&V) is widely recognized as a critical component in system engineering and is vital to the success of any space mission. V&V is a process that is used to check that a system meets its design requirements and specifications in order to fulfill its intended purpose. Verification often refers to the question "Have we built the system right?" whereas Validation asks "Have we built the right system?" Currently the SMAP V&V team is verifying design requirements through inspection, demonstration, analysis, or testing. An example of the SMAP V&V process is the verification of the antenna pointing accuracy with mathematical models since it is not possible to provide the appropriate micro-gravity environment for testing the antenna on Earth before launch.

  7. Tags and seals for arms control verification

    SciTech Connect

    DeVolpi, A.

    1990-09-18

    Tags and seals have long been recognized as important tools in arms control. The trend in control of armaments is to limit militarily significant equipment that is capable of being verified through direct and cooperative means, chiefly on-site inspection or monitoring. Although this paper will focus on the CFE treaty, the role of tags and seals for other treaties will also be addressed. Published technology and concepts will be reviewed, based on open sources. Arms control verification tags are defined as unique identifiers designed to be tamper-revealing; in that respect, seals are similar, being used as indicators of unauthorized access. Tamper-revealing tags might be considered as single-point markers, seals as two-point couplings, and nets as volume containment. The functions of an arms control tag can be considered to be two-fold: to provide field verification of the identity of a treaty-limited item (TLI), and to have a means of authentication of the tag and its tamper-revealing features. Authentication could take place in the field or be completed elsewhere. For CFE, the goal of tags and seals can be to reduce the overall cost of the entire verification system.

  8. Verification in Referral-Based Crowdsourcing

    PubMed Central

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  9. Trajectory Based Behavior Analysis for User Verification

    NASA Astrophysics Data System (ADS)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  10. Evaluation of the 29-km Eta Model. Part 1; Objective Verification at Three Selected Stations

    NASA Technical Reports Server (NTRS)

    Nutter, Paul A.; Manobianco, John; Merceret, Francis J. (Technical Monitor)

    1998-01-01

    This paper describes an objective verification of the National Centers for Environmental Prediction (NCEP) 29-km eta model from May 1996 through January 1998. The evaluation was designed to assess the model's surface and upper-air point forecast accuracy at three selected locations during separate warm (May - August) and cool (October - January) season periods. In order to enhance sample sizes available for statistical calculations, the objective verification includes two consecutive warm and cool season periods. Systematic model deficiencies comprise the larger portion of the total error in most of the surface forecast variables that were evaluated. The error characteristics for both surface and upper-air forecasts vary widely by parameter, season, and station location. At upper levels, a few characteristic biases are identified. Overall however, the upper-level errors are more nonsystematic in nature and could be explained partly by observational measurement uncertainty. With a few exceptions, the upper-air results also indicate that 24-h model error growth is not statistically significant. In February and August 1997, NCEP implemented upgrades to the eta model's physical parameterizations that were designed to change some of the model's error characteristics near the surface. The results shown in this paper indicate that these upgrades led to identifiable and statistically significant changes in forecast accuracy for selected surface parameters. While some of the changes were expected, others were not consistent with the intent of the model updates and further emphasize the need for ongoing sensitivity studies and localized statistical verification efforts. Objective verification of point forecasts is a stringent measure of model performance, but when used alone, is not enough to quantify the overall value that model guidance may add to the forecast process. Therefore, results from a subjective verification of the meso-eta model over the Florida peninsula are

  11. Verification of a Viscous Computational Aeroacoustics Code using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  12. Verification of a Viscous Computational Aeroacoustics Code Using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  13. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  14. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    NASA Astrophysics Data System (ADS)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  15. Thoughts on Verification of Nuclear Disarmament

    SciTech Connect

    Dunlop, W H

    2007-09-26

    It is my pleasure to be here to day to participate in this Conference. My thanks to the organizers for preparing such an interesting agenda on a very difficult topic. My effort in preparing my presentation was performed under the auspices of the U.S. Department of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48. And as many of you know Lawrence Livermore National Laboratory is now, as of Oct 1st, under contract to the Lawrence Livermore National Security LLC. There has been a long history of how to view verification of arms control agreements. The basis for verification during the days of SALT was that verification would be based on each country's national technical means. For treaties dealing with strategic missiles this worked well as the individual items subject to verification were of such a size that they were visible by the National Technical Means available at the time. And it was felt that the counting of missiles and launchers could be verified by our National Technical Means. For nuclear testing treaties the use of seismic measurements developed into a capability that was reasonably robust for all but the smallest of nuclear tests. However, once we had the Threshold Test Ban Treaty, there was a significant problem in that the fidelity of the measurements were not sufficient to determine if a test was slightly above the 150 kt limit or slightly below the 150 kt limit. This led some in the US to believe that the Soviet Union was not living up to the TTBT agreement. An on-site verification protocol was negotiated in 1988 and 1989 that allowed the US to make hydrodynamic yield measurements on Soviet tests above 50 kt yield and regional seismic measurements on all tests above 35 kt of yield; and the Soviets to make the same type of measurements on US tests to ensure that they were not over 150 kt. These on-site measurements were considered reasonably intrusive. Again the measurement capability was not

  16. Geologic constraints on clandestine nuclear testing in South Asia

    PubMed Central

    Davis, Dan M.; Sykes, Lynn R.

    1999-01-01

    Cavity decoupling in salt is the most plausible means by which a nation could conduct clandestine testing of militarily significant nuclear weapons. The conditions under which solution-mined salt can be used for this purpose are quite restrictive. The salt must be thick and reasonably pure. Containment of explosions sets a shallow limit on depth, and cavity stability sets a deep limit. These constraints are met in considerably <1% of the total land area of India and Pakistan. Most of that area is too dry for cavity construction by solution mining; disposal of brine in rivers can be detected easily. Salt domes, the most favorable structures for constructing large cavities, are not present in India and Pakistan. Confidence that they are adhering to the Comprehensive Test Ban Treaty (CTBT) is enhanced by their geological conditions, which are quite favorable to verification, not evasion. Thus, their participation in the CTBT is constrained overwhelmingly by political, not scientific, issues. Confidence in the verification of the CTBT could be enhanced if India and Pakistan permitted stations of the various monitoring technologies that are now widely deployed elsewhere to be operated on their territories. PMID:10500134

  17. Gate-Level Commercial Microelectronics Verification with Standard Cell Recognition

    DTIC Science & Technology

    2015-03-26

    GATE- LEVEL COMMERCIAL MICROELECTRONICS VERIFICATION WITH STANDARD CELL RECOGNITION THESIS Leleia A. Hsia, Second Lieutenant, USAF AFIT-ENG-MS-15-M...Government and is not subject to copyright protection in the United States. AFIT-ENG-MS-15-M-069 GATE- LEVEL COMMERCIAL MICROELECTRONICS VERIFICATION WITH...RELEASE; DISTRIBUTION UNLIMITED AFIT-ENG-MS-15-M-069 GATE- LEVEL COMMERCIAL MICROELECTRONICS VERIFICATION WITH STANDARD CELL RECOGNITION THESIS Leleia A

  18. Verification and validation plan for the SFR system analysis module

    SciTech Connect

    Hu, R.

    2014-12-18

    This report documents the Verification and Validation (V&V) Plan for software verification and validation of the SFR System Analysis Module (SAM), developed at Argonne National Laboratory for sodium fast reactor whole-plant transient analysis. SAM is developed under the DOE NEAMS program and is part of the Reactor Product Line toolkit. The SAM code, the phenomena and computational models of interest, the software quality assurance, and the verification and validation requirements and plans are discussed in this report.

  19. Simulated Order Verification and Medication Reconciliation during an Introductory Pharmacy Practice Experience

    PubMed Central

    Chesson, Melissa M.; Momary, Kathryn M.

    2015-01-01

    Objective. To create, implement, and assess a simulated medication reconciliation and an order verification activity using hospital training software. Design. A simulated patient with medication orders and home medications was built into existing hospital training software. Students in an institutional introductory pharmacy practice experience (IPPE) reconciled the patient’s medications and determined whether or not to verify the inpatient orders based on his medical history and laboratory data. After reconciliation, students identified medication discrepancies and documented their rationale for rejecting inpatient orders. Assessment. For a 3-year period, the majority of students agreed the simulation enhanced their learning, taught valuable clinical decision-making skills, integrated material from previous courses, and stimulated their interest in institutional pharmacy. Overall feedback from student evaluations about the IPPE also was favorable. Conclusion. Use of existing hospital training software can affordably simulate the pharmacist’s role in order verification and medication reconciliation, as well as improve clinical decision-making. PMID:27168609

  20. Simulator verification effort at the South Texas project electric generating station

    SciTech Connect

    Bellmore, P.E.; Albury, C.R.

    1987-01-01

    This paper presents the work being done at Houston Lighting and Power Company to verify the South Texas Project Electric Generating Station (STPEGS) simulator. The purpose of that work is to assure that the STPEGS simulator adequately reflects plant response during normal and abnormal transients. An enhanced understanding of the engineering and organizational needs of a simulator verification program is significant. This paper presents the techniques used to develop a best-estimate model. The best-estimate model generates plant response data for comparison with the STPEGS simulator. A typical licensing model is inadequate for this work because of the conservative assumptions in the model. The authors examine, in this paper, the interaction between the various groups responsible for simulator verification.