Science.gov

Sample records for nuclear verification helping

  1. Nuclear disarmament verification

    SciTech Connect

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  2. Helping nuclear power help us

    SciTech Connect

    Schecker, Jay A

    2009-01-01

    After a prolonged absence, the word 'nuclear' has returned to the lexicon of sustainable domestic energy resources. Due in no small part to its demonstrated reliability, nuclear power is poised to playa greater role in the nation's energy future, producing clean, carbon-neutral electricity and contributing even more to our energy security. To nuclear scientists, the resurgence presents an opportunity to inject new technologies into the industry to maximize the benefits that nuclear energy can provide. 'By developing new options for waste management and exploiting new materials to make key technological advances, we can significantly impact the use of nuclear energy in our future energy mix,' says Chris Stanek, a materials scientist at Los Alamos National Laboratory. Stanek approaches the big technology challenges by thinking way small, all the way down to the atoms. He and his colleagues are using cutting edge atomic-scale simulations to address a difficult aspect of nuclear waste -- predicting its behavior far into the future. Their research is part of a broader, coordinated effort on the part of the Laboratory to use its considerable experimental, theoretical, and computational capabilities to explore advanced materials central to not only waste issues, but to nuclear fuels as well.

  3. Nuclear Data Verification and Standardization

    SciTech Connect

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  4. The monitoring and verification of nuclear weapons

    SciTech Connect

    Garwin, Richard L.

    2014-05-09

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  5. Thoughts on Verification of Nuclear Disarmament

    SciTech Connect

    Dunlop, W H

    2007-09-26

    perfect and it was expected that occasionally there might be a verification measurement that was slightly above 150 kt. But the accuracy was much improved over the earlier seismic measurements. In fact some of this improvement was because as part of this verification protocol the US and Soviet Union provided the yields of several past tests to improve seismic calibrations. This actually helped provide a much needed calibration for the seismic measurements. It was also accepted that since nuclear tests were to a large part R&D related, it was also expected that occasionally there might be a test that was slightly above 150 kt, as you could not always predict the yield with high accuracy in advance of the test. While one could hypothesize that the Soviets could do a test at some other location than their test sites, if it were even a small fraction of 150 kt it would clearly be observed and would be a violation of the treaty. So the issue of clandestine tests of significance was easily covered for this particular treaty.

  6. Seismic Surveillance. Nuclear Test Ban Verification

    DTIC Science & Technology

    1990-02-26

    the two methods. The seismic results together with local geology information strongly imply that the north- eastern part of the NORESS siting area is...Co GL-TR-90-0062 N Seismic Surveillance - Nuclear Test Ban Verification N Eystein S. Husebye Bent 0. Ruud S University of Oslo Department of Geology ...Aval" ! .- Dist I University of Oslo ’ ! Department of Geology ___ P.O. Box 1047, Blindern N-0316 Oslo 3 Norway iii TABLE OF CONTENTS I Summary 2 Novel

  7. DESIGN INFORMATION VERIFICATION FOR NUCLEAR SAFEGUARDS

    SciTech Connect

    Robert S. Bean; Richard R. M. Metcalf; Phillip C. Durst

    2009-07-01

    A critical aspect of international safeguards activities performed by the International Atomic Energy Agency (IAEA) is the verification that facility design and construction (including upgrades and modifications) do not create opportunities for nuclear proliferation. These Design Information Verification activities require that IAEA inspectors compare current and past information about the facility to verify the operator’s declaration of proper use. The actual practice of DIV presents challenges to the inspectors due to the large amount of data generated, concerns about sensitive or proprietary data, the overall complexity of the facility, and the effort required to extract just the safeguards relevant information. Planned and anticipated facilities will (especially in the case of reprocessing plants) be ever larger and increasingly complex, thus exacerbating the challenges. This paper reports the results of a workshop held at the Idaho National Laboratory in March 2009, which considered technologies and methods to address these challenges. The use of 3D Laser Range Finding, Outdoor Visualization System, Gamma-LIDAR, and virtual facility modeling, as well as methods to handle the facility data issues (quantity, sensitivity, and accessibility and portability for the inspector) were presented. The workshop attendees drew conclusions about the use of these techniques with respect to successfully employing them in an operating environment, using a Fuel Conditioning Facility walk-through as a baseline for discussion.

  8. A Zero Knowledge Protocol For Nuclear Warhead Verification

    SciTech Connect

    Glaser, Alexander; Goldston, Robert J.

    2014-03-14

    The verification of nuclear warheads for arms control faces a paradox: International inspectors must gain high confidence in the authenticity of submitted items while learning nothing about them. Conventional inspection systems featuring ''information barriers'', designed to hide measurments stored in electronic systems, are at risk of tampering and snooping. Here we show the viability of fundamentally new approach to nuclear warhead verification that incorporates a zero-knowledge protocol, designed such that sensitive information is never measured so does not need to be hidden. We interrogate submitted items with energetic neutrons, making in effect, differential measurements of neutron transmission and emission. Calculations of diversion scenarios show that a high degree of discrimination can be achieved while revealing zero information. Timely demonstration of the viability of such an approach could be critical for the nexxt round of arms-control negotiations, which will likely require verification of individual warheads, rather than whole delivery systems.

  9. Nuclear reaction modeling, verification experiments, and applications

    SciTech Connect

    Dietrich, F.S.

    1995-10-01

    This presentation summarized the recent accomplishments and future promise of the neutron nuclear physics program at the Manuel Lujan Jr. Neutron Scatter Center (MLNSC) and the Weapons Neutron Research (WNR) facility. The unique capabilities of the spallation sources enable a broad range of experiments in weapons-related physics, basic science, nuclear technology, industrial applications, and medical physics.

  10. Physical cryptographic verification of nuclear warheads.

    PubMed

    Kemp, R Scott; Danagoulian, Areg; Macdonald, Ruaridh R; Vavrek, Jayson R

    2016-08-02

    How does one prove a claim about a highly sensitive object such as a nuclear weapon without revealing information about the object? This paradox has challenged nuclear arms control for more than five decades. We present a mechanism in the form of an interactive proof system that can validate the structure and composition of an object, such as a nuclear warhead, to arbitrary precision without revealing either its structure or composition. We introduce a tomographic method that simultaneously resolves both the geometric and isotopic makeup of an object. We also introduce a method of protecting information using a provably secure cryptographic hash that does not rely on electronics or software. These techniques, when combined with a suitable protocol, constitute an interactive proof system that could reject hoax items and clear authentic warheads with excellent sensitivity in reasonably short measurement times.

  11. Physical cryptographic verification of nuclear warheads

    NASA Astrophysics Data System (ADS)

    Kemp, R. Scott; Danagoulian, Areg; Macdonald, Ruaridh R.; Vavrek, Jayson R.

    2016-08-01

    How does one prove a claim about a highly sensitive object such as a nuclear weapon without revealing information about the object? This paradox has challenged nuclear arms control for more than five decades. We present a mechanism in the form of an interactive proof system that can validate the structure and composition of an object, such as a nuclear warhead, to arbitrary precision without revealing either its structure or composition. We introduce a tomographic method that simultaneously resolves both the geometric and isotopic makeup of an object. We also introduce a method of protecting information using a provably secure cryptographic hash that does not rely on electronics or software. These techniques, when combined with a suitable protocol, constitute an interactive proof system that could reject hoax items and clear authentic warheads with excellent sensitivity in reasonably short measurement times.

  12. Physical cryptographic verification of nuclear warheads

    PubMed Central

    Kemp, R. Scott; Danagoulian, Areg; Macdonald, Ruaridh R.; Vavrek, Jayson R.

    2016-01-01

    How does one prove a claim about a highly sensitive object such as a nuclear weapon without revealing information about the object? This paradox has challenged nuclear arms control for more than five decades. We present a mechanism in the form of an interactive proof system that can validate the structure and composition of an object, such as a nuclear warhead, to arbitrary precision without revealing either its structure or composition. We introduce a tomographic method that simultaneously resolves both the geometric and isotopic makeup of an object. We also introduce a method of protecting information using a provably secure cryptographic hash that does not rely on electronics or software. These techniques, when combined with a suitable protocol, constitute an interactive proof system that could reject hoax items and clear authentic warheads with excellent sensitivity in reasonably short measurement times. PMID:27432959

  13. As-Built Verification Plan Spent Nuclear Fuel Canister Storage Building MCO Handling Machine

    SciTech Connect

    SWENSON, C.E.

    2000-10-19

    This as-built verification plan outlines the methodology and responsibilities that will be implemented during the as-built field verification activity for the Canister Storage Building (CSB) MCO HANDLING MACHINE (MHM). This as-built verification plan covers THE ELECTRICAL PORTION of the CONSTRUCTION PERFORMED BY POWER CITY UNDER CONTRACT TO MOWAT. The as-built verifications will be performed in accordance Administrative Procedure AP 6-012-00, Spent Nuclear Fuel Project As-Built Verification Plan Development Process, revision I. The results of the verification walkdown will be documented in a verification walkdown completion package, approved by the Design Authority (DA), and maintained in the CSB project files.

  14. A seismic event analyzer for nuclear test ban treaty verification

    SciTech Connect

    Mason, C.L.; Johnson, R.R. . Dept. of Applied Science Lawrence Livermore National Lab., CA ); Searfus, R.M.; Lager, D.; Canales, T. )

    1988-01-01

    This paper presents an expert system that interprets seismic data from Norway's regional seismic array, NORESS, for underground nuclear weapons test ban treaty verification. Three important aspects of the expert system are (1) it emulates the problem solving behavior of the human seismic analyst using an Assumption Based Truth Maintenance System, (2) it acts as an assistant to the human analyst by automatically interpreting and presenting events for review, and (3) it enables the analyst to interactively query the system's chain of reasoning and manually perform and interpretation. The general problem of seismic treaty verification is described. The expert system is presented in terms of knowledge representation structures, assumption based reasoning system, user interface elements, and initial performance results. 8 refs., 10 figs., 2 tabs.

  15. A zero-knowledge protocol for nuclear warhead verification

    NASA Astrophysics Data System (ADS)

    Glaser, Alexander; Barak, Boaz; Goldston, Robert J.

    2014-06-01

    The verification of nuclear warheads for arms control involves a paradox: international inspectors will have to gain high confidence in the authenticity of submitted items while learning nothing about them. Proposed inspection systems featuring `information barriers', designed to hide measurements stored in electronic systems, are at risk of tampering and snooping. Here we show the viability of a fundamentally new approach to nuclear warhead verification that incorporates a zero-knowledge protocol, which is designed in such a way that sensitive information is never measured and so does not need to be hidden. We interrogate submitted items with energetic neutrons, making, in effect, differential measurements of both neutron transmission and emission. Calculations for scenarios in which material is diverted from a test object show that a high degree of discrimination can be achieved while revealing zero information. Our ideas for a physical zero-knowledge system could have applications beyond the context of nuclear disarmament. The proposed technique suggests a way to perform comparisons or computations on personal or confidential data without measuring the data in the first place.

  16. A zero-knowledge protocol for nuclear warhead verification.

    PubMed

    Glaser, Alexander; Barak, Boaz; Goldston, Robert J

    2014-06-26

    The verification of nuclear warheads for arms control involves a paradox: international inspectors will have to gain high confidence in the authenticity of submitted items while learning nothing about them. Proposed inspection systems featuring 'information barriers', designed to hide measurements stored in electronic systems, are at risk of tampering and snooping. Here we show the viability of a fundamentally new approach to nuclear warhead verification that incorporates a zero-knowledge protocol, which is designed in such a way that sensitive information is never measured and so does not need to be hidden. We interrogate submitted items with energetic neutrons, making, in effect, differential measurements of both neutron transmission and emission. Calculations for scenarios in which material is diverted from a test object show that a high degree of discrimination can be achieved while revealing zero information. Our ideas for a physical zero-knowledge system could have applications beyond the context of nuclear disarmament. The proposed technique suggests a way to perform comparisons or computations on personal or confidential data without measuring the data in the first place.

  17. Nuclear Resonance Fluorescence for Material Verification in Dismantlement

    SciTech Connect

    Warren, Glen A.; Detwiler, Rebecca S.

    2011-10-01

    Nuclear resonance fluorescence (NRF) is a well-established physical process that provides an isotope-specific signature that can be exploited for isotopic detection and characterization of samples. Pacific Northwest National Laboratory has been investigating possible applications of NRF for national security. Of the investigated applications, the verification of material in the dismantlement process is the most promising. Through a combination of benchmarking measurements and radiation transport modeling, we have shown that NRF techniques with existing bremsstrahlung photon sources and a modest detection system can be used to detect highly enriched uranium in the quantities and time limits relevant to the dismantlement process. Issues such as orientation, placement and material geometry do not significantly impact the sensitivity of the technique. We have also investigated how shielding of the uranium would be observed through non-NRF processes to enable the accurate assay of the material. This paper will discuss our findings on how NRF and photon-interrogation techniques may be applied to the material verification in the dismantlement process.

  18. Supporting the President's Arms Control and Nonproliferation Agenda: Transparency and Verification for Nuclear Arms Reductions

    SciTech Connect

    Doyle, James E; Meek, Elizabeth

    2009-01-01

    The President's arms control and nonproliferation agenda is still evolving and the details of initiatives supporting it remain undefined. This means that DOE, NNSA, NA-20, NA-24 and the national laboratories can help define the agenda, and the policies and the initiatives to support it. This will require effective internal and interagency coordination. The arms control and nonproliferation agenda is broad and includes the path-breaking goal of creating conditions for the elimination of nuclear weapons. Responsibility for various elements of the agenda will be widely scattered across the interagency. Therefore an interagency mapping exercise should be performed to identify the key points of engagement within NNSA and other agencies for creating effective policy coordination mechanisms. These can include informal networks, working groups, coordinating committees, interagency task forces, etc. It will be important for NA-20 and NA-24 to get a seat at the table and a functional role in many of these coordinating bodies. The arms control and nonproliferation agenda comprises both mature and developing policy initiatives. The more mature elements such as CTBT ratification and a follow-on strategic nuclear arms treaty with Russia have defined milestones. However, recent press reports indicate that even the START follow-on strategic arms pact that is planned to be complete by the end of 2009 may take significantly longer and be more expansive in scope. The Russians called for proposals to count non-deployed as well as deployed warheads. Other elements of the agenda such as FMCT, future bilateral nuclear arms reductions following a START follow-on treaty, nuclear posture changes, preparations for an international nuclear security summit, strengthened international safeguards and multilateral verification are in much earlier stages of development. For this reason any survey of arms control capabilities within the USG should be structured to address potential needs across the

  19. Fuzzy-logic-based safety verification framework for nuclear power plants.

    PubMed

    Rastogi, Achint; Gabbar, Hossam A

    2013-06-01

    This article presents a practical implementation of a safety verification framework for nuclear power plants (NPPs) based on fuzzy logic where hazard scenarios are identified in view of safety and control limits in different plant process values. Risk is estimated quantitatively and compared with safety limits in real time so that safety verification can be achieved. Fuzzy logic is used to define safety rules that map hazard condition with required safety protection in view of risk estimate. Case studies are analyzed from NPP to realize the proposed real-time safety verification framework. An automated system is developed to demonstrate the safety limit for different hazard scenarios.

  20. Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models

    SciTech Connect

    Ahmed Hassan; Jenny Chapman

    2006-02-01

    The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP

  1. Design Verification Report Spent Nuclear Fuel (SNF) Project Canister Storage Building (CSB)

    SciTech Connect

    PICKETT, W.W.

    2000-09-22

    The Sub-project W379, ''Spent Nuclear Fuel Canister Storage Building (CSB),'' was established as part of the Spent Nuclear Fuel (SNF) Project. The primary mission of the CSB is to safely store spent nuclear fuel removed from the K Basins in dry storage until such time that it can be transferred to the national geological repository at Yucca Mountain Nevada. This sub-project was initiated in late 1994 by a series of studies and conceptual designs. These studies determined that the partially constructed storage building, originally built as part of the Hanford Waste Vitrification Plant (HWVP) Project, could be redesigned to safely store the spent nuclear fuel. The scope of the CSB facility initially included a receiving station, a hot conditioning system, a storage vault, and a Multi-Canister Overpack (MCO) Handling Machine (MHM). Because of evolution of the project technical strategy, the hot conditioning system was deleted from the scope and MCO welding and sampling stations were added in its place. This report outlines the methods, procedures, and outputs developed by Project W379 to verify that the provided Structures, Systems, and Components (SSCs): satisfy the design requirements and acceptance criteria; perform their intended function; ensure that failure modes and hazards have been addressed in the design; and ensure that the SSCs as installed will not adversely impact other SSCs. Because this sub-project is still in the construction/start-up phase, all verification activities have not yet been performed (e.g., canister cover cap and welding fixture system verification, MCO Internal Gas Sampling equipment verification, and As-built verification.). The verification activities identified in this report that still are to be performed will be added to the start-up punchlist and tracked to closure.

  2. Verification Study of Buoyancy-Driven Turbulent Nuclear Combustion

    SciTech Connect

    2010-01-01

    Buoyancy-driven turbulent nuclear combustion determines the rate of nuclear burning during the deflagration phase (i.e., the ordinary nuclear flame phase) of Type Ia supernovae, and hence the amount of nuclear energy released during this phase. It therefore determines the amount the white dwarf str expands prior to initiation of a detonation wave, and so the amount of radioactive nickel and thus the peak luminosity of the explosion. However, this key physical process is not fully understood. To better understand this process, the Flash Center has conducted an extensive series of large-scale 3D simulations of buoyancy-driven turbulent nuclear combustion for three different physical situations. This movie shows the results for some of these simulations. Credits: Science: Ray Bair, Katherine Riley, Argonne National Laboratory; Anshu Dubey, Don Lamb, Dongwook Lee, University of Chicago; Robert Fisher, University of Massachusetts at Dartmouth and Dean Townsley, University of Alabama

 Visualization: Jonathan Gallagher, University of Chicago; Randy Hudson, John Norris and Michael E. Papka, Argonne National Laboratory/University of Chicago This research used resources of the Argonne Leadership Computing Facility at Argonne National Laboratory, which is supported by the Office of Science of the U.S. Dept. of Energy under contract DE-AC02-06CH11357. This research was supported in part by the National Science Foundation through TeraGrid resources provided by the University of Chicago and Argonne National Laboratory.

  3. North Korea's nuclear weapons program:verification priorities and new challenges.

    SciTech Connect

    Moon, Duk-ho

    2003-12-01

    A comprehensive settlement of the North Korean nuclear issue may involve military, economic, political, and diplomatic components, many of which will require verification to ensure reciprocal implementation. This paper sets out potential verification methodologies that might address a wide range of objectives. The inspection requirements set by the International Atomic Energy Agency form the foundation, first as defined at the time of the Agreed Framework in 1994, and now as modified by the events since revelation of the North Korean uranium enrichment program in October 2002. In addition, refreezing the reprocessing facility and 5 MWe reactor, taking possession of possible weapons components and destroying weaponization capabilities add many new verification tasks. The paper also considers several measures for the short-term freezing of the North's nuclear weapon program during the process of negotiations, should that process be protracted. New inspection technologies and monitoring tools are applicable to North Korean facilities and may offer improved approaches over those envisioned just a few years ago. These are noted, and potential bilateral and regional verification regimes are examined.

  4. A gamma-ray verification system for special nuclear material

    SciTech Connect

    Lanier, R.G.; Prindle, A.L.; Friensehner, A.V.; Buckley, W.M.

    1994-07-01

    The Safeguards Technology Program at the Lawrence Livermore National Laboratory (LLNL) has developed a gamma-ray screening system for use by the Materials Management Section of the Engineering Sciences Division at LLNL for verifying the presence or absence of special nuclear material (SNM) in a sample. This system facilitates the measurements required under the ``5610`` series of US Department of Energy orders. MMGAM is an intelligent, menu driven software application that runs on a personal computer and requires a precalibrated multi-channel analyzer and HPGe detector. It provides a very quick and easy-to-use means of determining the presence of SNM in a sample. After guiding the operator through a menu driven set-up procedure, the system provides an on-screen GO/NO-GO indication after determining the system calibration status. This system represents advances over earlier used systems in the areas of ease-of use, operator training requirements, and quality assurance. The system records the gamma radiation from a sample using a sequence of measurements involving a background measurement followed immediately by a measurement of the unknown sample. Both spectra are stored and available for analysis or output. In the current application, the presence of {sup 235}U, {sup 238}U, {sup 239}Pu, and {sup 208}Tl isotopes are indicated by extracting, from the stored spectra, four energy ``windows`` preset around gamma-ray lines characteristic of the radioactive decay of these nuclides. The system is easily extendible to more complicated problems.

  5. DOE/LLNL verification symposium on technologies for monitoring nuclear tests related to weapons proliferation

    SciTech Connect

    Nakanishi, K.K.

    1993-02-12

    The rapidly changing world situation has raised concerns regarding the proliferation of nuclear weapons and the ability to monitor a possible clandestine nuclear testing program. To address these issues, Lawrence Livermore National Laboratory`s (LLNL) Treaty Verification Program sponsored a symposium funded by the US Department of Energy`s (DOE) Office of Arms Control, Division of Systems and Technology. The DOE/LLNL Symposium on Technologies for Monitoring Nuclear Tests Related to Weapons Proliferation was held at the DOE`s Nevada Operations Office in Las Vegas, May 6--7,1992. This volume is a collection of several papers presented at the symposium. Several experts in monitoring technology presented invited talks assessing the status of monitoring technology with emphasis on the deficient areas requiring more attention in the future. In addition, several speakers discussed proliferation monitoring technologies being developed by the DOE`s weapons laboratories.

  6. DOE/LLNL verification symposium on technologies for monitoring nuclear tests related to weapons proliferation

    SciTech Connect

    Nakanishi, K.K.

    1993-02-12

    The rapidly changing world situation has raised concerns regarding the proliferation of nuclear weapons and the ability to monitor a possible clandestine nuclear testing program. To address these issues, Lawrence Livermore National Laboratory's (LLNL) Treaty Verification Program sponsored a symposium funded by the US Department of Energy's (DOE) Office of Arms Control, Division of Systems and Technology. The DOE/LLNL Symposium on Technologies for Monitoring Nuclear Tests Related to Weapons Proliferation was held at the DOE's Nevada Operations Office in Las Vegas, May 6--7,1992. This volume is a collection of several papers presented at the symposium. Several experts in monitoring technology presented invited talks assessing the status of monitoring technology with emphasis on the deficient areas requiring more attention in the future. In addition, several speakers discussed proliferation monitoring technologies being developed by the DOE's weapons laboratories.

  7. Development of a test system for verification and validation of nuclear transport simulations

    SciTech Connect

    White, Morgan C; Triplett, Brian S; Anghaie, Samim

    2008-01-01

    Verification and validation of nuclear data is critical to the accuracy of both stochastic and deterministic particle transport codes. In order to effectively test a set of nuclear data, the data must be applied to a wide variety of transport problems. Performing this task in a timely, efficient manner is tedious. The nuclear data team at Los Alamos National laboratory in collaboration with the University of Florida has developed a methodology to automate the process of nuclear data verification and validation (V and V). This automated V and V process can efficiently test a number of data libraries using well defined benchmark experiments, such as those in the International Criticality Safety Benchmark Experiment Project (ICSBEP). The process is implemented through an integrated set of Pyton scripts. Material and geometry data are read from an existing medium or given directly by the user to generate a benchmark experiment template file. The user specifies the choice of benchmark templates, codes, and libraries to form a V and V project. The Python scripts generate input decks for multiple transport codes from the templates, run and monitor individual jobs, and parse the relevant output automatically. The output can then be used to generate reports directly or can be stored into a database for later analysis. This methodology eases the burden on the user by reducing the amount of time and effort required for obtaining and compiling calculation results. The resource savings by using this automated methodology could potentially be an enabling technology for more sophisticated data studies, such as nuclear data uncertainty quantification. Once deployed, this tool will allow the nuclear data community to more thoroughly test data libraries leading to higher fidelity data in the future.

  8. REPORT OF THE WORKSHOP ON NUCLEAR FACILITY DESIGN INFORMATION EXAMINATION AND VERIFICATION FOR SAFEGUARDS

    SciTech Connect

    Richard Metcalf; Robert Bean

    2009-10-01

    Executive Summary The International Atomic Energy Agency (IAEA) implements nuclear safeguards and verifies countries are compliant with their international nuclear safeguards agreements. One of the key provisions in the safeguards agreement is the requirement that the country provide nuclear facility design and operating information to the IAEA relevant to safeguarding the facility, and at a very early stage. , This provides the opportunity for the IAEA to verify the safeguards-relevant features of the facility and to periodically ensure that those features have not changed. The national authorities (State System of Accounting for and Control of Nuclear Material - SSAC) provide the design information for all facilities within a country to the IAEA. The design information is conveyed using the IAEA’s Design Information Questionnaire (DIQ) and specifies: (1) Identification of the facility’s general character, purpose, capacity, and location; (2) Description of the facility’s layout and nuclear material form, location, and flow; (3) Description of the features relating to nuclear material accounting, containment, and surveillance; and (4) Description of existing and proposed procedures for nuclear material accounting and control, with identification of nuclear material balance areas. The DIQ is updated as required by written addendum. IAEA safeguards inspectors examine and verify this information in design information examination (DIE) and design information verification (DIV) activities to confirm that the facility has been constructed or is being operated as declared by the facility operator and national authorities, and to develop a suitable safeguards approach. Under the Next Generation Safeguards Initiative (NGSI), the National Nuclear Security Administrations (NNSA) Office of Non-Proliferation and International Security identified the need for more effective and efficient verification of design information by the IAEA for improving international safeguards

  9. Development of a Consensus Standard for Verification and Validation of Nuclear System Thermal-Fluids Software

    SciTech Connect

    Edwin A. Harvego; Richard R. Schultz; Ryan L. Crane

    2011-12-01

    With the resurgence of nuclear power and increased interest in advanced nuclear reactors as an option to supply abundant energy without the associated greenhouse gas emissions of the more conventional fossil fuel energy sources, there is a need to establish internationally recognized standards for the verification and validation (V&V) of software used to calculate the thermal-hydraulic behavior of advanced reactor designs for both normal operation and hypothetical accident conditions. To address this need, ASME (American Society of Mechanical Engineers) Standards and Certification has established the V&V 30 Committee, under the jurisdiction of the V&V Standards Committee, to develop a consensus standard for verification and validation of software used for design and analysis of advanced reactor systems. The initial focus of this committee will be on the V&V of system analysis and computational fluid dynamics (CFD) software for nuclear applications. To limit the scope of the effort, the committee will further limit its focus to software to be used in the licensing of High-Temperature Gas-Cooled Reactors. In this framework, the Standard should conform to Nuclear Regulatory Commission (NRC) and other regulatory practices, procedures and methods for licensing of nuclear power plants as embodied in the United States (U.S.) Code of Federal Regulations and other pertinent documents such as Regulatory Guide 1.203, 'Transient and Accident Analysis Methods' and NUREG-0800, 'NRC Standard Review Plan'. In addition, the Standard should be consistent with applicable sections of ASME NQA-1-2008 'Quality Assurance Requirements for Nuclear Facility Applications (QA)'. This paper describes the general requirements for the proposed V&V 30 Standard, which includes; (a) applicable NRC and other regulatory requirements for defining the operational and accident domain of a nuclear system that must be considered if the system is to be licensed, (b) the corresponding calculation domain of

  10. Design Verification Report Spent Nuclear Fuel (SNF) Project Canister Storage Building (CSB)

    SciTech Connect

    BAZINET, G.D.

    2000-11-03

    The Sub-project W379, ''Spent Nuclear Fuel Canister Storage Building (CSB),'' was established as part of the Spent Nuclear Fuel (SNF) Project. The primary mission of the CSB is to safely store spent nuclear fuel removed from the K Basins in dry storage until such time that it can be transferred to the national geological repository at Yucca Mountain Nevada. This sub-project was initiated in late 1994 by a series of studies and conceptual designs. These studies determined that the partially constructed storage building, originally built as part of the Hanford Waste Vitrification Plant (HWVP) Project, could be redesigned to safely store the spent nuclear fuel. The scope of the CSB facility initially included a receiving station, a hot conditioning system, a storage vault, and a Multi-Canister Overpack (MCO) Handling Machine (MHM). Because of evolution of the project technical strategy, the hot conditioning system was deleted from the scope and MCO welding and sampling stations were added in its place. This report outlines the methods, procedures, and outputs developed by Project W379 to verify that the provided Structures, Systems, and Components (SSCs): satisfy the design requirements and acceptance criteria; perform their intended function; ensure that failure modes and hazards have been addressed in the design; and ensure that the SSCs as installed will not adversely impact other SSCs. The original version of this document was prepared by Vista Engineering for the SNF Project. The purpose of this revision is to document completion of verification actions that were pending at the time the initial report was prepared. Verification activities for the installed and operational SSCs have been completed. Verification of future additions to the CSB related to the canister cover cap and welding fixture system and MCO Internal Gas Sampling equipment will be completed as appropriate for those components. The open items related to verification of those requirements are noted

  11. A physical zero-knowledge object-comparison system for nuclear warhead verification

    PubMed Central

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; d'Errico, Francesco

    2016-01-01

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications. PMID:27649477

  12. A physical zero-knowledge object-comparison system for nuclear warhead verification

    SciTech Connect

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; d’Errico, Francesco

    2016-09-20

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.

  13. A physical zero-knowledge object-comparison system for nuclear warhead verification.

    PubMed

    Philippe, Sébastien; Goldston, Robert J; Glaser, Alexander; d'Errico, Francesco

    2016-09-20

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.

  14. A physical zero-knowledge object-comparison system for nuclear warhead verification

    DOE PAGES

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; ...

    2016-09-20

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information.more » More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.« less

  15. A physical zero-knowledge object-comparison system for nuclear warhead verification

    NASA Astrophysics Data System (ADS)

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; D'Errico, Francesco

    2016-09-01

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.

  16. Help

    ERIC Educational Resources Information Center

    Tollefson, Ann

    2009-01-01

    Planning to start or expand a K-8 critical language program? Looking for support in doing so? There "may" be help at the federal level for great ideas and strong programs. While there have been various pools of federal dollars available to support world language programs for a number of years, the federal government's interest in…

  17. Technology diffusion of a different nature: Applications of nuclear safeguards technology to the chemical weapons verification regime

    SciTech Connect

    Kadner, S.P.; Reisman, A.; Turpen, E.

    1996-10-01

    The following discussion focuses on the issue of arms control implementation from the standpoint of technology and technical assistance. Not only are the procedures and techniques for safeguarding nuclear materials undergoing substantial changes, but the implementation of the Chemical Weapons Convention (CWC) and the Biological Weapons Convention (BWC) will give rise to technical difficulties unprecedented in the implementation of arms control verification. Although these regimes present new challenges, an analysis of the similarities between the nuclear and chemical weapons non-proliferation verification regimes illustrates the overlap in technological solutions. Just as cost-effective and efficient technologies can solve the problems faced by the nuclear safeguards community, these same technologies offer solutions for the CWC safeguards regime. With this in mind, experts at the Organization for the Prohibition of Chemical Weapons (OPCW), who are responsible for verification implementation, need to devise a CWC verification protocol that considers the technology already available. The functional similarity of IAEA and the OPCW, in conjunction with the technical necessities of both verification regimes, should receive attention with respect to the establishment of a technical assistance program. Lastly, the advanced status of the nuclear and chemical regime vis-a-vis the biological non-proliferation regime can inform our approach to implementation of confidence building measures for biological weapons.

  18. Verification of 235U mass content in nuclear fuel plates by an absolute method

    NASA Astrophysics Data System (ADS)

    El-Gammal, W.

    2007-01-01

    Nuclear Safeguards is referred to a verification System by which a State can control all nuclear materials (NM) and nuclear activities under its authority. An effective and efficient Safeguards System must include a system of measurements with capabilities sufficient to verify such NM. Measurements of NM using absolute methods could eliminate the dependency on NM Standards, which are necessary for other relative or semi-absolute methods. In this work, an absolute method has been investigated to verify the 235U mass content in nuclear fuel plates of Material Testing Reactor (MTR) type. The most intense gamma-ray signature at 185.7 keV emitted after α-decay of the 235U nuclei was employed in the method. The measuring system (an HPGe-spectrometer) was mathematically calibrated for efficiency using the general Monte Carlo transport code MCNP-4B. The calibration results and the measured net count rate were used to estimate the 235U mass content in fuel plates at different detector-to-fuel plate distances. Two sets of fuel plates, containing natural and low enriched uranium, were measured at the Fuel Fabrication Facility. Average accuracies for the estimated 235U masses of about 2.62% and 0.3% are obtained for the fuel plates containing natural and low enriched uranium; respectively, with a precision of about 3%.

  19. Verification of Dismantlement of Nuclear Warheads and Controls on Nuclear Materials

    DTIC Science & Technology

    1993-01-12

    would be easy for us, in consul- tation with other countries, to buy a large fraction of the Russian stockpile of HEU and convert it to LEU for use in...require about 450,000 kg-SWU per year. However, if the feed is 20 percent HEU (the highest enrichment defined as LEU by the IAEA) and the tail assay is...nuclear warheads, bans on the production of additional quantities of plutonium (Pu) and highly enriched uranium ( HEU ) for nuclear weapons and agreements on

  20. A New Approach to Nuclear Warhead Verification Using a Zero-Knowledge Protocol

    SciTech Connect

    Glaser,; Alexander,

    2012-05-16

    Warhead verification systems proposed to date fundamentally rely on the use of information barriers to prevent the release of classified design information. Measurements with information carriers significantly increase the complexity of inspection systems, make their certification and authentication difficult, and may reduce the overall confidence in the verifiability of future arms- control agreements. This talk presents a proof-of-concept of a new approach to nuclear warhead verification that minimizes the role of information barriers from the outset and envisions instead an inspection system that a priori avoids leakage of sensitive information using a so-called zero-knowledge protocol. The proposed inspection system is based on the template-matching approach and relies on active interrogation of a test object with 14-MeV neutrons. The viability of the method is examined with MCNP Monte Carlo neutron transport calculations modeling the experimental setup, an investigation of different diversion scenarios, and an analysis of the simulated data showing that it does not contain information about the properties of the inspected object.

  1. Applications of a Fast Neutron Detector System to Verification of Special Nuclear Materials

    NASA Astrophysics Data System (ADS)

    Mayo, Douglas R.; Byrd, Roger C.; Ensslin, Norbert; Krick, Merlyn S.; Mercer, David J.; Miller, Michael C.; Prettyman, Thomas H.; Russo, Phyllis A.

    1998-04-01

    An array of boron-loaded plastic optically coupled to bismuth germanate scintillators has been developed to detect neutrons for measurement of special nuclear materials. The phoswiched detection system has the advantage of a high neutron detection efficiency and short die-away time. This is achieved by mixing the moderator (plastic) and the detector (^10B) at the molecular level. Simulations indicate that the neutron capture probabilities equal or exceed those of the current thermal neutron multiplicity techniques which have the moderator (polyethylene) and detectors (^3He gas proportional tubes) macroscopically separate. Experiments have been performed to characterize the response of these detectors and validate computer simulations. The fast neutron detection system may be applied to the quantitative assay of plutonium in high (α,n) backgrounds, with emphasis on safeguards and enviromental scenarios. Additional applications of the insturment, in a non-quantative mode, has been tested for possible verification activities involving dismantlement of nuclear weapons. A description of the detector system, simulations and preliminary data will be presented.

  2. Stabilized, hand-held, gamma-ray verification instrument for special nuclear materials

    SciTech Connect

    Fehlau, P.E.; Wiig, G.

    1988-01-01

    For many years, Los Alamos has developed intelligent, hand-held, search instruments for use by non-specialists to search for special nuclear materials (SNM). The instruments sense SNM by detecting its emitted radiation with scintillation detectors monitored by digital alarm circuitry. Now, we have developed a new hand-held instrument that can verify the presence or absence of particular radioisotopes by analyzing gamma-ray spectra. The new instrument is similar to recent, microprocessor-based, search instruments, but has LED detector stabilization, three adjustable regions-of-interest, and additional operating programs for spectrum analysis. We call the new instrument an SNM verification instrument. Its spectrum analysis capability can verify the presence or absence of specific plutonium isotopes in containers or verify the presence of uranium and its enrichment. The instrument retains the search capability, light weight, and low-power requirement of its predecessors. Its ready portability, detector stabilization, and simple operation allow individuals with little technical training to verify the contents of SNM containers. 5 refs., 5 figs.

  3. Seismic surveillance: Nuclear test ban verification. Technical report, 1 Jan-31 Dec 90

    SciTech Connect

    Husebye, E.S.; Ruud, B.O.

    1991-02-27

    The project is aimed at seismic surveillance as part of on-going efforts for improving nuclear test ban verification capabilities. The problem is complex in the sense that underground explosions are most efficiently monitored by seismic means, but that the distinction between signals emitted by natural earthquakes and explosions remains unclear, at least at local and regional distances. In other words, seismic wave propagation in a heterogeneous Earth may easily mask specific source signatures. Section 2 presents a new scheme for seismic signal detection on the basis of three-component (3C) seismograph recording systems. A dual test statistics are introduced, namely a conventional STA/LTA on a stand alone basis or a combination of STA/LTA and a P-signal polarity measure. In the latter case a relatively low STA/LTA threshold can be used. An added advantage with the polarity estimate is that a corresponding slowness vector estimate is obtained. The detector is operated in three steps; (1) run detector on a suit of bandpass filtered records, (2) compress raw detection log while retaining parameters from the best filter, and (3) group detected and identified phases into event families and estimate epicenter parameters.

  4. Positive nuclear BAP1 immunostaining helps differentiate non-small cell lung carcinomas from malignant mesothelioma.

    PubMed

    Carbone, Michele; Shimizu, David; Napolitano, Andrea; Tanji, Mika; Pass, Harvey I; Yang, Haining; Pastorino, Sandra

    2016-09-13

    The differential diagnosis between pleural malignant mesothelioma (MM) and lung cancer is often challenging. Immunohistochemical (IHC) stains used to distinguish these malignancies include markers that are most often positive in MM and less frequently positive in carcinomas, and vice versa. However, in about 10-20% of the cases, the IHC results can be confusing and inconclusive, and novel markers are sought to increase the diagnostic accuracy.We stained 45 non-small cell lung cancer samples (32 adenocarcinomas and 13 squamous cell carcinomas) with a monoclonal antibody for BRCA1-associated protein 1 (BAP1) and also with an IHC panel we routinely use to help differentiate MM from carcinomas, which include, calretinin, Wilms Tumor 1, cytokeratin 5, podoplanin D2-40, pankeratin CAM5.2, thyroid transcription factor 1, Napsin-A, and p63. Nuclear BAP1 expression was also analyzed in 35 MM biopsies. All 45 non-small cell lung cancer biopsies stained positive for nuclear BAP1, whereas 22/35 (63%) MM biopsies lacked nuclear BAP1 staining, consistent with previous data. Lack of BAP1 nuclear staining was associated with MM (two-tailed Fisher's Exact Test, P = 5.4 x 10-11). Focal BAP1 staining was observed in a subset of samples, suggesting polyclonality. Diagnostic accuracy of other classical IHC markers was in agreement with previous studies. Our study indicated that absence of nuclear BAP1 stain helps differentiate MM from lung carcinomas. We suggest that BAP1 staining should be added to the IHC panel that is currently used to distinguish these malignancies.

  5. Positive nuclear BAP1 immunostaining helps differentiate non-small cell lung carcinomas from malignant mesothelioma

    PubMed Central

    Carbone, Michele; Shimizu, David; Napolitano, Andrea; Tanji, Mika; Pass, Harvey I.; Yang, Haining; Pastorino, Sandra

    2016-01-01

    The differential diagnosis between pleural malignant mesothelioma (MM) and lung cancer is often challenging. Immunohistochemical (IHC) stains used to distinguish these malignancies include markers that are most often positive in MM and less frequently positive in carcinomas, and vice versa. However, in about 10–20% of the cases, the IHC results can be confusing and inconclusive, and novel markers are sought to increase the diagnostic accuracy. We stained 45 non-small cell lung cancer samples (32 adenocarcinomas and 13 squamous cell carcinomas) with a monoclonal antibody for BRCA1-associated protein 1 (BAP1) and also with an IHC panel we routinely use to help differentiate MM from carcinomas, which include, calretinin, Wilms Tumor 1, cytokeratin 5, podoplanin D2-40, pankeratin CAM5.2, thyroid transcription factor 1, Napsin-A, and p63. Nuclear BAP1 expression was also analyzed in 35 MM biopsies. All 45 non-small cell lung cancer biopsies stained positive for nuclear BAP1, whereas 22/35 (63%) MM biopsies lacked nuclear BAP1 staining, consistent with previous data. Lack of BAP1 nuclear staining was associated with MM (two-tailed Fisher's Exact Test, P = 5.4 × 10−11). Focal BAP1 staining was observed in a subset of samples, suggesting polyclonality. Diagnostic accuracy of other classical IHC markers was in agreement with previous studies. Our study indicated that absence of nuclear BAP1 stain helps differentiate MM from lung carcinomas. We suggest that BAP1 staining should be added to the IHC panel that is currently used to distinguish these malignancies. PMID:27447750

  6. Technology Foresight and nuclear test verification: a structured and participatory approach

    NASA Astrophysics Data System (ADS)

    Noack, Patrick; Gaya-Piqué, Luis; Haralabus, Georgios; Auer, Matthias; Jain, Amit; Grenard, Patrick

    2013-04-01

    As part of its mandate, the CTBTO's nuclear explosion monitoring programme aims to maintain its sustainability, effectiveness and its long-term relevance to the verification regime. As such, the PTS is conducting a Technology Foresight programme of activities to identify technologies, processes, concepts and ideas that may serve said purpose and become applicable within the next 20 years. Through the Technology Foresight activities (online conferences, interviews, surveys, workshops and other) we have involved the wider science community in the fields of seismology, infrasound, hydroacoustics, radionuclide technology, remote sensing and geophysical techniques. We have assembled a catalogue of over 200 items, which incorporate technologies, processes, concepts and ideas which will have direct future relevance to the IMS (International Monitoring System), IDC (International Data Centre) and OSI (On-Site Inspection) activities within the PTS. In order to render this catalogue as applicable and useful as possible for strategy and planning, we have devised a "taxonomy" based on seven categories, against which each technology is assessed through a peer-review mechanism. These categories are: 1. Focus area of the technology in question: identify whether the technology relates to (one or more of the following) improving our understanding of source and source physics; propagation modelling; data acquisition; data transport; data processing; broad modelling concepts; quality assurance and data storage. 2. Current Development Stage of the technology in question. Based on a scale from one to six, this measure is specific to PTS needs and broadly reflects Technology Readiness Levels (TRLs). 3. Impact of the technology on each of the following capabilities: detection, location, characterization, sustainment and confidence building. 4. Development cost: the anticipated monetary cost of validating a prototype (i.e. Development Stage 3) of the technology in question. 5. Time to

  7. Development of a Standard for Verification and Validation of Software Used to Calculate Nuclear System Thermal Fluids Behavior

    SciTech Connect

    Richard R. Schultz; Edwin A. Harvego; Ryan L. Crane

    2010-05-01

    With the resurgence of nuclear power and increased interest in advanced nuclear reactors as an option to supply abundant energy without the associated greenhouse gas emissions of the more conventional fossil fuel energy sources, there is a need to establish internationally recognized standards for the verification and validation (V&V) of software used to calculate the thermal-hydraulic behavior of advanced reactor designs for both normal operation and hypothetical accident conditions. To address this need, ASME (American Society of Mechanical Engineers) Standards and Certification has established the V&V 30 Committee, under the responsibility of the V&V Standards Committee, to develop a consensus Standard for verification and validation of software used for design and analysis of advanced reactor systems. The initial focus of this committee will be on the V&V of system analysis and computational fluid dynamics (CFD) software for nuclear applications. To limit the scope of the effort, the committee will further limit its focus to software to be used in the licensing of High-Temperature Gas-Cooled Reactors. In this framework, the standard should conform to Nuclear Regulatory Commission (NRC) practices, procedures and methods for licensing of nuclear power plants as embodied in the United States (U.S.) Code of Federal Regulations and other pertinent documents such as Regulatory Guide 1.203, “Transient and Accident Analysis Methods” and NUREG-0800, “NRC Standard Review Plan”. In addition, the standard should be consistent with applicable sections of ASME Standard NQA-1 (“Quality Assurance Requirements for Nuclear Facility Applications (QA)”). This paper describes the general requirements for the V&V Standard, which includes; (a) the definition of the operational and accident domain of a nuclear system that must be considered if the system is to licensed, (b) the corresponding calculational domain of the software that should encompass the nuclear operational

  8. Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS) Code Verification and Validation Data Standards and Requirements: Fluid Dynamics Version 1.0

    SciTech Connect

    Greg Weirs; Hyung Lee

    2011-09-01

    V&V and UQ are the primary means to assess the accuracy and reliability of M&S and, hence, to establish confidence in M&S. Though other industries are establishing standards and requirements for the performance of V&V and UQ, at present, the nuclear industry has not established such standards or requirements. However, the nuclear industry is beginning to recognize that such standards are needed and that the resources needed to support V&V and UQ will be very significant. In fact, no single organization has sufficient resources or expertise required to organize, conduct and maintain a comprehensive V&V and UQ program. What is needed is a systematic and standardized approach to establish and provide V&V and UQ resources at a national or even international level, with a consortium of partners from government, academia and industry. Specifically, what is needed is a structured and cost-effective knowledge base that collects, evaluates and stores verification and validation data, and shows how it can be used to perform V&V and UQ, leveraging collaboration and sharing of resources to support existing engineering and licensing procedures as well as science-based V&V and UQ processes. The Nuclear Energy Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Utah State University and others with the objective of establishing a comprehensive and web-accessible knowledge base to provide V&V and UQ resources for M&S for nuclear reactor design, analysis and licensing. The knowledge base will serve as an important resource for technical exchange and collaboration that will enable credible and reliable computational models and simulations for application to nuclear power. NE-KAMS will serve as a valuable resource for the nuclear industry, academia, the national laboratories, the U.S. Nuclear Regulatory Commission (NRC) and

  9. Implementation of neutron counting techniques at US facilities for IAEA verification of excess materials from nuclear weapons production

    SciTech Connect

    Stewart, J.E.; Krick, M.S.; Langner, D.G.; Reilly, T.D.; Theis, W.; Lemaire, R.J.; Xiao, J.

    1995-08-01

    The U.S. Nonproliferation and Export Control Policy, announced by President Clinton before the United Nations General Assembly on September 27, 1993, commits the U.S. to placing under International Atomic Energy Agency (IAEA) Safeguards excess nuclear materials no longer needed for the U.S. nuclear deterrent. As of July 1, 1995, the IAEA had completed Initial Physical Inventory Verification (IPIV) at two facilities: a storage vault in the Oak Ridge Y-12 plant containing highly enriched uranium (HOW) metal and another storage vault in the Hanford Plutonium Finishing Plant (PFP) containing plutonium oxide and plutonium-bearing residues. Another plutonium- storage vault, located at Rocky Flats, is scheduled for the IPIV in the fall of 1995. Conventional neutron coincidence counting is one of the routinely applied IAEA nondestructive assay (ND) methods for verification of uranium and plutonium. However, at all three facilities mentioned above, neutron ND equipment had to be modified or developed for specific facility needs such as the type and configuration of material placed under safeguards. This document describes those modifications and developments.

  10. Methodology, verification, and performance of the continuous-energy nuclear data sensitivity capability in MCNP6

    SciTech Connect

    Kiedrowski, B. C.; Brown, F. B.

    2013-07-01

    A continuous-energy sensitivity coefficient capability has been introduced into MCNP6. The methods for generating energy-resolved and energy-integrated sensitivity profiles are discussed. Results from the verification exercises that were performed are given, and these show that MCNP6 compares favorably with analytic solutions, direct density perturbations, and comparisons to TSUNAMI-3D and MONK. Run-time and memory requirements are assessed for typical applications, and these are shown to be reasonable with modern computing resources. (authors)

  11. Human factors design, verification, and validation for two types of control room upgrades at a nuclear power plant

    SciTech Connect

    Boring, Laurids Ronald

    2014-10-01

    This paper describes the NUREG-0711 based human factors engineering (HFE) phases and associated elements required to support design, verification and validation (V&V), and implementation of a new plant process computer (PPC) and turbine control system (TCS) at a representative nuclear power plant. This paper reviews ways to take a human-system interface (HSI) specification and use it when migrating legacy PPC displays or designing displays with new functionality. These displays undergo iterative usability testing during the design phase and then undergo an integrated system validation (ISV) in a full scope control room training simulator. Following the successful demonstration of operator performance with the systems during the ISV, the new system is implemented at the plant, first in the training simulator and then in the main control room.

  12. Plastid DNA sequencing and nuclear SNP genotyping help resolve the puzzle of central American Platanus

    PubMed Central

    De Castro, Olga; Di Maio, Antonietta; Lozada García, José Armando; Piacenti, Danilo; Vázquez-Torres, Mario; De Luca, Paolo

    2013-01-01

    Background and Aims Recent research on the history of Platanus reveals that hybridization phenomena occurred in the central American species. This study has two goals: to help resolve the evolutive puzzle of central American Platanus, and to test the potential of real-time polymerase chain reaction (PCR) for detecting ancient hybridization. Methods Sequencing of a uniparental plastid DNA marker [psbA-trnH(GUG) intergenic spacer] and qualitative and quantitative single nucleotide polymorphism (SNP) genotyping of biparental nuclear ribosomal DNA (nrDNA) markers [LEAFY intron 2 (LFY-i2) and internal transcribed spacer 2 (ITS2)] were used. Key Results Based on the SNP genotyping results, several Platanus accessions show the presence of hybridization/introgression, including some accessions of P. rzedowskii and of P. mexicana var. interior and one of P. mexicana var. mexicana from Oaxaca (= P. oaxacana). Based on haplotype analyses of the psbA-trnH spacer, five haplotypes were detected. The most common of these is present in taxa belonging to P. orientalis, P. racemosa sensu lato, some accessions of P. occidentalis sensu stricto (s.s.) from Texas, P. occidentalis var. palmeri, P. mexicana s.s. and P. rzedowskii. This is highly relevant to genetic relationships with the haplotypes present in P. occidentalis s.s. and P. mexicana var. interior. Conclusions Hybridization and introgression events between lineages ancestral to modern central and eastern North American Platanus species occurred. Plastid haplotypes and qualitative and quantitative SNP genotyping provide information critical for understanding the complex history of Mexican Platanus. Compared with the usual molecular techniques of sub-cloning, sequencing and genotyping, real-time PCR assay is a quick and sensitive technique for analysing complex evolutionary patterns. PMID:23798602

  13. Routine inspection effort required for verification of a nuclear material production cutoff convention

    SciTech Connect

    Fishbone, L.G.; Sanborn, J.

    1995-08-01

    Preliminary estimates of the inspection effort to verify a Nuclear Material Cutoff Convention are presented. The estimates are based on a database of about 650 facilities in a total of eight states, the five nuclear-weapons states and three ``threshold`` states plus facility-specific inspection requirements. Typical figures for inspection requirements for specific facility types derive from IAEA experience, where applicable. Alternative estimates of inspection effort are used in cutoff options where full IAEA safeguards are not stipulated.

  14. Use of open source information and commercial satellite imagery for nuclear nonproliferation regime compliance verification by a community of academics

    NASA Astrophysics Data System (ADS)

    Solodov, Alexander

    The proliferation of nuclear weapons is a great threat to world peace and stability. The question of strengthening the nonproliferation regime has been open for a long period of time. In 1997 the International Atomic Energy Agency (IAEA) Board of Governors (BOG) adopted the Additional Safeguards Protocol. The purpose of the protocol is to enhance the IAEA's ability to detect undeclared production of fissile materials in member states. However, the IAEA does not always have sufficient human and financial resources to accomplish this task. Developed here is a concept for making use of human and technical resources available in academia that could be used to enhance the IAEA's mission. The objective of this research was to study the feasibility of an academic community using commercially or publicly available sources of information and products for the purpose of detecting covert facilities and activities intended for the unlawful acquisition of fissile materials or production of nuclear weapons. In this study, the availability and use of commercial satellite imagery systems, commercial computer codes for satellite imagery analysis, Comprehensive Test Ban Treaty (CTBT) verification International Monitoring System (IMS), publicly available information sources such as watchdog groups and press reports, and Customs Services information were explored. A system for integrating these data sources to form conclusions was also developed. The results proved that publicly and commercially available sources of information and data analysis can be a powerful tool in tracking violations in the international nuclear nonproliferation regime and a framework for implementing these tools in academic community was developed. As a result of this study a formation of an International Nonproliferation Monitoring Academic Community (INMAC) is proposed. This would be an independent organization consisting of academics (faculty, staff and students) from both nuclear weapon states (NWS) and

  15. Potential opportunities for nano materials to help enable enhanced nuclear fuel performance

    SciTech Connect

    McClellan, Kenneth J.

    2012-06-06

    This presentation is an overview of the technical challenges for development of nuclear fuels with enhanced performance and accident tolerance. Key specific aspects of improved fuel performance are noted. Examples of existing nanonuclear projects and concepts are presented and areas of potential focus are suggested. The audience for this presentation includes representatives from: DOE-NE, other national laboratories, industry and academia. This audience is a mixture of nanotechnology experts and nuclear energy researchers and managers.

  16. Development and verification of design methods for ducts in a space nuclear shield

    NASA Technical Reports Server (NTRS)

    Cerbone, R. J.; Selph, W. E.; Read, P. A.

    1972-01-01

    A practical method for computing the effectiveness of a space nuclear shield perforated by small tubing and cavities is reported. Performed calculations use solutions for a two dimensional transport code and evaluate perturbations of that solution using last flight estimates and other kernel integration techniques. In general, perturbations are viewed as a change in source strength of scattered radiation and a change in attenuation properties of the region.

  17. Analytical three-dimensional neutron transport benchmarks for verification of nuclear engineering codes. Final report

    SciTech Connect

    Ganapol, B.D.; Kornreich, D.E.

    1997-07-01

    Because of the requirement of accountability and quality control in the scientific world, a demand for high-quality analytical benchmark calculations has arisen in the neutron transport community. The intent of these benchmarks is to provide a numerical standard to which production neutron transport codes may be compared in order to verify proper operation. The overall investigation as modified in the second year renewal application includes the following three primary tasks. Task 1 on two dimensional neutron transport is divided into (a) single medium searchlight problem (SLP) and (b) two-adjacent half-space SLP. Task 2 on three-dimensional neutron transport covers (a) point source in arbitrary geometry, (b) single medium SLP, and (c) two-adjacent half-space SLP. Task 3 on code verification, includes deterministic and probabilistic codes. The primary aim of the proposed investigation was to provide a suite of comprehensive two- and three-dimensional analytical benchmarks for neutron transport theory applications. This objective has been achieved. The suite of benchmarks in infinite media and the three-dimensional SLP are a relatively comprehensive set of one-group benchmarks for isotropically scattering media. Because of time and resource limitations, the extensions of the benchmarks to include multi-group and anisotropic scattering are not included here. Presently, however, enormous advances in the solution for the planar Green`s function in an anisotropically scattering medium have been made and will eventually be implemented in the two- and three-dimensional solutions considered under this grant. Of particular note in this work are the numerical results for the three-dimensional SLP, which have never before been presented. The results presented were made possible only because of the tremendous advances in computing power that have occurred during the past decade.

  18. Taming the SQUID: How a nuclear physics education (mostly) helped my career in applied physics

    NASA Astrophysics Data System (ADS)

    Espy, Michelle

    2013-10-01

    My degree is in experimental nuclear physics, specifically studying the interaction of pions with nuclei. But after graduation I accepted a post-doctoral research position with a team based on applications of the Superconducting Quantum Interference Device (SQUID) to the study of the human brain. Despite knowing nothing about the brain or SQUIDs to start with, I have gone on to enjoy a career in applications of the SQUID and other sensors to the detection of weak magnetic fields in a variety of problems from brain studies (magnetoencephalography) to ultra-low field nuclear magnetic resonance for detection of explosives and illicit material. In this talk I will present some background on SQUIDs and their application to the detection of ultra-weak magnetic fields of biological and non-biological origin. I will also provide a little insight into what it has been like to use a nuclear physics background to pursue other types of science.

  19. Measurement and verification of positron emitter nuclei generated at each treatment site by target nuclear fragment reactions in proton therapy

    SciTech Connect

    Miyatake, Aya; Nishio, Teiji; Ogino, Takashi; Saijo, Nagahiro; Esumi, Hiroyasu; Uesaka, Mitsuru

    2010-08-15

    Purpose: The purpose of this study is to verify the characteristics of the positron emitter nuclei generated at each treatment site by proton irradiation. Methods: Proton therapy using a beam on-line PET system mounted on a rotating gantry port (BOLPs-RGp), which the authors developed, is provided at the National Cancer Center Kashiwa, Japan. BOLPs-RGp is a monitoring system that can confirm the activity distribution of the proton irradiated volume by detection of a pair of annihilation gamma rays coincidentally from positron emitter nuclei generated by the target nuclear fragment reactions between irradiated proton nuclei and nuclei in the human body. Activity is measured from a start of proton irradiation to a period of 200 s after the end of the irradiation. The characteristics of the positron emitter nuclei generated in a patient's body were verified by the measurement of the activity distribution at each treatment site using BOLPs-RGp. Results: The decay curves for measured activity were able to be approximated using two or three half-life values regardless of the treatment site. The activity of half-life value of about 2 min was important for a confirmation of the proton irradiated volume. Conclusions: In each proton treatment site, verification of the characteristics of the generated positron emitter nuclei was performed by using BOLPs-RGp. For the monitoring of the proton irradiated volume, the detection of {sup 15}O generated in a human body was important.

  20. Use of nuclear explosions to create gas condensate storage in the USSR. LLL Treaty Verification Program

    SciTech Connect

    Borg, I.Y.

    1982-08-23

    The Soviet Union has described industrial use of nuclear explosions to produce underground hydrocarbon storage. To examples are in the giant Orenburg gas condensate field. There is good reason to believe that three additional cavities were created in bedded salt in the yet to be fully developed giant Astrakhan gas condensate field in the region of the lower Volga. Although contrary to usual western practice, the cavities are believed to be used to store H/sub 2/S-rich, unstable gas condensate prior to processing in the main gas plants located tens of kilometers from the producing fields. Detonations at Orenburg and Astrakhan preceded plant construction. The use of nuclear explosions at several sites to create underground storage of highly corrosive liquid hydrocarbons suggests that the Soviets consider this time and cost effective. The possible benefits from such a plan include degasification and stabilization of the condensate before final processing, providing storage of condensate during periods of abnormally high natural gas production or during periods when condensate but not gas processing facilities are undergoing maintenance. Judging from information provided by Soviet specialists, the individual cavities have a maximum capacity on the order of 50,000 m/sup 3/.

  1. Routine inspection effort required for verification of a nuclear material production cutoff convention

    SciTech Connect

    Dougherty, D.; Fainberg, A.; Sanborn, J.; Allentuck, J.; Sun, C.

    1996-11-01

    On 27 September 1993, President Clinton proposed {open_quotes}... a multilateral convention prohibiting the production of highly enriched uranium or plutonium for nuclear explosives purposes or outside of international safeguards.{close_quotes} The UN General Assembly subsequently adopted a resolution recommending negotiation of a non-discriminatory, multilateral, and internationally and effectively verifiable treaty (hereinafter referred to as {open_quotes}the Cutoff Convention{close_quotes}) banning the production of fissile material for nuclear weapons. The matter is now on the agenda of the Conference on Disarmament, although not yet under negotiation. This accord would, in effect, place all fissile material (defined as highly enriched uranium and plutonium) produced after entry into force (EIF) of the accord under international safeguards. {open_quotes}Production{close_quotes} would mean separation of the material in question from radioactive fission products, as in spent fuel reprocessing, or enrichment of uranium above the 20% level, which defines highly enriched uranium (HEU). Facilities where such production could occur would be safeguarded to verify that either such production is not occurring or that all material produced at these facilities is maintained under safeguards.

  2. Verification of screening level for decontamination implemented after Fukushima nuclear accident

    PubMed Central

    Ogino, Haruyuki; Ichiji, Takeshi; Hattori, Takatoshi

    2012-01-01

    The screening level for decontamination that has been applied for the surface of the human body and contaminated handled objects after the Fukushima nuclear accident was verified by assessing the doses that arise from external irradiation, ingestion, inhalation and skin contamination. The result shows that the annual effective dose that arises from handled objects contaminated with the screening level for decontamination (i.e. 100 000 counts per minute) is <1 mSv y−1, which can be considered as the intervention exemption level in accordance with the International Commission on Radiological Protection recommendations. Furthermore, the screening level is also found to protect the skin from the incidence of a deterministic effect because the absorbed dose of the skin that arises from direct deposition on the surface of the human body is calculated to be lower than the threshold of the deterministic effect assuming a practical exposure duration. PMID:22228683

  3. Indian Point Nuclear Power Station: verification analysis of County Radiological Emergency-Response Plans

    SciTech Connect

    Nagle, J.; Whitfield, R.

    1983-05-01

    This report was developed as a management tool for use by the Federal Emergency Management Agency (FEMA) Region II staff. The analysis summarized in this report was undertaken to verify the extent to which procedures, training programs, and resources set forth in the County Radiological Emergency Response Plans (CRERPs) for Orange, Putnam, and Westchester counties in New York had been realized prior to the March 9, 1983, exercise of the Indian Point Nuclear Power Station near Buchanan, New York. To this end, a telephone survey of county emergency response organizations was conducted between January 19 and February 22, 1983. This report presents the results of responses obtained from this survey of county emergency response organizations.

  4. Ensuring Longevity: Ancient Glasses Help Predict Durability of Vitrified Nuclear Waste

    SciTech Connect

    Weaver, Jamie L.; McCloy, John S.; Ryan, Joseph V.; Kruger, Albert A.

    2016-05-01

    How does glass alter with time? For the last hundred years this has been an important question to the fields of object conservation and archeology to ensure the preservation of glass artifacts. This same question is part of the development and assessment of durable glass waste forms for the immobilization of nuclear wastes. Researchers have developed experiments ranging from simple to highly sophisticated to answer this question, and, as a result, have gained significant insight into the mechanisms that drive glass alteration. However, the gathered data have been predominately applicable to only short-term alteration times, i.e. over the course of decades. What has remained elusive is the long-term mechanisms of glass alteration[1]. These mechanisms are of particular interest to the international nuclear waste glass community as they strive to ensure that vitrified products will be durable for thousands to tens of thousands of years. For the last thirty years this community has been working to fill this research gap by partnering with archeologists, museum curators, and geologists to identify hundred to million-year old glass analogues that have altered in environments representative of those expected at potential nuclear waste disposal sites. The process of identifying a waste glass relevant analogue is challenging as it requires scientists to relate data collected from short-term laboratory experiments to observations made from long-term analogues and extensive geochemical modeling.

  5. Verification of the Cross Immunoreactivity of A60, a Mouse Monoclonal Antibody against Neuronal Nuclear Protein

    PubMed Central

    Mao, Shanping; Xiong, Guoxiang; Zhang, Lei; Dong, Huimin; Liu, Baohui; Cohen, Noam A.; Cohen, Akiva S.

    2016-01-01

    A60, the mouse monoclonal antibody against the neuronal nuclear protein (NeuN), is the most widely used neuronal marker in neuroscience research and neuropathological assays. Previous studies identified fragments of A60-immunoprecipitated protein as Synapsin I (Syn I), suggesting the antibody will demonstrate cross immunoreactivity. However, the likelihood of cross reactivity has never been verified by immunohistochemical techniques. Using our established tissue processing and immunofluorescent staining protocols, we found that A60 consistently labeled mossy fiber terminals in hippocampal area CA3. These A60-positive mossy fiber terminals could also be labeled by Syn I antibody. After treating brain slices with saponin in order to better preserve various membrane and/or vesicular proteins for immunostaining, we observed that A60 could also label additional synapses in various brain areas. Therefore, we used A60 together with a rabbit monoclonal NeuN antibody to confirm the existence of this cross reactivity. We showed that the putative band positive for A60 and Syn I could not be detected by the rabbit anti-NeuN in Western blotting. As efficient as Millipore A60 to recognize neuronal nuclei, the rabbit NeuN antibody demonstrated no labeling of synaptic structures in immunofluorescent staining. The present study successfully verified the cross reactivity present in immunohistochemistry, cautioning that A60 may not be the ideal biomarker to verify neuronal identity due to its cross immunoreactivity. In contrast, the rabbit monoclonal NeuN antibody used in this study may be a better candidate to substitute for A60. PMID:27242450

  6. Independent Verification and Validation Of SAPHIRE 8 Software Quality Assurance Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-02-01

    This report provides an evaluation of the Software Quality Assurance Plan. The Software Quality Assurance Plan is intended to ensure all actions necessary for the software life cycle; verification and validation activities; documentation and deliverables; project management; configuration management, nonconformance reporting and corrective action; and quality assessment and improvement have been planned and a systematic pattern of all actions necessary to provide adequate confidence that a software product conforms to established technical requirements; and to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  7. Nuclear data verification based on Monte Carlo simulations of the LLNL pulsed-sphere benchmark experiments (1979 & 1986) using the Mercury code

    SciTech Connect

    Descalle, M; Pruet, J

    2008-06-09

    Livermore's nuclear data group developed a new verification and validation test suite to ensure the quality of data used in application codes. This is based on models of LLNL's pulsed sphere fusion shielding benchmark experiments. Simulations were done with Mercury, a 3D particle transport Monte Carlo code using continuous-energy cross-section libraries. Results were compared to measurements of neutron leakage spectra generated by 14MeV neutrons in 17 target assemblies (for a blank target assembly, H{sub 2}O, Teflon, C, N{sub 2}, Al, Si, Ti, Fe, Cu, Ta, W, Au, Pb, {sup 232}Th, {sup 235}U, {sup 238}U, and {sup 239}Pu). We also tested the fidelity of simulations for photon production associated with neutron interactions in the different materials. Gamma-ray leakage energy per neutron was obtained from a simple 1D spherical geometry assembly and compared to three codes (TART, COG, MCNP5) and several versions of the Evaluated Nuclear Data File (ENDF) and Evaluated Nuclear Data Libraries (ENDL) cross-section libraries. These tests uncovered a number of errors in photon production cross-sections, and were instrumental to the V&V of different cross-section libraries. Development of the pulsed sphere tests also uncovered the need for new Mercury capabilities. To enable simulations of neutron time-of-flight experiments the nuclear data group implemented an improved treatment of biased angular scattering in MCAPM.

  8. Proceedings of the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT)

    SciTech Connect

    Nichols, James W., LTC

    2000-09-15

    These proceedings contain papers prepared for the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), held 13-15 September 2000 in New Orleans, Louisiana. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Defense Threat Reduction Agency (DTRA), Air Force Technical Applications Center (AFTAC), Department of Defense (DoD), US Army Space and Missile Defense Command, Defense Special Weapons Agency (DSWA), and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  9. Environmental Detection of Clandestine Nuclear Weapon Programs

    NASA Astrophysics Data System (ADS)

    Kemp, R. Scott

    2016-06-01

    Environmental sensing of nuclear activities has the potential to detect nuclear weapon programs at early stages, deter nuclear proliferation, and help verify nuclear accords. However, no robust system of detection has been deployed to date. This can be variously attributed to high costs, technical limitations in detector technology, simple countermeasures, and uncertainty about the magnitude or behavior of potential signals. In this article, current capabilities and promising opportunities are reviewed. Systematic research in a variety of areas could improve prospects for detecting covert nuclear programs, although the potential for countermeasures suggests long-term verification of nuclear agreements will need to rely on methods other than environmental sensing.

  10. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    SciTech Connect

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert; McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  11. Verification and validation benchmarks.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  12. Computation and Analysis of the Global Distribution of the Radioxenon Isotope 133Xe based on Emissions from Nuclear Power Plants and Radioisotope Production Facilities and its Relevance for the Verification of the Nuclear-Test-Ban Treaty

    NASA Astrophysics Data System (ADS)

    Wotawa, Gerhard; Becker, Andreas; Kalinowski, Martin; Saey, Paul; Tuma, Matthias; Zähringer, Matthias

    2010-05-01

    Monitoring of radioactive noble gases, in particular xenon isotopes, is a crucial element of the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The capability of the noble gas network, which is currently under construction, to detect signals from a nuclear explosion critically depends on the background created by other sources. Therefore, the global distribution of these isotopes based on emissions and transport patterns needs to be understood. A significant xenon background exists in the reactor regions of North America, Europe and Asia. An emission inventory of the four relevant xenon isotopes has recently been created, which specifies source terms for each power plant. As the major emitters of xenon isotopes worldwide, a few medical radioisotope production facilities have been recently identified, in particular the facilities in Chalk River (Canada), Fleurus (Belgium), Pelindaba (South Africa) and Petten (Netherlands). Emissions from these sites are expected to exceed those of the other sources by orders of magnitude. In this study, emphasis is put on 133Xe, which is the most prevalent xenon isotope. First, based on the emissions known, the resulting 133Xe concentration levels at all noble gas stations of the final CTBT verification network were calculated and found to be consistent with observations. Second, it turned out that emissions from the radioisotope facilities can explain a number of observed peaks, meaning that atmospheric transport modelling is an important tool for the categorization of measurements. Third, it became evident that Nuclear Power Plant emissions are more difficult to treat in the models, since their temporal variation is high and not generally reported. Fourth, there are indications that the assumed annual emissions may be underestimated by factors of two to ten, while the general emission patterns seem to be well understood. Finally, it became evident that 133Xe sources mainly influence the sensitivity of the

  13. A REPRINT of a July 1991 Report to Congress, Executive Summary of Verification of Nuclear Warhead Dismantlement and Special Nuclear Material Controls

    SciTech Connect

    Fuller, James L.

    2008-11-20

    With the renewed thinking and debate about deep reductions in nuclear weapons, including recent proposals about eliminating nuclear warheads altogether, republishing the general conclusions of the Robinson Committee Report of 1992 appears useful. The report is sometimes referred to as the 3151 Report, from Section 3151 of the National Defnse Authorization Act for FY1991, from where its requirement originated. This report contains the Executive Summary only and the forwarding letters from the Committee, the President of the United States, the Secretary of Energy, and C Paul Robinson, the head of the Advisory Committee.

  14. Helping Kids Help

    ERIC Educational Resources Information Center

    Heiss, E. Renee

    2008-01-01

    Educators need to help kids help others so that they can help themselves. Volunteering does not involve competition or grades. This is one area where students don't have to worry about measuring up to the expectations of parents, teachers, and coaches. Students participate in charitable work to add another line to a college transcript or job…

  15. Specification and verification of nuclear-power-plant training-simulator response characteristics. Part II. Conclusions and recommendations

    SciTech Connect

    Haas, P M; Selby, D L; Kerlin, T W; Felkins, L

    1982-05-01

    The nuclear industry should adopt and NRC regulatory and research actions should support the systems approach to training as a structured framework for development and validation of personnel training systems. Potential exists for improving the ability to assess simulator fidelity. Systems Identification Technology offers a potential framework for model validation. Installation of the data collection/recording equipment required by NUREG-0696 could provide a vastly improved source of data for simulator fidelity assessment. The NRC needs to continue its post-TMI actions to involve itself more rigorously and more formally in the entire process of NPP personnel training system development. However, this involvement should be a participative one with industry. The existing similator standards and guidelines should be reorganized to support the use of systems approach to training. The standards should require and support a holistic approach to training system development that recognizes simulators and simulator training as only parts of the complete training program and full-scope, high-fidelity, site-specific simulators as only one useful training device. Some recommendations for adapting the SAT/ISD process to the nuclear industry are: The formation of an NRC/industry planning/coordination group, a program planning study to develop a programmatic plan, development of a user's guide and NRC/industry workshops to establish common terminology and practice, and a pilot study applying the adopted SAT/ISD methodology to an actual nuclear industry training program.

  16. On-line high-performance liquid chromatography-ultraviolet-nuclear magnetic resonance method of the markers of nerve agents for verification of the Chemical Weapons Convention.

    PubMed

    Mazumder, Avik; Gupta, Hemendra K; Garg, Prabhat; Jain, Rajeev; Dubey, Devendra K

    2009-07-03

    This paper details an on-flow liquid chromatography-ultraviolet-nuclear magnetic resonance (LC-UV-NMR) method for the retrospective detection and identification of alkyl alkylphosphonic acids (AAPAs) and alkylphosphonic acids (APAs), the markers of the toxic nerve agents for verification of the Chemical Weapons Convention (CWC). Initially, the LC-UV-NMR parameters were optimized for benzyl derivatives of the APAs and AAPAs. The optimized parameters include stationary phase C(18), mobile phase methanol:water 78:22 (v/v), UV detection at 268nm and (1)H NMR acquisition conditions. The protocol described herein allowed the detection of analytes through acquisition of high quality NMR spectra from the aqueous solution of the APAs and AAPAs with high concentrations of interfering background chemicals which have been removed by preceding sample preparation. The reported standard deviation for the quantification is related to the UV detector which showed relative standard deviations (RSDs) for quantification within +/-1.1%, while lower limit of detection upto 16mug (in mug absolute) for the NMR detector. Finally the developed LC-UV-NMR method was applied to identify the APAs and AAPAs in real water samples, consequent to solid phase extraction and derivatization. The method is fast (total experiment time approximately 2h), sensitive, rugged and efficient.

  17. Making Sure Helping Helps.

    ERIC Educational Resources Information Center

    Gartner, Audrey; Riessman, Frank

    1993-01-01

    Benefits to the helper are important to consider in a national-service program, along with the benefits to the recipient. Some suggestions are offered to ensure reciprocity in community service. Democratizing help giving, that is making it available to the widest possible audience, could help remove some of the pitfalls associated with help…

  18. Application of cryoprobe 1H nuclear magnetic resonance spectroscopy and multivariate analysis for the verification of corsican honey.

    PubMed

    Donarski, James A; Jones, Stephen A; Charlton, Adrian J

    2008-07-23

    Proton nuclear magnetic resonance spectroscopy ((1)H NMR) and multivariate analysis techniques have been used to classify honey into two groups by geographical origin. Honey from Corsica (Miel de Corse) was used as an example of a protected designation of origin product. Mathematical models were constructed to determine the feasibility of distinguishing between honey from Corsica and that from other geographical locations in Europe, using (1)H NMR spectroscopy. Honey from 10 different regions within five countries was analyzed. (1)H NMR spectra were used as input variables for projection to latent structures (PLS) followed by linear discriminant analysis (LDA) and genetic programming (GP). Models were generated using three methods, PLS-LDA, two-stage GP, and a combination of PLS and GP (PLS-GP). The PLS-GP model used variables selected by PLS for subsequent GP calculations. All models were generated using Venetian blind cross-validation. Overall classification rates for the discrimination of Corsican and non-Corsican honey of 75.8, 94.5, and 96.2% were determined using PLS-LDA, two-stage GP, and PLS-GP, respectively. The variables utilized by PLS-GP were related to their (1)H NMR chemical shifts, and this led to the identification of trigonelline in honey for the first time.

  19. Assessment of the utility of on-site inspection for INF treaty verification. Sanitized. Technical report

    SciTech Connect

    Baker, J.C.; Hart, D.M.; Doherty, R.T.

    1983-11-10

    This report analyzes the utility of on-site inspection (OSI) for enhancing Intermediate-Range Nuclear Force (INF) treaty verification of Soviet compliance with US-proposed collateral limits on short-range ballistic missiles (SRBMs). It outlines a detailed verification regime that relies on manned OSI teams to help verify limitations on Soviet SRBM deployments. It also assesses the OSI regime's potential impact on US Pershing deployments. Finally, the report reviews the history of American policy concerning on-site inspection and evaluates the overall utility of OSI in support of National Technical Means.

  20. Utilization of the Differential Die-Away Self-Interrogation Technique for Characterization and Verification of Spent Nuclear Fuel

    SciTech Connect

    Trahan, Alexis Chanel

    2016-01-27

    New nondestructive assay techniques are sought to better characterize spent nuclear fuel. One of the NDA instruments selected for possible deployment is differential die-away self-interrogation (DDSI). The proposed DDSI approach for spent fuel assembly assay utilizes primarily the spontaneous fission and (α, n) neutrons in the assemblies as an internal interrogating radiation source. The neutrons released in spontaneous fission or (α,n) reactions are thermalized in the surrounding water and induce fission in fissile isotopes, thereby creating a measurable signal from isotopes of interest that would be otherwise difficult to measure. The DDSI instrument employs neutron coincidence counting with 3He tubes and list-mode-based data acquisition to allow for production of Rossi-alpha distributions (RADs) in post-processing. The list-mode approach to data collection and subsequent construction of RADs has expanded the analytical possibilities, as will be demonstrated throughout this thesis. One of the primary advantages is that the measured signal in the form of a RAD can be analyzed in its entirety including determination of die-away times in different time domains. This capability led to the development of the early die-away method, a novel leakage multiplication determination method which is tested throughout the thesis on different sources in simulation space and fresh fuel experiments. The early die-away method is a robust, accurate, improved method of determining multiplication without the need for knowledge of the (α,n) source term. The DDSI technique and instrument are presented along with the many novel capabilities enabled by and discovered through RAD analysis. Among the new capabilities presented are the early die-away method, total plutonium content determination, and highly sensitive missing pin detection. Simulation of hundreds of different spent and fresh fuel assemblies were used to develop the analysis algorithms and the techniques were tested on a

  1. Utilization of the Differential Die-Away Self-Interrogation Technique for Characterization and Verification of Spent Nuclear Fuel

    NASA Astrophysics Data System (ADS)

    Trahan, Alexis Chanel

    New nondestructive assay techniques are sought to better characterize spent nuclear fuel. One of the NDA instruments selected for possible deployment is differential die-away self-interrogation (DDSI). The proposed DDSI approach for spent fuel assembly assay utilizes primarily the spontaneous fission and (alpha, n) neutrons in the assemblies as an internal interrogating radiation source. The neutrons released in spontaneous fission or (alpha,n) reactions are thermalized in the surrounding water and induce fission in fissile isotopes, thereby creating a measurable signal from isotopes of interest that would be otherwise difficult to measure. The DDSI instrument employs neutron coincidence counting with 3He tubes and list-mode-based data acquisition to allow for production of Rossi-alpha distributions (RADs) in post-processing. The list-mode approach to data collection and subsequent construction of RADs has expanded the analytical possibilities, as will be demonstrated throughout this thesis. One of the primary advantages is that the measured signal in the form of a RAD can be analyzed in its entirety including determination of die-away times in different time domains. This capability led to the development of the early die-away method, a novel leakage multiplication determination method which is tested throughout the thesis on different sources in simulation space and fresh fuel experiments. The early die-away method is a robust, accurate, improved method of determining multiplication without the need for knowledge of the (alpha,n) source term. The DDSI technique and instrument are presented along with the many novel capabilities enabled by and discovered through RAD analysis. Among the new capabilities presented are the early die-away method, total plutonium content determination, and highly sensitive missing pin detection. Simulation of hundreds of different spent and fresh fuel assemblies were used to develop the analysis algorithms and the techniques were

  2. CTBT integrated verification system evaluation model supplement

    SciTech Connect

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  3. Helping Kids Help Themselves.

    ERIC Educational Resources Information Center

    Good, E. Perry

    This book explains how many of the behaviors that adults use to "help" kids are, at best, ineffective and, at worst, destructive to the adults' relationships with children. Adults traditionally believe that external cues prompt correct behavior--the premise of stimulus-response psychology. However, the ideas discussed here revolve around the…

  4. Answers to if the Lead Aprons are Really Helpful in Nuclear Medicine from the Perspective of Spectroscopy.

    PubMed

    He, X; Zhao, R; Rong, L; Yao, K; Chen, S; Wei, B

    2016-09-09

    Wearing lead X-ray-protective aprons is a routine in nuclear medicine department in parts of China. However, the staff are often perplexed by questions such as if it is imperative to wear aprons when injecting radioactive drugs, how much radiation dosage can be shielded and if the apron will produce secondary radiation instead? To answer these questions, a semiconductor detector was employed to record different gamma and X-ray spectra with and without the lead apron or lead sheet. Then, we could estimate the signal shielding ratio to different photons for the lead apron and compare with the hospitals measured data. In general, the two results coincided well. The spectral results showed that the detrimental secondary X-rays irradiation rises when the energy of gamma rays exceeds the K absorption edge of lead (88 keV). Moreover, the aprons are not so effective for gamma rays of 364 keV emitted from (131)I and 511 keV emitted from the positron radioactive nuclides. This work is purely a physical measurement in the laboratory. To the best of our knowledge, this is the first quantitative study on the level of gamma rays protection offered by the medical lead aprons and the importance of the spectroscopic measurements is discussed in this paper.

  5. MELCOR Verification, Benchmarking, and Applications experience at BNL

    SciTech Connect

    Madni, I.K.

    1992-01-01

    This paper presents a summary of MELCOR Verification, Benchmarking and Applications experience at Brookhaven National Laboratory (BNL), sponsored by the US Nuclear Regulatory Commission (NRC). Under MELCOR verification over the past several years, all released versions of the code were installed on BNL's computer system, verification exercises were performed, and defect investigation reports were sent to SNL. Benchmarking calculations of integral severe fuel damage tests performed at BNL have helped to identify areas of modeling strengths and weaknesses in MELCOR; the most appropriate choices for input parameters; selection of axial nodalization for core cells and heat structures; and workarounds that extend the capabilities of MELCOR. These insights are explored in greater detail in the paper, with the help of selected results and comparisons. Full plant applications calculations at BNL have helped to evaluate the ability of MELCOR to successfully simulate various accident sequences and calculate source terms to the environment for both BWRs and PWRs. A summary of results, including timing of key events, thermal-hydraulic response, and environmental releases of fission products are presented for selected calculations, along with comparisons with Source Term Code Package (STCP) calculations of the same sequences. Differences in results are explained on the basis of modeling differences between the two codes. The results of a sensitivity calculation are also shown. The paper concludes by highlighting some insights on bottomline issues, and the contribution of the BNL program to MELCOR development, assessment, and the identification of user needs for optimum use of the code.

  6. MELCOR Verification, Benchmarking, and Applications experience at BNL

    SciTech Connect

    Madni, I.K.

    1992-12-31

    This paper presents a summary of MELCOR Verification, Benchmarking and Applications experience at Brookhaven National Laboratory (BNL), sponsored by the US Nuclear Regulatory Commission (NRC). Under MELCOR verification over the past several years, all released versions of the code were installed on BNL`s computer system, verification exercises were performed, and defect investigation reports were sent to SNL. Benchmarking calculations of integral severe fuel damage tests performed at BNL have helped to identify areas of modeling strengths and weaknesses in MELCOR; the most appropriate choices for input parameters; selection of axial nodalization for core cells and heat structures; and workarounds that extend the capabilities of MELCOR. These insights are explored in greater detail in the paper, with the help of selected results and comparisons. Full plant applications calculations at BNL have helped to evaluate the ability of MELCOR to successfully simulate various accident sequences and calculate source terms to the environment for both BWRs and PWRs. A summary of results, including timing of key events, thermal-hydraulic response, and environmental releases of fission products are presented for selected calculations, along with comparisons with Source Term Code Package (STCP) calculations of the same sequences. Differences in results are explained on the basis of modeling differences between the two codes. The results of a sensitivity calculation are also shown. The paper concludes by highlighting some insights on bottomline issues, and the contribution of the BNL program to MELCOR development, assessment, and the identification of user needs for optimum use of the code.

  7. Technical challenges for dismantlement verification

    SciTech Connect

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-11-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion.

  8. Transmutation Fuel Performance Code Thermal Model Verification

    SciTech Connect

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  9. Verification and Validation Plan for the Codes LSP and ICARUS (PEGASUS)

    SciTech Connect

    RILEY,MERLE E.; BUSS,RICHARD J.; CAMPBELL,ROBERT B.; HOPKINS,MATTHEW M.; MILLER,PAUL A.; MOATS,ANNE R.; WAMPLER,WILLIAM R.

    2002-02-01

    This report documents the strategies for verification and validation of the codes LSP and ICARUS used for simulating the operation of the neutron tubes used in all modern nuclear weapons. The codes will be used to assist in the design of next generation neutron generators and help resolve manufacturing issues for current and future production of neutron devices. Customers for the software are identified, tube phenomena are identified and ranked, software quality strategies are given, and the validation plan is set forth.

  10. Independent Verification and Validation Of SAPHIRE 8 Software Design and Interface Design Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE software design and interface design is to assess the activities that results in the development, documentation, and review of a software design that meets the requirements defined in the software requirements documentation. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP) design specification.

  11. Independent Verification and Validation Of SAPHIRE 8 Volume 3 Users' Guide Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE 8 Volume 3 Users’ Guide is to assess the user documentation for its completeness, correctness, and consistency with respect to requirements for user interface and for any functionality that can be invoked by the user. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  12. Verification Games: Crowd-Sourced Formal Verification

    DTIC Science & Technology

    2016-03-01

    Formal Verification the verification tools developed by the Programming Languages and Software Engineering group were improved. A series of games...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox. Verification tools and games were integrated to verify...N/A i Contents List of Figures 1. SUMMARY .............................................................................................. 1 2

  13. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  14. Independent Verification and Validation Of SAPHIRE 8 Software Acceptance Test Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE 8 Software Acceptance Test Plan is to assess the approach to be taken for intended testing activities. The plan typically identifies the items to be tested, the requirements being tested, the testing to be performed, test schedules, personnel requirements, reporting requirements, evaluation criteria, and any risks requiring contingency planning. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  15. Monitoring and verification R&D

    SciTech Connect

    Pilat, Joseph F; Budlong - Sylvester, Kory W; Fearey, Bryan L

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.

  16. CTBT Integrated Verification System Evaluation Model

    SciTech Connect

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  17. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  18. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  19. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  20. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  1. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  2. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  3. Nuclear Scans

    MedlinePlus

    Nuclear scans use radioactive substances to see structures and functions inside your body. They use a special ... images. Most scans take 20 to 45 minutes. Nuclear scans can help doctors diagnose many conditions, including ...

  4. Sandia technology. Volume 13, number 2 Special issue : verification of arms control treaties.

    SciTech Connect

    Not Available

    1989-03-01

    Nuclear deterrence, a cornerstone of US national security policy, has helped prevent global conflict for over 40 years. The DOE and DoD share responsibility for this vital part of national security. The US will continue to rely on nuclear deterrence for the foreseeable future. In the late 1950s, Sandia developed satellite-borne nuclear burst detection systems to support the treaty banning atmospheric nuclear tests. This activity has continued to expand and diversify. When the Non-Proliferation Treaty was ratified in 1970, we began to develop technologies to protect nuclear materials from falling into unauthorized hands. This program grew and now includes systems for monitoring the movement and storage of nuclear materials, detecting tampering, and transmiting sensitive data securely. In the late 1970s, negotiations to further limit underground nuclear testing were being actively pursued. In less than 18 months, we fielded the National Seismic Station, an unattended observatory for in-country monitoring of nuclear tests. In the mid-l980s, arms-control interest shifted to facility monitoring and on-site inspection. Our Technical On-site Inspection Facility is the national test bed for perimeter and portal monitoring technology and the prototype for the inspection portal that was recently installed in the USSR under the Intermediate-Range Nuclear Forces accord. The articles in the special issue of Sundiu Technology describe some of our current contributions to verification technology. This work supports the US policy to seek realistic arms control agreements while maintaining our national security.

  5. The nuclear freeze controversy

    SciTech Connect

    Payne, K.B.; Gray, C.S.

    1984-01-01

    This book presents papers on nuclear arms control. Topics considered include the background and rationale behind the nuclear freeze proposal, nuclear deterrence, national defense, arms races, arms buildup, warfare, the moral aspects of nuclear deterrence, treaty verification, the federal budget, the economy, a historical perspective on Soviet policy toward the freeze, the other side of the Soviet peace offensive, and making sense of the nuclear freeze debate.

  6. Cold fusion verification

    NASA Astrophysics Data System (ADS)

    North, M. H.; Mastny, G. F.; Wesley, E. J.

    1991-03-01

    The objective of this work to verify and reproduce experimental observations of Cold Nuclear Fusion (CNF), as originally reported in 1989. The method was to start with the original report and add such additional information as became available to build a set of operational electrolytic CNF cells. Verification was to be achieved by first observing cells for neutron production, and for those cells that demonstrated a nuclear effect, careful calorimetric measurements were planned. The authors concluded, after laboratory experience, reading published work, talking with others in the field, and attending conferences, that CNF probably is chimera and will go the way of N-rays and polywater. The neutron detector used for these tests was a completely packaged unit built into a metal suitcase that afforded electrostatic shielding for the detectors and self-contained electronics. It was battery-powered, although it was on charge for most of the long tests. The sensor element consists of He detectors arranged in three independent layers in a solid moderating block. The count from each of the three layers as well as the sum of all the detectors were brought out and recorded separately. The neutron measurements were made with both the neutron detector and the sample tested in a cave made of thick moderating material that surrounded the two units on the sides and bottom.

  7. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  8. Columbus pressurized module verification

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Comandatore, Emanuele

    1986-01-01

    The baseline verification approach of the COLUMBUS Pressurized Module was defined during the A and B1 project phases. Peculiarities of the verification program are the testing requirements derived from the permanent manned presence in space. The model philosophy and the test program have been developed in line with the overall verification concept. Such critical areas as meteoroid protections, heat pipe radiators and module seals are identified and tested. Verification problem areas are identified and recommendations for the next development are proposed.

  9. Trashing the planet. [How Science can help us deal with environmental problems such as acid rain, depletion of the ozone, and nuclear waste (among other things)

    SciTech Connect

    Ray, D.L.; Guzzo, L.

    1990-01-01

    The authors use a common sense approach to their goals of clarifying environmental issues, separating fact from factoid, unmaking the dooms-crying opponents of all progress, and re-establishing a sense of reason and balance with respect to the environment, modern technology and science. The introductory section is a discussion of man, technology, and the environment. The authors point out the three major problem areas in the interface between science, the media, and the public: anxiety, factoids, and misinterpretation. They also discuss the reality of the economic and technological changes from the good old days. The second section of the book focuses on four major environmental issues: the greenhouse effect; acid rain; pesticides; and chemical toxins (asbestos, PCB, dioxin). In the third section the authors present a broad approach to the nuclear issues facing us: understanding of radiation; nuclear medicine; nuclear power; and nuclear waste. Finally the book concludes with a section of environmentalism and the future. The authors discuss political environmental activism, governmental actions, and global prospective. They also list four common sense approaches for ordinary citizens: pressure on the legislative branch of government; refusal to listen to the just in case argument; keeping a sense of perspective; and realizing that humans have the responsibility to be good stewards while at the same time they cannot live without altering the earth. At the end of the book there is a sizable section of endnotes and referenced citations.

  10. Optimal Imaging for Treaty Verification

    SciTech Connect

    Brubaker, Erik; Hilton, Nathan R.; Johnson, William; Marleau, Peter; Kupinski, Matthew; MacGahan, Christopher Jonathan

    2014-09-01

    Future arms control treaty verification regimes may use radiation imaging measurements to confirm and track nuclear warheads or other treaty accountable items (TAIs). This project leverages advanced inference methods developed for medical and adaptive imaging to improve task performance in arms control applications. Additionally, we seek a method to acquire and analyze imaging data of declared TAIs without creating an image of those objects or otherwise storing or revealing any classified information. Such a method would avoid the use of classified-information barriers (IB).

  11. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  12. Helping Children Help Themselves. Revised.

    ERIC Educational Resources Information Center

    Alberta Dept. of Agriculture, Edmonton.

    Youth leaders and parents can use this activity oriented publication to help children six to twelve years of age become more independent by acquiring daily living skills. The publication consists of five units, each of which contains an introduction, learning activities, and lists of resource materials. Age-ability levels are suggested for…

  13. Help Us to Help Ourselves

    ERIC Educational Resources Information Center

    Stanistreet, Paul

    2010-01-01

    Local authorities have a strong tradition of supporting communities to help themselves, and this is nowhere better illustrated than in the learning they commission and deliver through the Adult Safeguarded Learning budget. The budget was set up to protect at least a minimum of provision for adult liberal education, family learning and learning for…

  14. Verification of Scientific Simulations via Hypothesis-Driven Comparative and Quantitative Visualization

    SciTech Connect

    Ahrens, James P; Heitmann, Katrin; Petersen, Mark R; Woodring, Jonathan; Williams, Sean; Fasel, Patricia; Ahrens, Christine; Hsu, Chung-Hsing; Geveci, Berk

    2010-11-01

    This article presents a visualization-assisted process that verifies scientific-simulation codes. Code verification is necessary because scientists require accurate predictions to interpret data confidently. This verification process integrates iterative hypothesis verification with comparative, feature, and quantitative visualization. Following this process can help identify differences in cosmological and oceanographic simulations.

  15. Fuel Retrieval System Design Verification Report

    SciTech Connect

    GROTH, B.D.

    2000-04-11

    The Fuel Retrieval Subproject was established as part of the Spent Nuclear Fuel Project (SNF Project) to retrieve and repackage the SNF located in the K Basins. The Fuel Retrieval System (FRS) construction work is complete in the KW Basin, and start-up testing is underway. Design modifications and construction planning are also underway for the KE Basin. An independent review of the design verification process as applied to the K Basin projects was initiated in support of preparation for the SNF Project operational readiness review (ORR). A Design Verification Status Questionnaire, Table 1, is included which addresses Corrective Action SNF-EG-MA-EG-20000060, Item No.9 (Miller 2000).

  16. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  17. 7.0T nuclear magnetic resonance evaluation of the amyloid beta (1-40) animal model of Alzheimer's disease: comparison of cytology verification.

    PubMed

    Zhang, Lei; Dong, Shuai; Zhao, Guixiang; Ma, Yu

    2014-02-15

    3.0T magnetic resonance spectroscopic imaging is a commonly used method in the research of brain function in Alzheimer's disease. However, the role of 7.0T high-field magnetic resonance spectroscopic imaging in brain function of Alzheimer's disease remains unclear. In this study, 7.0T magnetic resonance spectroscopy showed that in the hippocampus of Alzheimer's disease rats, the N-acetylaspartate wave crest was reduced, and the creatine and choline wave crest was elevated. This finding was further supported by hematoxylin-eosin staining, which showed a loss of hippocampal neurons and more glial cells. Moreover, electron microscopy showed neuronal shrinkage and mitochondrial rupture, and scanning electron microscopy revealed small size hippocampal synaptic vesicles, incomplete synaptic structure, and reduced number. Overall, the results revealed that 7.0T high-field nuclear magnetic resonance spectroscopy detected the lesions and functional changes in hippocampal neurons of Alzheimer's disease rats in vivo, allowing the possibility for assessing the success rate and grading of the amyloid beta (1-40) animal model of Alzheimer's disease.

  18. Experimental verification of proton beam monitoring in a human body by use of activity image of positron-emitting nuclei generated by nuclear fragmentation reaction.

    PubMed

    Nishio, Teiji; Miyatake, Aya; Inoue, Kazumasa; Gomi-Miyagishi, Tomoko; Kohno, Ryosuke; Kameoka, Satoru; Nakagawa, Keiichi; Ogino, Takashi

    2008-01-01

    Proton therapy is a form of radiotherapy that enables concentration of dose on a tumor by use of a scanned or modulated Bragg peak. Therefore, it is very important to evaluate the proton-irradiated volume accurately. The proton-irradiated volume can be confirmed by detection of pair-annihilation gamma rays from positron-emitting nuclei generated by the nuclear fragmentation reaction of the incident protons on target nuclei using a PET apparatus. The activity of the positron-emitting nuclei generated in a patient was measured with a PET-CT apparatus after proton beam irradiation of the patient. Activity measurement was performed in patients with tumors of the brain, head and neck, liver, lungs, and sacrum. The 3-D PET image obtained on the CT image showed the visual correspondence with the irradiation area of the proton beam. Moreover, it was confirmed that there were differences in the strength of activity from the PET-CT images obtained at each irradiation site. The values of activity obtained from both measurement and calculation based on the reaction cross section were compared, and it was confirmed that the intensity and the distribution of the activity changed with the start time of the PET imaging after proton beam irradiation. The clinical use of this information about the positron-emitting nuclei will be important for promoting proton treatment with higher accuracy in the future.

  19. Standardized verification of fuel cycle modeling

    SciTech Connect

    Feng, B.; Dixon, B.; Sunny, E.; Cuadra, A.; Jacobson, J.; Brown, N. R.; Powers, J.; Worrall, A.; Passerini, S.; Gregg, R.

    2016-04-05

    A nuclear fuel cycle systems modeling and code-to-code comparison effort was coordinated across multiple national laboratories to verify the tools needed to perform fuel cycle analyses of the transition from a once-through nuclear fuel cycle to a sustainable potential future fuel cycle. For this verification study, a simplified example transition scenario was developed to serve as a test case for the four systems codes involved (DYMOND, VISION, ORION, and MARKAL), each used by a different laboratory participant. In addition, all participants produced spreadsheet solutions for the test case to check all the mass flows and reactor/facility profiles on a year-by-year basis throughout the simulation period. The test case specifications describe a transition from the current US fleet of light water reactors to a future fleet of sodium-cooled fast reactors that continuously recycle transuranic elements as fuel. After several initial coordinated modeling and calculation attempts, it was revealed that most of the differences in code results were not due to different code algorithms or calculation approaches, but due to different interpretations of the input specifications among the analysts. Therefore, the specifications for the test case itself were iteratively updated to remove ambiguity and to help calibrate interpretations. In addition, a few corrections and modifications were made to the codes as well, which led to excellent agreement between all codes and spreadsheets for this test case. Although no fuel cycle transition analysis codes matched the spreadsheet results exactly, all remaining differences in the results were due to fundamental differences in code structure and/or were thoroughly explained. As a result, the specifications and example results are provided so that they can be used to verify additional codes in the future for such fuel cycle transition scenarios.

  20. Standardized verification of fuel cycle modeling

    DOE PAGES

    Feng, B.; Dixon, B.; Sunny, E.; ...

    2016-04-05

    A nuclear fuel cycle systems modeling and code-to-code comparison effort was coordinated across multiple national laboratories to verify the tools needed to perform fuel cycle analyses of the transition from a once-through nuclear fuel cycle to a sustainable potential future fuel cycle. For this verification study, a simplified example transition scenario was developed to serve as a test case for the four systems codes involved (DYMOND, VISION, ORION, and MARKAL), each used by a different laboratory participant. In addition, all participants produced spreadsheet solutions for the test case to check all the mass flows and reactor/facility profiles on a year-by-yearmore » basis throughout the simulation period. The test case specifications describe a transition from the current US fleet of light water reactors to a future fleet of sodium-cooled fast reactors that continuously recycle transuranic elements as fuel. After several initial coordinated modeling and calculation attempts, it was revealed that most of the differences in code results were not due to different code algorithms or calculation approaches, but due to different interpretations of the input specifications among the analysts. Therefore, the specifications for the test case itself were iteratively updated to remove ambiguity and to help calibrate interpretations. In addition, a few corrections and modifications were made to the codes as well, which led to excellent agreement between all codes and spreadsheets for this test case. Although no fuel cycle transition analysis codes matched the spreadsheet results exactly, all remaining differences in the results were due to fundamental differences in code structure and/or were thoroughly explained. As a result, the specifications and example results are provided so that they can be used to verify additional codes in the future for such fuel cycle transition scenarios.« less

  1. The Challenge for Arms Control Verification in the Post-New START World

    SciTech Connect

    Wuest, C R

    2012-05-24

    Nuclear weapon arms control treaty verification is a key aspect of any agreement between signatories to establish that the terms and conditions spelled out in the treaty are being met. Historically, arms control negotiations have focused more on the rules and protocols for reducing the numbers of warheads and delivery systems - sometimes resorting to complex and arcane procedures for counting forces - in an attempt to address perceived or real imbalances in a nation's strategic posture that could lead to instability. Verification procedures are generally defined in arms control treaties and supporting documents and tend to focus on technical means and measures designed to ensure that a country is following the terms of the treaty and that it is not liable to engage in deception or outright cheating in an attempt to circumvent the spirit and the letter of the agreement. As the Obama Administration implements the articles, terms, and conditions of the recently ratified and entered-into-force New START treaty, there are already efforts within and outside of government to move well below the specified New START levels of 1550 warheads, 700 deployed strategic delivery vehicles, and 800 deployed and nondeployed strategic launchers (Inter-Continental Ballistic Missile (ICBM) silos, Submarine-Launched Ballistic Missile (SLBM) tubes on submarines, and bombers). A number of articles and opinion pieces have appeared that advocate for significantly deeper cuts in the U.S. nuclear stockpile, with some suggesting that unilateral reductions on the part of the U.S. would help coax Russia and others to follow our lead. Papers and studies prepared for the U.S. Department of Defense and at the U.S. Air War College have also been published, suggesting that nuclear forces totaling no more than about 300 warheads would be sufficient to meet U.S. national security and deterrence needs. (Davis 2011, Schaub and Forsyth 2010) Recent articles by James M. Acton and others suggest that the

  2. The Microbiology of Subsurface, Salt-Based Nuclear Waste Repositories: Using Microbial Ecology, Bioenergetics, and Projected Conditions to Help Predict Microbial Effects on Repository Performance

    SciTech Connect

    Swanson, Juliet S.; Cherkouk, Andrea; Arnold, Thuro; Meleshyn, Artur; Reed, Donald T.

    2016-11-17

    This report summarizes the potential role of microorganisms in salt-based nuclear waste repositories using available information on the microbial ecology of hypersaline environments, the bioenergetics of survival under high ionic strength conditions, and “repository microbiology” related studies. In areas where microbial activity is in question, there may be a need to shift the research focus toward feasibility studies rather than studies that generate actual input for performance assessments. In areas where activity is not necessary to affect performance (e.g., biocolloid transport), repository-relevant data should be generated. Both approaches will lend a realistic perspective to a safety case/performance scenario that will most likely underscore the conservative value of that case.

  3. Verification Of Tooling For Robotic Welding

    NASA Technical Reports Server (NTRS)

    Osterloh, Mark R.; Sliwinski, Karen E.; Anderson, Ronald R.

    1991-01-01

    Computer simulations, robotic inspections, and visual inspections performed to detect discrepancies. Method for verification of tooling for robotic welding involves combination of computer simulations and visual inspections. Verification process ensures accuracy of mathematical model representing tooling in off-line programming system that numerically simulates operation of robotic welding system. Process helps prevent damaging collisions between welding equipment and workpiece, ensures tooling positioned and oriented properly with respect to workpiece, and/or determines whether tooling to be modified or adjusted to achieve foregoing objectives.

  4. Natural Analogues - One Way to Help Build Public Confidence in the Predicted Performance of a Mined Geologic Repository for Nuclear Waste

    SciTech Connect

    Stuckless, J. S.

    2002-02-26

    The general public needs to have a way to judge the predicted long-term performance of the potential high-level nuclear waste repository at Yucca Mountain. The applicability and reliability of mathematical models used to make this prediction are neither easily understood nor accepted by the public. Natural analogues can provide the average person with a tool to assess the predicted performance and other scientific conclusions. For example, hydrologists with the Yucca Mountain Project have predicted that most of the water moving through the unsaturated zone at Yucca Mountain, Nevada will move through the host rock and around tunnels. Thus, seepage into tunnels is predicted to be a small percentage of available infiltration. This hypothesis can be tested experimentally and with some quantitative analogues. It can also be tested qualitatively using a variety of analogues such as (1) well-preserved Paleolithic to Neolithic paintings in caves and rock shelters, (2) biological remains preserved in caves and rock shelters, and (3) artifacts and paintings preserved in man-made underground openings. These examples can be found in materials that are generally available to the non-scientific public and can demonstrate the surprising degree of preservation of fragile and easily destroyed materials for very long periods of time within the unsaturated zone.

  5. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  6. Multidimensional Analysis of Nuclear Detonations

    DTIC Science & Technology

    2015-09-17

    reconstructions and temperature distributions of the early time nuclear fireballs. Initial developments have resulted in the first 2-dimensional... temperature distribution of a nuclear fireball using digitized film. This temperature analysis underwent verification using the Digital Imaging and Remote... temperature profile of the nuclear fireball as a function of optical path length. A 3-dimensional reconstruction was performed using a variation of a

  7. Systems Approach to Arms Control Verification

    SciTech Connect

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  8. Packaged low-level waste verification system

    SciTech Connect

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-08-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy`s National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL).

  9. National Center for Nuclear Security: The Nuclear Forensics Project (F2012)

    SciTech Connect

    Klingensmith, A. L.

    2012-03-21

    These presentation visuals introduce the National Center for Nuclear Security. Its chartered mission is to enhance the Nation’s verification and detection capabilities in support of nuclear arms control and nonproliferation through R&D activities at the NNSS. It has three focus areas: Treaty Verification Technologies, Nonproliferation Technologies, and Technical Nuclear Forensics. The objectives of nuclear forensics are to reduce uncertainty in the nuclear forensics process & improve the scientific defensibility of nuclear forensics conclusions when applied to nearsurface nuclear detonations. Research is in four key areas: Nuclear Physics, Debris collection and analysis, Prompt diagnostics, and Radiochemistry.

  10. Voltage verification unit

    DOEpatents

    Martin, Edward J.

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  11. Seismic Surveillance - Nuclear Test Ban Verification

    DTIC Science & Technology

    1992-03-27

    companies: BP Norway Ltd., Conoco Norway Inc., Elf Acquitaine Norge, Esso Norge A/S, Mobile Development Norway, Norsk Hydro A/S, A/S Norske Shell...with Drs. A. Dainty (M.I.T.) and D. Lokshtanov (Norsk Hydro , Bergen) are hereby acknowledged. Our sincere thanks to the many colleagues who provided...10 20 30 40 Events numbers ranked due to wrF values A -msctassified exrplosions *-niisclassified earthquakes Fig. 8a. Event discriminatlion with

  12. Visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-03-01

    On-site visual inspection will play an essential role in future Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection can greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can visual inspection offer ``ground truth`` in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending party may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection.

  13. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  14. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  15. Verification and Validation of Digitally Upgraded Control Rooms

    SciTech Connect

    Boring, Ronald; Lau, Nathan

    2015-09-01

    As nuclear power plants undertake main control room modernization, a challenge is the lack of a clearly defined human factors process to follow. Verification and validation (V&V) as applied in the nuclear power community has tended to involve efforts such as integrated system validation, which comes at the tail end of the design stage. To fill in guidance gaps and create a step-by-step process for control room modernization, we have developed the Guideline for Operational Nuclear Usability and Knowledge Elicitation (GONUKE). This approach builds on best practices in the software industry, which prescribe an iterative user-centered approach featuring multiple cycles of design and evaluation. Nuclear regulatory guidance for control room design emphasizes summative evaluation—which occurs after the design is complete. In the GONUKE approach, evaluation is also performed at the formative stage of design—early in the design cycle using mockups and prototypes for evaluation. The evaluation may involve expert review (e.g., software heuristic evaluation at the formative stage and design verification against human factors standards like NUREG-0700 at the summative stage). The evaluation may also involve user testing (e.g., usability testing at the formative stage and integrated system validation at the summative stage). An additional, often overlooked component of evaluation is knowledge elicitation, which captures operator insights into the system. In this report we outline these evaluation types across design phases that support the overall modernization process. The objective is to provide industry-suitable guidance for steps to be taken in support of the design and evaluation of a new human-machine interface (HMI) in the control room. We suggest the value of early-stage V&V and highlight how this early-stage V&V can help improve the design process for control room modernization. We argue that there is a need to overcome two shortcomings of V&V in current practice

  16. Multi-canister overpack project -- verification and validation, MCNP 4A

    SciTech Connect

    Goldmann, L.H.

    1997-11-10

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and the old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.

  17. 77 FR 50723 - Verification, Validation, Reviews, and Audits for Digital Computer Software Used in Safety...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... COMMISSION Verification, Validation, Reviews, and Audits for Digital Computer Software Used in Safety Systems... Audits for Digital Computer Software used in Safety Systems of Nuclear Power Plants.'' The DG-1210 is... development of, and compliance with, software verification and validation reviews and audits described in...

  18. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  19. Voice verification upgrade

    NASA Astrophysics Data System (ADS)

    Davis, R. L.; Sinnamon, J. T.; Cox, D. L.

    1982-06-01

    This contractor has two major objectives. The first was to build, test, and deliver to the government an entry control system using speaker verification (voice authentication) as the mechanism for verifying the user's claimed identity. This system included a physical mantrap, with an integral weight scale to prevent more than one user from gaining access with one verification (tailgating). The speaker verification part of the entry control system contained all the updates and embellishments to the algorithm that was developed earlier for the BISS (Base and Installation Security System) system under contract with the Electronic Systems Division of the USAF. These updates were tested prior to and during the contract on an operational system used at Texas Instruments in Dallas, Texas, for controlling entry to the Corporate Information Center (CIC).

  20. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  1. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  2. Voice Verification Upgrade.

    DTIC Science & Technology

    1982-06-01

    to develop speaker verification techniques for use over degraded commun- ication channels -- specifically telephone lines. A test of BISS type speaker...verification technology was performed on a degraded channel and compensation techniques were then developed . The fifth program [103 (Total Voice SV...UPGAW. *mbit aL DuI~sel Jme T. SImmoon e~d David L. Cox AAWVLP FIR MIEW RMAS Utgl~rIMIW At" DT11C AU9 231f CD, _ ROME AIR DEVELOPMENT CENTER Air

  3. Design verification and validation plan for the cold vacuum drying facility

    SciTech Connect

    NISHIKAWA, L.D.

    1999-06-03

    The Cold Vacuum Drying Facility (CVDF) provides the required process systems, supporting equipment, and facilities needed for drying spent nuclear fuel removed from the K Basins. This document presents the both completed and planned design verification and validation activities.

  4. Conceptual design. Final report: TFE Verification Program

    SciTech Connect

    Not Available

    1994-03-01

    This report documents the TFE Conceptual Design, which provided the design guidance for the TFE Verification program. The primary goals of this design effort were: (1) establish the conceptual design of an in-core thermionic reactor for a 2 Mw(e) space nuclear power system with a 7-year operating lifetime; (2) demonstrate scalability of the above concept over the output power range of 500 kW(e) to 5 MW(e); and (3) define the TFE which is the basis for the 2 MW (e) reactor design. This TFE specification provided the basis for the test program. These primary goals were achieved. The technical approach taking in the conceptual design effort is discussed in Section 2, and the results are discussed in Section 3. The remainder of this introduction draws a perspective on the role that this conceptual design task played in the TFE Verification Program.

  5. Computer Graphics Verification

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Video processing creates technical animation sequences using studio quality equipment to realistically represent fluid flow over space shuttle surfaces, helicopter rotors, and turbine blades.Computer systems Co-op, Tim Weatherford, performing computer graphics verification. Part of Co-op brochure.

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  7. FPGA Verification Accelerator (FVAX)

    NASA Technical Reports Server (NTRS)

    Oh, Jane; Burke, Gary

    2008-01-01

    Is Verification Acceleration Possible? - Increasing the visibility of the internal nodes of the FPGA results in much faster debug time - Forcing internal signals directly allows a problem condition to be setup very quickly center dot Is this all? - No, this is part of a comprehensive effort to improve the JPL FPGA design and V&V process.

  8. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests

  9. 10 CFR 63.47 - Facility information and verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Facility information and verification. 63.47 Section 63.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Us/iaea Safeguards Agreement § 63.47 Facility information...

  10. 10 CFR 63.47 - Facility information and verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Facility information and verification. 63.47 Section 63.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Us/iaea Safeguards Agreement § 63.47 Facility information...

  11. 10 CFR 63.47 - Facility information and verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Facility information and verification. 63.47 Section 63.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Us/iaea Safeguards Agreement § 63.47 Facility information...

  12. 10 CFR 63.47 - Facility information and verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Facility information and verification. 63.47 Section 63.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Us/iaea Safeguards Agreement § 63.47 Facility information...

  13. 10 CFR 63.47 - Facility information and verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Facility information and verification. 63.47 Section 63.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Us/iaea Safeguards Agreement § 63.47 Facility information...

  14. Improved Verification for Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Powell, Mark A.

    2008-01-01

    Aerospace systems are subject to many stringent performance requirements to be verified with low risk. This report investigates verification planning using conditional approaches vice the standard classical statistical methods, and usage of historical surrogate data for requirement validation and in verification planning. The example used in this report to illustrate the results of these investigations is a proposed mission assurance requirement with the concomitant maximum acceptable verification risk for the NASA Constellation Program Orion Launch Abort System (LAS). This report demonstrates the following improvements: 1) verification planning using conditional approaches vice classical statistical methods results in plans that are more achievable and feasible; 2) historical surrogate data can be used to bound validation of performance requirements; and, 3) incorporation of historical surrogate data in verification planning using conditional approaches produces even less costly and more reasonable verification plans. The procedures presented in this report may produce similar improvements and cost savings in verification for any stringent performance requirement for an aerospace system.

  15. Parent Tookit: Homework Help. Helpful Tips.

    ERIC Educational Resources Information Center

    All Kinds of Minds, 2006

    2006-01-01

    This check list contains tips for parents to help students reinforce and build upon what children learn at school: (1) Set a consistent time each day for doing homework; (2) Encourage children to make a homework checklist; (3) Provide assistance to help get started on a task; (4) Help children make a list of all needed materials before starting…

  16. Teaching "The Nuclear Predicament."

    ERIC Educational Resources Information Center

    Carman, Philip; Kneeshaw, Stephen

    1987-01-01

    Contends that courses on nuclear war must help students examine the political, social, religious, philosophical, economic, and moral assumptions which characterized the dilemma of nuclear armament/disarmament. Describes the upper level undergraduate course taught by the authors. (JDH)

  17. Plan for a laser weapon verification research program

    SciTech Connect

    Karr, T.J.

    1990-03-01

    Currently there is great interest in the question of how, or even whether, a treaty limiting the development and deployment of laser weapons could be verified. The concept of cooperative laser weapon verification is that each party would place monitoring stations near the other party's declared or suspect laser weapon facilities. The monitoring stations would measure the primary laser observables'' such as power or energy, either directly or by collecting laser radiation scattered from the air or the target, and would verify that the laser is operated within treaty limits. This concept is modeled along the lines of the seismic network recently activated in the USSR as a joint project of the United States Geologic Survey and the Soviet Academy of Sciences. The seismic data, gathered cooperatively, can be used by each party as it wishes, including to support verification of future nuclear test ban treaties. For laser weapon verification the monitoring stations are envisioned as ground-based, and would verify treaty limitations on ground-based laser anti-satellite (ASAT) weapons and on the ground-based development of other laser weapons. They would also contribute to verification of limitations on air-, sea- and space-based laser weapons, and the technology developed for cooperative verification could also be used in national technical means of verification. 2 figs., 4 tabs.

  18. National Center for Nuclear Security - NCNS

    SciTech Connect

    2014-11-12

    As the United States embarks on a new era of nuclear arms control, the tools for treaty verification must be accurate and reliable, and must work at stand-off distances. The National Center for Nuclear Security, or NCNS, at the Nevada National Security Site, is poised to become the proving ground for these technologies. The center is a unique test bed for non-proliferation and arms control treaty verification technologies. The NNSS is an ideal location for these kinds of activities because of its multiple environments; its cadre of experienced nuclear personnel, and the artifacts of atmospheric and underground nuclear weapons explosions. The NCNS will provide future treaty negotiators with solid data on verification and inspection regimes and a realistic environment in which future treaty verification specialists can be trained. Work on warhead monitoring at the NCNS will also support future arms reduction treaties.

  19. National Center for Nuclear Security - NCNS

    ScienceCinema

    None

    2016-07-12

    As the United States embarks on a new era of nuclear arms control, the tools for treaty verification must be accurate and reliable, and must work at stand-off distances. The National Center for Nuclear Security, or NCNS, at the Nevada National Security Site, is poised to become the proving ground for these technologies. The center is a unique test bed for non-proliferation and arms control treaty verification technologies. The NNSS is an ideal location for these kinds of activities because of its multiple environments; its cadre of experienced nuclear personnel, and the artifacts of atmospheric and underground nuclear weapons explosions. The NCNS will provide future treaty negotiators with solid data on verification and inspection regimes and a realistic environment in which future treaty verification specialists can be trained. Work on warhead monitoring at the NCNS will also support future arms reduction treaties.

  20. Microcode Verification Project.

    DTIC Science & Technology

    1980-05-01

    MICROCOPY RESOLUTION TEST CHART MADCTR.S042 /2>t w NI TeduIem R"pm’ 00 0 MICRQCODE VERIFICATION PROJECT Unhvrsity of Southern California Stephen D...in the production, testing , and maintenance of Air Force software. This effort was undertaken in response to that goal. The objective of the effort was...rather than hard wiring, is a recent development in computer technology. Hardware diagnostics do not fulfill testing requirements for these computers

  1. Robust verification analysis

    NASA Astrophysics Data System (ADS)

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-01

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  2. Robust verification analysis

    SciTech Connect

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-15

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  3. RESRAD-BUILD verification.

    SciTech Connect

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-31

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft{reg_sign} Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified.

  4. International and national security applications of cryogenic detectors - mostly nuclear safeguards

    SciTech Connect

    Rabin, Michael W

    2009-01-01

    As with science, so with security - in both arenas, the extraordinary sensitivity of cryogenic sensors enables high-confidence detection and high-precision measurement even of the faintest signals. Science applications are more mature, but several national and international security applications have been identified where cryogenic detectors have high potential payoff. International safeguards and nuclear forensics are areas needing new technology and methods to boost speed, sensitivity, precision and accuracy. Successfully applied, improved nuclear materials analysis will help constrain nuclear materials diversion pathways and contribute to treaty verification. Cryogenic microcalorimeter detectors for X-ray, gamma ray, neutron, and alpha particle spectrometry are under development with these aims in mind. In each case the unsurpassed energy resolution of microcalorimeters reveals previously invi sible spectral features of nuclear materials. Preliminary results of quantitative analysis indicate substantial improvements are still possible, but significant work will be required to fully understand the ultimate performance limits.

  5. National and International Security Applications of Cryogenic Detectors—Mostly Nuclear Safeguards

    NASA Astrophysics Data System (ADS)

    Rabin, Michael W.

    2009-12-01

    As with science, so with security—in both arenas, the extraordinary sensitivity of cryogenic sensors enables high-confidence detection and high-precision measurement even of the faintest signals. Science applications are more mature, but several national and international security applications have been identified where cryogenic detectors have high potential payoff. International safeguards and nuclear forensics are areas needing new technology and methods to boost speed, sensitivity, precision and accuracy. Successfully applied, improved nuclear materials analysis will help constrain nuclear materials diversion pathways and contribute to treaty verification. Cryogenic microcalorimeter detectors for X-ray, gamma-ray, neutron, and alpha-particle spectrometry are under development with these aims in mind. In each case the unsurpassed energy resolution of microcalorimeters reveals previously invisible spectral features of nuclear materials. Preliminary results of quantitative analysis indicate substantial improvements are still possible, but significant work will be required to fully understand the ultimate performance limits.

  6. A Cherenkov viewing device for used-fuel verification

    NASA Astrophysics Data System (ADS)

    Attas, E. M.; Chen, J. D.; Young, G. J.

    1990-12-01

    A Cherenkov viewing device (CVD) has been developed to help verify declared inventories of used nuclear fuel stored in water bays. The device detects and amplifies the faint ultraviolet Cherenkov glow from the water surrounding the fuel, producing a real-time visible image on a phosphor screen. Quartz optics, a UV-pass filter and a microchannel-plate image-intensifier tube serve to form the image, which can be photographed or viewed directly through an eyepiece. Normal fuel bay lighting does not interfere with the Cherenkov light image. The CVD has been successfully used to detect anomalous PWR, BWR and CANDU (CANada Deuterium Uranium: registered trademark) fuel assemblies in the presence of normal-burnup assemblies stored in used-fuel bays. The latest version of the CVD, known as Mark IV, is being used by inspectors from the International Atomic Energy Agency for verification of light-water power-reactor fuel. Its design and operation are described, together with plans for further enhancements of the instrumentation.

  7. Quantum money with classical verification

    SciTech Connect

    Gavinsky, Dmitry

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  8. Quantum money with classical verification

    NASA Astrophysics Data System (ADS)

    Gavinsky, Dmitry

    2014-12-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  9. Help Seeking and Receiving.

    ERIC Educational Resources Information Center

    Nadler, Arie

    Although social psychology has always had an interest in helping behavior, only recently has the full complexity of helping relations begun to be researched. Help seeking and receiving in the educational setting raise many issues regarding the use and effectiveness of the help itself. Central to all helping relations is the seeking/receiving…

  10. Freeze verification: time for a fresh approach

    SciTech Connect

    Paine, C.

    1983-01-01

    The administration's claim that some elements of a comprehensive nuclear freeze are unverifiable does not specify the nature of those elements and whether they represent a real threat to national security if we trusted the USSR to comply. The author contends that clandestine development of new weapons will have little strategic effect since both sides already have total destructive power. The risks of noncompliance are largely political and less than the risks of continued arms buildup. Since the USSR would also want the US to be bound by freeze terms, deterrence would come from mutual benefit. Hardliners argue that cheating is easier in a closed society; that our democracy would tend to relax and the USSR would move ahead with its plans for world domination. The author argues that, over time, a freeze would diminish Soviet confidence in its nuclear war fighting capabilities and that adequate verification is possible with monitoring and warning arrangements. (DCK)

  11. International safeguards: Accounting for nuclear materials

    SciTech Connect

    Fishbone, L.G.

    1988-09-28

    Nuclear safeguards applied by the International Atomic Energy Agency (IAEA) are one element of the non-proliferation regime'', the collection of measures whose aim is to forestall the spread of nuclear weapons to countries that do not already possess them. Safeguards verifications provide evidence that nuclear materials in peaceful use for nuclear-power production are properly accounted for. Though carried out in cooperation with nuclear facility operators, the verifications can provide assurance because they are designed with the capability to detect diversion, should it occur. Traditional safeguards verification measures conducted by inspectors of the IAEA include book auditing; counting and identifying containers of nuclear material; measuring nuclear material; photographic and video surveillance; and sealing. Novel approaches to achieve greater efficiency and effectiveness in safeguards verifications are under investigation as the number and complexity of nuclear facilities grow. These include the zone approach, which entails carrying out verifications for groups of facilities collectively, and randomization approach, which entails carrying out entire inspection visits some fraction of the time on a random basis. Both approaches show promise in particular situations, but, like traditional measures, must be tested to ensure their practical utility. These approaches are covered on this report. 15 refs., 16 figs., 3 tabs.

  12. Helping Parents Help Their Children Toward Literacy.

    ERIC Educational Resources Information Center

    Nichols, G. Jeane

    A practicum was designed to help parents of kindergartners in a low income area help their children develop literacy. The primary goal was to secure the active involvement of parents in their children's learning experiences. Other goals included improving kindergarten teachers' communication skills and expanding their strategies for reaching out…

  13. Safeguards for spent fuels: Verification problems

    SciTech Connect

    Pillay, K.K.S.; Picard, R.R.

    1991-01-01

    The accumulation of large quantities of spent nuclear fuels world-wide is a serious problem for international safeguards. A number of International Atomic Energy Agency (IAEA) member states, including the US, consider spent fuel to be a material form for which safeguards cannot be terminated, even after permanent disposal in a geologic repository. Because safeguards requirements for spent fuels are different from those of conventional bulk-handling and item-accounting facilities, there is room for innovation to design a unique safeguards regime for spent fuels that satisfies the goals of the nuclear nonproliferation treaty at a reasonable cost to both the facility and the IAEA. Various strategies being pursued for long-term management of spent fuels are examined with a realistic example to illustrate the problems of verifying safeguards under the present regime. Verification of a safeguards regime for spent fuels requires a mix of standard safeguards approaches, such as quantitative verification and use of seals, with other measures that are unique to spent fuels. 17 refs.

  14. Component testing for dynamic model verification

    NASA Technical Reports Server (NTRS)

    Hasselman, T. K.; Chrostowski, J. D.

    1984-01-01

    Dynamic model verification is the process whereby an analytical model of a dynamic system is compared with experimental data, adjusted if necessary to bring it into agreement with the data, and then qualified for future use in predicting system response in a different dynamic environment. These are various ways to conduct model verification. The approach taken here employs Bayesian statistical parameter estimation. Unlike curve fitting, whose objective is to minimize the difference between some analytical function and a given quantity of test data (or curve), Bayesian estimation attempts also to minimize the difference between the parameter values of that funciton (the model) and their initial estimates, in a least squares sense. The objectives of dynamic model verification, therefore, are to produce a model which: (1) is in agreement with test data; (2) will assist in the interpretation of test data; (3) can be used to help verify a design; (4) will reliably predict performance; and (5) in the case of space structures, will facilitate dynamic control.

  15. Constitutional and legal implications of arms control verification technologies

    SciTech Connect

    Tanzman, E.A.; Haffenden, R.

    1992-09-01

    United States law can both help and hinder the use of instrumentation as a component of arms control verification in this country. It can foster the general use of sophisticated verification technologies, where such devices are consistent with the value attached to privacy by the Fourth Amendment to the United States Constitution. On the other hand, law can hinder reliance on devices that cross this constitutional line, or where such technology itself threatens health, safety, or environment as such threats are defined in federal statutes. The purpose of this conference paper is to explain some of the lessons that have been learned about the relationship between law and verification technologies in the hope that law can help more than hinder. This paper has three parts. In order to start with a common understanding, part I will briefly describe the hierarchy of treaties, the Constitution, federal statutes, and state and local laws. Part 2 will discuss how the specific constitutional requirement that the government respect the right of privacy in all of its endeavors may affect the use of verification technologies. Part 3 will explain the environmental law constraints on verification technology as exemplified by the system of on-site sampling embodied in the current Rolling Text of the Draft Chemical Weapons Convention.

  16. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  17. Production readiness verification testing

    NASA Technical Reports Server (NTRS)

    James, A. M.; Bohon, H. L.

    1980-01-01

    A Production Readiness Verification Testing (PRVT) program has been established to determine if structures fabricated from advanced composites can be committed on a production basis to commercial airline service. The program utilizes subcomponents which reflect the variabilities in structure that can realistically be expected from current production and quality control technology to estimate the production qualities, variation in static strength, and durability of advanced composite structures. The results of the static tests and a durability assessment after one year of continuous load/environment testing of twenty two duplicates of each of two structural components (a segment of the front spar and cover of a vertical stabilizer box structure) are discussed.

  18. Bibliography for Verification and Validation in Computational Simulations

    SciTech Connect

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  19. Verification and validation of RADMODL Version 1.0

    SciTech Connect

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.

  20. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    SciTech Connect

    Luke, S J

    2011-12-20

    This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of the US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the

  1. Software Verification and Validation Procedure

    SciTech Connect

    Olund, Thomas S.

    2008-09-15

    This Software Verification and Validation procedure provides the action steps for the Tank Waste Information Network System (TWINS) testing process. The primary objective of the testing process is to provide assurance that the software functions as intended, and meets the requirements specified by the client. Verification and validation establish the primary basis for TWINS software product acceptance.

  2. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  3. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  4. TFE Verification Program

    SciTech Connect

    Not Available

    1993-05-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program builds directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern.

  5. Neptunium flow-sheet verification at reprocessing plants

    SciTech Connect

    Rance, P.; Chesnay, B.; Killeen, T.; Murray, M.; Nikkinen, M.; Petoe, A.; Plumb, J.; Saukkonen, H.

    2007-07-01

    Due to their fissile nature, neptunium and americium have at least a theoretical potential application as nuclear explosives and their proliferation potential was considered by the IAEA in studies in the late 1990's. This work was motivated by an increased awareness of the proliferation potential of americium and neptunium and a number of emerging projects in peaceful nuclear programmes which could result in an increase in the available quantities of these minor actinides. The studies culminated in proposals for various voluntary measures including the reporting of international transfers of separated americium and neptunium, declarations concerning the amount of separated neptunium and americium held by states and the application of flow-sheet verification to ensure that facilities capable of separating americium or neptunium are operated in a manner consistent with that declared. This paper discusses the issue of neptunium flowsheet verification in reprocessing plants. The proliferation potential of neptunium is first briefly discussed and then the chemistry of neptunium relevant to reprocessing plants described with a view to indicating a number of issues relevant to the verification of neptunium flow-sheets. Finally, the scope of verification activities is discussed including analysis of process and engineering design information, plant monitoring and sampling and the potential application of containment and surveillance measures. (authors)

  6. Extremely accurate sequential verification of RELAP5-3D

    SciTech Connect

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method of manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.

  7. Extremely accurate sequential verification of RELAP5-3D

    DOE PAGES

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less

  8. VEG-01: Veggie Hardware Verification Testing

    NASA Technical Reports Server (NTRS)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  9. Online fingerprint verification.

    PubMed

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  10. Verification of VENTSAR

    SciTech Connect

    Simpkins, A.A.

    1995-01-01

    The VENTSAR code is an upgraded and improved version of the VENTX code, which estimates concentrations on or near a building from a release at a nearby location. The code calculates the concentrations either for a given meteorological exceedance probability or for a given stability and wind speed combination. A single building can be modeled which lies in the path of the plume, or a penthouse can be added to the top of the building. Plume rise may also be considered. Release types can be either chemical or radioactive. Downwind concentrations are determined at user-specified incremental distances. This verification report was prepared to demonstrate that VENTSAR is properly executing all algorithms and transferring data. Hand calculations were also performed to ensure proper application of methodologies.

  11. Shift Verification and Validation

    SciTech Connect

    Pandya, Tara M.; Evans, Thomas M.; Davidson, Gregory G; Johnson, Seth R.; Godfrey, Andrew T.

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  12. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    NASA Astrophysics Data System (ADS)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help

  13. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  14. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  15. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  16. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  17. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  18. Verification of Disarmament or Limitation of Armaments: Instruments, Negotiations, Proposals

    DTIC Science & Technology

    2007-11-02

    Nuclear Test Pages: 00267 Cataloged Date: Aug 20, 1992 Document Type: HC Number of Copies In Library : 000001 Record ID: 24625 UNIDIR/92/28 UNIDIR...The Johns Hopkins Foreign Policy Institute, School of Advanced International Studies, 1989, pp. 33-54. ś For an overview of dismantlement, see...Weapons Databook, Natural Resources Defense Council, Ballinger Press, New York, 1990. 68 Verification of Disarmament or Limitation of Armaments is not

  19. Help! It's Hair Loss!

    MedlinePlus

    ... Emergency Room? What Happens in the Operating Room? Help! It's Hair Loss! KidsHealth > For Kids > Help! It's Hair Loss! A A A What's in ... a better look at what's going on to help decide what to do next. For a fungal ...

  20. Help with Hives

    MedlinePlus

    ... Emergency Room? What Happens in the Operating Room? Help With Hives KidsHealth > For Kids > Help With Hives A A A What's in this ... about what happened. The doctor can try to help figure out what might be causing your hives, ...

  1. EURATOM safeguards efforts in the development of spent fuel verification methods by non-destructive assay

    SciTech Connect

    Matloch, L.; Vaccaro, S.; Couland, M.; De Baere, P.; Schwalbach, P.

    2015-07-01

    The back end of the nuclear fuel cycle continues to develop. The European Commission, particularly the Nuclear Safeguards Directorate of the Directorate General for Energy, implements Euratom safeguards and needs to adapt to this situation. The verification methods for spent nuclear fuel, which EURATOM inspectors can use, require continuous improvement. Whereas the Euratom on-site laboratories provide accurate verification results for fuel undergoing reprocessing, the situation is different for spent fuel which is destined for final storage. In particular, new needs arise from the increasing number of cask loadings for interim dry storage and the advanced plans for the construction of encapsulation plants and geological repositories. Various scenarios present verification challenges. In this context, EURATOM Safeguards, often in cooperation with other stakeholders, is committed to further improvement of NDA methods for spent fuel verification. In this effort EURATOM plays various roles, ranging from definition of inspection needs to direct participation in development of measurement systems, including support of research in the framework of international agreements and via the EC Support Program to the IAEA. This paper presents recent progress in selected NDA methods. These methods have been conceived to satisfy different spent fuel verification needs, ranging from attribute testing to pin-level partial defect verification. (authors)

  2. Pairwise Identity Verification via Linear Concentrative Metric Learning.

    PubMed

    Zheng, Lilei; Duffner, Stefan; Idrissi, Khalid; Garcia, Christophe; Baskurt, Atilla

    2016-12-16

    This paper presents a study of metric learning systems on pairwise identity verification, including pairwise face verification and pairwise speaker verification, respectively. These problems are challenging because the individuals in training and testing are mutually exclusive, and also due to the probable setting of limited training data. For such pairwise verification problems, we present a general framework of metric learning systems and employ the stochastic gradient descent algorithm as the optimization solution. We have studied both similarity metric learning and distance metric learning systems, of either a linear or shallow nonlinear model under both restricted and unrestricted training settings. Extensive experiments demonstrate that with limited training pairs, learning a linear system on similar pairs only is preferable due to its simplicity and superiority, i.e., it generally achieves competitive performance on both the labeled faces in the wild face dataset and the NIST speaker dataset. It is also found that a pretrained deep nonlinear model helps to improve the face verification results significantly.

  3. Comments for A Conference on Verification in the 21st Century

    SciTech Connect

    Doyle, James E.

    2012-06-12

    The author offers 5 points for the discussion of Verification and Technology: (1) Experience with the implementation of arms limitation and arms reduction agreements confirms that technology alone has never been relied upon to provide effective verification. (2) The historical practice of verification of arms control treaties between Cold War rivals may constrain the cooperative and innovative use of technology for transparency, veification and confidence building in the future. (3) An area that has been identified by many, including the US State Department and NNSA as being rich for exploration for potential uses of technology for transparency and verification is information and communications technology (ICT). This includes social media, crowd-sourcing, the internet of things, and the concept of societal verification, but there are issues. (4) On the issue of the extent to which verification technologies are keeping pace with the demands of future protocols and agrements I think the more direct question is ''are they effective in supporting the objectives of the treaty or agreement?'' In this regard it is important to acknowledge that there is a verification grand challenge at our doorstep. That is ''how does one verify limitations on nuclear warheads in national stockpiles?'' (5) Finally, while recognizing the daunting political and security challenges of such an approach, multilateral engagement and cooperation at the conceptual and technical levels provides benefits for addressing future verification challenges.

  4. Acoustic techniques in nuclear safeguards

    SciTech Connect

    Olinger, C.T.; Sinha, D.N.

    1995-07-01

    Acoustic techniques can be employed to address many questions relevant to current nuclear technology needs. These include establishing and monitoring intrinsic tags and seals, locating holdup in areas where conventional radiation-based measurements have limited capability, process monitoring, monitoring containers for corrosion or changes in pressure, and facility design verification. These acoustics applications are in their infancy with respect to safeguards and nuclear material management, but proof-of-principle has been demonstrated in many of the areas listed.

  5. Verification and validation of control system software

    SciTech Connect

    Munro, J.K. Jr.; Kisner, R.A. ); Bhadtt, S.C. )

    1991-01-01

    The following guidelines are proposed for verification and validation (V V) of nuclear power plant control system software: (a) use risk management to decide what and how much V V is needed; (b) classify each software application using a scheme that reflects what type and how much V V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs.

  6. Generic interpreters and microprocessor verification

    NASA Technical Reports Server (NTRS)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  7. U.S. verification method disputed

    NASA Astrophysics Data System (ADS)

    Maggs, William Ward

    Milo Nordyke, senior scientist at Lawrence Liver more National Laboratory in Liver more, Calif., testified October 6 at a Senate Foreign Affairs Committee hearing on Soviet test ban noncompliance and the recently concluded Joint Verification Experiment. He said that the the government's method for on-site test monitoring is intrusive, expensive, and could limit some U.S. weapon design programs. In addition, Gregory Van der Vink of the congressional Office of Technology Assessment presented new evidence that White House charges that the Soviet Union has not complied with the current 150 kiloton test limit are probably without basis.Also testifying were Paul Robinson, U.S. negotiator for the Nuclear Testing Talks; Peter Sharfman, program manager for International Security and Commerce at OTA; and physicist David Hafemeister of California Polytechnical State University, San Luis Obispo.

  8. RELAP-7 Software Verification and Validation Plan

    SciTech Connect

    Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling

    2014-09-25

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.

  9. Reliable Entanglement Verification

    NASA Astrophysics Data System (ADS)

    Arrazola, Juan; Gittsovich, Oleg; Donohue, John; Lavoie, Jonathan; Resch, Kevin; Lütkenhaus, Norbert

    2013-05-01

    Entanglement plays a central role in quantum protocols. It is therefore important to be able to verify the presence of entanglement in physical systems from experimental data. In the evaluation of these data, the proper treatment of statistical effects requires special attention, as one can never claim to have verified the presence of entanglement with certainty. Recently increased attention has been paid to the development of proper frameworks to pose and to answer these type of questions. In this work, we apply recent results by Christandl and Renner on reliable quantum state tomography to construct a reliable entanglement verification procedure based on the concept of confidence regions. The statements made do not require the specification of a prior distribution nor the assumption of an independent and identically distributed (i.i.d.) source of states. Moreover, we develop efficient numerical tools that are necessary to employ this approach in practice, rendering the procedure ready to be employed in current experiments. We demonstrate this fact by analyzing the data of an experiment where photonic entangled two-photon states were generated and whose entanglement is verified with the use of an accessible nonlinear witness.

  10. Woodward Effect Experimental Verifications

    NASA Astrophysics Data System (ADS)

    March, Paul

    2004-02-01

    The work of J. F. Woodward (1990 1996a; 1996b; 1998; 2002a; 2002b; 2004) on the existence of ``mass fluctuations'' and their use in exotic propulsion schemes was examined for possible application in improving space flight propulsion and power generation. Woodward examined Einstein's General Relativity Theory (GRT) and assumed that if the strong Machian interpretation of GRT as well as gravitational / inertia like Wheeler-Feynman radiation reaction forces hold, then when an elementary particle is accelerated through a potential gradient, its rest mass should fluctuate around its mean value during its acceleration. Woodward also used GRT to clarify the precise experimental conditions necessary for observing and exploiting these mass fluctuations or ``Woodward effect'' (W-E). Later, in collaboration with his ex-graduate student T. Mahood, they also pushed the experimental verification boundaries of these proposals. If these purported mass fluctuations occur as Woodward claims, and his assumption that gravity and inertia are both byproducts of the same GRT based phenomenon per Mach's Principle is correct, then many innovative applications such as propellantless propulsion and gravitational exotic matter generators may be feasible. This paper examines the reality of mass fluctuations and the feasibility of using the W-E to design propellantless propulsion devices in the near to mid-term future. The latest experimental results, utilizing MHD-like force rectification systems, will also be presented.

  11. The CSMS (Configurable Seismic Monitoring System) Poorboy deployment: Seismic recording in Pinedale, Wyoming, of the Bullion NTS (Nevada Test Site) nuclear test under the verification provisions of the new TTBT protocol

    SciTech Connect

    Harben, P.E.; Rock, D.W.; Carlson, R.C.

    1990-07-10

    The Configurable Seismic Monitoring System (CSMS), developed at the Lawrence Livermore National Laboratory (LLNL) was deployed in a 13-m deep vault on the AFTAC facility at Pinedale, Wyoming to record the Bullion nuclear test. The purpose of the exercise was to meet all provisions of the new TTBT protocol on in-country seismic recording at a Designated Seismic Station (DSS). The CSMS successfully recorded the Bullion event consistent with and meeting all requirements in the new treaty protocol. In addition, desirable seismic system features not specified in the treaty protocol were determined; treaty protocol ambiguities were identified, and useful background noise recordings at the Pinedale site were obtained. 10 figs.

  12. Helping Children Understand Disabilities.

    ERIC Educational Resources Information Center

    Zakariya, Sally Banks

    1978-01-01

    The program described uses simulation activities; exposure to aids and appliances; guest speakers; books, movies, slides, and videotapes; and class discussion to help elementary students understand disabilities. (IRT)

  13. Studies in Seismic Verification

    DTIC Science & Technology

    1992-05-01

    features in the Earth. G(co) includes source region effects such as free surface reflections, geometrical spreading which may be frequency dependent...pressure function at the elastic radius. They used a pressure function based on free -field observations of several underground nuclear explosions...show an increase in 10 and 30 Hz spectral amplitude by a factor of about 5 above the free surface effect. Therefore we expect the Anza spectral

  14. Modality Switching in a Property Verification Task: An ERP Study of What Happens When Candles Flicker after High Heels Click.

    PubMed

    Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana

    2011-01-01

    The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties.

  15. Space station data management system - A common GSE test interface for systems testing and verification

    NASA Technical Reports Server (NTRS)

    Martinez, Pedro A.; Dunn, Kevin W.

    1987-01-01

    This paper examines the fundamental problems and goals associated with test, verification, and flight-certification of man-rated distributed data systems. First, a summary of the characteristics of modern computer systems that affect the testing process is provided. Then, verification requirements are expressed in terms of an overall test philosophy for distributed computer systems. This test philosophy stems from previous experience that was gained with centralized systems (Apollo and the Space Shuttle), and deals directly with the new problems that verification of distributed systems may present. Finally, a description of potential hardware and software tools to help solve these problems is provided.

  16. SPR Hydrostatic Column Model Verification and Validation.

    SciTech Connect

    Bettin, Giorgia; Lord, David; Rudeen, David Keith

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  17. Help! It's Hair Loss!

    MedlinePlus

    ... los dientes Video: Getting an X-ray Help! It's Hair Loss! KidsHealth > For Kids > Help! It's Hair Loss! Print A A A What's in ... part above the skin, is dead. (That's why it doesn't hurt to get a haircut!) This ...

  18. Helping Our Children.

    ERIC Educational Resources Information Center

    Polk, Sophie

    1987-01-01

    Describes the Ikaiyurluki Mikelnguut (Helping Our Children) project in the Yukon Kuskokwim Delta of Alaska where trained natural helpers are helping Yup'ik Eskimo villagers to cope with crisis situations--notably teenage suicide and drug and alcohol abuse. (Author/BB)

  19. Handi Helps, 1984.

    ERIC Educational Resources Information Center

    Handi Helps, 1984

    1984-01-01

    The eight issues of Handi Helps presented in this document focus on specific issues of concern to the disabled, parents, and those working with the disabled. The two-page handi help fact sheets focus on the following topics: child abuse, leukemia, arthritis, Tourette Syndrome, hemophilia, the puppet program "Meet the New Kids on the…

  20. Helping America's Youth

    ERIC Educational Resources Information Center

    Bush, Laura

    2005-01-01

    As First Lady of the United States, Laura Bush is leading the Helping America's Youth initiative of the federal government. She articulates the goal of enlisting public and volunteer resources to foster healthy growth by early intervention and mentoring of youngsters at risk. Helping America's Youth will benefit children and teenagers by…

  1. Handi Helps, 1985

    ERIC Educational Resources Information Center

    Handi Helps, 1985

    1985-01-01

    The six issues of Handi Helps presented here focus on specific issues of concern to the disabled, parents, and those working with the disabled. The two-page handi help fact sheets focus on the following topics: child sexual abuse prevention, asthma, scoliosis, the role of the occupational therapist, kidnapping, and muscular dystrophy. Each handi…

  2. Helping Friends and Family

    MedlinePlus

    ... chapter Join our online community Helping Friends and Family Part of living well with Alzheimer’s is adjusting to your “new normal” and helping family and friends do the same. Knowing what to ...

  3. How Nasa's Independent Verification and Validation (IVandV) Program Builds Reliability into a Space Mission Software System (SMSS)

    NASA Technical Reports Server (NTRS)

    Fisher, Marcus S.; Northey, Jeffrey; Stanton, William

    2014-01-01

    The purpose of this presentation is to outline how the NASA Independent Verification and Validation (IVV) Program helps to build reliability into the Space Mission Software Systems (SMSSs) that its customers develop.

  4. Isocenter verification for linac-based stereotactic radiation therapy: review of principles and techniques.

    PubMed

    Rowshanfarzad, Pejman; Sabet, Mahsheed; O'Connor, Daryl J; Greer, Peter B

    2011-11-15

    There have been several manual, semi-automatic and fully-automatic methods proposed for verification of the position of mechanical isocenter as part of comprehensive quality assurance programs required for linear accelerator-based stereotactic radiosurgery/radiotherapy (SRS/SRT) treatments. In this paper, a systematic review has been carried out to discuss the present methods for isocenter verification and compare their characteristics, to help physicists in making a decision on selection of their quality assurance routine.

  5. How Formal Dynamic Verification Tools Facilitate Novel Concurrency Visualizations

    NASA Astrophysics Data System (ADS)

    Aananthakrishnan, Sriram; Delisi, Michael; Vakkalanka, Sarvani; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev

    With the exploding scale of concurrency, presenting valuable pieces of information collected by formal verification tools intuitively and graphically can greatly enhance concurrent system debugging. Traditional MPI program debuggers present trace views of MPI program executions. Such views are redundant, often containing equivalent traces that permute independent MPI calls. In our ISP formal dynamic verifier for MPI programs, we present a collection of alternate views made possible by the use of formal dynamic verification. Some of ISP’s views help pinpoint errors, some facilitate discerning errors by eliminating redundancy, while others help understand the program better by displaying concurrent even orderings that must be respected by all MPI implementations, in the form of completes-before graphs. In this paper, we describe ISP’s graphical user interface (GUI) capabilities in all these areas which are currently supported by a portable Java based GUI, a Microsoft Visual Studio GUI, and an Eclipse based GUI whose development is in progress.

  6. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  7. Monte Carlo verification of IMRT treatment plans on grid.

    PubMed

    Gómez, Andrés; Fernández Sánchez, Carlos; Mouriño Gallego, José Carlos; López Cacheiro, Javier; González Castaño, Francisco J; Rodríguez-Silva, Daniel; Domínguez Carrera, Lorena; González Martínez, David; Pena García, Javier; Gómez Rodríguez, Faustino; González Castaño, Diego; Pombar Cameán, Miguel

    2007-01-01

    The eIMRT project is producing new remote computational tools for helping radiotherapists to plan and deliver treatments. The first available tool will be the IMRT treatment verification using Monte Carlo, which is a computational expensive problem that can be executed remotely on a GRID. In this paper, the current implementation of this process using GRID and SOA technologies is presented, describing the remote execution environment and the client.

  8. Safeguards Guidance Document for Designers of Commercial Nuclear Facilities: International Nuclear Safeguards Requirements and Practices For Uranium Enrichment Plants

    SciTech Connect

    Robert Bean; Casey Durst

    2009-10-01

    This report is the second in a series of guidelines on international safeguards requirements and practices, prepared expressly for the designers of nuclear facilities. The first document in this series is the description of generic international nuclear safeguards requirements pertaining to all types of facilities. These requirements should be understood and considered at the earliest stages of facility design as part of a new process called “Safeguards-by-Design.” This will help eliminate the costly retrofit of facilities that has occurred in the past to accommodate nuclear safeguards verification activities. The following summarizes the requirements for international nuclear safeguards implementation at enrichment plants, prepared under the Safeguards by Design project, and funded by the U.S. Department of Energy (DOE) National Nuclear Security Administration (NNSA), Office of NA-243. The purpose of this is to provide designers of nuclear facilities around the world with a simplified set of design requirements and the most common practices for meeting them. The foundation for these requirements is the international safeguards agreement between the country and the International Atomic Energy Agency (IAEA), pursuant to the Treaty on the Non-proliferation of Nuclear Weapons (NPT). Relevant safeguards requirements are also cited from the Safeguards Criteria for inspecting enrichment plants, found in the IAEA Safeguards Manual, Part SMC-8. IAEA definitions and terms are based on the IAEA Safeguards Glossary, published in 2002. The most current specification for safeguards measurement accuracy is found in the IAEA document STR-327, “International Target Values 2000 for Measurement Uncertainties in Safeguarding Nuclear Materials,” published in 2001. For this guide to be easier for the designer to use, the requirements have been restated in plainer language per expert interpretation using the source documents noted. The safeguards agreement is fundamentally a

  9. Cold Fusion Verification.

    DTIC Science & Technology

    1991-03-01

    published work, talking with others in the field, and attending conferences, that CNF probably is chimera and will go the way of N-rays and polywater ...way of N-rays and polywater . To date, no one, including Pons and Fleischmann, has been able to construct a so-called CNF electrochemical cell that...Cold Nuclear Fusion (CNF), as originally reported in 1989. The conclusion is that CNF probably is chimera and will go the way of N-rays and polywater

  10. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  11. Help! Where to Look.

    ERIC Educational Resources Information Center

    Kneer, Marian E.

    1984-01-01

    Developing and maintaining effective physical education programs requires that teachers continually update their knowledge and skills. Books and journals, conferences, professional organizations, and consultants provide information to help teachers develop effective programs. (DF)

  12. Hooked on Helping

    ERIC Educational Resources Information Center

    Longhurst, James; McCord, Joan

    2014-01-01

    In this article, teens presenting at a symposium on peer-helping programs describe how caring for others fosters personal growth and builds positive group cultures. Their individual thoughts and opinions are expressed.

  13. Helping Parents Say No.

    ERIC Educational Resources Information Center

    Duel, Debra K.

    1988-01-01

    Provides some activities that are designed to help students understand some of the reasons why parents sometimes refuse to let their children have pets. Includes mathematics and writing lessons, a student checklist, and a set of tips for parents. (TW)

  14. Helping Teens Cope.

    ERIC Educational Resources Information Center

    Jones, Jami I.

    2003-01-01

    Considers the role of school library media specialists in helping teens cope with developmental and emotional challenges. Discusses resiliency research, and opportunities to develop programs and services especially for middle school and high school at-risk teens. (LRW)

  15. Help with Hearing

    MedlinePlus

    ... Foundation has shared over 7,000 Gund Teddy Bears with repaired cleft lips with children and families ... call the Cleftline for more information about our bears. If you are interested in helping us continue ...

  16. Can Reading Help?

    ERIC Educational Resources Information Center

    Crowe, Chris

    2003-01-01

    Ponders the effect of September 11th on teenagers. Proposes that reading books can help teenagers sort out complicated issues. Recommends young adult novels that offer hope for overcoming tragedy. Lists 50 short story collections worth reading. (PM)

  17. Grandparents Can Help

    ERIC Educational Resources Information Center

    Pieper, Elizabeth

    1976-01-01

    Although grandparents may have difficulty in accepting their handicapped grandchild due to such factors as the notion of "bad blood," they can be helpful to parents by drawing from their experience to give new perspectives to complex problems. (SB)

  18. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  19. CHEMICAL INDUCTION MIXER VERIFICATION - ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Wet-Weather Flow Technologies Pilot of the Environmental Technology Verification (ETV) Program, which is supported by the U.S. Environmental Protection Agency and facilitated by NSF International, has recently evaluated the performance of chemical induction mixers used for di...

  20. Tritium as an indicator of venues for nuclear tests.

    PubMed

    Lyakhova, O N; Lukashenko, S N; Mulgin, S I; Zhdanov, S V

    2013-10-01

    Currently, due to the Treaty on the Non-proliferation of Nuclear Weapons there is a highly topical issue of an accurate verification of nuclear explosion venues. This paper proposes to consider new method for verification by using tritium as an indicator. Detailed studies of the tritium content in the air were carried in the locations of underground nuclear tests - "Balapan" and "Degelen" testing sites located in Semipalatinsk Test Site. The paper presents data on the levels and distribution of tritium in the air where tunnels and boreholes are located - explosion epicentres, wellheads and tunnel portals, as well as in estuarine areas of the venues for the underground nuclear explosions (UNE).

  1. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  2. Keeping Nuclear Materials Secure

    SciTech Connect

    2016-10-19

    For 50 years, Los Alamos National Laboratory has been helping to keep nuclear materials secure. We do this by developing instruments and training inspectors that are deployed to other countries to make sure materials such as uranium are being used for peaceful purposes and not diverted for use in weapons. These measures are called “nuclear safeguards,” and they help make the world a safer place.

  3. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    SciTech Connect

    Crowell, Michael W

    2015-01-01

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ‘hand’ comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink™ for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oak Ridge National Laboratory’s High Flux Isotope Reactor (HFIR).

  4. Dispelling myths about verification of sea-launched cruise missiles

    SciTech Connect

    Lewis, G.N. ); Ride, S.K. ); Townsend, J.S. )

    1989-11-10

    It is widely believed that an arms control limit on nuclear-armed sea-launched cruise missiles would be nearly impossible to verify. Among the reasons usually given are: these weapons are small, built in nondistinctive industrial facilities, deployed on a variety of ships and submarines, and difficult to distinguish from their conventionally armed counterparts. In this article, it is argued that the covert production and deployment of nuclear-armed sea-launched cruise missiles would not be so straightforward. A specific arms control proposed is described, namely a total ban on nuclear-armed sea-launched cruise missiles. This proposal is used to illustrate how an effective verification scheme might be constructed. 9 refs., 6 figs.

  5. Image Hashes as Templates for Verification

    SciTech Connect

    Janik, Tadeusz; Jarman, Kenneth D.; Robinson, Sean M.; Seifert, Allen; McDonald, Benjamin S.; White, Timothy A.

    2012-07-17

    Imaging systems can provide measurements that confidently assess characteristics of nuclear weapons and dismantled weapon components, and such assessment will be needed in future verification for arms control. Yet imaging is often viewed as too intrusive, raising concern about the ability to protect sensitive information. In particular, the prospect of using image-based templates for verifying the presence or absence of a warhead, or of the declared configuration of fissile material in storage, may be rejected out-of-hand as being too vulnerable to violation of information barrier (IB) principles. Development of a rigorous approach for generating and comparing reduced-information templates from images, and assessing the security, sensitivity, and robustness of verification using such templates, are needed to address these concerns. We discuss our efforts to develop such a rigorous approach based on a combination of image-feature extraction and encryption-utilizing hash functions to confirm proffered declarations, providing strong classified data security while maintaining high confidence for verification. The proposed work is focused on developing secure, robust, tamper-sensitive and automatic techniques that may enable the comparison of non-sensitive hashed image data outside an IB. It is rooted in research on so-called perceptual hash functions for image comparison, at the interface of signal/image processing, pattern recognition, cryptography, and information theory. Such perceptual or robust image hashing—which, strictly speaking, is not truly cryptographic hashing—has extensive application in content authentication and information retrieval, database search, and security assurance. Applying and extending the principles of perceptual hashing to imaging for arms control, we propose techniques that are sensitive to altering, forging and tampering of the imaged object yet robust and tolerant to content-preserving image distortions and noise. Ensuring that the

  6. Automated verification system user's guide

    NASA Technical Reports Server (NTRS)

    Hoffman, R. H.

    1972-01-01

    Descriptions of the operational requirements for all of the programs of the Automated Verification System (AVS) are provided. The AVS programs are: (1) FORTRAN code analysis and instrumentation program (QAMOD); (2) Test Effectiveness Evaluation Program (QAPROC); (3) Transfer Control Variable Tracking Program (QATRAK); (4) Program Anatomy Table Generator (TABGEN); and (5) Network Path Analysis Program (RAMBLE).

  7. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-07-16

    This paper will explore the difficulties of deep reductions by examining the technical verification challenges. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 (Pifer 2010). Further reductions will include stepping stones at 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national lab complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  8. Cleanup Verification Package for the 118-C-1, 105-C Solid Waste Burial Ground

    SciTech Connect

    M. J. Appel and J. M. Capron

    2007-07-25

    This cleanup verification package documents completion of remedial action for the 118-C-1, 105-C Solid Waste Burial Ground. This waste site was the primary burial ground for general wastes from the operation of the 105-C Reactor and received process tubes, aluminum fuel spacers, control rods, reactor hardware, spent nuclear fuel and soft wastes.

  9. Helpful hints to painless payload processing

    NASA Technical Reports Server (NTRS)

    Terhune, Terry; Carson, Maggie

    1995-01-01

    The helpful hints herein describe, from a system perspective, the functional flow of hardware and software. The flow will begin at the experiment development stage and continue through build-up, test, verification, delivery, launch and deintegration of the experiment. An effort will be made to identify those interfaces and transfer functions of processing that can be improved upon in the new world of 'Faster, Better, and Cheaper.' The documentation necessary to ensure configuration and processing requirements satisfaction will also be discussed. Hints and suggestions for improvements to enhance each phase of the flow will be derived from extensive experience and documented lessons learned. Charts will be utilized to define the functional flow and a list of 'lessons learned' will be addressed to show applicability. In conclusion, specific improvements for several areas of hardware processing, procedure development and quality assurance, that are generic to all Small Payloads, will be identified.

  10. Stretching: Does It Help?

    ERIC Educational Resources Information Center

    Vardiman, Phillip; Carrand, David; Gallagher, Philip M.

    2010-01-01

    Stretching prior to activity is universally accepted as an important way to improve performance and help prevent injury. Likewise, limited flexibility has been shown to decrease functional ability and predispose a person to injuries. Although this is commonly accepted, appropriate stretching for children and adolescents involved with sports and…

  11. Helping, Manipulation, and Magic

    ERIC Educational Resources Information Center

    Frey, Louise A.; Edinburg, Golda M.

    1978-01-01

    The thesis of this article is that an understanding of the primitive origins of the helping process in myth, magic, and ritual may prevent social workers from engaging in practices that negate their clients' ability to work out their own solutions to problems. (Author)

  12. Helping Perceptually Handicapped Children

    ERIC Educational Resources Information Center

    Rose, Helen S.

    1974-01-01

    Five children diagnosed as having a perceptual problem as revealed by the Bender Visual Motor Gestalt Test received special tutoring to help develop their visual discrimination abilities. The six-week program for teaching the concept of shapes employed kinesthetic, visual, tactile, and verbal processes. (CS)

  13. Helping Families Cope.

    ERIC Educational Resources Information Center

    Goodman, Carol R.

    The paper presents observations of families having adult members with learning disabilities and describes a residential program to facilitate the transition to independent living of lower functioning learning disabled young adults. The program, called Independence Center, involves placing participants in apartments with roommates and helping them…

  14. Helping Teachers Communicate

    ERIC Educational Resources Information Center

    Kise, Jane; Russell, Beth; Shumate, Carol

    2008-01-01

    Personality type theory describes normal differences in how people are energized, take in information, make decisions, and approach work and life--all key elements in how people teach and learn. Understanding one another's personality type preferences helps teachers share their instructional strategies and classroom information. Type theory…

  15. Ayudele! [Help Him!].

    ERIC Educational Resources Information Center

    Spencer, Maria Gutierrez, Comp.; Almance, Sofia, Comp.

    Written in Spanish and English, the booklet briefly discusses what parents can do to help their child learn at school. The booklet briefly notes the importance of getting enough sleep; eating breakfast; praising the child; developing the five senses; visiting the doctor; having a home and garden; talking, listening, and reading to the child;…

  16. What Helps Us Learn?

    ERIC Educational Resources Information Center

    Educational Leadership, 2010

    2010-01-01

    This article presents comments of high school students at the Howard Gardner School in Alexandria, Virginia, who were asked, What should teachers know about students to help them learn? Twelve high school students from the Howard Gardner School in Alexandria, Virginia, describe how their best teachers get to know them and thus were more able to…

  17. A Helping Hand.

    ERIC Educational Resources Information Center

    Renner, Jason M.

    2000-01-01

    Discusses how designing a hand washing-friendly environment can help to reduce the spread of germs in school restrooms. Use of electronic faucets, surface risk management, traffic flow, and user- friendly hand washing systems that are convenient and maximally hygienic are examined. (GR)

  18. Help With Schizophrenia

    MedlinePlus

    ... Disorders Obsessive-Compulsive Disorder (OCD) Postpartum Depression Posttraumatic Stress Disorder (PTSD) More Patients & Families All Topics Help With Schizophrenia Curated and updated for the community by APA Topic Information Schizophrenia is a chronic brain disorder that affects about one percent of ...

  19. Helping Adults to Spell.

    ERIC Educational Resources Information Center

    Moorhouse, Catherine

    This book presents a range of strategies for adult literacy tutors and offers a wealth of practical advice on teaching spelling within the context of writing. Chapters 1-3 offer basic information on talking with the student about spelling, finding out how the student spells and helping the student to see himself/herself as a "good" speller, and…

  20. Helping You Age Well

    MedlinePlus

    ... to keep family relationships and friendships over time. Exercise can also help prevent depression or lift your mood. Stay active and involved in life. Talk to your physician if you are feeling depressed. Teeth & ... Lungs: Regular aerobic exercise keeps lung capacity up. Smoking leads to chronic ...

  1. Help Teens Manage Diabetes

    MedlinePlus

    ... Grey, dean of the Yale University School of Nursing, developed and tested a program called Coping Skills Training (CST) as a part of routine diabetes ... is to improve diabetic teens' coping and communication skills, healthy ... sugar levels. "Nursing research is about helping people deal with the ...

  2. Self-Help Experiences

    ERIC Educational Resources Information Center

    Woody, Robert H.

    1973-01-01

    The author believes that there is a distinct need for professionals to become competent in providing materials for self-help lay efforts. Colleges and universities must provide for the facilitation of personal growth through self administered procedures by either a clinical approach (in counseling centers) or a didactic one (in classes as, for…

  3. Help for Stressed Students

    ERIC Educational Resources Information Center

    Pope, Denise Clarke; Simon, Richard

    2005-01-01

    The authors argue that increased focus and pressure for high academic achievement, particularly among more highly-motivated and successful students, may have serious negative consequences. They present a number of strategies designed to help reduce both causes and consequences associated with academic stress and improve students' mental and…

  4. Concepts of Model Verification and Validation

    SciTech Connect

    B.H.Thacker; S.W.Doebling; F.M.Hemez; M.C. Anderson; J.E. Pepin; E.A. Rodriguez

    2004-10-30

    Model verification and validation (V&V) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model V&V procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model V&V program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model V&V is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define V&V methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for V&V applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of V&V procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model V&V for all

  5. Helping pregnant teenagers.

    PubMed

    Bluestein, D; Starling, M E

    1994-08-01

    Teenagers who are pregnant face many difficult issues, and counseling by physicians can be an important source of help. We suggest guidelines for this counseling, beginning with a review of the scope and consequences of adolescent pregnancy. Communication strategies should be aimed at building rapport with techniques such as maintaining confidentiality, avoiding judgmental stances, and gearing communication to cognitive maturity. Techniques for exploring family relationships are useful because these relationships are key influences on subsequent decisions and behaviors. We discuss topics related to abortion and childbearing, such as safety, facilitation of balanced decision making, the use of prenatal care, and the formulation of long-term plans. Physicians who can effectively discuss these topics can help pregnant teenagers make informed decisions and improve their prospects for the future.

  6. Helping children talk.

    PubMed

    Day, L

    1998-01-01

    Many children and young people living in London are affected by HIV. Most such children come from families from sub-Saharan Africa. Some HIV-positive parents have died, some are ill, and some are well. Some older children know that their parents are infected with HIV, but most children are unaware. To help these children understand their situations, children with a parent or parents who have died or are very sick are invited to 6 half-days of storytelling and play, led by a family counselor and someone who uses drama. Trained volunteers come from local AIDS organizations. The sessions vary depending upon what the children choose to discuss. The adults' role is to help the children begin to reflect upon their feelings in a way which is easy for them to express. Sessions usually begin with the creation of a story using a toy animal, after which children subsequently act out the imaginary family in different ways.

  7. Measurement techniques for the verification of excess weapons materials

    SciTech Connect

    Tape, J.W.; Eccleston, G.W.; Yates, M.A.

    1998-12-01

    The end of the superpower arms race has resulted in an unprecedented reduction in stockpiles of deployed nuclear weapons. Numerous proposals have been put forward and actions have been taken to ensure the irreversibility of nuclear arms reductions, including unilateral initiatives such as those made by President Clinton in September 1993 to place fissile materials no longer needed for a deterrent under international inspection, and bilateral and multilateral measures currently being negotiated. For the technologist, there is a unique opportunity to develop the technical means to monitor nuclear materials that have been declared excess to nuclear weapons programs, to provide confidence that reductions are taking place and that the released materials are not being used again for nuclear explosive programs. However, because of the sensitive nature of these materials, a fundamental conflict exists between the desire to know that the bulk materials or weapon components in fact represent evidence of warhead reductions, and treaty commitments and national laws that require the protection of weapons design information. This conflict presents a unique challenge to technologists. The flow of excess weapons materials, from deployed warheads through storage, disassembly, component storage, conversion to bulk forms, and disposition, will be described in general terms. Measurement approaches based on the detection of passive or induced radiation will be discussed along with the requirement to protect sensitive information from release to unauthorized parties. Possible uses of measurement methods to assist in the verification of arms reductions will be described. The concept of measuring attributes of items rather than quantitative mass-based inventory verification will be discussed along with associated information-barrier concepts required to protect sensitive information.

  8. Wavelet Features Based Fingerprint Verification

    NASA Astrophysics Data System (ADS)

    Bagadi, Shweta U.; Thalange, Asha V.; Jain, Giridhar P.

    2010-11-01

    In this work; we present a automatic fingerprint identification system based on Level 3 features. Systems based only on minutiae features do not perform well for poor quality images. In practice, we often encounter extremely dry, wet fingerprint images with cuts, warts, etc. Due to such fingerprints, minutiae based systems show poor performance for real time authentication applications. To alleviate the problem of poor quality fingerprints, and to improve overall performance of the system, this paper proposes fingerprint verification based on wavelet statistical features & co-occurrence matrix features. The features include mean, standard deviation, energy, entropy, contrast, local homogeneity, cluster shade, cluster prominence, Information measure of correlation. In this method, matching can be done between the input image and the stored template without exhaustive search using the extracted feature. The wavelet transform based approach is better than the existing minutiae based method and it takes less response time and hence suitable for on-line verification, with high accuracy.

  9. NEXT Thruster Component Verification Testing

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Sovey, James S.

    2007-01-01

    Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.

  10. Ontology Matching with Semantic Verification

    PubMed Central

    Jean-Mary, Yves R.; Shironoshita, E. Patrick; Kabuka, Mansur R.

    2009-01-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies. PMID:20186256

  11. Verification and transparency in future arms control

    SciTech Connect

    Pilat, J.F.

    1996-09-01

    Verification`s importance has changed dramatically over time, although it always has been in the forefront of arms control. The goals and measures of verification and the criteria for success have changed with the times as well, reflecting such factors as the centrality of the prospective agreement to East-West relations during the Cold War, the state of relations between the United States and the Soviet Union, and the technologies available for monitoring. Verification`s role may be declining in the post-Cold War period. The prospects for such a development will depend, first and foremost, on the high costs of traditional arms control, especially those associated with requirements for verification. Moreover, the growing interest in informal, or non-negotiated arms control does not allow for verification provisions by the very nature of these arrangements. Multilateral agreements are also becoming more prominent and argue against highly effective verification measures, in part because of fears of promoting proliferation by opening sensitive facilities to inspectors from potential proliferant states. As a result, it is likely that transparency and confidence-building measures will achieve greater prominence, both as supplements to and substitutes for traditional verification. Such measures are not panaceas and do not offer all that we came to expect from verification during the Cold war. But they may be the best possible means to deal with current problems of arms reductions and restraints at acceptable levels of expenditure.

  12. Crowd-Sourced Program Verification

    DTIC Science & Technology

    2012-12-01

    S / ROBERT L. KAMINSKI WARREN H. DEBANY, JR. Work Unit Manager Technical Advisor, Information Exploitation & Operations...Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington, DC 20503. PLEASE DO NOT RETURN...investigation, the contractor constructed a prototype of a crowd-sourced verification system that takes as input a given program and produces as output a

  13. Structural System Identification Technology Verification

    DTIC Science & Technology

    1981-11-01

    USAAVRADCOM-TR-81-D-28Q V󈧄 ADA1091 81 LEI STRUCTURAL SYSTEM IDENTIFICATION TECHNOLOGY VERIFICATION \\ N. Giansante, A. Berman, W. o. Flannelly, E...release; distribution unlimited. Prepared for APPLIED TECHNOLOGY LABORATORY U. S. ARMY RESEARCH AND TECHNOLOGY LABORATORIES (AVRADCOM) S Fort Eustis...Va. 23604 4-J" APPLI ED TECHNOLOGY LABORATORY POSITION STATEMENT The Applied Technology Laboratory has been involved in the development of the Struc

  14. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  15. Verification and validation guidelines for high integrity systems. Volume 1

    SciTech Connect

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D.

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities.

  16. Earthquake Forecasting, Validation and Verification

    NASA Astrophysics Data System (ADS)

    Rundle, J.; Holliday, J.; Turcotte, D.; Donnellan, A.; Tiampo, K.; Klein, B.

    2009-05-01

    Techniques for earthquake forecasting are in development using both seismicity data mining methods, as well as numerical simulations. The former rely on the development of methods to recognize patterns in data, while the latter rely on the use of dynamical models that attempt to faithfully replicate the actual fault systems. Testing such forecasts is necessary not only to determine forecast quality, but also to improve forecasts. A large number of techniques to validate and verify forecasts have been developed for weather and financial applications. Many of these have been elaborated in public locations, including, for example, the URL as listed below. Typically, the goal is to test for forecast resolution, reliability and sharpness. A good forecast is characterized by consistency, quality and value. Most, if not all of these forecast verification procedures can be readily applied to earthquake forecasts as well. In this talk, we discuss both methods of forecasting, as well as validation and verification using a number of these standard methods. We show how these test methods might be useful for both fault-based forecasting, a group of forecast methods that includes the WGCEP and simulator-based renewal models, and grid-based forecasting, which includes the Relative Intensity, Pattern Informatics, and smoothed seismicity methods. We find that applying these standard methods of forecast verification is straightforward. Judgments about the quality of a given forecast method can often depend on the test applied, as well as on the preconceptions and biases of the persons conducting the tests.

  17. Please Help Your Union

    NASA Astrophysics Data System (ADS)

    Killeen, Tim

    2006-03-01

    The continuing success of AGU relies entirely on the volunteer work of members. A major contribution to these efforts comes from the over 40 committees that plan, oversee, and have operational roles in our meetings, publications, finances, elections, awards, education, public information, and public affairs activities. The names of committees are provided in the accompanying text box; their current membership and descriptions can be found on the Web at the AGU site. One of the most important and challenging tasks of the incoming AGU President is to reestablish these committees by appointing hundreds of volunteers. I now solicit your help in staffing these committees. Ideally, participation in these important committees will reflect the overall membership and perspectives of AGU members, so please do consider volunteering yourself. Of course, nominations of others would also be very welcome. I am particularly interested in making sure that the gender balance, age, and geographic representation are appropriate and reflect our changing demographics. Any suggestions you might have will be more helpful if accompanied by a few sentences of background information relevant to the particular committee.

  18. History of Nuclear India

    NASA Astrophysics Data System (ADS)

    Chaturvedi, Ram

    2000-04-01

    India emerged as a free and democratic country in 1947, and entered into the nuclear age in 1948 by establishing the Atomic Energy Commission (AEC), with Homi Bhabha as the chairman. Later on the Department of Atomic Energy (DAE) was created under the Office of the Prime Minister Jawahar Lal Nehru. Initially the AEC and DAE received international cooperation, and by 1963 India had two research reactors and four nuclear power reactors. In spite of the humiliating defeat in the border war by China in 1962 and China's nuclear testing in 1964, India continued to adhere to the peaceful uses of nuclear energy. On May 18, 1974 India performed a 15 kt Peaceful Nuclear Explosion (PNE). The western powers considered it nuclear weapons proliferation and cut off all financial and technical help, even for the production of nuclear power. However, India used existing infrastructure to build nuclear power reactors and exploded both fission and fusion devices on May 11 and 13, 1998. The international community viewed the later activity as a serious road block for the Non-Proliferation Treaty and the Comprehensive Test Ban Treaty; both deemed essential to stop the spread of nuclear weapons. India considers these treaties favoring nuclear states and is prepared to sign if genuine nuclear disarmament is included as an integral part of these treaties.

  19. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  20. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  1. Nuclear Lunar Logistics Study

    NASA Technical Reports Server (NTRS)

    1963-01-01

    This document has been prepared to incorporate all presentation aid material, together with some explanatory text, used during an oral briefing on the Nuclear Lunar Logistics System given at the George C. Marshall Space Flight Center, National Aeronautics and Space Administration, on 18 July 1963. The briefing and this document are intended to present the general status of the NERVA (Nuclear Engine for Rocket Vehicle Application) nuclear rocket development, the characteristics of certain operational NERVA-class engines, and appropriate technical and schedule information. Some of the information presented herein is preliminary in nature and will be subject to further verification, checking and analysis during the remainder of the study program. In addition, more detailed information will be prepared in many areas for inclusion in a final summary report. This work has been performed by REON, a division of Aerojet-General Corporation under Subcontract 74-10039 from the Lockheed Missiles and Space Company. The presentation and this document have been prepared in partial fulfillment of the provisions of the subcontract. From the inception of the NERVA program in July 1961, the stated emphasis has centered around the demonstration of the ability of a nuclear rocket to perform safely and reliably in the space environment, with the understanding that the assignment of a mission (or missions) would place undue emphasis on performance and operational flexibility. However, all were aware that the ultimate justification for the development program must lie in the application of the nuclear propulsion system to the national space objectives.

  2. Compendium of Arms Control Verification Proposals.

    DTIC Science & Technology

    1982-03-01

    ZONAL ON-SITE INSPECTION ............ 123 CHAPTER D - CONTROL POSTS ................................... 139 CHAPTER E - RECORDS MONITORING...de:cribi.nr in reneral the zirnifiemit features of the verification method concerned. I’ ’ ’vi.i Chapters A to D deal with verification by direct on...inspection (i.e. increasing as confidence develops), and chapter D with control or observation posts. Chapter E deals with verification by examination of

  3. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  4. Using tools for verification, documentation and testing

    NASA Technical Reports Server (NTRS)

    Osterweil, L. J.

    1978-01-01

    Methodologies are discussed on four of the major approaches to program upgrading -- namely dynamic testing, symbolic execution, formal verification and static analysis. The different patterns of strengths, weaknesses and applications of these approaches are shown. It is demonstrated that these patterns are in many ways complementary, offering the hope that they can be coordinated and unified into a single comprehensive program testing and verification system capable of performing a diverse and useful variety of error detection, verification and documentation functions.

  5. Applying Human-performance Models to Designing and Evaluating Nuclear Power Plants: Review Guidance and Technical Basis

    SciTech Connect

    O'Hara, J.M.

    2009-11-30

    Human performance models (HPMs) are simulations of human behavior with which we can predict human performance. Designers use them to support their human factors engineering (HFE) programs for a wide range of complex systems, including commercial nuclear power plants. Applicants to U.S. Nuclear Regulatory Commission (NRC) can use HPMs for design certifications, operating licenses, and license amendments. In the context of nuclear-plant safety, it is important to assure that HPMs are verified and validated, and their usage is consistent with their intended purpose. Using HPMs improperly may generate misleading or incorrect information, entailing safety concerns. The objective of this research was to develop guidance to support the NRC staff's reviews of an applicant's use of HPMs in an HFE program. The guidance is divided into three topical areas: (1) HPM Verification, (2) HPM Validation, and (3) User Interface Verification. Following this guidance will help ensure the benefits of HPMs are achieved in a technically sound, defensible manner. During the course of developing this guidance, I identified several issues that could not be addressed; they also are discussed.

  6. Scope and verification of a Fissile Material (Cutoff) Treaty

    SciTech Connect

    Hippel, Frank N. von

    2014-05-09

    A Fissile Material Cutoff Treaty (FMCT) would ban the production of fissile material - in practice highly-enriched uranium and separated plutonium - for weapons. It has been supported by strong majorities in the United Nations. After it comes into force, newly produced fissile materials could only be produced under international - most likely International Atomic Energy Agency - monitoring. Many non-weapon states argue that the treaty should also place under safeguards pre-existing stocks of fissile material in civilian use or declared excess for weapons so as to make nuclear-weapons reductions irreversible. This paper discusses the scope of the FMCT, the ability to detect clandestine production and verification challenges in the nuclear-weapons states.

  7. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  8. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  9. Input apparatus for dynamic signature verification systems

    DOEpatents

    EerNisse, Errol P.; Land, Cecil E.; Snelling, Jay B.

    1978-01-01

    The disclosure relates to signature verification input apparatus comprising a writing instrument and platen containing piezoelectric transducers which generate signals in response to writing pressures.

  10. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  11. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Cycles § 1065.550 Gas analyzer range verification and drift verification. (a) Range verification. If an... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations...-specific emissions over the entire duty cycle for drift. For each constituent to be verified, both sets...

  12. Task Force Report: Assessment of Nuclear Monitoring and Verification Technologies

    DTIC Science & Technology

    2014-01-01

    and other players . 4. For both cooperative and unilateral actions, a paradigm shift is called for that includes:  Creating a national strategy...Methodology which  allows  for  the  “back  and  forth”  between the problem  space  and solution  space ; i.e., the  tracking , integration, and trade...foundation  for  a  bridging  methodology,  or  systematic  mapping   between  the  problem  space   and  solutions  space ,  that enables  increased

  13. Regional Seismic Arrays and Nuclear Test Ban Verification

    DTIC Science & Technology

    1990-12-01

    part of the Baltic Shield based on joint interpretation of seismic, gravity, magnetic and heat flow data, Tectonophysics 162, 151-164. Gravity Anomaly...130. Milanovsky, S. Yu. (1984). Deep Geothermal Structure and Mantle heat Flow Along the Barents Sea- East Alps Geotraverse, Tectonophysics 103, 175...and I. S. Sacks (1989). Anelasticity and thermal structure of the oceanic upper mantle: Temperature calibration with heat flow data, J. Geophys. Res

  14. Nuclear Fabrication Consortium

    SciTech Connect

    Levesque, Stephen

    2013-04-05

    This report summarizes the activities undertaken by EWI while under contract from the Department of Energy (DOE) Office of Nuclear Energy (NE) for the management and operation of the Nuclear Fabrication Consortium (NFC). The NFC was established by EWI to independently develop, evaluate, and deploy fabrication approaches and data that support the re-establishment of the U.S. nuclear industry: ensuring that the supply chain will be competitive on a global stage, enabling more cost-effective and reliable nuclear power in a carbon constrained environment. The NFC provided a forum for member original equipment manufactures (OEM), fabricators, manufacturers, and materials suppliers to effectively engage with each other and rebuild the capacity of this supply chain by : Identifying and removing impediments to the implementation of new construction and fabrication techniques and approaches for nuclear equipment, including system components and nuclear plants. Providing and facilitating detailed scientific-based studies on new approaches and technologies that will have positive impacts on the cost of building of nuclear plants. Analyzing and disseminating information about future nuclear fabrication technologies and how they could impact the North American and the International Nuclear Marketplace. Facilitating dialog and initiate alignment among fabricators, owners, trade associations, and government agencies. Supporting industry in helping to create a larger qualified nuclear supplier network. Acting as an unbiased technology resource to evaluate, develop, and demonstrate new manufacturing technologies. Creating welder and inspector training programs to help enable the necessary workforce for the upcoming construction work. Serving as a focal point for technology, policy, and politically interested parties to share ideas and concepts associated with fabrication across the nuclear industry. The report the objectives and summaries of the Nuclear Fabrication Consortium

  15. 80 FR 48955 - Pipeline Safety: Public Workshop on Hazardous Liquid Integrity Verification Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2015-08-14

    ... Verification Process AGENCY: Pipeline and Hazardous Materials Safety Administration, DOT. ACTION: Notice of... Process for gas transmission pipelines to help address several mandates in the Pipeline Safety, Regulatory...] [FR Doc No: 2015-20065] DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials...

  16. U.S. EPA Environmental Technology Verification Program, the Founder of the ETV Concept

    EPA Science Inventory

    The U.S. EPA Environmental Technology Verification (ETV) Program develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The program was created in 1995 to help accelerate t...

  17. Ionoacoustics: A new direct method for range verification

    NASA Astrophysics Data System (ADS)

    Parodi, Katia; Assmann, Walter

    2015-05-01

    The superior ballistic properties of ion beams may offer improved tumor-dose conformality and unprecedented sparing of organs at risk in comparison to other radiation modalities in external radiotherapy. However, these advantages come at the expense of increased sensitivity to uncertainties in the actual treatment delivery, resulting from inaccuracies of patient positioning, physiological motion and uncertainties in the knowledge of the ion range in living tissue. In particular, the dosimetric selectivity of ion beams depends on the longitudinal location of the Bragg peak, making in vivo knowledge of the actual beam range the greatest challenge to full clinical exploitation of ion therapy. Nowadays, in vivo range verification techniques, which are already, or close to, being investigated in the clinical practice, rely on the detection of the secondary annihilation photons or prompt gammas, resulting from nuclear interaction of the primary ion beam with the irradiated tissue. Despite the initial promising results, these methods utilize a not straightforward correlation between nuclear and electromagnetic processes, and typically require massive and costly instrumentation. On the contrary, the long-term known, yet only recently revisited process of "ionoacoustics", which is generated by local tissue heating especially at the Bragg peak, may offer a more direct approach to in vivo range verification, as reviewed here.

  18. Neutron spectrometry for UF6 enrichment verification in storage cylinders

    DOE PAGES

    Mengesha, Wondwosen; Kiff, Scott D.

    2015-01-29

    Verification of declared UF6 enrichment and mass in storage cylinders is of great interest in nuclear material nonproliferation. Nondestructive assay (NDA) techniques are commonly used for safeguards inspections to ensure accountancy of declared nuclear materials. Common NDA techniques used include gamma-ray spectrometry and both passive and active neutron measurements. In the present study, neutron spectrometry was investigated for verification of UF6 enrichment in 30B storage cylinders based on an unattended and passive measurement approach. MCNP5 and Geant4 simulated neutron spectra, for selected UF6 enrichments and filling profiles, were used in the investigation. The simulated neutron spectra were analyzed using principalmore » component analysis (PCA). The PCA technique is a well-established technique and has a wide area of application including feature analysis, outlier detection, and gamma-ray spectral analysis. Results obtained demonstrate that neutron spectrometry supported by spectral feature analysis has potential for assaying UF6 enrichment in storage cylinders. The results from the present study also showed that difficulties associated with the UF6 filling profile and observed in other unattended passive neutron measurements can possibly be overcome using the approach presented.« less

  19. Hailstorms over Switzerland: Verification of Crowd-sourced Data

    NASA Astrophysics Data System (ADS)

    Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia

    2016-04-01

    The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.

  20. Verification of heterogeneous multi-agent system using MCMAS

    NASA Astrophysics Data System (ADS)

    Choi, Jiyoung; Kim, Seungkeun; Tsourdos, Antonios

    2015-03-01

    The focus of the paper is how to model autonomous behaviours of heterogeneous multi-agent systems such that it can be verified that they will always operate within predefined mission requirements and constraints. This is done by using formal methods with an abstraction of the behaviours modelling and model checking for their verification. Three case studies are presented to verify the decision-making behaviours of heterogeneous multi-agent system using a convoy mission scenario. The multi-agent system in a case study has been extended by increasing the number of agents and function complexity gradually. For automatic verification, model checker for multi-agent systems (MCMAS) is adopted due to its novel capability to accommodate the multi-agent system and successfully verifies the targeting behaviours of the team-level autonomous systems. The verification results help retrospectively the design of decision-making algorithms improved by considering additional agents and behaviours during three steps of scenario modification. Consequently, the last scenario deals with the system composed of a ground control system, two unmanned aerial vehicles, and four unmanned ground vehicles with fault-tolerant and communication relay capabilities.

  1. Nuclear rights - nuclear wrongs

    SciTech Connect

    Paul, E.F.; Miller, F.D.; Paul, J.; Ahrens, J.

    1986-01-01

    This book contains 11 selections. The titles are: Three Ways to Kill Innocent Bystanders: Some Conundrums Concerning the Morality of War; The International Defense of Liberty; Two Concepts of Deterrence; Nuclear Deterrence and Arms Control; Ethical Issues for the 1980s; The Moral Status of Nuclear Deterrent Threats; Optimal Deterrence; Morality and Paradoxical Deterrence; Immoral Risks: A Deontological Critique of Nuclear Deterrence; No War Without Dictatorship, No Peace Without Democracy: Foreign Policy as Domestic Politics; Marxism-Leninism and its Strategic Implications for the United States; Tocqueveille War.

  2. Refiners get petchems help

    SciTech Connect

    Wood, A.; Cornitius, T.

    1997-06-11

    The U.S.Refining Industry is facing hard times. Slow growth, tough environmental regulations, and fierce competition - especially in retail gasoline - have squeezed margins and prompted a series of mergers and acquisitions. The trend has affected the smallest and largest players, and a series of transactions over the past two years has created a new industry lineup. Among the larger companies, Mobil and Amoco are the latest to consider a refining merger. That follows recent plans by Ashland and Marathon to merge their refining businesses, and the decision by Shell, Texaco, and Saudi Aramco to combine some U.S. operations. Many of the leading independent refiners have increased their scale by acquiring refinery capacity. With refining still in the doldrums, more independents are taking a closer look at boosting production of petrochemicals, which offer high growth and, usually, better margins. That is being helped by the shift to refinery processes that favor the increased production of light olefins for alkylation and the removal of aromatics, providing opportunity to extract these materials for the petrochemical market. 5 figs., 3 tabs.

  3. Congress: how to help.

    PubMed

    James, J S

    1995-04-21

    Citizen input, through letters, calls, and visits to government representatives, is needed more urgently now than ever before. The fiscal 1996 budget and appropriations process is expected to provide disappointments. The House has eliminated HOPWA AIDS housing funding for the current year (although it could be reversed in the Senate). Moves are being made toward mandatory HIV testing, with no provisions for counseling or for care. There is no mass movement yet to support AIDS politically, and there is no single or consistent source for connecting with local organizations, or getting the necessary background information, as issues become current. This article lists several national and regional organizations which may be helpful in developing this process. National organizations with an AIDS focus include the National Association of People with AIDS, AIDS Action Council, Treatment Action Network, Mobilization Against AIDS, Center for Women Policy Studies, National Minority AIDS Council, Committee of Ten Thousand, and Mothers' Voice. Gay-focused national organizations include the Log Cabin Republicans and the Human Rights Campaign Fund. Many states have organizations which provide state and regional information on AIDS-related issues. Three major lobbying events include AIDSWATCH 95, Mother's Day Card Campaign, and the California AIDS Budget Lobby Day.

  4. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  5. Why do verification and validation?

    DOE PAGES

    Hu, Kenneth T.; Paez, Thomas L.

    2016-02-19

    In this discussion paper, we explore different ways to assess the value of verification and validation (V&V) of engineering models. We first present a literature review on the value of V&V and then use value chains and decision trees to show how value can be assessed from a decision maker's perspective. In this context, the value is what the decision maker is willing to pay for V&V analysis with the understanding that the V&V results are uncertain. As a result, the 2014 Sandia V&V Challenge Workshop is used to illustrate these ideas.

  6. Approaches to wind resource verification

    NASA Technical Reports Server (NTRS)

    Barchet, W. R.

    1982-01-01

    Verification of the regional wind energy resource assessments produced by the Pacific Northwest Laboratory addresses the question: Is the magnitude of the resource given in the assessments truly representative of the area of interest? Approaches using qualitative indicators of wind speed (tree deformation, eolian features), old and new data of opportunity not at sites specifically chosen for their exposure to the wind, and data by design from locations specifically selected to be good wind sites are described. Data requirements and evaluation procedures for verifying the resource are discussed.

  7. Double patterning from design enablement to verification

    NASA Astrophysics Data System (ADS)

    Abercrombie, David; Lacour, Pat; El-Sewefy, Omar; Volkov, Alex; Levine, Evgueni; Arb, Kellen; Reid, Chris; Li, Qiao; Ghosh, Pradiptya

    2011-11-01

    Litho-etch-litho-etch (LELE) is the double patterning (DP) technology of choice for 20 nm contact, via, and lower metal layers. We discuss the unique design and process characteristics of LELE DP, the challenges they present, and various solutions. ∘ We examine DP design methodologies, current DP conflict feedback mechanisms, and how they can help designers identify and resolve conflicts. ∘ In place and route (P&R), the placement engine must now be aware of the assumptions made during IP cell design, and use placement directives provide by the library designer. We examine the new effects DP introduces in detail routing, discuss how multiple choices of LELE and the cut allowances can lead to different solutions, and describe new capabilities required by detail routers and P&R engines. ∘ We discuss why LELE DP cuts and overlaps are critical to optical process correction (OPC), and how a hybrid mechanism of rule and model-based overlap generation can provide a fast and effective solution. ∘ With two litho-etch steps, mask misalignment and image rounding are now verification considerations. We present enhancements to the OPCVerify engine that check for pinching and bridging in the presence of DP overlay errors and acute angles.

  8. CFE verification: The decision to inspect

    SciTech Connect

    Allentuck, J.

    1990-01-01

    Verification of compliance with the provisions of the treaty on Conventional Forces-Europe (CFE) is subject to inspection quotas of various kinds. Thus the decision to carry out a specific inspection or verification activity must be prudently made. This decision process is outlined, and means for conserving quotas'' are suggested. 4 refs., 1 fig.

  9. Identity Verification, Control, and Aggression in Marriage

    ERIC Educational Resources Information Center

    Stets, Jan E.; Burke, Peter J.

    2005-01-01

    In this research we study the identity verification process and its effects in marriage. Drawing on identity control theory, we hypothesize that a lack of verification in the spouse identity (1) threatens stable self-meanings and interaction patterns between spouses, and (2) challenges a (nonverified) spouse's perception of control over the…

  10. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Verification program. 460.17 Section...

  11. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Verification program. 460.17 Section...

  12. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Verification program. 460.17 Section...

  13. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements:...

  14. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 5 2011-04-01 2011-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements:...

  15. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  16. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  17. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  18. 18 CFR 286.107 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 286.107 Section 286.107 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... Contested Audit Findings and Proposed Remedies § 286.107 Verification. The facts stated in the...

  19. 18 CFR 41.5 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 41.5 Section 41.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  20. 18 CFR 349.5 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 349.5 Section 349.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... PROPOSED REMEDIES § 349.5 Verification. The facts stated in the memorandum must be sworn to by...

  1. 18 CFR 286.107 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 286.107 Section 286.107 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... Contested Audit Findings and Proposed Remedies § 286.107 Verification. The facts stated in the...

  2. 18 CFR 41.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 41.5 Section 41.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  3. 18 CFR 41.5 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification. 41.5 Section 41.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  4. 18 CFR 349.5 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification. 349.5 Section 349.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... PROPOSED REMEDIES § 349.5 Verification. The facts stated in the memorandum must be sworn to by...

  5. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  6. 18 CFR 349.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 349.5 Section 349.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... PROPOSED REMEDIES § 349.5 Verification. The facts stated in the memorandum must be sworn to by...

  7. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 1 2013-10-01 2013-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  8. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  9. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 1 2014-10-01 2014-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  10. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  11. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  13. Guidelines for qualifying cleaning and verification materials

    NASA Technical Reports Server (NTRS)

    Webb, D.

    1995-01-01

    This document is intended to provide guidance in identifying technical issues which must be addressed in a comprehensive qualification plan for materials used in cleaning and cleanliness verification processes. Information presented herein is intended to facilitate development of a definitive checklist that should address all pertinent materials issues when down selecting a cleaning/verification media.

  14. Gender Verification of Female Olympic Athletes.

    ERIC Educational Resources Information Center

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  15. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... a flight. Verification must include flight testing. ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Verification program. 460.17 Section 460.17... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17...

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  17. Groundwater flow code verification ``benchmarking`` activity (COVE-2A): Analysis of participants` work

    SciTech Connect

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project.

  18. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  19. Video-based fingerprint verification.

    PubMed

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-09-04

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, "inside similarity" and "outside similarity" are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low.

  20. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness, which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  1. Modular verification of concurrent systems

    SciTech Connect

    Sobel, A.E.K.

    1986-01-01

    During the last ten years, a number of authors have proposed verification techniques that allow one to prove properties of individual processes by using global assumptions about the behavior of the remaining processes in the distributed program. As a result, one must justify these global assumptions before drawing any conclusions regarding the correctness of the entire program. This justification is often the most difficult part of the proof and presents a serious obstacle to hierarchical program development. This thesis develops a new approach to the verification of concurrent systems. The approach is modular and supports compositional development of programs since the proofs of each individual process of a program are completely isolated from all others. The generality of this approach is illustrated by applying it to a representative set of contemporary concurrent programming languages, namely: CSP, ADA, Distributed Processes, and a shared variable language. In addition, it is also shown how the approach may be used to deal with a number of other constructs that have been proposed for inclusion in concurrent languages: FORK and JOIN primitives, nested monitor calls, path expressions, atomic transactions, and asynchronous message passing. These results allow argument that the approach is universal and can be used to design proof systems for any concurrent language.

  2. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  3. Hierarchical Design and Verification for VLSI

    NASA Technical Reports Server (NTRS)

    Shostak, R. E.; Elliott, W. D.; Levitt, K. N.

    1983-01-01

    The specification and verification work is described in detail, and some of the problems and issues to be resolved in their application to Very Large Scale Integration VLSI systems are examined. The hierarchical design methodologies enable a system architect or design team to decompose a complex design into a formal hierarchy of levels of abstraction. The first step inprogram verification is tree formation. The next step after tree formation is the generation from the trees of the verification conditions themselves. The approach taken here is similar in spirit to the corresponding step in program verification but requires modeling of the semantics of circuit elements rather than program statements. The last step is that of proving the verification conditions using a mechanical theorem-prover.

  4. Why do ineffective treatments seem helpful? A brief review

    PubMed Central

    Hartman, Steve E

    2009-01-01

    After any therapy, when symptoms improve, healthcare providers (and patients) are tempted to award credit to treatment. Over time, a particular treatment can seem so undeniably helpful that scientific verification of efficacy is judged an inconvenient waste of time and resources. Unfortunately, practitioners' accumulated, day-to-day, informal impressions of diagnostic reliability and clinical efficacy are of limited value. To help clarify why even treatments entirely lacking in direct effect can seem helpful, I will explain why real signs and symptoms often improve, independent of treatment. Then, I will detail quirks of human perception, interpretation, and memory that often make symptoms seem improved, when they are not. I conclude that healthcare will grow to full potential only when judgments of clinical efficacy routinely are based in properly scientific, placebo-controlled, outcome analysis. PMID:19822008

  5. Help Helps, but Only so Much: Research on Help Seeking with Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Aleven, Vincent; Roll, Ido; McLaren, Bruce M.; Koedinger, Kenneth R.

    2016-01-01

    Help seeking is an important process in self-regulated learning (SRL). It may influence learning with intelligent tutoring systems (ITSs), because many ITSs provide help, often at the student's request. The Help Tutor was a tutor agent that gave in-context, real-time feedback on students' help-seeking behavior, as they were learning with an ITS.…

  6. Guidelines for the verification and validation of expert system software and conventional software: Evaluation of knowledge base certification methods. Volume 4

    SciTech Connect

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-03-01

    This report presents the results of the Knowledge Base Certification activity of the expert systems verification and validation (V&V) guideline development project which is jointly funded by the US Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. This activity is concerned with the development and testing of various methods for assuring the quality of knowledge bases. The testing procedure used was that of behavioral experiment, the first known such evaluation of any type of V&V activity. The value of such experimentation is its capability to provide empirical evidence for -- or against -- the effectiveness of plausible methods in helping people find problems in knowledge bases. The three-day experiment included 20 participants from three nuclear utilities, the Nuclear Regulatory Commission`s Technical training Center, the University of Maryland, EG&G Idaho, and SAIC. The study used two real nuclear expert systems: a boiling water reactor emergency operating procedures tracking system and a pressurized water reactor safety assessment systems. Ten participants were assigned to each of the expert systems. All participants were trained in and then used a sequence of four different V&V methods selected as being the best and most appropriate for study on the basis of prior evaluation activities. These methods either involved the analysis and tracing of requirements to elements in the knowledge base (requirements grouping and requirements tracing) or else involved direct inspection of the knowledge base for various kinds of errors. Half of the subjects within each system group used the best manual variant of the V&V methods (the control group), while the other half were supported by the results of applying real or simulated automated tools to the knowledge bases (the experimental group).

  7. Nuclear Cryogenic Propulsion Stage

    NASA Technical Reports Server (NTRS)

    Houts, Michael G.; Borowski, S. K.; George, J. A.; Kim, T.; Emrich, W. J.; Hickman, R. R.; Broadway, J. W.; Gerrish, H. P.; Adams, R. B.

    2012-01-01

    The fundamental capability of Nuclear Thermal Propulsion (NTP) is game changing for space exploration. A first generation Nuclear Cryogenic Propulsion Stage (NCPS) based on NTP could provide high thrust at a specific impulse above 900 s, roughly double that of state of the art chemical engines. Characteristics of fission and NTP indicate that useful first generation systems will provide a foundation for future systems with extremely high performance. The role of the NCPS in the development of advanced nuclear propulsion systems could be analogous to the role of the DC-3 in the development of advanced aviation. Progress made under the NCPS project could help enable both advanced NTP and advanced NEP.

  8. Help Seeking and Help Design in Interactive Learning Environments

    ERIC Educational Resources Information Center

    Aleven, Vincent; Stahl, Elmar; Schworm, Silke; Fischer, Frank; Wallace, Raven

    2003-01-01

    Many interactive learning environments (ILEs) offer on-demand help, intended to positively influence learning. Recent studies report evidence that although effective help-seeking behavior in ILEs is related to better learning outcomes, learners are not using help facilities effectively. This selective review (a) examines theoretical perspectives…

  9. Nuclear Medicine.

    ERIC Educational Resources Information Center

    Badawi, Ramsey D.

    2001-01-01

    Describes the use of nuclear medicine techniques in diagnosis and therapy. Describes instrumentation in diagnostic nuclear medicine and predicts future trends in nuclear medicine imaging technology. (Author/MM)

  10. Field Verification Program (Aquatic Disposal). Sister Chromatid Exchange in Marine Polychaetes Exposed to Black Rock Harbor Sediment.

    DTIC Science & Technology

    1985-07-01

    Environmental Laboratory. Manager of the Environ- mental Effects of Dredging Programs was Dr. Robert M. Engler, with Mr. Robert L. Lazor , FVP Coordinator...34,000 4,900 - .4. PAH po i,, nuclear ar omatl ; 0,4. .. . .. I, , ’ " , ml m m m m - ml . m m m mm. m m m - .,, ,, -D-RiS4 578 FIELD VERIFICATION PROGRAM

  11. Does Marijuana Help Treat Glaucoma?

    MedlinePlus

    ... Ophthalmologist Patient Stories Español Eye Health / Tips & Prevention Marijuana Sections Does Marijuana Help Treat Glaucoma? Why Eye ... Don't Recommend Marijuana for Glaucoma Infographic Does Marijuana Help Treat Glaucoma? Written by: David Turbert , contributing ...

  12. Helping Teens Resist Sexual Pressure

    MedlinePlus

    ... Size Email Print Share Helping Teens Resist Sexual Pressure Page Content Article Body Teens are more likely ... time they had intercourse. Helping Teens Resist Sexual Pressure “The pressure on teenagers to have sex is ...

  13. Tourette Syndrome: Help Stop Bullying

    MedlinePlus

    ... Past Emails CDC Features Tourette Syndrome: Help Stop Bullying Language: English Español (Spanish) Recommend on Facebook Tweet ... you can increase acceptance by helping to stop bullying of children with TS. Bullying doesn't just ...

  14. Helping Kids Deal with Bullies

    MedlinePlus

    ... to 2-Year-Old Helping Kids Deal With Bullies KidsHealth > For Parents > Helping Kids Deal With Bullies ... will be prepared if it does happen. Identifying Bullying Most kids have been teased by a sibling ...

  15. KAT-7 Science Verification Highlights

    NASA Astrophysics Data System (ADS)

    Lucero, Danielle M.; Carignan, Claude; KAT-7 Science Data; Processing Team, KAT-7 Science Commissioning Team

    2015-01-01

    KAT-7 is a pathfinder of the Square Kilometer Array precursor MeerKAT, which is under construction. Its short baselines and low system temperature make it sensitive to large scale, low surface brightness emission. This makes it an ideal instrument to use in searches for faint extended radio emission and low surface density extraplanar gas. We present an update on the progress of several such ongoing KAT-7 science verification projects. These include a large scale radio continuum and polarization survey of the Galactic Center, deep HI observations (100+ hours) of nearby disk galaxies (e.g. NGC253 and NGC3109), and targeted searches for HI tidal tails in galaxy groups (e.g. IC1459). A brief status update for MeerKAT will also be presented if time permits.

  16. MFTF sensor verification computer program

    SciTech Connect

    Chow, H.K.

    1984-11-09

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system.

  17. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  18. Retail applications of signature verification

    NASA Astrophysics Data System (ADS)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  19. Verification of FANTASTIC integrated code

    NASA Technical Reports Server (NTRS)

    Chauhan, Rajinder Singh

    1987-01-01

    FANTASTIC is an acronym for Failure Analysis Nonlinear Thermal and Structural Integrated Code. This program was developed by Failure Analysis Associates, Palo Alto, Calif., for MSFC to improve the accuracy of solid rocket motor nozzle analysis. FANTASTIC has three modules: FACT - thermochemical analysis; FAHT - heat transfer analysis; and FAST - structural analysis. All modules have keywords for data input. Work is in progress for the verification of the FAHT module, which is done by using data for various problems with known solutions as inputs to the FAHT module. The information obtained is used to identify problem areas of the code and passed on to the developer for debugging purposes. Failure Analysis Associates have revised the first version of the FANTASTIC code and a new improved version has been released to the Thermal Systems Branch.

  20. Crowd-Sourced Help with Emergent Knowledge for Optimized Formal Verification (CHEKOFV)

    DTIC Science & Technology

    2016-03-01

    i TABLE  OF   CONTENTS   List  of  Figures...Abstract  Interpretation  and  Machine   Learning ...interpretation,  invariant   learning ,  and  crowd-­‐sourcing  ....................................................  46   3.7   Frama

  1. Argonne nuclear pioneer: Leonard Koch

    SciTech Connect

    Koch, Leonard

    2012-01-01

    Leonard Koch joined Argonne National Laboratory in 1948. He helped design and build Experimental Breeder Reactor-1 (EBR-1), the first reactor to generate useable amounts of electricity from nuclear energy.

  2. Children's Help Seeking and Impulsivity

    ERIC Educational Resources Information Center

    Puustinen, Minna; Kokkonen, Marja; Tolvanen, Asko; Pulkkinen, Lea

    2004-01-01

    The aim of the present study was to analyze the relationship between students' (100 children aged 8 to 12) help-seeking behavior and impulsivity. Help-seeking behavior was evaluated using a naturalistic experimental paradigm in which children were placed in a problem-solving situation and had the opportunity to seek help from the experimenter, if…

  3. How Stitches Help Kids Heal

    MedlinePlus

    ... What Happens in the Operating Room? How Stitches Help Kids Heal KidsHealth > For Kids > How Stitches Help Kids Heal A A A What's in this ... the directions carefully with your mom's or dad's help. Different kinds of materials — sutures, glue, and butterflies — ...

  4. Scenarios for exercising technical approaches to verified nuclear reductions

    SciTech Connect

    Doyle, James

    2010-01-01

    Presidents Obama and Medvedev in April 2009 committed to a continuing process of step-by-step nuclear arms reductions beyond the new START treaty that was signed April 8, 2010 and to the eventual goal of a world free of nuclear weapons. In addition, the US Nuclear Posture review released April 6, 2010 commits the US to initiate a comprehensive national research and development program to support continued progress toward a world free of nuclear weapons, including expanded work on verification technologies and the development of transparency measures. It is impossible to predict the specific directions that US-RU nuclear arms reductions will take over the 5-10 years. Additional bilateral treaties could be reached requiring effective verification as indicated by statements made by the Obama administration. There could also be transparency agreements or other initiatives (unilateral, bilateral or multilateral) that require monitoring with a standard of verification lower than formal arms control, but still needing to establish confidence to domestic, bilateral and multilateral audiences that declared actions are implemented. The US Nuclear Posture Review and other statements give some indication of the kinds of actions and declarations that may need to be confirmed in a bilateral or multilateral setting. Several new elements of the nuclear arsenals could be directly limited. For example, it is likely that both strategic and nonstrategic nuclear warheads (deployed and in storage), warhead components, and aggregate stocks of such items could be accountable under a future treaty or transparency agreement. In addition, new initiatives or agreements may require the verified dismantlement of a certain number of nuclear warheads over a specified time period. Eventually procedures for confirming the elimination of nuclear warheads, components and fissile materials from military stocks will need to be established. This paper is intended to provide useful background information

  5. Enhanced Cancelable Biometrics for Online Signature Verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. In this paper, we propose a scheme to improve the performance of a cancelable approach for online signature verification. Our scheme generates two cancelable dataset from one raw dataset and uses them for verification. Preliminary experiments were performed using a distance-based online signature verification algorithm. The experimental results show that our proposed scheme is promising.

  6. Formal specification and verification of Ada software

    NASA Technical Reports Server (NTRS)

    Hird, Geoffrey R.

    1991-01-01

    The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.

  7. Liquefied Natural Gas (LNG) dispenser verification device

    NASA Astrophysics Data System (ADS)

    Xiong, Maotao; Yang, Jie-bin; Zhao, Pu-jun; Yu, Bo; Deng, Wan-quan

    2013-01-01

    The composition of working principle and calibration status of LNG (Liquefied Natural Gas) dispenser in China are introduced. According to the defect of weighing method in the calibration of LNG dispenser, LNG dispenser verification device has been researched. The verification device bases on the master meter method to verify LNG dispenser in the field. The experimental results of the device indicate it has steady performance, high accuracy level and flexible construction, and it reaches the international advanced level. Then LNG dispenser verification device will promote the development of LNG dispenser industry in China and to improve the technical level of LNG dispenser manufacture.

  8. Delayed Gamma-ray Spectroscopy for Non-Destructive Assay of Nuclear Materials

    SciTech Connect

    Mozin, Vladimir; Ludewigt, Bernhard; Campbell, Luke; Favalli, Andrea; Hunt, Alan

    2014-10-09

    This project addresses the need for improved non-destructive assay techniques for quantifying the actinide composition of spent nuclear fuel and for the independent verification of declared quantities of special nuclear materials at key stages of the fuel cycle. High-energy delayed gamma-ray spectroscopy following neutron irradiation is a potential technique for directly assaying spent fuel assemblies and achieving the safeguards goal of quantifying nuclear material inventories for spent fuel handling, interim storage, reprocessing facilities, repository sites, and final disposal. Other potential applications include determination of MOX fuel composition, characterization of nuclear waste packages, and challenges in homeland security and arms control verification.

  9. Thermal hydraulic feasibility assessment for the Spent Nuclear Fuel Project

    SciTech Connect

    Heard, F.J.; Cramer, E.R.; Beaver, T.R.; Thurgood, M.J.

    1996-01-01

    A series of scoping analyses have been completed investigating the thermal-hydraulic performance and feasibility of the Spent Nuclear Fuel Project (SNFP) Integrated Process Strategy (IPS). The SNFP was established to develop engineered solutions for the expedited removal, stabilization, and storage of spent nuclear fuel from the K Basins at the U.S. Department of Energy`s Hanford Site in Richland, Washington. The subject efforts focused on independently investigating, quantifying, and establishing the governing heat production and removal mechanisms for each of the IPS operations and configurations, obtaining preliminary results for comparison with and verification of other analyses, and providing technology-based recommendations for consideration and incorporation into the design bases for the SNFP. The goal was to develop a series fo thermal-hydraulic models that could respond to all process and safety-related issues that may arise pertaining to the SNFP. A series of sensitivity analyses were also performed to help identify those parameters that have the greatest impact on energy transfer and hence, temperature control. It is anticipated that the subject thermal-hydraulic models will form the basis for a series of advanced and more detailed models that will more accurately reflect the thermal performance of the IPS and alleviate the necessity for some of the more conservative assumptions and oversimplifications, as well as form the basis for the final process and safety analyses.

  10. DOE handbook: Integrated safety management systems (ISMS) verification team leader`s handbook

    SciTech Connect

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  11. Certainty in Stockpile Computing: Recommending a Verification and Validation Program for Scientific Software

    SciTech Connect

    Lee, J.R.

    1998-11-01

    As computing assumes a more central role in managing the nuclear stockpile, the consequences of an erroneous computer simulation could be severe. Computational failures are common in other endeavors and have caused project failures, significant economic loss, and loss of life. This report examines the causes of software failure and proposes steps to mitigate them. A formal verification and validation program for scientific software is recommended and described.

  12. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory's Source Region Program

    SciTech Connect

    Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.

    1993-01-21

    As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.

  13. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  14. Advanced Nuclear Measurements - Sensitivity Analysis Emerging Safeguards, Problems and Proliferation Risk

    SciTech Connect

    Dreicer, J.S.

    1999-07-15

    During the past year this component of the Advanced Nuclear Measurements LDRD-DR has focused on emerging safeguards problems and proliferation risk by investigating problems in two domains. The first is related to the analysis, quantification, and characterization of existing inventories of fissile materials, in particular, the minor actinides (MA) formed in the commercial fuel cycle. Understanding material forms and quantities helps identify and define future measurement problems, instrument requirements, and assists in prioritizing safeguards technology development. The second problem (dissertation research) has focused on the development of a theoretical foundation for sensor array anomaly detection. Remote and unattended monitoring or verification of safeguards activities is becoming a necessity due to domestic and international budgetary constraints. However, the ability to assess the trustworthiness of a sensor array has not been investigated. This research is developing an anomaly detection methodology to assess the sensor array.

  15. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  16. Verification of operating software for cooperative monitoring applications

    SciTech Connect

    Tolk, K.M.; Rembold, R.K.

    1997-08-01

    Monitoring agencies often use computer based equipment to control instruments and to collect data at sites that are being monitored under international safeguards or other cooperative monitoring agreements. In order for this data to be used as an independent verification of data supplied by the host at the facility, the software used must be trusted by the monitoring agency. The monitoring party must be sure that the software has not be altered to give results that could lead to erroneous conclusions about nuclear materials inventories or other operating conditions at the site. The host might also want to verify that the software being used is the software that has been previously inspected in order to be assured that only data that is allowed under the agreement is being collected. A description of a method to provide this verification using keyed has functions and how the proposed method overcomes possible vulnerabilities in methods currently in use such as loading the software from trusted disks is presented. The use of public key data authentication for this purpose is also discussed.

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: STORMWATER TECHNOLOGIES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techn...

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: FUEL CELLS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  19. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Energy Regulatory Commission 18 CFR Part 40 Generator Verification Reliability Standards AGENCY: Federal... approve the following Reliability Standards that were submitted to the Commission for approval by the North American Electric Reliability Corporation, the Commission-certified Electric...

  20. Environmental Technology Verification Program (ETV) Policy Compendium

    EPA Science Inventory

    The Policy Compendium summarizes operational decisions made to date by participants in the U.S. Environmental Protection Agency's (EPA's) Environmental Technology Verification Program (ETV) to encourage consistency among the ETV centers. The policies contained herein evolved fro...

  1. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  2. Engineering drawing field verification program. Revision 3

    SciTech Connect

    Ulk, P.F.

    1994-10-12

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented.

  3. The PASCAL-HDM Verification System

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The PASCAL-HDM verification system is described. This system supports the mechanical generation of verification conditions from PASCAL programs and HDM-SPECIAL specifications using the Floyd-Hoare axiomatic method. Tools are provided to parse programs and specifications, check their static semantics, generate verification conditions from Hoare rules, and translate the verification conditions appropriately for proof using the Shostak Theorem Prover, are explained. The differences between standard PASCAL and the language handled by this system are explained. This consists mostly of restrictions to the standard language definition, the only extensions or modifications being the addition of specifications to the code and the change requiring the references to a function of no arguments to have empty parentheses.

  4. MAMA Software Features: Quantification Verification Documentation-1

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  5. Verification of the Calore thermal analysis code.

    SciTech Connect

    Dowding, Kevin J.; Blackwell, Bennie Francis

    2004-07-01

    Calore is the ASC code developed to model steady and transient thermal diffusion with chemistry and dynamic enclosure radiation. An integral part of the software development process is code verification, which addresses the question 'Are we correctly solving the model equations'? This process aids the developers in that it identifies potential software bugs and gives the thermal analyst confidence that a properly prepared input will produce satisfactory output. Grid refinement studies have been performed on problems for which we have analytical solutions. In this talk, the code verification process is overviewed and recent results are presented. Recent verification studies have focused on transient nonlinear heat conduction and verifying algorithms associated with (tied) contact and adaptive mesh refinement. In addition, an approach to measure the coverage of the verification test suite relative to intended code applications is discussed.

  6. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... deviation occurs; (d) Reviewing the critical limits; (e) Reviewing other records pertaining to the...

  7. The NPARC Alliance Verification and Validation Archive

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Dudek, Julianne C.; Tatum, Kenneth E.

    2000-01-01

    The NPARC Alliance (National Project for Applications oriented Research in CFD) maintains a publicly-available, web-based verification and validation archive as part of the development and support of the WIND CFD code. The verification and validation methods used for the cases attempt to follow the policies and guidelines of the ASME and AIAA. The emphasis is on air-breathing propulsion flow fields with Mach numbers ranging from low-subsonic to hypersonic.

  8. Dynamic testing for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.

    1972-01-01

    Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.

  9. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  10. Nuclear weapons testing

    SciTech Connect

    Heylin, M.

    1988-02-15

    The author examines the history of efforts to ban, or at least constrain, nuclear tests. The issue has been marked by shifts in attitude by the superpowers in recent times. The Reagan Administration sees a comprehensive test ban only as a very long-term goal for the U.S. The Soviets, on the other hand, have been pushing extremely hard lately for a ban on all testing. The author discusses the pros and cons of such a ban by examining the arguments of the U.S. Department of Energy, Nobel Laureate Glenn T. Seaborg, and Associate Director for Defense Systems at Lawrence Livermore National Laboratory George H. Miller. Other issues that are discussed include verification, joint testing, and reliability. He concludes with a discussion of the future of the ban.

  11. Help!

    ERIC Educational Resources Information Center

    Adams, Caralee

    2006-01-01

    This article presents ten time-saving ideas for teachers. One great time-saving tip is to come in an hour early once or twice a week for grading papers. It is also a great idea if teachers will not give tests on Friday in order to reduce their weekend work.

  12. Ground-based visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-11-01

    Ground-based visual inspection will play an essential role in On-Site Inspection (OSI) for Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection will greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can ground-based visual inspection offer effective documentation in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending state may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection. The inspections will be carried out by inspectors from members of the CTBT Organization.

  13. Verification of Minimum Detectable Activity for Radiological Threat Source Search

    NASA Astrophysics Data System (ADS)

    Gardiner, Hannah; Myjak, Mitchell; Baciak, James; Detwiler, Rebecca; Seifert, Carolyn

    2015-10-01

    The Department of Homeland Security's Domestic Nuclear Detection Office is working to develop advanced technologies that will improve the ability to detect, localize, and identify radiological and nuclear sources from airborne platforms. The Airborne Radiological Enhanced-sensor System (ARES) program is developing advanced data fusion algorithms for analyzing data from a helicopter-mounted radiation detector. This detector platform provides a rapid, wide-area assessment of radiological conditions at ground level. The NSCRAD (Nuisance-rejection Spectral Comparison Ratios for Anomaly Detection) algorithm was developed to distinguish low-count sources of interest from benign naturally occurring radiation and irrelevant nuisance sources. It uses a number of broad, overlapping regions of interest to statistically compare each newly measured spectrum with the current estimate for the background to identify anomalies. We recently developed a method to estimate the minimum detectable activity (MDA) of NSCRAD in real time. We present this method here and report on the MDA verification using both laboratory measurements and simulated injects on measured backgrounds at or near the detection limits. This work is supported by the US Department of Homeland Security, Domestic Nuclear Detection Office, under competitively awarded contract/IAA HSHQDC-12-X-00376. This support does not constitute an express or implied endorsement on the part of the Gov't.

  14. High-Resolution Fast-Neutron Spectrometry for Arms Control and Treaty Verification

    SciTech Connect

    David L. Chichester; James T. Johnson; Edward H. Seabury

    2012-07-01

    Many nondestructive nuclear analysis techniques have been developed to support the measurement needs of arms control and treaty verification, including gross photon and neutron counting, low- and high-resolution gamma spectrometry, time-correlated neutron measurements, and photon and neutron imaging. One notable measurement technique that has not been extensively studied to date for these applications is high-resolution fast-neutron spectrometry (HRFNS). Applied for arms control and treaty verification, HRFNS has the potential to serve as a complimentary measurement approach to these other techniques by providing a means to either qualitatively or quantitatively determine the composition and thickness of non-nuclear materials surrounding neutron-emitting materials. The technique uses the normally-occurring neutrons present in arms control and treaty verification objects of interest as an internal source of neutrons for performing active-interrogation transmission measurements. Most low-Z nuclei of interest for arms control and treaty verification, including 9Be, 12C, 14N, and 16O, possess fast-neutron resonance features in their absorption cross sections in the 0.5- to 5-MeV energy range. Measuring the selective removal of source neutrons over this energy range, assuming for example a fission-spectrum starting distribution, may be used to estimate the stoichiometric composition of intervening materials between the neutron source and detector. At a simpler level, determination of the emitted fast-neutron spectrum may be used for fingerprinting 'known' assemblies for later use in template-matching tests. As with photon spectrometry, automated analysis of fast-neutron spectra may be performed to support decision making and reporting systems protected behind information barriers. This paper will report recent work at Idaho National Laboratory to explore the feasibility of using HRFNS for arms control and treaty verification applications, including simulations and

  15. Verification study of an emerging fire suppression system

    DOE PAGES

    Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.; ...

    2016-01-01

    Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation andmore » mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.« less

  16. Verification study of an emerging fire suppression system

    SciTech Connect

    Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.; Gubernatis, David C.

    2016-01-01

    Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation and mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.

  17. National Verification System of National Meteorological Center , China

    NASA Astrophysics Data System (ADS)

    Zhang, Jinyan; Wei, Qing; Qi, Dan

    2016-04-01

    Product Quality Verification Division for official weather forecasting of China was founded in April, 2011. It is affiliated to Forecast System Laboratory (FSL), National Meteorological Center (NMC), China. There are three employees in this department. I'm one of the employees and I am in charge of Product Quality Verification Division in NMC, China. After five years of construction, an integrated realtime National Verification System of NMC, China has been established. At present, its primary roles include: 1) to verify official weather forecasting quality of NMC, China; 2) to verify the official city weather forecasting quality of Provincial Meteorological Bureau; 3) to evaluate forecasting quality for each forecasters in NMC, China. To verify official weather forecasting quality of NMC, China, we have developed : • Grid QPF Verification module ( including upascale) • Grid temperature, humidity and wind forecast verification module • Severe convective weather forecast verification module • Typhoon forecast verification module • Disaster forecast verification • Disaster warning verification module • Medium and extend period forecast verification module • Objective elements forecast verification module • Ensemble precipitation probabilistic forecast verification module To verify the official city weather forecasting quality of Provincial Meteorological Bureau, we have developed : • City elements forecast verification module • Public heavy rain forecast verification module • City air quality forecast verification module. To evaluate forecasting quality for each forecasters in NMC, China, we have developed : • Off-duty forecaster QPF practice evaluation module • QPF evaluation module for forecasters • Severe convective weather forecast evaluation module • Typhoon track forecast evaluation module for forecasters • Disaster warning evaluation module for forecasters • Medium and extend period forecast evaluation module The further

  18. Helping your teen with depression

    MedlinePlus

    Teen depression - helping; Teen depression - talk therapy; Teen depression - medicine ... teen the most. The most effective treatments for depression are: Talk therapy Antidepressant medicines If your teen ...

  19. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Niton XLt 700 Series (XLt) XRF Services x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XLt analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XLt analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy

  20. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Elvatech, Ltd. ElvaX (ElvaX) x-ray fluorescence (XRF) analyzer distributed in the United States by Xcalibur XRF Services (Xcalibur), was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ElvaX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ElvaX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as s

  1. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Oxford ED2000 x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ED2000 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ED2000 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by com

  2. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Rigaku ZSX Mini II (ZSX Mini II) XRF Services x-ray fluorescence (XRF) analyzer was demon-strated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ZSX Mini II analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ZSX Mini II analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element con

  3. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Innov-X XT400 Series (XT400) x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XT400 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XT400 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was as

  4. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Rontec PicoTAX x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the PicoTAX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the PicoTAX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by c

  5. Learning Assumptions for Compositional Verification

    NASA Technical Reports Server (NTRS)

    Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.

  6. ALMA Band 5 Science Verification

    NASA Astrophysics Data System (ADS)

    Humphreys, L.; Biggs, A.; Immer, K.; Laing, R.; Liu, H. B.; Marconi, G.; Mroczkowski, T.; Testi, L.; Yagoubov, P.

    2017-03-01

    ALMA Band 5 (163–211 GHz) was recently commissioned and Science Verification (SV) observations were obtained in the latter half of 2016. A primary scientific focus of this band is the H2O line at 183.3 GHz, which can be observed around 15% of the time when the precipitable water vapour is sufficiently low (< 0.5 mm). Many more lines are covered in Band 5 and can be observed for over 70% of the time on Chajnantor, requiring similar restrictions to those for ALMA Bands 4 and 6. Examples include the H218O line at 203 GHz, some of the bright (3–2) lines of singly and doubly deuterated forms of formaldehyde, the (2–1) lines of HCO+, HCN, HNC, N2H+ and several of their isotopologues. A young star-forming region near the centre of the Milky Way, an evolved star also in our Galaxy, and a nearby ultraluminous infrared galaxy (ULIRG) were observed as part of the SV process and the data are briefly described. The reduced data, along with imaged data products, are now public and demonstrate the power of ALMA for high-resolution studies of H2O and other molecules in a variety of astronomical targets.

  7. Nuclear Technology for the Sustainable Development Goals

    NASA Astrophysics Data System (ADS)

    Darby, Iain

    2017-01-01

    Science, technology and innovation will play a crucial role in helping countries achieve the ambitious Sustainable Development Goals (SDGs). Since the discovery of nuclear fission in the 1930s, the peaceful applications of nuclear technology have helped many countries improve crops, fight pests, advance health, protect the environment and guarantee a stable supply of energy. Highlighting the goals related to health, hunger, energy and the environment, in this presentation I will discuss how nuclear technology contributes to the SDGs and how nuclear technology can further contribute to the well-being of people, help protect the planet and boost prosperity.

  8. Electromagnetic Signature Technique as a Promising Tool to Verify Nuclear Weapons Storage and Dismantlement under a Nuclear Arms Control Regime

    SciTech Connect

    Bunch, Kyle J.; Williams, Laura S.; Jones, Anthony M.; Ramuhalli, Pradeep

    2012-08-01

    The 2010 ratification of the New START Treaty has been widely regarded as a noteworthy national security achievement for both the Obama administration and the Medvedev-Putin regime, but deeper cuts are envisioned under future arms control regimes. Future verification needs will include monitoring the storage of warhead components and fissile materials and verifying dismantlement of warheads, pits, secondaries, and other materials. From both the diplomatic and technical perspectives, verification under future arms control regimes will pose new challenges. Since acceptable verification technology must protect sensitive design information and attributes, non-nuclear non-sensitive signatures may provide a significant verification tool without the use of additional information barriers. The use of electromagnetic signatures to monitor nuclear material storage containers is a promising technology with the potential to fulfill these challenging requirements. Research performed at Pacific Northwest National Laboratory (PNNL) has demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to confirm the presence of specific components on a “yes/no” basis without revealing classified information. Arms control inspectors might use this technique to verify the presence or absence of monitored items, including both nuclear and non-nuclear materials. Although additional research is needed to study signature aspects such as uniqueness and investigate container-specific scenarios, the technique potentially offers a rapid and cost-effective tool to verify reduction and dismantlement of U.S. and Russian nuclear weapons.

  9. Self Help and Mental Health

    ERIC Educational Resources Information Center

    Gartner, Alan

    1976-01-01

    Suggests that the single most important common denominator of the various types of self-help groups examined may be that the role of the person who has already lived through the experience is critical for helping others. (Author/AM)

  10. High Level Requirements for the Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS)

    SciTech Connect

    Rich Johnson; Hyung Lee; Kimberlyn C. Mousseau

    2011-09-01

    The US Department of Energy, Office of Nuclear Energy (DOE-NE), has been tasked with the important mission of ensuring that nuclear energy remains a compelling and viable energy source in the U.S. The motivations behind this mission include cost-effectively meeting the expected increases in the power needs of the country, reducing carbon emissions and reducing dependence on foreign energy sources. In the near term, to ensure that nuclear power remains a key element of U.S. energy strategy and portfolio, the DOE-NE will be working with the nuclear industry to support safe and efficient operations of existing nuclear power plants. In the long term, to meet the increasing energy needs of the U.S., the DOE-NE will be investing in research and development (R&D) and working in concert with the nuclear industry to build and deploy new, safer and more efficient nuclear power plants. The safe and efficient operations of existing nuclear power plants and designing, licensing and deploying new reactor designs, however, will require focused R&D programs as well as the extensive use and leveraging of advanced modeling and simulation (M&S). M&S will play a key role in ensuring safe and efficient operations of existing and new nuclear reactors. The DOE-NE has been actively developing and promoting the use of advanced M&S in reactor design and analysis through its R&D programs, e.g., the Nuclear Energy Advanced Modeling and Simulation (NEAMS) and Consortium for Advanced Simulation of Light Water Reactors (CASL) programs. Also, nuclear reactor vendors are already using CFD and CSM, for design, analysis, and licensing. However, these M&S tools cannot be used with confidence for nuclear reactor applications unless accompanied and supported by verification and validation (V&V) and uncertainty quantification (UQ) processes and procedures which provide quantitative measures of uncertainty for specific applications. The Nuclear Energy Knowledge base for Advanced Modeling and Simulation

  11. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  12. Data Storage Accounting and Verification at LHC experiments

    NASA Astrophysics Data System (ADS)

    Huang, C.-H.; Lanciotti, E.; Magini, N.; Ratnikova, N.; Sanchez-Hernandez, A.; Serfon, C.; Wildish, T.; Zhang, X.

    2012-12-01

    All major experiments at the Large Hadron Collider (LHC) need to measure real storage usage at the Grid sites. This information is equally important for resource management, planning, and operations. To verify the consistency of central catalogs, experiments are asking sites to provide a full list of the files they have on storage, including size, checksum, and other file attributes. Such storage dumps, provided at regular intervals, give a realistic view of the storage resource usage by the experiments. Regular monitoring of the space usage and data verification serve as additional internal checks of the system integrity and performance. Both the importance and the complexity of these tasks increase with the constant growth of the total data volumes during the active data taking period at the LHC. The use of common solutions helps to reduce the maintenance costs, both at the large Tier1 facilities supporting multiple virtual organizations and at the small sites that often lack manpower. We discuss requirements and solutions to the common tasks of data storage accounting and verification, and present experiment-specific strategies and implementations used within the LHC experiments according to their computing models.

  13. MACCS2 development and verification efforts

    SciTech Connect

    Young, M.; Chanin, D.

    1997-03-01

    MACCS2 represents a major enhancement of the capabilities of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, released in 1987, was developed to estimate the potential impacts to the surrounding public of severe accidents at nuclear power plants. The principal phenomena considered in MACCS/MACCS2 are atmospheric transport and deposition under time-variant meteorology, short-term and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. MACCS2 was developed as a general-purpose analytical tool applicable to diverse reactor and nonreactor facilities. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. In addition, errors that had been identified in MACCS version1.5.11.1 were corrected, including an error that prevented the code from providing intermediate-phase results. MACCS2 version 1.10 beta test was released to the beta-test group in May, 1995. In addition, the University of New Mexico (UNM) has completed an independent verification study of the code package. Since the beta-test release of MACCS2 version 1.10, a number of minor errors have been identified and corrected, and a number of enhancements have been added to the code package. The code enhancements added since the beta-test release of version 1.10 include: (1) an option to allow the user to input the {sigma}{sub y} and {sigma}{sub z} plume expansion parameters in a table-lookup form for incremental downwind distances, (2) an option to define different initial dimensions for up to four segments of a release, (3) an enhancement to the COMIDA2 food-chain model preprocessor to allow the user to supply externally calculated tables of tritium food-chain dose per unit deposition on farmland to support analyses of tritium releases, and (4) the capability to calculate direction-dependent doses.

  14. Collective helping and bystander effects in coevolving helping networks

    NASA Astrophysics Data System (ADS)

    Jo, Hang-Hyun; Lee, Hyun Keun; Park, Hyunggyu

    2010-06-01

    We study collective helping behavior and bystander effects in a coevolving helping network model. A node and a link of the network represents an agent who renders or receives help and a friendly relation between agents, respectively. A helping trial of an agent depends on relations with other involved agents and its result (success or failure) updates the relation between the helper and the recipient. We study the network link dynamics and its steady states analytically and numerically. The full phase diagram is presented with various kinds of active and inactive phases and the nature of phase transitions are explored. We find various interesting bystander effects, consistent with the field study results, of which the underlying mechanism is proposed.

  15. Verification and Validation Methodology of Real-Time Adaptive Neural Networks for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Gupta, Pramod; Loparo, Kenneth; Mackall, Dale; Schumann, Johann; Soares, Fola

    2004-01-01

    Recent research has shown that adaptive neural based control systems are very effective in restoring stability and control of an aircraft in the presence of damage or failures. The application of an adaptive neural network with a flight critical control system requires a thorough and proven process to ensure safe and proper flight operation. Unique testing tools have been developed as part of a process to perform verification and validation (V&V) of real time adaptive neural networks used in recent adaptive flight control system, to evaluate the performance of the on line trained neural networks. The tools will help in certification from FAA and will help in the successful deployment of neural network based adaptive controllers in safety-critical applications. The process to perform verification and validation is evaluated against a typical neural adaptive controller and the results are discussed.

  16. Nuclear Winter.

    ERIC Educational Resources Information Center

    Ehrlich, Anne

    1984-01-01

    "Nuclear Winter" was recently coined to describe the climatic and biological effects of a nuclear war. These effects are discussed based on models, simulations, scenarios, and projections. Effects on human populations are also considered. (JN)

  17. Nuclear Chemistry.

    ERIC Educational Resources Information Center

    Chemical and Engineering News, 1979

    1979-01-01

    Provides a brief review of the latest developments in nuclear chemistry. Nuclear research today is directed toward increased activity in radiopharmaceuticals and formation of new isotopes by high-energy, heavy-ion collisions. (Author/BB)

  18. IMAGE-BASED VERIFICATION: SOME ADVANTAGES, CHALLENGES, AND ALGORITHM-DRIVEN REQUIREMENTS

    SciTech Connect

    Seifert, Allen; McDonald, Benjamin S.; Jarman, Kenneth D.; Robinson, Sean M.; Misner, Alex C.; Miller, Erin A.; White, Timothy A.; Pitts, William K.

    2011-06-10

    ABSTRACT Imaging technologies may be a particularly useful technique that supports monitoring and verification of deployed and stockpiled nuclear weapons and dismantlement components. However, protecting the sensitive design information requires processing the image behind an information barrier and reporting only non-sensitive attributes related to the image. Reducing images to attributes may destroy some sensitive information, but the challenge remains. For example, reducing the measurement to an attribute such as defined shape and X-ray transmission of an edge might reveal sensitive information relating to shape, size, and material composition. If enough additional information is available to analyze with the attribute, it may still be possible to extract sensitive design information. In spite of these difficulties, the implementation of new treaty requirements may demand image technology as an option. Two fundamental questions are raised: What (minimal) information is needed from imaging to enable verification, and what imaging technologies are appropriate? PNNL is currently developing a suite of image analysis algorithms to define and extract attributes from images for dismantlement and warhead verification and counting scenarios. In this talk, we discuss imaging requirements from the perspective of algorithms operating behind information barriers, and review imaging technologies and their potential advantages for verification. Companion talks will concentrate on the technical aspects of the algorithms.

  19. NEUTRON MULTIPLICITY AND ACTIVE WELL NEUTRON COINCIDENCE VERIFICATION MEASUREMENTS PERFORMED FOR MARCH 2009 SEMI-ANNUAL DOE INVENTORY

    SciTech Connect

    Dewberry, R.; Ayers, J.; Tietze, F.; Klapper, K.

    2010-02-05

    The Analytical Development (AD) Section field nuclear measurement group performed six 'best available technique' verification measurements to satisfy a DOE requirement instituted for the March 2009 semi-annual inventory. The requirement of (1) yielded the need for SRNL Research Operations Department Material Control & Accountability (MC&A) group to measure the Pu content of five items and the highly enrich uranium (HEU) content of two. No 14Q-qualified measurement equipment was available to satisfy the requirement. The AD field nuclear group has routinely performed the required Confirmatory Measurements for the semi-annual inventories for fifteen years using sodium iodide and high purity germanium (HpGe) {gamma}-ray pulse height analysis nondestructive assay (NDA) instruments. With appropriate {gamma}-ray acquisition modeling, the HpGe spectrometers can be used to perform verification-type quantitative assay for Pu-isotopics and HEU content. The AD nuclear NDA group is widely experienced with this type of measurement and reports content for these species in requested process control, MC&A booking, and holdup measurements assays Site-wide. However none of the AD HpGe {gamma}-ray spectrometers have been 14Q-qualified, and the requirement of reference 1 specifically excluded a {gamma}-ray PHA measurement from those it would accept for the required verification measurements. The requirement of reference 1 was a new requirement for which the Savannah River National Laboratory (SRNL) Research Operations Department (ROD) MC&A group was unprepared. The criteria for exemption from verification were: (1) isotope content below 50 grams; (2) intrinsically tamper indicating or TID sealed items which contain a Category IV quantity of material; (3) assembled components; and (4) laboratory samples. Therefore all (SRNL) Material Balance Area (MBA) items with greater than 50 grams total Pu or greater than 50 grams HEU were subject to a verification measurement. The pass

  20. Cognitive Bias in the Verification and Validation of Space Flight Systems

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of

  1. Nuclear weapons, nuclear effects, nuclear war

    SciTech Connect

    Bing, G.F.

    1991-08-20

    This paper provides a brief and mostly non-technical description of the militarily important features of nuclear weapons, of the physical phenomena associated with individual explosions, and of the expected or possible results of the use of many weapons in a nuclear war. Most emphasis is on the effects of so-called ``strategic exchanges.``

  2. Nuclear Fuels.

    ERIC Educational Resources Information Center

    Nash, J. Thomas

    1983-01-01

    Trends in and factors related to the nuclear industry and nuclear fuel production are discussed. Topics addressed include nuclear reactors, survival of the U.S. uranium industry, production costs, budget cuts by the Department of Energy and U.S. Geological survey for resource studies, mining, and research/development activities. (JN)

  3. U.S. and Russian Collaboration in the Area of Nuclear Forensics

    SciTech Connect

    Kristo, M J

    2007-10-22

    Nuclear forensics has become increasingly important in the fight against illicit trafficking in nuclear and other radioactive materials. The illicit trafficking of nuclear materials is, of course, an international problem; nuclear materials may be mined and milled in one country, manufactured in a second country, diverted at a third location, and detected at a fourth. There have been a number of articles in public policy journals in the past year that call for greater interaction between the U. S. and the rest of the world on the topic of nuclear forensics. Some believe that such international cooperation would help provide a more certain capability to identify the source of the nuclear material used in a terrorist event. An improved international nuclear forensics capability would also be important as part of the IAEA verification toolkit, particularly linked to increased access provided by the additional protocol. A recent study has found that, although international progress has been made in securing weapons-usable HEU and Pu, the effort is still insufficient. They found that nuclear material, located in 40 countries, could be obtained by terrorists and criminals and used for a crude nuclear weapon. Through 2006, the IAEA Illicit Trafficking Database had recorded a total of 607 confirmed events involving illegal possession, theft, or loss of nuclear and other radioactive materials. Although it is difficult to predict the future course of such illicit trafficking, increasingly such activities are viewed as significant threats that merit the development of special capabilities. As early as April, 1996, nuclear forensics was recognized at the G-8 Summit in Moscow as an important element of an illicit nuclear trafficking program. Given international events over the past several years, the value and need for nuclear forensics seems greater than ever. Determining how and where legitimate control of nuclear material was lost and tracing the route of the material from

  4. INF verification: a guide for the perplexed

    SciTech Connect

    Mendelsohn, J.

    1987-09-01

    The administration has dug itself some deep holes on the verification issue. It will have to conclude an arms control treaty without having resolved earlier (but highly questionable) compliance issues on which it has placed great emphasis. It will probably have to abandon its more sweeping (and unnecessary) on-site inspection (OSI) proposals because of adverse security and political implications for the United States and its allies. And, finally, it will probably have to present to the Congress an INF treaty that will provide for a considerably less-stringent (but nonetheless adequate) verification regime that it had originally demanded. It is difficult to dispel the impression that, when the likelihood of concluding an INF treaty seemed remote, the administration indulged its penchant for intrusive and non-negotiable verification measures. As the possibility of, and eagerness for, a treaty increased, and as the Soviet Union shifted its policy from one of the resistance to OSI to one of indicating that on-site verification involved reciprocal obligations, the administration was forced to scale back its OSI rhetoric. This re-evaluation of OSI by the administration does not make the INF treaty any less verifiable; from the outset the Reagan administration was asking for a far more extensive verification package than was necessary, practicable, acceptable, or negotiable.

  5. Neighborhood Repulsed Metric Learning for Kinship Verification.

    PubMed

    Lu, Jiwen; Zhou, Xiuzhuang; Tan, Yap-Pen; Shang, Yuanyuan; Zhou, Jie

    2013-07-16

    Kinship verification from facial images is an interesting and challenging problem in computer vision, and there is very limited attempts on tackle this problem in the iterature. In this paper, we propose a new neighborhood repulsed metric learning (NRML) method for kinship verification. Motivated by the fact that interclass samples (without kinship relations) with higher similarity usually lie in a neighborhood and are more easily misclassified than those with lower similarity, we aim to learn a distance metric under which the intraclass samples (with kinship relations) are pulled as close as possible and interclass samples lying in a neighborhood are repulsed and pushed away as far as possible, simultaneously, such that more discriminative information can be exploited for verification. To make better use of multiple feature descriptors to extract complementary information, we further propose a multiview NRML (MNRML) method to seek a common distance metric to perform multiple feature fusion to improve the kinship verification performance. Experimental results are presented to demonstrate the efficacy of our proposed methods. Lastly, we also test human ability in kinship verification from facial images and our experimental results show that our methods are comparable to that of human observers.

  6. Neighborhood repulsed metric learning for kinship verification.

    PubMed

    Lu, Jiwen; Zhou, Xiuzhuang; Tan, Yap-Pen; Shang, Yuanyuan; Zhou, Jie

    2014-02-01

    Kinship verification from facial images is an interesting and challenging problem in computer vision, and there are very limited attempts on tackle this problem in the literature. In this paper, we propose a new neighborhood repulsed metric learning (NRML) method for kinship verification. Motivated by the fact that interclass samples (without a kinship relation) with higher similarity usually lie in a neighborhood and are more easily misclassified than those with lower similarity, we aim to learn a distance metric under which the intraclass samples (with a kinship relation) are pulled as close as possible and interclass samples lying in a neighborhood are repulsed and pushed away as far as possible, simultaneously, such that more discriminative information can be exploited for verification. To make better use of multiple feature descriptors to extract complementary information, we further propose a multiview NRML (MNRML) method to seek a common distance metric to perform multiple feature fusion to improve the kinship verification performance. Experimental results are presented to demonstrate the efficacy of our proposed methods. Finally, we also test human ability in kinship verification from facial images and our experimental results show that our methods are comparable to that of human observers.

  7. Hybrid Deep Learning for Face Verification.

    PubMed

    Sun, Yi; Wang, Xiaogang; Tang, Xiaoou

    2016-10-01

    This paper proposes a hybrid convolutional network (ConvNet)-Restricted Boltzmann Machine (RBM) model for face verification. A key contribution of this work is to learn high-level relational visual features with rich identity similarity information. The deep ConvNets in our model start by extracting local relational visual features from two face images in comparison, which are further processed through multiple layers to extract high-level and global relational features. To keep enough discriminative information, we use the last hidden layer neuron activations of the ConvNet as features for face verification instead of those of the output layer. To characterize face similarities from different aspects, we concatenate the features extracted from different face region pairs by different deep ConvNets. The resulting high-dimensional relational features are classified by an RBM for face verification. After pre-training each ConvNet and the RBM separately, the entire hybrid network is jointly optimized to further improve the accuracy. Various aspects of the ConvNet structures, relational features, and face verification classifiers are investigated. Our model achieves the state-of-the-art face verification performance on the challenging LFW dataset under both the unrestricted protocol and the setting when outside data is allowed to be used for training.

  8. A Quantitative Approach to the Formal Verification of Real-Time Systems.

    DTIC Science & Technology

    1996-09-01

    my parents Lia and Daniel and to my sister Daniela , for the help and support through- out my whole life. Even though they have not been present during...transient overload, scheduling of aperiodic tasks and priority granularity in communication scheduling [49]. For this rea - son, static scheduling algorithms...Quantitative temporal rea - soning. In Lecture Notes in Computer Science, Computer-Aided Verification. Springer- Verlag, 1990. [34] J. Fernandez, H

  9. Nuclear War Survival Skills

    SciTech Connect

    Kearny, C.H.

    2002-06-24

    The purpose of this book is to provide Americans with information and instructions that will significantly increase their chances of surviving a possible nuclear attack. It brings together field-tested instructions that, if followed by a large fraction of Americans during a crisis that preceded an attack, could save millions of lives. The author is convinced that the vulnerability of our country to nuclear threat or attack must be reduced and that the wide dissemination of the information contained in this book would help achieve that objective of our overall defense strategy.

  10. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  11. New Vaccines Help Protect You

    MedlinePlus

    ... Navigation Bar Home Current Issue Past Issues New Vaccines Help Protect You Past Issues / Fall 2006 Table ... this page please turn Javascript on. Important new vaccines have recently been approved for use and are ...

  12. How Stitches Help Kids Heal

    MedlinePlus

    ... cuts is a small sticky strip called a butterfly bandage. It keeps the edges of a shallow ... help. Different kinds of materials — sutures, glue, and butterflies — need different kinds of care. The doctor probably ...

  13. Going Local to Find Help

    MedlinePlus

    ... Issues Cover Story: Traumatic Brain Injury Going Local to Find Help Past Issues / Fall 2008 Table of Contents ... description, phone numbers, maps and directions, such as To Find Out More: Visit www.ninds.nih.gov/disorders/ ...

  14. Emojis help young people communicate.

    PubMed

    2016-10-26

    'The use of technology to support communication in therapy is an exciting development, particularly the use of mobile device emojis to help young people express, and practitioners to assess, their mental distress'.

  15. Students Help Students with Sails.

    ERIC Educational Resources Information Center

    Toskas, Denny

    1987-01-01

    Outlines a student tutoring program called SAILS (Student Assistance in Learning and Support) that helps students who have chronic difficulties in mathematics, reading, English, and with personal problems. (MD)

  16. Exercises to help prevent falls

    MedlinePlus

    ... this page: //medlineplus.gov/ency/patientinstructions/000493.htm Exercises to help prevent falls To use the sharing ... and easily. DO NOT hold your breath. Balance Exercises You can do some balance exercises during everyday ...

  17. Efficient Verification of Holograms Using Mobile Augmented Reality.

    PubMed

    Hartl, Andreas Daniel; Arth, Clemens; Grubert, Jens; Schmalstieg, Dieter

    2016-07-01

    Paper documents such as passports, visas and banknotes are frequently checked by inspection of security elements. In particular, optically variable devices such as holograms are important, but difficult to inspect. Augmented Reality can provide all relevant information on standard mobile devices. However, hologram verification on mobiles still takes long and provides lower accuracy than inspection by human individuals using appropriate reference information. We aim to address these drawbacks by automatic matching combined with a special parametrization of an efficient goal-oriented user interface which supports constrained navigation. We first evaluate a series of similarity measures for matching hologram patches to provide a sound basis for automatic decisions. Then a re-parametrized user interface is proposed based on observations of typical user behavior during document capture. These measures help to reduce capture time to approximately 15 s with better decisions regarding the evaluated samples than what can be achieved by untrained users.

  18. Battery Technology Life Verification Test Manual Revision 1

    SciTech Connect

    Jon P. Christophersen

    2012-12-01

    The purpose of this Technology Life Verification Test (TLVT) Manual is to help guide developers in their effort to successfully commercialize advanced energy storage devices such as battery and ultracapacitor technologies. The experimental design and data analysis discussed herein are focused on automotive applications based on the United States Advanced Battery Consortium (USABC) electric vehicle, hybrid electric vehicle, and plug-in hybrid electric vehicle (EV, HEV, and PHEV, respectively) performance targets. However, the methodology can be equally applied to other applications as well. This manual supersedes the February 2005 version of the TLVT Manual (Reference 1). It includes criteria for statistically-based life test matrix designs as well as requirements for test data analysis and reporting. Calendar life modeling and estimation techniques, including a user’s guide to the corresponding software tool is now provided in the Battery Life Estimator (BLE) Manual (Reference 2).

  19. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development and lower operating costs. However, as those system close control loops and arbitrate resources on board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques and concrete experiments at NASA.

  20. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development, and lower operating costs. However, as those system close control loops and arbitrate resources on-board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques, and concrete experiments at NASA.

  1. The Global Diffusion of Societal Verification Tools: A Quantitative Assessment of the Public’s Ability to Engage Nonproliferation Treaty Monitoring

    SciTech Connect

    Sayre, Amanda M.; Kreyling, Sean J.; West, Curtis L.

    2015-07-11

    The spread of nuclear and dual-use technologies and the need for more robust, effective and efficient nonproliferation and arms control treaties has led to an increasing need for innovative verification approaches and technologies. This need, paired with advancements in online computing, mobile devices, commercially available satellite imagery and the evolution of online social networks, has led to a resurgence of the concept of societal verification for arms control and nonproliferation treaties. In the event a country accepts its citizens’ assistance in supporting transparency, confidence-building and societal verification, the host government will need a population that is willing and able to participate. While scholarly interest in societal verification continues to grow, social scientific research on the topic is lacking. The aim of this paper is to begin the process of understanding public societal verification capabilities, extend the availability of quantitative research on societal verification and set in motion complementary research to increase the breadth and depth of knowledge on this topic. This paper presents a potential framework and outlines a research roadmap for the development of such a societal verification capability index.

  2. PBL Verification with Radiosonde and Aircraft Data

    NASA Astrophysics Data System (ADS)

    Tsidulko, M.; McQueen, J.; Dimego, G.; Ek, M.

    2008-12-01

    Boundary layer depth is an important characteristic in weather forecasting and it is a key parameter in air quality modeling determining extent of turbulence and dispersion for pollutants. Real-time PBL depths from the NAM(WRF/NMM) model are verified with different types of observations. PBL depths verification is incorporated into NCEP verification system including an ability to provide a range of statistical characteristics for the boundary layer heights. For the model, several types of boundary layer definitions are used. PBL height from the TKE scheme, critical Ri number approach as well as mixed layer depth are compared with observations. Observed PBL depths are determined applying Ri number approach to radiosonde profiles. Also, preliminary study of using ACARS data for PBL verification is conducted.

  3. Active alignment/contact verification system

    DOEpatents

    Greenbaum, William M.

    2000-01-01

    A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.

  4. Land Ice Verification and Validation Kit

    SciTech Connect

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and test data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.

  5. Critical Surface Cleaning and Verification Alternatives

    NASA Technical Reports Server (NTRS)

    Melton, Donald M.; McCool, A. (Technical Monitor)

    2000-01-01

    As a result of federal and state requirements, historical critical cleaning and verification solvents such as Freon 113, Freon TMC, and Trichloroethylene (TCE) are either highly regulated or no longer 0 C available. Interim replacements such as HCFC 225 have been qualified, however toxicity and future phase-out regulations necessitate long term solutions. The scope of this project was to qualify a safe and environmentally compliant LOX surface verification alternative to Freon 113, TCE and HCFC 225. The main effort was focused on initiating the evaluation and qualification of HCFC 225G as an alternate LOX verification solvent. The project was scoped in FY 99/00 to perform LOX compatibility, cleaning efficiency and qualification on flight hardware.

  6. Synthesis of calculational methods for design and analysis of radiation shields for nuclear rocket systems

    NASA Technical Reports Server (NTRS)

    Capo, M. A.; Disney, R. K.; Jordan, T. A.; Soltesz, R. G.; Woodsum, H. C.

    1969-01-01

    Eight computer programs make up a nine volume synthesis containing two design methods for nuclear rocket radiation shields. The first design method is appropriate for parametric and preliminary studies, while the second accomplishes the verification of a final nuclear rocket reactor design.

  7. North Korea’s Nuclear Weapons Development and Diplomacy

    DTIC Science & Technology

    2010-01-05

    earlier than Bosworth; he testified that North Korean Foreign Ministry official, Lee Gun , stated that a peace agreement...September 24, 2008; and Yi Chong- chin , “DPRK official at energy aid talks comments on nuclear verification issue,” Yonhap News Agency, September 19...complete denuclearization. If these 51 Choe Sang- hun , “Tensions rise on Korean peninsula,” New

  8. Near-term thermoelectric nuclear power options for SEI missions

    NASA Technical Reports Server (NTRS)

    Peterson, Jerry R.

    1992-01-01

    Three different types of thermoelectric nuclear space power systems are discussed. First, the general purpose heat source Radioisotope Thermoelectric Generator (RTG), which was qualified and flown on Galileo/Ulysses and is in development for Cassini, is discussed. Second, the modular RTG, which is undergoing life verification, is discussed. Finally, the SP-100 is discussed. The information is presented in viewgraph form.

  9. Automating Nuclear-Safety-Related SQA Procedures with Custom Applications

    SciTech Connect

    Freels, James D.

    2016-01-01

    Nuclear safety-related procedures are rigorous for good reason. Small design mistakes can quickly turn into unwanted failures. Researchers at Oak Ridge National Laboratory worked with COMSOL to define a simulation app that automates the software quality assurance (SQA) verification process and provides results in less than 24 hours.

  10. Hierarchical Representation Learning for Kinship Verification.

    PubMed

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  11. Hierarchical Representation Learning for Kinship Verification.

    PubMed

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2016-09-14

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this research, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determines their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d1, and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical Kinship Verification via Representation Learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU Kinship Database is created which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields stateof- the-art kinship verification accuracy on the WVU Kinship database and on four existing benchmark datasets. Further, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  12. 37 CFR 380.6 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... TRANSMISSIONS, NEW SUBSCRIPTION SERVICES AND THE MAKING OF EPHEMERAL REPRODUCTIONS § 380.6 Verification of... purpose of the audit. The Collective shall retain the report of the verification for a period of not...

  13. Generic Protocol for the Verification of Ballast Water Treatment Technology

    EPA Science Inventory

    In anticipation of the need to address performance verification and subsequent approval of new and innovative ballast water treatment technologies for shipboard installation, the U.S Coast Guard and the Environmental Protection Agency‘s Environmental Technology Verification Progr...

  14. Jet Propulsion Laboratory Environmental Verification Processes and Test Effectiveness

    NASA Technical Reports Server (NTRS)

    Hoffman, Alan R.; Green, Nelson W.

    2006-01-01

    Viewgraphs on the JPL processes for enviornmental verification and testing of aerospace systems is presented. The topics include: 1) Processes: a) JPL Design Principles b) JPL Flight Project Practices; 2) Environmental Verification; and 3) Test Effectiveness Assessment: Inflight Anomaly Trends.

  15. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  16. Proton Therapy Verification with PET Imaging

    PubMed Central

    Zhu, Xuping; Fakhri, Georges El

    2013-01-01

    Proton therapy is very sensitive to uncertainties introduced during treatment planning and dose delivery. PET imaging of proton induced positron emitter distributions is the only practical approach for in vivo, in situ verification of proton therapy. This article reviews the current status of proton therapy verification with PET imaging. The different data detecting systems (in-beam, in-room and off-line PET), calculation methods for the prediction of proton induced PET activity distributions, and approaches for data evaluation are discussed. PMID:24312147

  17. Verification of Plan Models Using UPPAAL

    NASA Technical Reports Server (NTRS)

    Khatib, Lina; Muscettola, Nicola; Haveland, Klaus; Lau, Sonic (Technical Monitor)

    2001-01-01

    This paper describes work on the verification of HSTS, the planner and scheduler of the Remote Agent autonomous control system deployed in Deep Space 1 (DS1). The verification is done using UPPAAL, a real time model checking tool. We start by motivating our work in the introduction. Then we give a brief description of HSTS and UPPAAL. After that, we give a mapping of HSTS models into UPPAAL and we present samples of plan model properties one may want to verify. Finally, we conclude with a summary.

  18. Challenges in High-Assurance Runtime Verification

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.

    2016-01-01

    Safety-critical systems are growing more complex and becoming increasingly autonomous. Runtime Verification (RV) has the potential to provide protections when a system cannot be assured by conventional means, but only if the RV itself can be trusted. In this paper, we proffer a number of challenges to realizing high-assurance RV and illustrate how we have addressed them in our research. We argue that high-assurance RV provides a rich target for automated verification tools in hope of fostering closer collaboration among the communities.

  19. Electric power system test and verification program

    NASA Technical Reports Server (NTRS)

    Rylicki, Daniel S.; Robinson, Frank, Jr.

    1994-01-01

    Space Station Freedom's (SSF's) electric power system (EPS) hardware and software verification is performed at all levels of integration, from components to assembly and system level tests. Careful planning is essential to ensure the EPS is tested properly on the ground prior to launch. The results of the test performed on breadboard model hardware and analyses completed to date have been evaluated and used to plan for design qualification and flight acceptance test phases. These results and plans indicate the verification program for SSF's 75-kW EPS would have been successful and completed in time to support the scheduled first element launch.

  20. On Backward-Style Anonymity Verification

    NASA Astrophysics Data System (ADS)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  1. Cold fusion verification. Final report for period ending 1989

    SciTech Connect

    North, M.H.; Mastny, G.F.; Wesley, E.J.

    1991-03-01

    The objective of this work to verify and reproduce experimental observations of Cold Nuclear Fusion (CNF), as originally reported in 1989. The method was to start with the original report and add such additional information as became available to build a set of operational electrolytic CNF cells. Verification was to be achieved by first observing cells for neutron production, and for those cells that demonstrated a nuclear effect, careful calorimetric measurements were planned. The authors concluded, after laboratory experience, reading published work, talking with others in the field, and attending conferences, that CNF probably is chimera and will go the way of N-rays and polywater. The neutron detector used for these tests was a completely packaged unit built into a metal suitcase that afforded electrostatic shielding for the detectors and self-contained electronics. It was battery-powered, although it was on charge for most of the long tests. The sensor element consists of He detectors arranged in three independent layers in a solid moderating block. The count from each of the three layers as well as the sum of all the detectors were brought out and recorded separately. The neutron measurements were made with both the neutron detector and the sample tested in a cave made of thick moderating material that surrounded the two units on the sides and bottom.

  2. Concept of Operations for Nuclear Warhead Embedded Sensors

    SciTech Connect

    Rockett, P D; Koncher, T R

    2012-05-16

    Embedded arms-control-sensors provide a powerful new paradigm for managing compliance with future nuclear weapons treaties, where deployed warhead numbers will be reduced to 1000 or less. The CONOPS (Concept of Operations) for use with these sensors is a practical tool with which one may help define design parameters, including size, power, resolution, communications, and physical structure. How frequently must data be acquired and must a human be present? Will such data be acquired for only stored weapons or will it be required of deployed weapons as well? Will tactical weapons be subject to such monitoring or will only strategic weapons apply? Which data will be most crucial? Will OSI's be a component of embedded sensor data management or will these sensors stand alone in their data extraction processes? The problem space is massive, but can be constrained by extrapolating to a reasonable future treaty regime and examining the bounded options this scenario poses. Arms control verification sensors, embedded within the warhead case or aeroshell, must provide sufficient but not excessively detailed data, confirming that the item is a nuclear warhead and that it is a particular warhead without revealing sensitive information. Geolocation will be provided by an intermediate transceiver used to acquire the data and to forward the data to a central processing location. Past Chain-of-Custody projects have included such devices and will be primarily responsible for adding such indicators in the future. For the purposes of a treaty regime a TLI will be verified as a nuclear warhead by knowledge of (a) the presence and mass of SNM, (b) the presence of HE, and (c) the reporting of a unique tag ID. All of these parameters can be obtained via neutron correlation measurements, Raman spectroscopy, and fiber optic grating fabrication, respectively. Data from these sensors will be pushed out monthly and acquired nearly daily, providing one of several verification layers in depth

  3. Verification of Advective Bar Elements Implemented in the Aria Thermal Response Code.

    SciTech Connect

    Mills, Brantley

    2016-01-01

    A verification effort was undertaken to evaluate the implementation of the new advective bar capability in the Aria thermal response code. Several approaches to the verification process were taken : a mesh refinement study to demonstrate solution convergence in the fluid and the solid, visually examining the mapping of the advective bar element nodes to the surrounding surfaces, and a comparison of solutions produced using the advective bars for simple geometries with solutions from commercial CFD software . The mesh refinement study has shown solution convergence for simple pipe flow in both temperature and velocity . Guidelines were provided to achieve appropriate meshes between the advective bar elements and the surrounding volume. Simulations of pipe flow using advective bars elements in Aria have been compared to simulations using the commercial CFD software ANSYS Fluent (r) and provided comparable solutions in temperature and velocity supporting proper implementation of the new capability. Verification of Advective Bar Elements iv Acknowledgements A special thanks goes to Dean Dobranich for his guidance and expertise through all stages of this effort . His advice and feedback was instrumental to its completion. Thanks also goes to Sam Subia and Tolu Okusanya for helping to plan many of the verification activities performed in this document. Thank you to Sam, Justin Lamb and Victor Brunini for their assistance in resolving issues encountered with running the advective bar element model. Finally, thanks goes to Dean, Sam, and Adam Hetzler for reviewing the document and providing very valuable comments.

  4. Helping Teachers Help Themselves: Professional Development That Makes a Difference

    ERIC Educational Resources Information Center

    Patton, Kevin; Parker, Melissa; Tannehill, Deborah

    2015-01-01

    For school administrators to facilitate impactful teacher professional development, a shift in thinking that goes beyond the acquisition of new skills and knowledge to helping teachers rethink their practice is required. Based on review of the professional development literature and our own continued observations of professional development, this…

  5. Helping Schools Help Children. Research Report No. 2.

    ERIC Educational Resources Information Center

    National Inst. of Mental Health (DHEW), Bethesda, MD. Center for Studies of Crime and Delinquency.

    This pamphlet briefly reports on an experimental program designed to help the underachieving student whose academic and behavioral problems keep him in trouble with school officials. The project is based on the following premises: (1) children who learn basic academic skills and appropriate behaviors will be less vulnerable to future problems; (2)…

  6. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  7. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  8. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  9. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  10. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  11. 40 CFR 1066.240 - Torque transducer verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Torque transducer verification. 1066... POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification. Verify torque-measurement systems by performing the verifications described in §§ 1066.270...

  12. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  13. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  14. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  15. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  16. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  17. 15 CFR 748.13 - Delivery Verification (DV).

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 15 Commerce and Foreign Trade 2 2011-01-01 2011-01-01 false Delivery Verification (DV). 748.13... (CLASSIFICATION, ADVISORY, AND LICENSE) AND DOCUMENTATION § 748.13 Delivery Verification (DV). (a) Scope. (1) BIS may request the licensee to obtain verifications of delivery on a selective basis. A...

  18. 15 CFR 748.13 - Delivery Verification (DV).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 2 2010-01-01 2010-01-01 false Delivery Verification (DV). 748.13... (CLASSIFICATION, ADVISORY, AND LICENSE) AND DOCUMENTATION § 748.13 Delivery Verification (DV). (a) Scope. (1) BIS may request the licensee to obtain verifications of delivery on a selective basis. A...

  19. 21 CFR 120.11 - Verification and validation.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Verification and validation. 120.11 Section 120.11 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD... § 120.11 Verification and validation. (a) Verification. Each processor shall verify that the...

  20. Sterilization of compounded parenteral preparations: verification of autoclaves.

    PubMed

    Rahe, Hank

    2013-01-01

    This article discusses the basic principles for verification of a sterilization process and provides a recommended approach to assure that autoclaves deliver the sterility-assured levels required for patient safety. Included is a summary of the protocol and verification (validation) results of a previously published case study involving autoclaves. To assure the sterility of compounded preparations, a verification procedure must be in place.

  1. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort...

  2. Passive Tomography for Spent Fuel Verification: Analysis Framework and Instrument Design Study

    SciTech Connect

    White, Timothy A.; Svard, Staffan J.; Smith, Leon E.; Mozin, Vladimir V.; Jansson, Peter; Davour, Anna; Grape, Sophie; Trellue, H.; Deshmukh, Nikhil S.; Wittman, Richard S.; Honkamaa, Tapani; Vaccaro, Stefano; Ely, James

    2015-05-18

    The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly is being assessed through a collaboration of Support Programs to the International Atomic Energy Agency (IAEA). In the first phase of this study, two safeguards verification objectives have been identified. The first is the independent determination of the number of active pins that are present in the assembly, in the absence of a priori information. The second objective is to provide quantitative measures of pin-by-pin properties, e.g. activity of key isotopes or pin attributes such as cooling time and relative burnup, for the detection of anomalies and/or verification of operator-declared data. The efficacy of GET to meet these two verification objectives will be evaluated across a range of fuel types, burnups, and cooling times, and with a target interrogation time of less than 60 minutes. The evaluation of GET viability for safeguards applications is founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types are used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. Instrument response data are processed by a variety of tomographic-reconstruction and image-processing methods, and scoring metrics specific to each of the verification objectives are defined and used to evaluate the performance of the methods. This paper will provide a description of the analysis framework and evaluation metrics, example performance-prediction results, and describe the design of a “universal” GET instrument intended to support the full range of verification scenarios envisioned by the IAEA.

  3. Spent Nuclear Fuel (SNF) Project Cask and MCO Helium Purge System Design Review Completion Report Project A.5 and A.6

    SciTech Connect

    ARD, K.E.

    2000-04-19

    This report documents the results of the design verification performed on the Cask and Multiple Canister Over-pack (MCO) Helium Purge System. The helium purge system is part of the Spent Nuclear Fuel (SNF) Project Cask Loadout System (CLS) at 100K area. The design verification employed the ''Independent Review Method'' in accordance with Administrative Procedure (AP) EN-6-027-01.

  4. Nuclear astrophysics

    SciTech Connect

    Haxton, W.C.

    1992-01-01

    The problem of core-collapse supernovae is used to illustrate the many connections between nuclear astrophysics and the problems nuclear physicists study in terrestrial laboratories. Efforts to better understand the collapse and mantle ejection are also motivated by a variety of interdisciplinary issues in nuclear, particle, and astrophysics, including galactic chemical evolution, neutrino masses and mixing, and stellar cooling by the emission of new particles. The current status of theory and observations is summarized.

  5. Nuclear astrophysics

    SciTech Connect

    Haxton, W.C.

    1992-12-31

    The problem of core-collapse supernovae is used to illustrate the many connections between nuclear astrophysics and the problems nuclear physicists study in terrestrial laboratories. Efforts to better understand the collapse and mantle ejection are also motivated by a variety of interdisciplinary issues in nuclear, particle, and astrophysics, including galactic chemical evolution, neutrino masses and mixing, and stellar cooling by the emission of new particles. The current status of theory and observations is summarized.

  6. 21 CFR 123.8 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Verification. 123.8 Section 123.8 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... processor shall verify that the HACCP plan is adequate to control food safety hazards that are...

  7. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  8. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  9. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  10. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  11. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  12. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 4 2014-10-01 2014-10-01 false Eligibility verification. 457.380 Section 457.380 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STATE CHILDREN'S HEALTH INSURANCE PROGRAMS (SCHIPs) ALLOTMENTS AND GRANTS TO STATES State...

  13. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 4 2012-10-01 2012-10-01 false Eligibility verification. 457.380 Section 457.380 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STATE CHILDREN'S HEALTH INSURANCE PROGRAMS (SCHIPs) ALLOTMENTS AND GRANTS TO STATES State...

  14. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment....

  15. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment....

  16. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment....

  17. Learner Verification: A Publisher's Case Study.

    ERIC Educational Resources Information Center

    Wilson, George

    Learner verification, a process by which publishers monitor the effectiveness of their products and strive to improve their services to schools, is a practice that most companies take seriously. The quality of educational materials may be ensured in many ways: by analysis of sales, through firsthand investigation, and by employing a system of…

  18. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  19. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  20. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...