Science.gov

Sample records for nuclear verification helping

  1. Nuclear disarmament verification

    SciTech Connect

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  2. Helping nuclear power help us

    SciTech Connect

    Schecker, Jay A

    2009-01-01

    After a prolonged absence, the word 'nuclear' has returned to the lexicon of sustainable domestic energy resources. Due in no small part to its demonstrated reliability, nuclear power is poised to playa greater role in the nation's energy future, producing clean, carbon-neutral electricity and contributing even more to our energy security. To nuclear scientists, the resurgence presents an opportunity to inject new technologies into the industry to maximize the benefits that nuclear energy can provide. 'By developing new options for waste management and exploiting new materials to make key technological advances, we can significantly impact the use of nuclear energy in our future energy mix,' says Chris Stanek, a materials scientist at Los Alamos National Laboratory. Stanek approaches the big technology challenges by thinking way small, all the way down to the atoms. He and his colleagues are using cutting edge atomic-scale simulations to address a difficult aspect of nuclear waste -- predicting its behavior far into the future. Their research is part of a broader, coordinated effort on the part of the Laboratory to use its considerable experimental, theoretical, and computational capabilities to explore advanced materials central to not only waste issues, but to nuclear fuels as well.

  3. Nuclear Data Verification and Standardization

    SciTech Connect

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  4. The monitoring and verification of nuclear weapons

    SciTech Connect

    Garwin, Richard L.

    2014-05-09

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  5. Thoughts on Verification of Nuclear Disarmament

    SciTech Connect

    Dunlop, W H

    2007-09-26

    perfect and it was expected that occasionally there might be a verification measurement that was slightly above 150 kt. But the accuracy was much improved over the earlier seismic measurements. In fact some of this improvement was because as part of this verification protocol the US and Soviet Union provided the yields of several past tests to improve seismic calibrations. This actually helped provide a much needed calibration for the seismic measurements. It was also accepted that since nuclear tests were to a large part R&D related, it was also expected that occasionally there might be a test that was slightly above 150 kt, as you could not always predict the yield with high accuracy in advance of the test. While one could hypothesize that the Soviets could do a test at some other location than their test sites, if it were even a small fraction of 150 kt it would clearly be observed and would be a violation of the treaty. So the issue of clandestine tests of significance was easily covered for this particular treaty.

  6. Requirements for Nuclear Test Verification

    NASA Astrophysics Data System (ADS)

    Dreicer, M.

    2001-05-01

    Verification comprises the collection and assessment of reliable, relevant information for ascertaining the degree to which our foreign partners are adhering to their international security commitments. In the past, treaty verification was largely a bilateral and the information used to make compliance judgements was under government control. Verification data was collected in joint bilaterally-controlled conditions or at large distances. The international verification regime being developed by the Comprehensive Test Ban Treaty Preparatory Commission will be providing a vast amount of data to a large number of Member States and scientific researchers. The increasingly rapid communication of data from many global sources, including the International Monitoring System, has shifted the traditional views of how verification should be implemented. The newly formed Bureau of Verification and Compliance in the U.S. Department of State is working to develop an overall concept of what sources of information and day-to-day activities are needed to carry out its verification and compliance functions. This presentation will set out preliminary ideas of how this will be and will include ideas of what types of research and development are needed.

  7. DESIGN INFORMATION VERIFICATION FOR NUCLEAR SAFEGUARDS

    SciTech Connect

    Robert S. Bean; Richard R. M. Metcalf; Phillip C. Durst

    2009-07-01

    A critical aspect of international safeguards activities performed by the International Atomic Energy Agency (IAEA) is the verification that facility design and construction (including upgrades and modifications) do not create opportunities for nuclear proliferation. These Design Information Verification activities require that IAEA inspectors compare current and past information about the facility to verify the operator’s declaration of proper use. The actual practice of DIV presents challenges to the inspectors due to the large amount of data generated, concerns about sensitive or proprietary data, the overall complexity of the facility, and the effort required to extract just the safeguards relevant information. Planned and anticipated facilities will (especially in the case of reprocessing plants) be ever larger and increasingly complex, thus exacerbating the challenges. This paper reports the results of a workshop held at the Idaho National Laboratory in March 2009, which considered technologies and methods to address these challenges. The use of 3D Laser Range Finding, Outdoor Visualization System, Gamma-LIDAR, and virtual facility modeling, as well as methods to handle the facility data issues (quantity, sensitivity, and accessibility and portability for the inspector) were presented. The workshop attendees drew conclusions about the use of these techniques with respect to successfully employing them in an operating environment, using a Fuel Conditioning Facility walk-through as a baseline for discussion.

  8. A Zero Knowledge Protocol For Nuclear Warhead Verification

    SciTech Connect

    Glaser, Alexander; Goldston, Robert J.

    2014-03-14

    The verification of nuclear warheads for arms control faces a paradox: International inspectors must gain high confidence in the authenticity of submitted items while learning nothing about them. Conventional inspection systems featuring ''information barriers'', designed to hide measurments stored in electronic systems, are at risk of tampering and snooping. Here we show the viability of fundamentally new approach to nuclear warhead verification that incorporates a zero-knowledge protocol, designed such that sensitive information is never measured so does not need to be hidden. We interrogate submitted items with energetic neutrons, making in effect, differential measurements of neutron transmission and emission. Calculations of diversion scenarios show that a high degree of discrimination can be achieved while revealing zero information. Timely demonstration of the viability of such an approach could be critical for the nexxt round of arms-control negotiations, which will likely require verification of individual warheads, rather than whole delivery systems.

  9. Nuclear reaction modeling, verification experiments, and applications

    SciTech Connect

    Dietrich, F.S.

    1995-10-01

    This presentation summarized the recent accomplishments and future promise of the neutron nuclear physics program at the Manuel Lujan Jr. Neutron Scatter Center (MLNSC) and the Weapons Neutron Research (WNR) facility. The unique capabilities of the spallation sources enable a broad range of experiments in weapons-related physics, basic science, nuclear technology, industrial applications, and medical physics.

  10. Physical cryptographic verification of nuclear warheads

    NASA Astrophysics Data System (ADS)

    Kemp, R. Scott; Danagoulian, Areg; Macdonald, Ruaridh R.; Vavrek, Jayson R.

    2016-08-01

    How does one prove a claim about a highly sensitive object such as a nuclear weapon without revealing information about the object? This paradox has challenged nuclear arms control for more than five decades. We present a mechanism in the form of an interactive proof system that can validate the structure and composition of an object, such as a nuclear warhead, to arbitrary precision without revealing either its structure or composition. We introduce a tomographic method that simultaneously resolves both the geometric and isotopic makeup of an object. We also introduce a method of protecting information using a provably secure cryptographic hash that does not rely on electronics or software. These techniques, when combined with a suitable protocol, constitute an interactive proof system that could reject hoax items and clear authentic warheads with excellent sensitivity in reasonably short measurement times.

  11. Physical cryptographic verification of nuclear warheads

    DOE PAGES

    Kemp, R. Scott; Danagoulian, Areg; Macdonald, Ruaridh R.; ...

    2016-07-18

    How does one prove a claim about a highly sensitive object such as a nuclear weapon without revealing information about the object? This paradox has challenged nuclear arms control for more than five decades. We present a mechanism in the form of an interactive proof system that can validate the structure and composition of an object, such as a nuclear warhead, to arbitrary precision without revealing either its structure or composition. We introduce a tomographic method that simultaneously resolves both the geometric and isotopic makeup of an object. We also introduce a method of protecting information using a provably securemore » cryptographic hash that does not rely on electronics or software. Finally, these techniques, when combined with a suitable protocol, constitute an interactive proof system that could reject hoax items and clear authentic warheads with excellent sensitivity in reasonably short measurement times.« less

  12. Physical cryptographic verification of nuclear warheads

    PubMed Central

    Kemp, R. Scott; Danagoulian, Areg; Macdonald, Ruaridh R.; Vavrek, Jayson R.

    2016-01-01

    How does one prove a claim about a highly sensitive object such as a nuclear weapon without revealing information about the object? This paradox has challenged nuclear arms control for more than five decades. We present a mechanism in the form of an interactive proof system that can validate the structure and composition of an object, such as a nuclear warhead, to arbitrary precision without revealing either its structure or composition. We introduce a tomographic method that simultaneously resolves both the geometric and isotopic makeup of an object. We also introduce a method of protecting information using a provably secure cryptographic hash that does not rely on electronics or software. These techniques, when combined with a suitable protocol, constitute an interactive proof system that could reject hoax items and clear authentic warheads with excellent sensitivity in reasonably short measurement times. PMID:27432959

  13. Physical cryptographic verification of nuclear warheads.

    PubMed

    Kemp, R Scott; Danagoulian, Areg; Macdonald, Ruaridh R; Vavrek, Jayson R

    2016-08-02

    How does one prove a claim about a highly sensitive object such as a nuclear weapon without revealing information about the object? This paradox has challenged nuclear arms control for more than five decades. We present a mechanism in the form of an interactive proof system that can validate the structure and composition of an object, such as a nuclear warhead, to arbitrary precision without revealing either its structure or composition. We introduce a tomographic method that simultaneously resolves both the geometric and isotopic makeup of an object. We also introduce a method of protecting information using a provably secure cryptographic hash that does not rely on electronics or software. These techniques, when combined with a suitable protocol, constitute an interactive proof system that could reject hoax items and clear authentic warheads with excellent sensitivity in reasonably short measurement times.

  14. Methods of Verification, Accountability and Control of Special Nuclear Material

    SciTech Connect

    Stewart, J.E.

    1999-05-03

    This session demonstrates nondestructive assay (NDA) measurement, surveillance and analysis technology required to protect, control and account (MPC and A) for special nuclear materials (SNM) in sealed containers. These measurements, observations and analyses comprise state-of-the art, strengthened, SNM safeguards systems. Staff member specialists, actively involved in research, development, training and implementation worldwide, will present six NDA verification systems and two software tools for integration and analysis of facility MPC and A data.

  15. As-Built Verification Plan Spent Nuclear Fuel Canister Storage Building MCO Handling Machine

    SciTech Connect

    SWENSON, C.E.

    2000-10-19

    This as-built verification plan outlines the methodology and responsibilities that will be implemented during the as-built field verification activity for the Canister Storage Building (CSB) MCO HANDLING MACHINE (MHM). This as-built verification plan covers THE ELECTRICAL PORTION of the CONSTRUCTION PERFORMED BY POWER CITY UNDER CONTRACT TO MOWAT. The as-built verifications will be performed in accordance Administrative Procedure AP 6-012-00, Spent Nuclear Fuel Project As-Built Verification Plan Development Process, revision I. The results of the verification walkdown will be documented in a verification walkdown completion package, approved by the Design Authority (DA), and maintained in the CSB project files.

  16. A seismic event analyzer for nuclear test ban treaty verification

    SciTech Connect

    Mason, C.L.; Johnson, R.R. . Dept. of Applied Science Lawrence Livermore National Lab., CA ); Searfus, R.M.; Lager, D.; Canales, T. )

    1988-01-01

    This paper presents an expert system that interprets seismic data from Norway's regional seismic array, NORESS, for underground nuclear weapons test ban treaty verification. Three important aspects of the expert system are (1) it emulates the problem solving behavior of the human seismic analyst using an Assumption Based Truth Maintenance System, (2) it acts as an assistant to the human analyst by automatically interpreting and presenting events for review, and (3) it enables the analyst to interactively query the system's chain of reasoning and manually perform and interpretation. The general problem of seismic treaty verification is described. The expert system is presented in terms of knowledge representation structures, assumption based reasoning system, user interface elements, and initial performance results. 8 refs., 10 figs., 2 tabs.

  17. A zero-knowledge protocol for nuclear warhead verification

    NASA Astrophysics Data System (ADS)

    Glaser, Alexander; Barak, Boaz; Goldston, Robert J.

    2014-06-01

    The verification of nuclear warheads for arms control involves a paradox: international inspectors will have to gain high confidence in the authenticity of submitted items while learning nothing about them. Proposed inspection systems featuring `information barriers', designed to hide measurements stored in electronic systems, are at risk of tampering and snooping. Here we show the viability of a fundamentally new approach to nuclear warhead verification that incorporates a zero-knowledge protocol, which is designed in such a way that sensitive information is never measured and so does not need to be hidden. We interrogate submitted items with energetic neutrons, making, in effect, differential measurements of both neutron transmission and emission. Calculations for scenarios in which material is diverted from a test object show that a high degree of discrimination can be achieved while revealing zero information. Our ideas for a physical zero-knowledge system could have applications beyond the context of nuclear disarmament. The proposed technique suggests a way to perform comparisons or computations on personal or confidential data without measuring the data in the first place.

  18. A zero-knowledge protocol for nuclear warhead verification.

    PubMed

    Glaser, Alexander; Barak, Boaz; Goldston, Robert J

    2014-06-26

    The verification of nuclear warheads for arms control involves a paradox: international inspectors will have to gain high confidence in the authenticity of submitted items while learning nothing about them. Proposed inspection systems featuring 'information barriers', designed to hide measurements stored in electronic systems, are at risk of tampering and snooping. Here we show the viability of a fundamentally new approach to nuclear warhead verification that incorporates a zero-knowledge protocol, which is designed in such a way that sensitive information is never measured and so does not need to be hidden. We interrogate submitted items with energetic neutrons, making, in effect, differential measurements of both neutron transmission and emission. Calculations for scenarios in which material is diverted from a test object show that a high degree of discrimination can be achieved while revealing zero information. Our ideas for a physical zero-knowledge system could have applications beyond the context of nuclear disarmament. The proposed technique suggests a way to perform comparisons or computations on personal or confidential data without measuring the data in the first place.

  19. TRANSPARENCY, VERIFICATION AND THE FUTURE OF NUCLEAR NONPROLIFERATION AND ARMS CONTROL

    SciTech Connect

    J. PILAT

    2000-11-01

    In the future, if the nuclear nonproliferation and arms control agendas are to advance, they will likely become increasingly seen as parallel undertakings with the objective of cradle-to-grave controls over nuclear warheads and/or materials. The pursuit of such an agenda was difficult enough at the outset of the nuclear age; it will be more difficult in the future with relatively wide-spread military and civil nuclear programs. This agenda will require both verification and transparency. To address emerging nuclear dangers, we may expect hybrid verification-transparency regimes to be seen as acceptable. Such regimes would have intrusive but much more limited verification provisions than Cold War accords, and have extensive transparency provisions designed in part to augment the verification measures, to fill in the ''gaps'' of the verification regime, and the like.

  20. Nuclear Resonance Fluorescence for Material Verification in Dismantlement

    SciTech Connect

    Warren, Glen A.; Detwiler, Rebecca S.

    2011-10-01

    Nuclear resonance fluorescence (NRF) is a well-established physical process that provides an isotope-specific signature that can be exploited for isotopic detection and characterization of samples. Pacific Northwest National Laboratory has been investigating possible applications of NRF for national security. Of the investigated applications, the verification of material in the dismantlement process is the most promising. Through a combination of benchmarking measurements and radiation transport modeling, we have shown that NRF techniques with existing bremsstrahlung photon sources and a modest detection system can be used to detect highly enriched uranium in the quantities and time limits relevant to the dismantlement process. Issues such as orientation, placement and material geometry do not significantly impact the sensitivity of the technique. We have also investigated how shielding of the uranium would be observed through non-NRF processes to enable the accurate assay of the material. This paper will discuss our findings on how NRF and photon-interrogation techniques may be applied to the material verification in the dismantlement process.

  1. Supporting the President's Arms Control and Nonproliferation Agenda: Transparency and Verification for Nuclear Arms Reductions

    SciTech Connect

    Doyle, James E; Meek, Elizabeth

    2009-01-01

    The President's arms control and nonproliferation agenda is still evolving and the details of initiatives supporting it remain undefined. This means that DOE, NNSA, NA-20, NA-24 and the national laboratories can help define the agenda, and the policies and the initiatives to support it. This will require effective internal and interagency coordination. The arms control and nonproliferation agenda is broad and includes the path-breaking goal of creating conditions for the elimination of nuclear weapons. Responsibility for various elements of the agenda will be widely scattered across the interagency. Therefore an interagency mapping exercise should be performed to identify the key points of engagement within NNSA and other agencies for creating effective policy coordination mechanisms. These can include informal networks, working groups, coordinating committees, interagency task forces, etc. It will be important for NA-20 and NA-24 to get a seat at the table and a functional role in many of these coordinating bodies. The arms control and nonproliferation agenda comprises both mature and developing policy initiatives. The more mature elements such as CTBT ratification and a follow-on strategic nuclear arms treaty with Russia have defined milestones. However, recent press reports indicate that even the START follow-on strategic arms pact that is planned to be complete by the end of 2009 may take significantly longer and be more expansive in scope. The Russians called for proposals to count non-deployed as well as deployed warheads. Other elements of the agenda such as FMCT, future bilateral nuclear arms reductions following a START follow-on treaty, nuclear posture changes, preparations for an international nuclear security summit, strengthened international safeguards and multilateral verification are in much earlier stages of development. For this reason any survey of arms control capabilities within the USG should be structured to address potential needs across the

  2. Verification as a Foundation for Validation of a Nuclear Fuel Performance Code

    SciTech Connect

    J. D. Hales; S. R. Novascone; B. W. Spencer; R. L. Williamson; G. Pastore; D. M. Perez

    2014-09-01

    Complex multiphysics simulations such as nuclear fuel performance analysis are composed of many submodels used to describe specific phenomena. These phenomena include, as examples, the relationship between stress and strain, heat transfer across a gas gap, and mechanical contact. These submodels work in concert to simulate real-world events, like the behavior of a fuel rod in a reactor. If a simulation tool is able to represent real-world behavior, the tool is said to be validated. While much emphasis is rightly placed on validation, model verification may be undervalued. Verification involves showing that a model performs as intended, that it computes results consistent with its mathematical description. This paper explains the differences between verification and validation and shows how validation should be preceded by verification. Specific verification problems, including several specific to nuclear fuel analysis, are given. Validation results are also presented.

  3. Fuzzy-logic-based safety verification framework for nuclear power plants.

    PubMed

    Rastogi, Achint; Gabbar, Hossam A

    2013-06-01

    This article presents a practical implementation of a safety verification framework for nuclear power plants (NPPs) based on fuzzy logic where hazard scenarios are identified in view of safety and control limits in different plant process values. Risk is estimated quantitatively and compared with safety limits in real time so that safety verification can be achieved. Fuzzy logic is used to define safety rules that map hazard condition with required safety protection in view of risk estimate. Case studies are analyzed from NPP to realize the proposed real-time safety verification framework. An automated system is developed to demonstrate the safety limit for different hazard scenarios. © 2012 Society for Risk Analysis.

  4. NDEC: A NEA platform for nuclear data testing, verification and benchmarking

    NASA Astrophysics Data System (ADS)

    Díez, C. J.; Michel-Sendis, F.; Cabellos, O.; Bossant, M.; Soppera, N.

    2017-09-01

    The selection, testing, verification and benchmarking of evaluated nuclear data consists, in practice, in putting an evaluated file through a number of checking steps where different computational codes verify that the file and the data it contains complies with different requirements. These requirements range from format compliance to good performance in application cases, while at the same time physical constraints and the agreement with experimental data are verified. At NEA, the NDEC (Nuclear Data Evaluation Cycle) platform aims at providing, in a user friendly interface, a thorough diagnose of the quality of a submitted evaluated nuclear data file. Such diagnose is based on the results of different computational codes and routines which carry out the mentioned verifications, tests and checks. NDEC also searches synergies with other existing NEA tools and databases, such as JANIS, DICE or NDaST, including them into its working scheme. Hence, this paper presents NDEC, its current development status and its usage in the JEFF nuclear data project.

  5. 75 FR 34439 - Defense Science Board Task Force on Nuclear Treaty Monitoring and Verification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-17

    ... of the Secretary Defense Science Board Task Force on Nuclear Treaty Monitoring and Verification... Warner, USAF Military Assistant, Defense Science Board, 3140 Defense Pentagon, Room 3B888A, Washington... INFORMATION: The mission of the Defense Science Board is to advise the Secretary of Defense and the Under...

  6. Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models

    SciTech Connect

    Ahmed Hassan; Jenny Chapman

    2006-02-01

    The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP

  7. Crowd-Sourced Help with Emergent Knowledge for Optimized Formal Verification (CHEKOFV)

    DTIC Science & Technology

    2016-03-01

    software. Leveraging human pattern recognition skills, the CSFV games provide formal verification proofs a machine analyzing the code cannot. The SRI...effectiveness and reduce the cost to verify code. 15. SUBJECT TERMS Formal Software Verification, Crowd-Sourcing, Games, Cyber Security, Human- Machine Systems...Abstract  Interpretation  and   Machine  Learning

  8. Design Verification Report Spent Nuclear Fuel (SNF) Project Canister Storage Building (CSB)

    SciTech Connect

    PICKETT, W.W.

    2000-09-22

    The Sub-project W379, ''Spent Nuclear Fuel Canister Storage Building (CSB),'' was established as part of the Spent Nuclear Fuel (SNF) Project. The primary mission of the CSB is to safely store spent nuclear fuel removed from the K Basins in dry storage until such time that it can be transferred to the national geological repository at Yucca Mountain Nevada. This sub-project was initiated in late 1994 by a series of studies and conceptual designs. These studies determined that the partially constructed storage building, originally built as part of the Hanford Waste Vitrification Plant (HWVP) Project, could be redesigned to safely store the spent nuclear fuel. The scope of the CSB facility initially included a receiving station, a hot conditioning system, a storage vault, and a Multi-Canister Overpack (MCO) Handling Machine (MHM). Because of evolution of the project technical strategy, the hot conditioning system was deleted from the scope and MCO welding and sampling stations were added in its place. This report outlines the methods, procedures, and outputs developed by Project W379 to verify that the provided Structures, Systems, and Components (SSCs): satisfy the design requirements and acceptance criteria; perform their intended function; ensure that failure modes and hazards have been addressed in the design; and ensure that the SSCs as installed will not adversely impact other SSCs. Because this sub-project is still in the construction/start-up phase, all verification activities have not yet been performed (e.g., canister cover cap and welding fixture system verification, MCO Internal Gas Sampling equipment verification, and As-built verification.). The verification activities identified in this report that still are to be performed will be added to the start-up punchlist and tracked to closure.

  9. Verification Study of Buoyancy-Driven Turbulent Nuclear Combustion

    SciTech Connect

    2010-01-01

    Buoyancy-driven turbulent nuclear combustion determines the rate of nuclear burning during the deflagration phase (i.e., the ordinary nuclear flame phase) of Type Ia supernovae, and hence the amount of nuclear energy released during this phase. It therefore determines the amount the white dwarf str expands prior to initiation of a detonation wave, and so the amount of radioactive nickel and thus the peak luminosity of the explosion. However, this key physical process is not fully understood. To better understand this process, the Flash Center has conducted an extensive series of large-scale 3D simulations of buoyancy-driven turbulent nuclear combustion for three different physical situations. This movie shows the results for some of these simulations. Credits: Science: Ray Bair, Katherine Riley, Argonne National Laboratory; Anshu Dubey, Don Lamb, Dongwook Lee, University of Chicago; Robert Fisher, University of Massachusetts at Dartmouth and Dean Townsley, University of Alabama

 Visualization: Jonathan Gallagher, University of Chicago; Randy Hudson, John Norris and Michael E. Papka, Argonne National Laboratory/University of Chicago This research used resources of the Argonne Leadership Computing Facility at Argonne National Laboratory, which is supported by the Office of Science of the U.S. Dept. of Energy under contract DE-AC02-06CH11357. This research was supported in part by the National Science Foundation through TeraGrid resources provided by the University of Chicago and Argonne National Laboratory.

  10. North Korea's nuclear weapons program:verification priorities and new challenges.

    SciTech Connect

    Moon, Duk-ho

    2003-12-01

    A comprehensive settlement of the North Korean nuclear issue may involve military, economic, political, and diplomatic components, many of which will require verification to ensure reciprocal implementation. This paper sets out potential verification methodologies that might address a wide range of objectives. The inspection requirements set by the International Atomic Energy Agency form the foundation, first as defined at the time of the Agreed Framework in 1994, and now as modified by the events since revelation of the North Korean uranium enrichment program in October 2002. In addition, refreezing the reprocessing facility and 5 MWe reactor, taking possession of possible weapons components and destroying weaponization capabilities add many new verification tasks. The paper also considers several measures for the short-term freezing of the North's nuclear weapon program during the process of negotiations, should that process be protracted. New inspection technologies and monitoring tools are applicable to North Korean facilities and may offer improved approaches over those envisioned just a few years ago. These are noted, and potential bilateral and regional verification regimes are examined.

  11. Task Force Report: Assessment of Nuclear Monitoring and Verification Technologies

    DTIC Science & Technology

    2014-01-01

    state actions and their potential cascading effects;  The impact of advancing technologies relevant to nuclear weapons development ;  The...detection programs to:  o Conduct systems  studies  and engage operators early in  development  to improve  transition of radiation detection  advances  to...Regimes| 23  Resilient Military Systems and the  Advanced  Cyber Threat  UNCLASSIFIED  stability were  achieved  between  the  United   States   and Russia

  12. A gamma-ray verification system for special nuclear material

    SciTech Connect

    Lanier, R.G.; Prindle, A.L.; Friensehner, A.V.; Buckley, W.M.

    1994-07-01

    The Safeguards Technology Program at the Lawrence Livermore National Laboratory (LLNL) has developed a gamma-ray screening system for use by the Materials Management Section of the Engineering Sciences Division at LLNL for verifying the presence or absence of special nuclear material (SNM) in a sample. This system facilitates the measurements required under the ``5610`` series of US Department of Energy orders. MMGAM is an intelligent, menu driven software application that runs on a personal computer and requires a precalibrated multi-channel analyzer and HPGe detector. It provides a very quick and easy-to-use means of determining the presence of SNM in a sample. After guiding the operator through a menu driven set-up procedure, the system provides an on-screen GO/NO-GO indication after determining the system calibration status. This system represents advances over earlier used systems in the areas of ease-of use, operator training requirements, and quality assurance. The system records the gamma radiation from a sample using a sequence of measurements involving a background measurement followed immediately by a measurement of the unknown sample. Both spectra are stored and available for analysis or output. In the current application, the presence of {sup 235}U, {sup 238}U, {sup 239}Pu, and {sup 208}Tl isotopes are indicated by extracting, from the stored spectra, four energy ``windows`` preset around gamma-ray lines characteristic of the radioactive decay of these nuclides. The system is easily extendible to more complicated problems.

  13. DOE/LLNL verification symposium on technologies for monitoring nuclear tests related to weapons proliferation

    SciTech Connect

    Nakanishi, K.K.

    1993-02-12

    The rapidly changing world situation has raised concerns regarding the proliferation of nuclear weapons and the ability to monitor a possible clandestine nuclear testing program. To address these issues, Lawrence Livermore National Laboratory`s (LLNL) Treaty Verification Program sponsored a symposium funded by the US Department of Energy`s (DOE) Office of Arms Control, Division of Systems and Technology. The DOE/LLNL Symposium on Technologies for Monitoring Nuclear Tests Related to Weapons Proliferation was held at the DOE`s Nevada Operations Office in Las Vegas, May 6--7,1992. This volume is a collection of several papers presented at the symposium. Several experts in monitoring technology presented invited talks assessing the status of monitoring technology with emphasis on the deficient areas requiring more attention in the future. In addition, several speakers discussed proliferation monitoring technologies being developed by the DOE`s weapons laboratories.

  14. DOE/LLNL verification symposium on technologies for monitoring nuclear tests related to weapons proliferation

    SciTech Connect

    Nakanishi, K.K.

    1993-02-12

    The rapidly changing world situation has raised concerns regarding the proliferation of nuclear weapons and the ability to monitor a possible clandestine nuclear testing program. To address these issues, Lawrence Livermore National Laboratory's (LLNL) Treaty Verification Program sponsored a symposium funded by the US Department of Energy's (DOE) Office of Arms Control, Division of Systems and Technology. The DOE/LLNL Symposium on Technologies for Monitoring Nuclear Tests Related to Weapons Proliferation was held at the DOE's Nevada Operations Office in Las Vegas, May 6--7,1992. This volume is a collection of several papers presented at the symposium. Several experts in monitoring technology presented invited talks assessing the status of monitoring technology with emphasis on the deficient areas requiring more attention in the future. In addition, several speakers discussed proliferation monitoring technologies being developed by the DOE's weapons laboratories.

  15. Development of a test system for verification and validation of nuclear transport simulations

    SciTech Connect

    White, Morgan C; Triplett, Brian S; Anghaie, Samim

    2008-01-01

    Verification and validation of nuclear data is critical to the accuracy of both stochastic and deterministic particle transport codes. In order to effectively test a set of nuclear data, the data must be applied to a wide variety of transport problems. Performing this task in a timely, efficient manner is tedious. The nuclear data team at Los Alamos National laboratory in collaboration with the University of Florida has developed a methodology to automate the process of nuclear data verification and validation (V and V). This automated V and V process can efficiently test a number of data libraries using well defined benchmark experiments, such as those in the International Criticality Safety Benchmark Experiment Project (ICSBEP). The process is implemented through an integrated set of Pyton scripts. Material and geometry data are read from an existing medium or given directly by the user to generate a benchmark experiment template file. The user specifies the choice of benchmark templates, codes, and libraries to form a V and V project. The Python scripts generate input decks for multiple transport codes from the templates, run and monitor individual jobs, and parse the relevant output automatically. The output can then be used to generate reports directly or can be stored into a database for later analysis. This methodology eases the burden on the user by reducing the amount of time and effort required for obtaining and compiling calculation results. The resource savings by using this automated methodology could potentially be an enabling technology for more sophisticated data studies, such as nuclear data uncertainty quantification. Once deployed, this tool will allow the nuclear data community to more thoroughly test data libraries leading to higher fidelity data in the future.

  16. REPORT OF THE WORKSHOP ON NUCLEAR FACILITY DESIGN INFORMATION EXAMINATION AND VERIFICATION FOR SAFEGUARDS

    SciTech Connect

    Richard Metcalf; Robert Bean

    2009-10-01

    Executive Summary The International Atomic Energy Agency (IAEA) implements nuclear safeguards and verifies countries are compliant with their international nuclear safeguards agreements. One of the key provisions in the safeguards agreement is the requirement that the country provide nuclear facility design and operating information to the IAEA relevant to safeguarding the facility, and at a very early stage. , This provides the opportunity for the IAEA to verify the safeguards-relevant features of the facility and to periodically ensure that those features have not changed. The national authorities (State System of Accounting for and Control of Nuclear Material - SSAC) provide the design information for all facilities within a country to the IAEA. The design information is conveyed using the IAEA’s Design Information Questionnaire (DIQ) and specifies: (1) Identification of the facility’s general character, purpose, capacity, and location; (2) Description of the facility’s layout and nuclear material form, location, and flow; (3) Description of the features relating to nuclear material accounting, containment, and surveillance; and (4) Description of existing and proposed procedures for nuclear material accounting and control, with identification of nuclear material balance areas. The DIQ is updated as required by written addendum. IAEA safeguards inspectors examine and verify this information in design information examination (DIE) and design information verification (DIV) activities to confirm that the facility has been constructed or is being operated as declared by the facility operator and national authorities, and to develop a suitable safeguards approach. Under the Next Generation Safeguards Initiative (NGSI), the National Nuclear Security Administrations (NNSA) Office of Non-Proliferation and International Security identified the need for more effective and efficient verification of design information by the IAEA for improving international safeguards

  17. Development of a Consensus Standard for Verification and Validation of Nuclear System Thermal-Fluids Software

    SciTech Connect

    Edwin A. Harvego; Richard R. Schultz; Ryan L. Crane

    2011-12-01

    With the resurgence of nuclear power and increased interest in advanced nuclear reactors as an option to supply abundant energy without the associated greenhouse gas emissions of the more conventional fossil fuel energy sources, there is a need to establish internationally recognized standards for the verification and validation (V&V) of software used to calculate the thermal-hydraulic behavior of advanced reactor designs for both normal operation and hypothetical accident conditions. To address this need, ASME (American Society of Mechanical Engineers) Standards and Certification has established the V&V 30 Committee, under the jurisdiction of the V&V Standards Committee, to develop a consensus standard for verification and validation of software used for design and analysis of advanced reactor systems. The initial focus of this committee will be on the V&V of system analysis and computational fluid dynamics (CFD) software for nuclear applications. To limit the scope of the effort, the committee will further limit its focus to software to be used in the licensing of High-Temperature Gas-Cooled Reactors. In this framework, the Standard should conform to Nuclear Regulatory Commission (NRC) and other regulatory practices, procedures and methods for licensing of nuclear power plants as embodied in the United States (U.S.) Code of Federal Regulations and other pertinent documents such as Regulatory Guide 1.203, 'Transient and Accident Analysis Methods' and NUREG-0800, 'NRC Standard Review Plan'. In addition, the Standard should be consistent with applicable sections of ASME NQA-1-2008 'Quality Assurance Requirements for Nuclear Facility Applications (QA)'. This paper describes the general requirements for the proposed V&V 30 Standard, which includes; (a) applicable NRC and other regulatory requirements for defining the operational and accident domain of a nuclear system that must be considered if the system is to be licensed, (b) the corresponding calculation domain of

  18. Design Verification Report Spent Nuclear Fuel (SNF) Project Canister Storage Building (CSB)

    SciTech Connect

    BAZINET, G.D.

    2000-11-03

    The Sub-project W379, ''Spent Nuclear Fuel Canister Storage Building (CSB),'' was established as part of the Spent Nuclear Fuel (SNF) Project. The primary mission of the CSB is to safely store spent nuclear fuel removed from the K Basins in dry storage until such time that it can be transferred to the national geological repository at Yucca Mountain Nevada. This sub-project was initiated in late 1994 by a series of studies and conceptual designs. These studies determined that the partially constructed storage building, originally built as part of the Hanford Waste Vitrification Plant (HWVP) Project, could be redesigned to safely store the spent nuclear fuel. The scope of the CSB facility initially included a receiving station, a hot conditioning system, a storage vault, and a Multi-Canister Overpack (MCO) Handling Machine (MHM). Because of evolution of the project technical strategy, the hot conditioning system was deleted from the scope and MCO welding and sampling stations were added in its place. This report outlines the methods, procedures, and outputs developed by Project W379 to verify that the provided Structures, Systems, and Components (SSCs): satisfy the design requirements and acceptance criteria; perform their intended function; ensure that failure modes and hazards have been addressed in the design; and ensure that the SSCs as installed will not adversely impact other SSCs. The original version of this document was prepared by Vista Engineering for the SNF Project. The purpose of this revision is to document completion of verification actions that were pending at the time the initial report was prepared. Verification activities for the installed and operational SSCs have been completed. Verification of future additions to the CSB related to the canister cover cap and welding fixture system and MCO Internal Gas Sampling equipment will be completed as appropriate for those components. The open items related to verification of those requirements are noted

  19. A physical zero-knowledge object-comparison system for nuclear warhead verification

    NASA Astrophysics Data System (ADS)

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; D'Errico, Francesco

    2016-09-01

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.

  20. A physical zero-knowledge object-comparison system for nuclear warhead verification

    PubMed Central

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; d'Errico, Francesco

    2016-01-01

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications. PMID:27649477

  1. A physical zero-knowledge object-comparison system for nuclear warhead verification

    SciTech Connect

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; d’Errico, Francesco

    2016-09-20

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.

  2. A physical zero-knowledge object-comparison system for nuclear warhead verification.

    PubMed

    Philippe, Sébastien; Goldston, Robert J; Glaser, Alexander; d'Errico, Francesco

    2016-09-20

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.

  3. A physical zero-knowledge object-comparison system for nuclear warhead verification

    DOE PAGES

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; ...

    2016-09-20

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information.more » More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.« less

  4. Technology diffusion of a different nature: Applications of nuclear safeguards technology to the chemical weapons verification regime

    SciTech Connect

    Kadner, S.P.; Reisman, A.; Turpen, E.

    1996-10-01

    The following discussion focuses on the issue of arms control implementation from the standpoint of technology and technical assistance. Not only are the procedures and techniques for safeguarding nuclear materials undergoing substantial changes, but the implementation of the Chemical Weapons Convention (CWC) and the Biological Weapons Convention (BWC) will give rise to technical difficulties unprecedented in the implementation of arms control verification. Although these regimes present new challenges, an analysis of the similarities between the nuclear and chemical weapons non-proliferation verification regimes illustrates the overlap in technological solutions. Just as cost-effective and efficient technologies can solve the problems faced by the nuclear safeguards community, these same technologies offer solutions for the CWC safeguards regime. With this in mind, experts at the Organization for the Prohibition of Chemical Weapons (OPCW), who are responsible for verification implementation, need to devise a CWC verification protocol that considers the technology already available. The functional similarity of IAEA and the OPCW, in conjunction with the technical necessities of both verification regimes, should receive attention with respect to the establishment of a technical assistance program. Lastly, the advanced status of the nuclear and chemical regime vis-a-vis the biological non-proliferation regime can inform our approach to implementation of confidence building measures for biological weapons.

  5. Verification of 235U mass content in nuclear fuel plates by an absolute method

    NASA Astrophysics Data System (ADS)

    El-Gammal, W.

    2007-01-01

    Nuclear Safeguards is referred to a verification System by which a State can control all nuclear materials (NM) and nuclear activities under its authority. An effective and efficient Safeguards System must include a system of measurements with capabilities sufficient to verify such NM. Measurements of NM using absolute methods could eliminate the dependency on NM Standards, which are necessary for other relative or semi-absolute methods. In this work, an absolute method has been investigated to verify the 235U mass content in nuclear fuel plates of Material Testing Reactor (MTR) type. The most intense gamma-ray signature at 185.7 keV emitted after α-decay of the 235U nuclei was employed in the method. The measuring system (an HPGe-spectrometer) was mathematically calibrated for efficiency using the general Monte Carlo transport code MCNP-4B. The calibration results and the measured net count rate were used to estimate the 235U mass content in fuel plates at different detector-to-fuel plate distances. Two sets of fuel plates, containing natural and low enriched uranium, were measured at the Fuel Fabrication Facility. Average accuracies for the estimated 235U masses of about 2.62% and 0.3% are obtained for the fuel plates containing natural and low enriched uranium; respectively, with a precision of about 3%.

  6. A New Approach to Nuclear Warhead Verification Using a Zero-Knowledge Protocol

    SciTech Connect

    Glaser,; Alexander,

    2012-05-16

    Warhead verification systems proposed to date fundamentally rely on the use of information barriers to prevent the release of classified design information. Measurements with information carriers significantly increase the complexity of inspection systems, make their certification and authentication difficult, and may reduce the overall confidence in the verifiability of future arms- control agreements. This talk presents a proof-of-concept of a new approach to nuclear warhead verification that minimizes the role of information barriers from the outset and envisions instead an inspection system that a priori avoids leakage of sensitive information using a so-called zero-knowledge protocol. The proposed inspection system is based on the template-matching approach and relies on active interrogation of a test object with 14-MeV neutrons. The viability of the method is examined with MCNP Monte Carlo neutron transport calculations modeling the experimental setup, an investigation of different diversion scenarios, and an analysis of the simulated data showing that it does not contain information about the properties of the inspected object.

  7. The verification system of the Comprehensive Nuclear-Test-Ban Treaty

    NASA Astrophysics Data System (ADS)

    Suarez, Gerardo

    2002-11-01

    The Comprehensive Nuclear-Test-Ban Treaty was opened for signature in September 1996. To date, the treaty has been signed by 165 countries and ratified by 93; among the latter, 31 out of the 44 whose ratification is needed for the treaty to enter into force. The treaty calls for the installation and operation of a verification system to ensure compliance. The verification system is composed of the International Monitoring System (IMS), the International Data Centre (IDC), and the On Site Inspection Division (OSI). The IMS is a global network of 321 stations hosted by 90 countries. The primary network is composed of 50 seismic stations, 31 of which are seismic arrays and 19 three-component, broad-band stations, 11 hydroacoustic stations, 60 infrasound arrays, and 80 radionuclide monitoring stations measuring radioactive particulates and noble gases in the atmosphere. The radionuclide network is supported by 16 laboratories. The auxiliary network of 120 seismic stations is interrogated on request by the IDC to improve the accuracy of the locations. The data from the 321 stations and from the laboratories is transmitted to the IDC in Vienna via a dedicated Global Communication Infrastructure (GCI) based on VSAT antennas. The IDC collects and processes the data collected from the four technologies and produces bulletins of events. The raw data and bulletins are distributed to state signatories. Upon entry into force, an on-site inspection may be carried out if it is suspected that a nuclear explosion has taken place. Since mid-1997, when the Provisional Technical Secretariat responsible for the implementation of the verification system began its work in Vienna, over 86% of the sites have been surveyed and the final location of the stations selected. By the end of 2002 this number will reach about 90%, essentially completing this phase. To date, 131 stations have been built or upgraded, and 80 are now sending data to the IDC; 112 others are under construction or under

  8. Applications of a Fast Neutron Detector System to Verification of Special Nuclear Materials

    NASA Astrophysics Data System (ADS)

    Mayo, Douglas R.; Byrd, Roger C.; Ensslin, Norbert; Krick, Merlyn S.; Mercer, David J.; Miller, Michael C.; Prettyman, Thomas H.; Russo, Phyllis A.

    1998-04-01

    An array of boron-loaded plastic optically coupled to bismuth germanate scintillators has been developed to detect neutrons for measurement of special nuclear materials. The phoswiched detection system has the advantage of a high neutron detection efficiency and short die-away time. This is achieved by mixing the moderator (plastic) and the detector (^10B) at the molecular level. Simulations indicate that the neutron capture probabilities equal or exceed those of the current thermal neutron multiplicity techniques which have the moderator (polyethylene) and detectors (^3He gas proportional tubes) macroscopically separate. Experiments have been performed to characterize the response of these detectors and validate computer simulations. The fast neutron detection system may be applied to the quantitative assay of plutonium in high (α,n) backgrounds, with emphasis on safeguards and enviromental scenarios. Additional applications of the insturment, in a non-quantative mode, has been tested for possible verification activities involving dismantlement of nuclear weapons. A description of the detector system, simulations and preliminary data will be presented.

  9. Help

    ERIC Educational Resources Information Center

    Tollefson, Ann

    2009-01-01

    Planning to start or expand a K-8 critical language program? Looking for support in doing so? There "may" be help at the federal level for great ideas and strong programs. While there have been various pools of federal dollars available to support world language programs for a number of years, the federal government's interest in…

  10. Help

    ERIC Educational Resources Information Center

    Tollefson, Ann

    2009-01-01

    Planning to start or expand a K-8 critical language program? Looking for support in doing so? There "may" be help at the federal level for great ideas and strong programs. While there have been various pools of federal dollars available to support world language programs for a number of years, the federal government's interest in…

  11. Stabilized, hand-held, gamma-ray verification instrument for special nuclear materials

    SciTech Connect

    Fehlau, P.E.; Wiig, G.

    1988-01-01

    For many years, Los Alamos has developed intelligent, hand-held, search instruments for use by non-specialists to search for special nuclear materials (SNM). The instruments sense SNM by detecting its emitted radiation with scintillation detectors monitored by digital alarm circuitry. Now, we have developed a new hand-held instrument that can verify the presence or absence of particular radioisotopes by analyzing gamma-ray spectra. The new instrument is similar to recent, microprocessor-based, search instruments, but has LED detector stabilization, three adjustable regions-of-interest, and additional operating programs for spectrum analysis. We call the new instrument an SNM verification instrument. Its spectrum analysis capability can verify the presence or absence of specific plutonium isotopes in containers or verify the presence of uranium and its enrichment. The instrument retains the search capability, light weight, and low-power requirement of its predecessors. Its ready portability, detector stabilization, and simple operation allow individuals with little technical training to verify the contents of SNM containers. 5 refs., 5 figs.

  12. Technology Foresight and nuclear test verification: a structured and participatory approach

    NASA Astrophysics Data System (ADS)

    Noack, Patrick; Gaya-Piqué, Luis; Haralabus, Georgios; Auer, Matthias; Jain, Amit; Grenard, Patrick

    2013-04-01

    As part of its mandate, the CTBTO's nuclear explosion monitoring programme aims to maintain its sustainability, effectiveness and its long-term relevance to the verification regime. As such, the PTS is conducting a Technology Foresight programme of activities to identify technologies, processes, concepts and ideas that may serve said purpose and become applicable within the next 20 years. Through the Technology Foresight activities (online conferences, interviews, surveys, workshops and other) we have involved the wider science community in the fields of seismology, infrasound, hydroacoustics, radionuclide technology, remote sensing and geophysical techniques. We have assembled a catalogue of over 200 items, which incorporate technologies, processes, concepts and ideas which will have direct future relevance to the IMS (International Monitoring System), IDC (International Data Centre) and OSI (On-Site Inspection) activities within the PTS. In order to render this catalogue as applicable and useful as possible for strategy and planning, we have devised a "taxonomy" based on seven categories, against which each technology is assessed through a peer-review mechanism. These categories are: 1. Focus area of the technology in question: identify whether the technology relates to (one or more of the following) improving our understanding of source and source physics; propagation modelling; data acquisition; data transport; data processing; broad modelling concepts; quality assurance and data storage. 2. Current Development Stage of the technology in question. Based on a scale from one to six, this measure is specific to PTS needs and broadly reflects Technology Readiness Levels (TRLs). 3. Impact of the technology on each of the following capabilities: detection, location, characterization, sustainment and confidence building. 4. Development cost: the anticipated monetary cost of validating a prototype (i.e. Development Stage 3) of the technology in question. 5. Time to

  13. Development of a Standard for Verification and Validation of Software Used to Calculate Nuclear System Thermal Fluids Behavior

    SciTech Connect

    Richard R. Schultz; Edwin A. Harvego; Ryan L. Crane

    2010-05-01

    With the resurgence of nuclear power and increased interest in advanced nuclear reactors as an option to supply abundant energy without the associated greenhouse gas emissions of the more conventional fossil fuel energy sources, there is a need to establish internationally recognized standards for the verification and validation (V&V) of software used to calculate the thermal-hydraulic behavior of advanced reactor designs for both normal operation and hypothetical accident conditions. To address this need, ASME (American Society of Mechanical Engineers) Standards and Certification has established the V&V 30 Committee, under the responsibility of the V&V Standards Committee, to develop a consensus Standard for verification and validation of software used for design and analysis of advanced reactor systems. The initial focus of this committee will be on the V&V of system analysis and computational fluid dynamics (CFD) software for nuclear applications. To limit the scope of the effort, the committee will further limit its focus to software to be used in the licensing of High-Temperature Gas-Cooled Reactors. In this framework, the standard should conform to Nuclear Regulatory Commission (NRC) practices, procedures and methods for licensing of nuclear power plants as embodied in the United States (U.S.) Code of Federal Regulations and other pertinent documents such as Regulatory Guide 1.203, “Transient and Accident Analysis Methods” and NUREG-0800, “NRC Standard Review Plan”. In addition, the standard should be consistent with applicable sections of ASME Standard NQA-1 (“Quality Assurance Requirements for Nuclear Facility Applications (QA)”). This paper describes the general requirements for the V&V Standard, which includes; (a) the definition of the operational and accident domain of a nuclear system that must be considered if the system is to licensed, (b) the corresponding calculational domain of the software that should encompass the nuclear operational

  14. Positive nuclear BAP1 immunostaining helps differentiate non-small cell lung carcinomas from malignant mesothelioma

    PubMed Central

    Carbone, Michele; Shimizu, David; Napolitano, Andrea; Tanji, Mika; Pass, Harvey I.; Yang, Haining; Pastorino, Sandra

    2016-01-01

    The differential diagnosis between pleural malignant mesothelioma (MM) and lung cancer is often challenging. Immunohistochemical (IHC) stains used to distinguish these malignancies include markers that are most often positive in MM and less frequently positive in carcinomas, and vice versa. However, in about 10–20% of the cases, the IHC results can be confusing and inconclusive, and novel markers are sought to increase the diagnostic accuracy. We stained 45 non-small cell lung cancer samples (32 adenocarcinomas and 13 squamous cell carcinomas) with a monoclonal antibody for BRCA1-associated protein 1 (BAP1) and also with an IHC panel we routinely use to help differentiate MM from carcinomas, which include, calretinin, Wilms Tumor 1, cytokeratin 5, podoplanin D2-40, pankeratin CAM5.2, thyroid transcription factor 1, Napsin-A, and p63. Nuclear BAP1 expression was also analyzed in 35 MM biopsies. All 45 non-small cell lung cancer biopsies stained positive for nuclear BAP1, whereas 22/35 (63%) MM biopsies lacked nuclear BAP1 staining, consistent with previous data. Lack of BAP1 nuclear staining was associated with MM (two-tailed Fisher's Exact Test, P = 5.4 × 10−11). Focal BAP1 staining was observed in a subset of samples, suggesting polyclonality. Diagnostic accuracy of other classical IHC markers was in agreement with previous studies. Our study indicated that absence of nuclear BAP1 stain helps differentiate MM from lung carcinomas. We suggest that BAP1 staining should be added to the IHC panel that is currently used to distinguish these malignancies. PMID:27447750

  15. Positive nuclear BAP1 immunostaining helps differentiate non-small cell lung carcinomas from malignant mesothelioma.

    PubMed

    Carbone, Michele; Shimizu, David; Napolitano, Andrea; Tanji, Mika; Pass, Harvey I; Yang, Haining; Pastorino, Sandra

    2016-09-13

    The differential diagnosis between pleural malignant mesothelioma (MM) and lung cancer is often challenging. Immunohistochemical (IHC) stains used to distinguish these malignancies include markers that are most often positive in MM and less frequently positive in carcinomas, and vice versa. However, in about 10-20% of the cases, the IHC results can be confusing and inconclusive, and novel markers are sought to increase the diagnostic accuracy.We stained 45 non-small cell lung cancer samples (32 adenocarcinomas and 13 squamous cell carcinomas) with a monoclonal antibody for BRCA1-associated protein 1 (BAP1) and also with an IHC panel we routinely use to help differentiate MM from carcinomas, which include, calretinin, Wilms Tumor 1, cytokeratin 5, podoplanin D2-40, pankeratin CAM5.2, thyroid transcription factor 1, Napsin-A, and p63. Nuclear BAP1 expression was also analyzed in 35 MM biopsies. All 45 non-small cell lung cancer biopsies stained positive for nuclear BAP1, whereas 22/35 (63%) MM biopsies lacked nuclear BAP1 staining, consistent with previous data. Lack of BAP1 nuclear staining was associated with MM (two-tailed Fisher's Exact Test, P = 5.4 x 10-11). Focal BAP1 staining was observed in a subset of samples, suggesting polyclonality. Diagnostic accuracy of other classical IHC markers was in agreement with previous studies. Our study indicated that absence of nuclear BAP1 stain helps differentiate MM from lung carcinomas. We suggest that BAP1 staining should be added to the IHC panel that is currently used to distinguish these malignancies.

  16. Gaseous standards preparation with the radionuclide Ar-41 for stack monitors calibration and verification in nuclear facilities.

    PubMed

    Kovar, Petr; Dryak, Pavel

    2008-01-01

    The Czech Metrology Institute performs calibration and verification of noble gases stack monitors in nuclear power plants and nuclear research facilities. Together with Kr-85 and Xe-133, the radionuclide Ar-41 is measured using HPGe detectors and its activity is determined using a gamma-ray peak at 1293keV. The counting efficiency used in these measurements was calculated by the Monte Carlo method using the MCNP code. Radioactive gas standard is prepared by irradiation of argon in a high-pressure vessel by a Cf-252 neutron generator. The inner shape and thickness of the cylinder walls were determined by radiography. The argon volume under normal conditions is determined from the high-pressure vessel volume and by a precise gas pressure measurement. As a result, the activity concentration of Ar-41 at normal conditions is certified.

  17. Implementation of neutron counting techniques at US facilities for IAEA verification of excess materials from nuclear weapons production

    SciTech Connect

    Stewart, J.E.; Krick, M.S.; Langner, D.G.; Reilly, T.D.; Theis, W.; Lemaire, R.J.; Xiao, J.

    1995-08-01

    The U.S. Nonproliferation and Export Control Policy, announced by President Clinton before the United Nations General Assembly on September 27, 1993, commits the U.S. to placing under International Atomic Energy Agency (IAEA) Safeguards excess nuclear materials no longer needed for the U.S. nuclear deterrent. As of July 1, 1995, the IAEA had completed Initial Physical Inventory Verification (IPIV) at two facilities: a storage vault in the Oak Ridge Y-12 plant containing highly enriched uranium (HOW) metal and another storage vault in the Hanford Plutonium Finishing Plant (PFP) containing plutonium oxide and plutonium-bearing residues. Another plutonium- storage vault, located at Rocky Flats, is scheduled for the IPIV in the fall of 1995. Conventional neutron coincidence counting is one of the routinely applied IAEA nondestructive assay (ND) methods for verification of uranium and plutonium. However, at all three facilities mentioned above, neutron ND equipment had to be modified or developed for specific facility needs such as the type and configuration of material placed under safeguards. This document describes those modifications and developments.

  18. Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS) Code Verification and Validation Data Standards and Requirements: Fluid Dynamics Version 1.0

    SciTech Connect

    Greg Weirs; Hyung Lee

    2011-09-01

    V&V and UQ are the primary means to assess the accuracy and reliability of M&S and, hence, to establish confidence in M&S. Though other industries are establishing standards and requirements for the performance of V&V and UQ, at present, the nuclear industry has not established such standards or requirements. However, the nuclear industry is beginning to recognize that such standards are needed and that the resources needed to support V&V and UQ will be very significant. In fact, no single organization has sufficient resources or expertise required to organize, conduct and maintain a comprehensive V&V and UQ program. What is needed is a systematic and standardized approach to establish and provide V&V and UQ resources at a national or even international level, with a consortium of partners from government, academia and industry. Specifically, what is needed is a structured and cost-effective knowledge base that collects, evaluates and stores verification and validation data, and shows how it can be used to perform V&V and UQ, leveraging collaboration and sharing of resources to support existing engineering and licensing procedures as well as science-based V&V and UQ processes. The Nuclear Energy Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Utah State University and others with the objective of establishing a comprehensive and web-accessible knowledge base to provide V&V and UQ resources for M&S for nuclear reactor design, analysis and licensing. The knowledge base will serve as an important resource for technical exchange and collaboration that will enable credible and reliable computational models and simulations for application to nuclear power. NE-KAMS will serve as a valuable resource for the nuclear industry, academia, the national laboratories, the U.S. Nuclear Regulatory Commission (NRC) and

  19. Methodology, verification, and performance of the continuous-energy nuclear data sensitivity capability in MCNP6

    SciTech Connect

    Kiedrowski, B. C.; Brown, F. B.

    2013-07-01

    A continuous-energy sensitivity coefficient capability has been introduced into MCNP6. The methods for generating energy-resolved and energy-integrated sensitivity profiles are discussed. Results from the verification exercises that were performed are given, and these show that MCNP6 compares favorably with analytic solutions, direct density perturbations, and comparisons to TSUNAMI-3D and MONK. Run-time and memory requirements are assessed for typical applications, and these are shown to be reasonable with modern computing resources. (authors)

  20. Human factors design, verification, and validation for two types of control room upgrades at a nuclear power plant

    SciTech Connect

    Boring, Laurids Ronald

    2014-10-01

    This paper describes the NUREG-0711 based human factors engineering (HFE) phases and associated elements required to support design, verification and validation (V&V), and implementation of a new plant process computer (PPC) and turbine control system (TCS) at a representative nuclear power plant. This paper reviews ways to take a human-system interface (HSI) specification and use it when migrating legacy PPC displays or designing displays with new functionality. These displays undergo iterative usability testing during the design phase and then undergo an integrated system validation (ISV) in a full scope control room training simulator. Following the successful demonstration of operator performance with the systems during the ISV, the new system is implemented at the plant, first in the training simulator and then in the main control room.

  1. Routine inspection effort required for verification of a nuclear material production cutoff convention

    SciTech Connect

    Fishbone, L.G.; Sanborn, J.

    1995-08-01

    Preliminary estimates of the inspection effort to verify a Nuclear Material Cutoff Convention are presented. The estimates are based on a database of about 650 facilities in a total of eight states, the five nuclear-weapons states and three ``threshold`` states plus facility-specific inspection requirements. Typical figures for inspection requirements for specific facility types derive from IAEA experience, where applicable. Alternative estimates of inspection effort are used in cutoff options where full IAEA safeguards are not stipulated.

  2. Enhanced global Radionuclide Source Attribution for the Nuclear-Test-Ban Verification by means of the Adjoint Ensemble Dispersion Modeling Technique applied at the IDC/CTBTO.

    NASA Astrophysics Data System (ADS)

    Becker, A.; Wotawa, G.; de Geer, L.

    2006-05-01

    The Provisional Technical Secretariat (PTS) of the CTBTO Preparatory Commission maintains and permanently updates a source-receptor matrix (SRM) describing the global monitoring capability of a highly sensitive 80 stations radionuclide (RN) network in order to verify states signatories' compliance of the comprehensive nuclear-test-ban treaty (CTBT). This is done by means of receptor-oriented Lagrangian particle dispersion modeling (LPDM) to help determine the region from which suspicious radionuclides may originate. In doing so the LPDM FLEXPART5.1 is integrated backward in time based on global analysis wind fields yielding global source-receptor sensitivity (SRS) fields stored in three-hour frequency and at 1º horizontal resolution. A database of these SRS fields substantially helps in improving the interpretation of the RN samples measurements and categorizations because it enables the testing of source-hypothesis's later on in a pure post-processing (SRM inversion) step being feasible on hardware with specifications comparable to currently sold PC's or Notebooks and at any place (decentralized), provided access to the SRS fields is warranted. Within the CTBT environment it is important to quickly achieve decision-makers confidence in the SRM based backtracking products issued by the PTS in the case of the occurrence of treaty relevant radionuclides. Therefore the PTS has set up a highly automated response system together with the Regional Specialized Meteorological Centers of the World Meteorological Organization in the field of dispersion modeling who committed themselves to provide the PTS with the same standard SRS fields as calculated by their systems for CTBT relevant cases. This system was twice utilized in 2005 in order to perform adjoint ensemble dispersion modeling (EDM) and demonstrated the potential of EDM based backtracking to improve the accuracy of the source location related to singular nuclear events thus serving the backward analogue to the

  3. Use of open source information and commercial satellite imagery for nuclear nonproliferation regime compliance verification by a community of academics

    NASA Astrophysics Data System (ADS)

    Solodov, Alexander

    The proliferation of nuclear weapons is a great threat to world peace and stability. The question of strengthening the nonproliferation regime has been open for a long period of time. In 1997 the International Atomic Energy Agency (IAEA) Board of Governors (BOG) adopted the Additional Safeguards Protocol. The purpose of the protocol is to enhance the IAEA's ability to detect undeclared production of fissile materials in member states. However, the IAEA does not always have sufficient human and financial resources to accomplish this task. Developed here is a concept for making use of human and technical resources available in academia that could be used to enhance the IAEA's mission. The objective of this research was to study the feasibility of an academic community using commercially or publicly available sources of information and products for the purpose of detecting covert facilities and activities intended for the unlawful acquisition of fissile materials or production of nuclear weapons. In this study, the availability and use of commercial satellite imagery systems, commercial computer codes for satellite imagery analysis, Comprehensive Test Ban Treaty (CTBT) verification International Monitoring System (IMS), publicly available information sources such as watchdog groups and press reports, and Customs Services information were explored. A system for integrating these data sources to form conclusions was also developed. The results proved that publicly and commercially available sources of information and data analysis can be a powerful tool in tracking violations in the international nuclear nonproliferation regime and a framework for implementing these tools in academic community was developed. As a result of this study a formation of an International Nonproliferation Monitoring Academic Community (INMAC) is proposed. This would be an independent organization consisting of academics (faculty, staff and students) from both nuclear weapon states (NWS) and

  4. Development and verification of design methods for ducts in a space nuclear shield

    NASA Technical Reports Server (NTRS)

    Cerbone, R. J.; Selph, W. E.; Read, P. A.

    1972-01-01

    A practical method for computing the effectiveness of a space nuclear shield perforated by small tubing and cavities is reported. Performed calculations use solutions for a two dimensional transport code and evaluate perturbations of that solution using last flight estimates and other kernel integration techniques. In general, perturbations are viewed as a change in source strength of scattered radiation and a change in attenuation properties of the region.

  5. Analytical three-dimensional neutron transport benchmarks for verification of nuclear engineering codes. Final report

    SciTech Connect

    Ganapol, B.D.; Kornreich, D.E.

    1997-07-01

    Because of the requirement of accountability and quality control in the scientific world, a demand for high-quality analytical benchmark calculations has arisen in the neutron transport community. The intent of these benchmarks is to provide a numerical standard to which production neutron transport codes may be compared in order to verify proper operation. The overall investigation as modified in the second year renewal application includes the following three primary tasks. Task 1 on two dimensional neutron transport is divided into (a) single medium searchlight problem (SLP) and (b) two-adjacent half-space SLP. Task 2 on three-dimensional neutron transport covers (a) point source in arbitrary geometry, (b) single medium SLP, and (c) two-adjacent half-space SLP. Task 3 on code verification, includes deterministic and probabilistic codes. The primary aim of the proposed investigation was to provide a suite of comprehensive two- and three-dimensional analytical benchmarks for neutron transport theory applications. This objective has been achieved. The suite of benchmarks in infinite media and the three-dimensional SLP are a relatively comprehensive set of one-group benchmarks for isotropically scattering media. Because of time and resource limitations, the extensions of the benchmarks to include multi-group and anisotropic scattering are not included here. Presently, however, enormous advances in the solution for the planar Green`s function in an anisotropically scattering medium have been made and will eventually be implemented in the two- and three-dimensional solutions considered under this grant. Of particular note in this work are the numerical results for the three-dimensional SLP, which have never before been presented. The results presented were made possible only because of the tremendous advances in computing power that have occurred during the past decade.

  6. Measurement and verification of positron emitter nuclei generated at each treatment site by target nuclear fragment reactions in proton therapy

    SciTech Connect

    Miyatake, Aya; Nishio, Teiji; Ogino, Takashi; Saijo, Nagahiro; Esumi, Hiroyasu; Uesaka, Mitsuru

    2010-08-15

    Purpose: The purpose of this study is to verify the characteristics of the positron emitter nuclei generated at each treatment site by proton irradiation. Methods: Proton therapy using a beam on-line PET system mounted on a rotating gantry port (BOLPs-RGp), which the authors developed, is provided at the National Cancer Center Kashiwa, Japan. BOLPs-RGp is a monitoring system that can confirm the activity distribution of the proton irradiated volume by detection of a pair of annihilation gamma rays coincidentally from positron emitter nuclei generated by the target nuclear fragment reactions between irradiated proton nuclei and nuclei in the human body. Activity is measured from a start of proton irradiation to a period of 200 s after the end of the irradiation. The characteristics of the positron emitter nuclei generated in a patient's body were verified by the measurement of the activity distribution at each treatment site using BOLPs-RGp. Results: The decay curves for measured activity were able to be approximated using two or three half-life values regardless of the treatment site. The activity of half-life value of about 2 min was important for a confirmation of the proton irradiated volume. Conclusions: In each proton treatment site, verification of the characteristics of the generated positron emitter nuclei was performed by using BOLPs-RGp. For the monitoring of the proton irradiated volume, the detection of {sup 15}O generated in a human body was important.

  7. Plastid DNA sequencing and nuclear SNP genotyping help resolve the puzzle of central American Platanus

    PubMed Central

    De Castro, Olga; Di Maio, Antonietta; Lozada García, José Armando; Piacenti, Danilo; Vázquez-Torres, Mario; De Luca, Paolo

    2013-01-01

    Background and Aims Recent research on the history of Platanus reveals that hybridization phenomena occurred in the central American species. This study has two goals: to help resolve the evolutive puzzle of central American Platanus, and to test the potential of real-time polymerase chain reaction (PCR) for detecting ancient hybridization. Methods Sequencing of a uniparental plastid DNA marker [psbA-trnH(GUG) intergenic spacer] and qualitative and quantitative single nucleotide polymorphism (SNP) genotyping of biparental nuclear ribosomal DNA (nrDNA) markers [LEAFY intron 2 (LFY-i2) and internal transcribed spacer 2 (ITS2)] were used. Key Results Based on the SNP genotyping results, several Platanus accessions show the presence of hybridization/introgression, including some accessions of P. rzedowskii and of P. mexicana var. interior and one of P. mexicana var. mexicana from Oaxaca (= P. oaxacana). Based on haplotype analyses of the psbA-trnH spacer, five haplotypes were detected. The most common of these is present in taxa belonging to P. orientalis, P. racemosa sensu lato, some accessions of P. occidentalis sensu stricto (s.s.) from Texas, P. occidentalis var. palmeri, P. mexicana s.s. and P. rzedowskii. This is highly relevant to genetic relationships with the haplotypes present in P. occidentalis s.s. and P. mexicana var. interior. Conclusions Hybridization and introgression events between lineages ancestral to modern central and eastern North American Platanus species occurred. Plastid haplotypes and qualitative and quantitative SNP genotyping provide information critical for understanding the complex history of Mexican Platanus. Compared with the usual molecular techniques of sub-cloning, sequencing and genotyping, real-time PCR assay is a quick and sensitive technique for analysing complex evolutionary patterns. PMID:23798602

  8. Use of nuclear explosions to create gas condensate storage in the USSR. LLL Treaty Verification Program

    SciTech Connect

    Borg, I.Y.

    1982-08-23

    The Soviet Union has described industrial use of nuclear explosions to produce underground hydrocarbon storage. To examples are in the giant Orenburg gas condensate field. There is good reason to believe that three additional cavities were created in bedded salt in the yet to be fully developed giant Astrakhan gas condensate field in the region of the lower Volga. Although contrary to usual western practice, the cavities are believed to be used to store H/sub 2/S-rich, unstable gas condensate prior to processing in the main gas plants located tens of kilometers from the producing fields. Detonations at Orenburg and Astrakhan preceded plant construction. The use of nuclear explosions at several sites to create underground storage of highly corrosive liquid hydrocarbons suggests that the Soviets consider this time and cost effective. The possible benefits from such a plan include degasification and stabilization of the condensate before final processing, providing storage of condensate during periods of abnormally high natural gas production or during periods when condensate but not gas processing facilities are undergoing maintenance. Judging from information provided by Soviet specialists, the individual cavities have a maximum capacity on the order of 50,000 m/sup 3/.

  9. Routine inspection effort required for verification of a nuclear material production cutoff convention

    SciTech Connect

    Dougherty, D.; Fainberg, A.; Sanborn, J.; Allentuck, J.; Sun, C.

    1996-11-01

    On 27 September 1993, President Clinton proposed {open_quotes}... a multilateral convention prohibiting the production of highly enriched uranium or plutonium for nuclear explosives purposes or outside of international safeguards.{close_quotes} The UN General Assembly subsequently adopted a resolution recommending negotiation of a non-discriminatory, multilateral, and internationally and effectively verifiable treaty (hereinafter referred to as {open_quotes}the Cutoff Convention{close_quotes}) banning the production of fissile material for nuclear weapons. The matter is now on the agenda of the Conference on Disarmament, although not yet under negotiation. This accord would, in effect, place all fissile material (defined as highly enriched uranium and plutonium) produced after entry into force (EIF) of the accord under international safeguards. {open_quotes}Production{close_quotes} would mean separation of the material in question from radioactive fission products, as in spent fuel reprocessing, or enrichment of uranium above the 20% level, which defines highly enriched uranium (HEU). Facilities where such production could occur would be safeguarded to verify that either such production is not occurring or that all material produced at these facilities is maintained under safeguards.

  10. Indian Point Nuclear Power Station: verification analysis of County Radiological Emergency-Response Plans

    SciTech Connect

    Nagle, J.; Whitfield, R.

    1983-05-01

    This report was developed as a management tool for use by the Federal Emergency Management Agency (FEMA) Region II staff. The analysis summarized in this report was undertaken to verify the extent to which procedures, training programs, and resources set forth in the County Radiological Emergency Response Plans (CRERPs) for Orange, Putnam, and Westchester counties in New York had been realized prior to the March 9, 1983, exercise of the Indian Point Nuclear Power Station near Buchanan, New York. To this end, a telephone survey of county emergency response organizations was conducted between January 19 and February 22, 1983. This report presents the results of responses obtained from this survey of county emergency response organizations.

  11. Verification of screening level for decontamination implemented after Fukushima nuclear accident

    PubMed Central

    Ogino, Haruyuki; Ichiji, Takeshi; Hattori, Takatoshi

    2012-01-01

    The screening level for decontamination that has been applied for the surface of the human body and contaminated handled objects after the Fukushima nuclear accident was verified by assessing the doses that arise from external irradiation, ingestion, inhalation and skin contamination. The result shows that the annual effective dose that arises from handled objects contaminated with the screening level for decontamination (i.e. 100 000 counts per minute) is <1 mSv y−1, which can be considered as the intervention exemption level in accordance with the International Commission on Radiological Protection recommendations. Furthermore, the screening level is also found to protect the skin from the incidence of a deterministic effect because the absorbed dose of the skin that arises from direct deposition on the surface of the human body is calculated to be lower than the threshold of the deterministic effect assuming a practical exposure duration. PMID:22228683

  12. Broadband seismology and the detection and verification of underground nuclear explosions

    NASA Astrophysics Data System (ADS)

    Tinker, Mark Andrew

    1997-10-01

    On September 24, 1996, President Clinton signed the Comprehensive Test Ban Treaty (CTBT), which bans the testing of all nuclear weapons thereby limiting their future development. Seismology is the primary tool used for the detection and identification of underground explosions and thus, will play a key role in monitoring a CTBT. The detection and identification of low yield explosions requires seismic stations at regional distances (<1500 km). However, because the regional wavefield propagates within the extremely heterogeneous crustal waveguide, the seismic waveforms are also very complicated. Therefore, it is necessary to have a solid understanding of how the phases used in regional discriminants develop within different tectonic regimes. Thus, the development of the seismic phases Pn and Lg, which compose the seismic discriminant Pn/Lg, within the western U.S. from the Non-Proliferation Experiment are evaluated. The most fundamental discriminant is event location as 90% of all seismic sources occur too deep within the earth to be unnatural. France resumed its nuclear testing program after a four year moratorium and conducted six tests during a five month period starting in September of 1995. Using teleseismic data, a joint hypocenter determination algorithm was used to determine the hypocenters of these six explosions. One of the most important problems in monitoring a CTBT is the detection and location of small seismic events. Although seismic arrays have become the central tool for event detection, in the context of a global monitoring treaty, there will be some dependence on sparse regional networks of three-component broadband seismic stations to detect low yield explosions. However, the full power of the data has not been utilized, namely using phases other than P and S. Therefore, the information in the surface wavetrain is used to improve the locations of small seismic events recorded on a sparse network in Bolivia. Finally, as a discrimination example in

  13. Verification of the Cross Immunoreactivity of A60, a Mouse Monoclonal Antibody against Neuronal Nuclear Protein

    PubMed Central

    Mao, Shanping; Xiong, Guoxiang; Zhang, Lei; Dong, Huimin; Liu, Baohui; Cohen, Noam A.; Cohen, Akiva S.

    2016-01-01

    A60, the mouse monoclonal antibody against the neuronal nuclear protein (NeuN), is the most widely used neuronal marker in neuroscience research and neuropathological assays. Previous studies identified fragments of A60-immunoprecipitated protein as Synapsin I (Syn I), suggesting the antibody will demonstrate cross immunoreactivity. However, the likelihood of cross reactivity has never been verified by immunohistochemical techniques. Using our established tissue processing and immunofluorescent staining protocols, we found that A60 consistently labeled mossy fiber terminals in hippocampal area CA3. These A60-positive mossy fiber terminals could also be labeled by Syn I antibody. After treating brain slices with saponin in order to better preserve various membrane and/or vesicular proteins for immunostaining, we observed that A60 could also label additional synapses in various brain areas. Therefore, we used A60 together with a rabbit monoclonal NeuN antibody to confirm the existence of this cross reactivity. We showed that the putative band positive for A60 and Syn I could not be detected by the rabbit anti-NeuN in Western blotting. As efficient as Millipore A60 to recognize neuronal nuclei, the rabbit NeuN antibody demonstrated no labeling of synaptic structures in immunofluorescent staining. The present study successfully verified the cross reactivity present in immunohistochemistry, cautioning that A60 may not be the ideal biomarker to verify neuronal identity due to its cross immunoreactivity. In contrast, the rabbit monoclonal NeuN antibody used in this study may be a better candidate to substitute for A60. PMID:27242450

  14. Potential opportunities for nano materials to help enable enhanced nuclear fuel performance

    SciTech Connect

    McClellan, Kenneth J.

    2012-06-06

    This presentation is an overview of the technical challenges for development of nuclear fuels with enhanced performance and accident tolerance. Key specific aspects of improved fuel performance are noted. Examples of existing nanonuclear projects and concepts are presented and areas of potential focus are suggested. The audience for this presentation includes representatives from: DOE-NE, other national laboratories, industry and academia. This audience is a mixture of nanotechnology experts and nuclear energy researchers and managers.

  15. Taming the SQUID: How a nuclear physics education (mostly) helped my career in applied physics

    NASA Astrophysics Data System (ADS)

    Espy, Michelle

    2013-10-01

    My degree is in experimental nuclear physics, specifically studying the interaction of pions with nuclei. But after graduation I accepted a post-doctoral research position with a team based on applications of the Superconducting Quantum Interference Device (SQUID) to the study of the human brain. Despite knowing nothing about the brain or SQUIDs to start with, I have gone on to enjoy a career in applications of the SQUID and other sensors to the detection of weak magnetic fields in a variety of problems from brain studies (magnetoencephalography) to ultra-low field nuclear magnetic resonance for detection of explosives and illicit material. In this talk I will present some background on SQUIDs and their application to the detection of ultra-weak magnetic fields of biological and non-biological origin. I will also provide a little insight into what it has been like to use a nuclear physics background to pursue other types of science.

  16. Independent Verification and Validation Of SAPHIRE 8 Software Quality Assurance Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-02-01

    This report provides an evaluation of the Software Quality Assurance Plan. The Software Quality Assurance Plan is intended to ensure all actions necessary for the software life cycle; verification and validation activities; documentation and deliverables; project management; configuration management, nonconformance reporting and corrective action; and quality assessment and improvement have been planned and a systematic pattern of all actions necessary to provide adequate confidence that a software product conforms to established technical requirements; and to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  17. Ensuring Longevity: Ancient Glasses Help Predict Durability of Vitrified Nuclear Waste

    SciTech Connect

    Weaver, Jamie L.; McCloy, John S.; Ryan, Joseph V.; Kruger, Albert A.

    2016-05-01

    How does glass alter with time? For the last hundred years this has been an important question to the fields of object conservation and archeology to ensure the preservation of glass artifacts. This same question is part of the development and assessment of durable glass waste forms for the immobilization of nuclear wastes. Researchers have developed experiments ranging from simple to highly sophisticated to answer this question, and, as a result, have gained significant insight into the mechanisms that drive glass alteration. However, the gathered data have been predominately applicable to only short-term alteration times, i.e. over the course of decades. What has remained elusive is the long-term mechanisms of glass alteration[1]. These mechanisms are of particular interest to the international nuclear waste glass community as they strive to ensure that vitrified products will be durable for thousands to tens of thousands of years. For the last thirty years this community has been working to fill this research gap by partnering with archeologists, museum curators, and geologists to identify hundred to million-year old glass analogues that have altered in environments representative of those expected at potential nuclear waste disposal sites. The process of identifying a waste glass relevant analogue is challenging as it requires scientists to relate data collected from short-term laboratory experiments to observations made from long-term analogues and extensive geochemical modeling.

  18. Independent Verification and Validation Of SAPHIRE 8 Software Requirements Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2009-09-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE requirements definition is to assess the activities that results in the specification, documentation, and review of the requirements that the software product must satisfy, including functionality, performance, design constraints, attributes and external interfaces. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP).

  19. Nuclear data verification based on Monte Carlo simulations of the LLNL pulsed-sphere benchmark experiments (1979 & 1986) using the Mercury code

    SciTech Connect

    Descalle, M; Pruet, J

    2008-06-09

    Livermore's nuclear data group developed a new verification and validation test suite to ensure the quality of data used in application codes. This is based on models of LLNL's pulsed sphere fusion shielding benchmark experiments. Simulations were done with Mercury, a 3D particle transport Monte Carlo code using continuous-energy cross-section libraries. Results were compared to measurements of neutron leakage spectra generated by 14MeV neutrons in 17 target assemblies (for a blank target assembly, H{sub 2}O, Teflon, C, N{sub 2}, Al, Si, Ti, Fe, Cu, Ta, W, Au, Pb, {sup 232}Th, {sup 235}U, {sup 238}U, and {sup 239}Pu). We also tested the fidelity of simulations for photon production associated with neutron interactions in the different materials. Gamma-ray leakage energy per neutron was obtained from a simple 1D spherical geometry assembly and compared to three codes (TART, COG, MCNP5) and several versions of the Evaluated Nuclear Data File (ENDF) and Evaluated Nuclear Data Libraries (ENDL) cross-section libraries. These tests uncovered a number of errors in photon production cross-sections, and were instrumental to the V&V of different cross-section libraries. Development of the pulsed sphere tests also uncovered the need for new Mercury capabilities. To enable simulations of neutron time-of-flight experiments the nuclear data group implemented an improved treatment of biased angular scattering in MCAPM.

  20. Proceedings of the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT)

    SciTech Connect

    Nichols, James W., LTC

    2000-09-15

    These proceedings contain papers prepared for the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), held 13-15 September 2000 in New Orleans, Louisiana. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Defense Threat Reduction Agency (DTRA), Air Force Technical Applications Center (AFTAC), Department of Defense (DoD), US Army Space and Missile Defense Command, Defense Special Weapons Agency (DSWA), and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  1. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    SciTech Connect

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert; McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  2. Environmental Detection of Clandestine Nuclear Weapon Programs

    NASA Astrophysics Data System (ADS)

    Kemp, R. Scott

    2016-06-01

    Environmental sensing of nuclear activities has the potential to detect nuclear weapon programs at early stages, deter nuclear proliferation, and help verify nuclear accords. However, no robust system of detection has been deployed to date. This can be variously attributed to high costs, technical limitations in detector technology, simple countermeasures, and uncertainty about the magnitude or behavior of potential signals. In this article, current capabilities and promising opportunities are reviewed. Systematic research in a variety of areas could improve prospects for detecting covert nuclear programs, although the potential for countermeasures suggests long-term verification of nuclear agreements will need to rely on methods other than environmental sensing.

  3. Verification and validation benchmarks.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  4. Computation and Analysis of the Global Distribution of the Radioxenon Isotope 133Xe based on Emissions from Nuclear Power Plants and Radioisotope Production Facilities and its Relevance for the Verification of the Nuclear-Test-Ban Treaty

    NASA Astrophysics Data System (ADS)

    Wotawa, Gerhard; Becker, Andreas; Kalinowski, Martin; Saey, Paul; Tuma, Matthias; Zähringer, Matthias

    2010-05-01

    Monitoring of radioactive noble gases, in particular xenon isotopes, is a crucial element of the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The capability of the noble gas network, which is currently under construction, to detect signals from a nuclear explosion critically depends on the background created by other sources. Therefore, the global distribution of these isotopes based on emissions and transport patterns needs to be understood. A significant xenon background exists in the reactor regions of North America, Europe and Asia. An emission inventory of the four relevant xenon isotopes has recently been created, which specifies source terms for each power plant. As the major emitters of xenon isotopes worldwide, a few medical radioisotope production facilities have been recently identified, in particular the facilities in Chalk River (Canada), Fleurus (Belgium), Pelindaba (South Africa) and Petten (Netherlands). Emissions from these sites are expected to exceed those of the other sources by orders of magnitude. In this study, emphasis is put on 133Xe, which is the most prevalent xenon isotope. First, based on the emissions known, the resulting 133Xe concentration levels at all noble gas stations of the final CTBT verification network were calculated and found to be consistent with observations. Second, it turned out that emissions from the radioisotope facilities can explain a number of observed peaks, meaning that atmospheric transport modelling is an important tool for the categorization of measurements. Third, it became evident that Nuclear Power Plant emissions are more difficult to treat in the models, since their temporal variation is high and not generally reported. Fourth, there are indications that the assumed annual emissions may be underestimated by factors of two to ten, while the general emission patterns seem to be well understood. Finally, it became evident that 133Xe sources mainly influence the sensitivity of the

  5. A REPRINT of a July 1991 Report to Congress, Executive Summary of Verification of Nuclear Warhead Dismantlement and Special Nuclear Material Controls

    SciTech Connect

    Fuller, James L.

    2008-11-20

    With the renewed thinking and debate about deep reductions in nuclear weapons, including recent proposals about eliminating nuclear warheads altogether, republishing the general conclusions of the Robinson Committee Report of 1992 appears useful. The report is sometimes referred to as the 3151 Report, from Section 3151 of the National Defnse Authorization Act for FY1991, from where its requirement originated. This report contains the Executive Summary only and the forwarding letters from the Committee, the President of the United States, the Secretary of Energy, and C Paul Robinson, the head of the Advisory Committee.

  6. Specification and verification of nuclear-power-plant training-simulator response characteristics. Part II. Conclusions and recommendations

    SciTech Connect

    Haas, P M; Selby, D L; Kerlin, T W; Felkins, L

    1982-05-01

    The nuclear industry should adopt and NRC regulatory and research actions should support the systems approach to training as a structured framework for development and validation of personnel training systems. Potential exists for improving the ability to assess simulator fidelity. Systems Identification Technology offers a potential framework for model validation. Installation of the data collection/recording equipment required by NUREG-0696 could provide a vastly improved source of data for simulator fidelity assessment. The NRC needs to continue its post-TMI actions to involve itself more rigorously and more formally in the entire process of NPP personnel training system development. However, this involvement should be a participative one with industry. The existing similator standards and guidelines should be reorganized to support the use of systems approach to training. The standards should require and support a holistic approach to training system development that recognizes simulators and simulator training as only parts of the complete training program and full-scope, high-fidelity, site-specific simulators as only one useful training device. Some recommendations for adapting the SAT/ISD process to the nuclear industry are: The formation of an NRC/industry planning/coordination group, a program planning study to develop a programmatic plan, development of a user's guide and NRC/industry workshops to establish common terminology and practice, and a pilot study applying the adopted SAT/ISD methodology to an actual nuclear industry training program.

  7. On-line high-performance liquid chromatography-ultraviolet-nuclear magnetic resonance method of the markers of nerve agents for verification of the Chemical Weapons Convention.

    PubMed

    Mazumder, Avik; Gupta, Hemendra K; Garg, Prabhat; Jain, Rajeev; Dubey, Devendra K

    2009-07-03

    This paper details an on-flow liquid chromatography-ultraviolet-nuclear magnetic resonance (LC-UV-NMR) method for the retrospective detection and identification of alkyl alkylphosphonic acids (AAPAs) and alkylphosphonic acids (APAs), the markers of the toxic nerve agents for verification of the Chemical Weapons Convention (CWC). Initially, the LC-UV-NMR parameters were optimized for benzyl derivatives of the APAs and AAPAs. The optimized parameters include stationary phase C(18), mobile phase methanol:water 78:22 (v/v), UV detection at 268nm and (1)H NMR acquisition conditions. The protocol described herein allowed the detection of analytes through acquisition of high quality NMR spectra from the aqueous solution of the APAs and AAPAs with high concentrations of interfering background chemicals which have been removed by preceding sample preparation. The reported standard deviation for the quantification is related to the UV detector which showed relative standard deviations (RSDs) for quantification within +/-1.1%, while lower limit of detection upto 16mug (in mug absolute) for the NMR detector. Finally the developed LC-UV-NMR method was applied to identify the APAs and AAPAs in real water samples, consequent to solid phase extraction and derivatization. The method is fast (total experiment time approximately 2h), sensitive, rugged and efficient.

  8. Wind gust warning verification

    NASA Astrophysics Data System (ADS)

    Primo, Cristina

    2016-07-01

    Operational meteorological centres around the world increasingly include warnings as one of their regular forecast products. Warnings are issued to warn the public about extreme weather situations that might occur leading to damages and losses. In forecasting these extreme events, meteorological centres help their potential users in preventing the damage or losses they might suffer. However, verifying these warnings requires specific methods. This is due not only to the fact that they happen rarely, but also because a new temporal dimension is added when defining a warning, namely the time window of the forecasted event. This paper analyses the issues that might appear when dealing with warning verification. It also proposes some new verification approaches that can be applied to wind warnings. These new techniques are later applied to a real life example, the verification of wind gust warnings at the German Meteorological Centre ("Deutscher Wetterdienst"). Finally, the results obtained from the latter are discussed.

  9. Application of cryoprobe 1H nuclear magnetic resonance spectroscopy and multivariate analysis for the verification of corsican honey.

    PubMed

    Donarski, James A; Jones, Stephen A; Charlton, Adrian J

    2008-07-23

    Proton nuclear magnetic resonance spectroscopy ((1)H NMR) and multivariate analysis techniques have been used to classify honey into two groups by geographical origin. Honey from Corsica (Miel de Corse) was used as an example of a protected designation of origin product. Mathematical models were constructed to determine the feasibility of distinguishing between honey from Corsica and that from other geographical locations in Europe, using (1)H NMR spectroscopy. Honey from 10 different regions within five countries was analyzed. (1)H NMR spectra were used as input variables for projection to latent structures (PLS) followed by linear discriminant analysis (LDA) and genetic programming (GP). Models were generated using three methods, PLS-LDA, two-stage GP, and a combination of PLS and GP (PLS-GP). The PLS-GP model used variables selected by PLS for subsequent GP calculations. All models were generated using Venetian blind cross-validation. Overall classification rates for the discrimination of Corsican and non-Corsican honey of 75.8, 94.5, and 96.2% were determined using PLS-LDA, two-stage GP, and PLS-GP, respectively. The variables utilized by PLS-GP were related to their (1)H NMR chemical shifts, and this led to the identification of trigonelline in honey for the first time.

  10. Utilization of the Differential Die-Away Self-Interrogation Technique for Characterization and Verification of Spent Nuclear Fuel

    NASA Astrophysics Data System (ADS)

    Trahan, Alexis Chanel

    New nondestructive assay techniques are sought to better characterize spent nuclear fuel. One of the NDA instruments selected for possible deployment is differential die-away self-interrogation (DDSI). The proposed DDSI approach for spent fuel assembly assay utilizes primarily the spontaneous fission and (alpha, n) neutrons in the assemblies as an internal interrogating radiation source. The neutrons released in spontaneous fission or (alpha,n) reactions are thermalized in the surrounding water and induce fission in fissile isotopes, thereby creating a measurable signal from isotopes of interest that would be otherwise difficult to measure. The DDSI instrument employs neutron coincidence counting with 3He tubes and list-mode-based data acquisition to allow for production of Rossi-alpha distributions (RADs) in post-processing. The list-mode approach to data collection and subsequent construction of RADs has expanded the analytical possibilities, as will be demonstrated throughout this thesis. One of the primary advantages is that the measured signal in the form of a RAD can be analyzed in its entirety including determination of die-away times in different time domains. This capability led to the development of the early die-away method, a novel leakage multiplication determination method which is tested throughout the thesis on different sources in simulation space and fresh fuel experiments. The early die-away method is a robust, accurate, improved method of determining multiplication without the need for knowledge of the (alpha,n) source term. The DDSI technique and instrument are presented along with the many novel capabilities enabled by and discovered through RAD analysis. Among the new capabilities presented are the early die-away method, total plutonium content determination, and highly sensitive missing pin detection. Simulation of hundreds of different spent and fresh fuel assemblies were used to develop the analysis algorithms and the techniques were

  11. Utilization of the Differential Die-Away Self-Interrogation Technique for Characterization and Verification of Spent Nuclear Fuel

    SciTech Connect

    Trahan, Alexis Chanel

    2016-01-27

    New nondestructive assay techniques are sought to better characterize spent nuclear fuel. One of the NDA instruments selected for possible deployment is differential die-away self-interrogation (DDSI). The proposed DDSI approach for spent fuel assembly assay utilizes primarily the spontaneous fission and (α, n) neutrons in the assemblies as an internal interrogating radiation source. The neutrons released in spontaneous fission or (α,n) reactions are thermalized in the surrounding water and induce fission in fissile isotopes, thereby creating a measurable signal from isotopes of interest that would be otherwise difficult to measure. The DDSI instrument employs neutron coincidence counting with 3He tubes and list-mode-based data acquisition to allow for production of Rossi-alpha distributions (RADs) in post-processing. The list-mode approach to data collection and subsequent construction of RADs has expanded the analytical possibilities, as will be demonstrated throughout this thesis. One of the primary advantages is that the measured signal in the form of a RAD can be analyzed in its entirety including determination of die-away times in different time domains. This capability led to the development of the early die-away method, a novel leakage multiplication determination method which is tested throughout the thesis on different sources in simulation space and fresh fuel experiments. The early die-away method is a robust, accurate, improved method of determining multiplication without the need for knowledge of the (α,n) source term. The DDSI technique and instrument are presented along with the many novel capabilities enabled by and discovered through RAD analysis. Among the new capabilities presented are the early die-away method, total plutonium content determination, and highly sensitive missing pin detection. Simulation of hundreds of different spent and fresh fuel assemblies were used to develop the analysis algorithms and the techniques were tested on a

  12. Assessment of the utility of on-site inspection for INF treaty verification. Sanitized. Technical report

    SciTech Connect

    Baker, J.C.; Hart, D.M.; Doherty, R.T.

    1983-11-10

    This report analyzes the utility of on-site inspection (OSI) for enhancing Intermediate-Range Nuclear Force (INF) treaty verification of Soviet compliance with US-proposed collateral limits on short-range ballistic missiles (SRBMs). It outlines a detailed verification regime that relies on manned OSI teams to help verify limitations on Soviet SRBM deployments. It also assesses the OSI regime's potential impact on US Pershing deployments. Finally, the report reviews the history of American policy concerning on-site inspection and evaluates the overall utility of OSI in support of National Technical Means.

  13. CTBT integrated verification system evaluation model supplement

    SciTech Connect

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  14. Technical challenges for dismantlement verification

    SciTech Connect

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-11-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion.

  15. Transmutation Fuel Performance Code Thermal Model Verification

    SciTech Connect

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  16. Helping Kids Help

    ERIC Educational Resources Information Center

    Heiss, E. Renee

    2008-01-01

    Educators need to help kids help others so that they can help themselves. Volunteering does not involve competition or grades. This is one area where students don't have to worry about measuring up to the expectations of parents, teachers, and coaches. Students participate in charitable work to add another line to a college transcript or job…

  17. Helping Kids Help

    ERIC Educational Resources Information Center

    Heiss, E. Renee

    2008-01-01

    Educators need to help kids help others so that they can help themselves. Volunteering does not involve competition or grades. This is one area where students don't have to worry about measuring up to the expectations of parents, teachers, and coaches. Students participate in charitable work to add another line to a college transcript or job…

  18. MELCOR Verification, Benchmarking, and Applications experience at BNL

    SciTech Connect

    Madni, I.K.

    1992-01-01

    This paper presents a summary of MELCOR Verification, Benchmarking and Applications experience at Brookhaven National Laboratory (BNL), sponsored by the US Nuclear Regulatory Commission (NRC). Under MELCOR verification over the past several years, all released versions of the code were installed on BNL's computer system, verification exercises were performed, and defect investigation reports were sent to SNL. Benchmarking calculations of integral severe fuel damage tests performed at BNL have helped to identify areas of modeling strengths and weaknesses in MELCOR; the most appropriate choices for input parameters; selection of axial nodalization for core cells and heat structures; and workarounds that extend the capabilities of MELCOR. These insights are explored in greater detail in the paper, with the help of selected results and comparisons. Full plant applications calculations at BNL have helped to evaluate the ability of MELCOR to successfully simulate various accident sequences and calculate source terms to the environment for both BWRs and PWRs. A summary of results, including timing of key events, thermal-hydraulic response, and environmental releases of fission products are presented for selected calculations, along with comparisons with Source Term Code Package (STCP) calculations of the same sequences. Differences in results are explained on the basis of modeling differences between the two codes. The results of a sensitivity calculation are also shown. The paper concludes by highlighting some insights on bottomline issues, and the contribution of the BNL program to MELCOR development, assessment, and the identification of user needs for optimum use of the code.

  19. MELCOR Verification, Benchmarking, and Applications experience at BNL

    SciTech Connect

    Madni, I.K.

    1992-12-31

    This paper presents a summary of MELCOR Verification, Benchmarking and Applications experience at Brookhaven National Laboratory (BNL), sponsored by the US Nuclear Regulatory Commission (NRC). Under MELCOR verification over the past several years, all released versions of the code were installed on BNL`s computer system, verification exercises were performed, and defect investigation reports were sent to SNL. Benchmarking calculations of integral severe fuel damage tests performed at BNL have helped to identify areas of modeling strengths and weaknesses in MELCOR; the most appropriate choices for input parameters; selection of axial nodalization for core cells and heat structures; and workarounds that extend the capabilities of MELCOR. These insights are explored in greater detail in the paper, with the help of selected results and comparisons. Full plant applications calculations at BNL have helped to evaluate the ability of MELCOR to successfully simulate various accident sequences and calculate source terms to the environment for both BWRs and PWRs. A summary of results, including timing of key events, thermal-hydraulic response, and environmental releases of fission products are presented for selected calculations, along with comparisons with Source Term Code Package (STCP) calculations of the same sequences. Differences in results are explained on the basis of modeling differences between the two codes. The results of a sensitivity calculation are also shown. The paper concludes by highlighting some insights on bottomline issues, and the contribution of the BNL program to MELCOR development, assessment, and the identification of user needs for optimum use of the code.

  20. Verification and Validation Plan for the Codes LSP and ICARUS (PEGASUS)

    SciTech Connect

    RILEY,MERLE E.; BUSS,RICHARD J.; CAMPBELL,ROBERT B.; HOPKINS,MATTHEW M.; MILLER,PAUL A.; MOATS,ANNE R.; WAMPLER,WILLIAM R.

    2002-02-01

    This report documents the strategies for verification and validation of the codes LSP and ICARUS used for simulating the operation of the neutron tubes used in all modern nuclear weapons. The codes will be used to assist in the design of next generation neutron generators and help resolve manufacturing issues for current and future production of neutron devices. Customers for the software are identified, tube phenomena are identified and ranked, software quality strategies are given, and the validation plan is set forth.

  1. Independent Verification and Validation Of SAPHIRE 8 Software Design and Interface Design Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE software design and interface design is to assess the activities that results in the development, documentation, and review of a software design that meets the requirements defined in the software requirements documentation. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP) design specification.

  2. Making Sure Helping Helps.

    ERIC Educational Resources Information Center

    Gartner, Audrey; Riessman, Frank

    1993-01-01

    Benefits to the helper are important to consider in a national-service program, along with the benefits to the recipient. Some suggestions are offered to ensure reciprocity in community service. Democratizing help giving, that is making it available to the widest possible audience, could help remove some of the pitfalls associated with help…

  3. Making Sure Helping Helps.

    ERIC Educational Resources Information Center

    Gartner, Audrey; Riessman, Frank

    1993-01-01

    Benefits to the helper are important to consider in a national-service program, along with the benefits to the recipient. Some suggestions are offered to ensure reciprocity in community service. Democratizing help giving, that is making it available to the widest possible audience, could help remove some of the pitfalls associated with help…

  4. Answers to if the Lead Aprons are Really Helpful in Nuclear Medicine from the Perspective of Spectroscopy.

    PubMed

    He, X; Zhao, R; Rong, L; Yao, K; Chen, S; Wei, B

    2016-09-09

    Wearing lead X-ray-protective aprons is a routine in nuclear medicine department in parts of China. However, the staff are often perplexed by questions such as if it is imperative to wear aprons when injecting radioactive drugs, how much radiation dosage can be shielded and if the apron will produce secondary radiation instead? To answer these questions, a semiconductor detector was employed to record different gamma and X-ray spectra with and without the lead apron or lead sheet. Then, we could estimate the signal shielding ratio to different photons for the lead apron and compare with the hospitals measured data. In general, the two results coincided well. The spectral results showed that the detrimental secondary X-rays irradiation rises when the energy of gamma rays exceeds the K absorption edge of lead (88 keV). Moreover, the aprons are not so effective for gamma rays of 364 keV emitted from (131)I and 511 keV emitted from the positron radioactive nuclides. This work is purely a physical measurement in the laboratory. To the best of our knowledge, this is the first quantitative study on the level of gamma rays protection offered by the medical lead aprons and the importance of the spectroscopic measurements is discussed in this paper.

  5. Independent Verification and Validation Of SAPHIRE 8 Volume 3 Users' Guide Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE 8 Volume 3 Users’ Guide is to assess the user documentation for its completeness, correctness, and consistency with respect to requirements for user interface and for any functionality that can be invoked by the user. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  6. Helping Kids Help Themselves.

    ERIC Educational Resources Information Center

    Good, E. Perry

    This book explains how many of the behaviors that adults use to "help" kids are, at best, ineffective and, at worst, destructive to the adults' relationships with children. Adults traditionally believe that external cues prompt correct behavior--the premise of stimulus-response psychology. However, the ideas discussed here revolve around the…

  7. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  8. Independent Verification and Validation Of SAPHIRE 8 Software Configuration Management Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-02-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE configuration management is to assess the activities that results in the process of identifying and defining the baselines associated with the SAPHIRE software product; controlling the changes to baselines and release of baselines throughout the life cycle; recording and reporting the status of baselines and the proposed and actual changes to the baselines; and verifying the correctness and completeness of baselines.. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  9. Independent Verification and Validation Of SAPHIRE 8 Software Acceptance Test Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE 8 Software Acceptance Test Plan is to assess the approach to be taken for intended testing activities. The plan typically identifies the items to be tested, the requirements being tested, the testing to be performed, test schedules, personnel requirements, reporting requirements, evaluation criteria, and any risks requiring contingency planning. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  10. Monitoring and verification R&D

    SciTech Connect

    Pilat, Joseph F; Budlong - Sylvester, Kory W; Fearey, Bryan L

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.

  11. SYNCHROTRON RADIATION, FREE ELECTRON LASER, APPLICATION OF NUCLEAR TECHNOLOGY, ETC.: Experimental verification of therapeutic doses for the superficially-placed tumor radiotherapy with heavy ions at HIRFL

    NASA Astrophysics Data System (ADS)

    Liu, Xin-Guo; Li, Qiang; Wu, Qing-Feng; Tao, Jia-Jun; Jin, Xiao-Dong

    2009-02-01

    Up to now, clinical trials of heavy-ion radiotherapy for superficially placed tumors have been carried out for six times and over 60 selected patients have been treated with 80-100 MeV/u carbon ions supplied by the Heavy Ion Research Facility in Lanzhou (HIRFL) at the Institute of Modern Physics, Chinese Academy of Sciences since November, 2006. A passive irradiation system and a dose optimization method for radiotherapy with carbon-ion beams have been developed. Experimental verification of longitudinally therapeutic dose distributions was conducted under the condition of simulating patient treatment in the therapy terminal at HIRFL. The measured depth-dose distributions basically coincide with the expected ones. These results indicate that the irradiation system and the dose optimization method are effective in the ongoing carbon-ion radiotherapy for shallow-seated tumors at HIRFL.

  12. Implementation and verification of nuclear interactions in a Monte-Carlo code for the Procom-ProGam proton therapy planning system

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, V. I.; Makarova, A. S.; Ryazantsev, O. B.; Samarin, S. I.; Uglov, A. S.

    2014-06-01

    A great breakthrough in proton therapy has happened in the new century: several tens of dedicated centers are now operated throughout the world and their number increases every year. An important component of proton therapy is a treatment planning system. To make calculations faster, these systems usually use analytical methods whose reliability and accuracy do not allow the advantages of this method of treatment to implement to the full extent. Predictions by the Monte Carlo (MC) method are a "gold" standard for the verification of calculations with these systems. At the Institute of Experimental and Theoretical Physics (ITEP) which is one of the eldest proton therapy centers in the world, an MC code is an integral part of their treatment planning system. This code which is called IThMC was developed by scientists from RFNC-VNIITF (Snezhinsk) under ISTC Project 3563.

  13. CTBT Integrated Verification System Evaluation Model

    SciTech Connect

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  14. Verification Games: Crowd-Sourced Formal Verification

    DTIC Science & Technology

    2016-03-01

    Formal Verification the verification tools developed by the Programming Languages and Software Engineering group were improved. A series of games... software makes it imperative to find more effective and efficient mechanisms for improving software reliability. Formal verification is an important part...of this effort, since it is the only way to be certain that a given piece of software is free of (certain types of) errors. To date, formal

  15. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  16. LANL measurements verification acceptance criteria

    SciTech Connect

    Chavez, D. M.

    2001-01-01

    The possibility of SNM diversion/theft is a major concern to organizations charged with control of Special Nuclear Material (SNM). Verification measurements are used to aid in the detection of SNM losses. The acceptance/rejection criteria for verification measurements are dependent on the facility-specific processes, the knowledge of the measured item, and the measurement technique applied. This paper will discuss some of the LANL measurement control steps and criteria applied for the acceptance of a verification measurement. The process involves interaction among the facility operations personnel, the subject matter experts of a specific instrument/technique, the process knowledge on the matrix of the measured item, and the measurement-specific precision and accuracy values. By providing an introduction to a site-specific application of measurement verification acceptance criteria, safeguards, material custodians, and SNM measurement professionals are assisted in understanding the acceptance/rejection process for measurements and their contribution of the process to the detection of SNM diversion.

  17. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  18. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  19. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  20. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  1. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  2. The nuclear freeze controversy

    SciTech Connect

    Payne, K.B.; Gray, C.S.

    1984-01-01

    This book presents papers on nuclear arms control. Topics considered include the background and rationale behind the nuclear freeze proposal, nuclear deterrence, national defense, arms races, arms buildup, warfare, the moral aspects of nuclear deterrence, treaty verification, the federal budget, the economy, a historical perspective on Soviet policy toward the freeze, the other side of the Soviet peace offensive, and making sense of the nuclear freeze debate.

  3. Nuclear Scans

    MedlinePlus

    Nuclear scans use radioactive substances to see structures and functions inside your body. They use a special ... images. Most scans take 20 to 45 minutes. Nuclear scans can help doctors diagnose many conditions, including ...

  4. Sandia technology. Volume 13, number 2 Special issue : verification of arms control treaties.

    SciTech Connect

    Not Available

    1989-03-01

    Nuclear deterrence, a cornerstone of US national security policy, has helped prevent global conflict for over 40 years. The DOE and DoD share responsibility for this vital part of national security. The US will continue to rely on nuclear deterrence for the foreseeable future. In the late 1950s, Sandia developed satellite-borne nuclear burst detection systems to support the treaty banning atmospheric nuclear tests. This activity has continued to expand and diversify. When the Non-Proliferation Treaty was ratified in 1970, we began to develop technologies to protect nuclear materials from falling into unauthorized hands. This program grew and now includes systems for monitoring the movement and storage of nuclear materials, detecting tampering, and transmiting sensitive data securely. In the late 1970s, negotiations to further limit underground nuclear testing were being actively pursued. In less than 18 months, we fielded the National Seismic Station, an unattended observatory for in-country monitoring of nuclear tests. In the mid-l980s, arms-control interest shifted to facility monitoring and on-site inspection. Our Technical On-site Inspection Facility is the national test bed for perimeter and portal monitoring technology and the prototype for the inspection portal that was recently installed in the USSR under the Intermediate-Range Nuclear Forces accord. The articles in the special issue of Sundiu Technology describe some of our current contributions to verification technology. This work supports the US policy to seek realistic arms control agreements while maintaining our national security.

  5. Cold fusion verification

    NASA Astrophysics Data System (ADS)

    North, M. H.; Mastny, G. F.; Wesley, E. J.

    1991-03-01

    The objective of this work to verify and reproduce experimental observations of Cold Nuclear Fusion (CNF), as originally reported in 1989. The method was to start with the original report and add such additional information as became available to build a set of operational electrolytic CNF cells. Verification was to be achieved by first observing cells for neutron production, and for those cells that demonstrated a nuclear effect, careful calorimetric measurements were planned. The authors concluded, after laboratory experience, reading published work, talking with others in the field, and attending conferences, that CNF probably is chimera and will go the way of N-rays and polywater. The neutron detector used for these tests was a completely packaged unit built into a metal suitcase that afforded electrostatic shielding for the detectors and self-contained electronics. It was battery-powered, although it was on charge for most of the long tests. The sensor element consists of He detectors arranged in three independent layers in a solid moderating block. The count from each of the three layers as well as the sum of all the detectors were brought out and recorded separately. The neutron measurements were made with both the neutron detector and the sample tested in a cave made of thick moderating material that surrounded the two units on the sides and bottom.

  6. Columbus pressurized module verification

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Comandatore, Emanuele

    1986-01-01

    The baseline verification approach of the COLUMBUS Pressurized Module was defined during the A and B1 project phases. Peculiarities of the verification program are the testing requirements derived from the permanent manned presence in space. The model philosophy and the test program have been developed in line with the overall verification concept. Such critical areas as meteoroid protections, heat pipe radiators and module seals are identified and tested. Verification problem areas are identified and recommendations for the next development are proposed.

  7. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  8. Verification of Adaptive Systems

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui; Vassev, Emil; Hinchey, Mike; Rouff, Christopher; Buskens, Richard

    2012-01-01

    Adaptive systems are critical for future space and other unmanned and intelligent systems. Verification of these systems is also critical for their use in systems with potential harm to human life or with large financial investments. Due to their nondeterministic nature and extremely large state space, current methods for verification of software systems are not adequate to provide a high level of assurance for them. The combination of stabilization science, high performance computing simulations, compositional verification and traditional verification techniques, plus operational monitors, provides a complete approach to verification and deployment of adaptive systems that has not been used before. This paper gives an overview of this approach.

  9. Optimal Imaging for Treaty Verification

    SciTech Connect

    Brubaker, Erik; Hilton, Nathan R.; Johnson, William; Marleau, Peter; Kupinski, Matthew; MacGahan, Christopher Jonathan

    2014-09-01

    Future arms control treaty verification regimes may use radiation imaging measurements to confirm and track nuclear warheads or other treaty accountable items (TAIs). This project leverages advanced inference methods developed for medical and adaptive imaging to improve task performance in arms control applications. Additionally, we seek a method to acquire and analyze imaging data of declared TAIs without creating an image of those objects or otherwise storing or revealing any classified information. Such a method would avoid the use of classified-information barriers (IB).

  10. Site Specific Verification Guidelines.

    SciTech Connect

    Harding, Steve; Gordon, Frederick M.; Kennedy, Mike

    1992-05-01

    The Bonneville Power Administration (BPA) and the Northwest region have moved from energy surplus to a time when demand for energy is likely to exceed available supplies. The Northwest Power Planning Council is calling for a major push to acquire new resources.'' To meet anticipated loads in the next decade, BPA and the region must more than double that rate at which we acquire conservation resources. BPA hopes to achieve some of this doubling by programs independently designed and implemented by utilities and other parties without intensive BPA involvement. BPA will accept proposals for programs using performance-based payments, in which BPA bases its reimbursement to the sponsor on measured energy savings rather than program costs. To receive payment for conservation projects developed under performance-based programs, utilities and other project developers must propose verification plans to measure the amount of energy savings. BPA has traditionally used analysis of billing histories, before and after measure installation, adjusted by a comparison group on non-participating customers to measure conservation savings. This approach does not work well for all conversation projects. For large or unusual facilities the comparison group approach is not reliable due to the absence of enough comparable non-participants to allow appropriate statistical analysis. For these facilities, which include large commercial and institutional buildings, industrial projects, and complex combinations of building types served by a single utility meter, savings must be verified on a site-specific basis. These guidelines were written to help proposers understand what Bonneville considers the important issues in site specific verification of conservation performance. It also provides a toolbox of methods with guidance on their application and use. 15 refs.

  11. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  12. Fuel Retrieval System Design Verification Report

    SciTech Connect

    GROTH, B.D.

    2000-04-11

    The Fuel Retrieval Subproject was established as part of the Spent Nuclear Fuel Project (SNF Project) to retrieve and repackage the SNF located in the K Basins. The Fuel Retrieval System (FRS) construction work is complete in the KW Basin, and start-up testing is underway. Design modifications and construction planning are also underway for the KE Basin. An independent review of the design verification process as applied to the K Basin projects was initiated in support of preparation for the SNF Project operational readiness review (ORR). A Design Verification Status Questionnaire, Table 1, is included which addresses Corrective Action SNF-EG-MA-EG-20000060, Item No.9 (Miller 2000).

  13. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  14. 7.0T nuclear magnetic resonance evaluation of the amyloid beta (1-40) animal model of Alzheimer's disease: comparison of cytology verification.

    PubMed

    Zhang, Lei; Dong, Shuai; Zhao, Guixiang; Ma, Yu

    2014-02-15

    3.0T magnetic resonance spectroscopic imaging is a commonly used method in the research of brain function in Alzheimer's disease. However, the role of 7.0T high-field magnetic resonance spectroscopic imaging in brain function of Alzheimer's disease remains unclear. In this study, 7.0T magnetic resonance spectroscopy showed that in the hippocampus of Alzheimer's disease rats, the N-acetylaspartate wave crest was reduced, and the creatine and choline wave crest was elevated. This finding was further supported by hematoxylin-eosin staining, which showed a loss of hippocampal neurons and more glial cells. Moreover, electron microscopy showed neuronal shrinkage and mitochondrial rupture, and scanning electron microscopy revealed small size hippocampal synaptic vesicles, incomplete synaptic structure, and reduced number. Overall, the results revealed that 7.0T high-field nuclear magnetic resonance spectroscopy detected the lesions and functional changes in hippocampal neurons of Alzheimer's disease rats in vivo, allowing the possibility for assessing the success rate and grading of the amyloid beta (1-40) animal model of Alzheimer's disease.

  15. Experimental verification of proton beam monitoring in a human body by use of activity image of positron-emitting nuclei generated by nuclear fragmentation reaction.

    PubMed

    Nishio, Teiji; Miyatake, Aya; Inoue, Kazumasa; Gomi-Miyagishi, Tomoko; Kohno, Ryosuke; Kameoka, Satoru; Nakagawa, Keiichi; Ogino, Takashi

    2008-01-01

    Proton therapy is a form of radiotherapy that enables concentration of dose on a tumor by use of a scanned or modulated Bragg peak. Therefore, it is very important to evaluate the proton-irradiated volume accurately. The proton-irradiated volume can be confirmed by detection of pair-annihilation gamma rays from positron-emitting nuclei generated by the nuclear fragmentation reaction of the incident protons on target nuclei using a PET apparatus. The activity of the positron-emitting nuclei generated in a patient was measured with a PET-CT apparatus after proton beam irradiation of the patient. Activity measurement was performed in patients with tumors of the brain, head and neck, liver, lungs, and sacrum. The 3-D PET image obtained on the CT image showed the visual correspondence with the irradiation area of the proton beam. Moreover, it was confirmed that there were differences in the strength of activity from the PET-CT images obtained at each irradiation site. The values of activity obtained from both measurement and calculation based on the reaction cross section were compared, and it was confirmed that the intensity and the distribution of the activity changed with the start time of the PET imaging after proton beam irradiation. The clinical use of this information about the positron-emitting nuclei will be important for promoting proton treatment with higher accuracy in the future.

  16. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  17. Verification of Scientific Simulations via Hypothesis-Driven Comparative and Quantitative Visualization

    SciTech Connect

    Ahrens, James P; Heitmann, Katrin; Petersen, Mark R; Woodring, Jonathan; Williams, Sean; Fasel, Patricia; Ahrens, Christine; Hsu, Chung-Hsing; Geveci, Berk

    2010-11-01

    This article presents a visualization-assisted process that verifies scientific-simulation codes. Code verification is necessary because scientists require accurate predictions to interpret data confidently. This verification process integrates iterative hypothesis verification with comparative, feature, and quantitative visualization. Following this process can help identify differences in cosmological and oceanographic simulations.

  18. The Challenge for Arms Control Verification in the Post-New START World

    SciTech Connect

    Wuest, C R

    2012-05-24

    Nuclear weapon arms control treaty verification is a key aspect of any agreement between signatories to establish that the terms and conditions spelled out in the treaty are being met. Historically, arms control negotiations have focused more on the rules and protocols for reducing the numbers of warheads and delivery systems - sometimes resorting to complex and arcane procedures for counting forces - in an attempt to address perceived or real imbalances in a nation's strategic posture that could lead to instability. Verification procedures are generally defined in arms control treaties and supporting documents and tend to focus on technical means and measures designed to ensure that a country is following the terms of the treaty and that it is not liable to engage in deception or outright cheating in an attempt to circumvent the spirit and the letter of the agreement. As the Obama Administration implements the articles, terms, and conditions of the recently ratified and entered-into-force New START treaty, there are already efforts within and outside of government to move well below the specified New START levels of 1550 warheads, 700 deployed strategic delivery vehicles, and 800 deployed and nondeployed strategic launchers (Inter-Continental Ballistic Missile (ICBM) silos, Submarine-Launched Ballistic Missile (SLBM) tubes on submarines, and bombers). A number of articles and opinion pieces have appeared that advocate for significantly deeper cuts in the U.S. nuclear stockpile, with some suggesting that unilateral reductions on the part of the U.S. would help coax Russia and others to follow our lead. Papers and studies prepared for the U.S. Department of Defense and at the U.S. Air War College have also been published, suggesting that nuclear forces totaling no more than about 300 warheads would be sufficient to meet U.S. national security and deterrence needs. (Davis 2011, Schaub and Forsyth 2010) Recent articles by James M. Acton and others suggest that the

  19. Standardized verification of fuel cycle modeling

    DOE PAGES

    Feng, B.; Dixon, B.; Sunny, E.; ...

    2016-04-05

    A nuclear fuel cycle systems modeling and code-to-code comparison effort was coordinated across multiple national laboratories to verify the tools needed to perform fuel cycle analyses of the transition from a once-through nuclear fuel cycle to a sustainable potential future fuel cycle. For this verification study, a simplified example transition scenario was developed to serve as a test case for the four systems codes involved (DYMOND, VISION, ORION, and MARKAL), each used by a different laboratory participant. In addition, all participants produced spreadsheet solutions for the test case to check all the mass flows and reactor/facility profiles on a year-by-yearmore » basis throughout the simulation period. The test case specifications describe a transition from the current US fleet of light water reactors to a future fleet of sodium-cooled fast reactors that continuously recycle transuranic elements as fuel. After several initial coordinated modeling and calculation attempts, it was revealed that most of the differences in code results were not due to different code algorithms or calculation approaches, but due to different interpretations of the input specifications among the analysts. Therefore, the specifications for the test case itself were iteratively updated to remove ambiguity and to help calibrate interpretations. In addition, a few corrections and modifications were made to the codes as well, which led to excellent agreement between all codes and spreadsheets for this test case. Although no fuel cycle transition analysis codes matched the spreadsheet results exactly, all remaining differences in the results were due to fundamental differences in code structure and/or were thoroughly explained. As a result, the specifications and example results are provided so that they can be used to verify additional codes in the future for such fuel cycle transition scenarios.« less

  20. Standardized verification of fuel cycle modeling

    SciTech Connect

    Feng, B.; Dixon, B.; Sunny, E.; Cuadra, A.; Jacobson, J.; Brown, N. R.; Powers, J.; Worrall, A.; Passerini, S.; Gregg, R.

    2016-04-05

    A nuclear fuel cycle systems modeling and code-to-code comparison effort was coordinated across multiple national laboratories to verify the tools needed to perform fuel cycle analyses of the transition from a once-through nuclear fuel cycle to a sustainable potential future fuel cycle. For this verification study, a simplified example transition scenario was developed to serve as a test case for the four systems codes involved (DYMOND, VISION, ORION, and MARKAL), each used by a different laboratory participant. In addition, all participants produced spreadsheet solutions for the test case to check all the mass flows and reactor/facility profiles on a year-by-year basis throughout the simulation period. The test case specifications describe a transition from the current US fleet of light water reactors to a future fleet of sodium-cooled fast reactors that continuously recycle transuranic elements as fuel. After several initial coordinated modeling and calculation attempts, it was revealed that most of the differences in code results were not due to different code algorithms or calculation approaches, but due to different interpretations of the input specifications among the analysts. Therefore, the specifications for the test case itself were iteratively updated to remove ambiguity and to help calibrate interpretations. In addition, a few corrections and modifications were made to the codes as well, which led to excellent agreement between all codes and spreadsheets for this test case. Although no fuel cycle transition analysis codes matched the spreadsheet results exactly, all remaining differences in the results were due to fundamental differences in code structure and/or were thoroughly explained. As a result, the specifications and example results are provided so that they can be used to verify additional codes in the future for such fuel cycle transition scenarios.

  1. The Concept of Conversion Factors and Reference Crops for the Prediction of 137Cs Root Uptake: Field Verification in Post-Chernobyl Landscape, 30 Years after Nuclear Accident

    NASA Astrophysics Data System (ADS)

    Komissarova, Olga; Paramonova, Tatiana

    2017-04-01

    One of the notable lessons obtained from nuclear accidents could be revealing the general features of 137Cs root uptake by agricultural crops for prediction the radionuclide accumulation in plants and its further distribution via food chains. Transfer factors (TFs) (the ratio of 137Cs activities in vegetation and in soil) have become a basis for such assessment when the characteristics of radioactive contamination, soil properties and phylogenetic features of different plant taxons important for root uptake are known. For the sake of simplicity the concept of conversion factor (CF) was accepted by IAEA (2006) to obtain unknown value of TF from the TF value of the reference crop cultivated on the same soil. Cereals were selected like reference group of agricultural crops. Presuming TF for cereals equal 1, CFs for tubers and fodder leguminous are 4, for grasses - 4.5, for leafy vegetables - 9, ets. To verify TFs and corresponding CFs values under environmental conditions of post-Chernobyl agricultural landscape the study in the area of Plavsky radioactive hotspot (Tula region, Russia) was conducted. Nowadays, after 30 years after the Chernobyl accident ( the first half-life period of 137Cs), arable chernozems of the territory are still polluted at the level 126-282 kBq/m2. The main crops of field rotation: wheat and barley (cereals), potatoes (tubers), soybean (leguminous), amaranth (non-leafy vegetables), rape ("other crops"), as well as galega-bromegrass mixture (cultivated species of grasses) and pasture grasses of semi-natural dry and wet meadows have been studied. Accumulation parameters of 137Cs in aboveground biomass, belowground biomass and edible parts of the plants were examined separately. Experimentally obtained 137Cs TFs in cereals are 0.24-0.32 for total biomass, 0.07-0.14 for aerial parts, 0.54-0.64 for roots and 0.01-0.02 for grain. Thus, (i) 137Cs transfer in grain of wheat and barley is insignificant and (ii) corresponding TFs values in both crops

  2. Search Help

    EPA Pesticide Factsheets

    Guidance and search help resource listing examples of common queries that can be used in the Google Search Appliance search request, including examples of special characters, or query term seperators that Google Search Appliance recognizes.

  3. Verification Of Tooling For Robotic Welding

    NASA Technical Reports Server (NTRS)

    Osterloh, Mark R.; Sliwinski, Karen E.; Anderson, Ronald R.

    1991-01-01

    Computer simulations, robotic inspections, and visual inspections performed to detect discrepancies. Method for verification of tooling for robotic welding involves combination of computer simulations and visual inspections. Verification process ensures accuracy of mathematical model representing tooling in off-line programming system that numerically simulates operation of robotic welding system. Process helps prevent damaging collisions between welding equipment and workpiece, ensures tooling positioned and oriented properly with respect to workpiece, and/or determines whether tooling to be modified or adjusted to achieve foregoing objectives.

  4. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  5. National Center for Nuclear Security: The Nuclear Forensics Project (F2012)

    SciTech Connect

    Klingensmith, A. L.

    2012-03-21

    These presentation visuals introduce the National Center for Nuclear Security. Its chartered mission is to enhance the Nation’s verification and detection capabilities in support of nuclear arms control and nonproliferation through R&D activities at the NNSS. It has three focus areas: Treaty Verification Technologies, Nonproliferation Technologies, and Technical Nuclear Forensics. The objectives of nuclear forensics are to reduce uncertainty in the nuclear forensics process & improve the scientific defensibility of nuclear forensics conclusions when applied to nearsurface nuclear detonations. Research is in four key areas: Nuclear Physics, Debris collection and analysis, Prompt diagnostics, and Radiochemistry.

  6. Systems Approach to Arms Control Verification

    SciTech Connect

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  7. Verification of the databases EXFOR and ENDF

    NASA Astrophysics Data System (ADS)

    Berton, Gottfried; Damart, Guillaume; Cabellos, Oscar; Beauzamy, Bernard; Soppera, Nicolas; Bossant, Manuel

    2017-09-01

    The objective of this work is for the verification of large experimental (EXFOR) and evaluated nuclear reaction databases (JEFF, ENDF, JENDL, TENDL…). The work is applied to neutron reactions in EXFOR data, including threshold reactions, isomeric transitions, angular distributions and data in the resonance region of both isotopes and natural elements. Finally, a comparison of the resonance integrals compiled in EXFOR database with those derived from the evaluated libraries is also performed.

  8. Packaged low-level waste verification system

    SciTech Connect

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-08-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy`s National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL).

  9. The Microbiology of Subsurface, Salt-Based Nuclear Waste Repositories: Using Microbial Ecology, Bioenergetics, and Projected Conditions to Help Predict Microbial Effects on Repository Performance

    SciTech Connect

    Swanson, Juliet S.; Cherkouk, Andrea; Arnold, Thuro; Meleshyn, Artur; Reed, Donald T.

    2016-11-17

    This report summarizes the potential role of microorganisms in salt-based nuclear waste repositories using available information on the microbial ecology of hypersaline environments, the bioenergetics of survival under high ionic strength conditions, and “repository microbiology” related studies. In areas where microbial activity is in question, there may be a need to shift the research focus toward feasibility studies rather than studies that generate actual input for performance assessments. In areas where activity is not necessary to affect performance (e.g., biocolloid transport), repository-relevant data should be generated. Both approaches will lend a realistic perspective to a safety case/performance scenario that will most likely underscore the conservative value of that case.

  10. RISKIND verification and benchmark comparisons

    SciTech Connect

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  11. Seismic Surveillance. Nuclear Test Ban Verification

    DTIC Science & Technology

    1990-02-26

    the following reasons; petrological classifications are based mainly on mineralogy and textural criteria, seismic reflection surveys map vertical...surveys can be explained in terms of petrological , geochemical, hydrostatic or magmatic discontinuities or simply low-angle faults (Larsson and...late archean greenstone belt. Lopian 2900-2650 Generated large volumes of new crust. Svecofennian 2000-1750 Large volumes of igneous rocks (1880 ±20

  12. Seismic Surveillance - Nuclear Test Ban Verification

    DTIC Science & Technology

    1992-03-27

    companies: BP Norway Ltd., Conoco Norway Inc., Elf Acquitaine Norge, Esso Norge A/S, Mobile Development Norway, Norsk Hydro A/S, A/S Norske Shell...with Drs. A. Dainty (M.I.T.) and D. Lokshtanov (Norsk Hydro , Bergen) are hereby acknowledged. Our sincere thanks to the many colleagues who provided...10 20 30 40 Events numbers ranked due to wrF values A -msctassified exrplosions *-niisclassified earthquakes Fig. 8a. Event discriminatlion with

  13. Voltage verification unit

    DOEpatents

    Martin, Edward J [Virginia Beach, VA

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  14. Visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-03-01

    On-site visual inspection will play an essential role in future Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection can greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can visual inspection offer ``ground truth`` in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending party may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection.

  15. Modular Machine Code Verification

    DTIC Science & Technology

    2007-05-01

    assembly level program verification is to design a type system for assembly language. Partly inspired by the Typed Intermediate Language (TIL) [57... designed to support direct verification of assembly programs with non-trivial 126 properties not expressible in traditional types. Besides the examples...provably sound tal for back-end opti- mization. In Proc. 2003 ACM Conference on Programming Language Design and Imple- mentation, pages 208–219. ACM

  16. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  17. Helping Children Help Themselves. Revised.

    ERIC Educational Resources Information Center

    Alberta Dept. of Agriculture, Edmonton.

    Youth leaders and parents can use this activity oriented publication to help children six to twelve years of age become more independent by acquiring daily living skills. The publication consists of five units, each of which contains an introduction, learning activities, and lists of resource materials. Age-ability levels are suggested for…

  18. Helping Children Help Themselves. Revised.

    ERIC Educational Resources Information Center

    Alberta Dept. of Agriculture, Edmonton.

    Youth leaders and parents can use this activity oriented publication to help children six to twelve years of age become more independent by acquiring daily living skills. The publication consists of five units, each of which contains an introduction, learning activities, and lists of resource materials. Age-ability levels are suggested for…

  19. Help Us to Help Ourselves

    ERIC Educational Resources Information Center

    Stanistreet, Paul

    2010-01-01

    Local authorities have a strong tradition of supporting communities to help themselves, and this is nowhere better illustrated than in the learning they commission and deliver through the Adult Safeguarded Learning budget. The budget was set up to protect at least a minimum of provision for adult liberal education, family learning and learning for…

  20. Help Us to Help Ourselves

    ERIC Educational Resources Information Center

    Stanistreet, Paul

    2010-01-01

    Local authorities have a strong tradition of supporting communities to help themselves, and this is nowhere better illustrated than in the learning they commission and deliver through the Adult Safeguarded Learning budget. The budget was set up to protect at least a minimum of provision for adult liberal education, family learning and learning for…

  1. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  2. Natural Analogues - One Way to Help Build Public Confidence in the Predicted Performance of a Mined Geologic Repository for Nuclear Waste

    SciTech Connect

    Stuckless, J. S.

    2002-02-26

    The general public needs to have a way to judge the predicted long-term performance of the potential high-level nuclear waste repository at Yucca Mountain. The applicability and reliability of mathematical models used to make this prediction are neither easily understood nor accepted by the public. Natural analogues can provide the average person with a tool to assess the predicted performance and other scientific conclusions. For example, hydrologists with the Yucca Mountain Project have predicted that most of the water moving through the unsaturated zone at Yucca Mountain, Nevada will move through the host rock and around tunnels. Thus, seepage into tunnels is predicted to be a small percentage of available infiltration. This hypothesis can be tested experimentally and with some quantitative analogues. It can also be tested qualitatively using a variety of analogues such as (1) well-preserved Paleolithic to Neolithic paintings in caves and rock shelters, (2) biological remains preserved in caves and rock shelters, and (3) artifacts and paintings preserved in man-made underground openings. These examples can be found in materials that are generally available to the non-scientific public and can demonstrate the surprising degree of preservation of fragile and easily destroyed materials for very long periods of time within the unsaturated zone.

  3. Does better taxon sampling help? A new phylogenetic hypothesis for Sepsidae (Diptera: Cyclorrhapha) based on 50 new taxa and the same old mitochondrial and nuclear markers.

    PubMed

    Zhao, Lei; Annie, Ang Shi Hui; Amrita, Srivathsan; Yi, Su Kathy Feng; Rudolf, Meier

    2013-10-01

    We here present a phylogenetic hypothesis for Sepsidae (Diptera: Cyclorrhapha), a group of schizophoran flies with ca. 320 described species that is widely used in sexual selection research. The hypothesis is based on five nuclear and five mitochondrial markers totaling 8813 bp for ca. 30% of the diversity (105 sepsid taxa) and - depending on analysis - six or nine outgroup species. Maximum parsimony (MP), maximum likelihood (ML), and Bayesian inferences (BI) yield overall congruent, well-resolved, and supported trees that are largely unaffected by three different ways to partition the data in BI and ML analyses. However, there are also five areas of uncertainty that affect suprageneric relationships where different analyses yield alternate topologies and MP and ML trees have significant conflict according to Shimodaira-Hasegawa tests. Two of these were already affected by conflict in a previous analysis that was based on the same genes and a subset of 69 species. The remaining three involve newly added taxa or genera whose relationships were previously resolved with low support. We thus find that the denser taxon sample in the present analysis does not reduce the topological conflict that had been identified previously. The present study nevertheless presents a significant contribution to the understanding of sepsid relationships in that 50 additional taxa from 18 genera are added to the Tree-of-Life of Sepsidae and that the placement of most taxa is well supported and robust to different tree reconstruction techniques. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. Multi-canister overpack project -- verification and validation, MCNP 4A

    SciTech Connect

    Goldmann, L.H.

    1997-11-10

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and the old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.

  5. Verification and Validation of Digitally Upgraded Control Rooms

    SciTech Connect

    Boring, Ronald; Lau, Nathan

    2015-09-01

    As nuclear power plants undertake main control room modernization, a challenge is the lack of a clearly defined human factors process to follow. Verification and validation (V&V) as applied in the nuclear power community has tended to involve efforts such as integrated system validation, which comes at the tail end of the design stage. To fill in guidance gaps and create a step-by-step process for control room modernization, we have developed the Guideline for Operational Nuclear Usability and Knowledge Elicitation (GONUKE). This approach builds on best practices in the software industry, which prescribe an iterative user-centered approach featuring multiple cycles of design and evaluation. Nuclear regulatory guidance for control room design emphasizes summative evaluation—which occurs after the design is complete. In the GONUKE approach, evaluation is also performed at the formative stage of design—early in the design cycle using mockups and prototypes for evaluation. The evaluation may involve expert review (e.g., software heuristic evaluation at the formative stage and design verification against human factors standards like NUREG-0700 at the summative stage). The evaluation may also involve user testing (e.g., usability testing at the formative stage and integrated system validation at the summative stage). An additional, often overlooked component of evaluation is knowledge elicitation, which captures operator insights into the system. In this report we outline these evaluation types across design phases that support the overall modernization process. The objective is to provide industry-suitable guidance for steps to be taken in support of the design and evaluation of a new human-machine interface (HMI) in the control room. We suggest the value of early-stage V&V and highlight how this early-stage V&V can help improve the design process for control room modernization. We argue that there is a need to overcome two shortcomings of V&V in current practice

  6. Centrifugal Tensioned Metastable Fluid Detectors for Trace Radiation Sources: Experimental Verification and Military Employment

    DTIC Science & Technology

    2016-06-01

    TENSIONED METASTABLE FLUID DETECTORS FOR TRACE RADIATION SOURCES: EXPERIMENTAL VERIFICATION AND MILITARY EMPLOYMENT by Dominic J. Chiaverotti...DETECTORS FOR TRACE RADIATION SOURCES: EXPERIMENTAL VERIFICATION AND MILITARY EMPLOYMENT 5. FUNDING NUMBERS 6. AUTHOR(S) Dominic J. Chiaverotti 7...the detection of fast neutrons or alpha particles that are telltale signs of nuclear material, while remaining blind to gamma radiation that could

  7. Low-Volatility Agent Permeation (LVAP) Verification and Validation Report

    DTIC Science & Technology

    2015-05-01

    research at the U.S. Army Edgewood Chemical Biological Center (ECBC; Aberdeen Proving Ground, MD) with support from the Joint Science and Technology...Executive Office for Chemical and Biological Defense (JPEO-CBD; Aberdeen Proving Ground, MD). 15. SUBJECT TERMS Verification and validation (V&V...comments to help make this verification testing successful. These stakeholders included representatives from U.S. Army Edgewood Chemical Biological

  8. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  9. Voice verification upgrade

    NASA Astrophysics Data System (ADS)

    Davis, R. L.; Sinnamon, J. T.; Cox, D. L.

    1982-06-01

    This contractor has two major objectives. The first was to build, test, and deliver to the government an entry control system using speaker verification (voice authentication) as the mechanism for verifying the user's claimed identity. This system included a physical mantrap, with an integral weight scale to prevent more than one user from gaining access with one verification (tailgating). The speaker verification part of the entry control system contained all the updates and embellishments to the algorithm that was developed earlier for the BISS (Base and Installation Security System) system under contract with the Electronic Systems Division of the USAF. These updates were tested prior to and during the contract on an operational system used at Texas Instruments in Dallas, Texas, for controlling entry to the Corporate Information Center (CIC).

  10. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  11. Voice Verification Upgrade.

    DTIC Science & Technology

    1982-06-01

    to develop speaker verification techniques for use over degraded commun- ication channels -- specifically telephone lines. A test of BISS type speaker...verification technology was performed on a degraded channel and compensation techniques were then developed . The fifth program [103 (Total Voice SV...UPGAW. *mbit aL DuI~sel Jme T. SImmoon e~d David L. Cox AAWVLP FIR MIEW RMAS Utgl~rIMIW At" DT11C AU9 231f CD, _ ROME AIR DEVELOPMENT CENTER Air

  12. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  13. Design verification and validation plan for the cold vacuum drying facility

    SciTech Connect

    NISHIKAWA, L.D.

    1999-06-03

    The Cold Vacuum Drying Facility (CVDF) provides the required process systems, supporting equipment, and facilities needed for drying spent nuclear fuel removed from the K Basins. This document presents the both completed and planned design verification and validation activities.

  14. Conceptual design. Final report: TFE Verification Program

    SciTech Connect

    Not Available

    1994-03-01

    This report documents the TFE Conceptual Design, which provided the design guidance for the TFE Verification program. The primary goals of this design effort were: (1) establish the conceptual design of an in-core thermionic reactor for a 2 Mw(e) space nuclear power system with a 7-year operating lifetime; (2) demonstrate scalability of the above concept over the output power range of 500 kW(e) to 5 MW(e); and (3) define the TFE which is the basis for the 2 MW (e) reactor design. This TFE specification provided the basis for the test program. These primary goals were achieved. The technical approach taking in the conceptual design effort is discussed in Section 2, and the results are discussed in Section 3. The remainder of this introduction draws a perspective on the role that this conceptual design task played in the TFE Verification Program.

  15. Helping individuals to help themselves.

    PubMed

    Costain, Lyndel; Croker, Helen

    2005-02-01

    Obesity is a serious and increasing health issue. Approximately two-thirds of adults in the UK are now overweight or obese. Recent public health reports firmly reinforce the importance of engaging individuals to look after their health, including their weight. They also spell out the need for individuals to be supported more actively, on many levels, to enable this 'engagement'. Meanwhile, national surveys indicate that approximately two-thirds of adults are concerned about weight control, with one-third actively trying to lose weight. This finding is hardly surprising considering current weight statistics, plus the plethora of popular diets on offer. Weight-loss methods include diet clubs, diet books, exercise, meal replacements, advice from healthcare professionals and following a self-styled diet. Obesity is a multi-factorial problem, and losing weight and, in particular, maintaining weight loss is difficult and often elusive. It is argued that the modern obesogenic or 'toxic' environment has essentially taken body-weight control from an instinctive 'survival' process to one that needs sustained cognitive and skill-based control. The evidence suggests that health professionals can help individuals achieve longer-term weight control by supporting them in making sustainable lifestyle changes using a range of behavioural techniques. These techniques include: assessing readiness to change; self-monitoring; realistic goal setting; dietary change; increased physical activity; stimulus control; cognitive restructuring; relapse management; establishing ongoing support. Consistently working in a client-centred way is also being increasingly advocated and incorporated into practice to help motivate and encourage, rather than hinder, the individual's progress.

  16. Proceedings of a conference on nuclear war: The search for solutions

    SciTech Connect

    Perry, T.L.; DeMille, D.

    1985-01-01

    This book presents the proceedings of a conference on the problem of nuclear war. Topics include civil defense; nuclear winter; the psychological consequences of nuclear war, arms control and verification.

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  19. Computer Graphics Verification

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Video processing creates technical animation sequences using studio quality equipment to realistically represent fluid flow over space shuttle surfaces, helicopter rotors, and turbine blades.Computer systems Co-op, Tim Weatherford, performing computer graphics verification. Part of Co-op brochure.

  20. FPGA Verification Accelerator (FVAX)

    NASA Technical Reports Server (NTRS)

    Oh, Jane; Burke, Gary

    2008-01-01

    Is Verification Acceleration Possible? - Increasing the visibility of the internal nodes of the FPGA results in much faster debug time - Forcing internal signals directly allows a problem condition to be setup very quickly center dot Is this all? - No, this is part of a comprehensive effort to improve the JPL FPGA design and V&V process.

  1. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests

  2. 10 CFR 63.47 - Facility information and verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Facility information and verification. 63.47 Section 63.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Us/iaea Safeguards Agreement § 63.47 Facility information and...

  3. 10 CFR 63.47 - Facility information and verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Facility information and verification. 63.47 Section 63.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Us/iaea Safeguards Agreement § 63.47 Facility information and...

  4. 10 CFR 63.47 - Facility information and verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Facility information and verification. 63.47 Section 63.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Us/iaea Safeguards Agreement § 63.47 Facility information and...

  5. 10 CFR 63.47 - Facility information and verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Facility information and verification. 63.47 Section 63.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Us/iaea Safeguards Agreement § 63.47 Facility information and...

  6. 10 CFR 63.47 - Facility information and verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Facility information and verification. 63.47 Section 63.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Us/iaea Safeguards Agreement § 63.47 Facility information and...

  7. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  8. Improved Verification for Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Powell, Mark A.

    2008-01-01

    Aerospace systems are subject to many stringent performance requirements to be verified with low risk. This report investigates verification planning using conditional approaches vice the standard classical statistical methods, and usage of historical surrogate data for requirement validation and in verification planning. The example used in this report to illustrate the results of these investigations is a proposed mission assurance requirement with the concomitant maximum acceptable verification risk for the NASA Constellation Program Orion Launch Abort System (LAS). This report demonstrates the following improvements: 1) verification planning using conditional approaches vice classical statistical methods results in plans that are more achievable and feasible; 2) historical surrogate data can be used to bound validation of performance requirements; and, 3) incorporation of historical surrogate data in verification planning using conditional approaches produces even less costly and more reasonable verification plans. The procedures presented in this report may produce similar improvements and cost savings in verification for any stringent performance requirement for an aerospace system.

  9. Optical secure image verification system based on ghost imaging

    NASA Astrophysics Data System (ADS)

    Wu, Jingjing; Haobogedewude, Buyinggaridi; Liu, Zhengjun; Liu, Shutian

    2017-09-01

    The ghost imaging can perform Fourier-space filtering by tailoring the configuration. We proposed a novel optical secure image verification system based on this theory with the help of phase matched filtering. In the verification process, the system key and the ID card which contain the information of the correct image and the information to be verified are put in the reference and the test paths, respectively. We demonstrate that the ghost imaging configuration can perform an incoherent correlation between the system key and the ID card. The correct verification manifests itself with a correlation peak in the ghost image. The primary image and the image to be verified are encrypted and encoded into pure phase masks beforehand for security. Multi-image secure verifications can also be implemented in the proposed system.

  10. National Center for Nuclear Security - NCNS

    SciTech Connect

    2014-11-12

    As the United States embarks on a new era of nuclear arms control, the tools for treaty verification must be accurate and reliable, and must work at stand-off distances. The National Center for Nuclear Security, or NCNS, at the Nevada National Security Site, is poised to become the proving ground for these technologies. The center is a unique test bed for non-proliferation and arms control treaty verification technologies. The NNSS is an ideal location for these kinds of activities because of its multiple environments; its cadre of experienced nuclear personnel, and the artifacts of atmospheric and underground nuclear weapons explosions. The NCNS will provide future treaty negotiators with solid data on verification and inspection regimes and a realistic environment in which future treaty verification specialists can be trained. Work on warhead monitoring at the NCNS will also support future arms reduction treaties.

  11. National Center for Nuclear Security - NCNS

    ScienceCinema

    None

    2016-07-12

    As the United States embarks on a new era of nuclear arms control, the tools for treaty verification must be accurate and reliable, and must work at stand-off distances. The National Center for Nuclear Security, or NCNS, at the Nevada National Security Site, is poised to become the proving ground for these technologies. The center is a unique test bed for non-proliferation and arms control treaty verification technologies. The NNSS is an ideal location for these kinds of activities because of its multiple environments; its cadre of experienced nuclear personnel, and the artifacts of atmospheric and underground nuclear weapons explosions. The NCNS will provide future treaty negotiators with solid data on verification and inspection regimes and a realistic environment in which future treaty verification specialists can be trained. Work on warhead monitoring at the NCNS will also support future arms reduction treaties.

  12. Teaching "The Nuclear Predicament."

    ERIC Educational Resources Information Center

    Carman, Philip; Kneeshaw, Stephen

    1987-01-01

    Contends that courses on nuclear war must help students examine the political, social, religious, philosophical, economic, and moral assumptions which characterized the dilemma of nuclear armament/disarmament. Describes the upper level undergraduate course taught by the authors. (JDH)

  13. Teaching "The Nuclear Predicament."

    ERIC Educational Resources Information Center

    Carman, Philip; Kneeshaw, Stephen

    1987-01-01

    Contends that courses on nuclear war must help students examine the political, social, religious, philosophical, economic, and moral assumptions which characterized the dilemma of nuclear armament/disarmament. Describes the upper level undergraduate course taught by the authors. (JDH)

  14. Plan for a laser weapon verification research program

    SciTech Connect

    Karr, T.J.

    1990-03-01

    Currently there is great interest in the question of how, or even whether, a treaty limiting the development and deployment of laser weapons could be verified. The concept of cooperative laser weapon verification is that each party would place monitoring stations near the other party's declared or suspect laser weapon facilities. The monitoring stations would measure the primary laser observables'' such as power or energy, either directly or by collecting laser radiation scattered from the air or the target, and would verify that the laser is operated within treaty limits. This concept is modeled along the lines of the seismic network recently activated in the USSR as a joint project of the United States Geologic Survey and the Soviet Academy of Sciences. The seismic data, gathered cooperatively, can be used by each party as it wishes, including to support verification of future nuclear test ban treaties. For laser weapon verification the monitoring stations are envisioned as ground-based, and would verify treaty limitations on ground-based laser anti-satellite (ASAT) weapons and on the ground-based development of other laser weapons. They would also contribute to verification of limitations on air-, sea- and space-based laser weapons, and the technology developed for cooperative verification could also be used in national technical means of verification. 2 figs., 4 tabs.

  15. RESRAD-BUILD verification.

    SciTech Connect

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-31

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft{reg_sign} Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified.

  16. Robust verification analysis

    SciTech Connect

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-15

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  17. Robust verification analysis

    NASA Astrophysics Data System (ADS)

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-01

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  18. Microcode Verification Project.

    DTIC Science & Technology

    1980-05-01

    MICROCOPY RESOLUTION TEST CHART MADCTR.S042 /2>t w NI TeduIem R"pm’ 00 0 MICRQCODE VERIFICATION PROJECT Unhvrsity of Southern California Stephen D...in the production, testing , and maintenance of Air Force software. This effort was undertaken in response to that goal. The objective of the effort was...rather than hard wiring, is a recent development in computer technology. Hardware diagnostics do not fulfill testing requirements for these computers

  19. TFE Verification Program

    SciTech Connect

    Not Available

    1990-03-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown on Fig. 1-1. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program;(3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-86; (5) and Thermionic Technology Program in 1986 and 1987. 18 refs., 64 figs., 43 tabs.

  20. National and International Security Applications of Cryogenic Detectors—Mostly Nuclear Safeguards

    NASA Astrophysics Data System (ADS)

    Rabin, Michael W.

    2009-12-01

    As with science, so with security—in both arenas, the extraordinary sensitivity of cryogenic sensors enables high-confidence detection and high-precision measurement even of the faintest signals. Science applications are more mature, but several national and international security applications have been identified where cryogenic detectors have high potential payoff. International safeguards and nuclear forensics are areas needing new technology and methods to boost speed, sensitivity, precision and accuracy. Successfully applied, improved nuclear materials analysis will help constrain nuclear materials diversion pathways and contribute to treaty verification. Cryogenic microcalorimeter detectors for X-ray, gamma-ray, neutron, and alpha-particle spectrometry are under development with these aims in mind. In each case the unsurpassed energy resolution of microcalorimeters reveals previously invisible spectral features of nuclear materials. Preliminary results of quantitative analysis indicate substantial improvements are still possible, but significant work will be required to fully understand the ultimate performance limits.

  1. International and national security applications of cryogenic detectors - mostly nuclear safeguards

    SciTech Connect

    Rabin, Michael W

    2009-01-01

    As with science, so with security - in both arenas, the extraordinary sensitivity of cryogenic sensors enables high-confidence detection and high-precision measurement even of the faintest signals. Science applications are more mature, but several national and international security applications have been identified where cryogenic detectors have high potential payoff. International safeguards and nuclear forensics are areas needing new technology and methods to boost speed, sensitivity, precision and accuracy. Successfully applied, improved nuclear materials analysis will help constrain nuclear materials diversion pathways and contribute to treaty verification. Cryogenic microcalorimeter detectors for X-ray, gamma ray, neutron, and alpha particle spectrometry are under development with these aims in mind. In each case the unsurpassed energy resolution of microcalorimeters reveals previously invi sible spectral features of nuclear materials. Preliminary results of quantitative analysis indicate substantial improvements are still possible, but significant work will be required to fully understand the ultimate performance limits.

  2. A Cherenkov viewing device for used-fuel verification

    NASA Astrophysics Data System (ADS)

    Attas, E. M.; Chen, J. D.; Young, G. J.

    1990-12-01

    A Cherenkov viewing device (CVD) has been developed to help verify declared inventories of used nuclear fuel stored in water bays. The device detects and amplifies the faint ultraviolet Cherenkov glow from the water surrounding the fuel, producing a real-time visible image on a phosphor screen. Quartz optics, a UV-pass filter and a microchannel-plate image-intensifier tube serve to form the image, which can be photographed or viewed directly through an eyepiece. Normal fuel bay lighting does not interfere with the Cherenkov light image. The CVD has been successfully used to detect anomalous PWR, BWR and CANDU (CANada Deuterium Uranium: registered trademark) fuel assemblies in the presence of normal-burnup assemblies stored in used-fuel bays. The latest version of the CVD, known as Mark IV, is being used by inspectors from the International Atomic Energy Agency for verification of light-water power-reactor fuel. Its design and operation are described, together with plans for further enhancements of the instrumentation.

  3. Verification of excess defense material

    SciTech Connect

    Fearey, B.L.; Pilat, J.F.; Eccleston, G.W.; Nicholas, N.J.; Tape, J.W.

    1997-12-01

    The international community in the post-Cold War period has expressed an interest in the International Atomic Energy Agency (IAEA) using its expertise in support of the arms control and disarmament process in unprecedented ways. The pledges of the US and Russian presidents to place excess defense materials under some type of international inspections raises the prospect of using IAEA safeguards approaches for monitoring excess materials, which include both classified and unclassified materials. Although the IAEA has suggested the need to address inspections of both types of materials, the most troublesome and potentially difficult problems involve approaches to the inspection of classified materials. The key issue for placing classified nuclear components and materials under IAEA safeguards is the conflict between these traditional IAEA materials accounting procedures and the US classification laws and nonproliferation policy designed to prevent the disclosure of critical weapon-design information. Possible verification approaches to classified excess defense materials could be based on item accountancy, attributes measurements, and containment and surveillance. Such approaches are not wholly new; in fact, they are quite well established for certain unclassified materials. Such concepts may be applicable to classified items, but the precise approaches have yet to be identified, fully tested, or evaluated for technical and political feasibility, or for their possible acceptability in an international inspection regime. Substantial work remains in these areas. This paper examines many of the challenges presented by international inspections of classified materials.

  4. International safeguards: Accounting for nuclear materials

    SciTech Connect

    Fishbone, L.G.

    1988-09-28

    Nuclear safeguards applied by the International Atomic Energy Agency (IAEA) are one element of the non-proliferation regime'', the collection of measures whose aim is to forestall the spread of nuclear weapons to countries that do not already possess them. Safeguards verifications provide evidence that nuclear materials in peaceful use for nuclear-power production are properly accounted for. Though carried out in cooperation with nuclear facility operators, the verifications can provide assurance because they are designed with the capability to detect diversion, should it occur. Traditional safeguards verification measures conducted by inspectors of the IAEA include book auditing; counting and identifying containers of nuclear material; measuring nuclear material; photographic and video surveillance; and sealing. Novel approaches to achieve greater efficiency and effectiveness in safeguards verifications are under investigation as the number and complexity of nuclear facilities grow. These include the zone approach, which entails carrying out verifications for groups of facilities collectively, and randomization approach, which entails carrying out entire inspection visits some fraction of the time on a random basis. Both approaches show promise in particular situations, but, like traditional measures, must be tested to ensure their practical utility. These approaches are covered on this report. 15 refs., 16 figs., 3 tabs.

  5. Quantum money with classical verification

    NASA Astrophysics Data System (ADS)

    Gavinsky, Dmitry

    2014-12-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  6. Quantum money with classical verification

    SciTech Connect

    Gavinsky, Dmitry

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  7. Freeze verification: time for a fresh approach

    SciTech Connect

    Paine, C.

    1983-01-01

    The administration's claim that some elements of a comprehensive nuclear freeze are unverifiable does not specify the nature of those elements and whether they represent a real threat to national security if we trusted the USSR to comply. The author contends that clandestine development of new weapons will have little strategic effect since both sides already have total destructive power. The risks of noncompliance are largely political and less than the risks of continued arms buildup. Since the USSR would also want the US to be bound by freeze terms, deterrence would come from mutual benefit. Hardliners argue that cheating is easier in a closed society; that our democracy would tend to relax and the USSR would move ahead with its plans for world domination. The author argues that, over time, a freeze would diminish Soviet confidence in its nuclear war fighting capabilities and that adequate verification is possible with monitoring and warning arrangements. (DCK)

  8. Safeguards for spent fuels: Verification problems

    SciTech Connect

    Pillay, K.K.S.; Picard, R.R.

    1991-01-01

    The accumulation of large quantities of spent nuclear fuels world-wide is a serious problem for international safeguards. A number of International Atomic Energy Agency (IAEA) member states, including the US, consider spent fuel to be a material form for which safeguards cannot be terminated, even after permanent disposal in a geologic repository. Because safeguards requirements for spent fuels are different from those of conventional bulk-handling and item-accounting facilities, there is room for innovation to design a unique safeguards regime for spent fuels that satisfies the goals of the nuclear nonproliferation treaty at a reasonable cost to both the facility and the IAEA. Various strategies being pursued for long-term management of spent fuels are examined with a realistic example to illustrate the problems of verifying safeguards under the present regime. Verification of a safeguards regime for spent fuels requires a mix of standard safeguards approaches, such as quantitative verification and use of seals, with other measures that are unique to spent fuels. 17 refs.

  9. Component testing for dynamic model verification

    NASA Technical Reports Server (NTRS)

    Hasselman, T. K.; Chrostowski, J. D.

    1984-01-01

    Dynamic model verification is the process whereby an analytical model of a dynamic system is compared with experimental data, adjusted if necessary to bring it into agreement with the data, and then qualified for future use in predicting system response in a different dynamic environment. These are various ways to conduct model verification. The approach taken here employs Bayesian statistical parameter estimation. Unlike curve fitting, whose objective is to minimize the difference between some analytical function and a given quantity of test data (or curve), Bayesian estimation attempts also to minimize the difference between the parameter values of that funciton (the model) and their initial estimates, in a least squares sense. The objectives of dynamic model verification, therefore, are to produce a model which: (1) is in agreement with test data; (2) will assist in the interpretation of test data; (3) can be used to help verify a design; (4) will reliably predict performance; and (5) in the case of space structures, will facilitate dynamic control.

  10. Production readiness verification testing

    NASA Technical Reports Server (NTRS)

    James, A. M.; Bohon, H. L.

    1980-01-01

    A Production Readiness Verification Testing (PRVT) program has been established to determine if structures fabricated from advanced composites can be committed on a production basis to commercial airline service. The program utilizes subcomponents which reflect the variabilities in structure that can realistically be expected from current production and quality control technology to estimate the production qualities, variation in static strength, and durability of advanced composite structures. The results of the static tests and a durability assessment after one year of continuous load/environment testing of twenty two duplicates of each of two structural components (a segment of the front spar and cover of a vertical stabilizer box structure) are discussed.

  11. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  12. Verification and validation of RADMODL Version 1.0

    SciTech Connect

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.

  13. Bibliography for Verification and Validation in Computational Simulations

    SciTech Connect

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  14. Constitutional and legal implications of arms control verification technologies

    SciTech Connect

    Tanzman, E.A.; Haffenden, R.

    1992-09-01

    United States law can both help and hinder the use of instrumentation as a component of arms control verification in this country. It can foster the general use of sophisticated verification technologies, where such devices are consistent with the value attached to privacy by the Fourth Amendment to the United States Constitution. On the other hand, law can hinder reliance on devices that cross this constitutional line, or where such technology itself threatens health, safety, or environment as such threats are defined in federal statutes. The purpose of this conference paper is to explain some of the lessons that have been learned about the relationship between law and verification technologies in the hope that law can help more than hinder. This paper has three parts. In order to start with a common understanding, part I will briefly describe the hierarchy of treaties, the Constitution, federal statutes, and state and local laws. Part 2 will discuss how the specific constitutional requirement that the government respect the right of privacy in all of its endeavors may affect the use of verification technologies. Part 3 will explain the environmental law constraints on verification technology as exemplified by the system of on-site sampling embodied in the current Rolling Text of the Draft Chemical Weapons Convention.

  15. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    SciTech Connect

    Luke, S J

    2011-12-20

    This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of the US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the

  16. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  17. Software Verification and Validation Procedure

    SciTech Connect

    Olund, Thomas S.

    2008-09-15

    This Software Verification and Validation procedure provides the action steps for the Tank Waste Information Network System (TWINS) testing process. The primary objective of the testing process is to provide assurance that the software functions as intended, and meets the requirements specified by the client. Verification and validation establish the primary basis for TWINS software product acceptance.

  18. HDL to verification logic translator

    NASA Astrophysics Data System (ADS)

    Gambles, J. W.; Windley, P. J.

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  19. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  20. TFE Verification Program

    SciTech Connect

    Not Available

    1993-05-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program builds directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern.

  1. Search for sanity: The politics of nuclear weapons and disarmament

    SciTech Connect

    Joseph, P.; Rosenblum, S.

    1984-01-01

    This book examines the political aspects of nuclear weapons and arms control. Topics considered include nuclear deterrence, military strategy, the military-industrial complex, the nuclear balance, first strike, nuclear errors and accidents, treaty verification, survival, the economic impact of military spending, Western European peace movements, peace movements in Eastern Europe, the cold war, nuclear diplomacy, moral aspects, the defense budget, national security, foreign policy, proliferation, and nuclear disarmament.

  2. Neptunium flow-sheet verification at reprocessing plants

    SciTech Connect

    Rance, P.; Chesnay, B.; Killeen, T.; Murray, M.; Nikkinen, M.; Petoe, A.; Plumb, J.; Saukkonen, H.

    2007-07-01

    Due to their fissile nature, neptunium and americium have at least a theoretical potential application as nuclear explosives and their proliferation potential was considered by the IAEA in studies in the late 1990's. This work was motivated by an increased awareness of the proliferation potential of americium and neptunium and a number of emerging projects in peaceful nuclear programmes which could result in an increase in the available quantities of these minor actinides. The studies culminated in proposals for various voluntary measures including the reporting of international transfers of separated americium and neptunium, declarations concerning the amount of separated neptunium and americium held by states and the application of flow-sheet verification to ensure that facilities capable of separating americium or neptunium are operated in a manner consistent with that declared. This paper discusses the issue of neptunium flowsheet verification in reprocessing plants. The proliferation potential of neptunium is first briefly discussed and then the chemistry of neptunium relevant to reprocessing plants described with a view to indicating a number of issues relevant to the verification of neptunium flow-sheets. Finally, the scope of verification activities is discussed including analysis of process and engineering design information, plant monitoring and sampling and the potential application of containment and surveillance measures. (authors)

  3. Extremely accurate sequential verification of RELAP5-3D

    SciTech Connect

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method of manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.

  4. Extremely accurate sequential verification of RELAP5-3D

    DOE PAGES

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less

  5. JPRS Report, Nuclear Developments

    DTIC Science & Technology

    2016-03-24

    TURKEY Joint Argentine Firm To Sell Nuclear Reactors [ANATOLIA] ....................................................... 34 Civil Defense Against Iraqi... Reactor and Nuclear Materials Control Law, the Radiation Hazard Prevention Law and the Law on THAILAND the Promotion of Development and Peaceful Applica...and carried without nuclear fuel in the reactor and only with generating unit 5 in particular are obviously passing the help of expensive equipment

  6. VEG-01: Veggie Hardware Verification Testing

    NASA Technical Reports Server (NTRS)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  7. Shift Verification and Validation

    SciTech Connect

    Pandya, Tara M.; Evans, Thomas M.; Davidson, Gregory G; Johnson, Seth R.; Godfrey, Andrew T.

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  8. Online fingerprint verification.

    PubMed

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  9. Verification of VENTSAR

    SciTech Connect

    Simpkins, A.A.

    1995-01-01

    The VENTSAR code is an upgraded and improved version of the VENTX code, which estimates concentrations on or near a building from a release at a nearby location. The code calculates the concentrations either for a given meteorological exceedance probability or for a given stability and wind speed combination. A single building can be modeled which lies in the path of the plume, or a penthouse can be added to the top of the building. Plume rise may also be considered. Release types can be either chemical or radioactive. Downwind concentrations are determined at user-specified incremental distances. This verification report was prepared to demonstrate that VENTSAR is properly executing all algorithms and transferring data. Hand calculations were also performed to ensure proper application of methodologies.

  10. Experimental verification of Santilli`s clean, subnuclear, hadronic energy

    SciTech Connect

    Tsagas, N.F.; Mystakidis, A.; Bakos, G.

    1996-02-01

    The structure of the nucleus and its constituents still presents a challenge to both theoretical and experimental physicists. This paper deals mainly with the an experimental attempt for the verification of the new theory for neutron structure and its stimulated decay recently proposed by R.M. Santilli which would imply a new, clean, subnuclear energy. The experiment is carried out by the Laboratory of Nuclear Technology at the University of Thrace, Xanthi, Greece.

  11. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within 370...

  12. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within 370...

  13. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within 370...

  14. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within 370...

  15. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within 370...

  16. Acoustic techniques in nuclear safeguards

    SciTech Connect

    Olinger, C.T.; Sinha, D.N.

    1995-07-01

    Acoustic techniques can be employed to address many questions relevant to current nuclear technology needs. These include establishing and monitoring intrinsic tags and seals, locating holdup in areas where conventional radiation-based measurements have limited capability, process monitoring, monitoring containers for corrosion or changes in pressure, and facility design verification. These acoustics applications are in their infancy with respect to safeguards and nuclear material management, but proof-of-principle has been demonstrated in many of the areas listed.

  17. Parent Tookit: Homework Help. Helpful Tips.

    ERIC Educational Resources Information Center

    All Kinds of Minds, 2006

    2006-01-01

    This check list contains tips for parents to help students reinforce and build upon what children learn at school: (1) Set a consistent time each day for doing homework; (2) Encourage children to make a homework checklist; (3) Provide assistance to help get started on a task; (4) Help children make a list of all needed materials before starting…

  18. EURATOM safeguards efforts in the development of spent fuel verification methods by non-destructive assay

    SciTech Connect

    Matloch, L.; Vaccaro, S.; Couland, M.; De Baere, P.; Schwalbach, P.

    2015-07-01

    The back end of the nuclear fuel cycle continues to develop. The European Commission, particularly the Nuclear Safeguards Directorate of the Directorate General for Energy, implements Euratom safeguards and needs to adapt to this situation. The verification methods for spent nuclear fuel, which EURATOM inspectors can use, require continuous improvement. Whereas the Euratom on-site laboratories provide accurate verification results for fuel undergoing reprocessing, the situation is different for spent fuel which is destined for final storage. In particular, new needs arise from the increasing number of cask loadings for interim dry storage and the advanced plans for the construction of encapsulation plants and geological repositories. Various scenarios present verification challenges. In this context, EURATOM Safeguards, often in cooperation with other stakeholders, is committed to further improvement of NDA methods for spent fuel verification. In this effort EURATOM plays various roles, ranging from definition of inspection needs to direct participation in development of measurement systems, including support of research in the framework of international agreements and via the EC Support Program to the IAEA. This paper presents recent progress in selected NDA methods. These methods have been conceived to satisfy different spent fuel verification needs, ranging from attribute testing to pin-level partial defect verification. (authors)

  19. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    NASA Astrophysics Data System (ADS)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help

  20. Comments for A Conference on Verification in the 21st Century

    SciTech Connect

    Doyle, James E.

    2012-06-12

    The author offers 5 points for the discussion of Verification and Technology: (1) Experience with the implementation of arms limitation and arms reduction agreements confirms that technology alone has never been relied upon to provide effective verification. (2) The historical practice of verification of arms control treaties between Cold War rivals may constrain the cooperative and innovative use of technology for transparency, veification and confidence building in the future. (3) An area that has been identified by many, including the US State Department and NNSA as being rich for exploration for potential uses of technology for transparency and verification is information and communications technology (ICT). This includes social media, crowd-sourcing, the internet of things, and the concept of societal verification, but there are issues. (4) On the issue of the extent to which verification technologies are keeping pace with the demands of future protocols and agrements I think the more direct question is ''are they effective in supporting the objectives of the treaty or agreement?'' In this regard it is important to acknowledge that there is a verification grand challenge at our doorstep. That is ''how does one verify limitations on nuclear warheads in national stockpiles?'' (5) Finally, while recognizing the daunting political and security challenges of such an approach, multilateral engagement and cooperation at the conceptual and technical levels provides benefits for addressing future verification challenges.

  1. Pairwise Identity Verification via Linear Concentrative Metric Learning.

    PubMed

    Zheng, Lilei; Duffner, Stefan; Idrissi, Khalid; Garcia, Christophe; Baskurt, Atilla

    2016-12-16

    This paper presents a study of metric learning systems on pairwise identity verification, including pairwise face verification and pairwise speaker verification, respectively. These problems are challenging because the individuals in training and testing are mutually exclusive, and also due to the probable setting of limited training data. For such pairwise verification problems, we present a general framework of metric learning systems and employ the stochastic gradient descent algorithm as the optimization solution. We have studied both similarity metric learning and distance metric learning systems, of either a linear or shallow nonlinear model under both restricted and unrestricted training settings. Extensive experiments demonstrate that with limited training pairs, learning a linear system on similar pairs only is preferable due to its simplicity and superiority, i.e., it generally achieves competitive performance on both the labeled faces in the wild face dataset and the NIST speaker dataset. It is also found that a pretrained deep nonlinear model helps to improve the face verification results significantly.

  2. Verification and validation of control system software

    SciTech Connect

    Munro, J.K. Jr.; Kisner, R.A. ); Bhadtt, S.C. )

    1991-01-01

    The following guidelines are proposed for verification and validation (V V) of nuclear power plant control system software: (a) use risk management to decide what and how much V V is needed; (b) classify each software application using a scheme that reflects what type and how much V V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs.

  3. Generic interpreters and microprocessor verification

    NASA Technical Reports Server (NTRS)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  4. RELAP-7 Software Verification and Validation Plan

    SciTech Connect

    Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling

    2014-09-25

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.

  5. U.S. verification method disputed

    NASA Astrophysics Data System (ADS)

    Maggs, William Ward

    Milo Nordyke, senior scientist at Lawrence Liver more National Laboratory in Liver more, Calif., testified October 6 at a Senate Foreign Affairs Committee hearing on Soviet test ban noncompliance and the recently concluded Joint Verification Experiment. He said that the the government's method for on-site test monitoring is intrusive, expensive, and could limit some U.S. weapon design programs. In addition, Gregory Van der Vink of the congressional Office of Technology Assessment presented new evidence that White House charges that the Soviet Union has not complied with the current 150 kiloton test limit are probably without basis.Also testifying were Paul Robinson, U.S. negotiator for the Nuclear Testing Talks; Peter Sharfman, program manager for International Security and Commerce at OTA; and physicist David Hafemeister of California Polytechnical State University, San Luis Obispo.

  6. The CSMS (Configurable Seismic Monitoring System) Poorboy deployment: Seismic recording in Pinedale, Wyoming, of the Bullion NTS (Nevada Test Site) nuclear test under the verification provisions of the new TTBT protocol

    SciTech Connect

    Harben, P.E.; Rock, D.W.; Carlson, R.C.

    1990-07-10

    The Configurable Seismic Monitoring System (CSMS), developed at the Lawrence Livermore National Laboratory (LLNL) was deployed in a 13-m deep vault on the AFTAC facility at Pinedale, Wyoming to record the Bullion nuclear test. The purpose of the exercise was to meet all provisions of the new TTBT protocol on in-country seismic recording at a Designated Seismic Station (DSS). The CSMS successfully recorded the Bullion event consistent with and meeting all requirements in the new treaty protocol. In addition, desirable seismic system features not specified in the treaty protocol were determined; treaty protocol ambiguities were identified, and useful background noise recordings at the Pinedale site were obtained. 10 figs.

  7. Woodward Effect Experimental Verifications

    NASA Astrophysics Data System (ADS)

    March, Paul

    2004-02-01

    The work of J. F. Woodward (1990 1996a; 1996b; 1998; 2002a; 2002b; 2004) on the existence of ``mass fluctuations'' and their use in exotic propulsion schemes was examined for possible application in improving space flight propulsion and power generation. Woodward examined Einstein's General Relativity Theory (GRT) and assumed that if the strong Machian interpretation of GRT as well as gravitational / inertia like Wheeler-Feynman radiation reaction forces hold, then when an elementary particle is accelerated through a potential gradient, its rest mass should fluctuate around its mean value during its acceleration. Woodward also used GRT to clarify the precise experimental conditions necessary for observing and exploiting these mass fluctuations or ``Woodward effect'' (W-E). Later, in collaboration with his ex-graduate student T. Mahood, they also pushed the experimental verification boundaries of these proposals. If these purported mass fluctuations occur as Woodward claims, and his assumption that gravity and inertia are both byproducts of the same GRT based phenomenon per Mach's Principle is correct, then many innovative applications such as propellantless propulsion and gravitational exotic matter generators may be feasible. This paper examines the reality of mass fluctuations and the feasibility of using the W-E to design propellantless propulsion devices in the near to mid-term future. The latest experimental results, utilizing MHD-like force rectification systems, will also be presented.

  8. Reliable Entanglement Verification

    NASA Astrophysics Data System (ADS)

    Arrazola, Juan; Gittsovich, Oleg; Donohue, John; Lavoie, Jonathan; Resch, Kevin; Lütkenhaus, Norbert

    2013-05-01

    Entanglement plays a central role in quantum protocols. It is therefore important to be able to verify the presence of entanglement in physical systems from experimental data. In the evaluation of these data, the proper treatment of statistical effects requires special attention, as one can never claim to have verified the presence of entanglement with certainty. Recently increased attention has been paid to the development of proper frameworks to pose and to answer these type of questions. In this work, we apply recent results by Christandl and Renner on reliable quantum state tomography to construct a reliable entanglement verification procedure based on the concept of confidence regions. The statements made do not require the specification of a prior distribution nor the assumption of an independent and identically distributed (i.i.d.) source of states. Moreover, we develop efficient numerical tools that are necessary to employ this approach in practice, rendering the procedure ready to be employed in current experiments. We demonstrate this fact by analyzing the data of an experiment where photonic entangled two-photon states were generated and whose entanglement is verified with the use of an accessible nonlinear witness.

  9. Help Seeking and Receiving.

    ERIC Educational Resources Information Center

    Nadler, Arie

    Although social psychology has always had an interest in helping behavior, only recently has the full complexity of helping relations begun to be researched. Help seeking and receiving in the educational setting raise many issues regarding the use and effectiveness of the help itself. Central to all helping relations is the seeking/receiving…

  10. Studies in Seismic Verification

    DTIC Science & Technology

    1992-05-01

    features in the Earth. G(co) includes source region effects such as free surface reflections, geometrical spreading which may be frequency dependent...pressure function at the elastic radius. They used a pressure function based on free -field observations of several underground nuclear explosions...show an increase in 10 and 30 Hz spectral amplitude by a factor of about 5 above the free surface effect. Therefore we expect the Anza spectral

  11. Helping Parents Help Their Children Toward Literacy.

    ERIC Educational Resources Information Center

    Nichols, G. Jeane

    A practicum was designed to help parents of kindergartners in a low income area help their children develop literacy. The primary goal was to secure the active involvement of parents in their children's learning experiences. Other goals included improving kindergarten teachers' communication skills and expanding their strategies for reaching out…

  12. Modality Switching in a Property Verification Task: An ERP Study of What Happens When Candles Flicker after High Heels Click.

    PubMed

    Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana

    2011-01-01

    The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties.

  13. Modality Switching in a Property Verification Task: An ERP Study of What Happens When Candles Flicker after High Heels Click

    PubMed Central

    Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana

    2011-01-01

    The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties. PMID:21713128

  14. SPR Hydrostatic Column Model Verification and Validation.

    SciTech Connect

    Bettin, Giorgia; Lord, David; Rudeen, David Keith

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  15. Verifying a nuclear weapon`s response to radiation environments

    SciTech Connect

    Dean, F.F.; Barrett, W.H.

    1998-05-01

    The process described in the paper is being applied as part of the design verification of a replacement component designed for a nuclear weapon currently in the active stockpile. This process is an adaptation of the process successfully used in nuclear weapon development programs. The verification process concentrates on evaluating system response to radiation environments, verifying system performance during and after exposure to radiation environments, and assessing system survivability.

  16. Space station data management system - A common GSE test interface for systems testing and verification

    NASA Technical Reports Server (NTRS)

    Martinez, Pedro A.; Dunn, Kevin W.

    1987-01-01

    This paper examines the fundamental problems and goals associated with test, verification, and flight-certification of man-rated distributed data systems. First, a summary of the characteristics of modern computer systems that affect the testing process is provided. Then, verification requirements are expressed in terms of an overall test philosophy for distributed computer systems. This test philosophy stems from previous experience that was gained with centralized systems (Apollo and the Space Shuttle), and deals directly with the new problems that verification of distributed systems may present. Finally, a description of potential hardware and software tools to help solve these problems is provided.

  17. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  18. Isocenter verification for linac-based stereotactic radiation therapy: review of principles and techniques.

    PubMed

    Rowshanfarzad, Pejman; Sabet, Mahsheed; O'Connor, Daryl J; Greer, Peter B

    2011-11-15

    There have been several manual, semi-automatic and fully-automatic methods proposed for verification of the position of mechanical isocenter as part of comprehensive quality assurance programs required for linear accelerator-based stereotactic radiosurgery/radiotherapy (SRS/SRT) treatments. In this paper, a systematic review has been carried out to discuss the present methods for isocenter verification and compare their characteristics, to help physicists in making a decision on selection of their quality assurance routine.

  19. How Formal Dynamic Verification Tools Facilitate Novel Concurrency Visualizations

    NASA Astrophysics Data System (ADS)

    Aananthakrishnan, Sriram; Delisi, Michael; Vakkalanka, Sarvani; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev

    With the exploding scale of concurrency, presenting valuable pieces of information collected by formal verification tools intuitively and graphically can greatly enhance concurrent system debugging. Traditional MPI program debuggers present trace views of MPI program executions. Such views are redundant, often containing equivalent traces that permute independent MPI calls. In our ISP formal dynamic verifier for MPI programs, we present a collection of alternate views made possible by the use of formal dynamic verification. Some of ISP’s views help pinpoint errors, some facilitate discerning errors by eliminating redundancy, while others help understand the program better by displaying concurrent even orderings that must be respected by all MPI implementations, in the form of completes-before graphs. In this paper, we describe ISP’s graphical user interface (GUI) capabilities in all these areas which are currently supported by a portable Java based GUI, a Microsoft Visual Studio GUI, and an Eclipse based GUI whose development is in progress.

  20. Safeguards Guidance Document for Designers of Commercial Nuclear Facilities: International Nuclear Safeguards Requirements and Practices For Uranium Enrichment Plants

    SciTech Connect

    Robert Bean; Casey Durst

    2009-10-01

    This report is the second in a series of guidelines on international safeguards requirements and practices, prepared expressly for the designers of nuclear facilities. The first document in this series is the description of generic international nuclear safeguards requirements pertaining to all types of facilities. These requirements should be understood and considered at the earliest stages of facility design as part of a new process called “Safeguards-by-Design.” This will help eliminate the costly retrofit of facilities that has occurred in the past to accommodate nuclear safeguards verification activities. The following summarizes the requirements for international nuclear safeguards implementation at enrichment plants, prepared under the Safeguards by Design project, and funded by the U.S. Department of Energy (DOE) National Nuclear Security Administration (NNSA), Office of NA-243. The purpose of this is to provide designers of nuclear facilities around the world with a simplified set of design requirements and the most common practices for meeting them. The foundation for these requirements is the international safeguards agreement between the country and the International Atomic Energy Agency (IAEA), pursuant to the Treaty on the Non-proliferation of Nuclear Weapons (NPT). Relevant safeguards requirements are also cited from the Safeguards Criteria for inspecting enrichment plants, found in the IAEA Safeguards Manual, Part SMC-8. IAEA definitions and terms are based on the IAEA Safeguards Glossary, published in 2002. The most current specification for safeguards measurement accuracy is found in the IAEA document STR-327, “International Target Values 2000 for Measurement Uncertainties in Safeguarding Nuclear Materials,” published in 2001. For this guide to be easier for the designer to use, the requirements have been restated in plainer language per expert interpretation using the source documents noted. The safeguards agreement is fundamentally a

  1. Cold Fusion Verification.

    DTIC Science & Technology

    1991-03-01

    published work, talking with others in the field, and attending conferences, that CNF probably is chimera and will go the way of N-rays and polywater ...way of N-rays and polywater . To date, no one, including Pons and Fleischmann, has been able to construct a so-called CNF electrochemical cell that...Cold Nuclear Fusion (CNF), as originally reported in 1989. The conclusion is that CNF probably is chimera and will go the way of N-rays and polywater

  2. How Nasa's Independent Verification and Validation (IVandV) Program Builds Reliability into a Space Mission Software System (SMSS)

    NASA Technical Reports Server (NTRS)

    Fisher, Marcus S.; Northey, Jeffrey; Stanton, William

    2014-01-01

    The purpose of this presentation is to outline how the NASA Independent Verification and Validation (IVV) Program helps to build reliability into the Space Mission Software Systems (SMSSs) that its customers develop.

  3. Monte Carlo verification of IMRT treatment plans on grid.

    PubMed

    Gómez, Andrés; Fernández Sánchez, Carlos; Mouriño Gallego, José Carlos; López Cacheiro, Javier; González Castaño, Francisco J; Rodríguez-Silva, Daniel; Domínguez Carrera, Lorena; González Martínez, David; Pena García, Javier; Gómez Rodríguez, Faustino; González Castaño, Diego; Pombar Cameán, Miguel

    2007-01-01

    The eIMRT project is producing new remote computational tools for helping radiotherapists to plan and deliver treatments. The first available tool will be the IMRT treatment verification using Monte Carlo, which is a computational expensive problem that can be executed remotely on a GRID. In this paper, the current implementation of this process using GRID and SOA technologies is presented, describing the remote execution environment and the client.

  4. Atmospheric discharge and dispersion of radionuclides during the Fukushima Dai-ichi Nuclear Power Plant accident. Part II: verification of the source term and analysis of regional-scale atmospheric dispersion.

    PubMed

    Terada, Hiroaki; Katata, Genki; Chino, Masamichi; Nagai, Haruyasu

    2012-10-01

    Regional-scale atmospheric dispersion simulations were carried out to verify the source term of (131)I and (137)Cs estimated in our previous studies, and to analyze the atmospheric dispersion and surface deposition during the Fukushima Dai-ichi Nuclear Power Plant accident. The accuracy of the source term was evaluated by comparing the simulation results with measurements of daily and monthly surface depositions (fallout) over land in eastern Japan from March 12 to April 30, 2011. The source term was refined using observed air concentrations of radionuclides for periods when there were significant discrepancies between the calculated and measured daily surface deposition, and when environmental monitoring data, which had not been used in our previous studies, were now available. The daily surface deposition using the refined source term was predicted mostly to within a factor of 10, and without any apparent bias. Considering the errors in the model prediction, the estimated source term is reasonably accurate during the period when the plume flowed over land in Japan. The analysis of regional-scale atmospheric dispersion and deposition suggests that the present distribution of a large amount of (137)Cs deposition in eastern Japan was produced primarily by four events that occurred on March 12, 15-16, 20, and 21-23. The ratio of wet deposition to the total varied widely depending on the influence by the particular event. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  6. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  7. Environmental Technology Verification Program Materials ...

    EPA Pesticide Factsheets

    The protocol provides generic procedures for implementing a verification test for the performance of in situ chemical oxidation (ISCO), focused specifically to expand the application of ISCO at manufactured gas plants with polyaromatic hydrocarbon (PAH) contamination (MGP/PAH) and at active gas station sites. The protocol provides generic procedures for implementing a verification test for the performance of in situ chemical oxidation (ISCO), focused specifically to expand the application of ISCO at manufactured gas plants with polyaromatic hydrocarbon (PAH) contamination (MGP/PAH) and at active gas station sites.

  8. CHEMICAL INDUCTION MIXER VERIFICATION - ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Wet-Weather Flow Technologies Pilot of the Environmental Technology Verification (ETV) Program, which is supported by the U.S. Environmental Protection Agency and facilitated by NSF International, has recently evaluated the performance of chemical induction mixers used for di...

  9. CHEMICAL INDUCTION MIXER VERIFICATION - ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Wet-Weather Flow Technologies Pilot of the Environmental Technology Verification (ETV) Program, which is supported by the U.S. Environmental Protection Agency and facilitated by NSF International, has recently evaluated the performance of chemical induction mixers used for di...

  10. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by the... for verification. When seeking verification of a contact lens prescription, a seller shall provide the...

  11. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  12. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Energy Regulatory Commission 18 CFR Part 40 Generator Verification Reliability Standards AGENCY: Federal... Organization: MOD-025-2 (Verification and Data Reporting of Generator Real and Reactive Power Capability and Synchronous Condenser Reactive Power Capability), MOD- 026-1 (Verification of Models and Data for Generator...

  13. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  14. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    SciTech Connect

    Crowell, Michael W

    2015-01-01

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ‘hand’ comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink™ for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oak Ridge National Laboratory’s High Flux Isotope Reactor (HFIR).

  15. Dispelling myths about verification of sea-launched cruise missiles

    SciTech Connect

    Lewis, G.N. ); Ride, S.K. ); Townsend, J.S. )

    1989-11-10

    It is widely believed that an arms control limit on nuclear-armed sea-launched cruise missiles would be nearly impossible to verify. Among the reasons usually given are: these weapons are small, built in nondistinctive industrial facilities, deployed on a variety of ships and submarines, and difficult to distinguish from their conventionally armed counterparts. In this article, it is argued that the covert production and deployment of nuclear-armed sea-launched cruise missiles would not be so straightforward. A specific arms control proposed is described, namely a total ban on nuclear-armed sea-launched cruise missiles. This proposal is used to illustrate how an effective verification scheme might be constructed. 9 refs., 6 figs.

  16. Keeping Nuclear Materials Secure

    SciTech Connect

    2016-10-19

    For 50 years, Los Alamos National Laboratory has been helping to keep nuclear materials secure. We do this by developing instruments and training inspectors that are deployed to other countries to make sure materials such as uranium are being used for peaceful purposes and not diverted for use in weapons. These measures are called “nuclear safeguards,” and they help make the world a safer place.

  17. Advanced nuclear propulsion concepts

    SciTech Connect

    Howe, S.D.

    1994-12-31

    A preliminary analysis has been carried out for two potential advanced nuclear propulsion systems: a contained pulsed nuclear propulsion engine and an antiproton initiated ICF system. The results of these studies indicate that both concepts have a high potential to help enable manned planetary exploration but require substantial development.

  18. Image Hashes as Templates for Verification

    SciTech Connect

    Janik, Tadeusz; Jarman, Kenneth D.; Robinson, Sean M.; Seifert, Allen; McDonald, Benjamin S.; White, Timothy A.

    2012-07-17

    Imaging systems can provide measurements that confidently assess characteristics of nuclear weapons and dismantled weapon components, and such assessment will be needed in future verification for arms control. Yet imaging is often viewed as too intrusive, raising concern about the ability to protect sensitive information. In particular, the prospect of using image-based templates for verifying the presence or absence of a warhead, or of the declared configuration of fissile material in storage, may be rejected out-of-hand as being too vulnerable to violation of information barrier (IB) principles. Development of a rigorous approach for generating and comparing reduced-information templates from images, and assessing the security, sensitivity, and robustness of verification using such templates, are needed to address these concerns. We discuss our efforts to develop such a rigorous approach based on a combination of image-feature extraction and encryption-utilizing hash functions to confirm proffered declarations, providing strong classified data security while maintaining high confidence for verification. The proposed work is focused on developing secure, robust, tamper-sensitive and automatic techniques that may enable the comparison of non-sensitive hashed image data outside an IB. It is rooted in research on so-called perceptual hash functions for image comparison, at the interface of signal/image processing, pattern recognition, cryptography, and information theory. Such perceptual or robust image hashing—which, strictly speaking, is not truly cryptographic hashing—has extensive application in content authentication and information retrieval, database search, and security assurance. Applying and extending the principles of perceptual hashing to imaging for arms control, we propose techniques that are sensitive to altering, forging and tampering of the imaged object yet robust and tolerant to content-preserving image distortions and noise. Ensuring that the

  19. Kleene Algebra and Bytecode Verification

    DTIC Science & Technology

    2016-04-27

    Languages, ACM SIGPLAN/SIGACT, 1998, pp. 149–160. [2] Coglio, A., Simple verification technique for complex Java bytecode subroutines, Concurrency and...of Programming Languages (POPL’73), ACM , 1973, pp. 194–206. [6] Kot, L. and D. Kozen, Second-order abstract interpretation via Kleene algebra

  20. Automated verification system user's guide

    NASA Technical Reports Server (NTRS)

    Hoffman, R. H.

    1972-01-01

    Descriptions of the operational requirements for all of the programs of the Automated Verification System (AVS) are provided. The AVS programs are: (1) FORTRAN code analysis and instrumentation program (QAMOD); (2) Test Effectiveness Evaluation Program (QAPROC); (3) Transfer Control Variable Tracking Program (QATRAK); (4) Program Anatomy Table Generator (TABGEN); and (5) Network Path Analysis Program (RAMBLE).

  1. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-07-16

    This paper will explore the difficulties of deep reductions by examining the technical verification challenges. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 (Pifer 2010). Further reductions will include stepping stones at 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national lab complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  2. Cleanup Verification Package for the 118-C-1, 105-C Solid Waste Burial Ground

    SciTech Connect

    M. J. Appel and J. M. Capron

    2007-07-25

    This cleanup verification package documents completion of remedial action for the 118-C-1, 105-C Solid Waste Burial Ground. This waste site was the primary burial ground for general wastes from the operation of the 105-C Reactor and received process tubes, aluminum fuel spacers, control rods, reactor hardware, spent nuclear fuel and soft wastes.

  3. Helping for Change

    ERIC Educational Resources Information Center

    Neuringer, Allen; Oleson, Kathryn C.

    2010-01-01

    In "Helping for Change," Allen Neuringer and Kathryn Oleson describe another strategy that individuals can use to achieve their green goals. You might ask, "How can helping someone else help me change when I'm in the habit of not fulfilling my own promises?" The authors answer that question by explaining how the social reinforcement in a helping…

  4. Help! It's Hair Loss!

    MedlinePlus

    ... Emergency Room? What Happens in the Operating Room? Help! It's Hair Loss! KidsHealth > For Kids > Help! It's Hair Loss! A A A What's in ... a better look at what's going on to help decide what to do next. For a fungal ...

  5. Help with Hives

    MedlinePlus

    ... Emergency Room? What Happens in the Operating Room? Help With Hives KidsHealth > For Kids > Help With Hives A A A What's in this ... about what happened. The doctor can try to help figure out what might be causing your hives, ...

  6. Concepts of Model Verification and Validation

    SciTech Connect

    B.H.Thacker; S.W.Doebling; F.M.Hemez; M.C. Anderson; J.E. Pepin; E.A. Rodriguez

    2004-10-30

    Model verification and validation (V&V) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model V&V procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model V&V program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model V&V is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define V&V methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for V&V applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of V&V procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model V&V for all

  7. Radia2: A new tool for radiotherapy verification

    NASA Astrophysics Data System (ADS)

    Ovejero, M. C.; Vega-Leal, A. Pérez; Cortés-Giraldo, M. A.; Abou-Haidar, Z.; Bocci, A.; Gallardo, M. I.; Espino, J. M.; Álvarez, M. A. G.; Quesada, J. M.; Arráns, R.

    2013-06-01

    Radiotherapy is nowadays a proven technique in cancer treatments. Within the evolution of radiotherapy treatments towards more complex techniques, the need of new dosimetric methods for treatment verifications has appeared. In order to reach an improved dosimetric method, a collaboration was started to transfer knowledge from nuclear reaction instrumentation to medical applications, involving several departments from the University of Seville, Centro Nacional de Aceleradores (CNA), the Hospital Universitario Virgen Macarena and the company Inabensa. The first prototype, patent pending [2], gave very promising results. Currently, a critical review is being carried out to create an improved system.

  8. A comprehensive ban on nuclear testing.

    PubMed

    Neild, R; Ruina, J P

    1972-01-14

    Our foregoing analysis of the role of a comprehensive test ban leads us to the following conclusions. 1) A CTB by itself will have little direct effect on the arms race between the superpowers. It would not hinder their nuclear arms production and deployment nor would it necessarily present a significant obstacle to the development of new nuclear weapons systems, despite limiting the development of new nuclear warhead designs. It can hardly make a dent in the destructive capability of the superpowers or in their ability to step up the pace of the arms race. 2) The chief merits of a CTB reside in the political sphere. It would help promote detente and could help to escalate interest in arms control agreements of broader scope. But in neither of these effects would it be as significant as a successful SALT (strategic arms limitation talks) agreement. The CTB also lingers as a piece of unfinished business since the signing of the LTB in 1963. The question can be and has been raised, "If the superpowers are serious about arms control, why have they not accepted the CTB, which is simple in concept and in form and is also free of serious military risks?" Such doubts about the sincerity of the superpowers' willingness to limit their own arms development will persist as long as there is no CTB. Substantial agreement at SALT would lessen some of this effect too, but would not eliminate it completely. 3) Recent progress in seismic identification has been impressive, and other means of obtaining technical intelligence about nuclear testing have probably also improved greatly. In addition, research on the technical means of on-site inspection has demonstrated its limited effectiveness. Therefore, the role of on-site inspections as an added deterrent to cheating on a CTB has diminished substantially. This is not to say that detection and identification of all nuclear tests is possible now, or ever, but only that on-site inspection would add very little to the other technical

  9. Help Design Software Project

    DTIC Science & Technology

    1988-11-30

    design principles in their online help. * Help Evaluation System to assist developers and end users in diagnosing the strengths and weakness of any...I I I Reference Principles Guidelines Model Help (browse) (browse) (browse) System Screen-Specific Design Principles Examples Guidelines Reference...basically points out the important features of the screen to more specific principles and other examples of those principles , and finally to detailed

  10. Gender verification testing in sport.

    PubMed

    Ferris, E A

    1992-07-01

    Gender verification testing in sport, first introduced in 1966 by the International Amateur Athletic Federation (IAAF) in response to fears that males with a physical advantage in terms of muscle mass and strength were cheating by masquerading as females in women's competition, has led to unfair disqualifications of women athletes and untold psychological harm. The discredited sex chromatin test, which identifies only the sex chromosome component of gender and is therefore misleading, was abandoned in 1991 by the IAAF in favour of medical checks for all athletes, women and men, which preclude the need for gender testing. But, women athletes will still be tested at the Olympic Games at Albertville and Barcelona using polymerase chain reaction (PCR) to amplify DNA sequences on the Y chromosome which identifies genetic sex only. Gender verification testing may in time be abolished when the sporting community are fully cognizant of its scientific and ethical implications.

  11. Wavelet Features Based Fingerprint Verification

    NASA Astrophysics Data System (ADS)

    Bagadi, Shweta U.; Thalange, Asha V.; Jain, Giridhar P.

    2010-11-01

    In this work; we present a automatic fingerprint identification system based on Level 3 features. Systems based only on minutiae features do not perform well for poor quality images. In practice, we often encounter extremely dry, wet fingerprint images with cuts, warts, etc. Due to such fingerprints, minutiae based systems show poor performance for real time authentication applications. To alleviate the problem of poor quality fingerprints, and to improve overall performance of the system, this paper proposes fingerprint verification based on wavelet statistical features & co-occurrence matrix features. The features include mean, standard deviation, energy, entropy, contrast, local homogeneity, cluster shade, cluster prominence, Information measure of correlation. In this method, matching can be done between the input image and the stored template without exhaustive search using the extracted feature. The wavelet transform based approach is better than the existing minutiae based method and it takes less response time and hence suitable for on-line verification, with high accuracy.

  12. NEXT Thruster Component Verification Testing

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Sovey, James S.

    2007-01-01

    Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.

  13. Land surface Verification Toolkit (LVT)

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  14. Ontology Matching with Semantic Verification

    PubMed Central

    Jean-Mary, Yves R.; Shironoshita, E. Patrick; Kabuka, Mansur R.

    2009-01-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies. PMID:20186256

  15. Verification and transparency in future arms control

    SciTech Connect

    Pilat, J.F.

    1996-09-01

    Verification`s importance has changed dramatically over time, although it always has been in the forefront of arms control. The goals and measures of verification and the criteria for success have changed with the times as well, reflecting such factors as the centrality of the prospective agreement to East-West relations during the Cold War, the state of relations between the United States and the Soviet Union, and the technologies available for monitoring. Verification`s role may be declining in the post-Cold War period. The prospects for such a development will depend, first and foremost, on the high costs of traditional arms control, especially those associated with requirements for verification. Moreover, the growing interest in informal, or non-negotiated arms control does not allow for verification provisions by the very nature of these arrangements. Multilateral agreements are also becoming more prominent and argue against highly effective verification measures, in part because of fears of promoting proliferation by opening sensitive facilities to inspectors from potential proliferant states. As a result, it is likely that transparency and confidence-building measures will achieve greater prominence, both as supplements to and substitutes for traditional verification. Such measures are not panaceas and do not offer all that we came to expect from verification during the Cold war. But they may be the best possible means to deal with current problems of arms reductions and restraints at acceptable levels of expenditure.

  16. Verification and validation guidelines for high integrity systems. Volume 1

    SciTech Connect

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D.

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities.

  17. Structural System Identification Technology Verification

    DTIC Science & Technology

    1981-11-01

    USAAVRADCOM-TR-81-D-28Q V󈧄 ADA1091 81 LEI STRUCTURAL SYSTEM IDENTIFICATION TECHNOLOGY VERIFICATION \\ N. Giansante, A. Berman, W. o. Flannelly, E...release; distribution unlimited. Prepared for APPLIED TECHNOLOGY LABORATORY U. S. ARMY RESEARCH AND TECHNOLOGY LABORATORIES (AVRADCOM) S Fort Eustis...Va. 23604 4-J" APPLI ED TECHNOLOGY LABORATORY POSITION STATEMENT The Applied Technology Laboratory has been involved in the development of the Struc

  18. Crowd-Sourced Program Verification

    DTIC Science & Technology

    2012-12-01

    S / ROBERT L. KAMINSKI WARREN H. DEBANY, JR. Work Unit Manager Technical Advisor, Information Exploitation & Operations...Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington, DC 20503. PLEASE DO NOT RETURN...investigation, the contractor constructed a prototype of a crowd-sourced verification system that takes as input a given program and produces as output a

  19. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  20. Earthquake Forecasting, Validation and Verification

    NASA Astrophysics Data System (ADS)

    Rundle, J.; Holliday, J.; Turcotte, D.; Donnellan, A.; Tiampo, K.; Klein, B.

    2009-05-01

    Techniques for earthquake forecasting are in development using both seismicity data mining methods, as well as numerical simulations. The former rely on the development of methods to recognize patterns in data, while the latter rely on the use of dynamical models that attempt to faithfully replicate the actual fault systems. Testing such forecasts is necessary not only to determine forecast quality, but also to improve forecasts. A large number of techniques to validate and verify forecasts have been developed for weather and financial applications. Many of these have been elaborated in public locations, including, for example, the URL as listed below. Typically, the goal is to test for forecast resolution, reliability and sharpness. A good forecast is characterized by consistency, quality and value. Most, if not all of these forecast verification procedures can be readily applied to earthquake forecasts as well. In this talk, we discuss both methods of forecasting, as well as validation and verification using a number of these standard methods. We show how these test methods might be useful for both fault-based forecasting, a group of forecast methods that includes the WGCEP and simulator-based renewal models, and grid-based forecasting, which includes the Relative Intensity, Pattern Informatics, and smoothed seismicity methods. We find that applying these standard methods of forecast verification is straightforward. Judgments about the quality of a given forecast method can often depend on the test applied, as well as on the preconceptions and biases of the persons conducting the tests.

  1. History of Nuclear India

    NASA Astrophysics Data System (ADS)

    Chaturvedi, Ram

    2000-04-01

    India emerged as a free and democratic country in 1947, and entered into the nuclear age in 1948 by establishing the Atomic Energy Commission (AEC), with Homi Bhabha as the chairman. Later on the Department of Atomic Energy (DAE) was created under the Office of the Prime Minister Jawahar Lal Nehru. Initially the AEC and DAE received international cooperation, and by 1963 India had two research reactors and four nuclear power reactors. In spite of the humiliating defeat in the border war by China in 1962 and China's nuclear testing in 1964, India continued to adhere to the peaceful uses of nuclear energy. On May 18, 1974 India performed a 15 kt Peaceful Nuclear Explosion (PNE). The western powers considered it nuclear weapons proliferation and cut off all financial and technical help, even for the production of nuclear power. However, India used existing infrastructure to build nuclear power reactors and exploded both fission and fusion devices on May 11 and 13, 1998. The international community viewed the later activity as a serious road block for the Non-Proliferation Treaty and the Comprehensive Test Ban Treaty; both deemed essential to stop the spread of nuclear weapons. India considers these treaties favoring nuclear states and is prepared to sign if genuine nuclear disarmament is included as an integral part of these treaties.

  2. Nuclear Lunar Logistics Study

    NASA Technical Reports Server (NTRS)

    1963-01-01

    This document has been prepared to incorporate all presentation aid material, together with some explanatory text, used during an oral briefing on the Nuclear Lunar Logistics System given at the George C. Marshall Space Flight Center, National Aeronautics and Space Administration, on 18 July 1963. The briefing and this document are intended to present the general status of the NERVA (Nuclear Engine for Rocket Vehicle Application) nuclear rocket development, the characteristics of certain operational NERVA-class engines, and appropriate technical and schedule information. Some of the information presented herein is preliminary in nature and will be subject to further verification, checking and analysis during the remainder of the study program. In addition, more detailed information will be prepared in many areas for inclusion in a final summary report. This work has been performed by REON, a division of Aerojet-General Corporation under Subcontract 74-10039 from the Lockheed Missiles and Space Company. The presentation and this document have been prepared in partial fulfillment of the provisions of the subcontract. From the inception of the NERVA program in July 1961, the stated emphasis has centered around the demonstration of the ability of a nuclear rocket to perform safely and reliably in the space environment, with the understanding that the assignment of a mission (or missions) would place undue emphasis on performance and operational flexibility. However, all were aware that the ultimate justification for the development program must lie in the application of the nuclear propulsion system to the national space objectives.

  3. Verification Test Suite for Physics Simulation Codes

    SciTech Connect

    Brock, J S; Kamm, J R; Rider, W J; Brandon, S; Woodward, C; Knupp, P; Trucano, T G

    2006-12-21

    The DOE/NNSA Advanced Simulation & Computing (ASC) Program directs the development, demonstration and deployment of physics simulation codes. The defensible utilization of these codes for high-consequence decisions requires rigorous verification and validation of the simulation software. The physics and engineering codes used at Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratory (SNL) are arguably among the most complex utilized in computational science. Verification represents an important aspect of the development, assessment and application of simulation software for physics and engineering. The purpose of this note is to formally document the existing tri-laboratory suite of verification problems used by LANL, LLNL, and SNL, i.e., the Tri-Lab Verification Test Suite. Verification is often referred to as ensuring that ''the [discrete] equations are solved [numerically] correctly''. More precisely, verification develops evidence of mathematical consistency between continuum partial differential equations (PDEs) and their discrete analogues, and provides an approach by which to estimate discretization errors. There are two variants of verification: (1) code verification, which compares simulation results to known analytical solutions, and (2) calculation verification, which estimates convergence rates and discretization errors without knowledge of a known solution. Together, these verification analyses support defensible verification and validation (V&V) of physics and engineering codes that are used to simulate complex problems that do not possess analytical solutions. Discretization errors (e.g., spatial and temporal errors) are embedded in the numerical solutions of the PDEs that model the relevant governing equations. Quantifying discretization errors, which comprise only a portion of the total numerical simulation error, is possible through code and calculation verification. Code verification

  4. Handi Helps, 1984.

    ERIC Educational Resources Information Center

    Handi Helps, 1984

    1984-01-01

    The eight issues of Handi Helps presented in this document focus on specific issues of concern to the disabled, parents, and those working with the disabled. The two-page handi help fact sheets focus on the following topics: child abuse, leukemia, arthritis, Tourette Syndrome, hemophilia, the puppet program "Meet the New Kids on the…

  5. Handi Helps, 1985

    ERIC Educational Resources Information Center

    Handi Helps, 1985

    1985-01-01

    The six issues of Handi Helps presented here focus on specific issues of concern to the disabled, parents, and those working with the disabled. The two-page handi help fact sheets focus on the following topics: child sexual abuse prevention, asthma, scoliosis, the role of the occupational therapist, kidnapping, and muscular dystrophy. Each handi…

  6. Help! It's Hair Loss!

    MedlinePlus

    ... A Real Lifesaver Kids Talk About: Coaches Help! It's Hair Loss! KidsHealth > For Kids > Help! It's Hair Loss! Print A A A What's in ... part above the skin, is dead. (That's why it doesn't hurt to get a haircut!) This ...

  7. Handi Helps, 1984.

    ERIC Educational Resources Information Center

    Handi Helps, 1984

    1984-01-01

    The eight issues of Handi Helps presented in this document focus on specific issues of concern to the disabled, parents, and those working with the disabled. The two-page handi help fact sheets focus on the following topics: child abuse, leukemia, arthritis, Tourette Syndrome, hemophilia, the puppet program "Meet the New Kids on the…

  8. Home Is for Helping.

    ERIC Educational Resources Information Center

    Des Moines Public Schools, IA.

    This booklet for parents offers ideas for utilizing everyday situations in the home to help children improve in school, primarily in reading and mathematics skills. General suggestions are given for helping children to do their best by talking to them, reading to them, listening to them, praising them, watching television with them, keeping them…

  9. Helping America's Youth

    ERIC Educational Resources Information Center

    Bush, Laura

    2005-01-01

    As First Lady of the United States, Laura Bush is leading the Helping America's Youth initiative of the federal government. She articulates the goal of enlisting public and volunteer resources to foster healthy growth by early intervention and mentoring of youngsters at risk. Helping America's Youth will benefit children and teenagers by…

  10. Handi Helps, 1985

    ERIC Educational Resources Information Center

    Handi Helps, 1985

    1985-01-01

    The six issues of Handi Helps presented here focus on specific issues of concern to the disabled, parents, and those working with the disabled. The two-page handi help fact sheets focus on the following topics: child sexual abuse prevention, asthma, scoliosis, the role of the occupational therapist, kidnapping, and muscular dystrophy. Each handi…

  11. Helping Our Children.

    ERIC Educational Resources Information Center

    Polk, Sophie

    1987-01-01

    Describes the Ikaiyurluki Mikelnguut (Helping Our Children) project in the Yukon Kuskokwim Delta of Alaska where trained natural helpers are helping Yup'ik Eskimo villagers to cope with crisis situations--notably teenage suicide and drug and alcohol abuse. (Author/BB)

  12. Helping America's Youth

    ERIC Educational Resources Information Center

    Bush, Laura

    2005-01-01

    As First Lady of the United States, Laura Bush is leading the Helping America's Youth initiative of the federal government. She articulates the goal of enlisting public and volunteer resources to foster healthy growth by early intervention and mentoring of youngsters at risk. Helping America's Youth will benefit children and teenagers by…

  13. Gender verification in competitive sports.

    PubMed

    Simpson, J L; Ljungqvist, A; de la Chapelle, A; Ferguson-Smith, M A; Genel, M; Carlson, A S; Ehrhardt, A A; Ferris, E

    1993-11-01

    The possibility that men might masquerade as women and be unfair competitors in women's sports is accepted as outrageous by athletes and the public alike. Since the 1930s, media reports have fuelled claims that individuals who once competed as female athletes subsequently appeared to be men. In most of these cases there was probably ambiguity of the external genitalia, possibly as a result of male pseudohermaphroditism. Nonetheless, beginning at the Rome Olympic Games in 1960, the International Amateur Athletics Federation (IAAF) began establishing rules of eligibility for women athletes. Initially, physical examination was used as a method for gender verification, but this plan was widely resented. Thus, sex chromatin testing (buccal smear) was introduced at the Mexico City Olympic Games in 1968. The principle was that genetic females (46,XX) show a single X-chromatic mass, whereas males (46,XY) do not. Unfortunately, sex chromatin analysis fell out of common diagnostic use by geneticists shortly after the International Olympic Committee (IOC) began its implementation for gender verification. The lack of laboratories routinely performing the test aggravated the problem of errors in interpretation by inexperienced workers, yielding false-positive and false-negative results. However, an even greater problem is that there exist phenotypic females with male sex chromatin patterns (e.g. androgen insensitivity, XY gonadal dysgenesis). These individuals have no athletic advantage as a result of their congenital abnormality and reasonably should not be excluded from competition. That is, only the chromosomal (genetic) sex is analysed by sex chromatin testing, not the anatomical or psychosocial status. For all the above reasons sex chromatin testing unfairly excludes many athletes. Although the IOC offered follow-up physical examinations that could have restored eligibility for those 'failing' sex chromatin tests, most affected athletes seemed to prefer to 'retire'. All

  14. Nuclear power and nuclear weapons

    SciTech Connect

    Vaughen, V.C.A.

    1983-01-01

    The proliferation of nuclear weapons and the expanded use of nuclear energy for the production of electricity and other peaceful uses are compared. The difference in technologies associated with nuclear weapons and nuclear power plants are described.

  15. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  16. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  17. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION COATINGS AND COATING EQUIPMENT PROGRAM (ETV CCEP): LIQUID COATINGS--GENERIC VERIFICATION PROTOCOL

    EPA Science Inventory

    This report is a generic verification protocol or GVP which provides standards for testing liquid coatings for their enviornmental impacts under the Environmental Technology Verification program. It provides generic guidelines for product specific testing and quality assurance p...

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION: GENERIC VERIFICATION PROTOCOL FOR BIOLOGICAL AND AEROSOL TESTING OF GENERAL VENTILATION AIR CLEANERS

    EPA Science Inventory

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  20. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  1. Using tools for verification, documentation and testing

    NASA Technical Reports Server (NTRS)

    Osterweil, L. J.

    1978-01-01

    Methodologies are discussed on four of the major approaches to program upgrading -- namely dynamic testing, symbolic execution, formal verification and static analysis. The different patterns of strengths, weaknesses and applications of these approaches are shown. It is demonstrated that these patterns are in many ways complementary, offering the hope that they can be coordinated and unified into a single comprehensive program testing and verification system capable of performing a diverse and useful variety of error detection, verification and documentation functions.

  2. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  3. Nuclear rights - nuclear wrongs

    SciTech Connect

    Paul, E.F.; Miller, F.D.; Paul, J.; Ahrens, J.

    1986-01-01

    This book contains 11 selections. The titles are: Three Ways to Kill Innocent Bystanders: Some Conundrums Concerning the Morality of War; The International Defense of Liberty; Two Concepts of Deterrence; Nuclear Deterrence and Arms Control; Ethical Issues for the 1980s; The Moral Status of Nuclear Deterrent Threats; Optimal Deterrence; Morality and Paradoxical Deterrence; Immoral Risks: A Deontological Critique of Nuclear Deterrence; No War Without Dictatorship, No Peace Without Democracy: Foreign Policy as Domestic Politics; Marxism-Leninism and its Strategic Implications for the United States; Tocqueveille War.

  4. Regional Seismic Arrays and Nuclear Test Ban Verification

    DTIC Science & Technology

    1990-12-01

    Applications to Static Displacements in Long Valley Caldera , California and Yellowstone Caldera , Wyoming W. R. Walter and J. N. Brune: The Spectra of...the upper maitle beneath A the Silent Canyon caldera in southern Nevada, which Spence (1974) -explained- by extrusion -of volatile magmatic -components...Analogous structures have also been observed- in the late Oligocene Questa caldera in north-central New Mexico (Lipman, 1983), and the late

  5. Uncertainty Quantification for Safety Verification Applications in Nuclear Power Plants

    NASA Astrophysics Data System (ADS)

    Boafo, Emmanuel

    There is an increasing interest in computational reactor safety analysis to systematically replace the conservative calculations by best estimate calculations augmented by quantitative uncertainty analysis methods. This has been necessitated by recent regulatory requirements that have permitted the use of such methods in reactor safety analysis. Stochastic uncertainty quantification methods have shown great promise, as they are better suited to capture the complexities in real engineering problems. This study proposes a framework for performing uncertainty quantification based on the stochastic approach, which can be applied to enhance safety analysis. (Abstract shortened by ProQuest.).

  6. Applying Human-performance Models to Designing and Evaluating Nuclear Power Plants: Review Guidance and Technical Basis

    SciTech Connect

    O'Hara, J.M.

    2009-11-30

    Human performance models (HPMs) are simulations of human behavior with which we can predict human performance. Designers use them to support their human factors engineering (HFE) programs for a wide range of complex systems, including commercial nuclear power plants. Applicants to U.S. Nuclear Regulatory Commission (NRC) can use HPMs for design certifications, operating licenses, and license amendments. In the context of nuclear-plant safety, it is important to assure that HPMs are verified and validated, and their usage is consistent with their intended purpose. Using HPMs improperly may generate misleading or incorrect information, entailing safety concerns. The objective of this research was to develop guidance to support the NRC staff's reviews of an applicant's use of HPMs in an HFE program. The guidance is divided into three topical areas: (1) HPM Verification, (2) HPM Validation, and (3) User Interface Verification. Following this guidance will help ensure the benefits of HPMs are achieved in a technically sound, defensible manner. During the course of developing this guidance, I identified several issues that could not be addressed; they also are discussed.

  7. Scope and verification of a Fissile Material (Cutoff) Treaty

    SciTech Connect

    Hippel, Frank N. von

    2014-05-09

    A Fissile Material Cutoff Treaty (FMCT) would ban the production of fissile material - in practice highly-enriched uranium and separated plutonium - for weapons. It has been supported by strong majorities in the United Nations. After it comes into force, newly produced fissile materials could only be produced under international - most likely International Atomic Energy Agency - monitoring. Many non-weapon states argue that the treaty should also place under safeguards pre-existing stocks of fissile material in civilian use or declared excess for weapons so as to make nuclear-weapons reductions irreversible. This paper discusses the scope of the FMCT, the ability to detect clandestine production and verification challenges in the nuclear-weapons states.

  8. Scope and verification of a Fissile Material (Cutoff) Treaty

    NASA Astrophysics Data System (ADS)

    von Hippel, Frank N.

    2014-05-01

    A Fissile Material Cutoff Treaty (FMCT) would ban the production of fissile material - in practice highly-enriched uranium and separated plutonium - for weapons. It has been supported by strong majorities in the United Nations. After it comes into force, newly produced fissile materials could only be produced under international - most likely International Atomic Energy Agency - monitoring. Many non-weapon states argue that the treaty should also place under safeguards pre-existing stocks of fissile material in civilian use or declared excess for weapons so as to make nuclear-weapons reductions irreversible. This paper discusses the scope of the FMCT, the ability to detect clandestine production and verification challenges in the nuclear-weapons states.

  9. Nuclear Fabrication Consortium

    SciTech Connect

    Levesque, Stephen

    2013-04-05

    This report summarizes the activities undertaken by EWI while under contract from the Department of Energy (DOE) Office of Nuclear Energy (NE) for the management and operation of the Nuclear Fabrication Consortium (NFC). The NFC was established by EWI to independently develop, evaluate, and deploy fabrication approaches and data that support the re-establishment of the U.S. nuclear industry: ensuring that the supply chain will be competitive on a global stage, enabling more cost-effective and reliable nuclear power in a carbon constrained environment. The NFC provided a forum for member original equipment manufactures (OEM), fabricators, manufacturers, and materials suppliers to effectively engage with each other and rebuild the capacity of this supply chain by : Identifying and removing impediments to the implementation of new construction and fabrication techniques and approaches for nuclear equipment, including system components and nuclear plants. Providing and facilitating detailed scientific-based studies on new approaches and technologies that will have positive impacts on the cost of building of nuclear plants. Analyzing and disseminating information about future nuclear fabrication technologies and how they could impact the North American and the International Nuclear Marketplace. Facilitating dialog and initiate alignment among fabricators, owners, trade associations, and government agencies. Supporting industry in helping to create a larger qualified nuclear supplier network. Acting as an unbiased technology resource to evaluate, develop, and demonstrate new manufacturing technologies. Creating welder and inspector training programs to help enable the necessary workforce for the upcoming construction work. Serving as a focal point for technology, policy, and politically interested parties to share ideas and concepts associated with fabrication across the nuclear industry. The report the objectives and summaries of the Nuclear Fabrication Consortium

  10. Input apparatus for dynamic signature verification systems

    DOEpatents

    EerNisse, Errol P.; Land, Cecil E.; Snelling, Jay B.

    1978-01-01

    The disclosure relates to signature verification input apparatus comprising a writing instrument and platen containing piezoelectric transducers which generate signals in response to writing pressures.

  11. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  12. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  13. Acceptance sampling methods for sample results verification

    SciTech Connect

    Jesse, C.A.

    1993-06-01

    This report proposes a statistical sampling method for use during the sample results verification portion of the validation of data packages. In particular, this method was derived specifically for the validation of data packages for metals target analyte analysis performed under United States Environmental Protection Agency Contract Laboratory Program protocols, where sample results verification can be quite time consuming. The purpose of such a statistical method is to provide options in addition to the ``all or nothing`` options that currently exist for sample results verification. The proposed method allows the amount of data validated during the sample results verification process to be based on a balance between risks and the cost of inspection.

  14. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  15. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Cycles § 1065.550 Gas analyzer range verification and drift verification. (a) Range verification. If an... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations...-specific emissions over the entire duty cycle for drift. For each constituent to be verified, both sets...

  16. Neutron spectrometry for UF6 enrichment verification in storage cylinders

    DOE PAGES

    Mengesha, Wondwosen; Kiff, Scott D.

    2015-01-29

    Verification of declared UF6 enrichment and mass in storage cylinders is of great interest in nuclear material nonproliferation. Nondestructive assay (NDA) techniques are commonly used for safeguards inspections to ensure accountancy of declared nuclear materials. Common NDA techniques used include gamma-ray spectrometry and both passive and active neutron measurements. In the present study, neutron spectrometry was investigated for verification of UF6 enrichment in 30B storage cylinders based on an unattended and passive measurement approach. MCNP5 and Geant4 simulated neutron spectra, for selected UF6 enrichments and filling profiles, were used in the investigation. The simulated neutron spectra were analyzed using principalmore » component analysis (PCA). The PCA technique is a well-established technique and has a wide area of application including feature analysis, outlier detection, and gamma-ray spectral analysis. Results obtained demonstrate that neutron spectrometry supported by spectral feature analysis has potential for assaying UF6 enrichment in storage cylinders. Thus the results from the present study also showed that difficulties associated with the UF6 filling profile and observed in other unattended passive neutron measurements can possibly be overcome using the approach presented.« less

  17. Sea-launched cruise missiles: Technology, missions, & arms control verification

    NASA Astrophysics Data System (ADS)

    Scribner, Richard A.

    1988-12-01

    Sea-launched cruise missiles (SLCMs) present some particularly striking problems for both national security and arms control. These small, dual-purpose, difficult to detect weapons present some formidable challenges for verification in any scheme that attempts to limit rather than eliminate them. Conventionally armed SLCMs offer to the navies of both superpowers important offensive and defensive capabilities. Nuclear armed, long-range, land-attack SLCMs, on the other hand, seem to pose destabilizing threats and otherwise have questionable value, despite strong US support for extensive deployment of them. If these weapons are not constrained, their deployment could circumvent gains which might be made in agreements directly reducing of strategic nuclear weapons. This paper reviews the technology and planned deployments of SLCMs, the verification schemes which have been discussed and are being investigated to try to deal with the problem, and examines the proposed need for and possible uses of SLCMs. It presents an overview of the problem technically, militarily, and politically.

  18. Ionoacoustics: A new direct method for range verification

    NASA Astrophysics Data System (ADS)

    Parodi, Katia; Assmann, Walter

    2015-05-01

    The superior ballistic properties of ion beams may offer improved tumor-dose conformality and unprecedented sparing of organs at risk in comparison to other radiation modalities in external radiotherapy. However, these advantages come at the expense of increased sensitivity to uncertainties in the actual treatment delivery, resulting from inaccuracies of patient positioning, physiological motion and uncertainties in the knowledge of the ion range in living tissue. In particular, the dosimetric selectivity of ion beams depends on the longitudinal location of the Bragg peak, making in vivo knowledge of the actual beam range the greatest challenge to full clinical exploitation of ion therapy. Nowadays, in vivo range verification techniques, which are already, or close to, being investigated in the clinical practice, rely on the detection of the secondary annihilation photons or prompt gammas, resulting from nuclear interaction of the primary ion beam with the irradiated tissue. Despite the initial promising results, these methods utilize a not straightforward correlation between nuclear and electromagnetic processes, and typically require massive and costly instrumentation. On the contrary, the long-term known, yet only recently revisited process of "ionoacoustics", which is generated by local tissue heating especially at the Bragg peak, may offer a more direct approach to in vivo range verification, as reviewed here.

  19. Help for Mental Illnesses

    MedlinePlus

    ... If you or someone you know has a mental illness, there are ways to get help. Use these ... Support Alliance Mental Health America National Alliance on Mental Illness University or medical school-affiliated programs may offer ...

  20. Can Reading Help?

    ERIC Educational Resources Information Center

    Crowe, Chris

    2003-01-01

    Ponders the effect of September 11th on teenagers. Proposes that reading books can help teenagers sort out complicated issues. Recommends young adult novels that offer hope for overcoming tragedy. Lists 50 short story collections worth reading. (PM)

  1. Grandparents Can Help

    ERIC Educational Resources Information Center

    Pieper, Elizabeth

    1976-01-01

    Although grandparents may have difficulty in accepting their handicapped grandchild due to such factors as the notion of "bad blood," they can be helpful to parents by drawing from their experience to give new perspectives to complex problems. (SB)

  2. Can Reading Help?

    ERIC Educational Resources Information Center

    Crowe, Chris

    2003-01-01

    Ponders the effect of September 11th on teenagers. Proposes that reading books can help teenagers sort out complicated issues. Recommends young adult novels that offer hope for overcoming tragedy. Lists 50 short story collections worth reading. (PM)

  3. Hooked on Helping

    ERIC Educational Resources Information Center

    Longhurst, James; McCord, Joan

    2014-01-01

    In this article, teens presenting at a symposium on peer-helping programs describe how caring for others fosters personal growth and builds positive group cultures. Their individual thoughts and opinions are expressed.

  4. Helping Parents Say No.

    ERIC Educational Resources Information Center

    Duel, Debra K.

    1988-01-01

    Provides some activities that are designed to help students understand some of the reasons why parents sometimes refuse to let their children have pets. Includes mathematics and writing lessons, a student checklist, and a set of tips for parents. (TW)

  5. Help! Where to Look.

    ERIC Educational Resources Information Center

    Kneer, Marian E.

    1984-01-01

    Developing and maintaining effective physical education programs requires that teachers continually update their knowledge and skills. Books and journals, conferences, professional organizations, and consultants provide information to help teachers develop effective programs. (DF)

  6. Hooked on Helping

    ERIC Educational Resources Information Center

    Longhurst, James; McCord, Joan

    2014-01-01

    In this article, teens presenting at a symposium on peer-helping programs describe how caring for others fosters personal growth and builds positive group cultures. Their individual thoughts and opinions are expressed.

  7. Petition Preparation Help

    EPA Pesticide Factsheets

    Preparing a part 75 Petition provides useful information and answers to common questions that will help the designated representative for a unit subject to part 75 prepare and submit a complete petition under §75.66.

  8. Helping Teens Cope.

    ERIC Educational Resources Information Center

    Jones, Jami I.

    2003-01-01

    Considers the role of school library media specialists in helping teens cope with developmental and emotional challenges. Discusses resiliency research, and opportunities to develop programs and services especially for middle school and high school at-risk teens. (LRW)

  9. Container Verification Using Optically Stimulated Luminescence

    SciTech Connect

    Tanner, Jennifer E.; Miller, Steven D.; Conrady, Matthew M.; Simmons, Kevin L.; Tinker, Michael R.

    2008-10-01

    Containment verification is a high priority for safeguards containment and surveillance. Nuclear material containers, safeguards equipment cabinets, camera housings, and detector cable conduit are all vulnerable to tampering. Even with a high security seal on a lid or door, custom-built hinges and interfaces, and special colors and types of finishes, the surfaces of enclosures can be tampered with and any penetrations repaired and covered over. With today’s technology, these repairs would not be detected during a simple visual inspection. Several suggested solutions have been to develop complicated networks of wires, fiber-optic cables, lasers or other sensors that line the inside of a container and alarm when the network is disturbed. This results in an active system with real time evidence of tampering but is probably not practical for most safeguards applications. A more practical solution would be to use a passive approach where an additional security feature was added to surfaces which would consist of a special coating or paint applied to the container or enclosure. One type of coating would incorporate optically stimulated luminescent (OSL) material. OSL materials are phosphors that luminesce in proportion to the ionizing radiation dose when stimulated with the appropriate optical wavelengths. The OSL fluoresces at a very specific wavelength when illuminated at another, very specific wavelength. The presence of the pre-irradiated OSL material in the coating is confirmed using a device that interrogates the surface of the enclosure using the appropriate optical wavelength and then reads the resulting luminescence. The presence of the OSL indicates that the integrity of the surface is intact. The coating itself could be transparent which would allow the appearance of the container to remain unchanged or the OSL material could be incorporated into certain paints or epoxies used on various types of containers. The coating could be applied during manufacturing

  10. Verification of heterogeneous multi-agent system using MCMAS

    NASA Astrophysics Data System (ADS)

    Choi, Jiyoung; Kim, Seungkeun; Tsourdos, Antonios

    2015-03-01

    The focus of the paper is how to model autonomous behaviours of heterogeneous multi-agent systems such that it can be verified that they will always operate within predefined mission requirements and constraints. This is done by using formal methods with an abstraction of the behaviours modelling and model checking for their verification. Three case studies are presented to verify the decision-making behaviours of heterogeneous multi-agent system using a convoy mission scenario. The multi-agent system in a case study has been extended by increasing the number of agents and function complexity gradually. For automatic verification, model checker for multi-agent systems (MCMAS) is adopted due to its novel capability to accommodate the multi-agent system and successfully verifies the targeting behaviours of the team-level autonomous systems. The verification results help retrospectively the design of decision-making algorithms improved by considering additional agents and behaviours during three steps of scenario modification. Consequently, the last scenario deals with the system composed of a ground control system, two unmanned aerial vehicles, and four unmanned ground vehicles with fault-tolerant and communication relay capabilities.

  11. Hailstorms over Switzerland: Verification of Crowd-sourced Data

    NASA Astrophysics Data System (ADS)

    Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia

    2016-04-01

    The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.

  12. Why do verification and validation?

    SciTech Connect

    Hu, Kenneth T.; Paez, Thomas L.

    2016-02-19

    In this discussion paper, we explore different ways to assess the value of verification and validation (V&V) of engineering models. We first present a literature review on the value of V&V and then use value chains and decision trees to show how value can be assessed from a decision maker's perspective. In this context, the value is what the decision maker is willing to pay for V&V analysis with the understanding that the V&V results are uncertain. As a result, the 2014 Sandia V&V Challenge Workshop is used to illustrate these ideas.

  13. Approaches to wind resource verification

    NASA Technical Reports Server (NTRS)

    Barchet, W. R.

    1982-01-01

    Verification of the regional wind energy resource assessments produced by the Pacific Northwest Laboratory addresses the question: Is the magnitude of the resource given in the assessments truly representative of the area of interest? Approaches using qualitative indicators of wind speed (tree deformation, eolian features), old and new data of opportunity not at sites specifically chosen for their exposure to the wind, and data by design from locations specifically selected to be good wind sites are described. Data requirements and evaluation procedures for verifying the resource are discussed.

  14. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  15. Helpful hints to painless payload processing

    NASA Technical Reports Server (NTRS)

    Terhune, Terry; Carson, Maggie

    1995-01-01

    The helpful hints herein describe, from a system perspective, the functional flow of hardware and software. The flow will begin at the experiment development stage and continue through build-up, test, verification, delivery, launch and deintegration of the experiment. An effort will be made to identify those interfaces and transfer functions of processing that can be improved upon in the new world of 'Faster, Better, and Cheaper.' The documentation necessary to ensure configuration and processing requirements satisfaction will also be discussed. Hints and suggestions for improvements to enhance each phase of the flow will be derived from extensive experience and documented lessons learned. Charts will be utilized to define the functional flow and a list of 'lessons learned' will be addressed to show applicability. In conclusion, specific improvements for several areas of hardware processing, procedure development and quality assurance, that are generic to all Small Payloads, will be identified.

  16. U.S. EPA Environmental Technology Verification Program, the Founder of the ETV Concept

    EPA Science Inventory

    The U.S. EPA Environmental Technology Verification (ETV) Program develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The program was created in 1995 to help accelerate t...

  17. 78 FR 45729 - Foreign Supplier Verification Programs for Importers of Food for Humans and Animals

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-29

    ...The Food and Drug Administration (FDA) is proposing to adopt regulations on foreign supplier verification programs (FSVPs) for importers of food for humans and animals. The proposed regulations would require importers to help ensure that food imported into the United States is produced in compliance with processes and procedures, including reasonably appropriate risk-based preventive controls,......

  18. 80 FR 48955 - Pipeline Safety: Public Workshop on Hazardous Liquid Integrity Verification Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2015-08-14

    ... Verification Process AGENCY: Pipeline and Hazardous Materials Safety Administration, DOT. ACTION: Notice of... Process for gas transmission pipelines to help address several mandates in the Pipeline Safety, Regulatory...] [FR Doc No: 2015-20065] DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials...

  19. U.S. EPA Environmental Technology Verification Program, the Founder of the ETV Concept

    EPA Science Inventory

    The U.S. EPA Environmental Technology Verification (ETV) Program develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The program was created in 1995 to help accelerate t...

  20. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  1. Double patterning from design enablement to verification

    NASA Astrophysics Data System (ADS)

    Abercrombie, David; Lacour, Pat; El-Sewefy, Omar; Volkov, Alex; Levine, Evgueni; Arb, Kellen; Reid, Chris; Li, Qiao; Ghosh, Pradiptya

    2011-11-01

    Litho-etch-litho-etch (LELE) is the double patterning (DP) technology of choice for 20 nm contact, via, and lower metal layers. We discuss the unique design and process characteristics of LELE DP, the challenges they present, and various solutions. ∘ We examine DP design methodologies, current DP conflict feedback mechanisms, and how they can help designers identify and resolve conflicts. ∘ In place and route (P&R), the placement engine must now be aware of the assumptions made during IP cell design, and use placement directives provide by the library designer. We examine the new effects DP introduces in detail routing, discuss how multiple choices of LELE and the cut allowances can lead to different solutions, and describe new capabilities required by detail routers and P&R engines. ∘ We discuss why LELE DP cuts and overlaps are critical to optical process correction (OPC), and how a hybrid mechanism of rule and model-based overlap generation can provide a fast and effective solution. ∘ With two litho-etch steps, mask misalignment and image rounding are now verification considerations. We present enhancements to the OPCVerify engine that check for pinching and bridging in the presence of DP overlay errors and acute angles.

  2. Groundwater flow code verification ``benchmarking`` activity (COVE-2A): Analysis of participants` work

    SciTech Connect

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project.

  3. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  4. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  5. Gender Verification of Female Olympic Athletes.

    ERIC Educational Resources Information Center

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  6. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 1 2013-10-01 2013-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  7. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  8. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 1 2014-10-01 2014-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  9. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  10. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  11. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... a flight. Verification must include flight testing. ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Verification program. 460.17 Section 460.17... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17...

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  13. Fingerprint verification prediction model in hand dermatitis.

    PubMed

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  15. 40 CFR 1066.220 - Linearity verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Linearity verification. 1066.220 Section 1066.220 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.220 Linearity verification. (a) Scope...

  16. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  17. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  18. 18 CFR 286.107 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 286.107 Section 286.107 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... Contested Audit Findings and Proposed Remedies § 286.107 Verification. The facts stated in the...

  19. 18 CFR 41.5 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 41.5 Section 41.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  20. 18 CFR 349.5 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 349.5 Section 349.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... PROPOSED REMEDIES § 349.5 Verification. The facts stated in the memorandum must be sworn to by...

  1. 18 CFR 286.107 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 286.107 Section 286.107 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... Contested Audit Findings and Proposed Remedies § 286.107 Verification. The facts stated in the...

  2. 18 CFR 41.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 41.5 Section 41.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  3. 18 CFR 41.5 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification. 41.5 Section 41.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  4. 18 CFR 349.5 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification. 349.5 Section 349.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... PROPOSED REMEDIES § 349.5 Verification. The facts stated in the memorandum must be sworn to by...

  5. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  6. 18 CFR 349.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 349.5 Section 349.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... PROPOSED REMEDIES § 349.5 Verification. The facts stated in the memorandum must be sworn to by...

  7. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Verification program. 460.17 Section...

  8. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Verification program. 460.17 Section...

  9. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Verification program. 460.17 Section...

  10. Identity Verification, Control, and Aggression in Marriage

    ERIC Educational Resources Information Center

    Stets, Jan E.; Burke, Peter J.

    2005-01-01

    In this research we study the identity verification process and its effects in marriage. Drawing on identity control theory, we hypothesize that a lack of verification in the spouse identity (1) threatens stable self-meanings and interaction patterns between spouses, and (2) challenges a (nonverified) spouse's perception of control over the…

  11. CFE verification: The decision to inspect

    SciTech Connect

    Allentuck, J.

    1990-01-01

    Verification of compliance with the provisions of the treaty on Conventional Forces-Europe (CFE) is subject to inspection quotas of various kinds. Thus the decision to carry out a specific inspection or verification activity must be prudently made. This decision process is outlined, and means for conserving quotas'' are suggested. 4 refs., 1 fig.

  12. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements:...

  13. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 5 2011-04-01 2011-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements:...

  14. 18 CFR 34.8 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 34.8 Section 34.8 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF... SECURITIES OR THE ASSUMPTION OF LIABILITIES § 34.8 Verification. Link to an amendment published at 70 FR...

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  16. Guidelines for qualifying cleaning and verification materials

    NASA Technical Reports Server (NTRS)

    Webb, D.

    1995-01-01

    This document is intended to provide guidance in identifying technical issues which must be addressed in a comprehensive qualification plan for materials used in cleaning and cleanliness verification processes. Information presented herein is intended to facilitate development of a definitive checklist that should address all pertinent materials issues when down selecting a cleaning/verification media.

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  18. Video-based fingerprint verification.

    PubMed

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-09-04

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, "inside similarity" and "outside similarity" are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low.

  19. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness, which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  20. Video-Based Fingerprint Verification

    PubMed Central

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283

  1. Runtime Verification with State Estimation

    NASA Technical Reports Server (NTRS)

    Stoller, Scott D.; Bartocci, Ezio; Seyster, Justin; Grosu, Radu; Havelund, Klaus; Smolka, Scott A.; Zadok, Erez

    2011-01-01

    We introduce the concept of Runtime Verification with State Estimation and show how this concept can be applied to estimate theprobability that a temporal property is satisfied by a run of a program when monitoring overhead is reduced by sampling. In such situations, there may be gaps in the observed program executions, thus making accurate estimation challenging. To deal with the effects of sampling on runtime verification, we view event sequences as observation sequences of a Hidden Markov Model (HMM), use an HMM model of the monitored program to "fill in" sampling-induced gaps in observation sequences, and extend the classic forward algorithm for HMM state estimation (which determines the probability of a state sequence, given an observation sequence) to compute the probability that the property is satisfied by an execution of the program. To validate our approach, we present a case study based on the mission software for a Mars rover. The results of our case study demonstrate high prediction accuracy for the probabilities computed by our algorithm. They also show that our technique is much more accurate than simply evaluating the temporal property on the given observation sequences, ignoring the gaps.

  2. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  3. Modular verification of concurrent systems

    SciTech Connect

    Sobel, A.E.K.

    1986-01-01

    During the last ten years, a number of authors have proposed verification techniques that allow one to prove properties of individual processes by using global assumptions about the behavior of the remaining processes in the distributed program. As a result, one must justify these global assumptions before drawing any conclusions regarding the correctness of the entire program. This justification is often the most difficult part of the proof and presents a serious obstacle to hierarchical program development. This thesis develops a new approach to the verification of concurrent systems. The approach is modular and supports compositional development of programs since the proofs of each individual process of a program are completely isolated from all others. The generality of this approach is illustrated by applying it to a representative set of contemporary concurrent programming languages, namely: CSP, ADA, Distributed Processes, and a shared variable language. In addition, it is also shown how the approach may be used to deal with a number of other constructs that have been proposed for inclusion in concurrent languages: FORK and JOIN primitives, nested monitor calls, path expressions, atomic transactions, and asynchronous message passing. These results allow argument that the approach is universal and can be used to design proof systems for any concurrent language.

  4. Nuclear Medicine.

    ERIC Educational Resources Information Center

    Badawi, Ramsey D.

    2001-01-01

    Describes the use of nuclear medicine techniques in diagnosis and therapy. Describes instrumentation in diagnostic nuclear medicine and predicts future trends in nuclear medicine imaging technology. (Author/MM)

  5. Nuclear ventriculography

    MedlinePlus

    ... ventriculography (RNV); Multiple gate acquisition scan (MUGA); Nuclear cardiology; Cardiomyopathy - nuclear ventriculography ... 56. Udelson JE, Dilsizian V, Bonow RO. Nuclear cardiology. In: Bonow RO, Mann DL, Zipes DP, Libby ...

  6. Nuclear Medicine.

    ERIC Educational Resources Information Center

    Badawi, Ramsey D.

    2001-01-01

    Describes the use of nuclear medicine techniques in diagnosis and therapy. Describes instrumentation in diagnostic nuclear medicine and predicts future trends in nuclear medicine imaging technology. (Author/MM)

  7. Helping patients stop smoking.

    PubMed

    Ferentz, K S; Valente, C M

    1994-01-01

    As more patients seek treatment for nicotine addiction, physicians must become adept at counseling patients on how to quit. Several simple behavioral modification techniques are available to help patients stop smoking, and these techniques can be incorporated into any busy practice. Any patient encounter can be used to inform patients of the dangers of smoking and to tell them to quit. Patients can be offered nicotine replacement therapy, although the long-term benefit is still unknown. Helping patients to quit is a rewarding process.

  8. Hierarchical Design and Verification for VLSI

    NASA Technical Reports Server (NTRS)

    Shostak, R. E.; Elliott, W. D.; Levitt, K. N.

    1983-01-01

    The specification and verification work is described in detail, and some of the problems and issues to be resolved in their application to Very Large Scale Integration VLSI systems are examined. The hierarchical design methodologies enable a system architect or design team to decompose a complex design into a formal hierarchy of levels of abstraction. The first step inprogram verification is tree formation. The next step after tree formation is the generation from the trees of the verification conditions themselves. The approach taken here is similar in spirit to the corresponding step in program verification but requires modeling of the semantics of circuit elements rather than program statements. The last step is that of proving the verification conditions using a mechanical theorem-prover.

  9. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  10. Nuclear data for nuclear transmutation

    SciTech Connect

    Harada, Hideo

    2009-05-04

    Current status on nuclear data for the study of nuclear transmutation of radioactive wastes is reviewed, mainly focusing on neutron capture reactions. It is stressed that the highest-precision frontier research in nuclear data measurements should be a key to satisfy the target accuracies on the nuclear data requested for realizing the nuclear transmutation.

  11. Nuclear data for nuclear transmutation

    NASA Astrophysics Data System (ADS)

    Harada, Hideo

    2009-05-01

    Current status on nuclear data for the study of nuclear transmutation of radioactive wastes is reviewed, mainly focusing on neutron capture reactions. It is stressed that the highest-precision frontier research in nuclear data measurements should be a key to satisfy the target accuracies on the nuclear data requested for realizing the nuclear transmutation.

  12. Nuclear Cryogenic Propulsion Stage

    NASA Technical Reports Server (NTRS)

    Houts, Michael G.; Borowski, S. K.; George, J. A.; Kim, T.; Emrich, W. J.; Hickman, R. R.; Broadway, J. W.; Gerrish, H. P.; Adams, R. B.

    2012-01-01

    The fundamental capability of Nuclear Thermal Propulsion (NTP) is game changing for space exploration. A first generation Nuclear Cryogenic Propulsion Stage (NCPS) based on NTP could provide high thrust at a specific impulse above 900 s, roughly double that of state of the art chemical engines. Characteristics of fission and NTP indicate that useful first generation systems will provide a foundation for future systems with extremely high performance. The role of the NCPS in the development of advanced nuclear propulsion systems could be analogous to the role of the DC-3 in the development of advanced aviation. Progress made under the NCPS project could help enable both advanced NTP and advanced NEP.

  13. JPRS Report Nuclear Developments

    DTIC Science & Technology

    2007-11-02

    Questions Remain Open In Mol Nuclear Scandal [ Michel Balthasart; Brussels LE VIF/L’EXPRESS, 25-31 Mar 88] 35 FRANCE Nuclear Fuel Cycle Industry...at End of 1987 [M. Barron ; Paris REVUE GENERALE NUCLEAIRE, Jan-Feb 88] 36 TURKEY Argentina To Help Acquire ’Nuclear Technology’ [Atilla Atakan...51002447 Brussels LE VIF/L’EXPRESS in French 25-31 Mar 88 pp 24-26 [Article by Michel Balthasart] [Text] What does the boycott of the European

  14. What Helps Us Learn?

    ERIC Educational Resources Information Center

    Educational Leadership, 2010

    2010-01-01

    This article presents comments of high school students at the Howard Gardner School in Alexandria, Virginia, who were asked, What should teachers know about students to help them learn? Twelve high school students from the Howard Gardner School in Alexandria, Virginia, describe how their best teachers get to know them and thus were more able to…

  15. Stretching: Does It Help?

    ERIC Educational Resources Information Center

    Vardiman, Phillip; Carrand, David; Gallagher, Philip M.

    2010-01-01

    Stretching prior to activity is universally accepted as an important way to improve performance and help prevent injury. Likewise, limited flexibility has been shown to decrease functional ability and predispose a person to injuries. Although this is commonly accepted, appropriate stretching for children and adolescents involved with sports and…

  16. Stretching: Does It Help?

    ERIC Educational Resources Information Center

    Vardiman, Phillip; Carrand, David; Gallagher, Philip M.

    2010-01-01

    Stretching prior to activity is universally accepted as an important way to improve performance and help prevent injury. Likewise, limited flexibility has been shown to decrease functional ability and predispose a person to injuries. Although this is commonly accepted, appropriate stretching for children and adolescents involved with sports and…

  17. Helping You Age Well

    MedlinePlus

    ... to keep family relationships and friendships over time. Exercise can also help prevent depression or lift your mood. Stay active and involved in life. Talk to your physician if you are feeling depressed. Teeth & ... Lungs: Regular aerobic exercise keeps lung capacity up. Smoking leads to chronic ...

  18. What Helps Us Learn?

    ERIC Educational Resources Information Center

    Educational Leadership, 2010

    2010-01-01

    This article presents comments of high school students at the Howard Gardner School in Alexandria, Virginia, who were asked, What should teachers know about students to help them learn? Twelve high school students from the Howard Gardner School in Alexandria, Virginia, describe how their best teachers get to know them and thus were more able to…

  19. Ayudele! [Help Him!].

    ERIC Educational Resources Information Center

    Spencer, Maria Gutierrez, Comp.; Almance, Sofia, Comp.

    Written in Spanish and English, the booklet briefly discusses what parents can do to help their child learn at school. The booklet briefly notes the importance of getting enough sleep; eating breakfast; praising the child; developing the five senses; visiting the doctor; having a home and garden; talking, listening, and reading to the child;…

  20. Helping Students Avoid Plagiarism.

    ERIC Educational Resources Information Center

    Wilhoit, Stephen

    1994-01-01

    Discusses how and why college students commit plagiarism, suggesting techniques that instructors can use to help student avoid plagiarism. Instructors should define and discuss plagiarism thoroughly; discuss hypothetical cases; review the conventions of quoting and documenting material; require multiple drafts of essays; and offer responses…

  1. Helping Families Cope.

    ERIC Educational Resources Information Center

    Goodman, Carol R.

    The paper presents observations of families having adult members with learning disabilities and describes a residential program to facilitate the transition to independent living of lower functioning learning disabled young adults. The program, called Independence Center, involves placing participants in apartments with roommates and helping them…

  2. Helping Teachers Communicate

    ERIC Educational Resources Information Center

    Kise, Jane; Russell, Beth; Shumate, Carol

    2008-01-01

    Personality type theory describes normal differences in how people are energized, take in information, make decisions, and approach work and life--all key elements in how people teach and learn. Understanding one another's personality type preferences helps teachers share their instructional strategies and classroom information. Type theory…

  3. Helping Teachers Develop

    ERIC Educational Resources Information Center

    Bubb, Sara

    2005-01-01

    It is fashionable to say that teaching can be the most rewarding profession there is- and it can be. Most teachers can all give examples of the pleasure of helping a child grow in knowledge and understanding, and achieve their potential. But what about the teacher? They shouldn't be excluded from the benefits of lifelong learning because of their…

  4. Helping Adults to Spell.

    ERIC Educational Resources Information Center

    Moorhouse, Catherine

    This book presents a range of strategies for adult literacy tutors and offers a wealth of practical advice on teaching spelling within the context of writing. Chapters 1-3 offer basic information on talking with the student about spelling, finding out how the student spells and helping the student to see himself/herself as a "good" speller, and…

  5. Ayudele! [Help Him!].

    ERIC Educational Resources Information Center

    Spencer, Maria Gutierrez, Comp.; Almance, Sofia, Comp.

    Written in Spanish and English, the booklet briefly discusses what parents can do to help their child learn at school. The booklet briefly notes the importance of getting enough sleep; eating breakfast; praising the child; developing the five senses; visiting the doctor; having a home and garden; talking, listening, and reading to the child;…

  6. Help for Stressed Students

    ERIC Educational Resources Information Center

    Pope, Denise Clarke; Simon, Richard

    2005-01-01

    The authors argue that increased focus and pressure for high academic achievement, particularly among more highly-motivated and successful students, may have serious negative consequences. They present a number of strategies designed to help reduce both causes and consequences associated with academic stress and improve students' mental and…

  7. Helping Perceptually Handicapped Children

    ERIC Educational Resources Information Center

    Rose, Helen S.

    1974-01-01

    Five children diagnosed as having a perceptual problem as revealed by the Bender Visual Motor Gestalt Test received special tutoring to help develop their visual discrimination abilities. The six-week program for teaching the concept of shapes employed kinesthetic, visual, tactile, and verbal processes. (CS)

  8. A Helping Hand.

    ERIC Educational Resources Information Center

    Renner, Jason M.

    2000-01-01

    Discusses how designing a hand washing-friendly environment can help to reduce the spread of germs in school restrooms. Use of electronic faucets, surface risk management, traffic flow, and user- friendly hand washing systems that are convenient and maximally hygienic are examined. (GR)

  9. Helping, Manipulation, and Magic

    ERIC Educational Resources Information Center

    Frey, Louise A.; Edinburg, Golda M.

    1978-01-01

    The thesis of this article is that an understanding of the primitive origins of the helping process in myth, magic, and ritual may prevent social workers from engaging in practices that negate their clients' ability to work out their own solutions to problems. (Author)

  10. Self-Help Experiences

    ERIC Educational Resources Information Center

    Woody, Robert H.

    1973-01-01

    The author believes that there is a distinct need for professionals to become competent in providing materials for self-help lay efforts. Colleges and universities must provide for the facilitation of personal growth through self administered procedures by either a clinical approach (in counseling centers) or a didactic one (in classes as, for…

  11. Help Teens Manage Diabetes

    MedlinePlus

    ... Grey, dean of the Yale University School of Nursing, developed and tested a program called Coping Skills Training (CST) as a part of routine diabetes ... is to improve diabetic teens' coping and communication skills, healthy ... sugar levels. "Nursing research is about helping people deal with the ...

  12. Help for Stressed Students

    ERIC Educational Resources Information Center

    Pope, Denise Clarke; Simon, Richard

    2005-01-01

    The authors argue that increased focus and pressure for high academic achievement, particularly among more highly-motivated and successful students, may have serious negative consequences. They present a number of strategies designed to help reduce both causes and consequences associated with academic stress and improve students' mental and…

  13. A Helping Hand.

    ERIC Educational Resources Information Center

    Renner, Jason M.

    2000-01-01

    Discusses how designing a hand washing-friendly environment can help to reduce the spread of germs in school restrooms. Use of electronic faucets, surface risk management, traffic flow, and user- friendly hand washing systems that are convenient and maximally hygienic are examined. (GR)

  14. Self-Help Experiences

    ERIC Educational Resources Information Center

    Woody, Robert H.

    1973-01-01

    The author believes that there is a distinct need for professionals to become competent in providing materials for self-help lay efforts. Colleges and universities must provide for the facilitation of personal growth through self administered procedures by either a clinical approach (in counseling centers) or a didactic one (in classes as, for…

  15. Guidelines for the verification and validation of expert system software and conventional software: Evaluation of knowledge base certification methods. Volume 4

    SciTech Connect

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-03-01

    This report presents the results of the Knowledge Base Certification activity of the expert systems verification and validation (V&V) guideline development project which is jointly funded by the US Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. This activity is concerned with the development and testing of various methods for assuring the quality of knowledge bases. The testing procedure used was that of behavioral experiment, the first known such evaluation of any type of V&V activity. The value of such experimentation is its capability to provide empirical evidence for -- or against -- the effectiveness of plausible methods in helping people find problems in knowledge bases. The three-day experiment included 20 participants from three nuclear utilities, the Nuclear Regulatory Commission`s Technical training Center, the University of Maryland, EG&G Idaho, and SAIC. The study used two real nuclear expert systems: a boiling water reactor emergency operating procedures tracking system and a pressurized water reactor safety assessment systems. Ten participants were assigned to each of the expert systems. All participants were trained in and then used a sequence of four different V&V methods selected as being the best and most appropriate for study on the basis of prior evaluation activities. These methods either involved the analysis and tracing of requirements to elements in the knowledge base (requirements grouping and requirements tracing) or else involved direct inspection of the knowledge base for various kinds of errors. Half of the subjects within each system group used the best manual variant of the V&V methods (the control group), while the other half were supported by the results of applying real or simulated automated tools to the knowledge bases (the experimental group).

  16. Gender verification of female athletes.

    PubMed

    Elsas, L J; Ljungqvist, A; Ferguson-Smith, M A; Simpson, J L; Genel, M; Carlson, A S; Ferris, E; de la Chapelle, A; Ehrhardt, A A

    2000-01-01

    The International Olympic Committee (IOC) officially mandated gender verification for female athletes beginning in 1968 and continuing through 1998. The rationale was to prevent masquerading males and women with "unfair, male-like" physical advantage from competing in female-only events. Visual observation and gynecological examination had been tried on a trial basis for two years at some competitions leading up to the 1968 Olympic Games, but these invasive and demeaning processes were jettisoned in favor of laboratory-based genetic tests. Sex chromatin and more recently DNA analyses for Y-specific male material were then required of all female athletes immediately preceding IOC-sanctioned sporting events, and many other international and national competitions following the IOC model. On-site gender verification has since been found to be highly discriminatory, and the cause of emotional trauma and social stigmatization for many females with problems of intersex who have been screened out from competition. Despite compelling evidence for the lack of scientific merit for chromosome-based screening for gender, as well as its functional and ethical inconsistencies, the IOC persisted in its policy for 30 years. The coauthors of this manuscript have worked with some success to rescind this policy through educating athletes and sports governors regarding the psychological and physical nature of sexual differentiation, and the inequities of genetic sex testing. In 1990, the International Amateur Athletics Federation (IAAF) called for abandonment of required genetic screening of women athletes, and by 1992 had adopted a fairer, medically justifiable model for preventing only male "impostors" in international track and field. At the recent recommendation of the IOC Athletes Commission, the Executive Board of the IOC has finally recognized the medical and functional inconsistencies and undue costs of chromosome-based methods. In 1999, the IOC ratified the abandonment of on

  17. Pediatric Readiness and Facility Verification.

    PubMed

    Remick, Katherine; Kaji, Amy H; Olson, Lenora; Ely, Michael; Schmuhl, Patricia; McGrath, Nancy; Edgerton, Elizabeth; Gausche-Hill, Marianne

    2016-03-01

    We perform a needs assessment of pediatric readiness, using a novel scoring system in California emergency departments (EDs), and determine the effect of pediatric verification processes on pediatric readiness. ED nurse managers from all 335 acute care hospital EDs in California were sent a 60-question Web-based assessment. A weighted pediatric readiness score (WPRS), using a 100-point scale, and gap analysis were calculated for each participating ED. Nurse managers from 90% (300/335) of EDs completed the Web-based assessment, including 51 pediatric verified EDs, 67 designated trauma centers, and 31 EDs assessed for pediatric capabilities. Most pediatric visits (87%) occurred in nonchildren's hospitals. The overall median WPRS was 69 (interquartile ratio [IQR] 57.7, 85.9). Pediatric verified EDs had a higher WPRS (89.6; IQR 84.1, 94.1) compared with nonverified EDs (65.5; IQR 55.5, 76.3) and EDs assessed for pediatric capabilities (70.7; IQR 57.4, 88.9). When verification status and ED volume were controlled for, trauma center designation was not predictive of an increase in the WPRS. Forty-three percent of EDs reported the presence of a quality improvement plan that included pediatric elements, and 53% reported a pediatric emergency care coordinator. When coordinator and quality improvement plan were controlled for, the presence of at least 1 pediatric emergency care coordinator was associated with a higher WPRS (85; IQR 75, 93.1) versus EDs without a coordinator (58; IQR 50.1, 66.9), and the presence of a quality improvement plan was associated with a higher WPRS (88; IQR 76.7, 95) compared with that of hospitals without a plan (62; IQR 51.2, 68.7). Of pediatric verified EDs, 92% had a quality improvement plan for pediatric emergency care and 96% had a pediatric emergency care coordinator. We report on the first comprehensive statewide assessment of "pediatric readiness" in EDs according to the 2009 "Guidelines for Care of Children in the Emergency Department

  18. Scenarios for exercising technical approaches to verified nuclear reductions

    SciTech Connect

    Doyle, James

    2010-01-01

    Presidents Obama and Medvedev in April 2009 committed to a continuing process of step-by-step nuclear arms reductions beyond the new START treaty that was signed April 8, 2010 and to the eventual goal of a world free of nuclear weapons. In addition, the US Nuclear Posture review released April 6, 2010 commits the US to initiate a comprehensive national research and development program to support continued progress toward a world free of nuclear weapons, including expanded work on verification technologies and the development of transparency measures. It is impossible to predict the specific directions that US-RU nuclear arms reductions will take over the 5-10 years. Additional bilateral treaties could be reached requiring effective verification as indicated by statements made by the Obama administration. There could also be transparency agreements or other initiatives (unilateral, bilateral or multilateral) that require monitoring with a standard of verification lower than formal arms control, but still needing to establish confidence to domestic, bilateral and multilateral audiences that declared actions are implemented. The US Nuclear Posture Review and other statements give some indication of the kinds of actions and declarations that may need to be confirmed in a bilateral or multilateral setting. Several new elements of the nuclear arsenals could be directly limited. For example, it is likely that both strategic and nonstrategic nuclear warheads (deployed and in storage), warhead components, and aggregate stocks of such items could be accountable under a future treaty or transparency agreement. In addition, new initiatives or agreements may require the verified dismantlement of a certain number of nuclear warheads over a specified time period. Eventually procedures for confirming the elimination of nuclear warheads, components and fissile materials from military stocks will need to be established. This paper is intended to provide useful background information

  19. Helping children talk.

    PubMed

    Day, L

    1998-01-01

    Many children and young people living in London are affected by HIV. Most such children come from families from sub-Saharan Africa. Some HIV-positive parents have died, some are ill, and some are well. Some older children know that their parents are infected with HIV, but most children are unaware. To help these children understand their situations, children with a parent or parents who have died or are very sick are invited to 6 half-days of storytelling and play, led by a family counselor and someone who uses drama. Trained volunteers come from local AIDS organizations. The sessions vary depending upon what the children choose to discuss. The adults' role is to help the children begin to reflect upon their feelings in a way which is easy for them to express. Sessions usually begin with the creation of a story using a toy animal, after which children subsequently act out the imaginary family in different ways.

  20. Helping pregnant teenagers.

    PubMed

    Bluestein, D; Starling, M E

    1994-08-01

    Teenagers who are pregnant face many difficult issues, and counseling by physicians can be an important source of help. We suggest guidelines for this counseling, beginning with a review of the scope and consequences of adolescent pregnancy. Communication strategies should be aimed at building rapport with techniques such as maintaining confidentiality, avoiding judgmental stances, and gearing communication to cognitive maturity. Techniques for exploring family relationships are useful because these relationships are key influences on subsequent decisions and behaviors. We discuss topics related to abortion and childbearing, such as safety, facilitation of balanced decision making, the use of prenatal care, and the formulation of long-term plans. Physicians who can effectively discuss these topics can help pregnant teenagers make informed decisions and improve their prospects for the future.

  1. MFTF sensor verification computer program

    SciTech Connect

    Chow, H.K.

    1984-11-09

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system.

  2. KAT-7 Science Verification Highlights

    NASA Astrophysics Data System (ADS)

    Lucero, Danielle M.; Carignan, Claude; KAT-7 Science Data; Processing Team, KAT-7 Science Commissioning Team

    2015-01-01

    KAT-7 is a pathfinder of the Square Kilometer Array precursor MeerKAT, which is under construction. Its short baselines and low system temperature make it sensitive to large scale, low surface brightness emission. This makes it an ideal instrument to use in searches for faint extended radio emission and low surface density extraplanar gas. We present an update on the progress of several such ongoing KAT-7 science verification projects. These include a large scale radio continuum and polarization survey of the Galactic Center, deep HI observations (100+ hours) of nearby disk galaxies (e.g. NGC253 and NGC3109), and targeted searches for HI tidal tails in galaxy groups (e.g. IC1459). A brief status update for MeerKAT will also be presented if time permits.

  3. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  4. Verification of FANTASTIC integrated code

    NASA Technical Reports Server (NTRS)

    Chauhan, Rajinder Singh

    1987-01-01

    FANTASTIC is an acronym for Failure Analysis Nonlinear Thermal and Structural Integrated Code. This program was developed by Failure Analysis Associates, Palo Alto, Calif., for MSFC to improve the accuracy of solid rocket motor nozzle analysis. FANTASTIC has three modules: FACT - thermochemical analysis; FAHT - heat transfer analysis; and FAST - structural analysis. All modules have keywords for data input. Work is in progress for the verification of the FAHT module, which is done by using data for various problems with known solutions as inputs to the FAHT module. The information obtained is used to identify problem areas of the code and passed on to the developer for debugging purposes. Failure Analysis Associates have revised the first version of the FANTASTIC code and a new improved version has been released to the Thermal Systems Branch.

  5. Retail applications of signature verification

    NASA Astrophysics Data System (ADS)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  6. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  7. Helping Iraqis Rebuild Iraq

    DTIC Science & Technology

    2003-09-01

    For example, the task force hired several former plant workers to fix the water pumps and generators to a water treatment plant in the town of Abu...Charlie Company, 223d Engineer Battalion (Task Force Knight), Mississippi National Guard, was able to put his civilian water treatment plant expertise...to use and help negotiate the purchase of parts that brought the water treatment plant to full operation. In addition, personnel from the 14th

  8. Delayed Gamma-ray Spectroscopy for Non-Destructive Assay of Nuclear Materials

    SciTech Connect

    Mozin, Vladimir; Ludewigt, Bernhard; Campbell, Luke; Favalli, Andrea; Hunt, Alan

    2014-10-09

    This project addresses the need for improved non-destructive assay techniques for quantifying the actinide composition of spent nuclear fuel and for the independent verification of declared quantities of special nuclear materials at key stages of the fuel cycle. High-energy delayed gamma-ray spectroscopy following neutron irradiation is a potential technique for directly assaying spent fuel assemblies and achieving the safeguards goal of quantifying nuclear material inventories for spent fuel handling, interim storage, reprocessing facilities, repository sites, and final disposal. Other potential applications include determination of MOX fuel composition, characterization of nuclear waste packages, and challenges in homeland security and arms control verification.

  9. A hardware-software system for the automation of verification and calibration of oil metering units secondary equipment

    NASA Astrophysics Data System (ADS)

    Boyarnikov, A. V.; Boyarnikova, L. V.; Kozhushko, A. A.; Sekachev, A. F.

    2017-08-01

    In the article the process of verification (calibration) of oil metering units secondary equipment is considered. The purpose of the work is to increase the reliability and reduce the complexity of this process by developing a software and hardware system that provides automated verification and calibration. The hardware part of this complex carries out the commutation of the measuring channels of the verified controller and the reference channels of the calibrator in accordance with the introduced algorithm. The developed software allows controlling the commutation of channels, setting values on the calibrator, reading the measured data from the controller, calculating errors and compiling protocols. This system can be used for checking the controllers of the secondary equipment of the oil metering units in the automatic verification mode (with the open communication protocol) or in the semi-automatic verification mode (without it). The peculiar feature of the approach used is the development of a universal signal switch operating under software control, which can be configured for various verification methods (calibration), which allows to cover the entire range of controllers of metering units secondary equipment. The use of automatic verification with the help of a hardware and software system allows to shorten the verification time by 5-10 times and to increase the reliability of measurements, excluding the influence of the human factor.

  10. 37 CFR 382.6 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... payments. (a) General. This section prescribes general rules pertaining to the verification of the payment... requesting the verification procedure shall retain the report of the verification for a period of three years... independent auditor, shall serve as an acceptable verification procedure for all interested parties. (f) Costs...

  11. Argonne nuclear pioneer: Leonard Koch

    SciTech Connect

    Koch, Leonard

    2012-01-01

    Leonard Koch joined Argonne National Laboratory in 1948. He helped design and build Experimental Breeder Reactor-1 (EBR-1), the first reactor to generate useable amounts of electricity from nuclear energy.

  12. Formal specification and verification of Ada software

    NASA Technical Reports Server (NTRS)

    Hird, Geoffrey R.

    1991-01-01

    The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.

  13. Enhanced Cancelable Biometrics for Online Signature Verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. In this paper, we propose a scheme to improve the performance of a cancelable approach for online signature verification. Our scheme generates two cancelable dataset from one raw dataset and uses them for verification. Preliminary experiments were performed using a distance-based online signature verification algorithm. The experimental results show that our proposed scheme is promising.

  14. Liquefied Natural Gas (LNG) dispenser verification device

    NASA Astrophysics Data System (ADS)

    Xiong, Maotao; Yang, Jie-bin; Zhao, Pu-jun; Yu, Bo; Deng, Wan-quan

    2013-01-01

    The composition of working principle and calibration status of LNG (Liquefied Natural Gas) dispenser in China are introduced. According to the defect of weighing method in the calibration of LNG dispenser, LNG dispenser verification device has been researched. The verification device bases on the master meter method to verify LNG dispenser in the field. The experimental results of the device indicate it has steady performance, high accuracy level and flexible construction, and it reaches the international advanced level. Then LNG dispenser verification device will promote the development of LNG dispenser industry in China and to improve the technical level of LNG dispenser manufacture.

  15. Certainty in Stockpile Computing: Recommending a Verification and Validation Program for Scientific Software

    SciTech Connect

    Lee, J.R.

    1998-11-01

    As computing assumes a more central role in managing the nuclear stockpile, the consequences of an erroneous computer simulation could be severe. Computational failures are common in other endeavors and have caused project failures, significant economic loss, and loss of life. This report examines the causes of software failure and proposes steps to mitigate them. A formal verification and validation program for scientific software is recommended and described.

  16. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory's Source Region Program

    SciTech Connect

    Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.

    1993-01-21

    As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.

  17. DOE handbook: Integrated safety management systems (ISMS) verification team leader`s handbook

    SciTech Connect

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  18. Thermal hydraulic feasibility assessment for the Spent Nuclear Fuel Project

    SciTech Connect

    Heard, F.J.; Cramer, E.R.; Beaver, T.R.; Thurgood, M.J.

    1996-01-01

    A series of scoping analyses have been completed investigating the thermal-hydraulic performance and feasibility of the Spent Nuclear Fuel Project (SNFP) Integrated Process Strategy (IPS). The SNFP was established to develop engineered solutions for the expedited removal, stabilization, and storage of spent nuclear fuel from the K Basins at the U.S. Department of Energy`s Hanford Site in Richland, Washington. The subject efforts focused on independently investigating, quantifying, and establishing the governing heat production and removal mechanisms for each of the IPS operations and configurations, obtaining preliminary results for comparison with and verification of other analyses, and providing technology-based recommendations for consideration and incorporation into the design bases for the SNFP. The goal was to develop a series fo thermal-hydraulic models that could respond to all process and safety-related issues that may arise pertaining to the SNFP. A series of sensitivity analyses were also performed to help identify those parameters that have the greatest impact on energy transfer and hence, temperature control. It is anticipated that the subject thermal-hydraulic models will form the basis for a series of advanced and more detailed models that will more accurately reflect the thermal performance of the IPS and alleviate the necessity for some of the more conservative assumptions and oversimplifications, as well as form the basis for the final process and safety analyses.

  19. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  20. Nuclear waste management

    NASA Astrophysics Data System (ADS)

    Chikalla, T. D.; Powell, J. A.

    1981-09-01

    Reports and summaries are presented for the following: high-level waste process development; alternative waste forms; TMI zeolite vitrification demonstration program; nuclear waste materials characterization center; TRU waste immobilization; TRU waste decontamination; krypton implantation; thermal outgassing; iodine-129 fixation; NWVP off-gas analysis; monitoring and physical characterization of unsaturated zone transport; well-logging instrumentation development; verification instrument development; mobility of organic complexes of radionuclides in soils; handbook of methods to decrease the generation of low-level waste; waste management system studies; waste management safety studies; assessment of effectiveness of geologic isolation systems; waste/rock interactions technology program; high-level waste form preparation; development of backfill materials; development of structural engineered barriers; disposal charge analysis; and analysis of spent fuel policy implementation.

  1. Nuclear weapons testing

    SciTech Connect

    Heylin, M.

    1988-02-15

    The author examines the history of efforts to ban, or at least constrain, nuclear tests. The issue has been marked by shifts in attitude by the superpowers in recent times. The Reagan Administration sees a comprehensive test ban only as a very long-term goal for the U.S. The Soviets, on the other hand, have been pushing extremely hard lately for a ban on all testing. The author discusses the pros and cons of such a ban by examining the arguments of the U.S. Department of Energy, Nobel Laureate Glenn T. Seaborg, and Associate Director for Defense Systems at Lawrence Livermore National Laboratory George H. Miller. Other issues that are discussed include verification, joint testing, and reliability. He concludes with a discussion of the future of the ban.

  2. Information Center Help Desk

    DTIC Science & Technology

    1991-09-01

    UsAISEC AD-A268 157 US Army Information Systems Engineering Command Fort Huachuca, AZ 85613-5300 U.S. ARMY INSTITUTE FOR RESEARCH IN MANAGEMENT...performs the functions of an IC servicing 15 other ICs within its command . It does not service end users at all. This IC develops regulations, policies...entry fields; most commands are function-key driven. There is no context-sensitive help. CA-Netman/MRM Pro uses ’Action Requests’ and ’Memo Files

  3. Mutual help in SETIs

    NASA Astrophysics Data System (ADS)

    Melia, F.; Frisch, D. H.

    1985-06-01

    Techniques to establish communication between earth and extraterrestrial intelligent beings are examined analytically, emphasizing that the success of searches for extraterrestrial intelligence (SETIs) depends on the selection by both sender and receiver of one of a few mutually helpful SETI strategies. An equation for estimating the probability that an SETI will result in the recognition of an ETI signal is developed, and numerical results for various SETI strategies are presented in tables. A minimum approach employing 10 40-m 20-kW dish antennas for a 30-yr SETI in a 2500-light-year disk is proposed.

  4. Nuclear exoticism

    NASA Astrophysics Data System (ADS)

    Penionzhkevich, Yu. E.

    2016-07-01

    Extreme states of nuclearmatter (such that feature high spins, large deformations, high density and temperature, or a large excess of neutrons and protons) play an important role in studying fundamental properties of nuclei and are helpful in solving the problem of constructing the equation of state for nuclear matter. The synthesis of neutron-rich nuclei near the nucleon drip lines and investigation of their properties permit drawing conclusions about the positions of these boundaries and deducing information about unusual states of such nuclei and about their decays. At the present time, experimental investigations along these lines can only be performed via the cooperation of leading research centers that possess powerful heavy-ion accelerators, such as the Large Hadron Collider (LHC) at CERN and the heavy-ion cyclotrons at the Joint Institute for Nuclear Research (JINR, Dubna), where respective experiments are being conducted by physicists from about 20 JINR member countries. The present article gives a survey of the most recent results in the realms of super neutron-rich nuclei. Implications of the change in the structure of such nuclei near the nucleon drip lines are discussed. Information about the results obtained by measuring the masses (binding energies) of exotic nuclei, the nucleon-distribution radii (neutron halo) and momentum distributions in them, and their deformations and quantum properties is presented. It is shown that the properties of nuclei lying near the stability boundaries differ strongly from the properties of other nuclei. The problem of the stability of nuclei that is associated with the magic numbers of 20 and 28 is discussed along with the effect of new magic numbers.

  5. Nuclear exoticism

    SciTech Connect

    Penionzhkevich, Yu. E.

    2016-07-15

    Extreme states of nuclearmatter (such that feature high spins, large deformations, high density and temperature, or a large excess of neutrons and protons) play an important role in studying fundamental properties of nuclei and are helpful in solving the problem of constructing the equation of state for nuclear matter. The synthesis of neutron-rich nuclei near the nucleon drip lines and investigation of their properties permit drawing conclusions about the positions of these boundaries and deducing information about unusual states of such nuclei and about their decays. At the present time, experimental investigations along these lines can only be performed via the cooperation of leading research centers that possess powerful heavy-ion accelerators, such as the Large Hadron Collider (LHC) at CERN and the heavy-ion cyclotrons at the Joint Institute for Nuclear Research (JINR, Dubna), where respective experiments are being conducted by physicists from about 20 JINR member countries. The present article gives a survey of the most recent results in the realms of super neutron-rich nuclei. Implications of the change in the structure of such nuclei near the nucleon drip lines are discussed. Information about the results obtained by measuring the masses (binding energies) of exotic nuclei, the nucleon-distribution radii (neutron halo) and momentum distributions in them, and their deformations and quantum properties is presented. It is shown that the properties of nuclei lying near the stability boundaries differ strongly from the properties of other nuclei. The problem of the stability of nuclei that is associated with the magic numbers of 20 and 28 is discussed along with the effect of new magic numbers.

  6. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  7. Speaker verification based on the fusion of speech acoustics and inverted articulatory signals☆

    PubMed Central

    Li, Ming; Kim, Jangwon; Lammert, Adam; Ghosh, Prasanta Kumar; Ramanarayanan, Vikram; Narayanan, Shrikanth

    2016-01-01

    We propose a practical, feature-level and score-level fusion approach by combining acoustic and estimated articulatory information for both text independent and text dependent speaker verification. From a practical point of view, we study how to improve speaker verification performance by combining dynamic articulatory information with the conventional acoustic features. On text independent speaker verification, we find that concatenating articulatory features obtained from measured speech production data with conventional Mel-frequency cepstral coefficients (MFCCs) improves the performance dramatically. However, since directly measuring articulatory data is not feasible in many real world applications, we also experiment with estimated articulatory features obtained through acoustic-to-articulatory inversion. We explore both feature level and score level fusion methods and find that the overall system performance is significantly enhanced even with estimated articulatory features. Such a performance boost could be due to the inter-speaker variation information embedded in the estimated articulatory features. Since the dynamics of articulation contain important information, we included inverted articulatory trajectories in text dependent speaker verification. We demonstrate that the articulatory constraints introduced by inverted articulatory features help to reject wrong password trials and improve the performance after score level fusion. We evaluate the proposed methods on the X-ray Microbeam database and the RSR 2015 database, respectively, for the aforementioned two tasks. Experimental results show that we achieve more than 15% relative equal error rate reduction for both speaker verification tasks. PMID:28496292

  8. Verification of operating software for cooperative monitoring applications

    SciTech Connect

    Tolk, K.M.; Rembold, R.K.

    1997-08-01

    Monitoring agencies often use computer based equipment to control instruments and to collect data at sites that are being monitored under international safeguards or other cooperative monitoring agreements. In order for this data to be used as an independent verification of data supplied by the host at the facility, the software used must be trusted by the monitoring agency. The monitoring party must be sure that the software has not be altered to give results that could lead to erroneous conclusions about nuclear materials inventories or other operating conditions at the site. The host might also want to verify that the software being used is the software that has been previously inspected in order to be assured that only data that is allowed under the agreement is being collected. A description of a method to provide this verification using keyed has functions and how the proposed method overcomes possible vulnerabilities in methods currently in use such as loading the software from trusted disks is presented. The use of public key data authentication for this purpose is also discussed.

  9. TRU waste certification and TRUPACT-2 payload verification

    SciTech Connect

    Hunter, E.K. . Waste Isolation Pilot Plant Project Office); Johnson, J.E. . Waste Isolation Div.)

    1990-01-01

    The Waste Isolation Pilot Plant (WIPP) established a policy that requires each waste shipper to verify that all waste shipments meet the requirements of the Waste Acceptance Criteria (WAC) prior to being shipped. This verification provides assurance that transuranic (TRU) wastes meet the criteria while still retained in a facility where discrepancies can be immediately corrected. Each Department of Energy (DOE) TRU waste facility planning to ship waste to the Waste Isolation Pilot Plant (WIPP) is required to develop and implement a specific program including Quality Assurance (QA) provisions to verify that waste is in full compliance with WIPP's WAC. This program is audited by a composite DOE and contractor audit team prior to granting the facility permission to certify waste. During interaction with the Nuclear Regulatory Commission (NRC) on payload verification for shipping in TRUPACT-II, a similar system was established by DOE. The TRUPACT-II Safety Analysis Report (SAR) contains the technical requirements and physical and chemical limits that payloads must meet (like the WAC). All shippers must plan and implement a payload control program including independent QA provisions. A similar composite audit team will conduct preshipment audits, frequent subsequent audits, and operations inspections to verify that all TRU waste shipments in TRUPACT-II meet the requirements of the Certificate of Compliance issued by the NRC which invokes the SAR requirements. 1 fig.

  10. The PASCAL-HDM Verification System

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The PASCAL-HDM verification system is described. This system supports the mechanical generation of verification conditions from PASCAL programs and HDM-SPECIAL specifications using the Floyd-Hoare axiomatic method. Tools are provided to parse programs and specifications, check their static semantics, generate verification conditions from Hoare rules, and translate the verification conditions appropriately for proof using the Shostak Theorem Prover, are explained. The differences between standard PASCAL and the language handled by this system are explained. This consists mostly of restrictions to the standard language definition, the only extensions or modifications being the addition of specifications to the code and the change requiring the references to a function of no arguments to have empty parentheses.

  11. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: STORMWATER TECHNOLOGIES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techn...

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: FUEL CELLS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  14. The PASCAL-HDM Verification System

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The PASCAL-HDM verification system is described. This system supports the mechanical generation of verification conditions from PASCAL programs and HDM-SPECIAL specifications using the Floyd-Hoare axiomatic method. Tools are provided to parse programs and specifications, check their static semantics, generate verification conditions from Hoare rules, and translate the verification conditions appropriately for proof using the Shostak Theorem Prover, are explained. The differences between standard PASCAL and the language handled by this system are explained. This consists mostly of restrictions to the standard language definition, the only extensions or modifications being the addition of specifications to the code and the change requiring the references to a function of no arguments to have empty parentheses.

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: FUEL CELLS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: STORMWATER TECHNOLOGIES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techn...

  17. Environmental Technology Verification Program (ETV) Policy Compendium

    EPA Science Inventory

    The Policy Compendium summarizes operational decisions made to date by participants in the U.S. Environmental Protection Agency's (EPA's) Environmental Technology Verification Program (ETV) to encourage consistency among the ETV centers. The policies contained herein evolved fro...

  18. U.S. Environmental Technology Verification Program

    EPA Science Inventory

    Overview of the U.S. Environmental Technology Verification Program (ETV), the ETV Greenhouse Gas Technology Center, and energy-related ETV projects. Presented at the Department of Energy's National Renewable Laboratory in Boulder, Colorado on June 23, 2008.

  19. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  20. Engineering drawing field verification program. Revision 3

    SciTech Connect

    Ulk, P.F.

    1994-10-12

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented.

  1. Environmental Technology Verification Program (ETV) Policy Compendium

    EPA Science Inventory

    The Policy Compendium summarizes operational decisions made to date by participants in the U.S. Environmental Protection Agency's (EPA's) Environmental Technology Verification Program (ETV) to encourage consistency among the ETV centers. The policies contained herein evolved fro...

  2. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... deviation occurs; (d) Reviewing the critical limits; (e) Reviewing other records pertaining to the...

  3. MAMA Software Features: Quantification Verification Documentation-1

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  4. Please Help Your Union

    NASA Astrophysics Data System (ADS)

    Killeen, Tim

    2006-03-01

    The continuing success of AGU relies entirely on the volunteer work of members. A major contribution to these efforts comes from the over 40 committees that plan, oversee, and have operational roles in our meetings, publications, finances, elections, awards, education, public information, and public affairs activities. The names of committees are provided in the accompanying text box; their current membership and descriptions can be found on the Web at the AGU site. One of the most important and challenging tasks of the incoming AGU President is to reestablish these committees by appointing hundreds of volunteers. I now solicit your help in staffing these committees. Ideally, participation in these important committees will reflect the overall membership and perspectives of AGU members, so please do consider volunteering yourself. Of course, nominations of others would also be very welcome. I am particularly interested in making sure that the gender balance, age, and geographic representation are appropriate and reflect our changing demographics. Any suggestions you might have will be more helpful if accompanied by a few sentences of background information relevant to the particular committee.

  5. Advanced Nuclear Measurements - Sensitivity Analysis Emerging Safeguards, Problems and Proliferation Risk

    SciTech Connect

    Dreicer, J.S.

    1999-07-15

    During the past year this component of the Advanced Nuclear Measurements LDRD-DR has focused on emerging safeguards problems and proliferation risk by investigating problems in two domains. The first is related to the analysis, quantification, and characterization of existing inventories of fissile materials, in particular, the minor actinides (MA) formed in the commercial fuel cycle. Understanding material forms and quantities helps identify and define future measurement problems, instrument requirements, and assists in prioritizing safeguards technology development. The second problem (dissertation research) has focused on the development of a theoretical foundation for sensor array anomaly detection. Remote and unattended monitoring or verification of safeguards activities is becoming a necessity due to domestic and international budgetary constraints. However, the ability to assess the trustworthiness of a sensor array has not been investigated. This research is developing an anomaly detection methodology to assess the sensor array.

  6. Ground-based visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-11-01

    Ground-based visual inspection will play an essential role in On-Site Inspection (OSI) for Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection will greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can ground-based visual inspection offer effective documentation in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending state may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection. The inspections will be carried out by inspectors from members of the CTBT Organization.

  7. Verification of Minimum Detectable Activity for Radiological Threat Source Search

    NASA Astrophysics Data System (ADS)

    Gardiner, Hannah; Myjak, Mitchell; Baciak, James; Detwiler, Rebecca; Seifert, Carolyn

    2015-10-01

    The Department of Homeland Security's Domestic Nuclear Detection Office is working to develop advanced technologies that will improve the ability to detect, localize, and identify radiological and nuclear sources from airborne platforms. The Airborne Radiological Enhanced-sensor System (ARES) program is developing advanced data fusion algorithms for analyzing data from a helicopter-mounted radiation detector. This detector platform provides a rapid, wide-area assessment of radiological conditions at ground level. The NSCRAD (Nuisance-rejection Spectral Comparison Ratios for Anomaly Detection) algorithm was developed to distinguish low-count sources of interest from benign naturally occurring radiation and irrelevant nuisance sources. It uses a number of broad, overlapping regions of interest to statistically compare each newly measured spectrum with the current estimate for the background to identify anomalies. We recently developed a method to estimate the minimum detectable activity (MDA) of NSCRAD in real time. We present this method here and report on the MDA verification using both laboratory measurements and simulated injects on measured backgrounds at or near the detection limits. This work is supported by the US Department of Homeland Security, Domestic Nuclear Detection Office, under competitively awarded contract/IAA HSHQDC-12-X-00376. This support does not constitute an express or implied endorsement on the part of the Gov't.

  8. Dynamic testing for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.

    1972-01-01

    Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.

  9. Modular Typestate Verification of Aliased Objects

    DTIC Science & Technology

    2007-03-01

    objects that change their state from one to another. Fugue [11] is the only existing typestate-based object-oriented protocol verification system that we...usage and implementation of such expressive typestate protocols. Our verification approach is highly inspired by Fugue [11]. We extend state invariants...packing, and frames to work in our context. We improve support for inheritance in comparison to Fugue by decoupling states of frames and reducing

  10. Certifiable Specification and Verification of C Programs

    NASA Astrophysics Data System (ADS)

    Lüth, Christoph; Walter, Dennis

    A novel approach to the specification and verification of C programs through an annotation language that is a mixture between JML and the language of Isabelle/HOL is proposed. This yields three benefits: specifications are concise and close to the underlying mathematical model; existing Isabelle theories can be reused; and the leap of faith from specification language to encoding in a logic is small. This is of particular relevance for software certification, and verification in application areas such as robotics.

  11. Identity verification in quantum key distribution

    NASA Astrophysics Data System (ADS)

    Zeng, Guihua; Zhang, Weiping

    2000-02-01

    The security of the previous quantum key distribution protocols, which is guaranteed by the laws of quantum physics, is based on legitimate users. However, impersonation of the legitimate communicators by eavesdroppers, in practice, will be inevitable. In this paper, we proposed a quantum key verification scheme, which can simultaneously distribute the quantum secret key and verify the communicators' identity. Investigation shows that this proposed identity verification scheme is secure.

  12. Handbook: Design of automated redundancy verification

    NASA Technical Reports Server (NTRS)

    Ford, F. A.; Hasslinger, T. W.; Moreno, F. J.

    1971-01-01

    The use of the handbook is discussed and the design progress is reviewed. A description of the problem is presented, and examples are given to illustrate the necessity for redundancy verification, along with the types of situations to which it is typically applied. Reusable space vehicles, such as the space shuttle, are recognized as being significant in the development of the automated redundancy verification problem.

  13. The NPARC Alliance Verification and Validation Archive

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Dudek, Julianne C.; Tatum, Kenneth E.

    2000-01-01

    The NPARC Alliance (National Project for Applications oriented Research in CFD) maintains a publicly-available, web-based verification and validation archive as part of the development and support of the WIND CFD code. The verification and validation methods used for the cases attempt to follow the policies and guidelines of the ASME and AIAA. The emphasis is on air-breathing propulsion flow fields with Mach numbers ranging from low-subsonic to hypersonic.

  14. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  15. High-Resolution Fast-Neutron Spectrometry for Arms Control and Treaty Verification

    SciTech Connect

    David L. Chichester; James T. Johnson; Edward H. Seabury

    2012-07-01

    Many nondestructive nuclear analysis techniques have been developed to support the measurement needs of arms control and treaty verification, including gross photon and neutron counting, low- and high-resolution gamma spectrometry, time-correlated neutron measurements, and photon and neutron imaging. One notable measurement technique that has not been extensively studied to date for these applications is high-resolution fast-neutron spectrometry (HRFNS). Applied for arms control and treaty verification, HRFNS has the potential to serve as a complimentary measurement approach to these other techniques by providing a means to either qualitatively or quantitatively determine the composition and thickness of non-nuclear materials surrounding neutron-emitting materials. The technique uses the normally-occurring neutrons present in arms control and treaty verification objects of interest as an internal source of neutrons for performing active-interrogation transmission measurements. Most low-Z nuclei of interest for arms control and treaty verification, including 9Be, 12C, 14N, and 16O, possess fast-neutron resonance features in their absorption cross sections in the 0.5- to 5-MeV energy range. Measuring the selective removal of source neutrons over this energy range, assuming for example a fission-spectrum starting distribution, may be used to estimate the stoichiometric composition of intervening materials between the neutron source and detector. At a simpler level, determination of the emitted fast-neutron spectrum may be used for fingerprinting 'known' assemblies for later use in template-matching tests. As with photon spectrometry, automated analysis of fast-neutron spectra may be performed to support decision making and reporting systems protected behind information barriers. This paper will report recent work at Idaho National Laboratory to explore the feasibility of using HRFNS for arms control and treaty verification applications, including simulations and

  16. Verification study of an emerging fire suppression system

    SciTech Connect

    Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.; Gubernatis, David C.

    2016-01-01

    Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation and mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.

  17. Verification study of an emerging fire suppression system

    DOE PAGES

    Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.; ...

    2016-01-01

    Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation andmore » mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.« less

  18. National Verification System of National Meteorological Center , China

    NASA Astrophysics Data System (ADS)

    Zhang, Jinyan; Wei, Qing; Qi, Dan

    2016-04-01

    Product Quality Verification Division for official weather forecasting of China was founded in April, 2011. It is affiliated to Forecast System Laboratory (FSL), National Meteorological Center (NMC), China. There are three employees in this department. I'm one of the employees and I am in charge of Product Quality Verification Division in NMC, China. After five years of construction, an integrated realtime National Verification System of NMC, China has been established. At present, its primary roles include: 1) to verify official weather forecasting quality of NMC, China; 2) to verify the official city weather forecasting quality of Provincial Meteorological Bureau; 3) to evaluate forecasting quality for each forecasters in NMC, China. To verify official weather forecasting quality of NMC, China, we have developed : • Grid QPF Verification module ( including upascale) • Grid temperature, humidity and wind forecast verification module • Severe convective weather forecast verification module • Typhoon forecast verification module • Disaster forecast verification • Disaster warning verification module • Medium and extend period forecast verification module • Objective elements forecast verification module • Ensemble precipitation probabilistic forecast verification module To verify the official city weather forecasting quality of Provincial Meteorological Bureau, we have developed : • City elements forecast verification module • Public heavy rain forecast verification module • City air quality forecast verification module. To evaluate forecasting quality for each forecasters in NMC, China, we have developed : • Off-duty forecaster QPF practice evaluation module • QPF evaluation module for forecasters • Severe convective weather forecast evaluation module • Typhoon track forecast evaluation module for forecasters • Disaster warning evaluation module for forecasters • Medium and extend period forecast evaluation module The further

  19. Software engineering and automatic continuous verification of scientific software

    NASA Astrophysics Data System (ADS)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical

  20. Electromagnetic Signature Technique as a Promising Tool to Verify Nuclear Weapons Storage and Dismantlement under a Nuclear Arms Control Regime

    SciTech Connect

    Bunch, Kyle J.; Williams, Laura S.; Jones, Anthony M.; Ramuhalli, Pradeep

    2012-08-01

    The 2010 ratification of the New START Treaty has been widely regarded as a noteworthy national security achievement for both the Obama administration and the Medvedev-Putin regime, but deeper cuts are envisioned under future arms control regimes. Future verification needs will include monitoring the storage of warhead components and fissile materials and verifying dismantlement of warheads, pits, secondaries, and other materials. From both the diplomatic and technical perspectives, verification under future arms control regimes will pose new challenges. Since acceptable verification technology must protect sensitive design information and attributes, non-nuclear non-sensitive signatures may provide a significant verification tool without the use of additional information barriers. The use of electromagnetic signatures to monitor nuclear material storage containers is a promising technology with the potential to fulfill these challenging requirements. Research performed at Pacific Northwest National Laboratory (PNNL) has demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to confirm the presence of specific components on a “yes/no” basis without revealing classified information. Arms control inspectors might use this technique to verify the presence or absence of monitored items, including both nuclear and non-nuclear materials. Although additional research is needed to study signature aspects such as uniqueness and investigate container-specific scenarios, the technique potentially offers a rapid and cost-effective tool to verify reduction and dismantlement of U.S. and Russian nuclear weapons.