Science.gov

Sample records for nuclear verification helping

  1. Nuclear disarmament verification

    SciTech Connect

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  2. Helping nuclear power help us

    SciTech Connect

    Schecker, Jay A

    2009-01-01

    After a prolonged absence, the word 'nuclear' has returned to the lexicon of sustainable domestic energy resources. Due in no small part to its demonstrated reliability, nuclear power is poised to playa greater role in the nation's energy future, producing clean, carbon-neutral electricity and contributing even more to our energy security. To nuclear scientists, the resurgence presents an opportunity to inject new technologies into the industry to maximize the benefits that nuclear energy can provide. 'By developing new options for waste management and exploiting new materials to make key technological advances, we can significantly impact the use of nuclear energy in our future energy mix,' says Chris Stanek, a materials scientist at Los Alamos National Laboratory. Stanek approaches the big technology challenges by thinking way small, all the way down to the atoms. He and his colleagues are using cutting edge atomic-scale simulations to address a difficult aspect of nuclear waste -- predicting its behavior far into the future. Their research is part of a broader, coordinated effort on the part of the Laboratory to use its considerable experimental, theoretical, and computational capabilities to explore advanced materials central to not only waste issues, but to nuclear fuels as well.

  3. Nuclear Data Verification and Standardization

    SciTech Connect

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  4. The monitoring and verification of nuclear weapons

    SciTech Connect

    Garwin, Richard L.

    2014-05-09

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  5. Thoughts on Verification of Nuclear Disarmament

    SciTech Connect

    Dunlop, W H

    2007-09-26

    perfect and it was expected that occasionally there might be a verification measurement that was slightly above 150 kt. But the accuracy was much improved over the earlier seismic measurements. In fact some of this improvement was because as part of this verification protocol the US and Soviet Union provided the yields of several past tests to improve seismic calibrations. This actually helped provide a much needed calibration for the seismic measurements. It was also accepted that since nuclear tests were to a large part R&D related, it was also expected that occasionally there might be a test that was slightly above 150 kt, as you could not always predict the yield with high accuracy in advance of the test. While one could hypothesize that the Soviets could do a test at some other location than their test sites, if it were even a small fraction of 150 kt it would clearly be observed and would be a violation of the treaty. So the issue of clandestine tests of significance was easily covered for this particular treaty.

  6. DESIGN INFORMATION VERIFICATION FOR NUCLEAR SAFEGUARDS

    SciTech Connect

    Robert S. Bean; Richard R. M. Metcalf; Phillip C. Durst

    2009-07-01

    A critical aspect of international safeguards activities performed by the International Atomic Energy Agency (IAEA) is the verification that facility design and construction (including upgrades and modifications) do not create opportunities for nuclear proliferation. These Design Information Verification activities require that IAEA inspectors compare current and past information about the facility to verify the operator’s declaration of proper use. The actual practice of DIV presents challenges to the inspectors due to the large amount of data generated, concerns about sensitive or proprietary data, the overall complexity of the facility, and the effort required to extract just the safeguards relevant information. Planned and anticipated facilities will (especially in the case of reprocessing plants) be ever larger and increasingly complex, thus exacerbating the challenges. This paper reports the results of a workshop held at the Idaho National Laboratory in March 2009, which considered technologies and methods to address these challenges. The use of 3D Laser Range Finding, Outdoor Visualization System, Gamma-LIDAR, and virtual facility modeling, as well as methods to handle the facility data issues (quantity, sensitivity, and accessibility and portability for the inspector) were presented. The workshop attendees drew conclusions about the use of these techniques with respect to successfully employing them in an operating environment, using a Fuel Conditioning Facility walk-through as a baseline for discussion.

  7. A Zero Knowledge Protocol For Nuclear Warhead Verification

    SciTech Connect

    Glaser, Alexander; Goldston, Robert J.

    2014-03-14

    The verification of nuclear warheads for arms control faces a paradox: International inspectors must gain high confidence in the authenticity of submitted items while learning nothing about them. Conventional inspection systems featuring ''information barriers'', designed to hide measurments stored in electronic systems, are at risk of tampering and snooping. Here we show the viability of fundamentally new approach to nuclear warhead verification that incorporates a zero-knowledge protocol, designed such that sensitive information is never measured so does not need to be hidden. We interrogate submitted items with energetic neutrons, making in effect, differential measurements of neutron transmission and emission. Calculations of diversion scenarios show that a high degree of discrimination can be achieved while revealing zero information. Timely demonstration of the viability of such an approach could be critical for the nexxt round of arms-control negotiations, which will likely require verification of individual warheads, rather than whole delivery systems.

  8. Nuclear reaction modeling, verification experiments, and applications

    SciTech Connect

    Dietrich, F.S.

    1995-10-01

    This presentation summarized the recent accomplishments and future promise of the neutron nuclear physics program at the Manuel Lujan Jr. Neutron Scatter Center (MLNSC) and the Weapons Neutron Research (WNR) facility. The unique capabilities of the spallation sources enable a broad range of experiments in weapons-related physics, basic science, nuclear technology, industrial applications, and medical physics.

  9. Physical cryptographic verification of nuclear warheads.

    PubMed

    Kemp, R Scott; Danagoulian, Areg; Macdonald, Ruaridh R; Vavrek, Jayson R

    2016-08-01

    How does one prove a claim about a highly sensitive object such as a nuclear weapon without revealing information about the object? This paradox has challenged nuclear arms control for more than five decades. We present a mechanism in the form of an interactive proof system that can validate the structure and composition of an object, such as a nuclear warhead, to arbitrary precision without revealing either its structure or composition. We introduce a tomographic method that simultaneously resolves both the geometric and isotopic makeup of an object. We also introduce a method of protecting information using a provably secure cryptographic hash that does not rely on electronics or software. These techniques, when combined with a suitable protocol, constitute an interactive proof system that could reject hoax items and clear authentic warheads with excellent sensitivity in reasonably short measurement times.

  10. Physical cryptographic verification of nuclear warheads.

    PubMed

    Kemp, R Scott; Danagoulian, Areg; Macdonald, Ruaridh R; Vavrek, Jayson R

    2016-08-01

    How does one prove a claim about a highly sensitive object such as a nuclear weapon without revealing information about the object? This paradox has challenged nuclear arms control for more than five decades. We present a mechanism in the form of an interactive proof system that can validate the structure and composition of an object, such as a nuclear warhead, to arbitrary precision without revealing either its structure or composition. We introduce a tomographic method that simultaneously resolves both the geometric and isotopic makeup of an object. We also introduce a method of protecting information using a provably secure cryptographic hash that does not rely on electronics or software. These techniques, when combined with a suitable protocol, constitute an interactive proof system that could reject hoax items and clear authentic warheads with excellent sensitivity in reasonably short measurement times. PMID:27432959

  11. Physical cryptographic verification of nuclear warheads

    NASA Astrophysics Data System (ADS)

    Kemp, R. Scott; Danagoulian, Areg; Macdonald, Ruaridh R.; Vavrek, Jayson R.

    2016-08-01

    How does one prove a claim about a highly sensitive object such as a nuclear weapon without revealing information about the object? This paradox has challenged nuclear arms control for more than five decades. We present a mechanism in the form of an interactive proof system that can validate the structure and composition of an object, such as a nuclear warhead, to arbitrary precision without revealing either its structure or composition. We introduce a tomographic method that simultaneously resolves both the geometric and isotopic makeup of an object. We also introduce a method of protecting information using a provably secure cryptographic hash that does not rely on electronics or software. These techniques, when combined with a suitable protocol, constitute an interactive proof system that could reject hoax items and clear authentic warheads with excellent sensitivity in reasonably short measurement times.

  12. Physical cryptographic verification of nuclear warheads

    PubMed Central

    Kemp, R. Scott; Danagoulian, Areg; Macdonald, Ruaridh R.; Vavrek, Jayson R.

    2016-01-01

    How does one prove a claim about a highly sensitive object such as a nuclear weapon without revealing information about the object? This paradox has challenged nuclear arms control for more than five decades. We present a mechanism in the form of an interactive proof system that can validate the structure and composition of an object, such as a nuclear warhead, to arbitrary precision without revealing either its structure or composition. We introduce a tomographic method that simultaneously resolves both the geometric and isotopic makeup of an object. We also introduce a method of protecting information using a provably secure cryptographic hash that does not rely on electronics or software. These techniques, when combined with a suitable protocol, constitute an interactive proof system that could reject hoax items and clear authentic warheads with excellent sensitivity in reasonably short measurement times. PMID:27432959

  13. A seismic event analyzer for nuclear test ban treaty verification

    SciTech Connect

    Mason, C.L.; Johnson, R.R. . Dept. of Applied Science Lawrence Livermore National Lab., CA ); Searfus, R.M.; Lager, D.; Canales, T. )

    1988-01-01

    This paper presents an expert system that interprets seismic data from Norway's regional seismic array, NORESS, for underground nuclear weapons test ban treaty verification. Three important aspects of the expert system are (1) it emulates the problem solving behavior of the human seismic analyst using an Assumption Based Truth Maintenance System, (2) it acts as an assistant to the human analyst by automatically interpreting and presenting events for review, and (3) it enables the analyst to interactively query the system's chain of reasoning and manually perform and interpretation. The general problem of seismic treaty verification is described. The expert system is presented in terms of knowledge representation structures, assumption based reasoning system, user interface elements, and initial performance results. 8 refs., 10 figs., 2 tabs.

  14. Nuclear Proliferation Using Laser Isotope Separation -- Verification Options

    SciTech Connect

    Erickson, S A

    2001-10-15

    Two levels of nonproliferation verification exist. Signatories of the basic agreements under the Nuclear Non-proliferation Treaty (NPT) agree to open their nuclear sites to inspection by the IAEA. A more detailed and intrusive level was developed following the determination that Iraq had begun a nuclear weapons development program that was not detected by the original level of verification methods. This level, referred to as 93+2 and detailed in model protocol INFCIRC/540, allows the IAEA to do environmental monitoring of non-declared facilities that are suspected of containing proliferation activity, and possibly further inspections, as well as allowing more detailed inspections of declared sites. 56 countries have signed a Strengthened Safeguards Systems Additional Protocol as of 16 July 2001. These additional inspections can be done on the instigation of the IAEA itself, or after requests by other parties to the NPT, based on information that they have collected. Since information able to cause suspicion of proliferation could arrive at any country, it is important that countries have procedures in place that will assist them in making decisions related to these inspections. Furthermore, IAEA inspection resources are limited, and therefore care needs to be taken to make best use of these resources. Most of the nonproliferation verification inspections may be concentrated on establishing that diversion of nuclear materials is not occurring, but some fraction will be related to determining if undeclared sites have nuclear materials production taking place within them. Of these, most suspicions will likely be related to the major existing technologies for uranium enrichment and reprocessing for plutonium extraction, as it would seem most likely that nations attempting proliferation would use tested means of producing nuclear materials. However, as technology continues to advance and new methods of enrichment and reprocessing are developed, inspection

  15. A zero-knowledge protocol for nuclear warhead verification

    NASA Astrophysics Data System (ADS)

    Glaser, Alexander; Barak, Boaz; Goldston, Robert J.

    2014-06-01

    The verification of nuclear warheads for arms control involves a paradox: international inspectors will have to gain high confidence in the authenticity of submitted items while learning nothing about them. Proposed inspection systems featuring `information barriers', designed to hide measurements stored in electronic systems, are at risk of tampering and snooping. Here we show the viability of a fundamentally new approach to nuclear warhead verification that incorporates a zero-knowledge protocol, which is designed in such a way that sensitive information is never measured and so does not need to be hidden. We interrogate submitted items with energetic neutrons, making, in effect, differential measurements of both neutron transmission and emission. Calculations for scenarios in which material is diverted from a test object show that a high degree of discrimination can be achieved while revealing zero information. Our ideas for a physical zero-knowledge system could have applications beyond the context of nuclear disarmament. The proposed technique suggests a way to perform comparisons or computations on personal or confidential data without measuring the data in the first place.

  16. A zero-knowledge protocol for nuclear warhead verification.

    PubMed

    Glaser, Alexander; Barak, Boaz; Goldston, Robert J

    2014-06-26

    The verification of nuclear warheads for arms control involves a paradox: international inspectors will have to gain high confidence in the authenticity of submitted items while learning nothing about them. Proposed inspection systems featuring 'information barriers', designed to hide measurements stored in electronic systems, are at risk of tampering and snooping. Here we show the viability of a fundamentally new approach to nuclear warhead verification that incorporates a zero-knowledge protocol, which is designed in such a way that sensitive information is never measured and so does not need to be hidden. We interrogate submitted items with energetic neutrons, making, in effect, differential measurements of both neutron transmission and emission. Calculations for scenarios in which material is diverted from a test object show that a high degree of discrimination can be achieved while revealing zero information. Our ideas for a physical zero-knowledge system could have applications beyond the context of nuclear disarmament. The proposed technique suggests a way to perform comparisons or computations on personal or confidential data without measuring the data in the first place.

  17. Mobile Pit verification system design based on passive special nuclear material verification in weapons storage facilities

    SciTech Connect

    Paul, J. N.; Chin, M. R.; Sjoden, G. E.

    2013-07-01

    A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reaction rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)

  18. 75 FR 34439 - Defense Science Board Task Force on Nuclear Treaty Monitoring and Verification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-17

    ... of the Secretary Defense Science Board Task Force on Nuclear Treaty Monitoring and Verification... Science Board Task Force on Nuclear Treaty Monitoring and Verification will meet in closed session on July... might be implemented. The task force's findings and recommendations, pursuant to 41 CFR...

  19. TRANSPARENCY, VERIFICATION AND THE FUTURE OF NUCLEAR NONPROLIFERATION AND ARMS CONTROL

    SciTech Connect

    J. PILAT

    2000-11-01

    In the future, if the nuclear nonproliferation and arms control agendas are to advance, they will likely become increasingly seen as parallel undertakings with the objective of cradle-to-grave controls over nuclear warheads and/or materials. The pursuit of such an agenda was difficult enough at the outset of the nuclear age; it will be more difficult in the future with relatively wide-spread military and civil nuclear programs. This agenda will require both verification and transparency. To address emerging nuclear dangers, we may expect hybrid verification-transparency regimes to be seen as acceptable. Such regimes would have intrusive but much more limited verification provisions than Cold War accords, and have extensive transparency provisions designed in part to augment the verification measures, to fill in the ''gaps'' of the verification regime, and the like.

  20. Supporting the President's Arms Control and Nonproliferation Agenda: Transparency and Verification for Nuclear Arms Reductions

    SciTech Connect

    Doyle, James E; Meek, Elizabeth

    2009-01-01

    The President's arms control and nonproliferation agenda is still evolving and the details of initiatives supporting it remain undefined. This means that DOE, NNSA, NA-20, NA-24 and the national laboratories can help define the agenda, and the policies and the initiatives to support it. This will require effective internal and interagency coordination. The arms control and nonproliferation agenda is broad and includes the path-breaking goal of creating conditions for the elimination of nuclear weapons. Responsibility for various elements of the agenda will be widely scattered across the interagency. Therefore an interagency mapping exercise should be performed to identify the key points of engagement within NNSA and other agencies for creating effective policy coordination mechanisms. These can include informal networks, working groups, coordinating committees, interagency task forces, etc. It will be important for NA-20 and NA-24 to get a seat at the table and a functional role in many of these coordinating bodies. The arms control and nonproliferation agenda comprises both mature and developing policy initiatives. The more mature elements such as CTBT ratification and a follow-on strategic nuclear arms treaty with Russia have defined milestones. However, recent press reports indicate that even the START follow-on strategic arms pact that is planned to be complete by the end of 2009 may take significantly longer and be more expansive in scope. The Russians called for proposals to count non-deployed as well as deployed warheads. Other elements of the agenda such as FMCT, future bilateral nuclear arms reductions following a START follow-on treaty, nuclear posture changes, preparations for an international nuclear security summit, strengthened international safeguards and multilateral verification are in much earlier stages of development. For this reason any survey of arms control capabilities within the USG should be structured to address potential needs across the

  1. Fuzzy-logic-based safety verification framework for nuclear power plants.

    PubMed

    Rastogi, Achint; Gabbar, Hossam A

    2013-06-01

    This article presents a practical implementation of a safety verification framework for nuclear power plants (NPPs) based on fuzzy logic where hazard scenarios are identified in view of safety and control limits in different plant process values. Risk is estimated quantitatively and compared with safety limits in real time so that safety verification can be achieved. Fuzzy logic is used to define safety rules that map hazard condition with required safety protection in view of risk estimate. Case studies are analyzed from NPP to realize the proposed real-time safety verification framework. An automated system is developed to demonstrate the safety limit for different hazard scenarios.

  2. INFCIRC/153 as the basis for verification of a special nuclear materials production cutoff convention

    SciTech Connect

    Parsick, R.; Sanborn, J.

    1994-08-01

    Although safeguards practice under the Nuclear Non-Proliferation Treaty (NPT) has evolved considerably over the lifetime of the agreement, INFCIRC/153 remains the defining document of NPT verification. It is the only available document that might be adopted as an element of a special nuclear materials production cutoff convention to define an ``NPT-like`` verification regime. The clearly stated objective of INFCIRC/153 safeguards is the ability to ``detect diversion,`` which is achieved by verifying the state`s nuclear material accounting system. Although the way in which the verification objectives of a cutoff convention will be formulated is not yet known, it is clear that there will not be an exact fit between the two. The mismatch becomes greater to the extent that material in declared facilities is ``grandfathered`` under cutoff. It also increases as a cutoff regime focuses verification on the physical process of production (in operating facilities) or on facility operational status or capability (at non-operating facilities) rather than on the nuclear material itself. This paper compares the technical objectives which may be assigned to the IAEA under a cutoff convention with those described in INFCIRC/153. It also deals with means and limitations of verification activities which are closely linked to these objectives. The broader political and legal objectives associated with the convention are not considered.

  3. Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models

    SciTech Connect

    Ahmed Hassan; Jenny Chapman

    2006-02-01

    The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP

  4. Design Verification Report Spent Nuclear Fuel (SNF) Project Canister Storage Building (CSB)

    SciTech Connect

    PICKETT, W.W.

    2000-09-22

    The Sub-project W379, ''Spent Nuclear Fuel Canister Storage Building (CSB),'' was established as part of the Spent Nuclear Fuel (SNF) Project. The primary mission of the CSB is to safely store spent nuclear fuel removed from the K Basins in dry storage until such time that it can be transferred to the national geological repository at Yucca Mountain Nevada. This sub-project was initiated in late 1994 by a series of studies and conceptual designs. These studies determined that the partially constructed storage building, originally built as part of the Hanford Waste Vitrification Plant (HWVP) Project, could be redesigned to safely store the spent nuclear fuel. The scope of the CSB facility initially included a receiving station, a hot conditioning system, a storage vault, and a Multi-Canister Overpack (MCO) Handling Machine (MHM). Because of evolution of the project technical strategy, the hot conditioning system was deleted from the scope and MCO welding and sampling stations were added in its place. This report outlines the methods, procedures, and outputs developed by Project W379 to verify that the provided Structures, Systems, and Components (SSCs): satisfy the design requirements and acceptance criteria; perform their intended function; ensure that failure modes and hazards have been addressed in the design; and ensure that the SSCs as installed will not adversely impact other SSCs. Because this sub-project is still in the construction/start-up phase, all verification activities have not yet been performed (e.g., canister cover cap and welding fixture system verification, MCO Internal Gas Sampling equipment verification, and As-built verification.). The verification activities identified in this report that still are to be performed will be added to the start-up punchlist and tracked to closure.

  5. Verification Study of Buoyancy-Driven Turbulent Nuclear Combustion

    SciTech Connect

    2010-01-01

    Buoyancy-driven turbulent nuclear combustion determines the rate of nuclear burning during the deflagration phase (i.e., the ordinary nuclear flame phase) of Type Ia supernovae, and hence the amount of nuclear energy released during this phase. It therefore determines the amount the white dwarf str expands prior to initiation of a detonation wave, and so the amount of radioactive nickel and thus the peak luminosity of the explosion. However, this key physical process is not fully understood. To better understand this process, the Flash Center has conducted an extensive series of large-scale 3D simulations of buoyancy-driven turbulent nuclear combustion for three different physical situations. This movie shows the results for some of these simulations. Credits: Science: Ray Bair, Katherine Riley, Argonne National Laboratory; Anshu Dubey, Don Lamb, Dongwook Lee, University of Chicago; Robert Fisher, University of Massachusetts at Dartmouth and Dean Townsley, University of Alabama

 Visualization: Jonathan Gallagher, University of Chicago; Randy Hudson, John Norris and Michael E. Papka, Argonne National Laboratory/University of Chicago This research used resources of the Argonne Leadership Computing Facility at Argonne National Laboratory, which is supported by the Office of Science of the U.S. Dept. of Energy under contract DE-AC02-06CH11357. This research was supported in part by the National Science Foundation through TeraGrid resources provided by the University of Chicago and Argonne National Laboratory.

  6. North Korea's nuclear weapons program:verification priorities and new challenges.

    SciTech Connect

    Moon, Duk-ho

    2003-12-01

    A comprehensive settlement of the North Korean nuclear issue may involve military, economic, political, and diplomatic components, many of which will require verification to ensure reciprocal implementation. This paper sets out potential verification methodologies that might address a wide range of objectives. The inspection requirements set by the International Atomic Energy Agency form the foundation, first as defined at the time of the Agreed Framework in 1994, and now as modified by the events since revelation of the North Korean uranium enrichment program in October 2002. In addition, refreezing the reprocessing facility and 5 MWe reactor, taking possession of possible weapons components and destroying weaponization capabilities add many new verification tasks. The paper also considers several measures for the short-term freezing of the North's nuclear weapon program during the process of negotiations, should that process be protracted. New inspection technologies and monitoring tools are applicable to North Korean facilities and may offer improved approaches over those envisioned just a few years ago. These are noted, and potential bilateral and regional verification regimes are examined.

  7. A gamma-ray verification system for special nuclear material

    SciTech Connect

    Lanier, R.G.; Prindle, A.L.; Friensehner, A.V.; Buckley, W.M.

    1994-07-01

    The Safeguards Technology Program at the Lawrence Livermore National Laboratory (LLNL) has developed a gamma-ray screening system for use by the Materials Management Section of the Engineering Sciences Division at LLNL for verifying the presence or absence of special nuclear material (SNM) in a sample. This system facilitates the measurements required under the ``5610`` series of US Department of Energy orders. MMGAM is an intelligent, menu driven software application that runs on a personal computer and requires a precalibrated multi-channel analyzer and HPGe detector. It provides a very quick and easy-to-use means of determining the presence of SNM in a sample. After guiding the operator through a menu driven set-up procedure, the system provides an on-screen GO/NO-GO indication after determining the system calibration status. This system represents advances over earlier used systems in the areas of ease-of use, operator training requirements, and quality assurance. The system records the gamma radiation from a sample using a sequence of measurements involving a background measurement followed immediately by a measurement of the unknown sample. Both spectra are stored and available for analysis or output. In the current application, the presence of {sup 235}U, {sup 238}U, {sup 239}Pu, and {sup 208}Tl isotopes are indicated by extracting, from the stored spectra, four energy ``windows`` preset around gamma-ray lines characteristic of the radioactive decay of these nuclides. The system is easily extendible to more complicated problems.

  8. Help

    ERIC Educational Resources Information Center

    Tollefson, Ann

    2009-01-01

    Planning to start or expand a K-8 critical language program? Looking for support in doing so? There "may" be help at the federal level for great ideas and strong programs. While there have been various pools of federal dollars available to support world language programs for a number of years, the federal government's interest in assuring strong…

  9. Development of a test system for verification and validation of nuclear transport simulations

    SciTech Connect

    White, Morgan C; Triplett, Brian S; Anghaie, Samim

    2008-01-01

    Verification and validation of nuclear data is critical to the accuracy of both stochastic and deterministic particle transport codes. In order to effectively test a set of nuclear data, the data must be applied to a wide variety of transport problems. Performing this task in a timely, efficient manner is tedious. The nuclear data team at Los Alamos National laboratory in collaboration with the University of Florida has developed a methodology to automate the process of nuclear data verification and validation (V and V). This automated V and V process can efficiently test a number of data libraries using well defined benchmark experiments, such as those in the International Criticality Safety Benchmark Experiment Project (ICSBEP). The process is implemented through an integrated set of Pyton scripts. Material and geometry data are read from an existing medium or given directly by the user to generate a benchmark experiment template file. The user specifies the choice of benchmark templates, codes, and libraries to form a V and V project. The Python scripts generate input decks for multiple transport codes from the templates, run and monitor individual jobs, and parse the relevant output automatically. The output can then be used to generate reports directly or can be stored into a database for later analysis. This methodology eases the burden on the user by reducing the amount of time and effort required for obtaining and compiling calculation results. The resource savings by using this automated methodology could potentially be an enabling technology for more sophisticated data studies, such as nuclear data uncertainty quantification. Once deployed, this tool will allow the nuclear data community to more thoroughly test data libraries leading to higher fidelity data in the future.

  10. Design Verification Report Spent Nuclear Fuel (SNF) Project Canister Storage Building (CSB)

    SciTech Connect

    BAZINET, G.D.

    2000-11-03

    The Sub-project W379, ''Spent Nuclear Fuel Canister Storage Building (CSB),'' was established as part of the Spent Nuclear Fuel (SNF) Project. The primary mission of the CSB is to safely store spent nuclear fuel removed from the K Basins in dry storage until such time that it can be transferred to the national geological repository at Yucca Mountain Nevada. This sub-project was initiated in late 1994 by a series of studies and conceptual designs. These studies determined that the partially constructed storage building, originally built as part of the Hanford Waste Vitrification Plant (HWVP) Project, could be redesigned to safely store the spent nuclear fuel. The scope of the CSB facility initially included a receiving station, a hot conditioning system, a storage vault, and a Multi-Canister Overpack (MCO) Handling Machine (MHM). Because of evolution of the project technical strategy, the hot conditioning system was deleted from the scope and MCO welding and sampling stations were added in its place. This report outlines the methods, procedures, and outputs developed by Project W379 to verify that the provided Structures, Systems, and Components (SSCs): satisfy the design requirements and acceptance criteria; perform their intended function; ensure that failure modes and hazards have been addressed in the design; and ensure that the SSCs as installed will not adversely impact other SSCs. The original version of this document was prepared by Vista Engineering for the SNF Project. The purpose of this revision is to document completion of verification actions that were pending at the time the initial report was prepared. Verification activities for the installed and operational SSCs have been completed. Verification of future additions to the CSB related to the canister cover cap and welding fixture system and MCO Internal Gas Sampling equipment will be completed as appropriate for those components. The open items related to verification of those requirements are noted

  11. REPORT OF THE WORKSHOP ON NUCLEAR FACILITY DESIGN INFORMATION EXAMINATION AND VERIFICATION FOR SAFEGUARDS

    SciTech Connect

    Richard Metcalf; Robert Bean

    2009-10-01

    Executive Summary The International Atomic Energy Agency (IAEA) implements nuclear safeguards and verifies countries are compliant with their international nuclear safeguards agreements. One of the key provisions in the safeguards agreement is the requirement that the country provide nuclear facility design and operating information to the IAEA relevant to safeguarding the facility, and at a very early stage. , This provides the opportunity for the IAEA to verify the safeguards-relevant features of the facility and to periodically ensure that those features have not changed. The national authorities (State System of Accounting for and Control of Nuclear Material - SSAC) provide the design information for all facilities within a country to the IAEA. The design information is conveyed using the IAEA’s Design Information Questionnaire (DIQ) and specifies: (1) Identification of the facility’s general character, purpose, capacity, and location; (2) Description of the facility’s layout and nuclear material form, location, and flow; (3) Description of the features relating to nuclear material accounting, containment, and surveillance; and (4) Description of existing and proposed procedures for nuclear material accounting and control, with identification of nuclear material balance areas. The DIQ is updated as required by written addendum. IAEA safeguards inspectors examine and verify this information in design information examination (DIE) and design information verification (DIV) activities to confirm that the facility has been constructed or is being operated as declared by the facility operator and national authorities, and to develop a suitable safeguards approach. Under the Next Generation Safeguards Initiative (NGSI), the National Nuclear Security Administrations (NNSA) Office of Non-Proliferation and International Security identified the need for more effective and efficient verification of design information by the IAEA for improving international safeguards

  12. A physical zero-knowledge object-comparison system for nuclear warhead verification

    PubMed Central

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; d'Errico, Francesco

    2016-01-01

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications. PMID:27649477

  13. A physical zero-knowledge object-comparison system for nuclear warhead verification

    NASA Astrophysics Data System (ADS)

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; D'Errico, Francesco

    2016-09-01

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.

  14. Verification of nuclear fuel plates by a developed non-destructive assay method

    NASA Astrophysics Data System (ADS)

    El-Gammal, W.; El-Nagdy, M.; Rizk, M.; Shawky, S.; Samei, M. A.

    2005-11-01

    Nuclear material (NM) verification is a main target for NM accounting and control. In this work a new relative non-destructive assay technique has been developed to verify the uranium mass content in nuclear fuel. The technique uses a planar high-resolution germanium gamma ray spectrometer in combination with the MCNP-4B Monte Carlo transport code. A standard NM sample was used to simulate the assayed NM and to determine the average intrinsic full energy peak efficiency of the detector for assayed configuration. The developed technique was found to be capable of verifying the operator declarations with an average accuracy of about 2.8% within a precision of better than 4%.

  15. A physical zero-knowledge object-comparison system for nuclear warhead verification.

    PubMed

    Philippe, Sébastien; Goldston, Robert J; Glaser, Alexander; d'Errico, Francesco

    2016-09-20

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.

  16. A physical zero-knowledge object-comparison system for nuclear warhead verification.

    PubMed

    Philippe, Sébastien; Goldston, Robert J; Glaser, Alexander; d'Errico, Francesco

    2016-01-01

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications. PMID:27649477

  17. Verification of 235U mass content in nuclear fuel plates by an absolute method

    NASA Astrophysics Data System (ADS)

    El-Gammal, W.

    2007-01-01

    Nuclear Safeguards is referred to a verification System by which a State can control all nuclear materials (NM) and nuclear activities under its authority. An effective and efficient Safeguards System must include a system of measurements with capabilities sufficient to verify such NM. Measurements of NM using absolute methods could eliminate the dependency on NM Standards, which are necessary for other relative or semi-absolute methods. In this work, an absolute method has been investigated to verify the 235U mass content in nuclear fuel plates of Material Testing Reactor (MTR) type. The most intense gamma-ray signature at 185.7 keV emitted after α-decay of the 235U nuclei was employed in the method. The measuring system (an HPGe-spectrometer) was mathematically calibrated for efficiency using the general Monte Carlo transport code MCNP-4B. The calibration results and the measured net count rate were used to estimate the 235U mass content in fuel plates at different detector-to-fuel plate distances. Two sets of fuel plates, containing natural and low enriched uranium, were measured at the Fuel Fabrication Facility. Average accuracies for the estimated 235U masses of about 2.62% and 0.3% are obtained for the fuel plates containing natural and low enriched uranium; respectively, with a precision of about 3%.

  18. A New Approach to Nuclear Warhead Verification Using a Zero-Knowledge Protocol

    SciTech Connect

    Glaser,; Alexander,

    2012-05-16

    Warhead verification systems proposed to date fundamentally rely on the use of information barriers to prevent the release of classified design information. Measurements with information carriers significantly increase the complexity of inspection systems, make their certification and authentication difficult, and may reduce the overall confidence in the verifiability of future arms- control agreements. This talk presents a proof-of-concept of a new approach to nuclear warhead verification that minimizes the role of information barriers from the outset and envisions instead an inspection system that a priori avoids leakage of sensitive information using a so-called zero-knowledge protocol. The proposed inspection system is based on the template-matching approach and relies on active interrogation of a test object with 14-MeV neutrons. The viability of the method is examined with MCNP Monte Carlo neutron transport calculations modeling the experimental setup, an investigation of different diversion scenarios, and an analysis of the simulated data showing that it does not contain information about the properties of the inspected object.

  19. Summary of the proceedings of the Defense Nuclear Agency Conference on Arms Control and Verification Technology (ACT), 1--4 June 1992

    SciTech Connect

    Van Keuren, J.

    1992-12-01

    The first Defense Nuclear Agency Conference on Arms Control and Verification Technology provided an international forum for over 200 individuals from the arms control verification technology and national security communities for discussion on the future of arms control verification and technology developments. Papers were presented in the following sessions: Future Arms Control Initiatives, Interface between Intelligence and Arms Control, Lessons Learned, Proliferation in a Changing World Technologies -- Roles and Applications, and Economics of Arms Control. Plenary sessions for general presentations on the future role of verification technology and negotiating and implementing verification measures. The conference papers will be published separately.

  20. Stabilized, hand-held, gamma-ray verification instrument for special nuclear materials

    SciTech Connect

    Fehlau, P.E.; Wiig, G.

    1988-01-01

    For many years, Los Alamos has developed intelligent, hand-held, search instruments for use by non-specialists to search for special nuclear materials (SNM). The instruments sense SNM by detecting its emitted radiation with scintillation detectors monitored by digital alarm circuitry. Now, we have developed a new hand-held instrument that can verify the presence or absence of particular radioisotopes by analyzing gamma-ray spectra. The new instrument is similar to recent, microprocessor-based, search instruments, but has LED detector stabilization, three adjustable regions-of-interest, and additional operating programs for spectrum analysis. We call the new instrument an SNM verification instrument. Its spectrum analysis capability can verify the presence or absence of specific plutonium isotopes in containers or verify the presence of uranium and its enrichment. The instrument retains the search capability, light weight, and low-power requirement of its predecessors. Its ready portability, detector stabilization, and simple operation allow individuals with little technical training to verify the contents of SNM containers. 5 refs., 5 figs.

  1. Technology Foresight and nuclear test verification: a structured and participatory approach

    NASA Astrophysics Data System (ADS)

    Noack, Patrick; Gaya-Piqué, Luis; Haralabus, Georgios; Auer, Matthias; Jain, Amit; Grenard, Patrick

    2013-04-01

    As part of its mandate, the CTBTO's nuclear explosion monitoring programme aims to maintain its sustainability, effectiveness and its long-term relevance to the verification regime. As such, the PTS is conducting a Technology Foresight programme of activities to identify technologies, processes, concepts and ideas that may serve said purpose and become applicable within the next 20 years. Through the Technology Foresight activities (online conferences, interviews, surveys, workshops and other) we have involved the wider science community in the fields of seismology, infrasound, hydroacoustics, radionuclide technology, remote sensing and geophysical techniques. We have assembled a catalogue of over 200 items, which incorporate technologies, processes, concepts and ideas which will have direct future relevance to the IMS (International Monitoring System), IDC (International Data Centre) and OSI (On-Site Inspection) activities within the PTS. In order to render this catalogue as applicable and useful as possible for strategy and planning, we have devised a "taxonomy" based on seven categories, against which each technology is assessed through a peer-review mechanism. These categories are: 1. Focus area of the technology in question: identify whether the technology relates to (one or more of the following) improving our understanding of source and source physics; propagation modelling; data acquisition; data transport; data processing; broad modelling concepts; quality assurance and data storage. 2. Current Development Stage of the technology in question. Based on a scale from one to six, this measure is specific to PTS needs and broadly reflects Technology Readiness Levels (TRLs). 3. Impact of the technology on each of the following capabilities: detection, location, characterization, sustainment and confidence building. 4. Development cost: the anticipated monetary cost of validating a prototype (i.e. Development Stage 3) of the technology in question. 5. Time to

  2. Development of a Standard for Verification and Validation of Software Used to Calculate Nuclear System Thermal Fluids Behavior

    SciTech Connect

    Richard R. Schultz; Edwin A. Harvego; Ryan L. Crane

    2010-05-01

    With the resurgence of nuclear power and increased interest in advanced nuclear reactors as an option to supply abundant energy without the associated greenhouse gas emissions of the more conventional fossil fuel energy sources, there is a need to establish internationally recognized standards for the verification and validation (V&V) of software used to calculate the thermal-hydraulic behavior of advanced reactor designs for both normal operation and hypothetical accident conditions. To address this need, ASME (American Society of Mechanical Engineers) Standards and Certification has established the V&V 30 Committee, under the responsibility of the V&V Standards Committee, to develop a consensus Standard for verification and validation of software used for design and analysis of advanced reactor systems. The initial focus of this committee will be on the V&V of system analysis and computational fluid dynamics (CFD) software for nuclear applications. To limit the scope of the effort, the committee will further limit its focus to software to be used in the licensing of High-Temperature Gas-Cooled Reactors. In this framework, the standard should conform to Nuclear Regulatory Commission (NRC) practices, procedures and methods for licensing of nuclear power plants as embodied in the United States (U.S.) Code of Federal Regulations and other pertinent documents such as Regulatory Guide 1.203, “Transient and Accident Analysis Methods” and NUREG-0800, “NRC Standard Review Plan”. In addition, the standard should be consistent with applicable sections of ASME Standard NQA-1 (“Quality Assurance Requirements for Nuclear Facility Applications (QA)”). This paper describes the general requirements for the V&V Standard, which includes; (a) the definition of the operational and accident domain of a nuclear system that must be considered if the system is to licensed, (b) the corresponding calculational domain of the software that should encompass the nuclear operational

  3. Assay of scrap plutonium oxide by thermal neutron multiplicity counting for IAEA verification of excess materials from nuclear weapons production

    SciTech Connect

    Stewart, J.E.; Krick, M.S.; Xiao, J.; LeMaire, R.J.; Fotin, V.; McRae, L.; Scott, D.; Westsik, G.

    1996-09-01

    The US Nonproliferation and Export Control Policy commits the US to placing under International Atomic Energy Agency (IAEA) safeguards excess nuclear materials no longer needed for the US nuclear deterrent. As of January 1,1996, the IAEA has completed Initial Physical Inventory Verification (IPIV) at the Oak Ridge Y-12 plant, the Hanford Plutonium Finishing Plant, and a plutonium storage vault at Rock Flats. Two IPIVs were performed at Hanford . This paper reports the results of thermal neutron multiplicity assay of plutonium residues during the second IPIV at Hanford. Using the Three Ring Multiplicity Counter (3RMC), measurements were performed on 69 individual cans of plutonium residues, each containing approximately 1 kg of material. Of the 69 items, 67 passed the IAEA acceptance criteria and two were selected for destructive analysis.

  4. Implementation of neutron counting techniques at US facilities for IAEA verification of excess materials from nuclear weapons production

    SciTech Connect

    Stewart, J.E.; Krick, M.S.; Langner, D.G.; Reilly, T.D.; Theis, W.; Lemaire, R.J.; Xiao, J.

    1995-08-01

    The U.S. Nonproliferation and Export Control Policy, announced by President Clinton before the United Nations General Assembly on September 27, 1993, commits the U.S. to placing under International Atomic Energy Agency (IAEA) Safeguards excess nuclear materials no longer needed for the U.S. nuclear deterrent. As of July 1, 1995, the IAEA had completed Initial Physical Inventory Verification (IPIV) at two facilities: a storage vault in the Oak Ridge Y-12 plant containing highly enriched uranium (HOW) metal and another storage vault in the Hanford Plutonium Finishing Plant (PFP) containing plutonium oxide and plutonium-bearing residues. Another plutonium- storage vault, located at Rocky Flats, is scheduled for the IPIV in the fall of 1995. Conventional neutron coincidence counting is one of the routinely applied IAEA nondestructive assay (ND) methods for verification of uranium and plutonium. However, at all three facilities mentioned above, neutron ND equipment had to be modified or developed for specific facility needs such as the type and configuration of material placed under safeguards. This document describes those modifications and developments.

  5. Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS) Code Verification and Validation Data Standards and Requirements: Fluid Dynamics Version 1.0

    SciTech Connect

    Greg Weirs; Hyung Lee

    2011-09-01

    V&V and UQ are the primary means to assess the accuracy and reliability of M&S and, hence, to establish confidence in M&S. Though other industries are establishing standards and requirements for the performance of V&V and UQ, at present, the nuclear industry has not established such standards or requirements. However, the nuclear industry is beginning to recognize that such standards are needed and that the resources needed to support V&V and UQ will be very significant. In fact, no single organization has sufficient resources or expertise required to organize, conduct and maintain a comprehensive V&V and UQ program. What is needed is a systematic and standardized approach to establish and provide V&V and UQ resources at a national or even international level, with a consortium of partners from government, academia and industry. Specifically, what is needed is a structured and cost-effective knowledge base that collects, evaluates and stores verification and validation data, and shows how it can be used to perform V&V and UQ, leveraging collaboration and sharing of resources to support existing engineering and licensing procedures as well as science-based V&V and UQ processes. The Nuclear Energy Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Utah State University and others with the objective of establishing a comprehensive and web-accessible knowledge base to provide V&V and UQ resources for M&S for nuclear reactor design, analysis and licensing. The knowledge base will serve as an important resource for technical exchange and collaboration that will enable credible and reliable computational models and simulations for application to nuclear power. NE-KAMS will serve as a valuable resource for the nuclear industry, academia, the national laboratories, the U.S. Nuclear Regulatory Commission (NRC) and

  6. Methodology, verification, and performance of the continuous-energy nuclear data sensitivity capability in MCNP6

    SciTech Connect

    Kiedrowski, B. C.; Brown, F. B.

    2013-07-01

    A continuous-energy sensitivity coefficient capability has been introduced into MCNP6. The methods for generating energy-resolved and energy-integrated sensitivity profiles are discussed. Results from the verification exercises that were performed are given, and these show that MCNP6 compares favorably with analytic solutions, direct density perturbations, and comparisons to TSUNAMI-3D and MONK. Run-time and memory requirements are assessed for typical applications, and these are shown to be reasonable with modern computing resources. (authors)

  7. Human factors design, verification, and validation for two types of control room upgrades at a nuclear power plant

    SciTech Connect

    Boring, Laurids Ronald

    2014-10-01

    This paper describes the NUREG-0711 based human factors engineering (HFE) phases and associated elements required to support design, verification and validation (V&V), and implementation of a new plant process computer (PPC) and turbine control system (TCS) at a representative nuclear power plant. This paper reviews ways to take a human-system interface (HSI) specification and use it when migrating legacy PPC displays or designing displays with new functionality. These displays undergo iterative usability testing during the design phase and then undergo an integrated system validation (ISV) in a full scope control room training simulator. Following the successful demonstration of operator performance with the systems during the ISV, the new system is implemented at the plant, first in the training simulator and then in the main control room.

  8. Simulated Verification of Fuel Element Inventory in a Small Reactor Core Using the Nuclear Materials Identification System (NMIS)

    SciTech Connect

    Grogan, Brandon R; Mihalczo, John T

    2009-01-01

    The International Panel on Climate Change projects that by 2050 the world energy demand may double. Although the primary focus for new nuclear power plants in industrialized nations is on large plants in the 1000-1600 MWe range, there is an increasing demand for small and medium reactors (SMRs). About half of the innovative SMR concepts are for small (<300 MWe) reactors with a 5-30 year life without on-site refueling. This type of reactor is also known as a battery-type reactor. These reactors are particularly attractive to countries with small power grids and for non-electrical purposes such as heating, hydrogen production, and seawater desalination. Traditionally, this type of reactor has been used in a nautical propulsion role. This type of reactor is designed as a permanently sealed unit to prevent the diversion of the uranium in the core by the user. However, after initial fabrication it will be necessary to verify that the newly fabricated reactor core contains the quantity of uranium that initially entered the fuel fabrication plant. In most instances, traditional inspection techniques can be used to perform this verification, but in certain situations the core design will be considered sensitive. Non-intrusive verification techniques must be utilized in these situations. The Nuclear Materials Identification System (NMIS) with imaging uses active interrogation and a fast time correlation processor to characterize fissile material. The MCNP-PoliMi computer code was used to simulate NMIS measurements of a small, sealed reactor core. Because most battery-type reactor designs are still in the early design phase, a more traditional design based on a Russian icebreaker core was used in the simulations. These simulations show how the radiography capabilities of the NMIS could be used to detect the diversion of fissile material by detecting void areas in the assembled core where fuel elements have been removed.

  9. Plastid DNA sequencing and nuclear SNP genotyping help resolve the puzzle of central American Platanus

    PubMed Central

    De Castro, Olga; Di Maio, Antonietta; Lozada García, José Armando; Piacenti, Danilo; Vázquez-Torres, Mario; De Luca, Paolo

    2013-01-01

    Background and Aims Recent research on the history of Platanus reveals that hybridization phenomena occurred in the central American species. This study has two goals: to help resolve the evolutive puzzle of central American Platanus, and to test the potential of real-time polymerase chain reaction (PCR) for detecting ancient hybridization. Methods Sequencing of a uniparental plastid DNA marker [psbA-trnH(GUG) intergenic spacer] and qualitative and quantitative single nucleotide polymorphism (SNP) genotyping of biparental nuclear ribosomal DNA (nrDNA) markers [LEAFY intron 2 (LFY-i2) and internal transcribed spacer 2 (ITS2)] were used. Key Results Based on the SNP genotyping results, several Platanus accessions show the presence of hybridization/introgression, including some accessions of P. rzedowskii and of P. mexicana var. interior and one of P. mexicana var. mexicana from Oaxaca (= P. oaxacana). Based on haplotype analyses of the psbA-trnH spacer, five haplotypes were detected. The most common of these is present in taxa belonging to P. orientalis, P. racemosa sensu lato, some accessions of P. occidentalis sensu stricto (s.s.) from Texas, P. occidentalis var. palmeri, P. mexicana s.s. and P. rzedowskii. This is highly relevant to genetic relationships with the haplotypes present in P. occidentalis s.s. and P. mexicana var. interior. Conclusions Hybridization and introgression events between lineages ancestral to modern central and eastern North American Platanus species occurred. Plastid haplotypes and qualitative and quantitative SNP genotyping provide information critical for understanding the complex history of Mexican Platanus. Compared with the usual molecular techniques of sub-cloning, sequencing and genotyping, real-time PCR assay is a quick and sensitive technique for analysing complex evolutionary patterns. PMID:23798602

  10. Use of open source information and commercial satellite imagery for nuclear nonproliferation regime compliance verification by a community of academics

    NASA Astrophysics Data System (ADS)

    Solodov, Alexander

    The proliferation of nuclear weapons is a great threat to world peace and stability. The question of strengthening the nonproliferation regime has been open for a long period of time. In 1997 the International Atomic Energy Agency (IAEA) Board of Governors (BOG) adopted the Additional Safeguards Protocol. The purpose of the protocol is to enhance the IAEA's ability to detect undeclared production of fissile materials in member states. However, the IAEA does not always have sufficient human and financial resources to accomplish this task. Developed here is a concept for making use of human and technical resources available in academia that could be used to enhance the IAEA's mission. The objective of this research was to study the feasibility of an academic community using commercially or publicly available sources of information and products for the purpose of detecting covert facilities and activities intended for the unlawful acquisition of fissile materials or production of nuclear weapons. In this study, the availability and use of commercial satellite imagery systems, commercial computer codes for satellite imagery analysis, Comprehensive Test Ban Treaty (CTBT) verification International Monitoring System (IMS), publicly available information sources such as watchdog groups and press reports, and Customs Services information were explored. A system for integrating these data sources to form conclusions was also developed. The results proved that publicly and commercially available sources of information and data analysis can be a powerful tool in tracking violations in the international nuclear nonproliferation regime and a framework for implementing these tools in academic community was developed. As a result of this study a formation of an International Nonproliferation Monitoring Academic Community (INMAC) is proposed. This would be an independent organization consisting of academics (faculty, staff and students) from both nuclear weapon states (NWS) and

  11. Analytical three-dimensional neutron transport benchmarks for verification of nuclear engineering codes. Final report

    SciTech Connect

    Ganapol, B.D.; Kornreich, D.E.

    1997-07-01

    Because of the requirement of accountability and quality control in the scientific world, a demand for high-quality analytical benchmark calculations has arisen in the neutron transport community. The intent of these benchmarks is to provide a numerical standard to which production neutron transport codes may be compared in order to verify proper operation. The overall investigation as modified in the second year renewal application includes the following three primary tasks. Task 1 on two dimensional neutron transport is divided into (a) single medium searchlight problem (SLP) and (b) two-adjacent half-space SLP. Task 2 on three-dimensional neutron transport covers (a) point source in arbitrary geometry, (b) single medium SLP, and (c) two-adjacent half-space SLP. Task 3 on code verification, includes deterministic and probabilistic codes. The primary aim of the proposed investigation was to provide a suite of comprehensive two- and three-dimensional analytical benchmarks for neutron transport theory applications. This objective has been achieved. The suite of benchmarks in infinite media and the three-dimensional SLP are a relatively comprehensive set of one-group benchmarks for isotropically scattering media. Because of time and resource limitations, the extensions of the benchmarks to include multi-group and anisotropic scattering are not included here. Presently, however, enormous advances in the solution for the planar Green`s function in an anisotropically scattering medium have been made and will eventually be implemented in the two- and three-dimensional solutions considered under this grant. Of particular note in this work are the numerical results for the three-dimensional SLP, which have never before been presented. The results presented were made possible only because of the tremendous advances in computing power that have occurred during the past decade.

  12. Development and verification of design methods for ducts in a space nuclear shield

    NASA Technical Reports Server (NTRS)

    Cerbone, R. J.; Selph, W. E.; Read, P. A.

    1972-01-01

    A practical method for computing the effectiveness of a space nuclear shield perforated by small tubing and cavities is reported. Performed calculations use solutions for a two dimensional transport code and evaluate perturbations of that solution using last flight estimates and other kernel integration techniques. In general, perturbations are viewed as a change in source strength of scattered radiation and a change in attenuation properties of the region.

  13. Development of an automated platform for the verification, testing, processing and benchmarking of Evaluated Nuclear Data at the NEA Data Bank. Status of the NDEC system

    NASA Astrophysics Data System (ADS)

    Michel-Sendis, F.; Díez, C. J.; Cabellos, O.

    2016-03-01

    Modern nuclear data Quality Assurance (QA) is, in practice, a multistage process that aims at establishing a thorough assessment of the validity of the physical information contained in an evaluated nuclear data file as compared to our best knowledge of available experimental data and theoretical models. It should systematically address the performance of the evaluated file against available pertinent integral experiments, with proper and prior verification that the information encoded in the evaluation is accurately processed and reconstructed for the application conditions. The aim of the NDEC (Nuclear Data Evaluation Cycle) platform currently being developed by the Data Bank is to provide a correct and automated handling of these diverse QA steps in order to facilitate the expert human assessment of evaluated nuclear data files, both by the evaluators and by the end users of nuclear data.

  14. Routine inspection effort required for verification of a nuclear material production cutoff convention

    SciTech Connect

    Dougherty, D.; Fainberg, A.; Sanborn, J.; Allentuck, J.; Sun, C.

    1996-11-01

    On 27 September 1993, President Clinton proposed {open_quotes}... a multilateral convention prohibiting the production of highly enriched uranium or plutonium for nuclear explosives purposes or outside of international safeguards.{close_quotes} The UN General Assembly subsequently adopted a resolution recommending negotiation of a non-discriminatory, multilateral, and internationally and effectively verifiable treaty (hereinafter referred to as {open_quotes}the Cutoff Convention{close_quotes}) banning the production of fissile material for nuclear weapons. The matter is now on the agenda of the Conference on Disarmament, although not yet under negotiation. This accord would, in effect, place all fissile material (defined as highly enriched uranium and plutonium) produced after entry into force (EIF) of the accord under international safeguards. {open_quotes}Production{close_quotes} would mean separation of the material in question from radioactive fission products, as in spent fuel reprocessing, or enrichment of uranium above the 20% level, which defines highly enriched uranium (HEU). Facilities where such production could occur would be safeguarded to verify that either such production is not occurring or that all material produced at these facilities is maintained under safeguards.

  15. Potential opportunities for nano materials to help enable enhanced nuclear fuel performance

    SciTech Connect

    McClellan, Kenneth J.

    2012-06-06

    This presentation is an overview of the technical challenges for development of nuclear fuels with enhanced performance and accident tolerance. Key specific aspects of improved fuel performance are noted. Examples of existing nanonuclear projects and concepts are presented and areas of potential focus are suggested. The audience for this presentation includes representatives from: DOE-NE, other national laboratories, industry and academia. This audience is a mixture of nanotechnology experts and nuclear energy researchers and managers.

  16. Verification of screening level for decontamination implemented after Fukushima nuclear accident

    PubMed Central

    Ogino, Haruyuki; Ichiji, Takeshi; Hattori, Takatoshi

    2012-01-01

    The screening level for decontamination that has been applied for the surface of the human body and contaminated handled objects after the Fukushima nuclear accident was verified by assessing the doses that arise from external irradiation, ingestion, inhalation and skin contamination. The result shows that the annual effective dose that arises from handled objects contaminated with the screening level for decontamination (i.e. 100 000 counts per minute) is <1 mSv y−1, which can be considered as the intervention exemption level in accordance with the International Commission on Radiological Protection recommendations. Furthermore, the screening level is also found to protect the skin from the incidence of a deterministic effect because the absorbed dose of the skin that arises from direct deposition on the surface of the human body is calculated to be lower than the threshold of the deterministic effect assuming a practical exposure duration. PMID:22228683

  17. Indian Point Nuclear Power Station: verification analysis of County Radiological Emergency-Response Plans

    SciTech Connect

    Nagle, J.; Whitfield, R.

    1983-05-01

    This report was developed as a management tool for use by the Federal Emergency Management Agency (FEMA) Region II staff. The analysis summarized in this report was undertaken to verify the extent to which procedures, training programs, and resources set forth in the County Radiological Emergency Response Plans (CRERPs) for Orange, Putnam, and Westchester counties in New York had been realized prior to the March 9, 1983, exercise of the Indian Point Nuclear Power Station near Buchanan, New York. To this end, a telephone survey of county emergency response organizations was conducted between January 19 and February 22, 1983. This report presents the results of responses obtained from this survey of county emergency response organizations.

  18. Routine inspection effort required for verification of a nuclear material production cutoff convention

    SciTech Connect

    Fishbone, L.G.; Sanborn, J.

    1994-12-01

    Preliminary estimates of the inspection effort to verify a Nuclear Material Cutoff Convention are presented. The estimates are based on (1) a database of about 650 facilities a total of eight states, i.e., the five nuclear-weapons states and three ``threshold`` states; (2) typical figures for inspection requirements for specific facility types derived from IAEA experience, where applicable; and (3) alternative estimates of inspection effort in cutoff options where full IAEA safeguards are not stipulated. Considerable uncertainty must be attached to the effort estimates. About 50--60% of the effort for each option is attributable to 16 large-scale reprocessing plants assumed to be in operation in the eight states; it is likely that some of these will be shut down by the time the convention enters into force. Another important question involving about one third of the overall effort is whether Euratom inspections in France and the U.K. could obviate the need for full-scale IAEA inspections at these facilities. Finally, the database does not yet contain many small-scale and military-related facilities. The results are therefore not presented as predictions but as the consequences of alternative assumptions. Despite the preliminary nature of the estimates, it is clear that a broad application of NPT-like safeguards to the eight states would require dramatic increases in the IAEA`s safeguards budget. It is also clear that the major component of the increased inspection effort would occur at large reprocessing plants (and associated plutonium facilities). Therefore, significantly bounding the increased effort requires a limitation on the inspection effort in these facility types.

  19. Taming the SQUID: How a nuclear physics education (mostly) helped my career in applied physics

    NASA Astrophysics Data System (ADS)

    Espy, Michelle

    2013-10-01

    My degree is in experimental nuclear physics, specifically studying the interaction of pions with nuclei. But after graduation I accepted a post-doctoral research position with a team based on applications of the Superconducting Quantum Interference Device (SQUID) to the study of the human brain. Despite knowing nothing about the brain or SQUIDs to start with, I have gone on to enjoy a career in applications of the SQUID and other sensors to the detection of weak magnetic fields in a variety of problems from brain studies (magnetoencephalography) to ultra-low field nuclear magnetic resonance for detection of explosives and illicit material. In this talk I will present some background on SQUIDs and their application to the detection of ultra-weak magnetic fields of biological and non-biological origin. I will also provide a little insight into what it has been like to use a nuclear physics background to pursue other types of science.

  20. Verification of the Cross Immunoreactivity of A60, a Mouse Monoclonal Antibody against Neuronal Nuclear Protein

    PubMed Central

    Mao, Shanping; Xiong, Guoxiang; Zhang, Lei; Dong, Huimin; Liu, Baohui; Cohen, Noam A.; Cohen, Akiva S.

    2016-01-01

    A60, the mouse monoclonal antibody against the neuronal nuclear protein (NeuN), is the most widely used neuronal marker in neuroscience research and neuropathological assays. Previous studies identified fragments of A60-immunoprecipitated protein as Synapsin I (Syn I), suggesting the antibody will demonstrate cross immunoreactivity. However, the likelihood of cross reactivity has never been verified by immunohistochemical techniques. Using our established tissue processing and immunofluorescent staining protocols, we found that A60 consistently labeled mossy fiber terminals in hippocampal area CA3. These A60-positive mossy fiber terminals could also be labeled by Syn I antibody. After treating brain slices with saponin in order to better preserve various membrane and/or vesicular proteins for immunostaining, we observed that A60 could also label additional synapses in various brain areas. Therefore, we used A60 together with a rabbit monoclonal NeuN antibody to confirm the existence of this cross reactivity. We showed that the putative band positive for A60 and Syn I could not be detected by the rabbit anti-NeuN in Western blotting. As efficient as Millipore A60 to recognize neuronal nuclei, the rabbit NeuN antibody demonstrated no labeling of synaptic structures in immunofluorescent staining. The present study successfully verified the cross reactivity present in immunohistochemistry, cautioning that A60 may not be the ideal biomarker to verify neuronal identity due to its cross immunoreactivity. In contrast, the rabbit monoclonal NeuN antibody used in this study may be a better candidate to substitute for A60. PMID:27242450

  1. Independent Verification and Validation Of SAPHIRE 8 Software Quality Assurance Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-02-01

    This report provides an evaluation of the Software Quality Assurance Plan. The Software Quality Assurance Plan is intended to ensure all actions necessary for the software life cycle; verification and validation activities; documentation and deliverables; project management; configuration management, nonconformance reporting and corrective action; and quality assessment and improvement have been planned and a systematic pattern of all actions necessary to provide adequate confidence that a software product conforms to established technical requirements; and to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  2. Independent Verification and Validation Of SAPHIRE 8 Software Quality Assurance Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    This report provides an evaluation of the Software Quality Assurance Plan. The Software Quality Assurance Plan is intended to ensure all actions necessary for the software life cycle; verification and validation activities; documentation and deliverables; project management; configuration management, nonconformance reporting and corrective action; and quality assessment and improvement have been planned and a systematic pattern of all actions necessary to provide adequate confidence that a software product conforms to established technical requirements; and to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  3. Independent Verification and Validation Of SAPHIRE 8 Software Requirements Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE requirements definition is to assess the activities that results in the specification, documentation, and review of the requirements that the software product must satisfy, including functionality, performance, design constraints, attributes and external interfaces. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP).

  4. Independent Verification and Validation Of SAPHIRE 8 Software Requirements Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2009-09-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE requirements definition is to assess the activities that results in the specification, documentation, and review of the requirements that the software product must satisfy, including functionality, performance, design constraints, attributes and external interfaces. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP).

  5. Independent Verification and Validation Of SAPHIRE 8 System Test Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-02-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE System Test Plan is to assess the approach to be taken for intended testing activities associated with the SAPHIRE software product. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  6. Verification and validation benchmarks.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  7. Proceedings of the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT)

    SciTech Connect

    Nichols, James W., LTC

    2000-09-15

    These proceedings contain papers prepared for the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), held 13-15 September 2000 in New Orleans, Louisiana. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Defense Threat Reduction Agency (DTRA), Air Force Technical Applications Center (AFTAC), Department of Defense (DoD), US Army Space and Missile Defense Command, Defense Special Weapons Agency (DSWA), and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  8. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    SciTech Connect

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert; McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  9. Wind gust warning verification

    NASA Astrophysics Data System (ADS)

    Primo, Cristina

    2016-07-01

    Operational meteorological centres around the world increasingly include warnings as one of their regular forecast products. Warnings are issued to warn the public about extreme weather situations that might occur leading to damages and losses. In forecasting these extreme events, meteorological centres help their potential users in preventing the damage or losses they might suffer. However, verifying these warnings requires specific methods. This is due not only to the fact that they happen rarely, but also because a new temporal dimension is added when defining a warning, namely the time window of the forecasted event. This paper analyses the issues that might appear when dealing with warning verification. It also proposes some new verification approaches that can be applied to wind warnings. These new techniques are later applied to a real life example, the verification of wind gust warnings at the German Meteorological Centre ("Deutscher Wetterdienst"). Finally, the results obtained from the latter are discussed.

  10. Helping Kids Help

    ERIC Educational Resources Information Center

    Heiss, E. Renee

    2008-01-01

    Educators need to help kids help others so that they can help themselves. Volunteering does not involve competition or grades. This is one area where students don't have to worry about measuring up to the expectations of parents, teachers, and coaches. Students participate in charitable work to add another line to a college transcript or job…

  11. Helping Users Help Themselves.

    ERIC Educational Resources Information Center

    O'Malley, Claire E.

    This discussion of the design of user-initiated help systems in computers focuses on the information that users actively seek to help them with their tasks, with emphasis on how to help users ask the questions that will bridge the gap between the initial internal (mental) form of the query and their information need as expressed by the system.…

  12. Independent Verification and Validation Of SAPHIRE 8 Software Project Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Carl Wharton

    2009-10-01

    This report provides an evaluation of the Project Plan. The Project Plan is intended to provide the high-level direction that documents the required software activities to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  13. Independent Verification and Validation Of SAPHIRE 8 Software Project Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Carl Wharton; Kent Norris

    2009-12-01

    This report provides an evaluation of the Project Plan. The Project Plan is intended to provide the high-level direction that documents the required software activities to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  14. Independent Verification and Validation Of SAPHIRE 8 Software Project Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Carl Wharton; Kent Norris

    2010-03-01

    This report provides an evaluation of the Project Plan. The Project Plan is intended to provide the high-level direction that documents the required software activities to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  15. Computation and Analysis of the Global Distribution of the Radioxenon Isotope 133Xe based on Emissions from Nuclear Power Plants and Radioisotope Production Facilities and its Relevance for the Verification of the Nuclear-Test-Ban Treaty

    NASA Astrophysics Data System (ADS)

    Wotawa, Gerhard; Becker, Andreas; Kalinowski, Martin; Saey, Paul; Tuma, Matthias; Zähringer, Matthias

    2010-05-01

    Monitoring of radioactive noble gases, in particular xenon isotopes, is a crucial element of the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The capability of the noble gas network, which is currently under construction, to detect signals from a nuclear explosion critically depends on the background created by other sources. Therefore, the global distribution of these isotopes based on emissions and transport patterns needs to be understood. A significant xenon background exists in the reactor regions of North America, Europe and Asia. An emission inventory of the four relevant xenon isotopes has recently been created, which specifies source terms for each power plant. As the major emitters of xenon isotopes worldwide, a few medical radioisotope production facilities have been recently identified, in particular the facilities in Chalk River (Canada), Fleurus (Belgium), Pelindaba (South Africa) and Petten (Netherlands). Emissions from these sites are expected to exceed those of the other sources by orders of magnitude. In this study, emphasis is put on 133Xe, which is the most prevalent xenon isotope. First, based on the emissions known, the resulting 133Xe concentration levels at all noble gas stations of the final CTBT verification network were calculated and found to be consistent with observations. Second, it turned out that emissions from the radioisotope facilities can explain a number of observed peaks, meaning that atmospheric transport modelling is an important tool for the categorization of measurements. Third, it became evident that Nuclear Power Plant emissions are more difficult to treat in the models, since their temporal variation is high and not generally reported. Fourth, there are indications that the assumed annual emissions may be underestimated by factors of two to ten, while the general emission patterns seem to be well understood. Finally, it became evident that 133Xe sources mainly influence the sensitivity of the

  16. The Indefinite Extension of the Nuclear Non-Proliferation Treaty: A Hinderence or Help to Future Arms Control

    NASA Astrophysics Data System (ADS)

    Pella, Peter J.

    1996-05-01

    The indefinite and "unconditional" extension of the Nuclear Non-Proliferation Treaty (NPT) was achieved almost one year ago today. This outcome was a major foreign policy goal of the Clinton Administration. Some critics of the NPT's indefinite extension claim that nuclear weapons states parties to the NPT have now legitimized their possession of nuclear weapons for all time and that there is no incentive for future nuclear arms control and disarmament measures. A discussion of how the indefinite extension of the NPT has affected the nuclear arms control landscape and the prospects for future disarmament measures will be discussed.

  17. Environmental Detection of Clandestine Nuclear Weapon Programs

    NASA Astrophysics Data System (ADS)

    Kemp, R. Scott

    2016-06-01

    Environmental sensing of nuclear activities has the potential to detect nuclear weapon programs at early stages, deter nuclear proliferation, and help verify nuclear accords. However, no robust system of detection has been deployed to date. This can be variously attributed to high costs, technical limitations in detector technology, simple countermeasures, and uncertainty about the magnitude or behavior of potential signals. In this article, current capabilities and promising opportunities are reviewed. Systematic research in a variety of areas could improve prospects for detecting covert nuclear programs, although the potential for countermeasures suggests long-term verification of nuclear agreements will need to rely on methods other than environmental sensing.

  18. Independent Verification and Validation Of SAPHIRE 8 Risk Management Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2009-11-01

    This report provides an evaluation of the risk management. Risk management is intended to ensure a methodology for conducting risk management planning, identification, analysis, responses, and monitoring and control activities associated with the SAPHIRE project work, and to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  19. Is flow verification necessary

    SciTech Connect

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper.

  20. Specification and verification of nuclear-power-plant training-simulator response characteristics. Part II. Conclusions and recommendations

    SciTech Connect

    Haas, P M; Selby, D L; Kerlin, T W; Felkins, L

    1982-05-01

    The nuclear industry should adopt and NRC regulatory and research actions should support the systems approach to training as a structured framework for development and validation of personnel training systems. Potential exists for improving the ability to assess simulator fidelity. Systems Identification Technology offers a potential framework for model validation. Installation of the data collection/recording equipment required by NUREG-0696 could provide a vastly improved source of data for simulator fidelity assessment. The NRC needs to continue its post-TMI actions to involve itself more rigorously and more formally in the entire process of NPP personnel training system development. However, this involvement should be a participative one with industry. The existing similator standards and guidelines should be reorganized to support the use of systems approach to training. The standards should require and support a holistic approach to training system development that recognizes simulators and simulator training as only parts of the complete training program and full-scope, high-fidelity, site-specific simulators as only one useful training device. Some recommendations for adapting the SAT/ISD process to the nuclear industry are: The formation of an NRC/industry planning/coordination group, a program planning study to develop a programmatic plan, development of a user's guide and NRC/industry workshops to establish common terminology and practice, and a pilot study applying the adopted SAT/ISD methodology to an actual nuclear industry training program.

  1. Geometric verification

    NASA Technical Reports Server (NTRS)

    Grebowsky, G. J.

    1982-01-01

    Present LANDSAT data formats are reviewed to clarify how the geodetic location and registration capabilities were defined for P-tape products and RBV data. Since there is only one geometric model used in the master data processor, geometric location accuracy of P-tape products depends on the absolute accuracy of the model and registration accuracy is determined by the stability of the model. Due primarily to inaccuracies in data provided by the LANDSAT attitude management system, desired accuracies are obtained only by using ground control points and a correlation process. The verification of system performance with regards to geodetic location requires the capability to determine pixel positions of map points in a P-tape array. Verification of registration performance requires the capability to determine pixel positions of common points (not necessarily map points) in 2 or more P-tape arrays for a given world reference system scene. Techniques for registration verification can be more varied and automated since map data are not required. The verification of LACIE extractions is used as an example.

  2. On-line high-performance liquid chromatography-ultraviolet-nuclear magnetic resonance method of the markers of nerve agents for verification of the Chemical Weapons Convention.

    PubMed

    Mazumder, Avik; Gupta, Hemendra K; Garg, Prabhat; Jain, Rajeev; Dubey, Devendra K

    2009-07-01

    This paper details an on-flow liquid chromatography-ultraviolet-nuclear magnetic resonance (LC-UV-NMR) method for the retrospective detection and identification of alkyl alkylphosphonic acids (AAPAs) and alkylphosphonic acids (APAs), the markers of the toxic nerve agents for verification of the Chemical Weapons Convention (CWC). Initially, the LC-UV-NMR parameters were optimized for benzyl derivatives of the APAs and AAPAs. The optimized parameters include stationary phase C(18), mobile phase methanol:water 78:22 (v/v), UV detection at 268nm and (1)H NMR acquisition conditions. The protocol described herein allowed the detection of analytes through acquisition of high quality NMR spectra from the aqueous solution of the APAs and AAPAs with high concentrations of interfering background chemicals which have been removed by preceding sample preparation. The reported standard deviation for the quantification is related to the UV detector which showed relative standard deviations (RSDs) for quantification within +/-1.1%, while lower limit of detection upto 16mug (in mug absolute) for the NMR detector. Finally the developed LC-UV-NMR method was applied to identify the APAs and AAPAs in real water samples, consequent to solid phase extraction and derivatization. The method is fast (total experiment time approximately 2h), sensitive, rugged and efficient.

  3. A REPRINT of a July 1991 Report to Congress, Executive Summary of Verification of Nuclear Warhead Dismantlement and Special Nuclear Material Controls

    SciTech Connect

    Fuller, James L.

    2008-11-20

    With the renewed thinking and debate about deep reductions in nuclear weapons, including recent proposals about eliminating nuclear warheads altogether, republishing the general conclusions of the Robinson Committee Report of 1992 appears useful. The report is sometimes referred to as the 3151 Report, from Section 3151 of the National Defnse Authorization Act for FY1991, from where its requirement originated. This report contains the Executive Summary only and the forwarding letters from the Committee, the President of the United States, the Secretary of Energy, and C Paul Robinson, the head of the Advisory Committee.

  4. CTBT integrated verification system evaluation model supplement

    SciTech Connect

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  5. MELCOR Verification, Benchmarking, and Applications experience at BNL

    SciTech Connect

    Madni, I.K.

    1992-12-31

    This paper presents a summary of MELCOR Verification, Benchmarking and Applications experience at Brookhaven National Laboratory (BNL), sponsored by the US Nuclear Regulatory Commission (NRC). Under MELCOR verification over the past several years, all released versions of the code were installed on BNL`s computer system, verification exercises were performed, and defect investigation reports were sent to SNL. Benchmarking calculations of integral severe fuel damage tests performed at BNL have helped to identify areas of modeling strengths and weaknesses in MELCOR; the most appropriate choices for input parameters; selection of axial nodalization for core cells and heat structures; and workarounds that extend the capabilities of MELCOR. These insights are explored in greater detail in the paper, with the help of selected results and comparisons. Full plant applications calculations at BNL have helped to evaluate the ability of MELCOR to successfully simulate various accident sequences and calculate source terms to the environment for both BWRs and PWRs. A summary of results, including timing of key events, thermal-hydraulic response, and environmental releases of fission products are presented for selected calculations, along with comparisons with Source Term Code Package (STCP) calculations of the same sequences. Differences in results are explained on the basis of modeling differences between the two codes. The results of a sensitivity calculation are also shown. The paper concludes by highlighting some insights on bottomline issues, and the contribution of the BNL program to MELCOR development, assessment, and the identification of user needs for optimum use of the code.

  6. MELCOR Verification, Benchmarking, and Applications experience at BNL

    SciTech Connect

    Madni, I.K.

    1992-01-01

    This paper presents a summary of MELCOR Verification, Benchmarking and Applications experience at Brookhaven National Laboratory (BNL), sponsored by the US Nuclear Regulatory Commission (NRC). Under MELCOR verification over the past several years, all released versions of the code were installed on BNL's computer system, verification exercises were performed, and defect investigation reports were sent to SNL. Benchmarking calculations of integral severe fuel damage tests performed at BNL have helped to identify areas of modeling strengths and weaknesses in MELCOR; the most appropriate choices for input parameters; selection of axial nodalization for core cells and heat structures; and workarounds that extend the capabilities of MELCOR. These insights are explored in greater detail in the paper, with the help of selected results and comparisons. Full plant applications calculations at BNL have helped to evaluate the ability of MELCOR to successfully simulate various accident sequences and calculate source terms to the environment for both BWRs and PWRs. A summary of results, including timing of key events, thermal-hydraulic response, and environmental releases of fission products are presented for selected calculations, along with comparisons with Source Term Code Package (STCP) calculations of the same sequences. Differences in results are explained on the basis of modeling differences between the two codes. The results of a sensitivity calculation are also shown. The paper concludes by highlighting some insights on bottomline issues, and the contribution of the BNL program to MELCOR development, assessment, and the identification of user needs for optimum use of the code.

  7. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  8. Independent Verification and Validation Of SAPHIRE 8 Software Design and Interface Design Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE software design and interface design is to assess the activities that results in the development, documentation, and review of a software design that meets the requirements defined in the software requirements documentation. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP) design specification.

  9. Independent Verification and Validation Of SAPHIRE 8 Software Design and Interface Design Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2009-10-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE software design and interface design is to assess the activities that results in the development, documentation, and review of a software design that meets the requirements defined in the software requirements documentation. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP) design specification.

  10. Monitoring and verification R&D

    SciTech Connect

    Pilat, Joseph F; Budlong - Sylvester, Kory W; Fearey, Bryan L

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.

  11. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  12. Independent Verification and Validation Of SAPHIRE 8 Volume 3 Users' Guide Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE 8 Volume 3 Users’ Guide is to assess the user documentation for its completeness, correctness, and consistency with respect to requirements for user interface and for any functionality that can be invoked by the user. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  13. CTBT Integrated Verification System Evaluation Model

    SciTech Connect

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  14. Columbus pressurized module verification

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Comandatore, Emanuele

    1986-01-01

    The baseline verification approach of the COLUMBUS Pressurized Module was defined during the A and B1 project phases. Peculiarities of the verification program are the testing requirements derived from the permanent manned presence in space. The model philosophy and the test program have been developed in line with the overall verification concept. Such critical areas as meteoroid protections, heat pipe radiators and module seals are identified and tested. Verification problem areas are identified and recommendations for the next development are proposed.

  15. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  16. Verification of Adaptive Systems

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui; Vassev, Emil; Hinchey, Mike; Rouff, Christopher; Buskens, Richard

    2012-01-01

    Adaptive systems are critical for future space and other unmanned and intelligent systems. Verification of these systems is also critical for their use in systems with potential harm to human life or with large financial investments. Due to their nondeterministic nature and extremely large state space, current methods for verification of software systems are not adequate to provide a high level of assurance for them. The combination of stabilization science, high performance computing simulations, compositional verification and traditional verification techniques, plus operational monitors, provides a complete approach to verification and deployment of adaptive systems that has not been used before. This paper gives an overview of this approach.

  17. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  18. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  19. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  20. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  1. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a)...

  2. Independent Verification and Validation Of SAPHIRE 8 Software Acceptance Test Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE 8 Software Acceptance Test Plan is to assess the approach to be taken for intended testing activities. The plan typically identifies the items to be tested, the requirements being tested, the testing to be performed, test schedules, personnel requirements, reporting requirements, evaluation criteria, and any risks requiring contingency planning. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  3. Independent Verification and Validation Of SAPHIRE 8 Software Configuration Management Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2009-10-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE configuration management is to assess the activities that results in the process of identifying and defining the baselines associated with the SAPHIRE software product; controlling the changes to baselines and release of baselines throughout the life cycle; recording and reporting the status of baselines and the proposed and actual changes to the baselines; and verifying the correctness and completeness of baselines.. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  4. Independent Verification and Validation Of SAPHIRE 8 Software Configuration Management Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-02-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE configuration management is to assess the activities that results in the process of identifying and defining the baselines associated with the SAPHIRE software product; controlling the changes to baselines and release of baselines throughout the life cycle; recording and reporting the status of baselines and the proposed and actual changes to the baselines; and verifying the correctness and completeness of baselines.. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  5. Programmable RET Mask Layout Verification

    NASA Astrophysics Data System (ADS)

    Beale, Daniel F.; Mayhew, Jeffrey P.; Rieger, Michael L.; Tang, Zongwu

    2002-12-01

    Emerging resolution enhancement techniques (RET) and OPC are dramatically increasing the complexity of mask layouts and, in turn, mask verification. Mask shapes needed to achieve required results on the wafer diverge significantly from corresponding shapes in the physical design, and in some cases a single chip layer may be decomposed into two masks used in multiple exposures. The mask verification challenge is to certify that a RET-synthesized mask layout will produce an acceptable facsimile of the design intent expressed in the design layout. Furthermore costs, tradeoffs between mask-complexity, design intent, targeted process latitude, and other factors are playing a growing role in helping to control rising mask costs. All of these considerations must in turn be incorporated into the mask layout verification strategy needed for data prep sign-off. In this paper we describe a technique for assessing the lithographic quality of mask layouts for diverse RET methods while effectively accommodating various manufacturing objectives and specifications. It leverages the familiar DRC paradigm for identifying errors and producing DRC-like error shapes in its output layout. It integrates a unique concept of "check figures" - layer-based geometries that dictate where and how simulations of shapes on the wafer are to be compared to the original desired layout. We will show how this provides a highly programmable environment that makes it possible to engage in "compound" check strategies that vary based on design intent and adaptive simulation with multiple checks. Verification may be applied at the "go/no go" level or can be used to build a body of data for quantitative analysis of lithographic behavior at multiple process conditions or for specific user-defined critical features. In addition, we will outline automated methods that guide the selection of input parameters controlling specific verification strategies.

  6. Game theory and decision support system for use in security reviews of nuclear material tracking and accountancy systems

    SciTech Connect

    Goutal, P.; Werkoff, F.; Le Manchec, K.; Preston, N.; Roche, F.

    1995-12-31

    Tracking and accountancy arrangements help guarantee the security of nuclear materials. Verifications consisting of comparisons between physical identifications or measurements on one hand and material accountancy on the other hand are carried out, in order to detect any unexpected absence of nuclear material. This paper studies two different aspects of the problem of the efficiency of these verifications. First, a decision support system for use in security reviews of nuclear material accountancy systems is presented. Its purpose is firstly to represent a facility and the associated verifications, tracking and accountancy operations and secondly, to calculate the detection delay in the case of an absence of nuclear material. Next, in order to minimize the detection delay for a limited, fixed number of physical identifications, a two-person, zero-sum game with incomplete information is described. The first results obtained from this analysis indicate shorter detection times than those given by games with complete information.

  7. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  8. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  9. Sandia technology. Volume 13, number 2 Special issue : verification of arms control treaties.

    SciTech Connect

    Not Available

    1989-03-01

    Nuclear deterrence, a cornerstone of US national security policy, has helped prevent global conflict for over 40 years. The DOE and DoD share responsibility for this vital part of national security. The US will continue to rely on nuclear deterrence for the foreseeable future. In the late 1950s, Sandia developed satellite-borne nuclear burst detection systems to support the treaty banning atmospheric nuclear tests. This activity has continued to expand and diversify. When the Non-Proliferation Treaty was ratified in 1970, we began to develop technologies to protect nuclear materials from falling into unauthorized hands. This program grew and now includes systems for monitoring the movement and storage of nuclear materials, detecting tampering, and transmiting sensitive data securely. In the late 1970s, negotiations to further limit underground nuclear testing were being actively pursued. In less than 18 months, we fielded the National Seismic Station, an unattended observatory for in-country monitoring of nuclear tests. In the mid-l980s, arms-control interest shifted to facility monitoring and on-site inspection. Our Technical On-site Inspection Facility is the national test bed for perimeter and portal monitoring technology and the prototype for the inspection portal that was recently installed in the USSR under the Intermediate-Range Nuclear Forces accord. The articles in the special issue of Sundiu Technology describe some of our current contributions to verification technology. This work supports the US policy to seek realistic arms control agreements while maintaining our national security.

  10. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  11. Burnup verification tests with the FORK measurement system-implementation for burnup credit

    SciTech Connect

    Ewing, R.I.

    1994-08-01

    Verification measurements may be used to help ensure nuclear criticality safety when burnup credit is applied to spent fuel transport and storage systems. The FORK system measures the passive neutron and gamma-ray emission from spent fuel assemblies while in the storage pool. It was designed at Los Alamos National Laboratory for the International Atomic Energy Agency safeguards program and is well suited to verify burnup and cooling time records at commercial Pressurized Water Reactor (PWR) sites. This report deals with the application of the FORK system to burnup credit operations.

  12. Helping Children Help Themselves. Revised.

    ERIC Educational Resources Information Center

    Alberta Dept. of Agriculture, Edmonton.

    Youth leaders and parents can use this activity oriented publication to help children six to twelve years of age become more independent by acquiring daily living skills. The publication consists of five units, each of which contains an introduction, learning activities, and lists of resource materials. Age-ability levels are suggested for…

  13. Standardized verification of fuel cycle modeling

    DOE PAGESBeta

    Feng, B.; Dixon, B.; Sunny, E.; Cuadra, A.; Jacobson, J.; Brown, N. R.; Powers, J.; Worrall, A.; Passerini, S.; Gregg, R.

    2016-04-05

    A nuclear fuel cycle systems modeling and code-to-code comparison effort was coordinated across multiple national laboratories to verify the tools needed to perform fuel cycle analyses of the transition from a once-through nuclear fuel cycle to a sustainable potential future fuel cycle. For this verification study, a simplified example transition scenario was developed to serve as a test case for the four systems codes involved (DYMOND, VISION, ORION, and MARKAL), each used by a different laboratory participant. In addition, all participants produced spreadsheet solutions for the test case to check all the mass flows and reactor/facility profiles on a year-by-yearmore » basis throughout the simulation period. The test case specifications describe a transition from the current US fleet of light water reactors to a future fleet of sodium-cooled fast reactors that continuously recycle transuranic elements as fuel. After several initial coordinated modeling and calculation attempts, it was revealed that most of the differences in code results were not due to different code algorithms or calculation approaches, but due to different interpretations of the input specifications among the analysts. Therefore, the specifications for the test case itself were iteratively updated to remove ambiguity and to help calibrate interpretations. In addition, a few corrections and modifications were made to the codes as well, which led to excellent agreement between all codes and spreadsheets for this test case. Although no fuel cycle transition analysis codes matched the spreadsheet results exactly, all remaining differences in the results were due to fundamental differences in code structure and/or were thoroughly explained. As a result, the specifications and example results are provided so that they can be used to verify additional codes in the future for such fuel cycle transition scenarios.« less

  14. Fuel Retrieval System (FRS) Design Verification

    SciTech Connect

    YANOCHKO, R.M.

    2000-01-27

    This document was prepared as part of an independent review to explain design verification activities already completed, and to define the remaining design verification actions for the Fuel Retrieval System. The Fuel Retrieval Subproject was established as part of the Spent Nuclear Fuel Project (SNF Project) to retrieve and repackage the SNF located in the K Basins. The Fuel Retrieval System (FRS) construction work is complete in the KW Basin, and start-up testing is underway Design modifications and construction planning are also underway for the KE Basin. An independent review of the design verification process as applied to the K Basin projects was initiated in support of preparation for the SNF Project operational readiness review (ORR).

  15. Heavy water physical verification in power plants

    SciTech Connect

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper.

  16. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  17. The Challenge for Arms Control Verification in the Post-New START World

    SciTech Connect

    Wuest, C R

    2012-05-24

    Nuclear weapon arms control treaty verification is a key aspect of any agreement between signatories to establish that the terms and conditions spelled out in the treaty are being met. Historically, arms control negotiations have focused more on the rules and protocols for reducing the numbers of warheads and delivery systems - sometimes resorting to complex and arcane procedures for counting forces - in an attempt to address perceived or real imbalances in a nation's strategic posture that could lead to instability. Verification procedures are generally defined in arms control treaties and supporting documents and tend to focus on technical means and measures designed to ensure that a country is following the terms of the treaty and that it is not liable to engage in deception or outright cheating in an attempt to circumvent the spirit and the letter of the agreement. As the Obama Administration implements the articles, terms, and conditions of the recently ratified and entered-into-force New START treaty, there are already efforts within and outside of government to move well below the specified New START levels of 1550 warheads, 700 deployed strategic delivery vehicles, and 800 deployed and nondeployed strategic launchers (Inter-Continental Ballistic Missile (ICBM) silos, Submarine-Launched Ballistic Missile (SLBM) tubes on submarines, and bombers). A number of articles and opinion pieces have appeared that advocate for significantly deeper cuts in the U.S. nuclear stockpile, with some suggesting that unilateral reductions on the part of the U.S. would help coax Russia and others to follow our lead. Papers and studies prepared for the U.S. Department of Defense and at the U.S. Air War College have also been published, suggesting that nuclear forces totaling no more than about 300 warheads would be sufficient to meet U.S. national security and deterrence needs. (Davis 2011, Schaub and Forsyth 2010) Recent articles by James M. Acton and others suggest that the

  18. Verification and arms control

    SciTech Connect

    Potter, W.C.

    1985-01-01

    Recent years have witnessed an increased stress upon the verification of arms control agreements, both as a technical problem and as a political issue. As one contribution here points out, the middle ground has shrunk between those who are persuaded that the Soviets are ''cheating'' and those who are willing to take some verification risks for the sake of achieving arms control. One angle, according to a Lawrence Livermore physicist who served as a member of the delegation to the various test-ban treaty negotiations, is the limited effectiveness of on-site inspection as compared to other means of verification.

  19. Packaged low-level waste verification system

    SciTech Connect

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-08-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy`s National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL).

  20. Verification of RADTRAN

    SciTech Connect

    Kanipe, F.L.; Neuhauser, K.S.

    1995-12-31

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes.

  1. Voltage verification unit

    DOEpatents

    Martin, Edward J.

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  2. 10 CFR 72.79 - Facility information and verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR THE INDEPENDENT STORAGE OF SPENT NUCLEAR FUEL, HIGH-LEVEL RADIOACTIVE WASTE, AND REACTOR-RELATED GREATER THAN CLASS C WASTE Records... this chapter on DOC/NRC Form AP-1 and associated forms; and (c) Shall permit verification thereof...

  3. 10 CFR 72.79 - Facility information and verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR THE INDEPENDENT STORAGE OF SPENT NUCLEAR FUEL, HIGH-LEVEL RADIOACTIVE WASTE, AND REACTOR-RELATED GREATER THAN CLASS C WASTE Records... this chapter on DOC/NRC Form AP-1 and associated forms; and (c) Shall permit verification thereof...

  4. 10 CFR 72.79 - Facility information and verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR THE INDEPENDENT STORAGE OF SPENT NUCLEAR FUEL, HIGH-LEVEL RADIOACTIVE WASTE, AND REACTOR-RELATED GREATER THAN CLASS C WASTE Records... this chapter on DOC/NRC Form AP-1 and associated forms; and (c) Shall permit verification thereof...

  5. Systems Approach to Arms Control Verification

    SciTech Connect

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  6. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  7. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  8. RISKIND verification and benchmark comparisons

    SciTech Connect

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  9. 7.0T nuclear magnetic resonance evaluation of the amyloid beta (1-40) animal model of Alzheimer's disease: comparison of cytology verification.

    PubMed

    Zhang, Lei; Dong, Shuai; Zhao, Guixiang; Ma, Yu

    2014-02-15

    3.0T magnetic resonance spectroscopic imaging is a commonly used method in the research of brain function in Alzheimer's disease. However, the role of 7.0T high-field magnetic resonance spectroscopic imaging in brain function of Alzheimer's disease remains unclear. In this study, 7.0T magnetic resonance spectroscopy showed that in the hippocampus of Alzheimer's disease rats, the N-acetylaspartate wave crest was reduced, and the creatine and choline wave crest was elevated. This finding was further supported by hematoxylin-eosin staining, which showed a loss of hippocampal neurons and more glial cells. Moreover, electron microscopy showed neuronal shrinkage and mitochondrial rupture, and scanning electron microscopy revealed small size hippocampal synaptic vesicles, incomplete synaptic structure, and reduced number. Overall, the results revealed that 7.0T high-field nuclear magnetic resonance spectroscopy detected the lesions and functional changes in hippocampal neurons of Alzheimer's disease rats in vivo, allowing the possibility for assessing the success rate and grading of the amyloid beta (1-40) animal model of Alzheimer's disease.

  10. Experimental verification of proton beam monitoring in a human body by use of activity image of positron-emitting nuclei generated by nuclear fragmentation reaction.

    PubMed

    Nishio, Teiji; Miyatake, Aya; Inoue, Kazumasa; Gomi-Miyagishi, Tomoko; Kohno, Ryosuke; Kameoka, Satoru; Nakagawa, Keiichi; Ogino, Takashi

    2008-01-01

    Proton therapy is a form of radiotherapy that enables concentration of dose on a tumor by use of a scanned or modulated Bragg peak. Therefore, it is very important to evaluate the proton-irradiated volume accurately. The proton-irradiated volume can be confirmed by detection of pair-annihilation gamma rays from positron-emitting nuclei generated by the nuclear fragmentation reaction of the incident protons on target nuclei using a PET apparatus. The activity of the positron-emitting nuclei generated in a patient was measured with a PET-CT apparatus after proton beam irradiation of the patient. Activity measurement was performed in patients with tumors of the brain, head and neck, liver, lungs, and sacrum. The 3-D PET image obtained on the CT image showed the visual correspondence with the irradiation area of the proton beam. Moreover, it was confirmed that there were differences in the strength of activity from the PET-CT images obtained at each irradiation site. The values of activity obtained from both measurement and calculation based on the reaction cross section were compared, and it was confirmed that the intensity and the distribution of the activity changed with the start time of the PET imaging after proton beam irradiation. The clinical use of this information about the positron-emitting nuclei will be important for promoting proton treatment with higher accuracy in the future.

  11. Visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-03-01

    On-site visual inspection will play an essential role in future Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection can greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can visual inspection offer ``ground truth`` in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending party may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection.

  12. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  13. Multi-canister overpack project -- verification and validation, MCNP 4A

    SciTech Connect

    Goldmann, L.H.

    1997-11-10

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and the old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.

  14. Verification of Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  15. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  16. Verification and Validation of Digitally Upgraded Control Rooms

    SciTech Connect

    Boring, Ronald; Lau, Nathan

    2015-09-01

    As nuclear power plants undertake main control room modernization, a challenge is the lack of a clearly defined human factors process to follow. Verification and validation (V&V) as applied in the nuclear power community has tended to involve efforts such as integrated system validation, which comes at the tail end of the design stage. To fill in guidance gaps and create a step-by-step process for control room modernization, we have developed the Guideline for Operational Nuclear Usability and Knowledge Elicitation (GONUKE). This approach builds on best practices in the software industry, which prescribe an iterative user-centered approach featuring multiple cycles of design and evaluation. Nuclear regulatory guidance for control room design emphasizes summative evaluation—which occurs after the design is complete. In the GONUKE approach, evaluation is also performed at the formative stage of design—early in the design cycle using mockups and prototypes for evaluation. The evaluation may involve expert review (e.g., software heuristic evaluation at the formative stage and design verification against human factors standards like NUREG-0700 at the summative stage). The evaluation may also involve user testing (e.g., usability testing at the formative stage and integrated system validation at the summative stage). An additional, often overlooked component of evaluation is knowledge elicitation, which captures operator insights into the system. In this report we outline these evaluation types across design phases that support the overall modernization process. The objective is to provide industry-suitable guidance for steps to be taken in support of the design and evaluation of a new human-machine interface (HMI) in the control room. We suggest the value of early-stage V&V and highlight how this early-stage V&V can help improve the design process for control room modernization. We argue that there is a need to overcome two shortcomings of V&V in current practice

  17. Nuclear Scans

    MedlinePlus

    Nuclear scans use radioactive substances to see structures and functions inside your body. They use a special ... images. Most scans take 20 to 45 minutes. Nuclear scans can help doctors diagnose many conditions, including ...

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  19. FPGA Verification Accelerator (FVAX)

    NASA Technical Reports Server (NTRS)

    Oh, Jane; Burke, Gary

    2008-01-01

    Is Verification Acceleration Possible? - Increasing the visibility of the internal nodes of the FPGA results in much faster debug time - Forcing internal signals directly allows a problem condition to be setup very quickly center dot Is this all? - No, this is part of a comprehensive effort to improve the JPL FPGA design and V&V process.

  20. Telescope performance verification

    NASA Astrophysics Data System (ADS)

    Swart, Gerhard P.; Buckley, David A. H.

    2004-09-01

    While Systems Engineering appears to be widely applied on the very large telescopes, it is lacking in the development of many of the medium and small telescopes currently in progress. The latter projects rely heavily on the experience of the project team, verbal requirements and conjecture based on the successes and failures of other telescopes. Furthermore, it is considered an unaffordable luxury to "close-the-loop" by carefully analysing and documenting the requirements and then verifying the telescope's compliance with them. In this paper the authors contend that a Systems Engineering approach is a keystone in the development of any telescope and that verification of the telescope's performance is not only an important management tool but also forms the basis upon which successful telescope operation can be built. The development of the Southern African Large Telescope (SALT) has followed such an approach and is now in the verification phase of its development. Parts of the SALT verification process will be discussed in some detail to illustrate the suitability of this approach, including oversight by the telescope shareholders, recording of requirements and results, design verification and performance testing. Initial test results will be presented where appropriate.

  1. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  2. Conceptual design. Final report: TFE Verification Program

    SciTech Connect

    Not Available

    1994-03-01

    This report documents the TFE Conceptual Design, which provided the design guidance for the TFE Verification program. The primary goals of this design effort were: (1) establish the conceptual design of an in-core thermionic reactor for a 2 Mw(e) space nuclear power system with a 7-year operating lifetime; (2) demonstrate scalability of the above concept over the output power range of 500 kW(e) to 5 MW(e); and (3) define the TFE which is the basis for the 2 MW (e) reactor design. This TFE specification provided the basis for the test program. These primary goals were achieved. The technical approach taking in the conceptual design effort is discussed in Section 2, and the results are discussed in Section 3. The remainder of this introduction draws a perspective on the role that this conceptual design task played in the TFE Verification Program.

  3. Natural Analogues - One Way to Help Build Public Confidence in the Predicted Performance of a Mined Geologic Repository for Nuclear Waste

    SciTech Connect

    Stuckless, J. S.

    2002-02-26

    The general public needs to have a way to judge the predicted long-term performance of the potential high-level nuclear waste repository at Yucca Mountain. The applicability and reliability of mathematical models used to make this prediction are neither easily understood nor accepted by the public. Natural analogues can provide the average person with a tool to assess the predicted performance and other scientific conclusions. For example, hydrologists with the Yucca Mountain Project have predicted that most of the water moving through the unsaturated zone at Yucca Mountain, Nevada will move through the host rock and around tunnels. Thus, seepage into tunnels is predicted to be a small percentage of available infiltration. This hypothesis can be tested experimentally and with some quantitative analogues. It can also be tested qualitatively using a variety of analogues such as (1) well-preserved Paleolithic to Neolithic paintings in caves and rock shelters, (2) biological remains preserved in caves and rock shelters, and (3) artifacts and paintings preserved in man-made underground openings. These examples can be found in materials that are generally available to the non-scientific public and can demonstrate the surprising degree of preservation of fragile and easily destroyed materials for very long periods of time within the unsaturated zone.

  4. Improved Verification for Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Powell, Mark A.

    2008-01-01

    Aerospace systems are subject to many stringent performance requirements to be verified with low risk. This report investigates verification planning using conditional approaches vice the standard classical statistical methods, and usage of historical surrogate data for requirement validation and in verification planning. The example used in this report to illustrate the results of these investigations is a proposed mission assurance requirement with the concomitant maximum acceptable verification risk for the NASA Constellation Program Orion Launch Abort System (LAS). This report demonstrates the following improvements: 1) verification planning using conditional approaches vice classical statistical methods results in plans that are more achievable and feasible; 2) historical surrogate data can be used to bound validation of performance requirements; and, 3) incorporation of historical surrogate data in verification planning using conditional approaches produces even less costly and more reasonable verification plans. The procedures presented in this report may produce similar improvements and cost savings in verification for any stringent performance requirement for an aerospace system.

  5. Design verification and validation plan for the cold vacuum drying facility

    SciTech Connect

    NISHIKAWA, L.D.

    1999-06-03

    The Cold Vacuum Drying Facility (CVDF) provides the required process systems, supporting equipment, and facilities needed for drying spent nuclear fuel removed from the K Basins. This document presents the both completed and planned design verification and validation activities.

  6. TFE Verification Program

    SciTech Connect

    Not Available

    1990-03-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown on Fig. 1-1. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program;(3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-86; (5) and Thermionic Technology Program in 1986 and 1987. 18 refs., 64 figs., 43 tabs.

  7. 77 FR 28572 - Notice of Submission for OMB Review; Federal Student Aid; Loan Verification Certificate for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-15

    ... Notice of Submission for OMB Review; Federal Student Aid; Loan Verification Certificate for Special Direct Consolidation Loans SUMMARY: This Loan Verification Certificate (LVC) will serve as the means by... the White House in an October 25, 2011 fact sheet titled ``Help Americans Manage Student Loan...

  8. 10 CFR 72.79 - Facility information and verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Facility information and verification. 72.79 Section 72.79 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR THE INDEPENDENT STORAGE OF... the International Atomic Energy Agency (IAEA) and take other action as necessary to implement the...

  9. 10 CFR 63.47 - Facility information and verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Facility information and verification. 63.47 Section 63.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Us/iaea Safeguards Agreement § 63.47 Facility information...

  10. 10 CFR 63.47 - Facility information and verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Facility information and verification. 63.47 Section 63.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Us/iaea Safeguards Agreement § 63.47 Facility information...

  11. 10 CFR 63.47 - Facility information and verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Facility information and verification. 63.47 Section 63.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Us/iaea Safeguards Agreement § 63.47 Facility information...

  12. 10 CFR 63.47 - Facility information and verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Facility information and verification. 63.47 Section 63.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Us/iaea Safeguards Agreement § 63.47 Facility information...

  13. 10 CFR 63.47 - Facility information and verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Facility information and verification. 63.47 Section 63.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Us/iaea Safeguards Agreement § 63.47 Facility information...

  14. 10 CFR 72.79 - Facility information and verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Facility information and verification. 72.79 Section 72.79 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR THE INDEPENDENT STORAGE OF... the International Atomic Energy Agency (IAEA) and take other action as necessary to implement the...

  15. Continuous verification using multimodal biometrics.

    PubMed

    Sim, Terence; Zhang, Sheng; Janakiraman, Rajkumar; Kumar, Sandeep

    2007-04-01

    Conventional verification systems, such as those controlling access to a secure room, do not usually require the user to reauthenticate himself for continued access to the protected resource. This may not be sufficient for high-security environments in which the protected resource needs to be continuously monitored for unauthorized use. In such cases, continuous verification is needed. In this paper, we present the theory, architecture, implementation, and performance of a multimodal biometrics verification system that continuously verifies the presence of a logged-in user. Two modalities are currently used--face and fingerprint--but our theory can be readily extended to include more modalities. We show that continuous verification imposes additional requirements on multimodal fusion when compared to conventional verification systems. We also argue that the usual performance metrics of false accept and false reject rates are insufficient yardsticks for continuous verification and propose new metrics against which we benchmark our system. PMID:17299225

  16. Using color for face verification

    NASA Astrophysics Data System (ADS)

    Leszczynski, Mariusz

    2009-06-01

    This paper presents research on importance of color information in face verification system. Four most popular color spaces where used: RGB, YIQ, YCbCr, luminance and compared using four types of discriminant classifiers. Experiments conducted on facial databases with complex background, different poses and light condition show that color information can improve the verification accuracy compared to the traditionally used luminance information. To achieve the best performance we recommend to use multi frames verification encoded to YIQ color space.

  17. Quantum money with classical verification

    SciTech Connect

    Gavinsky, Dmitry

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  18. Parent Tookit: Homework Help. Helpful Tips.

    ERIC Educational Resources Information Center

    All Kinds of Minds, 2006

    2006-01-01

    This check list contains tips for parents to help students reinforce and build upon what children learn at school: (1) Set a consistent time each day for doing homework; (2) Encourage children to make a homework checklist; (3) Provide assistance to help get started on a task; (4) Help children make a list of all needed materials before starting…

  19. A Cherenkov viewing device for used-fuel verification

    NASA Astrophysics Data System (ADS)

    Attas, E. M.; Chen, J. D.; Young, G. J.

    1990-12-01

    A Cherenkov viewing device (CVD) has been developed to help verify declared inventories of used nuclear fuel stored in water bays. The device detects and amplifies the faint ultraviolet Cherenkov glow from the water surrounding the fuel, producing a real-time visible image on a phosphor screen. Quartz optics, a UV-pass filter and a microchannel-plate image-intensifier tube serve to form the image, which can be photographed or viewed directly through an eyepiece. Normal fuel bay lighting does not interfere with the Cherenkov light image. The CVD has been successfully used to detect anomalous PWR, BWR and CANDU (CANada Deuterium Uranium: registered trademark) fuel assemblies in the presence of normal-burnup assemblies stored in used-fuel bays. The latest version of the CVD, known as Mark IV, is being used by inspectors from the International Atomic Energy Agency for verification of light-water power-reactor fuel. Its design and operation are described, together with plans for further enhancements of the instrumentation.

  20. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  1. Constitutional and legal implications of arms control verification technologies

    SciTech Connect

    Tanzman, E.A.; Haffenden, R.

    1992-09-01

    United States law can both help and hinder the use of instrumentation as a component of arms control verification in this country. It can foster the general use of sophisticated verification technologies, where such devices are consistent with the value attached to privacy by the Fourth Amendment to the United States Constitution. On the other hand, law can hinder reliance on devices that cross this constitutional line, or where such technology itself threatens health, safety, or environment as such threats are defined in federal statutes. The purpose of this conference paper is to explain some of the lessons that have been learned about the relationship between law and verification technologies in the hope that law can help more than hinder. This paper has three parts. In order to start with a common understanding, part I will briefly describe the hierarchy of treaties, the Constitution, federal statutes, and state and local laws. Part 2 will discuss how the specific constitutional requirement that the government respect the right of privacy in all of its endeavors may affect the use of verification technologies. Part 3 will explain the environmental law constraints on verification technology as exemplified by the system of on-site sampling embodied in the current Rolling Text of the Draft Chemical Weapons Convention.

  2. National Center for Nuclear Security: The Nuclear Forensics Project (F2012)

    SciTech Connect

    Klingensmith, A. L.

    2012-03-21

    These presentation visuals introduce the National Center for Nuclear Security. Its chartered mission is to enhance the Nation’s verification and detection capabilities in support of nuclear arms control and nonproliferation through R&D activities at the NNSS. It has three focus areas: Treaty Verification Technologies, Nonproliferation Technologies, and Technical Nuclear Forensics. The objectives of nuclear forensics are to reduce uncertainty in the nuclear forensics process & improve the scientific defensibility of nuclear forensics conclusions when applied to nearsurface nuclear detonations. Research is in four key areas: Nuclear Physics, Debris collection and analysis, Prompt diagnostics, and Radiochemistry.

  3. Software Verification and Validation Procedure

    SciTech Connect

    Olund, Thomas S.

    2008-09-15

    This Software Verification and Validation procedure provides the action steps for the Tank Waste Information Network System (TWINS) testing process. The primary objective of the testing process is to provide assurance that the software functions as intended, and meets the requirements specified by the client. Verification and validation establish the primary basis for TWINS software product acceptance.

  4. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  5. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  6. Nuclear materials accounting, helping the facility operator

    SciTech Connect

    Barnes, J.W.; Thomas, K.E.

    1986-01-01

    A modern materials control and accounting (MC and A) system can provide major benefits to production personnel. It can enhance understanding of process systems performance, localize and reconcile material losses, and identify instruments that are out-of-calibration or malfunctioning. Examples of the above MC and A system applications are given. We show how Operations personnel can use an MC and A system to their advantage rather than letting the MC and A system take advantage of them.

  7. Freeze verification: time for a fresh approach

    SciTech Connect

    Paine, C.

    1983-01-01

    The administration's claim that some elements of a comprehensive nuclear freeze are unverifiable does not specify the nature of those elements and whether they represent a real threat to national security if we trusted the USSR to comply. The author contends that clandestine development of new weapons will have little strategic effect since both sides already have total destructive power. The risks of noncompliance are largely political and less than the risks of continued arms buildup. Since the USSR would also want the US to be bound by freeze terms, deterrence would come from mutual benefit. Hardliners argue that cheating is easier in a closed society; that our democracy would tend to relax and the USSR would move ahead with its plans for world domination. The author argues that, over time, a freeze would diminish Soviet confidence in its nuclear war fighting capabilities and that adequate verification is possible with monitoring and warning arrangements. (DCK)

  8. Verification and validation of RADMODL Version 1.0

    SciTech Connect

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.

  9. Bibliography for Verification and Validation in Computational Simulations

    SciTech Connect

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  10. VEG-01: Veggie Hardware Verification Testing

    NASA Technical Reports Server (NTRS)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  11. Verification of VENTSAR

    SciTech Connect

    Simpkins, A.A.

    1995-01-01

    The VENTSAR code is an upgraded and improved version of the VENTX code, which estimates concentrations on or near a building from a release at a nearby location. The code calculates the concentrations either for a given meteorological exceedance probability or for a given stability and wind speed combination. A single building can be modeled which lies in the path of the plume, or a penthouse can be added to the top of the building. Plume rise may also be considered. Release types can be either chemical or radioactive. Downwind concentrations are determined at user-specified incremental distances. This verification report was prepared to demonstrate that VENTSAR is properly executing all algorithms and transferring data. Hand calculations were also performed to ensure proper application of methodologies.

  12. Online fingerprint verification.

    PubMed

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications. PMID:17365425

  13. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    SciTech Connect

    Luke, S J

    2011-12-20

    This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of the US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the

  14. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  15. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  16. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  17. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  18. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  19. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    NASA Astrophysics Data System (ADS)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help

  20. Extremely accurate sequential verification of RELAP5-3D

    DOE PAGESBeta

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less

  1. Extremely accurate sequential verification of RELAP5-3D

    SciTech Connect

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method of manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.

  2. Neptunium flow-sheet verification at reprocessing plants

    SciTech Connect

    Rance, P.; Chesnay, B.; Killeen, T.; Murray, M.; Nikkinen, M.; Petoe, A.; Plumb, J.; Saukkonen, H.

    2007-07-01

    Due to their fissile nature, neptunium and americium have at least a theoretical potential application as nuclear explosives and their proliferation potential was considered by the IAEA in studies in the late 1990's. This work was motivated by an increased awareness of the proliferation potential of americium and neptunium and a number of emerging projects in peaceful nuclear programmes which could result in an increase in the available quantities of these minor actinides. The studies culminated in proposals for various voluntary measures including the reporting of international transfers of separated americium and neptunium, declarations concerning the amount of separated neptunium and americium held by states and the application of flow-sheet verification to ensure that facilities capable of separating americium or neptunium are operated in a manner consistent with that declared. This paper discusses the issue of neptunium flowsheet verification in reprocessing plants. The proliferation potential of neptunium is first briefly discussed and then the chemistry of neptunium relevant to reprocessing plants described with a view to indicating a number of issues relevant to the verification of neptunium flow-sheets. Finally, the scope of verification activities is discussed including analysis of process and engineering design information, plant monitoring and sampling and the potential application of containment and surveillance measures. (authors)

  3. Proceedings of the array signal processing symposium: Treaty Verification Program

    SciTech Connect

    Harris, D.B.

    1988-02-01

    A common theme underlying the research these groups conduct is the use of propagating waves to detect, locate, image or otherwise identify features of the environment significant to their applications. The applications considered in this symposium are verification of nuclear test ban treaties, non-destructive evaluation (NDE) of manufactured components, and sonar and electromagnetic target acquisition and tracking. These proceedings cover just the first two topics. In these applications, arrays of sensors are used to detect propagating waves and to measure the characteristics that permit interpretation. The reason for using sensors arrays, which are inherently more expensive than single sensor systems, is twofold. By combining the signals from multiple sensors, it is usually possible to suppress unwanted noise, which permtis detection and analysis of waker signals. Secondly, in complicated situations in which many waves are present, arrays make it possible to separate the waves and to measure their individual characteristics (direction, velocity, etc.). Other systems (such as three-component sensors in the seismic application) can perform these functions to some extent, but none are so effective and versatile as arrays. The objectives of test ban treaty verification are to detect, locate and identify underground nuclear explosions, and to discriminate them from earthquakes and conventional chemical explosions. Two physical modes of treaty verification are considered: monitoring with arrays of seismic stations (solid earth propagation), and monitoring with arrays of acoustic (infrasound) stations (atmospheric propagation). The majority of the presentations represented in these proceeding address various aspects of the seismic verification problem.

  4. Biometric verification with correlation filters.

    PubMed

    Vijaya Kumar, B V K; Savvides, Marios; Xie, Chunyan; Venkataramani, Krithika; Thornton, Jason; Mahalanobis, Abhijit

    2004-01-10

    Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric (such as a fingerprint, face image, or iris image) to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images (e.g., facial expressions, illumination changes, etc.). We investigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification. PMID:14735958

  5. Generic interpreters and microprocessor verification

    NASA Technical Reports Server (NTRS)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  6. TPS verification with UUT simulation

    NASA Astrophysics Data System (ADS)

    Wang, Guohua; Meng, Xiaofeng; Zhao, Ruixian

    2006-11-01

    TPS's (Test Program Set) verification or first article acceptance test commonly depends on fault insertion experiment on UUT (Unit Under Test). However the failure modes injected on UUT is limited and it is almost infeasible when the UUT is in development or in a distributed state. To resolve this problem, a TPS verification method based on UUT interface signal simulation is putting forward. The interoperability between ATS (automatic test system) and UUT simulation platform is very important to realize automatic TPS verification. After analyzing the ATS software architecture, the approach to realize interpretability between ATS software and UUT simulation platform is proposed. And then the UUT simulation platform software architecture is proposed based on the ATS software architecture. The hardware composition and software architecture of the UUT simulation is described in details. The UUT simulation platform has been implemented in avionics equipment TPS development, debug and verification.

  7. Reliable Entanglement Verification

    NASA Astrophysics Data System (ADS)

    Arrazola, Juan; Gittsovich, Oleg; Donohue, John; Lavoie, Jonathan; Resch, Kevin; Lütkenhaus, Norbert

    2013-05-01

    Entanglement plays a central role in quantum protocols. It is therefore important to be able to verify the presence of entanglement in physical systems from experimental data. In the evaluation of these data, the proper treatment of statistical effects requires special attention, as one can never claim to have verified the presence of entanglement with certainty. Recently increased attention has been paid to the development of proper frameworks to pose and to answer these type of questions. In this work, we apply recent results by Christandl and Renner on reliable quantum state tomography to construct a reliable entanglement verification procedure based on the concept of confidence regions. The statements made do not require the specification of a prior distribution nor the assumption of an independent and identically distributed (i.i.d.) source of states. Moreover, we develop efficient numerical tools that are necessary to employ this approach in practice, rendering the procedure ready to be employed in current experiments. We demonstrate this fact by analyzing the data of an experiment where photonic entangled two-photon states were generated and whose entanglement is verified with the use of an accessible nonlinear witness.

  8. Proceedings of a conference on nuclear war: The search for solutions

    SciTech Connect

    Perry, T.L.; DeMille, D.

    1985-01-01

    This book presents the proceedings of a conference on the problem of nuclear war. Topics include civil defense; nuclear winter; the psychological consequences of nuclear war, arms control and verification.

  9. Verification and validation of control system software

    SciTech Connect

    Munro, J.K. Jr.; Kisner, R.A. ); Bhadtt, S.C. )

    1991-01-01

    The following guidelines are proposed for verification and validation (V V) of nuclear power plant control system software: (a) use risk management to decide what and how much V V is needed; (b) classify each software application using a scheme that reflects what type and how much V V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs.

  10. Comments for A Conference on Verification in the 21st Century

    SciTech Connect

    Doyle, James E.

    2012-06-12

    The author offers 5 points for the discussion of Verification and Technology: (1) Experience with the implementation of arms limitation and arms reduction agreements confirms that technology alone has never been relied upon to provide effective verification. (2) The historical practice of verification of arms control treaties between Cold War rivals may constrain the cooperative and innovative use of technology for transparency, veification and confidence building in the future. (3) An area that has been identified by many, including the US State Department and NNSA as being rich for exploration for potential uses of technology for transparency and verification is information and communications technology (ICT). This includes social media, crowd-sourcing, the internet of things, and the concept of societal verification, but there are issues. (4) On the issue of the extent to which verification technologies are keeping pace with the demands of future protocols and agrements I think the more direct question is ''are they effective in supporting the objectives of the treaty or agreement?'' In this regard it is important to acknowledge that there is a verification grand challenge at our doorstep. That is ''how does one verify limitations on nuclear warheads in national stockpiles?'' (5) Finally, while recognizing the daunting political and security challenges of such an approach, multilateral engagement and cooperation at the conceptual and technical levels provides benefits for addressing future verification challenges.

  11. Help! It's Hair Loss!

    MedlinePlus

    ... Homework? Here's Help White House Lunch Recipes Help! It's Hair Loss! KidsHealth > For Kids > Help! It's Hair Loss! Print A A A Text Size ... part above the skin, is dead. (That's why it doesn't hurt to get a haircut!) This ...

  12. Computer Generated Inputs for NMIS Processor Verification

    SciTech Connect

    J. A. Mullens; J. E. Breeding; J. A. McEvers; R. W. Wysor; L. G. Chiang; J. R. Lenarduzzi; J. T. Mihalczo; J. K. Mattingly

    2001-06-29

    Proper operation of the Nuclear Identification Materials System (NMIS) processor can be verified using computer-generated inputs [BIST (Built-In-Self-Test)] at the digital inputs. Preselected sequences of input pulses to all channels with known correlation functions are compared to the output of the processor. These types of verifications have been utilized in NMIS type correlation processors at the Oak Ridge National Laboratory since 1984. The use of this test confirmed a malfunction in a NMIS processor at the All-Russian Scientific Research Institute of Experimental Physics (VNIIEF) in 1998. The NMIS processor boards were returned to the U.S. for repair and subsequently used in NMIS passive and active measurements with Pu at VNIIEF in 1999.

  13. RELAP-7 Software Verification and Validation Plan

    SciTech Connect

    Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling

    2014-09-25

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.

  14. SPR Hydrostatic Column Model Verification and Validation.

    SciTech Connect

    Bettin, Giorgia; Lord, David; Rudeen, David Keith

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  15. Multichip reticle approach for OPC model verification

    NASA Astrophysics Data System (ADS)

    Taravade, Kunal N.; Belova, Nadya; Jost, Andrew M.; Callan, Neal P.

    2003-12-01

    The complexity of current semiconductor technology due to shrinking feature sizes causes more and more engineering efforts and expenses to deliver the final product to customers. One of the largest expense in the entire budget is the reticle manufacturing. With the need to perform mask correction in order to account for optical proximity effects on the wafer level, the reticle expenses have become even more critical. For 0.13um technology one can not avoid optical proximity correction (OPC) procedure for modifying original designs to comply with design rules as required by Front End (FE) and Back End (BE) processes. Once an OPC model is generated one needs to confirm and verify the said model with additional test reticles for every critical layer of the technology. Such a verification procedure would include the most critical layers (two FE layers and four BE layers for the 0.13 technology node). This allows us to evaluate model performance under real production conditions encountered on customer designs. At LSI we have developed and verified the low volume reticle (LVR) approach for verification of different OPC models. The proposed approach allows performing die-to-die reticle defect inspection in addition to checking the printed image on the wafer. It helps finalizing litho and etch process parameters. Processing wafers with overlaying masks for two consecutive BE layer (via and metal2 masks) allowed us to evaluate robustness of OPC models for a wafer stack against both reticle and wafer induced misalignments.

  16. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  17. Face verification with balanced thresholds.

    PubMed

    Yan, Shuicheng; Xu, Dong; Tang, Xiaoou

    2007-01-01

    The process of face verification is guided by a pre-learned global threshold, which, however, is often inconsistent with class-specific optimal thresholds. It is, hence, beneficial to pursue a balance of the class-specific thresholds in the model-learning stage. In this paper, we present a new dimensionality reduction algorithm tailored to the verification task that ensures threshold balance. This is achieved by the following aspects. First, feasibility is guaranteed by employing an affine transformation matrix, instead of the conventional projection matrix, for dimensionality reduction, and, hence, we call the proposed algorithm threshold balanced transformation (TBT). Then, the affine transformation matrix, constrained as the product of an orthogonal matrix and a diagonal matrix, is optimized to improve the threshold balance and classification capability in an iterative manner. Unlike most algorithms for face verification which are directly transplanted from face identification literature, TBT is specifically designed for face verification and clarifies the intrinsic distinction between these two tasks. Experiments on three benchmark face databases demonstrate that TBT significantly outperforms the state-of-the-art subspace techniques for face verification.

  18. International and national security applications of cryogenic detectors - mostly nuclear safeguards

    SciTech Connect

    Rabin, Michael W

    2009-01-01

    As with science, so with security - in both arenas, the extraordinary sensitivity of cryogenic sensors enables high-confidence detection and high-precision measurement even of the faintest signals. Science applications are more mature, but several national and international security applications have been identified where cryogenic detectors have high potential payoff. International safeguards and nuclear forensics are areas needing new technology and methods to boost speed, sensitivity, precision and accuracy. Successfully applied, improved nuclear materials analysis will help constrain nuclear materials diversion pathways and contribute to treaty verification. Cryogenic microcalorimeter detectors for X-ray, gamma ray, neutron, and alpha particle spectrometry are under development with these aims in mind. In each case the unsurpassed energy resolution of microcalorimeters reveals previously invi sible spectral features of nuclear materials. Preliminary results of quantitative analysis indicate substantial improvements are still possible, but significant work will be required to fully understand the ultimate performance limits.

  19. How Nasa's Independent Verification and Validation (IVandV) Program Builds Reliability into a Space Mission Software System (SMSS)

    NASA Technical Reports Server (NTRS)

    Fisher, Marcus S.; Northey, Jeffrey; Stanton, William

    2014-01-01

    The purpose of this presentation is to outline how the NASA Independent Verification and Validation (IVV) Program helps to build reliability into the Space Mission Software Systems (SMSSs) that its customers develop.

  20. National Center for Nuclear Security - NCNS

    ScienceCinema

    None

    2016-07-12

    As the United States embarks on a new era of nuclear arms control, the tools for treaty verification must be accurate and reliable, and must work at stand-off distances. The National Center for Nuclear Security, or NCNS, at the Nevada National Security Site, is poised to become the proving ground for these technologies. The center is a unique test bed for non-proliferation and arms control treaty verification technologies. The NNSS is an ideal location for these kinds of activities because of its multiple environments; its cadre of experienced nuclear personnel, and the artifacts of atmospheric and underground nuclear weapons explosions. The NCNS will provide future treaty negotiators with solid data on verification and inspection regimes and a realistic environment in which future treaty verification specialists can be trained. Work on warhead monitoring at the NCNS will also support future arms reduction treaties.

  1. National Center for Nuclear Security - NCNS

    SciTech Connect

    2014-11-12

    As the United States embarks on a new era of nuclear arms control, the tools for treaty verification must be accurate and reliable, and must work at stand-off distances. The National Center for Nuclear Security, or NCNS, at the Nevada National Security Site, is poised to become the proving ground for these technologies. The center is a unique test bed for non-proliferation and arms control treaty verification technologies. The NNSS is an ideal location for these kinds of activities because of its multiple environments; its cadre of experienced nuclear personnel, and the artifacts of atmospheric and underground nuclear weapons explosions. The NCNS will provide future treaty negotiators with solid data on verification and inspection regimes and a realistic environment in which future treaty verification specialists can be trained. Work on warhead monitoring at the NCNS will also support future arms reduction treaties.

  2. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  3. AREST-CT V1.0 software verification

    SciTech Connect

    Chen, Y.; Engel, D.W.; McGrail, B.P.; Lessor, K.S.

    1995-07-01

    The Analyzer for Radionuclide Source-Term with Chemical Transport (AREST-CT) is a scientific computer code designed for performance assessments of engineered barrier system (EBS) concepts for the underground storage of nuclear waste, including high-level, intermediate, and low-level wastes. The AREST-CT code has features for analyzing the degradation of and release of radionuclides from the waste form, chemical reactions that depend on time and space, and transport of the waste and other products through the EBS. This document provides a description of the verification testing that has been performed on the initial version of ARESTCT (V1.0). Software verification is the process of confirming that the models and algorithms have been correctly implemented into a computer code. Software verification for V1.0 consisted of testing the individual modules (unit tests) and a test of the fully-coupled model (integration testing). The integration test was done by comparing the results from AREST-CT with the results from the reactive transport code CIRF.A. The test problem consisted of a 1-D analysis of the release, transport, and precipitation of {sup 99}{Tc} in an idealized LLW disposal system. All verification tests showed that AREST-CT works properly and in accordance with design specifications.

  4. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  5. CHEMICAL INDUCTION MIXER VERIFICATION - ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Wet-Weather Flow Technologies Pilot of the Environmental Technology Verification (ETV) Program, which is supported by the U.S. Environmental Protection Agency and facilitated by NSF International, has recently evaluated the performance of chemical induction mixers used for di...

  6. Handi Helps, 1984.

    ERIC Educational Resources Information Center

    Handi Helps, 1984

    1984-01-01

    The eight issues of Handi Helps presented in this document focus on specific issues of concern to the disabled, parents, and those working with the disabled. The two-page handi help fact sheets focus on the following topics: child abuse, leukemia, arthritis, Tourette Syndrome, hemophilia, the puppet program "Meet the New Kids on the Block" and dog…

  7. The Help Desk.

    ERIC Educational Resources Information Center

    Klein, Regina; And Others

    1988-01-01

    The first of three articles describes the results of a survey that examined characteristics and responsibilities of help-desk personnel at major database and online services. The second provides guidelines to using such customer services, and the third lists help-desk numbers for online databases and systems. (CLB)

  8. Handi Helps, 1985

    ERIC Educational Resources Information Center

    Handi Helps, 1985

    1985-01-01

    The six issues of Handi Helps presented here focus on specific issues of concern to the disabled, parents, and those working with the disabled. The two-page handi help fact sheets focus on the following topics: child sexual abuse prevention, asthma, scoliosis, the role of the occupational therapist, kidnapping, and muscular dystrophy. Each handi…

  9. Helping Children Understand Divorce.

    ERIC Educational Resources Information Center

    Allers, Robert D.

    1980-01-01

    Children of divorced parents may bring many problems along when they come to school. Teachers can recognize these troubles and help children learn to handle them. They may be able to help children better understand their feelings about their parents' divorce. (CJ)

  10. A verification system of RMAP protocol controller

    NASA Astrophysics Data System (ADS)

    Khanov, V. Kh; Shakhmatov, A. V.; Chekmarev, S. A.

    2015-01-01

    The functional verification problem of IP blocks of RMAP protocol controller is considered. The application of the verification method using fully- functional models of the processor and the internal bus of a system-on-chip is justified. Principles of construction of a verification system based on the given approach are proposed. The practical results of creating a system of verification of IP block of RMAP protocol controller is presented.

  11. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  12. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Energy Regulatory Commission 18 CFR Part 40 Generator Verification Reliability Standards AGENCY: Federal... Organization: MOD-025-2 (Verification and Data Reporting of Generator Real and Reactive Power Capability and Synchronous Condenser Reactive Power Capability), MOD- 026-1 (Verification of Models and Data for...

  13. The science verification of FLAMES

    NASA Astrophysics Data System (ADS)

    Primas, Francesca

    2003-06-01

    After a new VLT instrument has been commissioned and thoroughly tested1, a series of scientific and technical checkups are scheduled in order to test the front-to-end operations chain before the official start of regular operations. Technically speaking, these are the socalled Dry Runs, part of which are usually devoted to the Science Verification (SV for short) of that specific instrument. A Science Verification programme includes a set of typical scientific observations with the aim of verifying and demonstrating to the community the capabilities of a new instrument in the operational framework of the VLT Paranal Observatory. Though manifold, its goals can be summarised in two main points: from the scientific point of view, by demonstrating the scientific potential of the new instrument, these observations will provide ESO users with first science- grade data, thus fostering an early scientific return. From the technical point of view, by testing the whole operational system (from the preparation of the observations to their execution and analysis), it will provide important feedback to the Instrument Operation Teams (both in Paranal and in Garching), to the Instrument Division, and to the Data Flow groups. More details about the concept(s) behind a Science Verification can be found in the “Science Verification Policy and Procedures” document (available at http://www.eso.org/science/vltsv/).

  14. International safeguards: Accounting for nuclear materials

    SciTech Connect

    Fishbone, L.G.

    1988-09-28

    Nuclear safeguards applied by the International Atomic Energy Agency (IAEA) are one element of the non-proliferation regime'', the collection of measures whose aim is to forestall the spread of nuclear weapons to countries that do not already possess them. Safeguards verifications provide evidence that nuclear materials in peaceful use for nuclear-power production are properly accounted for. Though carried out in cooperation with nuclear facility operators, the verifications can provide assurance because they are designed with the capability to detect diversion, should it occur. Traditional safeguards verification measures conducted by inspectors of the IAEA include book auditing; counting and identifying containers of nuclear material; measuring nuclear material; photographic and video surveillance; and sealing. Novel approaches to achieve greater efficiency and effectiveness in safeguards verifications are under investigation as the number and complexity of nuclear facilities grow. These include the zone approach, which entails carrying out verifications for groups of facilities collectively, and randomization approach, which entails carrying out entire inspection visits some fraction of the time on a random basis. Both approaches show promise in particular situations, but, like traditional measures, must be tested to ensure their practical utility. These approaches are covered on this report. 15 refs., 16 figs., 3 tabs.

  15. Helping Parents Say No.

    ERIC Educational Resources Information Center

    Duel, Debra K.

    1988-01-01

    Provides some activities that are designed to help students understand some of the reasons why parents sometimes refuse to let their children have pets. Includes mathematics and writing lessons, a student checklist, and a set of tips for parents. (TW)

  16. Can Reading Help?

    ERIC Educational Resources Information Center

    Crowe, Chris

    2003-01-01

    Ponders the effect of September 11th on teenagers. Proposes that reading books can help teenagers sort out complicated issues. Recommends young adult novels that offer hope for overcoming tragedy. Lists 50 short story collections worth reading. (PM)

  17. Hooked on Helping

    ERIC Educational Resources Information Center

    Longhurst, James; McCord, Joan

    2014-01-01

    In this article, teens presenting at a symposium on peer-helping programs describe how caring for others fosters personal growth and builds positive group cultures. Their individual thoughts and opinions are expressed.

  18. Helping Friends and Family

    MedlinePlus

    ... take them up on it! When the adjustment process gets stuck back to top There are times ... are some ways to help move the adjustment process along: Speak honestly and frankly about your feelings . ...

  19. Help Teens Manage Diabetes

    MedlinePlus

    ... Training (CST) as a part of routine diabetes management. Its aim is to improve diabetic teens' coping and communication skills, healthy behaviors, and conflict resolution. The CST training helps diabetic teens to ...

  20. Helping Teens Cope.

    ERIC Educational Resources Information Center

    Jones, Jami I.

    2003-01-01

    Considers the role of school library media specialists in helping teens cope with developmental and emotional challenges. Discusses resiliency research, and opportunities to develop programs and services especially for middle school and high school at-risk teens. (LRW)

  1. Teaching "The Nuclear Predicament."

    ERIC Educational Resources Information Center

    Carman, Philip; Kneeshaw, Stephen

    1987-01-01

    Contends that courses on nuclear war must help students examine the political, social, religious, philosophical, economic, and moral assumptions which characterized the dilemma of nuclear armament/disarmament. Describes the upper level undergraduate course taught by the authors. (JDH)

  2. Initial performance of the advanced inventory verification sample system (AVIS)

    SciTech Connect

    Marlow, Johnna B; Swinhoe, Martyn T; Menlove, Howard O; Rael, Carlos D

    2009-01-01

    This paper describes the requirements, design and initial performance of the Advanced Inventory Verification Sample System (AVIS) a non-destructive assay (NDA) system to measure small samples of bulk mixed uranium-plutonium oxide (MOX) materials (powders and pellets). The AVIS design has evolved from previously developed conceptual physics and engineering designs for the Inventory Sample Verification System (INVS), a safeguards system for nondestructive assay of small samples. The AVIS is an integrated gamma-neutron system. Jointly designed by the Nuclear Material Control Center (NMCC) and the Los Alamos National Laboratory (LANL), AVIS is intended to meet a performance specification of a total measurement uncertainty of less than 0.5% in the neutron ({sup 240}Pu{sub effective}) measurement. This will allow the AVIS to replace destructive chemical analysis for many samples, with concomitant cost, exposure and waste generation savings for the facility. Data taken to date confirming the performance of the AVIS is presented.

  3. Image Hashes as Templates for Verification

    SciTech Connect

    Janik, Tadeusz; Jarman, Kenneth D.; Robinson, Sean M.; Seifert, Allen; McDonald, Benjamin S.; White, Timothy A.

    2012-07-17

    Imaging systems can provide measurements that confidently assess characteristics of nuclear weapons and dismantled weapon components, and such assessment will be needed in future verification for arms control. Yet imaging is often viewed as too intrusive, raising concern about the ability to protect sensitive information. In particular, the prospect of using image-based templates for verifying the presence or absence of a warhead, or of the declared configuration of fissile material in storage, may be rejected out-of-hand as being too vulnerable to violation of information barrier (IB) principles. Development of a rigorous approach for generating and comparing reduced-information templates from images, and assessing the security, sensitivity, and robustness of verification using such templates, are needed to address these concerns. We discuss our efforts to develop such a rigorous approach based on a combination of image-feature extraction and encryption-utilizing hash functions to confirm proffered declarations, providing strong classified data security while maintaining high confidence for verification. The proposed work is focused on developing secure, robust, tamper-sensitive and automatic techniques that may enable the comparison of non-sensitive hashed image data outside an IB. It is rooted in research on so-called perceptual hash functions for image comparison, at the interface of signal/image processing, pattern recognition, cryptography, and information theory. Such perceptual or robust image hashing—which, strictly speaking, is not truly cryptographic hashing—has extensive application in content authentication and information retrieval, database search, and security assurance. Applying and extending the principles of perceptual hashing to imaging for arms control, we propose techniques that are sensitive to altering, forging and tampering of the imaged object yet robust and tolerant to content-preserving image distortions and noise. Ensuring that the

  4. The CSMS (Configurable Seismic Monitoring System) Poorboy deployment: Seismic recording in Pinedale, Wyoming, of the Bullion NTS (Nevada Test Site) nuclear test under the verification provisions of the new TTBT protocol

    SciTech Connect

    Harben, P.E.; Rock, D.W.; Carlson, R.C.

    1990-07-10

    The Configurable Seismic Monitoring System (CSMS), developed at the Lawrence Livermore National Laboratory (LLNL) was deployed in a 13-m deep vault on the AFTAC facility at Pinedale, Wyoming to record the Bullion nuclear test. The purpose of the exercise was to meet all provisions of the new TTBT protocol on in-country seismic recording at a Designated Seismic Station (DSS). The CSMS successfully recorded the Bullion event consistent with and meeting all requirements in the new treaty protocol. In addition, desirable seismic system features not specified in the treaty protocol were determined; treaty protocol ambiguities were identified, and useful background noise recordings at the Pinedale site were obtained. 10 figs.

  5. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    SciTech Connect

    Crowell, Michael W

    2015-01-01

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ‘hand’ comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink™ for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety related calculations at the Oak Ridge National Laboratory’s High Flux Isotope Reactor (HFIR).

  6. Concepts of Model Verification and Validation

    SciTech Connect

    B.H.Thacker; S.W.Doebling; F.M.Hemez; M.C. Anderson; J.E. Pepin; E.A. Rodriguez

    2004-10-30

    Model verification and validation (V&V) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model V&V procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model V&V program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model V&V is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define V&V methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for V&V applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of V&V procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model V&V for all

  7. Experimental verification of quantum computation

    NASA Astrophysics Data System (ADS)

    Barz, Stefanie; Fitzsimons, Joseph F.; Kashefi, Elham; Walther, Philip

    2013-11-01

    Quantum computers are expected to offer substantial speed-ups over their classical counterparts and to solve problems intractable for classical computers. Beyond such practical significance, the concept of quantum computation opens up fundamental questions, among them the issue of whether quantum computations can be certified by entities that are inherently unable to compute the results themselves. Here we present the first experimental verification of quantum computation. We show, in theory and experiment, how a verifier with minimal quantum resources can test a significantly more powerful quantum computer. The new verification protocol introduced here uses the framework of blind quantum computing and is independent of the experimental quantum-computation platform used. In our scheme, the verifier is required only to generate single qubits and transmit them to the quantum computer. We experimentally demonstrate this protocol using four photonic qubits and show how the verifier can test the computer's ability to perform quantum computation.

  8. Ontology Matching with Semantic Verification

    PubMed Central

    Jean-Mary, Yves R.; Shironoshita, E. Patrick; Kabuka, Mansur R.

    2009-01-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies. PMID:20186256

  9. Ontology Matching with Semantic Verification.

    PubMed

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  10. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  11. Helpful hints to painless payload processing

    NASA Technical Reports Server (NTRS)

    Terhune, Terry; Carson, Maggie

    1995-01-01

    The helpful hints herein describe, from a system perspective, the functional flow of hardware and software. The flow will begin at the experiment development stage and continue through build-up, test, verification, delivery, launch and deintegration of the experiment. An effort will be made to identify those interfaces and transfer functions of processing that can be improved upon in the new world of 'Faster, Better, and Cheaper.' The documentation necessary to ensure configuration and processing requirements satisfaction will also be discussed. Hints and suggestions for improvements to enhance each phase of the flow will be derived from extensive experience and documented lessons learned. Charts will be utilized to define the functional flow and a list of 'lessons learned' will be addressed to show applicability. In conclusion, specific improvements for several areas of hardware processing, procedure development and quality assurance, that are generic to all Small Payloads, will be identified.

  12. Verification and transparency in future arms control

    SciTech Connect

    Pilat, J.F.

    1996-09-01

    Verification`s importance has changed dramatically over time, although it always has been in the forefront of arms control. The goals and measures of verification and the criteria for success have changed with the times as well, reflecting such factors as the centrality of the prospective agreement to East-West relations during the Cold War, the state of relations between the United States and the Soviet Union, and the technologies available for monitoring. Verification`s role may be declining in the post-Cold War period. The prospects for such a development will depend, first and foremost, on the high costs of traditional arms control, especially those associated with requirements for verification. Moreover, the growing interest in informal, or non-negotiated arms control does not allow for verification provisions by the very nature of these arrangements. Multilateral agreements are also becoming more prominent and argue against highly effective verification measures, in part because of fears of promoting proliferation by opening sensitive facilities to inspectors from potential proliferant states. As a result, it is likely that transparency and confidence-building measures will achieve greater prominence, both as supplements to and substitutes for traditional verification. Such measures are not panaceas and do not offer all that we came to expect from verification during the Cold war. But they may be the best possible means to deal with current problems of arms reductions and restraints at acceptable levels of expenditure.

  13. Stretching: Does It Help?

    ERIC Educational Resources Information Center

    Vardiman, Phillip; Carrand, David; Gallagher, Philip M.

    2010-01-01

    Stretching prior to activity is universally accepted as an important way to improve performance and help prevent injury. Likewise, limited flexibility has been shown to decrease functional ability and predispose a person to injuries. Although this is commonly accepted, appropriate stretching for children and adolescents involved with sports and…

  14. Helping Perceptually Handicapped Children

    ERIC Educational Resources Information Center

    Rose, Helen S.

    1974-01-01

    Five children diagnosed as having a perceptual problem as revealed by the Bender Visual Motor Gestalt Test received special tutoring to help develop their visual discrimination abilities. The six-week program for teaching the concept of shapes employed kinesthetic, visual, tactile, and verbal processes. (CS)

  15. With a Little Help.

    ERIC Educational Resources Information Center

    Cunningham, Richard

    1997-01-01

    Describes a volunteer tutoring program coordinated by associates of the Exxon Corporation to help middle and high school students with math and science homework. Enumerates the successes of the tutoring program and highlights other outreach activities of the company in Baton Rouge. Stresses that the future of high-technology companies depends on…

  16. Help for Stressed Students

    ERIC Educational Resources Information Center

    Pope, Denise Clarke; Simon, Richard

    2005-01-01

    The authors argue that increased focus and pressure for high academic achievement, particularly among more highly-motivated and successful students, may have serious negative consequences. They present a number of strategies designed to help reduce both causes and consequences associated with academic stress and improve students' mental and…

  17. Helping Teachers Communicate

    ERIC Educational Resources Information Center

    Kise, Jane; Russell, Beth; Shumate, Carol

    2008-01-01

    Personality type theory describes normal differences in how people are energized, take in information, make decisions, and approach work and life--all key elements in how people teach and learn. Understanding one another's personality type preferences helps teachers share their instructional strategies and classroom information. Type theory…

  18. A Helping Hand.

    ERIC Educational Resources Information Center

    Renner, Jason M.

    2000-01-01

    Discusses how designing a hand washing-friendly environment can help to reduce the spread of germs in school restrooms. Use of electronic faucets, surface risk management, traffic flow, and user- friendly hand washing systems that are convenient and maximally hygienic are examined. (GR)

  19. What Helps Us Learn?

    ERIC Educational Resources Information Center

    Educational Leadership, 2010

    2010-01-01

    This article presents comments of high school students at the Howard Gardner School in Alexandria, Virginia, who were asked, What should teachers know about students to help them learn? Twelve high school students from the Howard Gardner School in Alexandria, Virginia, describe how their best teachers get to know them and thus were more able to…

  20. Ayudele! [Help Him!].

    ERIC Educational Resources Information Center

    Spencer, Maria Gutierrez, Comp.; Almance, Sofia, Comp.

    Written in Spanish and English, the booklet briefly discusses what parents can do to help their child learn at school. The booklet briefly notes the importance of getting enough sleep; eating breakfast; praising the child; developing the five senses; visiting the doctor; having a home and garden; talking, listening, and reading to the child;…

  1. Cleanup Verification Package for the 118-C-1, 105-C Solid Waste Burial Ground

    SciTech Connect

    M. J. Appel and J. M. Capron

    2007-07-25

    This cleanup verification package documents completion of remedial action for the 118-C-1, 105-C Solid Waste Burial Ground. This waste site was the primary burial ground for general wastes from the operation of the 105-C Reactor and received process tubes, aluminum fuel spacers, control rods, reactor hardware, spent nuclear fuel and soft wastes.

  2. 77 FR 50723 - Verification, Validation, Reviews, and Audits for Digital Computer Software Used in Safety...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... COMMISSION Verification, Validation, Reviews, and Audits for Digital Computer Software Used in Safety Systems... Audits for Digital Computer Software used in Safety Systems of Nuclear Power Plants.'' The DG-1210 is... access publicly available documents online in the NRC Library at...

  3. Gender verification in competitive sports.

    PubMed

    Simpson, J L; Ljungqvist, A; de la Chapelle, A; Ferguson-Smith, M A; Genel, M; Carlson, A S; Ehrhardt, A A; Ferris, E

    1993-11-01

    The possibility that men might masquerade as women and be unfair competitors in women's sports is accepted as outrageous by athletes and the public alike. Since the 1930s, media reports have fuelled claims that individuals who once competed as female athletes subsequently appeared to be men. In most of these cases there was probably ambiguity of the external genitalia, possibly as a result of male pseudohermaphroditism. Nonetheless, beginning at the Rome Olympic Games in 1960, the International Amateur Athletics Federation (IAAF) began establishing rules of eligibility for women athletes. Initially, physical examination was used as a method for gender verification, but this plan was widely resented. Thus, sex chromatin testing (buccal smear) was introduced at the Mexico City Olympic Games in 1968. The principle was that genetic females (46,XX) show a single X-chromatic mass, whereas males (46,XY) do not. Unfortunately, sex chromatin analysis fell out of common diagnostic use by geneticists shortly after the International Olympic Committee (IOC) began its implementation for gender verification. The lack of laboratories routinely performing the test aggravated the problem of errors in interpretation by inexperienced workers, yielding false-positive and false-negative results. However, an even greater problem is that there exist phenotypic females with male sex chromatin patterns (e.g. androgen insensitivity, XY gonadal dysgenesis). These individuals have no athletic advantage as a result of their congenital abnormality and reasonably should not be excluded from competition. That is, only the chromosomal (genetic) sex is analysed by sex chromatin testing, not the anatomical or psychosocial status. For all the above reasons sex chromatin testing unfairly excludes many athletes. Although the IOC offered follow-up physical examinations that could have restored eligibility for those 'failing' sex chromatin tests, most affected athletes seemed to prefer to 'retire'. All

  4. Ninety-four cases of encapsulated follicular variant of papillary thyroid carcinoma: A name change to Noninvasive Follicular Thyroid Neoplasm with Papillary-like Nuclear Features would help prevent overtreatment.

    PubMed

    Thompson, Lester Dr

    2016-07-01

    Encapsulated follicular variant of papillary thyroid carcinoma is a common thyroid gland cancer, with a highly indolent behavior. Recently, reclassification as a non-malignant neoplasm has been proposed. There is no comprehensive, community hospital based longitudinal evaluation of encapsulated follicular variant of papillary thyroid carcinoma. Ninety-four cases of encapsulated follicular variant of papillary thyroid carcinoma were identified in a review of all thyroid gland surgeries performed in 2002 within the Southern California Permanente Medical Group. All histology slides were reviewed and follow-up obtained. Seventy-five women and nineteen men, aged 20-80 years (mean 45.6 years), had a single (n=61), multiple (same lobe; n=20), or bilateral (n=13) tumor(s), ranging in size from 0.7 to 9.5 cm in diameter (mean 3.3 cm). Histologically, all cases demonstrated a well-formed tumor capsule, with capsular and/or lymphovascular invasion in 17 and no invasion in 77 cases. Lymph node metastases were not identified. The tumors had a follicular architecture, without necrosis or >3 mitoses/10 high-power fields (HPFs). Classical papillary thyroid carcinoma nuclear features were seen in at least three HPFs per 3 mm of tumor diameter, including enlarged, elongated, crowded, and overlapping nuclei, irregular nuclear contours, nuclear grooves, and nuclear chromatin clearing. Lobectomy alone (n=41), thyroidectomy alone (n=34), or completion thyroidectomy (n=19) was the initial treatment combined with post-op radioablative iodine in 25 patients. All patients were without evidence of disease after a median follow-up of 11.8 years. Encapsulated follicular variant of papillary thyroid carcinoma showed benign behavior, supporting conservative surgery alone and reclassification of these tumors to Noninvasive Follicular Thyroid Neoplasm with Papillary-like Nuclear Features (NIFTP).

  5. Search for sanity: The politics of nuclear weapons and disarmament

    SciTech Connect

    Joseph, P.; Rosenblum, S.

    1984-01-01

    This book examines the political aspects of nuclear weapons and arms control. Topics considered include nuclear deterrence, military strategy, the military-industrial complex, the nuclear balance, first strike, nuclear errors and accidents, treaty verification, survival, the economic impact of military spending, Western European peace movements, peace movements in Eastern Europe, the cold war, nuclear diplomacy, moral aspects, the defense budget, national security, foreign policy, proliferation, and nuclear disarmament.

  6. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PEMS calibrations and verifications....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the applicable calibrations and verifications in subpart D of this part, including the linearity verifications...

  7. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PEMS calibrations and verifications....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the applicable calibrations and verifications in subpart D of this part, including the linearity verifications...

  8. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  9. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  10. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  11. IAEA verification experiment at the Portsmouth Gaseous Diffusion Plant

    SciTech Connect

    Gordon, D.M.; Subudhi, M.; Calvert, O.L.; Bonner, T.N.; Adams, J.G.; Cherry, R.C.; Whiting, N.E.

    1998-08-01

    In April 1996, the United States (US) added the Portsmouth Gaseous Diffusion Plant to the list of facilities eligible for the application of International Atomic Energy Agency (IAEA) safeguards. At that time, the US proposed that the IAEA carry out a Verification Experiment at the plant with respect to the downblending of about 13 metric tons of highly enriched uranium (HEU) in the form of UF{sub 6}. This material is part of the 226 metric tons of fissile material that President Clinton has declared to be excess to US national-security needs and which will be permanently withdrawn from the US nuclear stockpile. In September 1997, the IAEA agreed to carry out this experiment, and during the first three weeks of December 1997, the IAEA verified the design information concerning the downblending process. The plant has been subject to short-notice random inspections since December 17, 1997. This paper provides an overview of the Verification Experiment, the monitoring technologies used in the verification approach, and some of the experience gained to date.

  12. Verification and validation guidelines for high integrity systems. Volume 1

    SciTech Connect

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D.

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities.

  13. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  14. Input apparatus for dynamic signature verification systems

    DOEpatents

    EerNisse, Errol P.; Land, Cecil E.; Snelling, Jay B.

    1978-01-01

    The disclosure relates to signature verification input apparatus comprising a writing instrument and platen containing piezoelectric transducers which generate signals in response to writing pressures.

  15. Guide to good practices for independent verification

    SciTech Connect

    1998-12-01

    This Guide to Good Practices is written to enhance understanding of, and provide direction for, Independent Verification, Chapter X of Department of Energy (DOE) Order 5480.19, Conduct of Operations Requirements for DOE Facilities. The practices in this guide should be considered when planning or reviewing independent verification activities. Contractors are advised to adopt procedures that meet the intent of DOE Order 5480.19. Independent Verification is an element of an effective Conduct of Operations program. The complexity and array of activities performed in DOE facilities dictate the necessity for coordinated independent verification activities to promote safe and efficient operations.

  16. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  17. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Cycles § 1065.550 Gas analyzer range verification and drift verification. (a) Range verification. If an... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations...-specific emissions over the entire duty cycle for drift. For each constituent to be verified, both sets...

  18. Measurement techniques for the verification of excess weapons materials

    SciTech Connect

    Tape, J.W.; Eccleston, G.W.; Yates, M.A.

    1998-12-01

    The end of the superpower arms race has resulted in an unprecedented reduction in stockpiles of deployed nuclear weapons. Numerous proposals have been put forward and actions have been taken to ensure the irreversibility of nuclear arms reductions, including unilateral initiatives such as those made by President Clinton in September 1993 to place fissile materials no longer needed for a deterrent under international inspection, and bilateral and multilateral measures currently being negotiated. For the technologist, there is a unique opportunity to develop the technical means to monitor nuclear materials that have been declared excess to nuclear weapons programs, to provide confidence that reductions are taking place and that the released materials are not being used again for nuclear explosive programs. However, because of the sensitive nature of these materials, a fundamental conflict exists between the desire to know that the bulk materials or weapon components in fact represent evidence of warhead reductions, and treaty commitments and national laws that require the protection of weapons design information. This conflict presents a unique challenge to technologists. The flow of excess weapons materials, from deployed warheads through storage, disassembly, component storage, conversion to bulk forms, and disposition, will be described in general terms. Measurement approaches based on the detection of passive or induced radiation will be discussed along with the requirement to protect sensitive information from release to unauthorized parties. Possible uses of measurement methods to assist in the verification of arms reductions will be described. The concept of measuring attributes of items rather than quantitative mass-based inventory verification will be discussed along with associated information-barrier concepts required to protect sensitive information.

  19. Conformance Verification of Privacy Policies

    NASA Astrophysics Data System (ADS)

    Fu, Xiang

    Web applications are both the consumers and providers of information. To increase customer confidence, many websites choose to publish their privacy protection policies. However, policy conformance is often neglected. We propose a logic based framework for formally specifying and reasoning about the implementation of privacy protection by a web application. A first order extension of computation tree logic is used to specify a policy. A verification paradigm, built upon a static control/data flow analysis, is presented to verify if a policy is satisfied.

  20. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  1. Approaches to wind resource verification

    NASA Technical Reports Server (NTRS)

    Barchet, W. R.

    1982-01-01

    Verification of the regional wind energy resource assessments produced by the Pacific Northwest Laboratory addresses the question: Is the magnitude of the resource given in the assessments truly representative of the area of interest? Approaches using qualitative indicators of wind speed (tree deformation, eolian features), old and new data of opportunity not at sites specifically chosen for their exposure to the wind, and data by design from locations specifically selected to be good wind sites are described. Data requirements and evaluation procedures for verifying the resource are discussed.

  2. Why do verification and validation?

    DOE PAGESBeta

    Hu, Kenneth T.; Paez, Thomas L.

    2016-02-19

    In this discussion paper, we explore different ways to assess the value of verification and validation (V&V) of engineering models. We first present a literature review on the value of V&V and then use value chains and decision trees to show how value can be assessed from a decision maker's perspective. In this context, the value is what the decision maker is willing to pay for V&V analysis with the understanding that the V&V results are uncertain. As a result, the 2014 Sandia V&V Challenge Workshop is used to illustrate these ideas.

  3. Hailstorms over Switzerland: Verification of Crowd-sourced Data

    NASA Astrophysics Data System (ADS)

    Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia

    2016-04-01

    The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.

  4. Verification of heterogeneous multi-agent system using MCMAS

    NASA Astrophysics Data System (ADS)

    Choi, Jiyoung; Kim, Seungkeun; Tsourdos, Antonios

    2015-03-01

    The focus of the paper is how to model autonomous behaviours of heterogeneous multi-agent systems such that it can be verified that they will always operate within predefined mission requirements and constraints. This is done by using formal methods with an abstraction of the behaviours modelling and model checking for their verification. Three case studies are presented to verify the decision-making behaviours of heterogeneous multi-agent system using a convoy mission scenario. The multi-agent system in a case study has been extended by increasing the number of agents and function complexity gradually. For automatic verification, model checker for multi-agent systems (MCMAS) is adopted due to its novel capability to accommodate the multi-agent system and successfully verifies the targeting behaviours of the team-level autonomous systems. The verification results help retrospectively the design of decision-making algorithms improved by considering additional agents and behaviours during three steps of scenario modification. Consequently, the last scenario deals with the system composed of a ground control system, two unmanned aerial vehicles, and four unmanned ground vehicles with fault-tolerant and communication relay capabilities.

  5. Definition of ground test for Large Space Structure (LSS) control verification, appendix G

    NASA Technical Reports Server (NTRS)

    1984-01-01

    A Large Space Structure (LSS) ground test facility was developed to help verify LSS passive and active control theories. The facility also perform: (1) subsystem and component testing; (2) remote sensing and control; (3) parameter estimation and model verification; and (4) evolutionary modeling and control. The program is examined as is and looks at the first experiment to be performed in the laboratory.

  6. U.S. EPA Environmental Technology Verification Program, the Founder of the ETV Concept

    EPA Science Inventory

    The U.S. EPA Environmental Technology Verification (ETV) Program develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The program was created in 1995 to help accelerate t...

  7. Helping Your Child through Early Adolescence -- Helping Your Child Series

    MedlinePlus

    ... CHILD'S ACADEMIC SUCCESS Helping Your Child Through Early Adolescence -- Helping Your Child Series PDF (1 MB) For ... Acknowledgements Tips to Help Your Child through Early Adolescence No Child Left Behind < Previous page | ^ Top ^ | Next ...

  8. A practical experience with independent verification and validation

    NASA Technical Reports Server (NTRS)

    Page, Gerald; Mcgarry, Frank E.; Card, David N.

    1985-01-01

    One approach to reducing software cost and increasing reliability is the use of an independent verification and validation (IV & V) methodology. The Software Engineering Laboratory (SEL) applied the IV & V methodology to two medium-size flight dynamics software development projects. Then, to measure the effectiveness of the IV & V approach, the SEL compared these two projects with two similar past projects, using measures like productivity, reliability, and maintain ablilty. Results indicated that the use of the IV & V methodology did not help the overall process nor improve the product in these cases.

  9. Please Help Your Union

    NASA Astrophysics Data System (ADS)

    Killeen, Tim

    2006-03-01

    The continuing success of AGU relies entirely on the volunteer work of members. A major contribution to these efforts comes from the over 40 committees that plan, oversee, and have operational roles in our meetings, publications, finances, elections, awards, education, public information, and public affairs activities. The names of committees are provided in the accompanying text box; their current membership and descriptions can be found on the Web at the AGU site. One of the most important and challenging tasks of the incoming AGU President is to reestablish these committees by appointing hundreds of volunteers. I now solicit your help in staffing these committees. Ideally, participation in these important committees will reflect the overall membership and perspectives of AGU members, so please do consider volunteering yourself. Of course, nominations of others would also be very welcome. I am particularly interested in making sure that the gender balance, age, and geographic representation are appropriate and reflect our changing demographics. Any suggestions you might have will be more helpful if accompanied by a few sentences of background information relevant to the particular committee.

  10. Container Verification Using Optically Stimulated Luminescence

    SciTech Connect

    Tanner, Jennifer E.; Miller, Steven D.; Conrady, Matthew M.; Simmons, Kevin L.; Tinker, Michael R.

    2008-10-01

    Containment verification is a high priority for safeguards containment and surveillance. Nuclear material containers, safeguards equipment cabinets, camera housings, and detector cable conduit are all vulnerable to tampering. Even with a high security seal on a lid or door, custom-built hinges and interfaces, and special colors and types of finishes, the surfaces of enclosures can be tampered with and any penetrations repaired and covered over. With today’s technology, these repairs would not be detected during a simple visual inspection. Several suggested solutions have been to develop complicated networks of wires, fiber-optic cables, lasers or other sensors that line the inside of a container and alarm when the network is disturbed. This results in an active system with real time evidence of tampering but is probably not practical for most safeguards applications. A more practical solution would be to use a passive approach where an additional security feature was added to surfaces which would consist of a special coating or paint applied to the container or enclosure. One type of coating would incorporate optically stimulated luminescent (OSL) material. OSL materials are phosphors that luminesce in proportion to the ionizing radiation dose when stimulated with the appropriate optical wavelengths. The OSL fluoresces at a very specific wavelength when illuminated at another, very specific wavelength. The presence of the pre-irradiated OSL material in the coating is confirmed using a device that interrogates the surface of the enclosure using the appropriate optical wavelength and then reads the resulting luminescence. The presence of the OSL indicates that the integrity of the surface is intact. The coating itself could be transparent which would allow the appearance of the container to remain unchanged or the OSL material could be incorporated into certain paints or epoxies used on various types of containers. The coating could be applied during manufacturing

  11. Acoustic techniques in nuclear safeguards

    SciTech Connect

    Olinger, C.T.; Sinha, D.N.

    1995-07-01

    Acoustic techniques can be employed to address many questions relevant to current nuclear technology needs. These include establishing and monitoring intrinsic tags and seals, locating holdup in areas where conventional radiation-based measurements have limited capability, process monitoring, monitoring containers for corrosion or changes in pressure, and facility design verification. These acoustics applications are in their infancy with respect to safeguards and nuclear material management, but proof-of-principle has been demonstrated in many of the areas listed.

  12. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  13. Runtime Verification with State Estimation

    NASA Technical Reports Server (NTRS)

    Stoller, Scott D.; Bartocci, Ezio; Seyster, Justin; Grosu, Radu; Havelund, Klaus; Smolka, Scott A.; Zadok, Erez

    2011-01-01

    We introduce the concept of Runtime Verification with State Estimation and show how this concept can be applied to estimate theprobability that a temporal property is satisfied by a run of a program when monitoring overhead is reduced by sampling. In such situations, there may be gaps in the observed program executions, thus making accurate estimation challenging. To deal with the effects of sampling on runtime verification, we view event sequences as observation sequences of a Hidden Markov Model (HMM), use an HMM model of the monitored program to "fill in" sampling-induced gaps in observation sequences, and extend the classic forward algorithm for HMM state estimation (which determines the probability of a state sequence, given an observation sequence) to compute the probability that the property is satisfied by an execution of the program. To validate our approach, we present a case study based on the mission software for a Mars rover. The results of our case study demonstrate high prediction accuracy for the probabilities computed by our algorithm. They also show that our technique is much more accurate than simply evaluating the temporal property on the given observation sequences, ignoring the gaps.

  14. A Verification Method for MASOES.

    PubMed

    Perozo, N; Aguilar Perozo, J; Terán, O; Molina, H

    2013-02-01

    MASOES is a 3agent architecture for designing and modeling self-organizing and emergent systems. This architecture describes the elements, relationships, and mechanisms, both at the individual and the collective levels, that favor the analysis of the self-organizing and emergent phenomenon without mathematically modeling the system. In this paper, a method is proposed for verifying MASOES from the point of view of design in order to study the self-organizing and emergent behaviors of the modeled systems. The verification criteria are set according to what is proposed in MASOES for modeling self-organizing and emerging systems and the principles of the wisdom of crowd paradigm and the fuzzy cognitive map (FCM) theory. The verification method for MASOES has been implemented in a tool called FCM Designer and has been tested to model a community of free software developers that works under the bazaar style as well as a Wikipedia community in order to study their behavior and determine their self-organizing and emergent capacities.

  15. Video-based fingerprint verification.

    PubMed

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-09-04

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, "inside similarity" and "outside similarity" are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low.

  16. Video-Based Fingerprint Verification

    PubMed Central

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283

  17. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  18. Students' Verification Strategies for Combinatorial Problems

    ERIC Educational Resources Information Center

    Mashiach Eizenberg, Michal; Zaslavsky, Orit

    2004-01-01

    We focus on a major difficulty in solving combinatorial problems, namely, on the verification of a solution. Our study aimed at identifying undergraduate students' tendencies to verify their solutions, and the verification strategies that they employ when solving these problems. In addition, an attempt was made to evaluate the level of efficiency…

  19. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  20. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Verification program. 460.17 Section...

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  2. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements:...

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  4. 29 CFR 1903.19 - Abatement verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 5 2014-07-01 2014-07-01 false Abatement verification. 1903.19 Section 1903.19 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR INSPECTIONS, CITATIONS AND PROPOSED PENALTIES § 1903.19 Abatement verification. Purpose. OSHA's inspections are intended to result in...

  5. Gender Verification of Female Olympic Athletes.

    ERIC Educational Resources Information Center

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  6. Guidelines for qualifying cleaning and verification materials

    NASA Technical Reports Server (NTRS)

    Webb, D.

    1995-01-01

    This document is intended to provide guidance in identifying technical issues which must be addressed in a comprehensive qualification plan for materials used in cleaning and cleanliness verification processes. Information presented herein is intended to facilitate development of a definitive checklist that should address all pertinent materials issues when down selecting a cleaning/verification media.

  7. CFE verification: The decision to inspect

    SciTech Connect

    Allentuck, J.

    1990-01-01

    Verification of compliance with the provisions of the treaty on Conventional Forces-Europe (CFE) is subject to inspection quotas of various kinds. Thus the decision to carry out a specific inspection or verification activity must be prudently made. This decision process is outlined, and means for conserving quotas'' are suggested. 4 refs., 1 fig.

  8. Scope and verification of a Fissile Material (Cutoff) Treaty

    SciTech Connect

    Hippel, Frank N. von

    2014-05-09

    A Fissile Material Cutoff Treaty (FMCT) would ban the production of fissile material - in practice highly-enriched uranium and separated plutonium - for weapons. It has been supported by strong majorities in the United Nations. After it comes into force, newly produced fissile materials could only be produced under international - most likely International Atomic Energy Agency - monitoring. Many non-weapon states argue that the treaty should also place under safeguards pre-existing stocks of fissile material in civilian use or declared excess for weapons so as to make nuclear-weapons reductions irreversible. This paper discusses the scope of the FMCT, the ability to detect clandestine production and verification challenges in the nuclear-weapons states.

  9. Scope and verification of a Fissile Material (Cutoff) Treaty

    NASA Astrophysics Data System (ADS)

    von Hippel, Frank N.

    2014-05-01

    A Fissile Material Cutoff Treaty (FMCT) would ban the production of fissile material - in practice highly-enriched uranium and separated plutonium - for weapons. It has been supported by strong majorities in the United Nations. After it comes into force, newly produced fissile materials could only be produced under international - most likely International Atomic Energy Agency - monitoring. Many non-weapon states argue that the treaty should also place under safeguards pre-existing stocks of fissile material in civilian use or declared excess for weapons so as to make nuclear-weapons reductions irreversible. This paper discusses the scope of the FMCT, the ability to detect clandestine production and verification challenges in the nuclear-weapons states.

  10. New method of verificating optical flat flatness

    NASA Astrophysics Data System (ADS)

    Sun, Hao; Li, Xueyuan; Han, Sen; Zhu, Jianrong; Guo, Zhenglai; Fu, Yuegang

    2014-11-01

    Optical flat is commonly used in optical testing instruments, flatness is the most important parameter of forming errors. As measurement criteria, optical flat flatness (OFF) index needs to have good precision. Current measurement in China is heavily dependent on the artificial visual interpretation, through discrete points to characterize the flatness. The efficiency and accuracy of this method can not meet the demand of industrial development. In order to improve the testing efficiency and accuracy of measurement, it is necessary to develop an optical flat verification system, which can obtain all surface information rapidly and efficiently, at the same time, in accordance with current national metrological verification procedures. This paper reviews current optical flat verification method and solves the problems existing in previous test, by using new method and its supporting software. Final results show that the new system can improve verification efficiency and accuracy, by comparing with JJG 28-2000 metrological verification procedures method.

  11. Working memory mechanism in proportional quantifier verification.

    PubMed

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-12-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g., "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow dots". The second study reveals that both types of sentences are correlated with memory storage, however, only proportional sentences are associated with the cognitive control. This result suggests that the cognitive mechanism underlying the verification of proportional quantifiers is crucially related to the integration process, in which an individual has to compare in memory the cardinalities of two sets. In the third study we find that the numerical distance between two cardinalities that must be compared significantly influences the verification time and accuracy. The results of our studies are discussed in the broader context of processing complex sentences. PMID:24374596

  12. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  13. Neutron spectrometry for UF6 enrichment verification in storage cylinders

    DOE PAGESBeta

    Mengesha, Wondwosen; Kiff, Scott D.

    2015-01-29

    Verification of declared UF6 enrichment and mass in storage cylinders is of great interest in nuclear material nonproliferation. Nondestructive assay (NDA) techniques are commonly used for safeguards inspections to ensure accountancy of declared nuclear materials. Common NDA techniques used include gamma-ray spectrometry and both passive and active neutron measurements. In the present study, neutron spectrometry was investigated for verification of UF6 enrichment in 30B storage cylinders based on an unattended and passive measurement approach. MCNP5 and Geant4 simulated neutron spectra, for selected UF6 enrichments and filling profiles, were used in the investigation. The simulated neutron spectra were analyzed using principalmore » component analysis (PCA). The PCA technique is a well-established technique and has a wide area of application including feature analysis, outlier detection, and gamma-ray spectral analysis. Results obtained demonstrate that neutron spectrometry supported by spectral feature analysis has potential for assaying UF6 enrichment in storage cylinders. The results from the present study also showed that difficulties associated with the UF6 filling profile and observed in other unattended passive neutron measurements can possibly be overcome using the approach presented.« less

  14. Ionoacoustics: A new direct method for range verification

    NASA Astrophysics Data System (ADS)

    Parodi, Katia; Assmann, Walter

    2015-05-01

    The superior ballistic properties of ion beams may offer improved tumor-dose conformality and unprecedented sparing of organs at risk in comparison to other radiation modalities in external radiotherapy. However, these advantages come at the expense of increased sensitivity to uncertainties in the actual treatment delivery, resulting from inaccuracies of patient positioning, physiological motion and uncertainties in the knowledge of the ion range in living tissue. In particular, the dosimetric selectivity of ion beams depends on the longitudinal location of the Bragg peak, making in vivo knowledge of the actual beam range the greatest challenge to full clinical exploitation of ion therapy. Nowadays, in vivo range verification techniques, which are already, or close to, being investigated in the clinical practice, rely on the detection of the secondary annihilation photons or prompt gammas, resulting from nuclear interaction of the primary ion beam with the irradiated tissue. Despite the initial promising results, these methods utilize a not straightforward correlation between nuclear and electromagnetic processes, and typically require massive and costly instrumentation. On the contrary, the long-term known, yet only recently revisited process of "ionoacoustics", which is generated by local tissue heating especially at the Bragg peak, may offer a more direct approach to in vivo range verification, as reviewed here.

  15. Nuclear astrophysics

    NASA Astrophysics Data System (ADS)

    Arnould, M.; Takahashi, K.

    1999-03-01

    Nuclear astrophysics is that branch of astrophysics which helps understanding of the Universe, or at least some of its many faces, through the knowledge of the microcosm of the atomic nucleus. It attempts to find as many nuclear physics imprints as possible in the macrocosm, and to decipher what those messages are telling us about the varied constituent objects in the Universe at present and in the past. In the last decades much advance has been made in nuclear astrophysics thanks to the sometimes spectacular progress made in the modelling of the structure and evolution of the stars, in the quality and diversity of the astronomical observations, as well as in the experimental and theoretical understanding of the atomic nucleus and of its spontaneous or induced transformations. Developments in other subfields of physics and chemistry have also contributed to that advance. Notwithstanding the accomplishment, many long-standing problems remain to be solved, and the theoretical understanding of a large variety of observational facts needs to be put on safer grounds. In addition, new questions are continuously emerging, and new facts endangering old ideas. This review shows that astrophysics has been, and still is, highly demanding to nuclear physics in both its experimental and theoretical components. On top of the fact that large varieties of nuclei have to be dealt with, these nuclei are immersed in highly unusual environments which may have a significant impact on their static properties, the diversity of their transmutation modes, and on the probabilities of these modes. In order to have a chance of solving some of the problems nuclear astrophysics is facing, the astrophysicists and nuclear physicists are obviously bound to put their competence in common, and have sometimes to benefit from the help of other fields of physics, like particle physics, plasma physics or solid-state physics. Given the highly varied and complex aspects, we pick here some specific nuclear

  16. Verifying a nuclear weapon`s response to radiation environments

    SciTech Connect

    Dean, F.F.; Barrett, W.H.

    1998-05-01

    The process described in the paper is being applied as part of the design verification of a replacement component designed for a nuclear weapon currently in the active stockpile. This process is an adaptation of the process successfully used in nuclear weapon development programs. The verification process concentrates on evaluating system response to radiation environments, verifying system performance during and after exposure to radiation environments, and assessing system survivability.

  17. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  18. MCFC power plant system verification

    SciTech Connect

    Farooque, M.; Bernard, R.; Doyon, J.; Paetsch, L.; Patel, P.; Skok, A.; Yuh, C.

    1993-11-01

    In pursuit of commercialization, efforts are underway to: (1) advance the technology base by enhancing performance and demonstrating endurance, (2) scale up stack to the full area and height, (3) acquire stack manufacturing capability and experience, (4) establish capability as well as gain experience for power plant system testing of the full-height carbonate fuel cell stack, (5) and define power plant design and develop critical subsystem components. All the major project objectives have already been attained. Over the last year, significant progress has been achieved in establishing the full-height stack design, gaining stack manufacturing and system integrated testing experience, and verifying the major equipment design in power plant system tests. In this paper, recent progresses on stack scaleup, demonstration testing, BOP verification, and stack endurance are presented.

  19. Retail applications of signature verification

    NASA Astrophysics Data System (ADS)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  20. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  1. Safeguards Guidance Document for Designers of Commercial Nuclear Facilities: International Nuclear Safeguards Requirements and Practices For Uranium Enrichment Plants

    SciTech Connect

    Robert Bean; Casey Durst

    2009-10-01

    This report is the second in a series of guidelines on international safeguards requirements and practices, prepared expressly for the designers of nuclear facilities. The first document in this series is the description of generic international nuclear safeguards requirements pertaining to all types of facilities. These requirements should be understood and considered at the earliest stages of facility design as part of a new process called “Safeguards-by-Design.” This will help eliminate the costly retrofit of facilities that has occurred in the past to accommodate nuclear safeguards verification activities. The following summarizes the requirements for international nuclear safeguards implementation at enrichment plants, prepared under the Safeguards by Design project, and funded by the U.S. Department of Energy (DOE) National Nuclear Security Administration (NNSA), Office of NA-243. The purpose of this is to provide designers of nuclear facilities around the world with a simplified set of design requirements and the most common practices for meeting them. The foundation for these requirements is the international safeguards agreement between the country and the International Atomic Energy Agency (IAEA), pursuant to the Treaty on the Non-proliferation of Nuclear Weapons (NPT). Relevant safeguards requirements are also cited from the Safeguards Criteria for inspecting enrichment plants, found in the IAEA Safeguards Manual, Part SMC-8. IAEA definitions and terms are based on the IAEA Safeguards Glossary, published in 2002. The most current specification for safeguards measurement accuracy is found in the IAEA document STR-327, “International Target Values 2000 for Measurement Uncertainties in Safeguarding Nuclear Materials,” published in 2001. For this guide to be easier for the designer to use, the requirements have been restated in plainer language per expert interpretation using the source documents noted. The safeguards agreement is fundamentally a

  2. Why do ineffective treatments seem helpful? A brief review

    PubMed Central

    Hartman, Steve E

    2009-01-01

    After any therapy, when symptoms improve, healthcare providers (and patients) are tempted to award credit to treatment. Over time, a particular treatment can seem so undeniably helpful that scientific verification of efficacy is judged an inconvenient waste of time and resources. Unfortunately, practitioners' accumulated, day-to-day, informal impressions of diagnostic reliability and clinical efficacy are of limited value. To help clarify why even treatments entirely lacking in direct effect can seem helpful, I will explain why real signs and symptoms often improve, independent of treatment. Then, I will detail quirks of human perception, interpretation, and memory that often make symptoms seem improved, when they are not. I conclude that healthcare will grow to full potential only when judgments of clinical efficacy routinely are based in properly scientific, placebo-controlled, outcome analysis. PMID:19822008

  3. Enhanced Cancelable Biometrics for Online Signature Verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. In this paper, we propose a scheme to improve the performance of a cancelable approach for online signature verification. Our scheme generates two cancelable dataset from one raw dataset and uses them for verification. Preliminary experiments were performed using a distance-based online signature verification algorithm. The experimental results show that our proposed scheme is promising.

  4. Liquefied Natural Gas (LNG) dispenser verification device

    NASA Astrophysics Data System (ADS)

    Xiong, Maotao; Yang, Jie-bin; Zhao, Pu-jun; Yu, Bo; Deng, Wan-quan

    2013-01-01

    The composition of working principle and calibration status of LNG (Liquefied Natural Gas) dispenser in China are introduced. According to the defect of weighing method in the calibration of LNG dispenser, LNG dispenser verification device has been researched. The verification device bases on the master meter method to verify LNG dispenser in the field. The experimental results of the device indicate it has steady performance, high accuracy level and flexible construction, and it reaches the international advanced level. Then LNG dispenser verification device will promote the development of LNG dispenser industry in China and to improve the technical level of LNG dispenser manufacture.

  5. We Want to Help!

    NASA Astrophysics Data System (ADS)

    Trombino, D. F.

    1997-05-01

    The D.M.S.O. is the only full-time optical solar observatory in the Sunshine State. Its instruments are made available on a NO CHARGE basis to deserving Central Florida amateur astronomers, undergraduate students at nearby Stetson University and other local colleges. We are privately owned and completely independent of federal funding. Our research is supported through small, individual and corporate donations (mainly equipment) and the voluntary manpower of trained amateur solar observers. The D.M.S.O. is an affiliate of the Museum of Art & Sciences, Daytona Beach, and maintains ties with the Department of Physics and Computer Sciences at Stetson University in DeLand. Daily solar observations are made in while-light, H-alpha and calcium II K-line wavelengths using University grade Day-Star filters in conjunction with a long focus 15cm refractor and two 12 cm refractors. Particular attention is given to the morphology of sunspots, plage, flares, prominence and other features. Our results are reported to the Solar Sections of the B.A.A. (England), A.L.P.O. (U.S.A.) and SONNE, (Germany). Our East coast location enables us to record photospheric and chromospheric activity well in advance of West coast observatories: an obvious advantage. Numerous lakes at our site provides us with exceptional seeing -- at times approaching one arc/sec. Future plans call for high resolution CCD/video solar patrol monitoring, simultaneously in three wavelengths at 15 second intervals. This and other projects, including establishing a web page, will be undertaken in cooperation with the Computer Sciences Institute at Stetson University. We do no have all the answers, but we may have the solution to your problems. We welcome the opportunity to discuss your needs and our future cooperative goals. We need each other and we want to help! Please leave your card and feel free to contact me at the above address. Our E-mail address is NLTsolar@aol.com, if you prefer.

  6. Help Helps, but Only so Much: Research on Help Seeking with Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Aleven, Vincent; Roll, Ido; McLaren, Bruce M.; Koedinger, Kenneth R.

    2016-01-01

    Help seeking is an important process in self-regulated learning (SRL). It may influence learning with intelligent tutoring systems (ITSs), because many ITSs provide help, often at the student's request. The Help Tutor was a tutor agent that gave in-context, real-time feedback on students' help-seeking behavior, as they were learning with an ITS.…

  7. Groundwater flow code verification ``benchmarking`` activity (COVE-2A): Analysis of participants` work

    SciTech Connect

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project.

  8. Tritium as an indicator of venues for nuclear tests.

    PubMed

    Lyakhova, O N; Lukashenko, S N; Mulgin, S I; Zhdanov, S V

    2013-10-01

    Currently, due to the Treaty on the Non-proliferation of Nuclear Weapons there is a highly topical issue of an accurate verification of nuclear explosion venues. This paper proposes to consider new method for verification by using tritium as an indicator. Detailed studies of the tritium content in the air were carried in the locations of underground nuclear tests - "Balapan" and "Degelen" testing sites located in Semipalatinsk Test Site. The paper presents data on the levels and distribution of tritium in the air where tunnels and boreholes are located - explosion epicentres, wellheads and tunnel portals, as well as in estuarine areas of the venues for the underground nuclear explosions (UNE). PMID:23639690

  9. Tritium as an indicator of venues for nuclear tests.

    PubMed

    Lyakhova, O N; Lukashenko, S N; Mulgin, S I; Zhdanov, S V

    2013-10-01

    Currently, due to the Treaty on the Non-proliferation of Nuclear Weapons there is a highly topical issue of an accurate verification of nuclear explosion venues. This paper proposes to consider new method for verification by using tritium as an indicator. Detailed studies of the tritium content in the air were carried in the locations of underground nuclear tests - "Balapan" and "Degelen" testing sites located in Semipalatinsk Test Site. The paper presents data on the levels and distribution of tritium in the air where tunnels and boreholes are located - explosion epicentres, wellheads and tunnel portals, as well as in estuarine areas of the venues for the underground nuclear explosions (UNE).

  10. MAMA Software Features: Quantification Verification Documentation-1

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  11. Engineering drawing field verification program. Revision 3

    SciTech Connect

    Ulk, P.F.

    1994-10-12

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented.

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: FUEL CELLS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: STORMWATER TECHNOLOGIES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techn...

  14. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis...

  15. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis...

  16. PERFORMANCE VERIFICATION OF WATER SECURITY - RELATED TECHNOLOGIES

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program's Advanced Monitoring Systems (AMS) Center has been charged by EPA to verify the performance of commercially available monitoring technologies for air, water, soil. Four categories of water security technologies (most of whi...

  17. Environmental Technology Verification Program (ETV) Policy Compendium

    EPA Science Inventory

    The Policy Compendium summarizes operational decisions made to date by participants in the U.S. Environmental Protection Agency's (EPA's) Environmental Technology Verification Program (ETV) to encourage consistency among the ETV centers. The policies contained herein evolved fro...

  18. U.S. Environmental Technology Verification Program

    EPA Science Inventory

    Overview of the U.S. Environmental Technology Verification Program (ETV), the ETV Greenhouse Gas Technology Center, and energy-related ETV projects. Presented at the Department of Energy's National Renewable Laboratory in Boulder, Colorado on June 23, 2008.

  19. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  20. The NPARC Alliance Verification and Validation Archive

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Dudek, Julianne C.; Tatum, Kenneth E.

    2000-01-01

    The NPARC Alliance (National Project for Applications oriented Research in CFD) maintains a publicly-available, web-based verification and validation archive as part of the development and support of the WIND CFD code. The verification and validation methods used for the cases attempt to follow the policies and guidelines of the ASME and AIAA. The emphasis is on air-breathing propulsion flow fields with Mach numbers ranging from low-subsonic to hypersonic.

  1. Does Marijuana Help Treat Glaucoma?

    MedlinePlus

    ... Ophthalmologist Patient Stories Español Eye Health / Tips & Prevention Marijuana Sections Does Marijuana Help Treat Glaucoma? Why Eye ... Don't Recommend Marijuana for Glaucoma Infographic Does Marijuana Help Treat Glaucoma? Written by: David Turbert , contributing ...

  2. DOE handbook: Integrated safety management systems (ISMS) verification team leader`s handbook

    SciTech Connect

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  3. Development of advanced seal verification

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Kosten, Susan E.; Abushagur, Mustafa A.

    1992-01-01

    The purpose of this research is to develop a technique to monitor and insure seal integrity with a sensor that has no active elements to burn-out during a long duration activity, such as a leakage test or especially during a mission in space. The original concept proposed is that by implementing fiber optic sensors, changes in the integrity of a seal can be monitored in real time and at no time should the optical fiber sensor fail. The electrical components which provide optical excitation and detection through the fiber are not part of the seal; hence, if these electrical components fail, they can be easily changed without breaking the seal. The optical connections required for the concept to work does present a functional problem to work out. The utility of the optical fiber sensor for seal monitoring should be general enough that the degradation of a seal can be determined before catastrophic failure occurs and appropriate action taken. Two parallel efforts were performed in determining the feasibility of using optical fiber sensors for seal verification. In one study, research on interferometric measurements of the mechanical response of the optical fiber sensors to seal integrity was studied. In a second study, the implementation of the optical fiber to a typical vacuum chamber was implemented and feasibility studies on microbend experiments in the vacuum chamber were performed. Also, an attempt was made to quantify the amount of pressure actually being applied to the optical fiber using finite element analysis software by Algor.

  4. Learning Assumptions for Compositional Verification

    NASA Technical Reports Server (NTRS)

    Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.

  5. Biometric verification in dynamic writing

    NASA Astrophysics Data System (ADS)

    George, Susan E.

    2002-03-01

    Pen-tablet devices capable of capturing the dynamics of writing record temporal and pressure information as well as the spatial pattern. This paper explores biometric verification based upon the dynamics of writing where writers are distinguished not on the basis of what they write (ie the signature), but how they write. We have collected samples of dynamic writing from 38 Chinese writers. Each writer was asked to provide 10 copies of a paragraph of text and the same number of signature samples. From the data we have extracted stroke-based primitives from the sentence data utilizing pen-up/down information and heuristic rules about the shape of the character. The x, y and pressure values of each primitive were interpolated into an even temporal range based upon a 20 msec sampling rate. We applied the Daubechies 1 wavelet transform to the x signal, y signal and pressure signal using the coefficients as inputs to a multi-layer perceptron trained with back-propagation on the sentence data. We found a sensitivity of 0.977 and specificity of 0.990 recognizing writers based on test primitives extracted from sentence data and measures of 0.916 and 0.961 respectively, from test primitives extracted from signature data.

  6. National Verification System of National Meteorological Center , China

    NASA Astrophysics Data System (ADS)

    Zhang, Jinyan; Wei, Qing; Qi, Dan

    2016-04-01

    Product Quality Verification Division for official weather forecasting of China was founded in April, 2011. It is affiliated to Forecast System Laboratory (FSL), National Meteorological Center (NMC), China. There are three employees in this department. I'm one of the employees and I am in charge of Product Quality Verification Division in NMC, China. After five years of construction, an integrated realtime National Verification System of NMC, China has been established. At present, its primary roles include: 1) to verify official weather forecasting quality of NMC, China; 2) to verify the official city weather forecasting quality of Provincial Meteorological Bureau; 3) to evaluate forecasting quality for each forecasters in NMC, China. To verify official weather forecasting quality of NMC, China, we have developed : • Grid QPF Verification module ( including upascale) • Grid temperature, humidity and wind forecast verification module • Severe convective weather forecast verification module • Typhoon forecast verification module • Disaster forecast verification • Disaster warning verification module • Medium and extend period forecast verification module • Objective elements forecast verification module • Ensemble precipitation probabilistic forecast verification module To verify the official city weather forecasting quality of Provincial Meteorological Bureau, we have developed : • City elements forecast verification module • Public heavy rain forecast verification module • City air quality forecast verification module. To evaluate forecasting quality for each forecasters in NMC, China, we have developed : • Off-duty forecaster QPF practice evaluation module • QPF evaluation module for forecasters • Severe convective weather forecast evaluation module • Typhoon track forecast evaluation module for forecasters • Disaster warning evaluation module for forecasters • Medium and extend period forecast evaluation module The further

  7. Verification of operating software for cooperative monitoring applications

    SciTech Connect

    Tolk, K.M.; Rembold, R.K.

    1997-08-01

    Monitoring agencies often use computer based equipment to control instruments and to collect data at sites that are being monitored under international safeguards or other cooperative monitoring agreements. In order for this data to be used as an independent verification of data supplied by the host at the facility, the software used must be trusted by the monitoring agency. The monitoring party must be sure that the software has not be altered to give results that could lead to erroneous conclusions about nuclear materials inventories or other operating conditions at the site. The host might also want to verify that the software being used is the software that has been previously inspected in order to be assured that only data that is allowed under the agreement is being collected. A description of a method to provide this verification using keyed has functions and how the proposed method overcomes possible vulnerabilities in methods currently in use such as loading the software from trusted disks is presented. The use of public key data authentication for this purpose is also discussed.

  8. Guidelines for the verification and validation of expert system software and conventional software: Evaluation of knowledge base certification methods. Volume 4

    SciTech Connect

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-03-01

    This report presents the results of the Knowledge Base Certification activity of the expert systems verification and validation (V&V) guideline development project which is jointly funded by the US Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. This activity is concerned with the development and testing of various methods for assuring the quality of knowledge bases. The testing procedure used was that of behavioral experiment, the first known such evaluation of any type of V&V activity. The value of such experimentation is its capability to provide empirical evidence for -- or against -- the effectiveness of plausible methods in helping people find problems in knowledge bases. The three-day experiment included 20 participants from three nuclear utilities, the Nuclear Regulatory Commission`s Technical training Center, the University of Maryland, EG&G Idaho, and SAIC. The study used two real nuclear expert systems: a boiling water reactor emergency operating procedures tracking system and a pressurized water reactor safety assessment systems. Ten participants were assigned to each of the expert systems. All participants were trained in and then used a sequence of four different V&V methods selected as being the best and most appropriate for study on the basis of prior evaluation activities. These methods either involved the analysis and tracing of requirements to elements in the knowledge base (requirements grouping and requirements tracing) or else involved direct inspection of the knowledge base for various kinds of errors. Half of the subjects within each system group used the best manual variant of the V&V methods (the control group), while the other half were supported by the results of applying real or simulated automated tools to the knowledge bases (the experimental group).

  9. Certainty in Stockpile Computing: Recommending a Verification and Validation Program for Scientific Software

    SciTech Connect

    Lee, J.R.

    1998-11-01

    As computing assumes a more central role in managing the nuclear stockpile, the consequences of an erroneous computer simulation could be severe. Computational failures are common in other endeavors and have caused project failures, significant economic loss, and loss of life. This report examines the causes of software failure and proposes steps to mitigate them. A formal verification and validation program for scientific software is recommended and described.

  10. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory's Source Region Program

    SciTech Connect

    Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.

    1993-01-21

    As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.

  11. Software engineering and automatic continuous verification of scientific software

    NASA Astrophysics Data System (ADS)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical

  12. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  13. Verification study of an emerging fire suppression system

    DOE PAGESBeta

    Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.; Gubernatis, David C.

    2016-01-01

    Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation andmore » mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.« less

  14. Verification study of an emerging fire suppression system

    SciTech Connect

    Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.; Gubernatis, David C.

    2016-01-01

    Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation and mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.

  15. Verification of Internal Dose Calculations.

    NASA Astrophysics Data System (ADS)

    Aissi, Abdelmadjid

    The MIRD internal dose calculations have been in use for more than 15 years, but their accuracy has always been questionable. There have been attempts to verify these calculations; however, these attempts had various shortcomings which kept the question of verification of the MIRD data still unanswered. The purpose of this research was to develop techniques and methods to verify the MIRD calculations in a more systematic and scientific manner. The research consisted of improving a volumetric dosimeter, developing molding techniques, and adapting the Monte Carlo computer code ALGAM to the experimental conditions and vice versa. The organic dosimetric system contained TLD-100 powder and could be shaped to represent human organs. The dosimeter possessed excellent characteristics for the measurement of internal absorbed doses, even in the case of the lungs. The molding techniques are inexpensive and were used in the fabrication of dosimetric and radioactive source organs. The adaptation of the computer program provided useful theoretical data with which the experimental measurements were compared. The experimental data and the theoretical calculations were compared for 6 source organ-7 target organ configurations. The results of the comparison indicated the existence of an agreement between measured and calculated absorbed doses, when taking into consideration the average uncertainty (16%) of the measurements, and the average coefficient of variation (10%) of the Monte Carlo calculations. However, analysis of the data gave also an indication that the Monte Carlo method might overestimate the internal absorbed doses. Even if the overestimate exists, at least it could be said that the use of the MIRD method in internal dosimetry was shown to lead to no unnecessary exposure to radiation that could be caused by underestimating the absorbed dose. The experimental and the theoretical data were also used to test the validity of the Reciprocity Theorem for heterogeneous

  16. Ozone Monitoring Instrument geolocation verification

    NASA Astrophysics Data System (ADS)

    Kroon, M.; Dobber, M. R.; Dirksen, R.; Veefkind, J. P.; van den Oord, G. H. J.; Levelt, P. F.

    2008-08-01

    Verification of the geolocation assigned to individual ground pixels as measured by the Ozone Monitoring Instrument (OMI) aboard the NASA EOS-Aura satellite was performed by comparing geophysical Earth surface details as observed in OMI false color images with the high-resolution continental outline vector map as provided by the Interactive Data Language (IDL) software tool from ITT Visual Information Solutions. The OMI false color images are generated from the OMI visible channel by integration over 20-nm-wide spectral bands of the Earth radiance intensity around 484 nm, 420 nm, and 360 nm wavelength per ground pixel. Proportional to the integrated intensity, we assign color values composed of CRT standard red, green, and blue to the OMI ground pixels. Earth surface details studied are mostly high-contrast coast lines where arid land or desert meets deep blue ocean. The IDL high-resolution vector map is based on the 1993 CIA World Database II Map with a 1-km accuracy. Our results indicate that the average OMI geolocation offset over the years 2005-2006 is 0.79 km in latitude and 0.29 km in longitude, with a standard deviation of 1.64 km in latitude and 2.04 km in longitude, respectively. Relative to the OMI nadir pixel size, one obtains mean displacements of ˜6.1% in latitude and ˜1.2% in longitude, with standard deviations of 12.6% and 7.9%, respectively. We conclude that the geolocation assigned to individual OMI ground pixels is sufficiently accurate to support scientific studies of atmospheric features as observed in OMI level 2 satellite data products, such as air quality issues on urban scales or volcanic eruptions and its plumes, that occur on spatial scales comparable to or smaller than OMI nadir pixels.

  17. Automated verification of system configuration

    NASA Astrophysics Data System (ADS)

    Andrews, W. H., Jr.; Baker, S. P.; Blalock, A. V.

    1991-05-01

    Errors in field wiring can result in significant correction costs (if the errors are discovered prior to use), in erroneous or unusable data (if the errors are not discovered in time), or in serious accidents (if the errors corrupt critical data). Detailed field wiring checkout rework are tedious and expensive, but they are essential steps in the quality assurance process for large, complex instrumentation and control systems. A recent Oak Ridge National Laboratory (ORNL) development, the CONFiguration IDEnification System (CONFIDES) automates verification of field wiring. In CONFIDES, an identifier module is installed on or integrated into each component (e.g., sensor, actuator, cable, distribution panel) to be verified. Interrogator modules, controlled by a personal computer (PC), are installed at the connections of the field wiring to the inputs of the data acquisition and control system (DACS). Interrogator modules poll the components connected to each channel of the DACS and can determine the path taken by each channel's signal to or from the end device for that channel. The system will provide not only the identification (ID) code for the cables and patch panels in the path to a particular sensor or actuator but for individual cable conductor IDs as well. One version of the system uses existing signal wires for communications between CONFIDES modules. Another, more powerful version requires a single dedicated conductor in each cable. Both version can operate with or without instrument power applied and neither interferes with the normal operation of the DACS. Identifier modules can provide a variety of information including status and calibration data.

  18. Applying Human-performance Models to Designing and Evaluating Nuclear Power Plants: Review Guidance and Technical Basis

    SciTech Connect

    O'Hara, J.M.

    2009-11-30

    Human performance models (HPMs) are simulations of human behavior with which we can predict human performance. Designers use them to support their human factors engineering (HFE) programs for a wide range of complex systems, including commercial nuclear power plants. Applicants to U.S. Nuclear Regulatory Commission (NRC) can use HPMs for design certifications, operating licenses, and license amendments. In the context of nuclear-plant safety, it is important to assure that HPMs are verified and validated, and their usage is consistent with their intended purpose. Using HPMs improperly may generate misleading or incorrect information, entailing safety concerns. The objective of this research was to develop guidance to support the NRC staff's reviews of an applicant's use of HPMs in an HFE program. The guidance is divided into three topical areas: (1) HPM Verification, (2) HPM Validation, and (3) User Interface Verification. Following this guidance will help ensure the benefits of HPMs are achieved in a technically sound, defensible manner. During the course of developing this guidance, I identified several issues that could not be addressed; they also are discussed.

  19. Transmutation in ADS and needs for nuclear data, with an introduction to the n-TOF at CERN

    SciTech Connect

    Gonzalez, E.; Embid, M.; Fernandez, R.; Garcia, J.; Villamarin, D.

    1999-11-16

    Transmutation can help in the nuclear waste problem by reducing seriously the life and amount of the most dangerous isotopes (radiotoxicity, heat, packing volume and neutron multiplication reductions). ADS are one of the best technologies for nuclear waste transmutation at large scale. Although enough information is available to prepare conceptual designs and make assessments on their performance, a large R and D campaign is required to obtain the precision data required to optimize the detailed engineering design and refine our expectations calculations on waste reduction by the different transmutation strategies being proposed. In particular a large R and D effort is required in nuclear physics, where fundamental differential measurements and integral verification experiments are required. In this sense, the PS213 n-TOF at CERN PS (at Switzerland) will become one of the largest installations to perform the fundamental differential measurements and a wide international collaboration has been setup to perform the cross section measuring campaign. Similarly, the MUSE and several other experiments taking place and in preparation in Europe, USA and Japan will provide the integral verification.

  20. Data Storage Accounting and Verification at LHC experiments

    NASA Astrophysics Data System (ADS)

    Huang, C.-H.; Lanciotti, E.; Magini, N.; Ratnikova, N.; Sanchez-Hernandez, A.; Serfon, C.; Wildish, T.; Zhang, X.

    2012-12-01

    All major experiments at the Large Hadron Collider (LHC) need to measure real storage usage at the Grid sites. This information is equally important for resource management, planning, and operations. To verify the consistency of central catalogs, experiments are asking sites to provide a full list of the files they have on storage, including size, checksum, and other file attributes. Such storage dumps, provided at regular intervals, give a realistic view of the storage resource usage by the experiments. Regular monitoring of the space usage and data verification serve as additional internal checks of the system integrity and performance. Both the importance and the complexity of these tasks increase with the constant growth of the total data volumes during the active data taking period at the LHC. The use of common solutions helps to reduce the maintenance costs, both at the large Tier1 facilities supporting multiple virtual organizations and at the small sites that often lack manpower. We discuss requirements and solutions to the common tasks of data storage accounting and verification, and present experiment-specific strategies and implementations used within the LHC experiments according to their computing models.

  1. High-Resolution Fast-Neutron Spectrometry for Arms Control and Treaty Verification

    SciTech Connect

    David L. Chichester; James T. Johnson; Edward H. Seabury

    2012-07-01

    Many nondestructive nuclear analysis techniques have been developed to support the measurement needs of arms control and treaty verification, including gross photon and neutron counting, low- and high-resolution gamma spectrometry, time-correlated neutron measurements, and photon and neutron imaging. One notable measurement technique that has not been extensively studied to date for these applications is high-resolution fast-neutron spectrometry (HRFNS). Applied for arms control and treaty verification, HRFNS has the potential to serve as a complimentary measurement approach to these other techniques by providing a means to either qualitatively or quantitatively determine the composition and thickness of non-nuclear materials surrounding neutron-emitting materials. The technique uses the normally-occurring neutrons present in arms control and treaty verification objects of interest as an internal source of neutrons for performing active-interrogation transmission measurements. Most low-Z nuclei of interest for arms control and treaty verification, including 9Be, 12C, 14N, and 16O, possess fast-neutron resonance features in their absorption cross sections in the 0.5- to 5-MeV energy range. Measuring the selective removal of source neutrons over this energy range, assuming for example a fission-spectrum starting distribution, may be used to estimate the stoichiometric composition of intervening materials between the neutron source and detector. At a simpler level, determination of the emitted fast-neutron spectrum may be used for fingerprinting 'known' assemblies for later use in template-matching tests. As with photon spectrometry, automated analysis of fast-neutron spectra may be performed to support decision making and reporting systems protected behind information barriers. This paper will report recent work at Idaho National Laboratory to explore the feasibility of using HRFNS for arms control and treaty verification applications, including simulations and

  2. Ground-based visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-11-01

    Ground-based visual inspection will play an essential role in On-Site Inspection (OSI) for Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection will greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can ground-based visual inspection offer effective documentation in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending state may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection. The inspections will be carried out by inspectors from members of the CTBT Organization.

  3. Verification of Minimum Detectable Activity for Radiological Threat Source Search

    NASA Astrophysics Data System (ADS)

    Gardiner, Hannah; Myjak, Mitchell; Baciak, James; Detwiler, Rebecca; Seifert, Carolyn

    2015-10-01

    The Department of Homeland Security's Domestic Nuclear Detection Office is working to develop advanced technologies that will improve the ability to detect, localize, and identify radiological and nuclear sources from airborne platforms. The Airborne Radiological Enhanced-sensor System (ARES) program is developing advanced data fusion algorithms for analyzing data from a helicopter-mounted radiation detector. This detector platform provides a rapid, wide-area assessment of radiological conditions at ground level. The NSCRAD (Nuisance-rejection Spectral Comparison Ratios for Anomaly Detection) algorithm was developed to distinguish low-count sources of interest from benign naturally occurring radiation and irrelevant nuisance sources. It uses a number of broad, overlapping regions of interest to statistically compare each newly measured spectrum with the current estimate for the background to identify anomalies. We recently developed a method to estimate the minimum detectable activity (MDA) of NSCRAD in real time. We present this method here and report on the MDA verification using both laboratory measurements and simulated injects on measured backgrounds at or near the detection limits. This work is supported by the US Department of Homeland Security, Domestic Nuclear Detection Office, under competitively awarded contract/IAA HSHQDC-12-X-00376. This support does not constitute an express or implied endorsement on the part of the Gov't.

  4. To Help Substance Abusers, We Must First Help Ourselves.

    ERIC Educational Resources Information Center

    Educational Leadership, 1988

    1988-01-01

    Administrator recounts experience growing up in alcoholic home, hoping to inspire other school professionals helping young people with substance abuse problems. Although helping others seems natural for adult children of alcoholics, certain unconsciously held attitudes and behaviors can impede school prevention and recovery programs. Organizations…

  5. The Chandra HelpDesk

    NASA Astrophysics Data System (ADS)

    Galle, Elizabeth C.

    2008-03-01

    The Chandra X-ray Center (CXC) HelpDesk has answered hundreds of user questions over the course of the Chandra mission, ranging from basic syntax errors to advanced analysis questions. This talk gives an introduction to the HelpDesk system and staff, presents a sample of recent HelpDesk requests, and discusses how user-submitted questions improve the software and documentation.

  6. Verification and Validation Methodology of Real-Time Adaptive Neural Networks for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Gupta, Pramod; Loparo, Kenneth; Mackall, Dale; Schumann, Johann; Soares, Fola

    2004-01-01

    Recent research has shown that adaptive neural based control systems are very effective in restoring stability and control of an aircraft in the presence of damage or failures. The application of an adaptive neural network with a flight critical control system requires a thorough and proven process to ensure safe and proper flight operation. Unique testing tools have been developed as part of a process to perform verification and validation (V&V) of real time adaptive neural networks used in recent adaptive flight control system, to evaluate the performance of the on line trained neural networks. The tools will help in certification from FAA and will help in the successful deployment of neural network based adaptive controllers in safety-critical applications. The process to perform verification and validation is evaluated against a typical neural adaptive controller and the results are discussed.

  7. Hybrid Deep Learning for Face Verification.

    PubMed

    Sun, Yi; Wang, Xiaogang; Tang, Xiaoou

    2016-10-01

    This paper proposes a hybrid convolutional network (ConvNet)-Restricted Boltzmann Machine (RBM) model for face verification. A key contribution of this work is to learn high-level relational visual features with rich identity similarity information. The deep ConvNets in our model start by extracting local relational visual features from two face images in comparison, which are further processed through multiple layers to extract high-level and global relational features. To keep enough discriminative information, we use the last hidden layer neuron activations of the ConvNet as features for face verification instead of those of the output layer. To characterize face similarities from different aspects, we concatenate the features extracted from different face region pairs by different deep ConvNets. The resulting high-dimensional relational features are classified by an RBM for face verification. After pre-training each ConvNet and the RBM separately, the entire hybrid network is jointly optimized to further improve the accuracy. Various aspects of the ConvNet structures, relational features, and face verification classifiers are investigated. Our model achieves the state-of-the-art face verification performance on the challenging LFW dataset under both the unrestricted protocol and the setting when outside data is allowed to be used for training. PMID:26660699

  8. Verification against perturbed analyses and observations

    NASA Astrophysics Data System (ADS)

    Bowler, N. E.; Cullen, M. J. P.; Piccolo, C.

    2015-07-01

    It has long been known that verification of a forecast against the sequence of analyses used to produce those forecasts can under-estimate the magnitude of forecast errors. Here we show that under certain conditions the verification of a short-range forecast against a perturbed analysis coming from an ensemble data assimilation scheme can give the same root-mean-square error as verification against the truth. This means that a perturbed analysis can be used as a reliable proxy for the truth. However, the conditions required for this result to hold are rather restrictive: the analysis must be optimal, the ensemble spread must be equal to the error in the mean, the ensemble size must be large and the forecast being verified must be the background forecast used in the data assimilation. Although these criteria are unlikely to be met exactly it becomes clear that for most cases verification against a perturbed analysis gives better results than verification against an unperturbed analysis. We demonstrate the application of these results in a idealised model framework and a numerical weather prediction context. In deriving this result we recall that an optimal (Kalman) analysis is one for which the analysis increments are uncorrelated with the analysis errors.

  9. Complementary technologies for verification of excess plutonium

    SciTech Connect

    Langner, , D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-12-31

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of {sup 240}Pu to {sup 239}Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime.

  10. Hybrid Deep Learning for Face Verification.

    PubMed

    Sun, Yi; Wang, Xiaogang; Tang, Xiaoou

    2016-10-01

    This paper proposes a hybrid convolutional network (ConvNet)-Restricted Boltzmann Machine (RBM) model for face verification. A key contribution of this work is to learn high-level relational visual features with rich identity similarity information. The deep ConvNets in our model start by extracting local relational visual features from two face images in comparison, which are further processed through multiple layers to extract high-level and global relational features. To keep enough discriminative information, we use the last hidden layer neuron activations of the ConvNet as features for face verification instead of those of the output layer. To characterize face similarities from different aspects, we concatenate the features extracted from different face region pairs by different deep ConvNets. The resulting high-dimensional relational features are classified by an RBM for face verification. After pre-training each ConvNet and the RBM separately, the entire hybrid network is jointly optimized to further improve the accuracy. Various aspects of the ConvNet structures, relational features, and face verification classifiers are investigated. Our model achieves the state-of-the-art face verification performance on the challenging LFW dataset under both the unrestricted protocol and the setting when outside data is allowed to be used for training.

  11. Verification of COSMO model over Poland

    NASA Astrophysics Data System (ADS)

    Linkowska, Joanna; Mazur, Andrzej; Wyszogrodzki, Andrzej

    2014-05-01

    The Polish National Weather Service and Institute of Meteorology and Water Management - National Research Institute (IMWM-NRI, Warsaw, Poland) joined the Consortium for Small-Scale Modeling (COSMO) in 2002. Thanks to cooperation in the consortium the meteorological model COSMO is run operationally at IMWM-NRI at both 2.8km and 7km horizontal resolutions. In research mode, data assimilation tests have been carried out using a 6-hourly cycle nudging scheme. We would like to present verification results of the COSMO model, comparing model generated surface temperature, wind and rain fall rates with the Synop measurements. In addition, verification results of vertical profiles for chosen variables will also be analyzed and presented. The verification is divided into the following areas: i) assessing impact of data assimilation on the quality of 2.8km resolution model forecasts by switching data assimilation on and off, ii) spatio-temporal verification of model results at 7km resolution and iii) conditional verification of selected parameters against chosen meteorological condition(s).

  12. MACCS2 development and verification efforts

    SciTech Connect

    Young, M.; Chanin, D.

    1997-03-01

    MACCS2 represents a major enhancement of the capabilities of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, released in 1987, was developed to estimate the potential impacts to the surrounding public of severe accidents at nuclear power plants. The principal phenomena considered in MACCS/MACCS2 are atmospheric transport and deposition under time-variant meteorology, short-term and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. MACCS2 was developed as a general-purpose analytical tool applicable to diverse reactor and nonreactor facilities. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. In addition, errors that had been identified in MACCS version1.5.11.1 were corrected, including an error that prevented the code from providing intermediate-phase results. MACCS2 version 1.10 beta test was released to the beta-test group in May, 1995. In addition, the University of New Mexico (UNM) has completed an independent verification study of the code package. Since the beta-test release of MACCS2 version 1.10, a number of minor errors have been identified and corrected, and a number of enhancements have been added to the code package. The code enhancements added since the beta-test release of version 1.10 include: (1) an option to allow the user to input the {sigma}{sub y} and {sigma}{sub z} plume expansion parameters in a table-lookup form for incremental downwind distances, (2) an option to define different initial dimensions for up to four segments of a release, (3) an enhancement to the COMIDA2 food-chain model preprocessor to allow the user to supply externally calculated tables of tritium food-chain dose per unit deposition on farmland to support analyses of tritium releases, and (4) the capability to calculate direction-dependent doses.

  13. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development, and lower operating costs. However, as those system close control loops and arbitrate resources on-board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques, and concrete experiments at NASA.

  14. Efficient Verification of Holograms Using Mobile Augmented Reality.

    PubMed

    Hartl, Andreas Daniel; Arth, Clemens; Grubert, Jens; Schmalstieg, Dieter

    2016-07-01

    Paper documents such as passports, visas and banknotes are frequently checked by inspection of security elements. In particular, optically variable devices such as holograms are important, but difficult to inspect. Augmented Reality can provide all relevant information on standard mobile devices. However, hologram verification on mobiles still takes long and provides lower accuracy than inspection by human individuals using appropriate reference information. We aim to address these drawbacks by automatic matching combined with a special parametrization of an efficient goal-oriented user interface which supports constrained navigation. We first evaluate a series of similarity measures for matching hologram patches to provide a sound basis for automatic decisions. Then a re-parametrized user interface is proposed based on observations of typical user behavior during document capture. These measures help to reduce capture time to approximately 15 s with better decisions regarding the evaluated samples than what can be achieved by untrained users. PMID:26561461

  15. Verification and Implementation of Operations Safety Controls for Flight Missions

    NASA Technical Reports Server (NTRS)

    Smalls, James R.; Jones, Cheryl L.; Carrier, Alicia S.

    2010-01-01

    There are several engineering disciplines, such as reliability, supportability, quality assurance, human factors, risk management, safety, etc. Safety is an extremely important engineering specialty within NASA, and the consequence involving a loss of crew is considered a catastrophic event. Safety is not difficult to achieve when properly integrated at the beginning of each space systems project/start of mission planning. The key is to ensure proper handling of safety verification throughout each flight/mission phase. Today, Safety and Mission Assurance (S&MA) operations engineers continue to conduct these flight product reviews across all open flight products. As such, these reviews help ensure that each mission is accomplished with safety requirements along with controls heavily embedded in applicable flight products. Most importantly, the S&MA operations engineers are required to look for important design and operations controls so that safety is strictly adhered to as well as reflected in the final flight product.

  16. Battery Technology Life Verification Test Manual Revision 1

    SciTech Connect

    Jon P. Christophersen

    2012-12-01

    The purpose of this Technology Life Verification Test (TLVT) Manual is to help guide developers in their effort to successfully commercialize advanced energy storage devices such as battery and ultracapacitor technologies. The experimental design and data analysis discussed herein are focused on automotive applications based on the United States Advanced Battery Consortium (USABC) electric vehicle, hybrid electric vehicle, and plug-in hybrid electric vehicle (EV, HEV, and PHEV, respectively) performance targets. However, the methodology can be equally applied to other applications as well. This manual supersedes the February 2005 version of the TLVT Manual (Reference 1). It includes criteria for statistically-based life test matrix designs as well as requirements for test data analysis and reporting. Calendar life modeling and estimation techniques, including a user’s guide to the corresponding software tool is now provided in the Battery Life Estimator (BLE) Manual (Reference 2).

  17. Data verification in the residue laboratory.

    PubMed

    Ault, J A; Cassidy, P S; Crawford, C J; Jablonski, J E; Kenyon, R G

    1994-12-01

    Residue analysis frequently presents a challenge to the quality assurance (QA) auditor due to the sheer volume of data to be audited. In the face of multiple boxes of raw data, some process must be defined that assures the scientist and the QA auditor of the quality and integrity of the data. A program that ensures that complete and appropriate verification of data before it reaches the Quality Assurance Unit (QAU) is presented. The "Guidelines for Peer Review of Data" were formulated by the Residue Analysis Business Center at Ricerca, Inc. to accommodate efficient use of review time and to define any uncertainties concerning what are acceptable data. The core of this program centers around five elements: Study initiation (definitional) meetings, calculations, verification, approval, and the use of a verification checklist.

  18. Critical Surface Cleaning and Verification Alternatives

    NASA Technical Reports Server (NTRS)

    Melton, Donald M.; McCool, A. (Technical Monitor)

    2000-01-01

    As a result of federal and state requirements, historical critical cleaning and verification solvents such as Freon 113, Freon TMC, and Trichloroethylene (TCE) are either highly regulated or no longer 0 C available. Interim replacements such as HCFC 225 have been qualified, however toxicity and future phase-out regulations necessitate long term solutions. The scope of this project was to qualify a safe and environmentally compliant LOX surface verification alternative to Freon 113, TCE and HCFC 225. The main effort was focused on initiating the evaluation and qualification of HCFC 225G as an alternate LOX verification solvent. The project was scoped in FY 99/00 to perform LOX compatibility, cleaning efficiency and qualification on flight hardware.

  19. Active alignment/contact verification system

    DOEpatents

    Greenbaum, William M.

    2000-01-01

    A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.

  20. Land Ice Verification and Validation Kit

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&Vmore » involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and test data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  1. Land Ice Verification and Validation Kit

    SciTech Connect

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and test data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.

  2. Helping a Learning Group Mature.

    ERIC Educational Resources Information Center

    Parsons, Jerry

    The ultimate goal of a learning group is to help learners achieve their goals and objectives and to help them learn to live in a rapidly changing and evolving society. Within the context of social change, the author examines how internal dynamics can be used to aid a teacher in developing an effective learning group. Drawing from psychology,…

  3. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  4. Resource Letter PSNAC-1: Physics and society: Nuclear arms control

    NASA Astrophysics Data System (ADS)

    Glaser, Alexander; Mian, Zia

    2008-01-01

    This Resource Letter provides a guide to the literature on nuclear arms control for the nonspecialist. Journal articles and books are cited for the following topics: nuclear weapons, fissile materials, nonproliferation, missiles and missile defenses, verification, disarmament, and the role of scientists in arms control.

  5. Electric power system test and verification program

    NASA Technical Reports Server (NTRS)

    Rylicki, Daniel S.; Robinson, Frank, Jr.

    1994-01-01

    Space Station Freedom's (SSF's) electric power system (EPS) hardware and software verification is performed at all levels of integration, from components to assembly and system level tests. Careful planning is essential to ensure the EPS is tested properly on the ground prior to launch. The results of the test performed on breadboard model hardware and analyses completed to date have been evaluated and used to plan for design qualification and flight acceptance test phases. These results and plans indicate the verification program for SSF's 75-kW EPS would have been successful and completed in time to support the scheduled first element launch.

  6. Challenges in High-Assurance Runtime Verification

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.

    2016-01-01

    Safety-critical systems are growing more complex and becoming increasingly autonomous. Runtime Verification (RV) has the potential to provide protections when a system cannot be assured by conventional means, but only if the RV itself can be trusted. In this paper, we proffer a number of challenges to realizing high-assurance RV and illustrate how we have addressed them in our research. We argue that high-assurance RV provides a rich target for automated verification tools in hope of fostering closer collaboration among the communities.

  7. Verification of Plan Models Using UPPAAL

    NASA Technical Reports Server (NTRS)

    Khatib, Lina; Muscettola, Nicola; Haveland, Klaus; Lau, Sonic (Technical Monitor)

    2001-01-01

    This paper describes work on the verification of HSTS, the planner and scheduler of the Remote Agent autonomous control system deployed in Deep Space 1 (DS1). The verification is done using UPPAAL, a real time model checking tool. We start by motivating our work in the introduction. Then we give a brief description of HSTS and UPPAAL. After that, we give a mapping of HSTS models into UPPAAL and we present samples of plan model properties one may want to verify. Finally, we conclude with a summary.

  8. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  9. Generic Protocol for the Verification of Ballast Water Treatment Technology

    EPA Science Inventory

    In anticipation of the need to address performance verification and subsequent approval of new and innovative ballast water treatment technologies for shipboard installation, the U.S Coast Guard and the Environmental Protection Agency‘s Environmental Technology Verification Progr...

  10. Collective helping and bystander effects in coevolving helping networks

    NASA Astrophysics Data System (ADS)

    Jo, Hang-Hyun; Lee, Hyun Keun; Park, Hyunggyu

    2010-06-01

    We study collective helping behavior and bystander effects in a coevolving helping network model. A node and a link of the network represents an agent who renders or receives help and a friendly relation between agents, respectively. A helping trial of an agent depends on relations with other involved agents and its result (success or failure) updates the relation between the helper and the recipient. We study the network link dynamics and its steady states analytically and numerically. The full phase diagram is presented with various kinds of active and inactive phases and the nature of phase transitions are explored. We find various interesting bystander effects, consistent with the field study results, of which the underlying mechanism is proposed.

  11. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    NASA Technical Reports Server (NTRS)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  12. NEUTRON MULTIPLICITY AND ACTIVE WELL NEUTRON COINCIDENCE VERIFICATION MEASUREMENTS PERFORMED FOR MARCH 2009 SEMI-ANNUAL DOE INVENTORY

    SciTech Connect

    Dewberry, R.; Ayers, J.; Tietze, F.; Klapper, K.

    2010-02-05

    The Analytical Development (AD) Section field nuclear measurement group performed six 'best available technique' verification measurements to satisfy a DOE requirement instituted for the March 2009 semi-annual inventory. The requirement of (1) yielded the need for SRNL Research Operations Department Material Control & Accountability (MC&A) group to measure the Pu content of five items and the highly enrich uranium (HEU) content of two. No 14Q-qualified measurement equipment was available to satisfy the requirement. The AD field nuclear group has routinely performed the required Confirmatory Measurements for the semi-annual inventories for fifteen years using sodium iodide and high purity germanium (HpGe) {gamma}-ray pulse height analysis nondestructive assay (NDA) instruments. With appropriate {gamma}-ray acquisition modeling, the HpGe spectrometers can be used to perform verification-type quantitative assay for Pu-isotopics and HEU content. The AD nuclear NDA group is widely experienced with this type of measurement and reports content for these species in requested process control, MC&A booking, and holdup measurements assays Site-wide. However none of the AD HpGe {gamma}-ray spectrometers have been 14Q-qualified, and the requirement of reference 1 specifically excluded a {gamma}-ray PHA measurement from those it would accept for the required verification measurements. The requirement of reference 1 was a new requirement for which the Savannah River National Laboratory (SRNL) Research Operations Department (ROD) MC&A group was unprepared. The criteria for exemption from verification were: (1) isotope content below 50 grams; (2) intrinsically tamper indicating or TID sealed items which contain a Category IV quantity of material; (3) assembled components; and (4) laboratory samples. Therefore all (SRNL) Material Balance Area (MBA) items with greater than 50 grams total Pu or greater than 50 grams HEU were subject to a verification measurement. The pass

  13. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  14. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  15. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  16. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  17. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  18. 45 CFR 1626.6 - Verification of citizenship.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 4 2014-10-01 2014-10-01 false Verification of citizenship. 1626.6 Section 1626.6... ON LEGAL ASSISTANCE TO ALIENS § 1626.6 Verification of citizenship. (a) A recipient shall require all... require verification of citizenship. A recipient shall not consider factors such as a person's...

  19. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  20. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  1. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  2. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  3. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  4. 8 CFR 343b.5 - Verification of naturalization.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 8 Aliens and Nationality 1 2012-01-01 2012-01-01 false Verification of naturalization. 343b.5... CERTIFICATE OF NATURALIZATION FOR RECOGNITION BY A FOREIGN STATE § 343b.5 Verification of naturalization. The application shall not be granted without first obtaining verification of the applicant's naturalization....

  5. 8 CFR 343b.5 - Verification of naturalization.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 8 Aliens and Nationality 1 2011-01-01 2011-01-01 false Verification of naturalization. 343b.5... CERTIFICATE OF NATURALIZATION FOR RECOGNITION BY A FOREIGN STATE § 343b.5 Verification of naturalization. The application shall not be granted without first obtaining verification of the applicant's naturalization....

  6. 8 CFR 343b.5 - Verification of naturalization.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 8 Aliens and Nationality 1 2014-01-01 2014-01-01 false Verification of naturalization. 343b.5... CERTIFICATE OF NATURALIZATION FOR RECOGNITION BY A FOREIGN STATE § 343b.5 Verification of naturalization. The application shall not be granted without first obtaining verification of the applicant's naturalization....

  7. 8 CFR 343b.5 - Verification of naturalization.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 8 Aliens and Nationality 1 2013-01-01 2013-01-01 false Verification of naturalization. 343b.5... CERTIFICATE OF NATURALIZATION FOR RECOGNITION BY A FOREIGN STATE § 343b.5 Verification of naturalization. The application shall not be granted without first obtaining verification of the applicant's naturalization....

  8. 8 CFR 343b.5 - Verification of naturalization.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Verification of naturalization. 343b.5... CERTIFICATE OF NATURALIZATION FOR RECOGNITION BY A FOREIGN STATE § 343b.5 Verification of naturalization. The application shall not be granted without first obtaining verification of the applicant's naturalization....

  9. 37 CFR 261.7 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... This section prescribes general rules pertaining to the verification by any Copyright Owner or... section shall apply to situations where a Copyright Owner or a Performer and a Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner or a...

  10. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... test conditions. As provided in 40 CFR 1068.5, we will deem your system to not meet the requirements of... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PEMS calibrations and verifications....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all...

  11. Verification of Advective Bar Elements Implemented in the Aria Thermal Response Code.

    SciTech Connect

    Mills, Brantley

    2016-01-01

    A verification effort was undertaken to evaluate the implementation of the new advective bar capability in the Aria thermal response code. Several approaches to the verification process were taken : a mesh refinement study to demonstrate solution convergence in the fluid and the solid, visually examining the mapping of the advective bar element nodes to the surrounding surfaces, and a comparison of solutions produced using the advective bars for simple geometries with solutions from commercial CFD software . The mesh refinement study has shown solution convergence for simple pipe flow in both temperature and velocity . Guidelines were provided to achieve appropriate meshes between the advective bar elements and the surrounding volume. Simulations of pipe flow using advective bars elements in Aria have been compared to simulations using the commercial CFD software ANSYS Fluent (r) and provided comparable solutions in temperature and velocity supporting proper implementation of the new capability. Verification of Advective Bar Elements iv Acknowledgements A special thanks goes to Dean Dobranich for his guidance and expertise through all stages of this effort . His advice and feedback was instrumental to its completion. Thanks also goes to Sam Subia and Tolu Okusanya for helping to plan many of the verification activities performed in this document. Thank you to Sam, Justin Lamb and Victor Brunini for their assistance in resolving issues encountered with running the advective bar element model. Finally, thanks goes to Dean, Sam, and Adam Hetzler for reviewing the document and providing very valuable comments.

  12. Technology Helps Students Learn Grammar.

    ERIC Educational Resources Information Center

    Bowen, Candace Perkins

    1999-01-01

    Describes several recent approaches on college campuses that use technology (including both Web sites and CD-ROM virtual environments) to help journalism students learn grammar. Notes successes and problems. (SR)

  13. Going Local to Find Help

    MedlinePlus

    ... Home Current Issue Past Issues Cover Story: Traumatic Brain Injury Going Local to Find Help Past Issues / Fall ... the time. From the MedlinePlus page on Traumatic Brain Injury, you can use Go Local to find specific ...

  14. The Global Diffusion of Societal Verification Tools: A Quantitative Assessment of the Public’s Ability to Engage Nonproliferation Treaty Monitoring

    SciTech Connect

    Sayre, Amanda M.; Kreyling, Sean J.; West, Curtis L.

    2015-07-11

    The spread of nuclear and dual-use technologies and the need for more robust, effective and efficient nonproliferation and arms control treaties has led to an increasing need for innovative verification approaches and technologies. This need, paired with advancements in online computing, mobile devices, commercially available satellite imagery and the evolution of online social networks, has led to a resurgence of the concept of societal verification for arms control and nonproliferation treaties. In the event a country accepts its citizens’ assistance in supporting transparency, confidence-building and societal verification, the host government will need a population that is willing and able to participate. While scholarly interest in societal verification continues to grow, social scientific research on the topic is lacking. The aim of this paper is to begin the process of understanding public societal verification capabilities, extend the availability of quantitative research on societal verification and set in motion complementary research to increase the breadth and depth of knowledge on this topic. This paper presents a potential framework and outlines a research roadmap for the development of such a societal verification capability index.

  15. Gamma electron vertex imaging and application to beam range verification in proton therapy

    SciTech Connect

    Hyeong Kim, Chan; Hyung Park, Jin; Seo, Hee; Rim Lee, Han

    2012-02-15

    Purpose: This paper describes a new gamma-ray imaging method, ''gamma electron vertex imaging (GEVI)'', which can be used for precise beam range verification in proton therapy. Methods: In GEVI imaging, the high-energy gammas from a source or nuclear interactions are first converted, by Compton scattering, to electrons, which subsequently are traced by hodoscopes to determine the location of the gamma source or the vertices of the nuclear interactions. The performance of GEVI imaging for use in-beam range verification was evaluated by Monte Carlo simulations employing geant4 equipped with the QGSP{sub B}IC{sub H}P physics package. Results: Our simulation results show that GEVI imaging can determine the proton beam range very accurately, within 2-3 mm of error, even without any sophisticated analysis. The results were obtained under simplified conditions of monoenergetic pencil beams stopped in a homogeneous phantom and on the basis of the obtained results it is expected to achieve submillimeter accuracy in proton beam range measurement. Conclusions: If future experimental work confirms the simulated results presented in this paper, the use of GEVI imaging is expected to have a great potential in increasing the accuracy of proton beam range verification in a patient, resulting in significant improvement of treatment effectiveness by enabling tight conformation of radiation dose to the tumor volume and patient safety.

  16. Nuclear Medicine

    MedlinePlus

    ... Parents/Teachers Resource Links for Students Glossary Nuclear Medicine What is nuclear medicine? What are radioactive tracers? ... funded researchers advancing nuclear medicine? What is nuclear medicine? Nuclear medicine is a medical specialty that uses ...

  17. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 4 2012-10-01 2012-10-01 false Eligibility verification. 457.380 Section 457.380 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... this chapter, the State must obtain the information through that service. (h) Interaction with...

  18. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 4 2014-10-01 2014-10-01 false Eligibility verification. 457.380 Section 457.380 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES...) Interaction with program integrity requirements. Nothing in this section should be construed as limiting...

  19. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment....

  20. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment....

  1. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment....

  2. Synthesis, verification, and optimization of systolic arrays

    SciTech Connect

    Rajopadhye, S.V.

    1986-01-01

    This dissertation addresses the issue of providing a sound theoretical basis for three important issues relating to systolic arrays, namely synthesis, verification, and optimization. Former research has concentrated on analysis of the dependency structure of the computation, and there have been numerous approaches to map this dependency structure onto a locally interconnected network. This study pursues a similar approach, but with a major generalization of the class of problems analyzed. In earlier research, it was essential that the dependencies were expressible as constant vectors (from a point in the domain to the points that it depended on); here they are permitted to be arbitrary linear functions of the point. Theory for synthesizing systolic architectures from such generalized specifications is developed. Also a systematic (mechanizable) approach to the synthesis of systolic architectures that have control signals is presented. In the areas of verification and optimization, a rigorous mathematical framework is presented that permits reasoning about the behavior of systolic arrays as functions on streams of data. Using this approach, the verification of such architectures reduces to the problem of verification of functional program.s

  3. The ALMA Commissioning and Science Verification Team

    NASA Astrophysics Data System (ADS)

    Hales, A.; Sheth, K.; Wilson, T. L.

    2010-04-01

    The goal of Commissioning is to take ALMA from the stage reached at the end of AIV, that is, a system that functions at an engineering level to an instrument that meets the science / astronomy requirements. Science Verification is the quantitative confirmation that the data produced by the instrument is valid and has the required characteristics in terms of sensitivity, image quality and accuracy.

  4. The Assembly, Integration, and Verification (AIV) team

    NASA Astrophysics Data System (ADS)

    2009-06-01

    Assembly, Integration, and Verification (AIV) is the process by which the software and hardware deliveries from the distributed ALMA partners (North America, South America, Europe, and East Asia) are assembled and integrated into a working system, and the initial technical capabilities tested to insure that they will meet the observatories exacting requirements for science.

  5. An Interactive System for Graduation Verification.

    ERIC Educational Resources Information Center

    Wang, Y.; Dasarathy, B.

    1981-01-01

    This description of a computerized graduation verification system developed and implemented at the University of South Carolina at Columbia discusses the "table-driven" feature of the programs and details the implementation of the system, including examples of the Extended Backus Naur Form (EBNF) notation used to represent the system language…

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION OF BAGHOUSE FILTRATION PRODUCTS

    EPA Science Inventory

    The Environmental Technology Verification Program (ETV) was started by EPA in 1995 to generate independent credible data on the performance of innovative technologies that have potential to improve protection of public health and the environment. ETV does not approve or certify p...

  7. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  8. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  9. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  10. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  11. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  12. 21 CFR 123.8 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Verification. 123.8 Section 123.8 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... processor shall verify that the HACCP plan is adequate to control food safety hazards that are...

  13. 21 CFR 123.8 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Verification. 123.8 Section 123.8 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... processor shall verify that the HACCP plan is adequate to control food safety hazards that are...

  14. Needs Assessment Project: FY 82 Verification Study.

    ERIC Educational Resources Information Center

    Shively, Joe E.; O'Donnell, Phyllis

    As part of a continuing assessment of educational needs in a seven-state region, researchers conducted a verification study to check the validity of educational needs first identified in fiscal year (FY) 1980. The seven states comprise Alabama, Kentucky, Ohio, Pennsylvania, Tennessee, Virginia, and West Virginia. This report describes assessment…

  15. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  16. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  17. Distilling the Verification Process for Prognostics Algorithms

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai

    2013-01-01

    The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.

  18. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Verification program. 460.17 Section 460.17 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT...

  19. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Verification program. 460.17 Section 460.17 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT...

  20. Environmental Technology Verification (ETV) Quality Program (Poster)

    EPA Science Inventory

    This is a poster created for the ETV Quality Program. The EPA Environmental Technology Verification Program (ETV) develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The...

  1. Hardware verification at Computational Logic, Inc.

    NASA Technical Reports Server (NTRS)

    Brock, Bishop C.; Hunt, Warren A., Jr.

    1990-01-01

    The following topics are covered in viewgraph form: (1) hardware verification; (2) Boyer-Moore logic; (3) core RISC; (4) the FM8502 fabrication, implementation specification, and pinout; (5) hardware description language; (6) arithmetic logic generator; (7) near term expected results; (8) present trends; (9) future directions; (10) collaborations and technology transfer; and (11) technology enablers.

  2. 24 CFR 257.112 - Mortgagee verifications.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... URBAN DEVELOPMENT MORTGAGE AND LOAN INSURANCE PROGRAMS UNDER NATIONAL HOUSING ACT AND OTHER AUTHORITIES... income. (b) Mortgage fraud verification. The mortgagor shall provide a certification to the mortgagee that the mortgagor has not been convicted under federal or state law for fraud during the...

  3. 24 CFR 257.112 - Mortgagee verifications.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... URBAN DEVELOPMENT MORTGAGE AND LOAN INSURANCE PROGRAMS UNDER NATIONAL HOUSING ACT AND OTHER AUTHORITIES... income. (b) Mortgage fraud verification. The mortgagor shall provide a certification to the mortgagee that the mortgagor has not been convicted under federal or state law for fraud during the...

  4. Gender, Legitimation, and Identity Verification in Groups

    ERIC Educational Resources Information Center

    Burke, Peter J.; Stets, Jan E.; Cerven, Christine

    2007-01-01

    Drawing upon identity theory, expectation states theory, and legitimation theory, we examine how the task leader identity in task-oriented groups is more likely to be verified for persons with high status characteristics. We hypothesize that identity verification will be accomplished more readily for male group members and legitimated task leaders…

  5. Environmental Technology Verification Program Fact Sheet

    EPA Science Inventory

    This is a Fact Sheet for the ETV Program. The EPA Environmental Technology Verification Program (ETV) develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The program ...

  6. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  7. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  8. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  9. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  10. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  11. Racing to help: racial bias in high emergency helping situations.

    PubMed

    Kunstman, Jonathan W; Plant, E Ashby

    2008-12-01

    The present work explored the influence of emergency severity on racial bias in helping behavior. Three studies placed participants in staged emergencies and measured differences in the speed and quantity of help offered to Black and White victims. Consistent with predictions, as the level of emergency increased, the speed and quality of help White participants offered to Black victims relative to White victims decreased. In line with the authors' predictions based on an integration of aversive racism theory and the arousal: cost-reward perspective on prosocial behavior, severe emergencies with Black victims elicited high levels of aversion from White helpers, and these high levels of aversion were directly related to the slower help offered to Black victims but not to White victims (Study 1). In addition, the bias was related to White individuals' interpretation of the emergency as less severe and themselves as less responsible to help Black victims rather than White victims (Studies 2 and 3). Study 3 also illustrated that emergency racial bias is unique to White individuals' responses to Black victims and not evinced by Black helpers.

  12. Proton range verification through prompt gamma-ray spectroscopy

    NASA Astrophysics Data System (ADS)

    Verburg, Joost M.; Seco, Joao

    2014-12-01

    We present an experimental study of a novel method to verify the range of proton therapy beams. Differential cross sections were measured for 15 prompt gamma-ray lines from proton-nuclear interactions with 12C and 16O at proton energies up to 150 MeV. These cross sections were used to model discrete prompt gamma-ray emissions along proton pencil-beams. By fitting detected prompt gamma-ray counts to these models, we simultaneously determined the beam range and the oxygen and carbon concentration of the irradiated matter. The performance of the method was assessed in two phantoms with different elemental concentrations, using a small scale prototype detector. Based on five pencil-beams with different ranges delivering 5 × 108 protons and without prior knowledge of the elemental composition at the measurement point, the absolute range was determined with a standard deviation of 1.0-1.4 mm. Relative range shifts at the same dose level were detected with a standard deviation of 0.3-0.5 mm. The determined oxygen and carbon concentrations also agreed well with the actual values. These results show that quantitative prompt gamma-ray measurements enable knowledge of nuclear reaction cross sections to be used for precise proton range verification in the presence of tissue with an unknown composition.

  13. Proton range verification through prompt gamma-ray spectroscopy.

    PubMed

    Verburg, Joost M; Seco, Joao

    2014-12-01

    We present an experimental study of a novel method to verify the range of proton therapy beams. Differential cross sections were measured for 15 prompt gamma-ray lines from proton-nuclear interactions with (12)C and (16)O at proton energies up to 150 MeV. These cross sections were used to model discrete prompt gamma-ray emissions along proton pencil-beams. By fitting detected prompt gamma-ray counts to these models, we simultaneously determined the beam range and the oxygen and carbon concentration of the irradiated matter. The performance of the method was assessed in two phantoms with different elemental concentrations, using a small scale prototype detector. Based on five pencil-beams with different ranges delivering 5 × 10(8) protons and without prior knowledge of the elemental composition at the measurement point, the absolute range was determined with a standard deviation of 1.0-1.4 mm. Relative range shifts at the same dose level were detected with a standard deviation of 0.3-0.5 mm. The determined oxygen and carbon concentrations also agreed well with the actual values. These results show that quantitative prompt gamma-ray measurements enable knowledge of nuclear reaction cross sections to be used for precise proton range verification in the presence of tissue with an unknown composition.

  14. The AdaptiV Approach to Verification of Adaptive Systems

    SciTech Connect

    Rouff, Christopher; Buskens, Richard; Pullum, Laura L; Cui, Xiaohui; Hinchey, Mike

    2012-01-01

    Adaptive systems are critical for future space and other unmanned and intelligent systems. Verification of these systems is also critical for their use in systems with potential harm to human life or with large financial investments. Due to their nondeterministic nature and extremely large state space, current methods for verification of software systems are not adequate to provide a high level of assurance. The combination of stabilization science, high performance computing simulations, compositional verification and traditional verification techniques, plus operational monitors, provides a complete approach to verification and deployment of adaptive systems that has not been used before. This paper gives an overview of this approach.

  15. On the role of code comparisons in verification and validation.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2003-08-01

    This report presents a perspective on the role of code comparison activities in verification and validation. We formally define the act of code comparison as the Code Comparison Principle (CCP) and investigate its application in both verification and validation. One of our primary conclusions is that the use of code comparisons for validation is improper and dangerous. We also conclude that while code comparisons may be argued to provide a beneficial component in code verification activities, there are higher quality code verification tasks that should take precedence. Finally, we provide a process for application of the CCP that we believe is minimal for achieving benefit in verification processes.

  16. Passive Tomography for Spent Fuel Verification: Analysis Framework and Instrument Design Study

    SciTech Connect

    White, Timothy A.; Svard, Staffan J.; Smith, Leon E.; Mozin, Vladimir V.; Jansson, Peter; Davour, Anna; Grape, Sophie; Trellue, H.; Deshmukh, Nikhil S.; Wittman, Richard S.; Honkamaa, Tapani; Vaccaro, Stefano; Ely, James

    2015-05-18

    The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly is being assessed through a collaboration of Support Programs to the International Atomic Energy Agency (IAEA). In the first phase of this study, two safeguards verification objectives have been identified. The first is the independent determination of the number of active pins that are present in the assembly, in the absence of a priori information. The second objective is to provide quantitative measures of pin-by-pin properties, e.g. activity of key isotopes or pin attributes such as cooling time and relative burnup, for the detection of anomalies and/or verification of operator-declared data. The efficacy of GET to meet these two verification objectives will be evaluated across a range of fuel types, burnups, and cooling times, and with a target interrogation time of less than 60 minutes. The evaluation of GET viability for safeguards applications is founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types are used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. Instrument response data are processed by a variety of tomographic-reconstruction and image-processing methods, and scoring metrics specific to each of the verification objectives are defined and used to evaluate the performance of the methods. This paper will provide a description of the analysis framework and evaluation metrics, example performance-prediction results, and describe the design of a “universal” GET instrument intended to support the full range of verification scenarios envisioned by the IAEA.

  17. Nuclear Lunar Logistics Study

    NASA Technical Reports Server (NTRS)

    1963-01-01

    This document has been prepared to incorporate all presentation aid material, together with some explanatory text, used during an oral briefing on the Nuclear Lunar Logistics System given at the George C. Marshall Space Flight Center, National Aeronautics and Space Administration, on 18 July 1963. The briefing and this document are intended to present the general status of the NERVA (Nuclear Engine for Rocket Vehicle Application) nuclear rocket development, the characteristics of certain operational NERVA-class engines, and appropriate technical and schedule information. Some of the information presented herein is preliminary in nature and will be subject to further verification, checking and analysis during the remainder of the study program. In addition, more detailed information will be prepared in many areas for inclusion in a final summary report. This work has been performed by REON, a division of Aerojet-General Corporation under Subcontract 74-10039 from the Lockheed Missiles and Space Company. The presentation and this document have been prepared in partial fulfillment of the provisions of the subcontract. From the inception of the NERVA program in July 1961, the stated emphasis has centered around the demonstration of the ability of a nuclear rocket to perform safely and reliably in the space environment, with the understanding that the assignment of a mission (or missions) would place undue emphasis on performance and operational flexibility. However, all were aware that the ultimate justification for the development program must lie in the application of the nuclear propulsion system to the national space objectives.

  18. Helping Teachers Help Themselves: Professional Development That Makes a Difference

    ERIC Educational Resources Information Center

    Patton, Kevin; Parker, Melissa; Tannehill, Deborah

    2015-01-01

    For school administrators to facilitate impactful teacher professional development, a shift in thinking that goes beyond the acquisition of new skills and knowledge to helping teachers rethink their practice is required. Based on review of the professional development literature and our own continued observations of professional development, this…

  19. Helping Schools Help Children. Research Report No. 2.

    ERIC Educational Resources Information Center

    National Inst. of Mental Health (DHEW), Bethesda, MD. Center for Studies of Crime and Delinquency.

    This pamphlet briefly reports on an experimental program designed to help the underachieving student whose academic and behavioral problems keep him in trouble with school officials. The project is based on the following premises: (1) children who learn basic academic skills and appropriate behaviors will be less vulnerable to future problems; (2)…

  20. Using the full scale 3D solid anthropometric model in radiation oncology positioning and verification.

    PubMed

    Sun, Shuh-Ping; Wu, Ching-Jung

    2004-01-01

    This paper describes the full size solid 3D Anthropometric Model using in the positioning and verification process for radiation treatment planning of the skull of cancer patients in radiotherapy. In order to obtain a full scale 3D, solid Anthropometric Model, data is first collected through computed tomography and optical scanning. Through surface reconstruction, a model is made of the patients skull, after which rapid prototyping and rapid tooling is applied to acquire a 1:1 solid model, thus, it can replace the patient for the tumor positioning and verification in radiotherapy. The 3D Anthropometric Model are not only provide a clear picture of the external appearance, but also allow insight into the internal structure of organic bodies, which is of great advantage in radiotherapy. During radiotherapy planning, 3D Anthropometric Model can be used to simulate all kinds of situations on the simulator and the linear accelerator, without the patient needing to be present, so that the medical physicist or dosimetrist will be able to design a precise treatment plan that is tailored to the patient. The 3D Anthropometric Model production system can effectively help us solve problems related to r adiotherapy positioning and verification, helping both radiotherapists and cancer patients. We expect that the application of 3D Anthropometric Model can reduce the time that needs to be spent on pretreatment procedures and enhance the quality of health care for cancer patients.

  1. Verification of monitor unit calculations for non-IMRT clinical radiotherapy: Report of AAPM Task Group 114

    SciTech Connect

    Stern, Robin L.; Heaton, Robert; Fraser, Martin W.; and others

    2011-01-15

    The requirement of an independent verification of the monitor units (MU) or time calculated to deliver the prescribed dose to a patient has been a mainstay of radiation oncology quality assurance. The need for and value of such a verification was obvious when calculations were performed by hand using look-up tables, and the verification was achieved by a second person independently repeating the calculation. However, in a modern clinic using CT/MR/PET simulation, computerized 3D treatment planning, heterogeneity corrections, and complex calculation algorithms such as convolution/superposition and Monte Carlo, the purpose of and methodology for the MU verification have come into question. In addition, since the verification is often performed using a simpler geometrical model and calculation algorithm than the primary calculation, exact or almost exact agreement between the two can no longer be expected. Guidelines are needed to help the physicist set clinically reasonable action levels for agreement. This report addresses the following charges of the task group: (1) To re-evaluate the purpose and methods of the ''independent second check'' for monitor unit calculations for non-IMRT radiation treatment in light of the complexities of modern-day treatment planning. (2) To present recommendations on how to perform verification of monitor unit calculations in a modern clinic. (3) To provide recommendations on establishing action levels for agreement between primary calculations and verification, and to provide guidance in addressing discrepancies outside the action levels. These recommendations are to be used as guidelines only and shall not be interpreted as requirements.

  2. Verification of monitor unit calculations for non-IMRT clinical radiotherapy: report of AAPM Task Group 114.

    PubMed

    Stern, Robin L; Heaton, Robert; Fraser, Martin W; Goddu, S Murty; Kirby, Thomas H; Lam, Kwok Leung; Molineu, Andrea; Zhu, Timothy C

    2011-01-01

    The requirement of an independent verification of the monitor units (MU) or time calculated to deliver the prescribed dose to a patient has been a mainstay of radiation oncology quality assurance. The need for and value of such a verification was obvious when calculations were performed by hand using look-up tables, and the verification was achieved by a second person independently repeating the calculation. However, in a modern clinic using CT/MR/PET simulation, computerized 3D treatment planning, heterogeneity corrections, and complex calculation algorithms such as convolution/superposition and Monte Carlo, the purpose of and methodology for the MU verification have come into question. In addition, since the verification is often performed using a simpler geometrical model and calculation algorithm than the primary calculation, exact or almost exact agreement between the two can no longer be expected. Guidelines are needed to help the physicist set clinically reasonable action levels for agreement. This report addresses the following charges of the task group: (1) To re-evaluate the purpose and methods of the "independent second check" for monitor unit calculations for non-IMRT radiation treatment in light of the complexities of modern-day treatment planning. (2) To present recommendations on how to perform verification of monitor unit calculations in a modern clinic. (3) To provide recommendations on establishing action levels for agreement between primary calculations and verification, and to provide guidance in addressing discrepancies outside the action levels. These recommendations are to be used as guidelines only and shall not be interpreted as requirements.

  3. History of Nuclear India

    NASA Astrophysics Data System (ADS)

    Chaturvedi, Ram

    2000-04-01

    India emerged as a free and democratic country in 1947, and entered into the nuclear age in 1948 by establishing the Atomic Energy Commission (AEC), with Homi Bhabha as the chairman. Later on the Department of Atomic Energy (DAE) was created under the Office of the Prime Minister Jawahar Lal Nehru. Initially the AEC and DAE received international cooperation, and by 1963 India had two research reactors and four nuclear power reactors. In spite of the humiliating defeat in the border war by China in 1962 and China's nuclear testing in 1964, India continued to adhere to the peaceful uses of nuclear energy. On May 18, 1974 India performed a 15 kt Peaceful Nuclear Explosion (PNE). The western powers considered it nuclear weapons proliferation and cut off all financial and technical help, even for the production of nuclear power. However, India used existing infrastructure to build nuclear power reactors and exploded both fission and fusion devices on May 11 and 13, 1998. The international community viewed the later activity as a serious road block for the Non-Proliferation Treaty and the Comprehensive Test Ban Treaty; both deemed essential to stop the spread of nuclear weapons. India considers these treaties favoring nuclear states and is prepared to sign if genuine nuclear disarmament is included as an integral part of these treaties.

  4. Crew Exploration Vehicle (CEV) Potable Water System Verification Description

    NASA Technical Reports Server (NTRS)

    Peterson, Laurie; DeVera, Jean; Vega, Leticia; Adam, Nik; Steele, John; Gazda, Daniel; Roberts, Michael

    2009-01-01

    The Crew Exploration Vehicle (CEV), also known as Orion, will ferry a crew of up to six astronauts to the International Space Station (ISS), or a crew of up to four astronauts to the moon. The first launch of CEV is scheduled for approximately 2014. A stored water system on the CEV will supply the crew with potable water for various purposes: drinking and food rehydration, hygiene, medical needs, sublimation, and various contingency situations. The current baseline biocide for the stored water system is ionic silver, similar in composition to the biocide used to maintain quality of the water transferred from the Orbiter to the ISS and stored in Contingency Water Containers (CWCs). In the CEV water system, the ionic silver biocide is expected to be depleted from solution due to ionic silver plating onto the surfaces of the materials within the CEV water system, thus negating its effectiveness as a biocide. Since the biocide depletion is expected to occur within a short amount of time after loading the water into the CEV water tanks at the Kennedy Space Center (KSC), an additional microbial control is a 0.1 micron point of use filter that will be used at the outlet of the Potable Water Dispenser (PWD). Because this may be the first time NASA is considering a stored water system for longterm missions that does not maintain a residual biocide, a team of experts in materials compatibility, biofilms and point of use filters, surface treatment and coatings, and biocides has been created to pinpoint concerns and perform testing to help alleviate those concerns related to the CEV water system. Results from the test plans laid out in the paper presented to SAE last year (Crew Exploration Vehicle (CEV) Potable Water System Verification Coordination, 2008012083) will be detailed in this paper. Additionally, recommendations for the CEV verification will be described for risk mitigation in meeting the physicochemical and microbiological requirements on the CEV PWS.

  5. Advanced Nuclear Measurements - Sensitivity Analysis Emerging Safeguards, Problems and Proliferation Risk

    SciTech Connect

    Dreicer, J.S.

    1999-07-15

    During the past year this component of the Advanced Nuclear Measurements LDRD-DR has focused on emerging safeguards problems and proliferation risk by investigating problems in two domains. The first is related to the analysis, quantification, and characterization of existing inventories of fissile materials, in particular, the minor actinides (MA) formed in the commercial fuel cycle. Understanding material forms and quantities helps identify and define future measurement problems, instrument requirements, and assists in prioritizing safeguards technology development. The second problem (dissertation research) has focused on the development of a theoretical foundation for sensor array anomaly detection. Remote and unattended monitoring or verification of safeguards activities is becoming a necessity due to domestic and international budgetary constraints. However, the ability to assess the trustworthiness of a sensor array has not been investigated. This research is developing an anomaly detection methodology to assess the sensor array.

  6. Motivational Maturity and Helping Behavior

    ERIC Educational Resources Information Center

    Haymes, Michael; Green, Logan

    1977-01-01

    Maturity in conative development (type of motivation included in Maslow's needs hierarchy) was found to be predictive of helping behavior in middle class white male college students. The effects of safety and esteem needs were compared, and the acceptance of responsibility was also investigated. (GDC)

  7. Help for Finding Missing Children.

    ERIC Educational Resources Information Center

    McCormick, Kathleen

    1984-01-01

    Efforts to locate missing children have expanded from a federal law allowing for entry of information into an F.B.I. computer system to companion bills before Congress for establishing a national missing child clearinghouse and a Justice Department center to help in conducting searches. Private organizations are also involved. (KS)

  8. Competitive helping in online giving.

    PubMed

    Raihani, Nichola J; Smith, Sarah

    2015-05-01

    Unconditional generosity in humans is a puzzle. One possibility is that individuals benefit from being seen as generous if there is competition for access to partners and if generosity is a costly-and therefore reliable-signal of partner quality [1-3]. The "competitive helping" hypothesis predicts that people will compete to be the most generous, particularly in the presence of attractive potential partners [1]. However, this key prediction has not been directly tested. Using data from online fundraising pages, we demonstrate competitive helping in the real world. Donations to fundraising pages are public and made sequentially. Donors can therefore respond to the behavior of previous donors, creating a potential generosity tournament. Our test of the competitive helping hypothesis focuses on the response to large, visible donations. We show that male donors show significantly stronger responses (by donating more) when they are donating to an attractive female fundraiser and responding to a large donation made by another male donor. The responses for this condition are around four times greater than when males give to less-attractive female (or male) fundraisers or when they respond to a large donation made by a female donor. Unlike males, females do not compete in donations when giving to attractive male fundraisers. These data suggest that males use competitive helping displays in the presence of attractive females and suggest a role for sexual selection in explaining unconditional generosity.

  9. Helping Schools Be More Sustainable

    ERIC Educational Resources Information Center

    Peters, Henricus

    2009-01-01

    How do children relate to and interact with their natural and built environments? What opportunities in science are schools and other groups offering children to help their environmental encounters to be a positive and healthy--even enjoyable--experience? These are questions raised by the concept of "environmental education"--the approach using…

  10. Helping Children with Reading Disability.

    ERIC Educational Resources Information Center

    Edgington, Ruth; And Others

    Intended for parents helping their children with reading disabilities, the book describes specific activities in eight areas. The eight areas include general suggestions for the study period, hand and eye coordination activities, phonics training, ear training, reading, relaxation activities, muscle memory, writing, and spelling. Thirteen…

  11. Grief: Helping Young Children Cope

    ERIC Educational Resources Information Center

    Wood, Frances B.

    2008-01-01

    In their role as caregivers supporting the children they teach, it is important for teachers to understand the grieving process and recognize symptoms of grief. The author explains Elisabeth Kubler-Ross's five stages of grief and offers 10 classroom strategies to help young children cope with their feelings.

  12. Enlisting Parents' Help with Mathematics.

    ERIC Educational Resources Information Center

    Kahn, Ann P.

    1989-01-01

    A new national PTA kit, "Math Matters; Kids Are Counting on You," can help all parents make a difference in their children's education. Suggested home activities include doubling cookie recipes, surveying and graphing family ice cream flavor preferences, filling in football "stat" charts, and other tasks easily performed on a calculator. (MLH)

  13. Helping Your Child to Read.

    ERIC Educational Resources Information Center

    Foust, Betty Jean

    This booklet provides suggestions for parents in helping their children to learn how to read. The first section provides 34 suggestions and activities for parents to use with preschool children, such as reciting nursery rhymes, reading aloud, respecting the child's mood, and playing listening games. The second section offers 25 suggestions and…

  14. Helping Young Children Overcome Shyness.

    ERIC Educational Resources Information Center

    Malouff, John

    This paper examines shyness--its causes and its impact on children--and presents several strategies based on social learning theory for parents and teachers to help young children overcome shyness. The paper also describes a personal application of these strategies on a young girl. The strategies presented for parents and teachers are: (1) tell…

  15. Getting the Help We Need

    ERIC Educational Resources Information Center

    Haertel, Edward

    2013-01-01

    In validating uses of testing, it is helpful to distinguish those that rely directly on the information provided by scores or score distributions ("direct" uses and consequences) versus those that instead capitalize on the motivational effects of testing, or use testing and test reporting to shape public opinion ("indirect" uses and consequences).…

  16. Helping Students Cope with Death.

    ERIC Educational Resources Information Center

    Rodabough, Tillman

    1980-01-01

    Classroom teachers need to understand the broad differences that exist between a child's perception of death and that of an adult and should be prepared to confront and cope with the effects of death and grief upon students. Children's perceptions of death and ways in which the teacher can help the child with his grief are described. (JN)

  17. Helping Students Design an Education

    ERIC Educational Resources Information Center

    Guertin, Elizabeth

    2015-01-01

    Most students embrace the incorporation of diverse disciplines into their educational plan when advisers and faculty members help them understand the added value of these courses in achieving their unique goals and how adding breadth can enhance their education and preparation. Individual advising is key to students gaining this perspective on…

  18. Will You Help Me Lead?

    ERIC Educational Resources Information Center

    Wade, Carolann; Ferriter, Bill

    2007-01-01

    Board certified teachers Wade and Ferriter each describe how established teacher leaders helped them break through hesitancy about accepting leadership roles early in their careers. Ferriter was interested in working on education advocacy and school reform, but found only traditional leadership roles like mentoring and running school committees in…

  19. Scholarship can help ideas flourish.

    PubMed

    Pearce, Lynne

    2016-03-01

    Scholarships from the Florence Nightingale Foundation are providing nurses with the financial means to put innovative ideas into practice. Nurses from all four countries of the UK can apply for leadership, travel and research scholarships to support their career development and help improve patient care. PMID:26959448

  20. Here's How You Can Help

    ERIC Educational Resources Information Center

    Afterschool Alliance, 2007

    2007-01-01

    This "Afterschool Action Kit" contains tips on what the community can do to support afterschool programs. The kit is a useful tool for parents, community members or practitioners, and gives advice on finding or starting a quality program, identifying program needs and what resources to tap for help. The kit notes that Americans agree that…

  1. Helping Children Cope with Disaster.

    ERIC Educational Resources Information Center

    Federal Emergency Management Agency, Washington, DC.

    Noting that the most assistance adults can provide to a child during a disaster is to be calm, honest, and caring, this brochure provides suggestions for helping children cope with natural and other disasters. The brochure details how children's typical reactions vary with their age, describes how families can prepare for disasters, and suggests…

  2. Technology to Help Struggling Students

    ERIC Educational Resources Information Center

    Silver-Pacuilla, Heidi; Fleischman, Steve

    2006-01-01

    Many technology features that were originally developed to help people with specific sensory impairments are now widely in use. Research is beginning to show the benefits of giving all students access to these capabilities. As such, educators should not hesitate to integrate technology features into instruction for students who struggle with…

  3. Some Ways of Helping Underachievers.

    ERIC Educational Resources Information Center

    Willings, David; Greenwood, Bill

    1990-01-01

    A program of intervention called therapeutic tutoring to help underachievers is described. Intervention centers around students' loci of control, through a process of identifying areas in which students feel empowered and relating academic experiences to these areas. Academic exercises based on Monopoly, cricket, rugby, soap operas, field hockey,…

  4. HELP: Healthy Early Literacy Program

    ERIC Educational Resources Information Center

    Rader, Laura A.

    2008-01-01

    A daily intensive supplemental reading and writing program was developed to assist students who were: 1. identified with a language disability and 2. identified as at-risk for reading failure in an urban elementary school. The purpose of the program was to help these students understand and develop the connection between oral and written language…

  5. Helping SBH Pupils with Handwriting.

    ERIC Educational Resources Information Center

    Anderson, Elizabeth

    1979-01-01

    The article describes two cases of children (7- and 8 1/2-years-old) with spina bifida and hydrocephalus who participated in a research project to discover whether such children could make significant improvements in writing given appropriate help, and to produce an advisory booklet for teachers. (SBH)

  6. Ah!Help: A generalized on-line help facility

    NASA Technical Reports Server (NTRS)

    Yu, Wong Nai; Mantooth, Charmiane; Soulahakil, Alex

    1986-01-01

    The idea behind the help facility discussed is relatively simple. It is made unique by the fact that it is written in Ada and uses aspects of the language which make information retrieval rapid and simple. Specifically, the DIRECT IO facility allows for random access into the help files. It is necessary to discuss the advantages of random access over sequential access. The mere fact that the program in written in Ada implies a saving in terms of lines of code. This introduces the possibility of eventually adapting the program to run at the microcomputer level, a major consideration . Additionally, since the program uses only standard Ada generics, it is portable to other systems. This is another aspect which must always be taken into consideration in writting any software package in the modern day world of computer programming.

  7. FPIA helps expand contraceptive services.

    PubMed

    Groot, H

    1984-01-01

    Since the beginning in 1971 of the Planned Parenthood Federation of America's international program, Family Planning International Assistance (FPIA), US$54 million has been contributed in direct financial support for the operation of over 300 family planning programs in 51 countries; over 3000 institutions in 115 countries have been supplied with family planning commodities, including over 600 million condoms, 120 cycles of oral contraceptives, and 4 million IUD; and about 1 million contraceptive clients were served by FPIA funded projects in 1982 aone. Since 1971, however, the world's population has increased from 3.7 billion to around 4.7 billion people. About 85 million people are added to the world each year. There is consensus that without organized family planning programs, today's world population would be even higher. FPIA measures its progress in terms of expanding the availability of contraceptive services in devloping countries. FPIA supported projects have helped make services available in areas previously lacking them, and has helped involve a wide variety of organizations, such as women's groups, youth organizations, and Red Cross Societies, in family planning services. A prime concern of FPIA, which has limited resources, is what happens to projects once FPIA support is terminated. FPIA has been paying attention to local income generation to help projects become more self-supporting and to increas staff members' management skills. The more successful income-generating schemes appear to be directly related to family planning, selling contraceptives and locally produced educational materials, and charging fees for family planning and related medical services and tuition for training courses. FPIA funded to projects use management by objectives (MBO) to help improve management skills. MBO helps grantees improve their ability to set objectives, plan, monitor, report, and do day-to-day project management.

  8. Formal verification of human-automation interaction

    NASA Technical Reports Server (NTRS)

    Degani, Asaf; Heymann, Michael

    2002-01-01

    This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.

  9. Sensor-fusion-based biometric identity verification

    SciTech Connect

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W.; Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person`s identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm.

  10. Collaborative Localization and Location Verification in WSNs

    PubMed Central

    Miao, Chunyu; Dai, Guoyong; Ying, Kezhen; Chen, Qingzhang

    2015-01-01

    Localization is one of the most important technologies in wireless sensor networks. A lightweight distributed node localization scheme is proposed by considering the limited computational capacity of WSNs. The proposed scheme introduces the virtual force model to determine the location by incremental refinement. Aiming at solving the drifting problem and malicious anchor problem, a location verification algorithm based on the virtual force mode is presented. In addition, an anchor promotion algorithm using the localization reliability model is proposed to re-locate the drifted nodes. Extended simulation experiments indicate that the localization algorithm has relatively high precision and the location verification algorithm has relatively high accuracy. The communication overhead of these algorithms is relative low, and the whole set of reliable localization methods is practical as well as comprehensive. PMID:25954948

  11. Test load verification through strain data analysis

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Harrington, F.

    1995-01-01

    A traditional binding acceptance criterion on polycrystalline structures is the experimental verification of the ultimate factor of safety. At fracture, the induced strain is inelastic and about an order-of-magnitude greater than designed for maximum expected operational limit. At this extreme strained condition, the structure may rotate and displace at the applied verification load such as to unknowingly distort the load transfer into the static test article. Test may result in erroneously accepting a submarginal design or rejecting a reliable one. A technique was developed to identify, monitor, and assess the load transmission error through two back-to-back surface-measured strain data. The technique is programmed for expediency and convenience. Though the method was developed to support affordable aerostructures, the method is also applicable for most high-performance air and surface transportation structural systems.

  12. Verification tests for contaminant transport codes

    SciTech Connect

    Rowe, R.K.; Nadarajah, P.

    1996-12-31

    The importance of verifying contaminant transport codes and the techniques that may be used in this verification process are discussed. Commonly used contaminant transport codes are characterized as belonging to one of several types or classes of solution, such as analytic, finite layer, boundary element, finite difference and finite element. Both the level of approximation and the solution methodology should be verified for each contaminant transport code. One powerful method that may be used in contaminant transport code verification is cross-checking (benchmarking) with other codes. This technique is used to check the results of codes from one solution class with the results of codes from another solution class. In this paper cross-checking is performed for three classes of solution; these are, analytic, finite layer, and finite element.

  13. Packaged low-level waste verification system

    SciTech Connect

    Tuite, K.; Winberg, M.R.; McIsaac, C.V.

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  14. Thermal hydraulic feasibility assessment for the Spent Nuclear Fuel Project

    SciTech Connect

    Heard, F.J.; Cramer, E.R.; Beaver, T.R.; Thurgood, M.J.

    1996-01-01

    A series of scoping analyses have been completed investigating the thermal-hydraulic performance and feasibility of the Spent Nuclear Fuel Project (SNFP) Integrated Process Strategy (IPS). The SNFP was established to develop engineered solutions for the expedited removal, stabilization, and storage of spent nuclear fuel from the K Basins at the U.S. Department of Energy`s Hanford Site in Richland, Washington. The subject efforts focused on independently investigating, quantifying, and establishing the governing heat production and removal mechanisms for each of the IPS operations and configurations, obtaining preliminary results for comparison with and verification of other analyses, and providing technology-based recommendations for consideration and incorporation into the design bases for the SNFP. The goal was to develop a series fo thermal-hydraulic models that could respond to all process and safety-related issues that may arise pertaining to the SNFP. A series of sensitivity analyses were also performed to help identify those parameters that have the greatest impact on energy transfer and hence, temperature control. It is anticipated that the subject thermal-hydraulic models will form the basis for a series of advanced and more detailed models that will more accurately reflect the thermal performance of the IPS and alleviate the necessity for some of the more conservative assumptions and oversimplifications, as well as form the basis for the final process and safety analyses.

  15. Nuclear Fabrication Consortium

    SciTech Connect

    Levesque, Stephen

    2013-04-05

    This report summarizes the activities undertaken by EWI while under contract from the Department of Energy (DOE) Office of Nuclear Energy (NE) for the management and operation of the Nuclear Fabrication Consortium (NFC). The NFC was established by EWI to independently develop, evaluate, and deploy fabrication approaches and data that support the re-establishment of the U.S. nuclear industry: ensuring that the supply chain will be competitive on a global stage, enabling more cost-effective and reliable nuclear power in a carbon constrained environment. The NFC provided a forum for member original equipment manufactures (OEM), fabricators, manufacturers, and materials suppliers to effectively engage with each other and rebuild the capacity of this supply chain by : Identifying and removing impediments to the implementation of new construction and fabrication techniques and approaches for nuclear equipment, including system components and nuclear plants. Providing and facilitating detailed scientific-based studies on new approaches and technologies that will have positive impacts on the cost of building of nuclear plants. Analyzing and disseminating information about future nuclear fabrication technologies and how they could impact the North American and the International Nuclear Marketplace. Facilitating dialog and initiate alignment among fabricators, owners, trade associations, and government agencies. Supporting industry in helping to create a larger qualified nuclear supplier network. Acting as an unbiased technology resource to evaluate, develop, and demonstrate new manufacturing technologies. Creating welder and inspector training programs to help enable the necessary workforce for the upcoming construction work. Serving as a focal point for technology, policy, and politically interested parties to share ideas and concepts associated with fabrication across the nuclear industry. The report the objectives and summaries of the Nuclear Fabrication Consortium

  16. Survey of Existing Tools for Formal Verification.

    SciTech Connect

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.; Jackson, Mayo

    2014-12-01

    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  17. Critical Surface Cleaning and Verification Alternatives

    NASA Technical Reports Server (NTRS)

    Melton, Donald M.; McCool, A. (Technical Monitor)

    2000-01-01

    A viewgraph presentation outlines the goal, scope, and background of the cleaning solvent HCFC 225G. HCFC 225G is compared to other cleaning solvents such as Freon 113 and HFE 7100. Test results of hardware submersion in HCFC 225G and Freon 113 are shown. Project accomplishments, average cleaning efficiency, and hardware qualification are discussed. Results show HCFC 225G is an excellent cleaning and verification solvent for industrial contaminants.

  18. Analyzing personalized policies for online biometric verification.

    PubMed

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  19. Dose Verification in IMRT and VMAT

    SciTech Connect

    Feygelman, Vladimir; Nelms, Benjamin E.

    2011-05-05

    This is a review paper of the current IMRT dosimetric verification methods, with the emphasis on the solid state dosimeters. Different types of IMRT treatments and the associated quality assurance challenges are described. The prevailing techniques of quantifying dosimetric agreement and their weaknesses in terms of clinical relevance are discussed. A variety of empirical, semi-empirical, and purely calculational methods are summarized from a clinical practice point of view. A number of available commercial devices and emerging technologies are described.

  20. Verification of the SLC wake potentials

    SciTech Connect

    Bane, K.; Weiland, T.

    1983-01-01

    The accurate knowledge of the monopole, dipole, and quadrupole wake potentials is essential for SLC. These wake potentials were previously computed by the modal method. The time domain code TBCI allows independent verification of these results. This comparison shows that the two methods agree to within 10% for bunch lengths down to 1 mm. TBCI results also indicate that rounding the irises gives at least a 10% reduction in the wake potentials.

  1. Component Verification and Certification in NASA Missions

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Penix, John; Norvig, Peter (Technical Monitor)

    2001-01-01

    Software development for NASA missions is a particularly challenging task. Missions are extremely ambitious scientifically, have very strict time frames, and must be accomplished with a maximum degree of reliability. Verification technologies must therefore be pushed far beyond their current capabilities. Moreover, reuse and adaptation of software architectures and components must be incorporated in software development within and across missions. This paper discusses NASA applications that we are currently investigating from these perspectives.

  2. Cleaning verification by air/water impingement

    NASA Technical Reports Server (NTRS)

    Jones, Lisa L.; Littlefield, Maria D.; Melton, Gregory S.; Caimi, Raoul E. B.; Thaxton, Eric A.

    1995-01-01

    This paper will discuss how the Kennedy Space Center intends to perform precision cleaning verification by Air/Water Impingement in lieu of chlorofluorocarbon-113 gravimetric nonvolatile residue analysis (NVR). Test results will be given that demonstrate the effectiveness of the Air/Water system. A brief discussion of the Total Carbon method via the use of a high temperature combustion analyzer will also be given. The necessary equipment for impingement will be shown along with other possible applications of this technology.

  3. Systematic physical verification with topological patterns

    NASA Astrophysics Data System (ADS)

    Dai, Vito; Lai, Ya-Chieh; Gennari, Frank; Teoh, Edward; Capodieci, Luigi

    2014-03-01

    Design rule checks (DRC) are the industry workhorse for constraining design to ensure both physical and electrical manufacturability. Where DRCs fail to fully capture the concept of manufacturability, pattern-based approaches, such as DRC Plus, fill the gap using a library of patterns to capture and identify problematic 2D configurations. Today, both a DRC deck and a pattern matching deck may be found in advanced node process development kits. Major electronic design automation (EDA) vendors offer both DRC and pattern matching solutions for physical verification; in fact, both are frequently integrated into the same physical verification tool. In physical verification, DRCs represent dimensional constraints relating directly to process limitations. On the other hand, patterns represent the 2D placement of surrounding geometries that can introduce systematic process effects. It is possible to combine both DRCs and patterns in a single topological pattern representation. A topological pattern has two separate components: a bitmap representing the placement and alignment of polygon edges, and a vector of dimensional constraints. The topological pattern is unique and unambiguous; there is no code to write, and no two different ways to represent the same physical structure. Furthermore, markers aligned to the pattern can be generated to designate specific layout optimizations for improving manufacturability. In this paper, we describe how to do systematic physical verification with just topological patterns. Common mappings between traditional design rules and topological pattern rules are presented. We describe techniques that can be used during the development of a topological rule deck such as: taking constraints defined on one rule, and systematically projecting it onto other related rules; systematically separating a single rule into two or more rules, when the single rule is not sufficient to capture manufacturability constraints; creating test layout which

  4. Verification tests of durable TPS concepts

    NASA Technical Reports Server (NTRS)

    Shideler, J. L.; Webb, G. L.; Pittman, C. M.

    1984-01-01

    Titanium multiwall, superalloy honeycomb, and Advanced Carbon-carbon (ACC) multipost Thermal Protection System (TPS) concepts are being developed to provide durable protection for surfaces of future space transportation systems. Verification tests including thermal, vibration, acoustic, water absorption, lightning strike, and aerothermal tests are described. Preliminary results indicate that the three TPS concepts are viable up to a surface temperature in excess of 2300 F.

  5. Dynamic analysis for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Fralich, R. W.; Green, C. E.; Rheinfurth, M. H.

    1972-01-01

    Two approaches that are used for determining the modes and frequencies of space shuttle structures are discussed. The first method, direct numerical analysis, involves finite element mathematical modeling of the space shuttle structure in order to use computer programs for dynamic structural analysis. The second method utilizes modal-coupling techniques of experimental verification made by vibrating only spacecraft components and by deducing modes and frequencies of the complete vehicle from results obtained in the component tests.

  6. Verification of a simplified method to evaluate the capacities of template-type platforms

    SciTech Connect

    Bea, R.G.; Mortazavi, M.M.; Loch, K.J.; Young, P.L.

    1995-12-01

    This paper summarizes development of simplified procedures to evaluate storm loadings imposed on template-type platforms and to evaluate the ultimate limit state lateral loading capacities of such platforms. Verification of these procedures has been accomplished by comparing results from the simplified analyses with results from three dimensional, linear and nonlinear analyses of a variety of template-type platforms. Good agreement between results from the two types of analyses has been developed for the evaluations of both loadings and capacities. The verification platforms have included four-leg well protector and quarters structures and eight-leg drilling and production Gulf of Mexico structures that employed a variety of types of bracing patterns and joints. Several of these structures were subjected to intense hurricane storm loadings during hurricanes Andrew, Carmen, and Frederic. Within the population of verification platforms are several that failed or were very near failure. The simplified loading and capacity analyses are able to replicate the observed performance of these platforms. Realistic simulation of the brace joints and foundation capacity characteristics are critical aspects of these analyses. There is a reasonable degree of verification of the simplified methods with the observed performance of platforms in the field during intense hurricane storm loadings. These methods can be used to help screen platforms that are being evaluated for extended service. In addition, the results from these analyses can be used to help verify results from complex analytical models that are intended to determine the ultimate limit state loading capacities of platforms. Lastly, and perhaps most importantly this approach can be used in the preliminary design of new platforms.

  7. Signal verification can promote reliable signalling

    PubMed Central

    Broom, Mark; Ruxton, Graeme D.; Schaefer, H. Martin

    2013-01-01

    The central question in communication theory is whether communication is reliable, and if so, which mechanisms select for reliability. The primary approach in the past has been to attribute reliability to strategic costs associated with signalling as predicted by the handicap principle. Yet, reliability can arise through other mechanisms, such as signal verification; but the theoretical understanding of such mechanisms has received relatively little attention. Here, we model whether verification can lead to reliability in repeated interactions that typically characterize mutualisms. Specifically, we model whether fruit consumers that discriminate among poor- and good-quality fruits within a population can select for reliable fruit signals. In our model, plants either signal or they do not; costs associated with signalling are fixed and independent of plant quality. We find parameter combinations where discriminating fruit consumers can select for signal reliability by abandoning unprofitable plants more quickly. This self-serving behaviour imposes costs upon plants as a by-product, rendering it unprofitable for unrewarding plants to signal. Thus, strategic costs to signalling are not a prerequisite for reliable communication. We expect verification to more generally explain signal reliability in repeated consumer–resource interactions that typify mutualisms but also in antagonistic interactions such as mimicry and aposematism. PMID:24068354

  8. Tags and seals for arms control verification

    SciTech Connect

    DeVolpi, A.

    1990-09-18

    Tags and seals have long been recognized as important tools in arms control. The trend in control of armaments is to limit militarily significant equipment that is capable of being verified through direct and cooperative means, chiefly on-site inspection or monitoring. Although this paper will focus on the CFE treaty, the role of tags and seals for other treaties will also be addressed. Published technology and concepts will be reviewed, based on open sources. Arms control verification tags are defined as unique identifiers designed to be tamper-revealing; in that respect, seals are similar, being used as indicators of unauthorized access. Tamper-revealing tags might be considered as single-point markers, seals as two-point couplings, and nets as volume containment. The functions of an arms control tag can be considered to be two-fold: to provide field verification of the identity of a treaty-limited item (TLI), and to have a means of authentication of the tag and its tamper-revealing features. Authentication could take place in the field or be completed elsewhere. For CFE, the goal of tags and seals can be to reduce the overall cost of the entire verification system.

  9. Verification in referral-based crowdsourcing.

    PubMed

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through "referral-based crowdsourcing": the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge.

  10. Formal verification of an avionics microprocessor

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam, K.; Miller, Steven P.

    1995-01-01

    Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.

  11. Trajectory Based Behavior Analysis for User Verification

    NASA Astrophysics Data System (ADS)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  12. On code verification of RANS solvers

    NASA Astrophysics Data System (ADS)

    Eça, L.; Klaij, C. M.; Vaz, G.; Hoekstra, M.; Pereira, F. S.

    2016-04-01

    This article discusses Code Verification of Reynolds-Averaged Navier Stokes (RANS) solvers that rely on face based finite volume discretizations for volumes of arbitrary shape. The study includes test cases with known analytical solutions (generated with the method of manufactured solutions) corresponding to laminar and turbulent flow, with the latter using eddy-viscosity turbulence models. The procedure to perform Code Verification based on grid refinement studies is discussed and the requirements for its correct application are illustrated in a simple one-dimensional problem. It is shown that geometrically similar grids are recommended for proper Code Verification and so the data should not have scatter making the use of least square fits unnecessary. Results show that it may be advantageous to determine the extrapolated error to cell size/time step zero instead of assuming that it is zero, especially when it is hard to determine the asymptotic order of grid convergence. In the RANS examples, several of the features of the ReFRESCO solver are checked including the effects of the available turbulence models in the convergence properties of the code. It is shown that it is required to account for non-orthogonality effects in the discretization of the diffusion terms and that the turbulence quantities transport equations can deteriorate the order of grid convergence of mean flow quantities.

  13. Verification of micro-beam irradiation

    NASA Astrophysics Data System (ADS)

    Li, Qiongge; Juang, Titania; Beth, Rachel; Chang, Sha; Oldham, Mark

    2015-01-01

    Micro-beam Radiation Therapy (MRT) is an experimental radiation therapy with provocative experimental data indicating potential for improved efficacy in some diseases. Here we demonstrated a comprehensive micro-beam verification method utilizing high resolution (50pm) PRESAGE/Micro-Optical-CT 3D Dosimetry. A small PRESAGE cylindrical dosimeter was irradiated by a novel compact Carbon-Nano-Tube (CNT) field emission based MRT system. The Percentage Depth Dose (PDD), Peak-to-Valley Dose Ratio (PVDR) and beam width (FWHM) data were obtained and analyzed from a three strips radiation experiment. A fast dose drop-off with depth, a preserved beam width with depth (an averaged FWHM across three beams remains constant (405.3um, sigma=13.2um) between depth of 3.0~14.0mm), and a high PVDR value (increases with depth from 6.3 at 3.0mm depth to 8.6 at 14.0mm depth) were discovered during this verification process. Some operating procedures such as precise dosimeter mounting, robust mechanical motions (especially rotation) and stray-light artifact management were optimized and developed to achieve a more accurate and dosimetric verification method.

  14. Automating engineering verification in ALMA subsystems

    NASA Astrophysics Data System (ADS)

    Ortiz, José; Castillo, Jorge

    2014-08-01

    The Atacama Large Millimeter/submillimeter Array is an interferometer comprising 66 individual high precision antennas located over 5000 meters altitude in the north of Chile. Several complex electronic subsystems need to be meticulously tested at different stages of an antenna commissioning, both independently and when integrated together. First subsystem integration takes place at the Operations Support Facilities (OSF), at an altitude of 3000 meters. Second integration occurs at the high altitude Array Operations Site (AOS), where also combined performance with Central Local Oscillator (CLO) and Correlator is assessed. In addition, there are several other events requiring complete or partial verification of instrument specifications compliance, such as parts replacements, calibration, relocation within AOS, preventive maintenance and troubleshooting due to poor performance in scientific observations. Restricted engineering time allocation and the constant pressure of minimizing downtime in a 24/7 astronomical observatory, impose the need to complete (and report) the aforementioned verifications in the least possible time. Array-wide disturbances, such as global power interruptions and following recovery, generate the added challenge of executing this checkout on multiple antenna elements at once. This paper presents the outcome of the automation of engineering verification setup, execution, notification and reporting in ALMA and how these efforts have resulted in a dramatic reduction of both time and operator training required. Signal Path Connectivity (SPC) checkout is introduced as a notable case of such automation.

  15. SMAP Verification and Validation Project - Final Report

    NASA Technical Reports Server (NTRS)

    Murry, Michael

    2012-01-01

    In 2007, the National Research Council (NRC) released the Decadal Survey of Earth science. In the future decade, the survey identified 15 new space missions of significant scientific and application value for the National Aeronautics and Space Administration (NASA) to undertake. One of these missions was the Soil Moisture Active Passive (SMAP) mission that NASA assigned to the Jet Propulsion Laboratory (JPL) in 2008. The goal of SMAP1 is to provide global, high resolution mapping of soil moisture and its freeze/thaw states. The SMAP project recently passed its Critical Design Review and is proceeding with its fabrication and testing phase.Verification and Validation (V&V) is widely recognized as a critical component in system engineering and is vital to the success of any space mission. V&V is a process that is used to check that a system meets its design requirements and specifications in order to fulfill its intended purpose. Verification often refers to the question "Have we built the system right?" whereas Validation asks "Have we built the right system?" Currently the SMAP V&V team is verifying design requirements through inspection, demonstration, analysis, or testing. An example of the SMAP V&V process is the verification of the antenna pointing accuracy with mathematical models since it is not possible to provide the appropriate micro-gravity environment for testing the antenna on Earth before launch.

  16. Verification in Referral-Based Crowdsourcing

    PubMed Central

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  17. Signal verification can promote reliable signalling.

    PubMed

    Broom, Mark; Ruxton, Graeme D; Schaefer, H Martin

    2013-11-22

    The central question in communication theory is whether communication is reliable, and if so, which mechanisms select for reliability. The primary approach in the past has been to attribute reliability to strategic costs associated with signalling as predicted by the handicap principle. Yet, reliability can arise through other mechanisms, such as signal verification; but the theoretical understanding of such mechanisms has received relatively little attention. Here, we model whether verification can lead to reliability in repeated interactions that typically characterize mutualisms. Specifically, we model whether fruit consumers that discriminate among poor- and good-quality fruits within a population can select for reliable fruit signals. In our model, plants either signal or they do not; costs associated with signalling are fixed and independent of plant quality. We find parameter combinations where discriminating fruit consumers can select for signal reliability by abandoning unprofitable plants more quickly. This self-serving behaviour imposes costs upon plants as a by-product, rendering it unprofitable for unrewarding plants to signal. Thus, strategic costs to signalling are not a prerequisite for reliable communication. We expect verification to more generally explain signal reliability in repeated consumer-resource interactions that typify mutualisms but also in antagonistic interactions such as mimicry and aposematism.

  18. Scenarios for exercising technical approaches to verified nuclear reductions

    SciTech Connect

    Doyle, James

    2010-01-01

    Presidents Obama and Medvedev in April 2009 committed to a continuing process of step-by-step nuclear arms reductions beyond the new START treaty that was signed April 8, 2010 and to the eventual goal of a world free of nuclear weapons. In addition, the US Nuclear Posture review released April 6, 2010 commits the US to initiate a comprehensive national research and development program to support continued progress toward a world free of nuclear weapons, including expanded work on verification technologies and the development of transparency measures. It is impossible to predict the specific directions that US-RU nuclear arms reductions will take over the 5-10 years. Additional bilateral treaties could be reached requiring effective verification as indicated by statements made by the Obama administration. There could also be transparency agreements or other initiatives (unilateral, bilateral or multilateral) that require monitoring with a standard of verification lower than formal arms control, but still needing to establish confidence to domestic, bilateral and multilateral audiences that declared actions are implemented. The US Nuclear Posture Review and other statements give some indication of the kinds of actions and declarations that may need to be confirmed in a bilateral or multilateral setting. Several new elements of the nuclear arsenals could be directly limited. For example, it is likely that both strategic and nonstrategic nuclear warheads (deployed and in storage), warhead components, and aggregate stocks of such items could be accountable under a future treaty or transparency agreement. In addition, new initiatives or agreements may require the verified dismantlement of a certain number of nuclear warheads over a specified time period. Eventually procedures for confirming the elimination of nuclear warheads, components and fissile materials from military stocks will need to be established. This paper is intended to provide useful background information

  19. Help for health decision challenges.

    PubMed

    Doty, L

    1997-01-01

    Medical care is becoming more technically challenging and community-based. The majority of patients and family health gatekeepers (the family member who regulates health care services for the family unit) are female, while the majority of physicians are male. Therefore, differences in female versus male methods of decision making add to the difficulty in making health choices. The female patient and family health gatekeeper may need new knowledge, skills and time to help them deal with difficult medical choices. They may benefit from a multidisciplinary, unbiased group of experts in the form of a Community Healthcare Committee. Trained to be responsible for the general health of the community, the primary care practitioner is ideal to take a leadership role in developing such a committee. A Community Healthcare Committee that understands different methods of health decision making could serve as a resource by providing community health education and private case reviews intended to help individuals with health care decisions. PMID:9379165

  20. New Help for the Handicapped

    NASA Technical Reports Server (NTRS)

    1984-01-01

    L & M Electronics, Inc.'s telemetry system is used to measure degree and location of abnormal muscle activity. This telemetry was originally used to monitor astronauts vital functions. Leg sensors send wireless signals to computer which develops pictures of gait patterns. System records, measures and analyzes muscle activities in limbs and spine. Computer developed pictures of gait patterns help physicians determine potential of corrective surgery, evaluate various types of braces, or decide whether physical therapy may improve motor functions.