Science.gov

Sample records for distributed location verification

  1. Collaborative Localization and Location Verification in WSNs

    PubMed Central

    Miao, Chunyu; Dai, Guoyong; Ying, Kezhen; Chen, Qingzhang

    2015-01-01

    Localization is one of the most important technologies in wireless sensor networks. A lightweight distributed node localization scheme is proposed by considering the limited computational capacity of WSNs. The proposed scheme introduces the virtual force model to determine the location by incremental refinement. Aiming at solving the drifting problem and malicious anchor problem, a location verification algorithm based on the virtual force mode is presented. In addition, an anchor promotion algorithm using the localization reliability model is proposed to re-locate the drifted nodes. Extended simulation experiments indicate that the localization algorithm has relatively high precision and the location verification algorithm has relatively high accuracy. The communication overhead of these algorithms is relative low, and the whole set of reliable localization methods is practical as well as comprehensive. PMID:25954948

  2. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  3. An Efficient Location Verification Scheme for Static Wireless Sensor Networks

    PubMed Central

    Kim, In-hwan; Kim, Bo-sung; Song, JooSeok

    2017-01-01

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007

  4. An Efficient Location Verification Scheme for Static Wireless Sensor Networks.

    PubMed

    Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok

    2017-01-24

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.

  5. Protecting Privacy and Securing the Gathering of Location Proofs - The Secure Location Verification Proof Gathering Protocol

    NASA Astrophysics Data System (ADS)

    Graham, Michelle; Gray, David

    As wireless networks become increasingly ubiquitous, the demand for a method of locating a device has increased dramatically. Location Based Services are now commonplace but there are few methods of verifying or guaranteeing a location provided by a user without some specialised hardware, especially in larger scale networks. We propose a system for the verification of location claims, using proof gathered from neighbouring devices. In this paper we introduce a protocol to protect this proof gathering process, protecting the privacy of all involved parties and securing it from intruders and malicious claiming devices. We present the protocol in stages, extending the security of this protocol to allow for flexibility within its application. The Secure Location Verification Proof Gathering Protocol (SLVPGP) has been designed to function within the area of Vehicular Networks, although its application could be extended to any device with wireless & cryptographic capabilities.

  6. 37 CFR 384.7 - Verification of royalty distributions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... distributions. 384.7 Section 384.7 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF... BUSINESS ESTABLISHMENT SERVICES § 384.7 Verification of royalty distributions. (a) General. This section prescribes procedures by which any Copyright Owner may verify the royalty distributions made by...

  7. Field verification of a nondestructive damage location algorithm

    SciTech Connect

    Farrar, C.R.; Stubbs, N.

    1996-12-31

    Over the past 25 years, the use of modal parameters for detecting damage has received considerable attention from the civil engineering community. The basic idea is that changes in the structure`s properties, primarily stiffness, will alter the dynamic properties of the structure such as frequencies and mode shapes, and properties derived from these quantities such as modal-based flexibility. In this paper, a method for nondestructive damage location in bridges, as determined by changes in the modal properties, is described. The damage detection algorithm is applied to pre- and post-damage modal properties measured on a bridge. Results of the analysis indicate that the method accurately locates the damage. Subjects relating to practical implementation of this damage identification algorithm that need further study are discussed.

  8. A Verification System for Distributed Objects with Asynchronous Method Calls

    NASA Astrophysics Data System (ADS)

    Ahrendt, Wolfgang; Dylla, Maximilian

    We present a verification system for Creol, an object-oriented modeling language for concurrent distributed applications. The system is an instance of KeY, a framework for object-oriented software verification, which has so far been applied foremost to sequential Java. Building on KeY characteristic concepts, like dynamic logic, sequent calculus, explicit substitutions, and the taclet rule language, the system presented in this paper addresses functional correctness of Creol models featuring local cooperative thread parallelism and global communication via asynchronous method calls. The calculus heavily operates on communication histories which describe the interfaces of Creol units. Two example scenarios demonstrate the usage of the system.

  9. Modeling and Verification of Distributed Generation and Voltage Regulation Equipment for Unbalanced Distribution Power Systems; Annual Subcontract Report, June 2007

    SciTech Connect

    Davis, M. W.; Broadwater, R.; Hambrick, J.

    2007-07-01

    This report summarizes the development of models for distributed generation and distribution circuit voltage regulation equipment for unbalanced power systems and their verification through actual field measurements.

  10. GENERIC VERIFICATION PROTOCOL: DISTRIBUTED GENERATION AND COMBINED HEAT AND POWER FIELD TESTING PROTOCOL

    EPA Science Inventory

    This report is a generic verification protocol by which EPA’s Environmental Technology Verification program tests newly developed equipment for distributed generation of electric power, usually micro-turbine generators and internal combustion engine generators. The protocol will ...

  11. DOE-EPRI distributed wind Turbine Verification Program (TVP III)

    SciTech Connect

    McGowin, C.; DeMeo, E.; Calvert, S.

    1997-12-31

    In 1992, the Electric Power Research Institute (EPRI) and the U.S. Department of Energy (DOE) initiated the Utility Wind Turbine Verification Program (TVP). The goal of the program is to evaluate prototype advanced wind turbines at several sites developed by U.S. electric utility companies. Two six MW wind projects have been installed under the TVP program by Central and South West Services in Fort Davis, Texas and Green Mountain Power Corporation in Searsburg, Vermont. In early 1997, DOE and EPRI selected five more utility projects to evaluate distributed wind generation using smaller {open_quotes}clusters{close_quotes} of wind turbines connected directly to the electricity distribution system. This paper presents an overview of the objectives, scope, and status of the EPRI-DOE TVP program and the existing and planned TVP projects.

  12. Design and Verification of a Distributed Communication Protocol

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.

  13. Location Management in Distributed Mobile Environments

    DTIC Science & Technology

    1994-09-01

    additional mes-sages need to be sent for this purpose. The update in fp time is done to avoid purging of the forward-ing pointer data at the MSSs. The...the average search- update cost for LU -JUis less than7 or equal to 1. Thus, the aggregate costof LU -JU is lower than LU -PC. LU -PC performsbetter...computing. Location management con-sists of location updates , searches and search-updates.An update occurs when a mobile host changes loca-tion. A search

  14. Distributed Engine Control Empirical/Analytical Verification Tools

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  15. Specification and Verification of Secure Concurrent and Distributed Software Systems

    DTIC Science & Technology

    1992-02-01

    Theorem Proving Support Systems ............... 95 5 Algebraic Specification and Verification of Concurrency in OBJ 97 5.1 Overview of the Approach...final algebra specifications: the methodology ................... 156 7.2.2 Structure of the generic SRM specification ........................ 158 7.2.3...basic support for algebraic specification * EEDM - wide-range of support for specification and verification including software engi. neerimg support

  16. Reconstructing Spatial Distributions from Anonymized Locations

    SciTech Connect

    Horey, James L; Forrest, Stephanie; Groat, Michael

    2012-01-01

    Devices such as mobile phones, tablets, and sensors are often equipped with GPS that accurately report a person's location. Combined with wireless communication, these devices enable a wide range of new social tools and applications. These same qualities, however, leave location-aware applications vulnerable to privacy violations. This paper introduces the Negative Quad Tree, a privacy protection method for location aware applications. The method is broadly applicable to applications that use spatial density information, such as social applications that measure the popularity of social venues. The method employs a simple anonymization algorithm running on mobile devices, and a more complex reconstruction algorithm on a central server. This strategy is well suited to low-powered mobile devices. The paper analyzes the accuracy of the reconstruction method in a variety of simulated and real-world settings and demonstrates that the method is accurate enough to be used in many real-world scenarios.

  17. 37 CFR 380.7 - Verification of royalty distributions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... TRANSMISSIONS, NEW SUBSCRIPTION SERVICES AND THE MAKING OF EPHEMERAL REPRODUCTIONS § 380.7 Verification of... records maintained by third parties for the purpose of the audit. The Copyright Owner or...

  18. Radionuclide Inventory Distribution Project Data Evaluation and Verification White Paper

    SciTech Connect

    NSTec Environmental Restoration

    2010-05-17

    Testing of nuclear explosives caused widespread contamination of surface soils on the Nevada Test Site (NTS). Atmospheric tests produced the majority of this contamination. The Radionuclide Inventory and Distribution Program (RIDP) was developed to determine distribution and total inventory of radionuclides in surface soils at the NTS to evaluate areas that may present long-term health hazards. The RIDP achieved this objective with aerial radiological surveys, soil sample results, and in situ gamma spectroscopy. This white paper presents the justification to support the use of RIDP data as a guide for future evaluation and to support closure of Soils Sub-Project sites under the purview of the Federal Facility Agreement and Consent Order. Use of the RIDP data as part of the Data Quality Objective process is expected to provide considerable cost savings and accelerate site closures. The following steps were completed: - Summarize the RIDP data set and evaluate the quality of the data. - Determine the current uses of the RIDP data and cautions associated with its use. - Provide recommendations for enhancing data use through field verification or other methods. The data quality is sufficient to utilize RIDP data during the planning process for site investigation and closure. Project planning activities may include estimating 25-millirem per industrial access year dose rate boundaries, optimizing characterization efforts, projecting final end states, and planning remedial actions. In addition, RIDP data may be used to identify specific radionuclide distributions, and augment other non-radionuclide dose rate data. Finally, the RIDP data can be used to estimate internal and external dose rates. The data quality is sufficient to utilize RIDP data during the planning process for site investigation and closure. Project planning activities may include estimating 25-millirem per industrial access year dose rate boundaries, optimizing characterization efforts, projecting final

  19. The Role of Experience in Location Estimation: Target Distributions Shift Location Memory Biases

    ERIC Educational Resources Information Center

    Lipinski, John; Simmering, Vanessa R.; Johnson, Jeffrey S.; Spencer, John P.

    2010-01-01

    Research based on the Category Adjustment model concluded that the spatial distribution of target locations does not influence location estimation responses [Huttenlocher, J., Hedges, L., Corrigan, B., & Crawford, L. E. (2004). Spatial categories and the estimation of location. "Cognition, 93", 75-97]. This conflicts with earlier results showing…

  20. Protection of Location Privacy Based on Distributed Collaborative Recommendations

    PubMed Central

    Wang, Peng; Yang, Jing; Zhang, Jian-Pei

    2016-01-01

    In the existing centralized location services system structure, the server is easily attracted and be the communication bottleneck. It caused the disclosure of users’ location. For this, we presented a new distributed collaborative recommendation strategy that is based on the distributed system. In this strategy, each node establishes profiles of their own location information. When requests for location services appear, the user can obtain the corresponding location services according to the recommendation of the neighboring users’ location information profiles. If no suitable recommended location service results are obtained, then the user can send a service request to the server according to the construction of a k-anonymous data set with a centroid position of the neighbors. In this strategy, we designed a new model of distributed collaborative recommendation location service based on the users’ location information profiles and used generalization and encryption to ensure the safety of the user’s location information privacy. Finally, we used the real location data set to make theoretical and experimental analysis. And the results show that the strategy proposed in this paper is capable of reducing the frequency of access to the location server, providing better location services and protecting better the user’s location privacy. PMID:27649308

  1. Protection of Location Privacy Based on Distributed Collaborative Recommendations.

    PubMed

    Wang, Peng; Yang, Jing; Zhang, Jian-Pei

    2016-01-01

    In the existing centralized location services system structure, the server is easily attracted and be the communication bottleneck. It caused the disclosure of users' location. For this, we presented a new distributed collaborative recommendation strategy that is based on the distributed system. In this strategy, each node establishes profiles of their own location information. When requests for location services appear, the user can obtain the corresponding location services according to the recommendation of the neighboring users' location information profiles. If no suitable recommended location service results are obtained, then the user can send a service request to the server according to the construction of a k-anonymous data set with a centroid position of the neighbors. In this strategy, we designed a new model of distributed collaborative recommendation location service based on the users' location information profiles and used generalization and encryption to ensure the safety of the user's location information privacy. Finally, we used the real location data set to make theoretical and experimental analysis. And the results show that the strategy proposed in this paper is capable of reducing the frequency of access to the location server, providing better location services and protecting better the user's location privacy.

  2. Modeling and verification of distributed systems with labeled predicate transition nets

    NASA Astrophysics Data System (ADS)

    Lloret, Jean-Christophe

    Two main steps in the design of distributed systems are modeling and verification. Petri nets and CCS are two basic formal models. CCS is a modular language supporting compositional verification. Conversely, the petri net theory requires an accurate description of parallelism and focuses on property global verification. A structuring technique based on CCS concepts is introduced for predicate/transition nets. It consists of a high level petri net that permits expression of communication with value passing. In particular, a petri net composition operator, that can be interpreted as a multi-rendezvous between communicating systems, is defined. The multi rendezvous allows abstract modeling, with small state graphs. The developed formalism is highly convenient for refining abstract models relative to less abstract levels. Based on this work, a software tool, supporting distributed system design and verification, is developed. The advantage of this approach is shown in many research and industrial applications.

  3. Automated fault location and diagnosis on electric power distribution feeders

    SciTech Connect

    Zhu, J.; Lubkeman, D.L.; Girgis, A.A.

    1997-04-01

    This paper presents new techniques for locating and diagnosing faults on electric power distribution feeders. The proposed fault location and diagnosis scheme is capable of accurately identifying the location of a fault upon its occurrence, based on the integration of information available from disturbance recording devices with knowledge contained in a distribution feeder database. The developed fault location and diagnosis system can also be applied to the investigation of temporary faults that may not result in a blown fuse. The proposed fault location algorithm is based on the steady-state analysis of the faulted distribution network. To deal with the uncertainties inherent in the system modeling and the phasor estimation, the fault location algorithm has been adapted to estimate fault regions based on probabilistic modeling and analysis. Since the distribution feeder is a radial network, multiple possibilities of fault locations could be computed with measurements available only at the substation. To identify the actual fault location, a fault diagnosis algorithm has been developed to prune down and rank the possible fault locations by integrating the available pieces of evidence. Testing of the developed fault location and diagnosis system using field data has demonstrated its potential for practical use.

  4. Privacy-Preserving Location-Based Query Using Location Indexes and Parallel Searching in Distributed Networks

    PubMed Central

    Liu, Lei; Zhao, Jing

    2014-01-01

    An efficient location-based query algorithm of protecting the privacy of the user in the distributed networks is given. This algorithm utilizes the location indexes of the users and multiple parallel threads to search and select quickly all the candidate anonymous sets with more users and their location information with more uniform distribution to accelerate the execution of the temporal-spatial anonymous operations, and it allows the users to configure their custom-made privacy-preserving location query requests. The simulated experiment results show that the proposed algorithm can offer simultaneously the location query services for more users and improve the performance of the anonymous server and satisfy the anonymous location requests of the users. PMID:24790579

  5. Privacy-preserving location-based query using location indexes and parallel searching in distributed networks.

    PubMed

    Zhong, Cheng; Liu, Lei; Zhao, Jing

    2014-01-01

    An efficient location-based query algorithm of protecting the privacy of the user in the distributed networks is given. This algorithm utilizes the location indexes of the users and multiple parallel threads to search and select quickly all the candidate anonymous sets with more users and their location information with more uniform distribution to accelerate the execution of the temporal-spatial anonymous operations, and it allows the users to configure their custom-made privacy-preserving location query requests. The simulated experiment results show that the proposed algorithm can offer simultaneously the location query services for more users and improve the performance of the anonymous server and satisfy the anonymous location requests of the users.

  6. The application of fuzzy neural network in distribution center location

    NASA Astrophysics Data System (ADS)

    Li, Yongpan; Liu, Yong

    2013-03-01

    In this paper, the establishment of the fuzzy neural network model for logistics distribution center location applied the fuzzy method to the input value of BP algorithm and took the experts' evaluation value as the expected output. At the same time, using the network learning to get the optimized selection and furthermore get a more accurate evaluation to the programs of location.

  7. Ground vibration tests of a high fidelity truss for verification of on orbit damage location techniques

    NASA Technical Reports Server (NTRS)

    Kashangaki, Thomas A. L.

    1992-01-01

    This paper describes a series of modal tests that were performed on a cantilevered truss structure. The goal of the tests was to assemble a large database of high quality modal test data for use in verification of proposed methods for on orbit model verification and damage detection in flexible truss structures. A description of the hardware is provided along with details of the experimental setup and procedures for 16 damage cases. Results from selected cases are presented and discussed. Differences between ground vibration testing and on orbit modal testing are also described.

  8. Event-Based Specification and Verification of Distributed Systems.

    DTIC Science & Technology

    1982-01-01

    B); RTI5(A, B); RT21(A, B); End behavior End system. A distributed design [HOA78] to generate prime * numbers using the "sieve of Eratosthenes ...D. Theorem 3.7. The distributed "sieve of Eratosthenes " is a correct prime number generator. Proof By the sequence of theorems 3.7.1. to 3.7.6

  9. A distributed approach to verification and validation of electronic structure simulation data using ESTEST

    NASA Astrophysics Data System (ADS)

    Yuan, Gary; Gygi, François

    2012-08-01

    We present a Verification and Validation (V&V) approach for electronic structure computations based on a network of distributed servers running the ESTEST (Electronic Structure TEST) software. This network-based infrastructure enables remote verification, validation, comparison and sharing of electronic structure data obtained with different simulation codes. The implementation and configuration of the distributed framework is described. ESTEST features are enhanced by server communication and data sharing, minimizing the duplication of effort by separate research groups. We discuss challenges that arise from the use of a distributed network of ESTEST servers and outline possible solutions. A community web portal called ESTEST Discovery is introduced for the purpose of facilitating the collection and annotation of contents from multiple ESTEST servers. We describe examples of use of the framework using two currently running servers at the University of California Davis and at the Centre Européen de Calcul Atomique et Moléculaire (CECAM).

  10. Abstractions for Fault-Tolerant Distributed System Verification

    NASA Technical Reports Server (NTRS)

    Pike, Lee S.; Maddalon, Jeffrey M.; Miner, Paul S.; Geser, Alfons

    2004-01-01

    Four kinds of abstraction for the design and analysis of fault tolerant distributed systems are discussed. These abstractions concern system messages, faults, fault masking voting, and communication. The abstractions are formalized in higher order logic, and are intended to facilitate specifying and verifying such systems in higher order theorem provers.

  11. Logistics distribution centers location problem and algorithm under fuzzy environment

    NASA Astrophysics Data System (ADS)

    Yang, Lixing; Ji, Xiaoyu; Gao, Ziyou; Li, Keping

    2007-11-01

    Distribution centers location problem is concerned with how to select distribution centers from the potential set so that the total relevant cost is minimized. This paper mainly investigates this problem under fuzzy environment. Consequentially, chance-constrained programming model for the problem is designed and some properties of the model are investigated. Tabu search algorithm, genetic algorithm and fuzzy simulation algorithm are integrated to seek the approximate best solution of the model. A numerical example is also given to show the application of the algorithm.

  12. Design and verification of distributed logic controllers with application of Petri nets

    NASA Astrophysics Data System (ADS)

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał; Wiśniewska, Monika

    2015-12-01

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  13. Design and verification of distributed logic controllers with application of Petri nets

    SciTech Connect

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał; Wiśniewska, Monika

    2015-12-31

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  14. Towards the Formal Verification of a Distributed Real-Time Automotive System

    NASA Technical Reports Server (NTRS)

    Endres, Erik; Mueller, Christian; Shadrin, Andrey; Tverdyshev, Sergey

    2010-01-01

    We present the status of a project which aims at building, formally and pervasively verifying a distributed automotive system. The target system is a gate-level model which consists of several interconnected electronic control units with independent clocks. This model is verified against the specification as seen by a system programmer. The automotive system is implemented on several FPGA boards. The pervasive verification is carried out using combination of interactive theorem proving (Isabelle/HOL) and model checking (LTL).

  15. The verification of lightning location accuracy in Finland deduced from lightning strikes to trees

    NASA Astrophysics Data System (ADS)

    Mäkelä, Antti; Mäkelä, Jakke; Haapalainen, Jussi; Porjo, Niko

    2016-05-01

    We present a new method to determine the ground truth and accuracy of lightning location systems (LLS), using natural lightning strikes to trees. Observations of strikes to trees are being collected with a Web-based survey tool at the Finnish Meteorological Institute. Since the Finnish thunderstorms tend to have on average a low flash rate, it is often possible to identify from the LLS data unambiguously the stroke that caused damage to a given tree. The coordinates of the tree are then the ground truth for that stroke. The technique has clear advantages over other methods used to determine the ground truth. Instrumented towers and rocket launches measure upward-propagating lightning. Video and audio records, even with triangulation, are rarely capable of high accuracy. We present data for 36 quality-controlled tree strikes in the years 2007-2008. We show that the average inaccuracy of the lightning location network for that period was 600 m. In addition, we show that the 50% confidence ellipse calculated by the lightning location network and used operationally for describing the location accuracy is physically meaningful: half of all the strikes were located within the uncertainty ellipse of the nearest recorded stroke. Using tree strike data thus allows not only the accuracy of the LLS to be estimated but also the reliability of the uncertainty ellipse. To our knowledge, this method has not been attempted before for natural lightning.

  16. Solute location in a nanoconfined liquid depends on charge distribution

    NASA Astrophysics Data System (ADS)

    Harvey, Jacob A.; Thompson, Ward H.

    2015-07-01

    Nanostructured materials that can confine liquids have attracted increasing attention for their diverse properties and potential applications. Yet, significant gaps remain in our fundamental understanding of such nanoconfined liquids. Using replica exchange molecular dynamics simulations of a nanoscale, hydroxyl-terminated silica pore system, we determine how the locations explored by a coumarin 153 (C153) solute in ethanol depend on its charge distribution, which can be changed through a charge transfer electronic excitation. The solute position change is driven by the internal energy, which favors C153 at the pore surface compared to the pore interior, but less so for the more polar, excited-state molecule. This is attributed to more favorable non-specific solvation of the large dipole moment excited-state C153 by ethanol at the expense of hydrogen-bonding with the pore. It is shown that a change in molecule location resulting from shifts in the charge distribution is a general result, though how the solute position changes will depend upon the specific system. This has important implications for interpreting measurements and designing applications of mesoporous materials.

  17. Solute location in a nanoconfined liquid depends on charge distribution

    SciTech Connect

    Harvey, Jacob A.; Thompson, Ward H.

    2015-07-28

    Nanostructured materials that can confine liquids have attracted increasing attention for their diverse properties and potential applications. Yet, significant gaps remain in our fundamental understanding of such nanoconfined liquids. Using replica exchange molecular dynamics simulations of a nanoscale, hydroxyl-terminated silica pore system, we determine how the locations explored by a coumarin 153 (C153) solute in ethanol depend on its charge distribution, which can be changed through a charge transfer electronic excitation. The solute position change is driven by the internal energy, which favors C153 at the pore surface compared to the pore interior, but less so for the more polar, excited-state molecule. This is attributed to more favorable non-specific solvation of the large dipole moment excited-state C153 by ethanol at the expense of hydrogen-bonding with the pore. It is shown that a change in molecule location resulting from shifts in the charge distribution is a general result, though how the solute position changes will depend upon the specific system. This has important implications for interpreting measurements and designing applications of mesoporous materials.

  18. Mass and galaxy distributions of four massive galaxy clusters from Dark Energy Survey Science Verification data

    SciTech Connect

    Melchior, P.; Suchyta, E.; Huff, E.; Hirsch, M.; Kacprzak, T.; Rykoff, E.; Gruen, D.; Armstrong, R.; Bacon, D.; Bechtol, K.; Bernstein, G. M.; Bridle, S.; Clampitt, J.; Honscheid, K.; Jain, B.; Jouvel, S.; Krause, E.; Lin, H.; MacCrann, N.; Patton, K.; Plazas, A.; Rowe, B.; Vikram, V.; Wilcox, H.; Young, J.; Zuntz, J.; Abbott, T.; Abdalla, F. B.; Allam, S. S.; Banerji, M.; Bernstein, J. P.; Bernstein, R. A.; Bertin, E.; Buckley-Geer, E.; Burke, D. L.; Castander, F. J.; da Costa, L. N.; Cunha, C. E.; Depoy, D. L.; Desai, S.; Diehl, H. T.; Doel, P.; Estrada, J.; Evrard, A. E.; Neto, A. F.; Fernandez, E.; Finley, D. A.; Flaugher, B.; Frieman, J. A.; Gaztanaga, E.; Gerdes, D.; Gruendl, R. A.; Gutierrez, G. R.; Jarvis, M.; Karliner, I.; Kent, S.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Maia, M. A. G.; Makler, M.; Marriner, J.; Marshall, J. L.; Merritt, K. W.; Miller, C. J.; Miquel, R.; Mohr, J.; Neilsen, E.; Nichol, R. C.; Nord, B. D.; Reil, K.; Roe, N. A.; Roodman, A.; Sako, M.; Sanchez, E.; Santiago, B. X.; Schindler, R.; Schubnell, M.; Sevilla-Noarbe, I.; Sheldon, E.; Smith, C.; Soares-Santos, M.; Swanson, M. E. C.; Sypniewski, A. J.; Tarle, G.; Thaler, J.; Thomas, D.; Tucker, D. L.; Walker, A.; Wechsler, R.; Weller, J.; Wester, W.

    2015-03-31

    We measure the weak-lensing masses and galaxy distributions of four massive galaxy clusters observed during the Science Verification phase of the Dark Energy Survey. This pathfinder study is meant to 1) validate the DECam imager for the task of measuring weak-lensing shapes, and 2) utilize DECam's large field of view to map out the clusters and their environments over 90 arcmin. We conduct a series of rigorous tests on astrometry, photometry, image quality, PSF modeling, and shear measurement accuracy to single out flaws in the data and also to identify the optimal data processing steps and parameters. We find Science Verification data from DECam to be suitable for the lensing analysis described in this paper. The PSF is generally well-behaved, but the modeling is rendered difficult by a flux-dependent PSF width and ellipticity. We employ photometric redshifts to distinguish between foreground and background galaxies, and a red-sequence cluster finder to provide cluster richness estimates and cluster-galaxy distributions. By fitting NFW profiles to the clusters in this study, we determine weak-lensing masses that are in agreement with previous work. For Abell 3261, we provide the first estimates of redshift, weak-lensing mass, and richness. In addition, the cluster-galaxy distributions indicate the presence of filamentary structures attached to 1E 0657-56 and RXC J2248.7-4431, stretching out as far as 1 degree (approximately 20 Mpc), showcasing the potential of DECam and DES for detailed studies of degree-scale features on the sky.

  19. Mass and galaxy distributions of four massive galaxy clusters from Dark Energy Survey Science Verification data

    SciTech Connect

    Melchior, P.; Suchyta, E.; Huff, E.; Hirsch, M.; Kacprzak, T.; Rykoff, E.; Gruen, D.; Armstrong, R.; Bacon, D.; Bechtol, K.; Bernstein, G. M.; Bridle, S.; Clampitt, J.; Honscheid, K.; Jain, B.; Jouvel, S.; Krause, E.; Lin, H.; MacCrann, N.; Patton, K.; Plazas, A.; Rowe, B.; Vikram, V.; Wilcox, H.; Young, J.; Zuntz, J.; Abbott, T.; Abdalla, F. B.; Allam, S. S.; Banerji, M.; Bernstein, J. P.; Bernstein, R. A.; Bertin, E.; Buckley-Geer, E.; Burke, D. L.; Castander, F. J.; da Costa, L. N.; Cunha, C. E.; Depoy, D. L.; Desai, S.; Diehl, H. T.; Doel, P.; Estrada, J.; Evrard, A. E.; Neto, A. F.; Fernandez, E.; Finley, D. A.; Flaugher, B.; Frieman, J. A.; Gaztanaga, E.; Gerdes, D.; Gruendl, R. A.; Gutierrez, G. R.; Jarvis, M.; Karliner, I.; Kent, S.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Maia, M. A. G.; Makler, M.; Marriner, J.; Marshall, J. L.; Merritt, K. W.; Miller, C. J.; Miquel, R.; Mohr, J.; Neilsen, E.; Nichol, R. C.; Nord, B. D.; Reil, K.; Roe, N. A.; Roodman, A.; Sako, M.; Sanchez, E.; Santiago, B. X.; Schindler, R.; Schubnell, M.; Sevilla-Noarbe, I.; Sheldon, E.; Smith, C.; Soares-Santos, M.; Swanson, M. E. C.; Sypniewski, A. J.; Tarle, G.; Thaler, J.; Thomas, D.; Tucker, D. L.; Walker, A.; Wechsler, R.; Weller, J.; Wester, W.

    2015-03-31

    We measure the weak lensing masses and galaxy distributions of four massive galaxy clusters observed during the Science Verification phase of the Dark Energy Survey (DES). This pathfinder study is meant to (1) validate the Dark Energy Camera (DECam) imager for the task of measuring weak lensing shapes, and (2) utilize DECam's large field of view to map out the clusters and their environments over 90 arcmin. We conduct a series of rigorous tests on astrometry, photometry, image quality, point spread function (PSF) modelling, and shear measurement accuracy to single out flaws in the data and also to identify the optimal data processing steps and parameters. We find Science Verification data from DECam to be suitable for the lensing analysis described in this paper. The PSF is generally well behaved, but the modelling is rendered difficult by a flux-dependent PSF width and ellipticity. We employ photometric redshifts to distinguish between foreground and background galaxies, and a red-sequence cluster finder to provide cluster richness estimates and cluster-galaxy distributions. By fitting Navarro-Frenk-White profiles to the clusters in this study, we determine weak lensing masses that are in agreement with previous work. For Abell 3261, we provide the first estimates of redshift, weak lensing mass, and richness. In addition, the cluster-galaxy distributions indicate the presence of filamentary structures attached to 1E 0657-56 and RXC J2248.7-4431, stretching out as far as 1. (approximately 20 Mpc), showcasing the potential of DECam and DES for detailed studies of degree-scale features on the sky.

  20. Mass and galaxy distributions of four massive galaxy clusters from Dark Energy Survey Science Verification data

    DOE PAGES

    Melchior, P.; Suchyta, E.; Huff, E.; ...

    2015-03-31

    We measure the weak-lensing masses and galaxy distributions of four massive galaxy clusters observed during the Science Verification phase of the Dark Energy Survey. This pathfinder study is meant to 1) validate the DECam imager for the task of measuring weak-lensing shapes, and 2) utilize DECam's large field of view to map out the clusters and their environments over 90 arcmin. We conduct a series of rigorous tests on astrometry, photometry, image quality, PSF modelling, and shear measurement accuracy to single out flaws in the data and also to identify the optimal data processing steps and parameters. We find Sciencemore » Verification data from DECam to be suitable for the lensing analysis described in this paper. The PSF is generally well-behaved, but the modelling is rendered difficult by a flux-dependent PSF width and ellipticity. We employ photometric redshifts to distinguish between foreground and background galaxies, and a red-sequence cluster finder to provide cluster richness estimates and cluster-galaxy distributions. By fitting NFW profiles to the clusters in this study, we determine weak-lensing masses that are in agreement with previous work. For Abell 3261, we provide the first estimates of redshift, weak-lensing mass, and richness. Additionally, the cluster-galaxy distributions indicate the presence of filamentary structures attached to 1E 0657-56 and RXC J2248.7-4431, stretching out as far as 1degree (approximately 20 Mpc), showcasing the potential of DECam and DES for detailed studies of degree-scale features on the sky.« less

  1. Mass and galaxy distributions of four massive galaxy clusters from Dark Energy Survey Science Verification data

    SciTech Connect

    Melchior, P.; Suchyta, E.; Huff, E.; Hirsch, M.; Kacprzak, T.; Rykoff, E.; Gruen, D.; Armstrong, R.; Bacon, D.; Bechtol, K.; Bernstein, G. M.; Bridle, S.; Clampitt, J.; Honscheid, K.; Jain, B.; Jouvel, S.; Krause, E.; Lin, H.; MacCrann, N.; Patton, K.; Plazas, A.; Rowe, B.; Vikram, V.; Wilcox, H.; Young, J.; Zuntz, J.; Abbott, T.; Abdalla, F. B.; Allam, S. S.; Banerji, M.; Bernstein, J. P.; Bernstein, R. A.; Bertin, E.; Buckley-Geer, E.; Burke, D. L.; Castander, F. J.; da Costa, L. N.; Cunha, C. E.; Depoy, D. L.; Desai, S.; Diehl, H. T.; Doel, P.; Estrada, J.; Evrard, A. E.; Neto, A. F.; Fernandez, E.; Finley, D. A.; Flaugher, B.; Frieman, J. A.; Gaztanaga, E.; Gerdes, D.; Gruendl, R. A.; Gutierrez, G. R.; Jarvis, M.; Karliner, I.; Kent, S.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Maia, M. A. G.; Makler, M.; Marriner, J.; Marshall, J. L.; Merritt, K. W.; Miller, C. J.; Miquel, R.; Mohr, J.; Neilsen, E.; Nichol, R. C.; Nord, B. D.; Reil, K.; Roe, N. A.; Roodman, A.; Sako, M.; Sanchez, E.; Santiago, B. X.; Schindler, R.; Schubnell, M.; Sevilla-Noarbe, I.; Sheldon, E.; Smith, C.; Soares-Santos, M.; Swanson, M. E. C.; Sypniewski, A. J.; Tarle, G.; Thaler, J.; Thomas, D.; Tucker, D. L.; Walker, A.; Wechsler, R.; Weller, J.; Wester, W.

    2015-03-31

    We measure the weak-lensing masses and galaxy distributions of four massive galaxy clusters observed during the Science Verification phase of the Dark Energy Survey. This pathfinder study is meant to 1) validate the DECam imager for the task of measuring weak-lensing shapes, and 2) utilize DECam's large field of view to map out the clusters and their environments over 90 arcmin. We conduct a series of rigorous tests on astrometry, photometry, image quality, PSF modelling, and shear measurement accuracy to single out flaws in the data and also to identify the optimal data processing steps and parameters. We find Science Verification data from DECam to be suitable for the lensing analysis described in this paper. The PSF is generally well-behaved, but the modelling is rendered difficult by a flux-dependent PSF width and ellipticity. We employ photometric redshifts to distinguish between foreground and background galaxies, and a red-sequence cluster finder to provide cluster richness estimates and cluster-galaxy distributions. By fitting NFW profiles to the clusters in this study, we determine weak-lensing masses that are in agreement with previous work. For Abell 3261, we provide the first estimates of redshift, weak-lensing mass, and richness. Additionally, the cluster-galaxy distributions indicate the presence of filamentary structures attached to 1E 0657-56 and RXC J2248.7-4431, stretching out as far as 1degree (approximately 20 Mpc), showcasing the potential of DECam and DES for detailed studies of degree-scale features on the sky.

  2. Verification of spatial and temporal pressure distributions in segmented solid rocket motors

    NASA Technical Reports Server (NTRS)

    Salita, Mark

    1989-01-01

    A wide variety of analytical tools are in use today to predict the history and spatial distributions of pressure in the combustion chambers of solid rocket motors (SRMs). Experimental and analytical methods are presented here that allow the verification of many of these predictions. These methods are applied to the redesigned space shuttle booster (RSRM). Girth strain-gage data is compared to the predictions of various one-dimensional quasisteady analyses in order to verify the axial drop in motor static pressure during ignition transients as well as quasisteady motor operation. The results of previous modeling of radial flows in the bore, slots, and around grain overhangs are supported by approximate analytical and empirical techniques presented here. The predictions of circumferential flows induced by inhibitor asymmetries, nozzle vectoring, and propellant slump are compared to each other and to subscale cold air and water tunnel measurements to ascertain their validity.

  3. Dosimetric verification of stereotactic radiosurgery/stereotactic radiotherapy dose distributions using Gafchromic EBT3

    SciTech Connect

    Cusumano, Davide; Fumagalli, Maria L.; Marchetti, Marcello; Fariselli, Laura; De Martin, Elena

    2015-10-01

    Aim of this study is to examine the feasibility of using the new Gafchromic EBT3 film in a high-dose stereotactic radiosurgery and radiotherapy quality assurance procedure. Owing to the reduced dimensions of the involved lesions, the feasibility of scanning plan verification films on the scanner plate area with the best uniformity rather than using a correction mask was evaluated. For this purpose, signal values dispersion and reproducibility of film scans were investigated. Uniformity was then quantified in the selected area and was found to be within 1.5% for doses up to 8 Gy. A high-dose threshold level for analyses using this procedure was established evaluating the sensitivity of the irradiated films. Sensitivity was found to be of the order of centiGray for doses up to 6.2 Gy and decreasing for higher doses. The obtained results were used to implement a procedure comparing dose distributions delivered with a CyberKnife system to planned ones. The procedure was validated through single beam irradiation on a Gafchromic film. The agreement between dose distributions was then evaluated for 13 patients (brain lesions, 5 Gy/die prescription isodose ~80%) using gamma analysis. Results obtained using Gamma test criteria of 5%/1 mm show a pass rate of 94.3%. Gamma frequency parameters calculation for EBT3 films showed to strongly depend on subtraction of unexposed film pixel values from irradiated ones. In the framework of the described dosimetric procedure, EBT3 films proved to be effective in the verification of high doses delivered to lesions with complex shapes and adjacent to organs at risk.

  4. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.

  5. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Kranz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.

  6. Distribution and Location of Genetic Effects for Dairy Traits

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Genetic effects for many dairy traits and for total economic merit are fairly evenly distributed across all chromosomes. A high-density scan using 38,416 SNP markers for 5,285 bulls confirmed two previously-known major genes on Bos taurus autosomes (BTA) 6 and 14 but revealed few other large effects...

  7. Location cuing and response time distributions in visual attention.

    PubMed

    Gottlob, Lawrence R

    2004-11-01

    The allocation of visual attention was investigated in two experiments. In Experiment 1 (n = 24), a peripheral cue was presented, and in Experiment 2 (n = 24), a central cue was used. In both experiments, cue validity was 90%, and the task was four-choice target identification. Response time distributions were collected for valid trials over five cue-target stimulus onset asynchronies (SOAs), and ex-Gaussian parameters were extracted. In both experiments, only the mean of the Gaussian component decreased as a function of cue-target SOA, which implied a strict time axis translation of the distributions. The results were consistent with sequential sampling models featuring a variable delay in the onset of information uptake.

  8. SLR data screening; location of peak of data distribution

    NASA Technical Reports Server (NTRS)

    Sinclair, Andrew T.

    1993-01-01

    At the 5th Laser Ranging Instrumentation Workshop held at Herstmonceux in 1984, consideration was given to the formation of on-site normal points by laser stations, and an algorithm was formulated. The algorithm included a recommendation that an iterated 3.0 x rms rejection criterion should be used to screen the data, and that arithmetic means should be formed within the normal point bins of the retained data. From Sept. 1990 onwards, this algorithm and screening criterion have been brought into effect by various laser stations for forming on-site normal points, and small variants of the algorithm are used by most analysis centers for forming normal points from full-rate data, although the data screening criterion they use ranges from about 2.5 to 3.0 x rms. At the CSTG Satellite Laser Ranging (SLR) Subcommission, a working group was set up in Mar. 1991 to review the recommended screening procedure. This paper has been influenced by the discussions of this working group, although the views expressed are primarily those of this author. The main thrust of this paper is that, particularly for single photon systems, a more important issue than data screening is the determination of the peak of a data distribution and hence, the determination of the bias of the peak from the mean. Several methods of determining the peak are discussed.

  9. Pretreatment verification of IMRT absolute dose distributions using a commercial a-Si EPID

    SciTech Connect

    Talamonti, C.; Casati, M.; Bucciolini, M.

    2006-11-15

    A commercial amorphous silicon electronic portal imaging device (EPID) has been studied to investigate its potential in the field of pretreatment verifications of step and shoot, intensity modulated radiation therapy (IMRT), 6 MV photon beams. The EPID was calibrated to measure absolute exit dose in a water-equivalent phantom at patient level, following an experimental approach, which does not require sophisticated calculation algorithms. The procedure presented was specifically intended to replace the time-consuming in-phantom film dosimetry. The dosimetric response was characterized on the central axis in terms of stability, linearity, and pulse repetition frequency dependence. The a-Si EPID demonstrated a good linearity with dose (within 2% from 1 monitor unit), which represent a prerequisite for the application in IMRT. A series of measurements, in which phantom thickness, air gap between the phantom and the EPID, field size and position of measurement of dose in the phantom (entrance or exit) varied, was performed to find the optimal calibration conditions, for which the field size dependence is minimized. In these conditions (20 cm phantom thickness, 56 cm air gap, exit dose measured at the isocenter), the introduction of a filter for the low-energy scattered radiation allowed us to define a universal calibration factor, independent of field size. The off-axis extension of the dose calibration was performed by applying a radial correction for the beam profile, distorted due to the standard flood field calibration of the device. For the acquisition of IMRT fields, it was necessary to employ home-made software and a specific procedure. This method was applied for the measurement of the dose distributions for 15 clinical IMRT fields. The agreement between the dose distributions, quantified by the gamma index, was found, on average, in 97.6% and 98.3% of the analyzed points for EPID versus TPS and for EPID versus FILM, respectively, thus suggesting a great

  10. Study On Burst Location Technology under Steady-state in Water Distribution System

    NASA Astrophysics Data System (ADS)

    Liu, Xianpin; Li, Shuping; Wang, Shaowei; He, Fang; He, Zhixun; Cao, Guodong

    2010-11-01

    According to the characteristics of hydraulic information under the state of burst in water distribution system, to get the correlation of monitoring values and burst location and locate the position of burst on time by mathematical fitting. This method can effectively make use of the information of SCADA in water distribution system to active locating burst position. A new idea of burst location in water distribution systems to shorten the burst time, reduce the impact on urban water supply, economic losses and waste of water resources.

  11. Redshift Distributions of Galaxies in the DES Science Verification Shear Catalogue and Implications for Weak Lensing

    SciTech Connect

    Bonnett, C.

    2015-07-21

    We present photometric redshift estimates for galaxies used in the weak lensing analysis of the Dark Energy Survey Science Verification (DES SV) data. Four model- or machine learning-based photometric redshift methods { annz2, bpz calibrated against BCC-U fig simulations, skynet, and tpz { are analysed. For training, calibration, and testing of these methods, we also construct a catalogue of spectroscopically confirmed galaxies matched against DES SV data. The performance of the methods is evalu-ated against the matched spectroscopic catalogue, focusing on metrics relevant for weak lensing analyses, with additional validation against COSMOS photo-zs. From the galaxies in the DES SV shear catalogue, which have mean redshift 0.72 ±0.01 over the range 0:3 < z < 1:3, we construct three tomographic bins with means of z = {0.45; 0.67,1.00g}. These bins each have systematic uncertainties δz ≲ 0.05 in the mean of the fiducial skynet photo-z n(z). We propagate the errors in the redshift distributions through to their impact on cosmological parameters estimated with cosmic shear, and find that they cause shifts in the value of σ8 of approx. 3%. This shift is within the one sigma statistical errors on σ8 for the DES SV shear catalog. We also found that further study of the potential impact of systematic differences on the critical surface density, Σcrit, contained levels of bias safely less than the statistical power of DES SV data. We recommend a final Gaussian prior for the photo-z bias in the mean of n(z) of width 0:05 for each of the three tomographic bins, and show that this is a sufficient bias model for the corresponding cosmology analysis.

  12. Operational flood-forecasting in the Piemonte region - development and verification of a fully distributed physically-oriented hydrological model

    NASA Astrophysics Data System (ADS)

    Rabuffetti, D.; Ravazzani, G.; Barbero, S.; Mancini, M.

    2009-03-01

    A hydrological model for real time flood forecasting to Civil Protection services requires reliability and rapidity. At present, computational capabilities overcome the rapidity needs even when a fully distributed hydrological model is adopted for a large river catchment as the Upper Po river basin closed at Ponte Becca (nearly 40 000 km2). This approach allows simulating the whole domain and obtaining the responses of large as well as of medium and little sized sub-catchments. The FEST-WB hydrological model (Mancini, 1990; Montaldo et al., 2007; Rabuffetti et al., 2008) is implemented. The calibration and verification activities are based on more than 100 flood events, occurred along the main tributaries of the Po river in the period 2000-2003. More than 300 meteorological stations are used to obtain the forcing fields, 10 cross sections with continuous and reliable discharge time series are used for calibration while verification is performed on about 40 monitored cross sections. Furthermore meteorological forecasting models are used to force the hydrological model with Quantitative Precipitation Forecasts (QPFs) for 36 h horizon in "operational setting" experiments. Particular care is devoted to understanding how QPF affects the accuracy of the Quantitative Discharge Forecasts (QDFs) and to assessing the QDF uncertainty impact on the warning system reliability. Results are presented either in terms of QDF and of warning issues highlighting the importance of an "operational based" verification approach.

  13. Locational Marginal Pricing in the Campus Power System at the Power Distribution Level

    SciTech Connect

    Hao, Jun; Gu, Yi; Zhang, Yingchen; Zhang, Jun Jason; Gao, David Wenzhong

    2016-11-14

    In the development of smart grid at distribution level, the realization of real-time nodal pricing is one of the key challenges. The research work in this paper implements and studies the methodology of locational marginal pricing at distribution level based on a real-world distribution power system. The pricing mechanism utilizes optimal power flow to calculate the corresponding distributional nodal prices. Both Direct Current Optimal Power Flow and Alternate Current Optimal Power Flow are utilized to calculate and analyze the nodal prices. The University of Denver campus power grid is used as the power distribution system test bed to demonstrate the pricing methodology.

  14. A Distribution-class Locational Marginal Price (DLMP) Index for Enhanced Distribution Systems

    NASA Astrophysics Data System (ADS)

    Akinbode, Oluwaseyi Wemimo

    The smart grid initiative is the impetus behind changes that are expected to culminate into an enhanced distribution system with the communication and control infrastructure to support advanced distribution system applications and resources such as distributed generation, energy storage systems, and price responsive loads. This research proposes a distribution-class analog of the transmission LMP (DLMP) as an enabler of the advanced applications of the enhanced distribution system. The DLMP is envisioned as a control signal that can incentivize distribution system resources to behave optimally in a manner that benefits economic efficiency and system reliability and that can optimally couple the transmission and the distribution systems. The DLMP is calculated from a two-stage optimization problem; a transmission system OPF and a distribution system OPF. An iterative framework that ensures accurate representation of the distribution system's price sensitive resources for the transmission system problem and vice versa is developed and its convergence problem is discussed. As part of the DLMP calculation framework, a DCOPF formulation that endogenously captures the effect of real power losses is discussed. The formulation uses piecewise linear functions to approximate losses. This thesis explores, with theoretical proofs, the breakdown of the loss approximation technique when non-positive DLMPs/LMPs occur and discusses a mixed integer linear programming formulation that corrects the breakdown. The DLMP is numerically illustrated in traditional and enhanced distribution systems and its superiority to contemporary pricing mechanisms is demonstrated using price responsive loads. Results show that the impact of the inaccuracy of contemporary pricing schemes becomes significant as flexible resources increase. At high elasticity, aggregate load consumption deviated from the optimal consumption by up to about 45 percent when using a flat or time-of-use rate. Individual load

  15. A location-routing-inventory model for designing multisource distribution networks

    NASA Astrophysics Data System (ADS)

    Ahmadi-Javid, Amir; Seddighi, Amir Hossein

    2012-06-01

    This article studies a ternary-integration problem that incorporates location, inventory and routing decisions in designing a multisource distribution network. The objective of the problem is to minimize the total cost of location, routing and inventory. A mixed-integer programming formulation is first presented, and then a three-phase heuristic is developed to solve large-sized instances of the problem. The numerical study indicates that the proposed heuristic is both effective and efficient.

  16. Geographic location, network patterns and population distribution of rural settlements in Greece

    NASA Astrophysics Data System (ADS)

    Asimakopoulos, Avraam; Mogios, Emmanuel; Xenikos, Dimitrios G.

    2016-10-01

    Our work addresses the problem of how social networks are embedded in space, by studying the spread of human population over complex geomorphological terrain. We focus on villages or small cities up to a few thousand inhabitants located in mountainous areas in Greece. This terrain presents a familiar tree-like structure of valleys and land plateaus. Cities are found more often at lower altitudes and exhibit preference on south orientation. Furthermore, the population generally avoids flat land plateaus and river beds, preferring locations slightly uphill, away from the plateau edge. Despite the location diversity regarding geomorphological parameters, we find certain quantitative norms when we examine location and population distributions relative to the (man-made) transportation network. In particular, settlements at radial distance ℓ away from road network junctions have the same mean altitude, practically independent of ℓ ranging from a few meters to 10 km. Similarly, the distribution of the settlement population at any given ℓ is the same for all ℓ. Finally, the cumulative distribution of the number of rural cities n(ℓ) is fitted to the Weibull distribution, suggesting that human decisions for creating settlements could be paralleled to mechanisms typically attributed to this particular statistical distribution.

  17. Optimization of pressure gauge locations for water distribution systems using entropy theory.

    PubMed

    Yoo, Do Guen; Chang, Dong Eil; Jun, Hwandon; Kim, Joong Hoon

    2012-12-01

    It is essential to select the optimal pressure gauge location for effective management and maintenance of water distribution systems. This study proposes an objective and quantified standard for selecting the optimal pressure gauge location by defining the pressure change at other nodes as a result of demand change at a specific node using entropy theory. Two cases are considered in terms of demand change: that in which demand at all nodes shows peak load by using a peak factor and that comprising the demand change of the normal distribution whose average is the base demand. The actual pressure change pattern is determined by using the emitter function of EPANET to reflect the pressure that changes practically at each node. The optimal pressure gauge location is determined by prioritizing the node that processes the largest amount of information it gives to (giving entropy) and receives from (receiving entropy) the whole system according to the entropy standard. The suggested model is applied to one virtual and one real pipe network, and the optimal pressure gauge location combination is calculated by implementing the sensitivity analysis based on the study results. These analysis results support the following two conclusions. Firstly, the installation priority of the pressure gauge in water distribution networks can be determined with a more objective standard through the entropy theory. Secondly, the model can be used as an efficient decision-making guide for gauge installation in water distribution systems.

  18. A method to optimize sampling locations for measuring indoor air distributions

    NASA Astrophysics Data System (ADS)

    Huang, Yan; Shen, Xiong; Li, Jianmin; Li, Bingye; Duan, Ran; Lin, Chao-Hsin; Liu, Junjie; Chen, Qingyan

    2015-02-01

    Indoor air distributions, such as the distributions of air temperature, air velocity, and contaminant concentrations, are very important to occupants' health and comfort in enclosed spaces. When point data is collected for interpolation to form field distributions, the sampling locations (the locations of the point sensors) have a significant effect on time invested, labor costs and measuring accuracy on field interpolation. This investigation compared two different sampling methods: the grid method and the gradient-based method, for determining sampling locations. The two methods were applied to obtain point air parameter data in an office room and in a section of an economy-class aircraft cabin. The point data obtained was then interpolated to form field distributions by the ordinary Kriging method. Our error analysis shows that the gradient-based sampling method has 32.6% smaller error of interpolation than the grid sampling method. We acquired the function between the interpolation errors and the sampling size (the number of sampling points). According to the function, the sampling size has an optimal value and the maximum sampling size can be determined by the sensor and system errors. This study recommends the gradient-based sampling method for measuring indoor air distributions.

  19. Spatially distributed energy balance snowmelt modelling in a mountainous river basin: estimation of meteorological inputs and verification of model results

    NASA Astrophysics Data System (ADS)

    Garen, David C.; Marks, Danny

    2005-12-01

    A spatially distributed energy balance snowmelt model has been applied to a 2150 km 2 drainage basin in the Boise River, ID, USA, to simulate the accumulation and melt of the snowpack for the years 1998-2000. The simulation was run at a 3 h time step and a spatial resolution of 250 m. Spatial field time series of meteorological input data were obtained using various spatial interpolation and simulation methods. The variables include precipitation, air temperature, dew point temperature, wind speed, and solar and thermal radiation. The goal was to use readily available data and relatively straightforward, yet physically meaningful, methods to develop the spatial fields. With these meteorological fields as input, the simulated fields of snow water equivalent, snow depth, and snow covered area reproduce observations very well. The simulated snowmelt fields are also used as input to a spatially distributed hydrologic model to estimate streamflow. This gives an additional verification of the snowmelt modelling results as well as provides a linkage of the two models to generate hydrographs for water management information. This project is a demonstration of spatially distributed energy balance snowmelt modelling in a large mountainous catchment using data from existing meteorological networks. This capability then suggests the potential for developing new spatial hydrologic informational products and the possibility of improving the accuracy of the prediction of hydrologic processes for water and natural resources management.

  20. Role of origin and release location in pre-spawning distribution and movements of anadromous alewife

    USGS Publications Warehouse

    Frank, Holly J.; Mather, M. E.; Smith, Joseph M.; Muth, Robert M.; Finn, John T.

    2011-01-01

    Capturing adult anadromous fish that are ready to spawn from a self sustaining population and transferring them into a depleted system is a common fisheries enhancement tool. The behaviour of these transplanted fish, however, has not been fully evaluated. The movements of stocked and native anadromous alewife, Alosa pseudoharengus (Wilson), were monitored in the Ipswich River, Massachusetts, USA, to provide a scientific basis for this management tool. Radiotelemetry was used to examine the effect of origin (native or stocked) and release location (upstream or downstream) on distribution and movement during the spawning migration. Native fish remained in the river longer than stocked fish regardless of release location. Release location and origin influenced where fish spent time and how they moved. The spatial mosaic of available habitats and the entire trajectory of freshwater movements should be considered to restore effectively spawners that traverse tens of kilometres within coastal rivers.

  1. Biomechanical Assessment of Rucksack Shoulder Strap Attachment Location: Effect on Load Distribution to the Torso

    DTIC Science & Technology

    2001-05-01

    UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADPO 11003 TITLE: Biomechanical Assessment of Rucksack Shoulder Strap...ADP010987 thru ADPO11009 UNCLASSIFIED 20-1 Biomechanical Assessment of Rucksack Shoulder Strap Attachment Location: Effect on Load Distribution to the...Education Queen’s University Kingston, Ontario, Canada K7L 3N6 Summary The objective of this study was to conduct biomechanical testing of pack component

  2. Tomotherapy dose distribution verification using MAGIC-f polymer gel dosimetry

    SciTech Connect

    Pavoni, J. F.; Pike, T. L.; Snow, J.; DeWerd, L.; Baffa, O.

    2012-05-15

    Purpose: This paper presents the application of MAGIC-f gel in a three-dimensional dose distribution measurement and its ability to accurately measure the dose distribution from a tomotherapy unit. Methods: A prostate intensity-modulated radiation therapy (IMRT) irradiation was simulated in the gel phantom and the treatment was delivered by a TomoTherapy equipment. Dose distribution was evaluated by the R2 distribution measured in magnetic resonance imaging. Results: A high similarity was found by overlapping of isodoses of the dose distribution measured with the gel and expected by the treatment planning system (TPS). Another analysis was done by comparing the relative absorbed dose profiles in the measured and in the expected dose distributions extracted along indicated lines of the volume and the results were also in agreement. The gamma index analysis was also applied to the data and a high pass rate was achieved (88.4% for analysis using 3%/3 mm and of 96.5% using 4%/4 mm). The real three-dimensional analysis compared the dose-volume histograms measured for the planning volumes and expected by the treatment planning, being the results also in good agreement by the overlapping of the curves. Conclusions: These results show that MAGIC-f gel is a promise for tridimensional dose distribution measurements.

  3. Improvement on post-OPC verification efficiency for contact/via coverage check by final CD biasing of metal lines and considering their location on the metal layout

    NASA Astrophysics Data System (ADS)

    Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong

    2011-04-01

    As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model

  4. Sensor Location Problem Optimization for Traffic Network with Different Spatial Distributions of Traffic Information.

    PubMed

    Bao, Xu; Li, Haijian; Qin, Lingqiao; Xu, Dongwei; Ran, Bin; Rong, Jian

    2016-10-27

    To obtain adequate traffic information, the density of traffic sensors should be sufficiently high to cover the entire transportation network. However, deploying sensors densely over the entire network may not be realistic for practical applications due to the budgetary constraints of traffic management agencies. This paper describes several possible spatial distributions of traffic information credibility and proposes corresponding different sensor information credibility functions to describe these spatial distribution properties. A maximum benefit model and its simplified model are proposed to solve the traffic sensor location problem. The relationships between the benefit and the number of sensors are formulated with different sensor information credibility functions. Next, expanding models and algorithms in analytic results are performed. For each case, the maximum benefit, the optimal number and spacing of sensors are obtained and the analytic formulations of the optimal sensor locations are derived as well. Finally, a numerical example is proposed to verify the validity and availability of the proposed models for solving a network sensor location problem. The results show that the optimal number of sensors of segments with different model parameters in an entire freeway network can be calculated. Besides, it can also be concluded that the optimal sensor spacing is independent of end restrictions but dependent on the values of model parameters that represent the physical conditions of sensors and roads.

  5. Sensor Location Problem Optimization for Traffic Network with Different Spatial Distributions of Traffic Information

    PubMed Central

    Bao, Xu; Li, Haijian; Qin, Lingqiao; Xu, Dongwei; Ran, Bin; Rong, Jian

    2016-01-01

    To obtain adequate traffic information, the density of traffic sensors should be sufficiently high to cover the entire transportation network. However, deploying sensors densely over the entire network may not be realistic for practical applications due to the budgetary constraints of traffic management agencies. This paper describes several possible spatial distributions of traffic information credibility and proposes corresponding different sensor information credibility functions to describe these spatial distribution properties. A maximum benefit model and its simplified model are proposed to solve the traffic sensor location problem. The relationships between the benefit and the number of sensors are formulated with different sensor information credibility functions. Next, expanding models and algorithms in analytic results are performed. For each case, the maximum benefit, the optimal number and spacing of sensors are obtained and the analytic formulations of the optimal sensor locations are derived as well. Finally, a numerical example is proposed to verify the validity and availability of the proposed models for solving a network sensor location problem. The results show that the optimal number of sensors of segments with different model parameters in an entire freeway network can be calculated. Besides, it can also be concluded that the optimal sensor spacing is independent of end restrictions but dependent on the values of model parameters that represent the physical conditions of sensors and roads. PMID:27801794

  6. Experimental Verification of Modeled Thermal Distribution Produced by a Piston Source in Physiotherapy Ultrasound

    PubMed Central

    Lopez-Haro, S. A.; Leija, L.

    2016-01-01

    Objectives. To present a quantitative comparison of thermal patterns produced by the piston-in-a-baffle approach with those generated by a physiotherapy ultrasonic device and to show the dependency among thermal patterns and acoustic intensity distributions. Methods. The finite element (FE) method was used to model an ideal acoustic field and the produced thermal pattern to be compared with the experimental acoustic and temperature distributions produced by a real ultrasonic applicator. A thermal model using the measured acoustic profile as input is also presented for comparison. Temperature measurements were carried out with thermocouples inserted in muscle phantom. The insertion place of thermocouples was monitored with ultrasound imaging. Results. Modeled and measured thermal profiles were compared within the first 10 cm of depth. The ideal acoustic field did not adequately represent the measured field having different temperature profiles (errors 10% to 20%). Experimental field was concentrated near the transducer producing a region with higher temperatures, while the modeled ideal temperature was linearly distributed along the depth. The error was reduced to 7% when introducing the measured acoustic field as the input variable in the FE temperature modeling. Conclusions. Temperature distributions are strongly related to the acoustic field distributions. PMID:27999801

  7. Locations of Sampling Stations for Water Quality Monitoring in Water Distribution Networks.

    PubMed

    Rathi, Shweta; Gupta, Rajesh

    2014-04-01

    Water quality is required to be monitored in the water distribution networks (WDNs) at salient locations to assure the safe quality of water supplied to the consumers. Such monitoring stations (MSs) provide warning against any accidental contaminations. Various objectives like demand coverage, time for detection, volume of water contaminated before detection, extent of contamination, expected population affected prior to detection, detection likelihood and others, have been independently or jointly considered in determining optimal number and location of MSs in WDNs. "Demand coverage" defined as the percentage of network demand monitored by a particular monitoring station is a simple measure to locate MSs. Several methods based on formulation of coverage matrix using pre-specified coverage criteria and optimization have been suggested. Coverage criteria is defined as some minimum percentage of total flow received at the monitoring stations that passed through any upstream node included then as covered node of the monitoring station. Number of monitoring stations increases with the increase in the value of coverage criteria. Thus, the design of monitoring station becomes subjective. A simple methodology is proposed herein which priority wise iteratively selects MSs to achieve targeted demand coverage. The proposed methodology provided the same number and location of MSs for illustrative network as an optimization method did. Further, the proposed method is simple and avoids subjectivity that could arise from the consideration of coverage criteria. The application of methodology is also shown on a WDN of Dharampeth zone (Nagpur city WDN in Maharashtra, India) having 285 nodes and 367 pipes.

  8. Verification, Validation, and Accreditation Challenges of Distributed Simulation for Space Exploration Technology

    NASA Technical Reports Server (NTRS)

    Thomas, Danny; Hartway, Bobby; Hale, Joe

    2006-01-01

    Throughout its rich history, NASA has invested heavily in sophisticated simulation capabilities. These capabilities reside in NASA facilities across the country - and with partners around the world. NASA s Exploration Systems Mission Directorate (ESMD) has the opportunity to leverage these considerable investments to resolve technical questions relating to its missions. The distributed nature of the assets, both in terms of geography and organization, present challenges to their combined and coordinated use, but precedents of geographically distributed real-time simulations exist. This paper will show how technological advances in simulation can be employed to address the issues associated with netting NASA simulation assets.

  9. Distributed fiber sensing system with wide frequency response and accurate location

    NASA Astrophysics Data System (ADS)

    Shi, Yi; Feng, Hao; Zeng, Zhoumo

    2016-02-01

    A distributed fiber sensing system merging Mach-Zehnder interferometer and phase-sensitive optical time domain reflectometer (Φ-OTDR) is demonstrated for vibration measurement, which requires wide frequency response and accurate location. Two narrow line-width lasers with delicately different wavelengths are used to constitute the interferometer and reflectometer respectively. A narrow band Fiber Bragg Grating is responsible for separating the two wavelengths. In addition, heterodyne detection is applied to maintain the signal to noise rate of the locating signal. Experiment results show that the novel system has a wide frequency from 1 Hz to 50 MHz, limited by the sample frequency of data acquisition card, and a spatial resolution of 20 m, according to 200 ns pulse width, along 2.5 km fiber link.

  10. Locating of the earth fault of the single-phase at the tree distribution network

    NASA Astrophysics Data System (ADS)

    Guo, Li-Ping

    2013-07-01

    This paper proposes method through the combination of ranging the C-type traveling wave and the analysis and selection of line to the component of the line mode to locate the fault point. Inject into high amplitude and narrow signals in the beginning of a line and detect circuit returning from the arrival time of the waveform. Comparing of the normal waveform and failure waveform in both cases, receive the reflected wave fault at the arrival time. And then determine the fault distance. Using the effect from the fault which generate the shocks of the traveling wave and comparing the shock time of each branch line, the line which acquires the longest duration of vibration is the fault line. Through the theoretical analysis, Matlab simulation and effective analysis of the selected data, this paper proves the correctness of the method and demonstrate that the method of fault location in distribution networks is practical.

  11. Gas Chromatographic Verification of a Mathematical Model: Product Distribution Following Methanolysis Reactions.

    ERIC Educational Resources Information Center

    Lam, R. B.; And Others

    1983-01-01

    Investigated application of binomial statistics to equilibrium distribution of ester systems by employing gas chromatography to verify the mathematical model used. Discusses model development and experimental techniques, indicating the model enables a straightforward extension to symmetrical polyfunctional esters and presents a mathematical basis…

  12. Specification/Verification of Temporal Properties for Distributed Systems: Issues and Approaches. Volume 1

    DTIC Science & Technology

    1990-02-01

    Philip A. Bernstein and Nathan Goodman. Concurrency control in distributed database systems. ACM Computing Surveys, 13(2):185-221, June 1981. [5] K. J...Sequential Processe8. Series in Computer Science. PrenticeHall International, Englewood Cliff, NJ, 1985. 96 [24] A. L. Hopkins Jr., T. Basil Smith, III, and J

  13. Verification of 3D Dose Distributions of a Beta-Emitting Radionuclide Using PRESAGE^ Dosimeters

    NASA Astrophysics Data System (ADS)

    Crowder, Mandi; Grant, Ryan; Ibbott, Geoff; Wendt, Richard

    2010-11-01

    Liquid Brachytherapy involves the direct administration of a beta-emitting radioactive solution into the selected tissue. The solution does not migrate from the injection point and uses the limited range of beta particles to produce a three-dimensional dose distribution. We simulated distributions by beta-dose kernels and validated those estimates by irradiating PRESAGE^ polyurethane dosimeters that measure the three-dimensional dose distributions by a change in optical density that is proportional to dose. The dosimeters were injected with internal beta-emitting radionuclide yttrium-90, exposed for 5.75 days, imaged with optical tomography, and analyzed with radiotherapy software. Dosimeters irradiated with an electron beam to 2 or 3 Gy were used for calibration. The shapes and dose distributions in the PRESAGE^ dosimeters were consistent with the predicted dose kernels. Our experiments have laid the groundwork for future application to individualized patient therapy by ultimately designing a treatment plan that conforms to the shape of any appropriate tumor.

  14. Nanofibre distribution in composites manufactured with epoxy reinforced with nanofibrillated cellulose: model prediction and verification

    NASA Astrophysics Data System (ADS)

    Aitomäki, Yvonne; Westin, Mikael; Korpimäki, Jani; Oksman, Kristiina

    2016-07-01

    In this study a model based on simple scattering is developed and used to predict the distribution of nanofibrillated cellulose in composites manufactured by resin transfer moulding (RTM) where the resin contains nanofibres. The model is a Monte Carlo based simulation where nanofibres are randomly chosen from probability density functions for length, diameter and orientation. Their movements are then tracked as they advance through a random arrangement of fibres in defined fibre bundles. The results of the model show that the fabric filters the nanofibres within the first 20 µm unless clear inter-bundle channels are available. The volume fraction of the fabric fibres, flow velocity and size of nanofibre influence this to some extent. To verify the model, an epoxy with 0.5 wt.% Kraft Birch nanofibres was made through a solvent exchange route and stained with a colouring agent. This was infused into a glass fibre fabric using an RTM process. The experimental results confirmed the filtering of the nanofibres by the fibre bundles and their penetration in the fabric via the inter-bundle channels. Hence, the model is a useful tool for visualising the distribution of the nanofibres in composites in this manufacturing process.

  15. Experimental verification of a model describing the intensity distribution from a single mode optical fiber

    SciTech Connect

    Moro, Erik A; Puckett, Anthony D; Todd, Michael D

    2011-01-24

    The intensity distribution of a transmission from a single mode optical fiber is often approximated using a Gaussian-shaped curve. While this approximation is useful for some applications such as fiber alignment, it does not accurately describe transmission behavior off the axis of propagation. In this paper, another model is presented, which describes the intensity distribution of the transmission from a single mode optical fiber. A simple experimental setup is used to verify the model's accuracy, and agreement between model and experiment is established both on and off the axis of propagation. Displacement sensor designs based on the extrinsic optical lever architecture are presented. The behavior of the transmission off the axis of propagation dictates the performance of sensor architectures where large lateral offsets (25-1500 {micro}m) exist between transmitting and receiving fibers. The practical implications of modeling accuracy over this lateral offset region are discussed as they relate to the development of high-performance intensity modulated optical displacement sensors. In particular, the sensitivity, linearity, resolution, and displacement range of a sensor are functions of the relative positioning of the sensor's transmitting and receiving fibers. Sensor architectures with high combinations of sensitivity and displacement range are discussed. It is concluded that the utility of the accurate model is in its predicative capability and that this research could lead to an improved methodology for high-performance sensor design.

  16. Estimation of distributional parameters for censored trace level water quality data. 2. Verification and applications

    USGS Publications Warehouse

    Helsel, D.R.; Gilliom, R.J.

    1986-01-01

    Estimates of distributional parameters (mean, standard deviation, median, interquartile range) are often desired for data sets containing censored observations. Eight methods for estimating these parameters have been evaluated by R. J. Gilliom and D. R. Helsel (this issue) using Monte Carlo simulations. To verify those findings, the same methods are now applied to actual water quality data. The best method (lowest root-mean-squared error (rmse)) over all parameters, sample sizes, and censoring levels is log probability regression (LR), the method found best in the Monte Carlo simulations. Best methods for estimating moment or percentile parameters separately are also identical to the simulations. Reliability of these estimates can be expressed as confidence intervals using rmse and bias values taken from the simulation results. Finally, a new simulation study shows that best methods for estimating uncensored sample statistics from censored data sets are identical to those for estimating population parameters.

  17. Location, Location, Location!

    ERIC Educational Resources Information Center

    Ramsdell, Kristin

    2004-01-01

    Of prime importance in real estate, location is also a key element in the appeal of romances. Popular geographic settings and historical periods sell, unpopular ones do not--not always with a logical explanation, as the author discovered when she conducted a survey on this topic last year. (Why, for example, are the French Revolution and the…

  18. Atomic Scale Verification of Oxide-Ion Vacancy Distribution near a Single Grain Boundary in YSZ

    PubMed Central

    An, Jihwan; Park, Joong Sun; Koh, Ai Leen; Lee, Hark B.; Jung, Hee Joon; Schoonman, Joop; Sinclair, Robert; Gür, Turgut M.; Prinz, Fritz B.

    2013-01-01

    This study presents atomic scale characterization of grain boundary defect structure in a functional oxide with implications for a wide range of electrochemical and electronic behavior. Indeed, grain boundary engineering can alter transport and kinetic properties by several orders of magnitude. Here we report experimental observation and determination of oxide-ion vacancy concentration near the Σ13 (510)/[001] symmetric tilt grain-boundary of YSZ bicrystal using aberration-corrected TEM operated under negative spherical aberration coefficient imaging condition. We show significant oxygen deficiency due to segregation of oxide-ion vacancies near the grain-boundary core with half-width < 0.6 nm. Electron energy loss spectroscopy measurements with scanning TEM indicated increased oxide-ion vacancy concentration at the grain boundary core. Oxide-ion density distribution near a grain boundary simulated by molecular dynamics corroborated well with experimental results. Such column-by-column quantification of defect concentration in functional materials can provide new insights that may lead to engineered grain boundaries designed for specific functionalities. PMID:24042150

  19. Verification of the efficiency of chemical disinfection and sanitation measures in in-building distribution systems.

    PubMed

    Lenz, J; Linke, S; Gemein, S; Exner, M; Gebel, J

    2010-06-01

    Previous investigations of biofilms, generated in a silicone tube model have shown that the number of colony forming units (CFU) can reach 10(7)/cm(2), the total cell count (TCC) of microorganisms can be up to 10(8)cells/cm(2). The present study focuses on the situation in in-building distribution systems. Different chemical disinfectants were tested for their efficacy on drinking water biofilms in silicone tubes: free chlorine (electrochemically activated), chlorine dioxide, hydrogen peroxide (H(2)O(2)), silver, and fruit acids. With regard to the widely differing manufacturers' instructions for the usage of their disinfectants three different variations of the silicone tube model were developed to simulate practical use conditions. First the continuous treatment, second the intermittent treatment, third the efficacy of external disinfection treatment and the monitoring for possible biofilm formation with the Hygiene-Monitor. The working experience showed that it is important to know how to handle the individual disinfectants. Every active ingredient has its own optimal application concerning its concentration, exposure time, physical parameters like pH, temperature or redox potential. When used correctly all products tested were able to reduce the CFU to a value below the detection limit. Most of the active ingredients could not significantly reduce the TCC/cm(2), which means that viable microorganisms may still be present in the system. Thus the question arises what happened with these cells? In some cases SEM pictures of the biofilm matrix after a successful disinfection still showed biofilm residues. According to these results, no general correlation between CFU/cm(2), TCC/cm(2) and the visualised biofilm matrix on the silicone tube surface (SEM) could be demonstrated after a treatment with disinfectants.

  20. Commercial milk distribution profiles and production locations. Hanford Environmental Dose Reconstruction Project

    SciTech Connect

    Deonigi, D.E.; Anderson, D.M.; Wilfert, G.L.

    1993-12-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project was established to estimate radiation doses that people could have received from nuclear operations at the Hanford Site since 1944. For this period iodine-131 is the most important offsite contributor to radiation doses from Hanford operations. Consumption of milk from cows that ate vegetation contaminated by iodine-131 is the dominant radiation pathway for individuals who drank milk. Information has been developed on commercial milk cow locations and commercial milk distribution during 1945 and 1951. The year 1945 was selected because during 1945 the largest amount of iodine-131 was released from Hanford facilities in a calendar year; therefore, 1945 was the year in which an individual was likely to have received the highest dose. The year 1951 was selected to provide data for comparing the changes that occurred in commercial milk flows (i.e., sources, processing locations, and market areas) between World War II and the post-war period. To estimate the doses people could have received from this milk flow, it is necessary to estimate the amount of milk people consumed, the source of the milk, the specific feeding regime used for milk cows, and the amount of iodine-131 contamination deposited on feed.

  1. Commercial milk distribution profiles and production locations. Hanford Environmental Dose Reconstruction Project

    SciTech Connect

    Deonigi, D.E.; Anderson, D.M.; Wilfert, G.L.

    1994-04-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project was established to estimate radiation doses that people could have received from nuclear operations at the Hanford Site since 1944. For this period iodine-131 is the most important offsite contributor to radiation doses from Hanford operations. Consumption of milk from cows that ate vegetation contaminated by iodine-131 is the dominant radiation pathway for individuals who drank milk (Napier 1992). Information has been developed on commercial milk cow locations and commercial milk distribution during 1945 and 1951. The year 1945 was selected because during 1945 the largest amount of iodine-131 was released from Hanford facilities in a calendar year (Heeb 1993); therefore, 1945 was the year in which an individual was likely to have received the highest dose. The year 1951 was selected to provide data for comparing the changes that occurred in commercial milk flows (i.e., sources, processing locations, and market areas) between World War II and the post-war period. To estimate the doses people could have received from this milk flow, it is necessary to estimate the amount of milk people consumed, the source of the milk, the specific feeding regime used for milk cows, and the amount of iodine-131 contamination deposited on feed.

  2. Redshift distributions of galaxies in the Dark Energy Survey Science Verification shear catalogue and implications for weak lensing

    NASA Astrophysics Data System (ADS)

    Bonnett, C.; Troxel, M. A.; Hartley, W.; Amara, A.; Leistedt, B.; Becker, M. R.; Bernstein, G. M.; Bridle, S. L.; Bruderer, C.; Busha, M. T.; Carrasco Kind, M.; Childress, M. J.; Castander, F. J.; Chang, C.; Crocce, M.; Davis, T. M.; Eifler, T. F.; Frieman, J.; Gangkofner, C.; Gaztanaga, E.; Glazebrook, K.; Gruen, D.; Kacprzak, T.; King, A.; Kwan, J.; Lahav, O.; Lewis, G.; Lidman, C.; Lin, H.; MacCrann, N.; Miquel, R.; O'Neill, C. R.; Palmese, A.; Peiris, H. V.; Refregier, A.; Rozo, E.; Rykoff, E. S.; Sadeh, I.; Sánchez, C.; Sheldon, E.; Uddin, S.; Wechsler, R. H.; Zuntz, J.; Abbott, T.; Abdalla, F. B.; Allam, S.; Armstrong, R.; Banerji, M.; Bauer, A. H.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Capozzi, D.; Carnero Rosell, A.; Carretero, J.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Dietrich, J. P.; Doel, P.; Fausti Neto, A.; Fernandez, E.; Flaugher, B.; Fosalba, P.; Gerdes, D. W.; Gruendl, R. A.; Honscheid, K.; Jain, B.; James, D. J.; Jarvis, M.; Kim, A. G.; Kuehn, K.; Kuropatkin, N.; Li, T. S.; Lima, M.; Maia, M. A. G.; March, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Miller, C. J.; Neilsen, E.; Nichol, R. C.; Nord, B.; Ogando, R.; Plazas, A. A.; Reil, K.; Romer, A. K.; Roodman, A.; Sako, M.; Sanchez, E.; Santiago, B.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thaler, J.; Thomas, D.; Vikram, V.; Walker, A. R.; Dark Energy Survey Collaboration

    2016-08-01

    We present photometric redshift estimates for galaxies used in the weak lensing analysis of the Dark Energy Survey Science Verification (DES SV) data. Four model- or machine learning-based photometric redshift methods—annz2, bpz calibrated against BCC-Ufig simulations, skynet, and tpz—are analyzed. For training, calibration, and testing of these methods, we construct a catalogue of spectroscopically confirmed galaxies matched against DES SV data. The performance of the methods is evaluated against the matched spectroscopic catalogue, focusing on metrics relevant for weak lensing analyses, with additional validation against COSMOS photo-z 's. From the galaxies in the DES SV shear catalogue, which have mean redshift 0.72 ±0.01 over the range 0.3 distributions through to their impact on cosmological parameters estimated with cosmic shear, and find that they cause shifts in the value of σ8 of approximately 3%. This shift is within the one sigma statistical errors on σ8 for the DES SV shear catalogue. We further study the potential impact of systematic differences on the critical surface density, Σcrit , finding levels of bias safely less than the statistical power of DES SV data. We recommend a final Gaussian prior for the photo-z bias in the mean of n (z ) of width 0.05 for each of the three tomographic bins, and show that this is a sufficient bias model for the corresponding cosmology analysis.

  3. Redshift distributions of galaxies in the Dark Energy Survey Science Verification shear catalogue and implications for weak lensing

    SciTech Connect

    Bonnett, C.; Troxel, M. A.; Hartley, W.; Amara, A.; Leistedt, B.; Becker, M. R.; Bernstein, G. M.; Bridle, S. L.; Bruderer, C.; Busha, M. T.; Carrasco Kind, M.; Childress, M. J.; Castander, F. J.; Chang, C.; Crocce, M.; Davis, T. M.; Eifler, T. F.; Frieman, J.; Gangkofner, C.; Gaztanaga, E.; Glazebrook, K.; Gruen, D.; Kacprzak, T.; King, A.; Kwan, J.; Lahav, O.; Lewis, G.; Lidman, C.; Lin, H.; MacCrann, N.; Miquel, R.; O’Neill, C. R.; Palmese, A.; Peiris, H. V.; Refregier, A.; Rozo, E.; Rykoff, E. S.; Sadeh, I.; Sánchez, C.; Sheldon, E.; Uddin, S.; Wechsler, R. H.; Zuntz, J.; Abbott, T.; Abdalla, F. B.; Allam, S.; Armstrong, R.; Banerji, M.; Bauer, A. H.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Capozzi, D.; Carnero Rosell, A.; Carretero, J.; Cunha, C. E.; D’Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Dietrich, J. P.; Doel, P.; Fausti Neto, A.; Fernandez, E.; Flaugher, B.; Fosalba, P.; Gerdes, D. W.; Gruendl, R. A.; Honscheid, K.; Jain, B.; James, D. J.; Jarvis, M.; Kim, A. G.; Kuehn, K.; Kuropatkin, N.; Li, T. S.; Lima, M.; Maia, M. A. G.; March, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Miller, C. J.; Neilsen, E.; Nichol, R. C.; Nord, B.; Ogando, R.; Plazas, A. A.; Reil, K.; Romer, A. K.; Roodman, A.; Sako, M.; Sanchez, E.; Santiago, B.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thaler, J.; Thomas, D.; Vikram, V.; Walker, A. R.

    2016-08-30

    Here we present photometric redshift estimates for galaxies used in the weak lensing analysis of the Dark Energy Survey Science Verification (DES SV) data. Four model- or machine learning-based photometric redshift methods—annz2, bpz calibrated against BCC-Ufig simulations, skynet, and tpz—are analyzed. For training, calibration, and testing of these methods, we construct a catalogue of spectroscopically confirmed galaxies matched against DES SV data. The performance of the methods is evaluated against the matched spectroscopic catalogue, focusing on metrics relevant for weak lensing analyses, with additional validation against COSMOS photo-z’s. From the galaxies in the DES SV shear catalogue, which have mean redshift 0.72±0.01 over the range 0.3distributions through to their impact on cosmological parameters estimated with cosmic shear, and find that they cause shifts in the value of σ8 of approximately 3%. This shift is within the one sigma statistical errors on σ8 for the DES SV shear catalogue. We further study the potential impact of systematic differences on the critical surface density, Σcrit, finding levels of bias safely less than the statistical power of DES SV data. In conclusion, we recommend a final Gaussian prior for the photo-z bias in the mean of n(z) of width 0.05 for each of the three tomographic bins, and show that this is a sufficient bias model for the corresponding cosmology analysis.

  4. Distributed solar photovoltaic array location and extent dataset for remote sensing object identification

    PubMed Central

    Bradbury, Kyle; Saboo, Raghav; L. Johnson, Timothy; Malof, Jordan M.; Devarajan, Arjun; Zhang, Wuming; M. Collins, Leslie; G. Newell, Richard

    2016-01-01

    Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment. PMID:27922592

  5. Locating illicit connections in storm water sewers using fiber-optic distributed temperature sensing.

    PubMed

    Hoes, O A C; Schilperoort, R P S; Luxemburg, W M J; Clemens, F H L R; van de Giesen, N C

    2009-12-01

    A newly developed technique using distributed temperature sensing (DTS) has been developed to find illicit household sewage connections to storm water systems in the Netherlands. DTS allows for the accurate measurement of temperature along a fiber-optic cable, with high spatial (2m) and temporal (30s) resolution. We inserted a fiber-optic cable of 1300m in two storm water drains. At certain locations, significant temperature differences with an intermittent character were measured, indicating inflow of water that was not storm water. In all cases, we found that foul water from households or companies entered the storm water system through an illicit sewage connection. The method of using temperature differences for illicit connection detection in storm water networks is discussed. The technique of using fiber-optic cables for distributed temperature sensing is explained in detail. The DTS method is a reliable, inexpensive and practically feasible method to detect illicit connections to storm water systems, which does not require access to private property.

  6. [Spatial correlation of active mounds locative distribution of Solenopsis invicta Buren polygyne populations].

    PubMed

    Lu, Yong-yue; Li, Ning-dong; Liang, Guang-wen; Zeng, Ling

    2007-01-01

    By using geostatistic method, this paper studied the spatial distribution patterns of the active mounds of Solenopsis invicta Buren polygyne populations in Wuchuan and Shenzhen, and built up the spherical models of the interval distances and semivariances of the mounds. The semivariograms were described at the two directions of east-west and south-north, which were obviously positively correlated to the interval distances, revealing that the active mounds in locative area were space-dependent. The ranges of the 5 spherical models constructed for 5 sampling plots in Wuchuan were 9.1 m, 7.6 m, 23.5 m, 7.5 m and 14.5 m, respectively, with an average of 12.4 m. The mounds of any two plots in this range were significantly correlated. There was a randomicity in the spatial distribution of active mounds, and the randomicity index (Nugget/Sill) was 0.7034, 0.9247, 0.4398, 1.1196 and 0.4624, respectively. In Shenzhen, the relationships between the interval distances and semivariances were described by 7 spherical models, and the ranges were 14.5 m, 11.2 m, 10.8 m, 17.6 m, 11.3 m, 9.9 m and 12.8 m, respectively, with an average of 12.6 m.

  7. Distributed solar photovoltaic array location and extent dataset for remote sensing object identification

    NASA Astrophysics Data System (ADS)

    Bradbury, Kyle; Saboo, Raghav; L. Johnson, Timothy; Malof, Jordan M.; Devarajan, Arjun; Zhang, Wuming; M. Collins, Leslie; G. Newell, Richard

    2016-12-01

    Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment.

  8. Distributed solar photovoltaic array location and extent dataset for remote sensing object identification.

    PubMed

    Bradbury, Kyle; Saboo, Raghav; L Johnson, Timothy; Malof, Jordan M; Devarajan, Arjun; Zhang, Wuming; M Collins, Leslie; G Newell, Richard

    2016-12-06

    Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment.

  9. Relation Between Sprite Distribution and Source Locations of VHF Pulses Derived From JEM- GLIMS Measurements

    NASA Astrophysics Data System (ADS)

    Sato, Mitsuteru; Mihara, Masahiro; Ushio, Tomoo; Morimoto, Takeshi; Kikuchi, Hiroshi; Adachi, Toru; Suzuki, Makoto; Yamazaki, Atsushi; Takahashi, Yukihiro

    2015-04-01

    JEM-GLIMS is continuing the comprehensive nadir observations of lightning and TLEs using optical instruments and electromagnetic wave receivers since November 2012. For the period between November 20, 2012 and November 30, 2014, JEM-GLIMS succeeded in detecting 5,048 lightning events. A total of 567 events in 5,048 lightning events were TLEs, which were mostly elves events. To identify the sprite occurrences from the transient optical flash data, it is necessary to perform the following data analysis: (1) a subtraction of the appropriately scaled wideband camera data from the narrowband camera data; (2) a calculation of intensity ratio between different spectrophotometer channels; and (3) an estimation of the polarization and CMC for the parent CG discharges using ground-based ELF measurement data. From a synthetic comparison of these results, it is confirmed that JEM-GLISM succeeded in detecting sprite events. The VHF receiver (VITF) onboard JEM-GLIMS uses two patch-type antennas separated by a 1.6-m interval and can detect VHF pulses emitted by lightning discharges in the 70-100 MHz frequency range. Using both an interferometric technique and a group delay technique, we can estimate the source locations of VHF pulses excited by lightning discharges. In the event detected at 06:41:15.68565 UT on June 12, 2014 over central North America, sprite was distributed with a horizontal displacement of 20 km from the peak location of the parent lightning emission. In this event, a total of 180 VHF pulses were simultaneously detected by VITF. From the detailed data analysis of these VHF pulse data, it is found that the majority of the source locations were placed near the area of the dim lightning emission, which may imply that the VHF pulses were associated with the in-cloud lightning current. At the presentation, we will show detailed comparison between the spatiotemporal characteristics of sprite emission and source locations of VHF pulses excited by the parent lightning

  10. Distribution of persistent organohalogen compounds in pine needles from selected locations in Kentucky and Georgia, USA.

    PubMed

    Loganathan, Bommanna G; Kumar, Kurunthachalam Senthil; Seaford, Kosta D; Sajwan, Kenneth S; Hanari, Nobuyasu; Yamashita, Nobuyoshi

    2008-04-01

    Epicuticular wax of pine needles accumulates organic pollutants from the atmosphere, and the pine needle samples have been used for monitoring both local and regional distributions of semivolatile organic air pollutants. One-year-old pine needles collected from residential and industrial locations in western Kentucky and the vicinity of Linden Chemicals and Plastics, a Superfund Site at Brunswick, Georgia, were analyzed for polychlorinated biphenyls (PCBs), major chlorinated pesticides, and polychlorinated naphthalenes (PCNs). Total PCB concentrations in pine needles from Kentucky ranged from 5.2 to 12 ng/g dry weight (dw). These sites were comparatively less polluted than those from the Superfund Site, which had total PCB concentrations in pine needles in the range of 15-34 ng/g dw. Total chlorinated pesticides concentrations in pine needles ranged from 3.5 to 10 ng/g dw from Kentucky. A similar range of concentrations of chlorinated pesticides (7.3-12 ng/g dw) was also found in pine needle samples from the Superfund site. Total PCN concentrations in pine needles ranged from 76 to 150 pg/g dw in Kentucky. At the Superfund Site, total PCN concentrations ranged from 610 pg/g dw to 38,000 pg/g dw. When the toxic equivalencies (TEQs) of PCBs in pine needles were compared, Kentucky was relatively lower (0.03-0.11 pg/g dry wt) than the TEQs at the Superfund Site (0.24-0.48 pg/g dry wt). The TEQs of PCNs from Kentucky (0.004-0.067 pg/g dw) were much lower than the TEQs from locations near the Superfund Site (0.30-19 pg/g dry wt). The results revealed that pine needles are excellent, passive, nondestructive bioindicators for monitoring and evaluating PCBs, chlorinated pesticides, and PCNs.

  11. Genomic distribution of AFLP markers relative to gene locations for different eukaryotic species

    PubMed Central

    2013-01-01

    Background Amplified fragment length polymorphism (AFLP) markers are frequently used for a wide range of studies, such as genome-wide mapping, population genetic diversity estimation, hybridization and introgression studies, phylogenetic analyses, and detection of signatures of selection. An important issue to be addressed for some of these fields is the distribution of the markers across the genome, particularly in relation to gene sequences. Results Using in-silico restriction fragment analysis of the genomes of nine eukaryotic species we characterise the distribution of AFLP fragments across the genome and, particularly, in relation to gene locations. First, we identify the physical position of markers across the chromosomes of all species. An observed accumulation of fragments around (peri) centromeric regions in some species is produced by repeated sequences, and this accumulation disappears when AFLP bands rather than fragments are considered. Second, we calculate the percentage of AFLP markers positioned within gene sequences. For the typical EcoRI/MseI enzyme pair, this ranges between 28 and 87% and is usually larger than that expected by chance because of the higher GC content of gene sequences relative to intergenic ones. In agreement with this, the use of enzyme pairs with GC-rich restriction sites substantially increases the above percentages. For example, using the enzyme system SacI/HpaII, 86% of AFLP markers are located within gene sequences in A. thaliana, and 100% of markers in Plasmodium falciparun. We further find that for a typical trait controlled by 50 genes of average size, if 1000 AFLPs are used in a study, the number of those within 1 kb distance from any of the genes would be only about 1–2, and only about 50% of the genes would have markers within that distance. Conclusions The high coverage of AFLP markers across the genomes and the high proportion of markers within or close to gene sequences make them suitable for genome scans and

  12. Influence of Interhemispheric Asymmetry in Volcanic Forcing on ITCZ Location and Oxygen Isotope Distribution

    NASA Astrophysics Data System (ADS)

    Colose, C.; LeGrande, A. N.; Vuille, M. F.

    2014-12-01

    Volcanic eruptions are a dominant source of natural forced variability during the Common Era. Although transient, eruptions strongly cool the planet through the liberation of sulfur gases that enter the stratosphere (converting to sulfate aerosol) and scatter sunlight. In particular, such events source the largest amplitude radiative forcings that perturb the terrestrial climate during the Last Millennium. Previous studies have highlighted the global climate impact of large volcanic events, including the role of latitude and time-of-year of a given eruption. Here, we focus on the influence of hemispheric asymmetry in Aerosol Optical Depth (AOD) and its projection onto the tropical hydrologic cycle. This is assessed using a suite of simulations from a fully coupled isotope-enabled General Circulation Model (NASA GISS ModelE2-R) run from 850-2005 CE. This study builds upon prior work that demonstrate the role of inter-hemispheric forcing gradients on Intertropical Convergence Zone (ITCZ) location. In addition to unveiling the physical mechanisms that alter tropical hydroclimate, we highlight the anticipated tropical oxygen isotope distribution following large eruptions. Thus, through the vehicle of an isotope-enabled model, we formulate a potentially falsifiable prediction for how volcanic forcing may manifest itself in high-resolution proxies across the tropics.

  13. Optimal Sensor Placement for Leak Location in Water Distribution Networks Using Genetic Algorithms

    PubMed Central

    Casillas, Myrna V.; Puig, Vicenç; Garza-Castañón, Luis E.; Rosich, Albert

    2013-01-01

    This paper proposes a new sensor placement approach for leak location in water distribution networks (WDNs). The sensor placement problem is formulated as an integer optimization problem. The optimization criterion consists in minimizing the number of non-isolable leaks according to the isolability criteria introduced. Because of the large size and non-linear integer nature of the resulting optimization problem, genetic algorithms (GAs) are used as the solution approach. The obtained results are compared with a semi-exhaustive search method with higher computational effort, proving that GA allows one to find near-optimal solutions with less computational load. Moreover, three ways of increasing the robustness of the GA-based sensor placement method have been proposed using a time horizon analysis, a distance-based scoring and considering different leaks sizes. A great advantage of the proposed methodology is that it does not depend on the isolation method chosen by the user, as long as it is based on leak sensitivity analysis. Experiments in two networks allow us to evaluate the performance of the proposed approach. PMID:24193099

  14. Lexical distributional cues, but not situational cues, are readily used to learn abstract locative verb-structure associations.

    PubMed

    Twomey, Katherine E; Chang, Franklin; Ambridge, Ben

    2016-08-01

    Children must learn the structural biases of locative verbs in order to avoid making overgeneralisation errors (e.g., (∗)I filled water into the glass). It is thought that they use linguistic and situational information to learn verb classes that encode structural biases. In addition to situational cues, we examined whether children and adults could use the lexical distribution of nouns in the post-verbal noun phrase of transitive utterances to assign novel verbs to locative classes. In Experiment 1, children and adults used lexical distributional cues to assign verb classes, but were unable to use situational cues appropriately. In Experiment 2, adults generalised distributionally-learned classes to novel verb arguments, demonstrating that distributional information can cue abstract verb classes. Taken together, these studies show that human language learners can use a lexical distributional mechanism that is similar to that used by computational linguistic systems that use large unlabelled corpora to learn verb meaning.

  15. Spatial distribution of soil water repellency in a grassland located in Lithuania

    NASA Astrophysics Data System (ADS)

    Pereira, Paulo; Novara, Agata

    2014-05-01

    Soil water repellency (SWR) it is recognized to be very heterogeneous in time in space and depends on soil type, climate, land use, vegetation and season (Doerr et al., 2002). It prevents or reduces water infiltration, with important impacts on soil hydrology, influencing the mobilization and transport of substances into the soil profile. The reduced infiltration increases surface runoff and soil erosion. SWR reduce also the seed emergency and plant growth due the reduced amount of water in the root zone. Positive aspects of SWR are the increase of soil aggregate stability, organic carbon sequestration and reduction of water evaporation (Mataix-Solera and Doerr, 2004; Diehl, 2013). SWR depends on the soil aggregate size. In fire affected areas it was founded that SWR was more persistent in small size aggregates (Mataix-Solera and Doerr, 2004; Jordan et al., 2011). However, little information is available about SWR spatial distribution according to soil aggregate size. The aim of this work is study the spatial distribution of SWR in fine earth (<2 mm) and different aggregate sizes, 2-1 mm, 1-0.5 mm, 0.5-0.25 mm and <0.25 mm. The studied area is located near Vilnius (Lithuania) at 54° 42' N, 25° 08 E, 158 masl. A plot with 400 m2 (20 x 20 m with 5 m space between sampling points) and 25 soil samples were collected in the top soil (0-5 cm) and taken to the laboratory. Previously to SWR assessment, the samples were air dried. The persistence of SWR was analysed according to the Water Drop Penetration Method, which involves placing three drops of distilled water onto the soil surface and registering the time in seconds (s) required for the drop complete penetration (Wessel, 1988). Data did not respected Gaussian distribution, thus in order to meet normality requirements it was log-normal transformed. Spatial interpolations were carried out using Ordinary Kriging. The results shown that SWR was on average in fine earth 2.88 s (Coeficient of variation % (CV%)=44.62), 2

  16. Responses of European precipitation distributions and regimes to different blocking locations

    NASA Astrophysics Data System (ADS)

    Sousa, Pedro M.; Trigo, Ricardo M.; Barriopedro, David; Soares, Pedro M. M.; Ramos, Alexandre M.; Liberato, Margarida L. R.

    2017-02-01

    In this work we performed an analysis on the impacts of blocking episodes on seasonal and annual European precipitation and the associated physical mechanisms. Distinct domains were considered in detail taking into account different blocking center positions spanning between the Atlantic and western Russia. Significant positive precipitation anomalies are found for southernmost areas while generalized negative anomalies (up to 75 % in some areas) occur in large areas of central and northern Europe. This dipole of anomalies is reversed when compared to that observed during episodes of strong zonal flow conditions. We illustrate that the location of the maximum precipitation anomalies follows quite well the longitudinal positioning of the blocking centers and discuss regional and seasonal differences in the precipitation responses. To better understand the precipitation anomalies, we explore the blocking influence on cyclonic activity. The results indicate a split of the storm-tracks north and south of blocking systems, leading to an almost complete reduction of cyclonic centers in northern and central Europe and increases in southern areas, where cyclone frequency doubles during blocking episodes. However, the underlying processes conductive to the precipitation anomalies are distinct between northern and southern European regions, with a significant role of atmospheric instability in southern Europe, and moisture availability as the major driver at higher latitudes. This distinctive underlying process is coherent with the characteristic patterns of latent heat release from the ocean associated with blocked and strong zonal flow patterns. We also analyzed changes in the full range of the precipitation distribution of several regional sectors during blocked and zonal days. Results show that precipitation reductions in the areas under direct blocking influence are driven by a substantial drop in the frequency of moderate rainfall classes. Contrarily, southwards of

  17. Exploring the Estimation of Examinee Locations Using Multidimensional Latent Trait Models under Different Distributional Assumptions

    ERIC Educational Resources Information Center

    Jang, Hyesuk

    2014-01-01

    This study aims to evaluate a multidimensional latent trait model to determine how well the model works in various empirical contexts. Contrary to the assumption of these latent trait models that the traits are normally distributed, situations in which the latent trait is not shaped with a normal distribution may occur (Sass et al, 2008; Woods…

  18. A robust confidence interval for location for symmetric, long-tailed distributions.

    PubMed

    Gross, A M

    1973-07-01

    A procedure called the wave-interval is presented for obtaining a 95% confidence interval for the center (mean, median) of a symmetric distribution that is not only highly efficient when the data have a Normal distribution but also performs well when some or all of the data come from a long-tailed distribution such as the Cauchy. Use of the wave-interval greatly reduces the risk of asserting much less than one's data will support. The only table required is the usual t-table. The wave-interval procedure is definitely recommended for samples of ten or more, and appears satisfactory for samples of nine or eight.

  19. Where exactly am I? Self-location judgements distribute between head and torso.

    PubMed

    Alsmith, Adrian J T; Longo, Matthew R

    2014-02-01

    I am clearly located where my body is located. But is there one particular place inside my body where I am? Recent results have provided apparently contradictory findings about this question. Here, we addressed this issue using a more direct approach than has been used in previous studies. Using a simple pointing task, we asked participants to point directly at themselves, either by manual manipulation of the pointer whilst blindfolded or by visually discerning when the pointer was in the correct position. Self-location judgements in haptic and visual modalities were highly similar, and were clearly modulated by the starting location of the pointer. Participants most frequently chose to point to one of two likely regions, the upper face or the upper torso, according to which they reached first. These results suggest that while the experienced self is not spread out homogeneously across the entire body, nor is it localised in any single point. Rather, two distinct regions, the upper face and upper torso, appear to be judged as where "I" am.

  20. On intra-supply chain system with an improved distribution plan, multiple sales locations and quality assurance.

    PubMed

    Chiu, Singa Wang; Huang, Chao-Chih; Chiang, Kuo-Wei; Wu, Mei-Fang

    2015-01-01

    Transnational companies, operating in extremely competitive global markets, always seek to lower different operating costs, such as inventory holding costs in their intra- supply chain system. This paper incorporates a cost reducing product distribution policy into an intra-supply chain system with multiple sales locations and quality assurance studied by [Chiu et al., Expert Syst Appl, 40:2669-2676, (2013)]. Under the proposed cost reducing distribution policy, an added initial delivery of end items is distributed to multiple sales locations to meet their demand during the production unit's uptime and rework time. After rework when the remaining production lot goes through quality assurance, n fixed quantity installments of finished items are then transported to sales locations at a fixed time interval. Mathematical modeling and optimization techniques are used to derive closed-form optimal operating policies for the proposed system. Furthermore, the study demonstrates significant savings in stock holding costs for both the production unit and sales locations. Alternative of outsourcing product delivery task to an external distributor is analyzed to assist managerial decision making in potential outsourcing issues in order to facilitate further reduction in operating costs.

  1. The hemodynamic effects of the LVAD outflow cannula location on the thrombi distribution in the aorta: A primary numerical study.

    PubMed

    Zhang, Yage; Gao, Bin; Yu, Chang

    2016-09-01

    Although a growing number of patients undergo LVAD implantation for heart failure treatment, thrombi are still the devastating complication for patients who used LVAD. LVAD outflow cannula location and thrombi generation sources were hypothesized to affect the thrombi distribution in the aorta. To test this hypothesis, numerical studies were conducted by using computational fluid dynamic (CFD) theory. Two anastomotic configurations, in which the LVAD outflow cannula is anastomosed to the anterior and lateral ascending aortic wall (named as anterior configurations and lateral configurations, respectively), are designed. The particles, whose sized are same as those of thrombi, are released at the LVAD output cannula and the aortic valve (named as thrombiP and thrombiL, respectively) to calculate the distribution of thrombi. The simulation results demonstrate that the thrombi distribution in the aorta is significantly affected by the LVAD outflow cannula location. In anterior configuration, the thrombi probability of entering into the three branches is 23.60%, while that in lateral configuration is 36.68%. Similarly, in anterior configuration, the thrombi probabilities of entering into brachiocephalic artery, left common carotid artery and left subclavian artery, is 8.51%, 9.64%, 5.45%, respectively, while that in lateral configuration it is 11.39%, 3.09%, 22.20% respectively. Moreover, the origins of thrombi could affect their distributions in the aorta. In anterior configuration, the thrombiP has a lower probability to enter into the three branches than thrombiL (12% vs. 25%). In contrast, in lateral configuration, the thrombiP has a higher probability to enter into the three branches than thrombiL (47% vs. 35%). In brief, the LVAD outflow cannula location significantly affects the distribution of thrombi in the aorta. Thus, in the clinical practice, the selection of outflow location of LVAD and the risk of thrombi formed in the left ventricle should be paid more

  2. Assessing cadmium distribution applying neutron radiography in moss trophical levels, located in Szarvasko, Hungary.

    PubMed

    Varga, János; Korösi, Ferenc; Balaskó, Márton; Naár, Zoltán

    2004-10-01

    The measuring station of the 10 MW VVR-SM research reactor at the Budapest Neutron Centre (Hungary) was used to perform dynamic neutron radiography (DNR), which was, to our best knowledge, the first time, in a Tortella tortuosa biotope. In the conducted study, two trophical levels, moss and spider Thomisidae sp. juv., were examined. Cadmium penetration routes, distribution and accumulation zones were visualized in the leafy gametophyte life cycle of Tortella tortuosa and in the organs of the spider.

  3. Distributed fiber optic sensor employing phase generate carrier for disturbance detection and location

    NASA Astrophysics Data System (ADS)

    Xu, Haiyan; Wu, Hongyan; Zhang, Xuewu; Zhang, Zhuo; Li, Min

    2015-05-01

    Distributed optic fiber sensor is a new type of system, which could be used in the long-distance and strong-EMI condition for monitoring and inspection. A method of external modulation with a phase modulator is proposed in this paper to improve the positioning accuracy of the disturbance in a distributed optic-fiber sensor. We construct distributed disturbance detecting system based on Michelson interferometer, and a phase modulator has been attached to the fiber sensor in front of the Faraday rotation mirror (FRM), to elevate the signal produced by interfering of the two lights reflected by the Faraday rotation Mirror to a high frequency, while other signals remain in the low frequency. Through a high pass filter and phase retrieve circus, a signal which is proportional to the external disturbance is acquired. The accuracy of disturbance positioning with this signal can be largely improved. The method is quite simple and easy to achieve. Theoretical analysis and experimental results show that, this method can effectively improve the positioning accuracy.

  4. Syringe filtration methods for examining dissolved and colloidal trace element distributions in remote field locations

    NASA Technical Reports Server (NTRS)

    Shiller, Alan M.

    2003-01-01

    It is well-established that sampling and sample processing can easily introduce contamination into dissolved trace element samples if precautions are not taken. However, work in remote locations sometimes precludes bringing bulky clean lab equipment into the field and likewise may make timely transport of samples to the lab for processing impossible. Straightforward syringe filtration methods are described here for collecting small quantities (15 mL) of 0.45- and 0.02-microm filtered river water in an uncontaminated manner. These filtration methods take advantage of recent advances in analytical capabilities that require only small amounts of waterfor analysis of a suite of dissolved trace elements. Filter clogging and solute rejection artifacts appear to be minimal, although some adsorption of metals and organics does affect the first approximately 10 mL of water passing through the filters. Overall the methods are clean, easy to use, and provide reproducible representations of the dissolved and colloidal fractions of trace elements in river waters. Furthermore, sample processing materials can be prepared well in advance in a clean lab and transported cleanly and compactly to the field. Application of these methods is illustrated with data from remote locations in the Rocky Mountains and along the Yukon River. Evidence from field flow fractionation suggests that the 0.02-microm filters may provide a practical cutoff to distinguish metals associated with small inorganic and organic complexes from those associated with silicate and oxide colloids.

  5. MPL-Net Measurements of Aerosol and Cloud Vertical Distributions at Co-Located AERONET Sites

    NASA Technical Reports Server (NTRS)

    Welton, Ellsworth J.; Campbell, James R.; Berkoff, Timothy A.; Spinhirne, James D.; Tsay, Si-Chee; Holben, Brent; Starr, David OC. (Technical Monitor)

    2002-01-01

    In the early 1990s, the first small, eye-safe, and autonomous lidar system was developed, the Micropulse Lidar (MPL). The MPL acquires signal profiles of backscattered laser light from aerosols and clouds. The signals are analyzed to yield multiple layer heights, optical depths of each layer, average extinction-to-backscatter ratios for each layer, and profiles of extinction in each layer. In 2000, several MPL sites were organized into a coordinated network, called MPL-Net, by the Cloud and Aerosol Lidar Group at NASA Goddard Space Flight Center (GSFC) using funding provided by the NASA Earth Observing System. tn addition to the funding provided by NASA EOS, the NASA CERES Ground Validation Group supplied four MPL systems to the project, and the NASA TOMS group contributed their MPL for work at GSFC. The Atmospheric Radiation Measurement Program (ARM) also agreed to make their data available to the MPL-Net project for processing. In addition to the initial NASA and ARM operated sites, several other independent research groups have also expressed interest in joining the network using their own instruments. Finally, a limited amount of EOS funding was set aside to participate in various field experiments each year. The NASA Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) project also provides funds to deploy their MPL during ocean research cruises. All together, the MPL-Net project has participated in four major field experiments since 2000. Most MPL-Net sites and field experiment locations are also co-located with sunphotometers in the NASA Aerosol Robotic Network. (AERONET). Therefore, at these locations data is collected on both aerosol and cloud vertical structure as well as column optical depth and sky radiance. Real-time data products are now available from most MPL-Net sites. Our real-time products are generated at times of AERONET aerosol optical depth (AOD) measurements. The AERONET AOD is used as input to our

  6. Redshift distributions of galaxies in the Dark Energy Survey Science Verification shear catalogue and implications for weak lensing

    DOE PAGES

    Bonnett, C.; Troxel, M. A.; Hartley, W.; ...

    2016-08-30

    Here we present photometric redshift estimates for galaxies used in the weak lensing analysis of the Dark Energy Survey Science Verification (DES SV) data. Four model- or machine learning-based photometric redshift methods—annz2, bpz calibrated against BCC-Ufig simulations, skynet, and tpz—are analyzed. For training, calibration, and testing of these methods, we construct a catalogue of spectroscopically confirmed galaxies matched against DES SV data. The performance of the methods is evaluated against the matched spectroscopic catalogue, focusing on metrics relevant for weak lensing analyses, with additional validation against COSMOS photo-z’s. From the galaxies in the DES SV shear catalogue, which have meanmore » redshift 0.72±0.01 over the range 0.38 of approximately 3%. This shift is within the one sigma statistical errors on σ8 for the DES SV shear catalogue. We further study the potential impact of systematic differences on the critical surface density, Σcrit, finding levels of bias safely less than the statistical power of DES SV data. In conclusion, we recommend a final Gaussian prior for the photo-z bias in the mean of n(z) of width 0.05 for each of the three tomographic bins, and show that this is a sufficient bias model for the corresponding cosmology analysis.« less

  7. Distribution of Brazilian dermatologists according to geographic location, population and HDI of municipalities: an ecological study*

    PubMed Central

    Schmitt, Juliano Vilaverde; Miot, Hélio Amante

    2014-01-01

    This study investigated the geographic distribution of dermatologists in Brazilian municipalities in relation to the population, regions of the country and human development index. We conducted an ecological study based on data from the 2010 census, the 2010 human development index, and the records of the Brazilian Society of Dermatology. 5565 municipalities and 6718 dermatologists were surveyed. Only 504 (9.1%) municipalities had dermatologists, and accounted for 56.2% of the Brazilian population. The smallest population size and lowest HDI rate that best discriminated municipalities that did not have dermatologists were found to be 28,000 and 0.71, respectively. The average population density of dermatologists in cities was 1/23.000 inhabitants, and variations were independently associated with the HDI, the population of the municipalities and the region of the country. PMID:25387516

  8. A mitochondrial location for haemoglobins--dynamic distribution in ageing and Parkinson's disease.

    PubMed

    Shephard, Freya; Greville-Heygate, Oliver; Marsh, Oliver; Anderson, Susan; Chakrabarti, Lisa

    2014-01-01

    Haemoglobins are iron-containing proteins that transport oxygen in the blood of most vertebrates. The mitochondrion is the cellular organelle which consumes oxygen in order to synthesise ATP. Mitochondrial dysfunction is implicated in neurodegeneration and ageing. We find that α and β haemoglobin (Hba and Hbb) proteins are altered in their distribution in mitochondrial fractions from degenerating brain. We demonstrate that both Hba and Hbb are co-localised with the mitochondrion in mammalian brain. The precise localisation of the Hbs is within the inner membrane space and associated with inner mitochondrial membrane. Relative mitochondrial to cytoplasmic ratios of Hba and Hbb show changing distributions of these proteins during the process of neurodegeneration in the pcd(5j) mouse brain. A significant difference in mitochondrial Hba and Hbb content in the mitochondrial fraction is seen at 31 days after birth, this corresponds to a stage when dynamic neuronal loss is measured to be greatest in the Purkinje Cell Degeneration mouse. We also report changes in mitochondrial Hba and Hbb levels in ageing brain and muscle. Significant differences in mitochondrial Hba and Hbb can be seen when comparing aged brain to muscle, suggesting tissue specific functions of these proteins in the mitochondrion. In muscle there are significant differences between Hba levels in old and young mitochondria. To understand whether the changes detected in mitochondrial Hbs are of clinical significance, we examined Parkinson's disease brain, immunohistochemistry studies suggest that cell bodies in the substantia nigra accumulate mitochondrial Hb. However, western blotting of mitochondrial fractions from PD and control brains indicates significantly less Hb in PD brain mitochondria. One explanation could be a specific loss of cells containing mitochondria loaded with Hb proteins. Our study opens the door to an examination of the role of Hb function, within the context of the mitochondrion

  9. Impacts to the chest of PMHSs - Influence of impact location and load distribution on chest response.

    PubMed

    Holmqvist, Kristian; Svensson, Mats Y; Davidsson, Johan; Gutsche, Andreas; Tomasch, Ernst; Darok, Mario; Ravnik, Dean

    2016-02-01

    The chest response of the human body has been studied for several load conditions, but is not well known in the case of steering wheel rim-to-chest impact in heavy goods vehicle frontal collisions. The aim of this study was to determine the response of the human chest in a set of simulated steering wheel impacts. PMHS tests were carried out and analysed. The steering wheel load pattern was represented by a rigid pendulum with a straight bar-shaped front. A crash test dummy chest calibration pendulum was utilised for comparison. In this study, a set of rigid bar impacts were directed at various heights of the chest, spanning approximately 120mm around the fourth intercostal space. The impact energy was set below a level estimated to cause rib fracture. The analysed results consist of responses, evaluated with respect to differences in the impacting shape and impact heights on compression and viscous criteria chest injury responses. The results showed that the bar impacts consistently produced lesser scaled chest compressions than the hub; the Middle bar responses were around 90% of the hub responses. A superior bar impact provided lesser chest compression; the average response was 86% of the Middle bar response. For inferior bar impacts, the chest compression response was 116% of the chest compression in the middle. The damping properties of the chest caused the compression to decrease in the high speed bar impacts to 88% of that in low speed impacts. From the analysis it could be concluded that the bar impact shape provides lower chest criteria responses compared to the hub. Further, the bar responses are dependent on the impact location of the chest. Inertial and viscous effects of the upper body affect the responses. The results can be used to assess the responses of human substitutes such as anthropomorphic test devices and finite element human body models, which will benefit the development process of heavy goods vehicle safety systems.

  10. FIBWR: a steady-state core flow distribution code for boiling water reactors code verification and qualification report. Final report

    SciTech Connect

    Ansari, A.F.; Gay, R.R.; Gitnick, B.J.

    1981-07-01

    A steady-state core flow distribution code (FIBWR) is described. The ability of the recommended models to predict various pressure drop components and void distribution is shown by comparison to the experimental data. Application of the FIBWR code to the Vermont Yankee Nuclear Power Station is shown by comparison to the plant measured data.

  11. Drop size distributions and related properties of fog for five locations measured from aircraft

    NASA Technical Reports Server (NTRS)

    Zak, J. Allen

    1994-01-01

    Fog drop size distributions were collected from aircraft as part of the Synthetic Vision Technology Demonstration Program. Three west coast marine advection fogs, one frontal fog, and a radiation fog were sampled from the top of the cloud to the bottom as the aircraft descended on a 3-degree glideslope. Drop size versus altitude versus concentration are shown in three dimensional plots for each 10-meter altitude interval from 1-minute samples. Also shown are median volume radius and liquid water content. Advection fogs contained the largest drops with median volume radius of 5-8 micrometers, although the drop sizes in the radiation fog were also large just above the runway surface. Liquid water content increased with height, and the total number of drops generally increased with time. Multimodal variations in number density and particle size were noted in most samples where there was a peak concentration of small drops (2-5 micrometers) at low altitudes, midaltitude peak of drops 5-11 micrometers, and high-altitude peak of the larger drops (11-15 micrometers and above). These observations are compared with others and corroborate previous results in fog gross properties, although there is considerable variation with time and altitude even in the same type of fog.

  12. Locating ethics in data science: responsibility and accountability in global and distributed knowledge production systems

    PubMed Central

    2016-01-01

    The distributed and global nature of data science creates challenges for evaluating the quality, import and potential impact of the data and knowledge claims being produced. This has significant consequences for the management and oversight of responsibilities and accountabilities in data science. In particular, it makes it difficult to determine who is responsible for what output, and how such responsibilities relate to each other; what ‘participation’ means and which accountabilities it involves, with regard to data ownership, donation and sharing as well as data analysis, re-use and authorship; and whether the trust placed on automated tools for data mining and interpretation is warranted (especially as data processing strategies and tools are often developed separately from the situations of data use where ethical concerns typically emerge). To address these challenges, this paper advocates a participative, reflexive management of data practices. Regulatory structures should encourage data scientists to examine the historical lineages and ethical implications of their work at regular intervals. They should also foster awareness of the multitude of skills and perspectives involved in data science, highlighting how each perspective is partial and in need of confrontation with others. This approach has the potential to improve not only the ethical oversight for data science initiatives, but also the quality and reliability of research outputs. This article is part of the themed issue ‘The ethical impact of data science’. PMID:28336799

  13. Locating ethics in data science: responsibility and accountability in global and distributed knowledge production systems.

    PubMed

    Leonelli, Sabina

    2016-12-28

    The distributed and global nature of data science creates challenges for evaluating the quality, import and potential impact of the data and knowledge claims being produced. This has significant consequences for the management and oversight of responsibilities and accountabilities in data science. In particular, it makes it difficult to determine who is responsible for what output, and how such responsibilities relate to each other; what 'participation' means and which accountabilities it involves, with regard to data ownership, donation and sharing as well as data analysis, re-use and authorship; and whether the trust placed on automated tools for data mining and interpretation is warranted (especially as data processing strategies and tools are often developed separately from the situations of data use where ethical concerns typically emerge). To address these challenges, this paper advocates a participative, reflexive management of data practices. Regulatory structures should encourage data scientists to examine the historical lineages and ethical implications of their work at regular intervals. They should also foster awareness of the multitude of skills and perspectives involved in data science, highlighting how each perspective is partial and in need of confrontation with others. This approach has the potential to improve not only the ethical oversight for data science initiatives, but also the quality and reliability of research outputs.This article is part of the themed issue 'The ethical impact of data science'.

  14. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests

  15. Estimation of hydrothermal deposits location from magnetization distribution and magnetic properties in the North Fiji Basin

    NASA Astrophysics Data System (ADS)

    Choi, S.; Kim, C.; Park, C.; Kim, H.

    2013-12-01

    The North Fiji Basin is belong to one of the youngest basins of back-arc basins in the southwest Pacific (from 12 Ma ago). We performed the marine magnetic and the bathymetry survey in the North Fiji Basin for finding the submarine hydrothermal deposits in April 2012. We acquired magnetic and bathymetry datasets by using Multi-Beam Echo Sounder EM120 (Kongsberg Co.) and Overhouser Proton Magnetometer SeaSPY (Marine Magnetics Co.). We conducted the data processing to obtain detailed seabed topography, magnetic anomaly, reduce to the pole(RTP), analytic signal and magnetization. The study areas composed of the two areas(KF-1(longitude : 173.5 ~ 173.7 and latitude : -16.2 ~ -16.5) and KF-3(longitude : 173.4 ~ 173.6 and latitude : -18.7 ~ -19.1)) in Central Spreading Ridge(CSR) and one area(KF-2(longitude : 173.7 ~ 174 and latitude : -16.8 ~ -17.2)) in Triple Junction(TJ). The seabed topography of KF-1 existed thin horst in two grabens that trends NW-SE direction. The magnetic properties of KF-1 showed high magnetic anomalies in center part and magnetic lineament structure of trending E-W direction. In the magnetization distribution of KF-1, the low magnetization zone matches well with a strong analytic signal in the northeastern part. KF-2 area has TJ. The seabed topography formed like Y-shape and showed a high feature in the center of TJ. The magnetic properties of KF-2 displayed high magnetic anomalies in N-S spreading ridge center and northwestern part. In the magnetization distribution of KF-2, the low magnetization zone matches well with a strong analytic signal in the northeastern part. The seabed topography of KF-3 presented a flat and high topography like dome structure at center axis and some seamounts scattered around the axis. The magnetic properties of KF-3 showed high magnetic anomalies in N-S spreading ridge center part. In the magnetization of KF-2, the low magnetization zone mismatches to strong analytic signal in this area. The difference of KF-3

  16. Distribution of incident rainfall through vegetation in a watershed located in southern Spain

    NASA Astrophysics Data System (ADS)

    Moreno Perez, Maria Fatima; Roldan Cañas, Jose; Perez Arellano, Rafael; Cienfuegos, Ignacio

    2013-04-01

    The rainfall interception by vegetation canopy is one of the main factors involved in soil moisture and runoff because a large proportion returns to the atmosphere as evaporation. This may assume evaporation loss between 20 and 40% of the rain, so it should be taken into account in basin water balances, especially in arid and semi-arid regions with scanty rainfall. The purpose of this study was to determine the distribution of rainwater through the canopy of trees and shrub present in the watershed of "The Cabril" (Cordoba, Spain). The incident precipitation, throughfall and cortical flow were quantified for 2 agricultural years, 2010/11 and 2011/12, in the predominant vegetation, rockrose (Cistus ladanifer) and tree pines (Pinus pinea), in order to determine the volume of precipitation intercepted, and the influence of the rainfall intensity and duration on interception. 1134.4 mm of rain were collected on 102 storms. 31.4% was intercepted and evaporated into the atmosphere in the pines, and 19% in the rockrose. Cortical flow represented 0.3% in pine and 17,7% in rockrose, and throughfall represented 68.3% in pine and 63.3% in rockrose. Despite numerical differences exist between vegetation cover, the results indicate that there are significant correlations between throughfall, cortical flow and interception with precipitation in both pine and rockrose. The amount of water needed to saturate the tops of the pines showed variations between 1.6 and 9.5 mm. Variation in rockrose is 1.8 to 3.9 mm depending on the intensity of rainfall. The interception reached their highest values with less intense rainfall, decreasing considerably when rainfall duration and intensity increase. It can be seen that precipitation events exceeding 20 mm cause an increase of moisture beneath the surface of pine greater than outside. The opposite is produced when events are less than 20 mm. This can be explained because the interception in the small events is very high.

  17. Screening for the Location of RNA using the Chloride Ion Distribution in Simulations of Virus Capsids.

    PubMed

    Larsson, Daniel S D; van der Spoel, David

    2012-07-10

    The complete structure of the genomic material inside a virus capsid remains elusive, although a limited amount of symmetric nucleic acid can be resolved in the crystal structure of 17 icosahedral viruses. The negatively charged sugar-phosphate backbone of RNA and DNA as well as the large positive charge of the interior surface of the virus capsids suggest that electrostatic complementarity is an important factor in the packaging of the genomes in these viruses. To test how much packing information is encoded by the electrostatic and steric envelope of the capsid interior, we performed extensive all-atom molecular dynamics (MD) simulations of virus capsids with explicit water molecules and solvent ions. The model systems were two small plant viruses in which significant amounts of RNA has been observed by X-ray crystallography: satellite tobacco mosaic virus (STMV, 62% RNA visible) and satellite tobacco necrosis virus (STNV, 34% RNA visible). Simulations of half-capsids of these viruses with no RNA present revealed that the binding sites of RNA correlated well with regions populated by chloride ions, suggesting that it is possible to screen for the binding sites of nucleic acids by determining the equilibrium distribution of negative ions. By including the crystallographically resolved RNA in addition to ions, we predicted the localization of the unresolved RNA in the viruses. Both viruses showed a hot-spot for RNA binding at the 5-fold symmetry axis. The MD simulations were compared to predictions of the chloride density based on nonlinear Poisson-Boltzmann equation (PBE) calculations with mobile ions. Although the predictions are superficially similar, the PBE calculations overestimate the ion concentration close to the capsid surface and underestimate it far away, mainly because protein dynamics is not taken into account. Density maps from chloride screening can be used to aid in building atomic models of packaged virus genomes. Knowledge of the principles of

  18. What influences national and foreign physicians’ geographic distribution? An analysis of medical doctors’ residence location in Portugal

    PubMed Central

    2012-01-01

    Background The debate over physicians’ geographical distribution has attracted the attention of the economic and public health literature over the last forty years. Nonetheless, it is still to date unclear what influences physicians’ location, and whether foreign physicians contribute to fill the geographical gaps left by national doctors in any given country. The present research sets out to investigate the current distribution of national and international physicians in Portugal, with the objective to understand its determinants and provide an evidence base for policy-makers to identify policies to influence it. Methods A cross-sectional study of physicians currently registered in Portugal was conducted to describe the population and explore the association of physician residence patterns with relevant personal and municipality characteristics. Data from the Portuguese Medical Council on physicians’ residence and characteristics were analysed, as well as data from the National Institute of Statistics on municipalities’ population, living standards and health care network. Descriptive statistics, chi-square tests, negative binomial and logistic regression modelling were applied to determine: (a) municipality characteristics predicting Portuguese and International physicians’ geographical distribution, and; (b) doctors’ characteristics that could increase the odds of residing outside the country’s metropolitan areas. Results There were 39,473 physicians in Portugal in 2008, 51.1% of whom male, and 40.2% between 41 and 55 years of age. They were predominantly Portuguese (90.5%), with Spanish, Brazilian and African nationalities also represented. Population, Population’s Purchasing Power, Nurses per capita and Municipality Development Index (MDI) were the municipality characteristics displaying the strongest association with national physicians’ location. For foreign physicians, the MDI was not statistically significant, while municipalities

  19. Verification of patient-specific dose distributions in proton therapy using a commercial two-dimensional ion chamber array

    SciTech Connect

    Arjomandy, Bijan; Sahoo, Narayan; Ciangaru, George; Zhu, Ronald; Song Xiaofei; Gillin, Michael

    2010-11-15

    Purpose: The purpose of this study was to determine whether a two-dimensional (2D) ion chamber array detector quickly and accurately measures patient-specific dose distributions in treatment with passively scattered and spot scanning proton beams. Methods: The 2D ion chamber array detector MatriXX was used to measure the dose distributions in plastic water phantom from passively scattered and spot scanning proton beam fields planned for patient treatment. Planar dose distributions were measured using MatriXX, and the distributions were compared to those calculated using a treatment-planning system. The dose distributions generated by the treatment-planning system and a film dosimetry system were similarly compared. Results: For passively scattered proton beams, the gamma index for the dose-distribution comparison for treatment fields for three patients with prostate cancer and for one patient with lung cancer was less than 1.0 for 99% and 100% of pixels for a 3% dose tolerance and 3 mm distance-to-dose agreement, respectively. For spot scanning beams, the mean ({+-} standard deviation) percentages of pixels with gamma indices meeting the passing criteria were 97.1%{+-}1.4% and 98.8%{+-}1.4% for MatriXX and film dosimetry, respectively, for 20 fields used to treat patients with prostate cancer. Conclusions: Unlike film dosimetry, MatriXX provides not only 2D dose-distribution information but also absolute dosimetry in fractions of minutes with acceptable accuracy. The results of this study indicate that MatriXX can be used to verify patient-field specific dose distributions in proton therapy.

  20. Dip distribution of Oita-Kumamoto Tectonic Line located in central Kyushu, Japan, estimated by eigenvectors of gravity gradient tensor

    NASA Astrophysics Data System (ADS)

    Kusumoto, Shigekazu

    2016-09-01

    We estimated the dip distribution of Oita-Kumamoto Tectonic Line located in central Kyushu, Japan, by using the dip of the maximum eigenvector of the gravity gradient tensor. A series of earthquakes in Kumamoto and Oita beginning on 14 April 2016 occurred along this tectonic line, the largest of which was M = 7.3. Because a gravity gradiometry survey has not been conducted in the study area, we calculated the gravity gradient tensor from the Bouguer gravity anomaly and employed it to the analysis. The general dip distribution of the Oita-Kumamoto Tectonic Line was found to be about 65° and tends to be higher towards its eastern end. In addition, we estimated the dip around the largest earthquake to be about 60° from the gravity gradient tensor. This result agrees with the dip of the earthquake source fault obtained by Global Navigation Satellite System data analysis.[Figure not available: see fulltext.

  1. Do as I say, not as I do: a lexical distributional account of English locative verb class acquisition.

    PubMed

    Twomey, Katherine E; Chang, Franklin; Ambridge, Ben

    2014-09-01

    Children overgeneralise verbs to ungrammatical structures early in acquisition, but retreat from these overgeneralisations as they learn semantic verb classes. In a large corpus of English locative utterances (e.g., the woman sprayed water onto the wall/wall with water), we found structural biases which changed over development and which could explain overgeneralisation behaviour. Children and adults had similar verb classes and a correspondence analysis suggested that lexical distributional regularities in the adult input could help to explain the acquisition of these classes. A connectionist model provided an explicit account of how structural biases could be learned over development and how these biases could be reduced by learning verb classes from distributional regularities.

  2. SU-E-T-798: Verification of 3DVH Dose Distribution Before Clinical Implementation for Patient-Specific IMRT QA

    SciTech Connect

    McFadden, D

    2015-06-15

    Purpose: In recent years patient-specific IMRT QA has transitioned from film and chamber measurements to beam-by-beam 2D array measurements. 3DVH takes this transition a step further by estimating the 3D dose distribution delivered using 2D per beam diode array measurements. In this study, the 3D dose distribution generated by 3DVH is compared to film and chamber measurements. In addition, the accuracy ROI volume and error detection is investigated. Methods: Composite film and ion chamber measurements in a solid water phantom were performed for 9 IMRT PINNACLE patient plans for 4 treatment sites. The film and chamber measurements were compared to the dose distribution predicted by 3DVH using MAPCHECK2 per beam measurements. The absolute point dose measurement (CAX) was extracted from the predicted 3DVH and PINNACLE dose distribution and was compared by taking the ratio of measured to predicted doses. The dose distribution measured with film was compared to the distribution in the corresponding plane (AX, SAG, COR) extracted from predicted dose distribution by 3DVH and PINNACLE using a 2D gamma analysis. Gamma analysis was performed with 2% dose, 2 mm DTA, 20% threshold, and global normalization. In addition, the percent difference between 3DVH and PINNACLE ROI volumes was calculated. Results: The average ratio of the measured point dose vs the 3DVH predicted dose was 1.017 (σ=0.011). The average gamma passing rate for measured vs 3DVH dose distributions was 95.1% (σ=2.53%). The average percent difference of 3DVH vs PINNACLE ROI volume was 2.29% (σ=2.5%). Conclusion: The dose distributions predicted by 3DVH using MAPCHECK2 measurements are the same as the distributions that would have been obtained using film and chamber. The ROI volumes used in 3DVH are not an exact match to those in PINNACLE; the effect requires more investigation. The accuracy of error detection by 3DVH is currently being investigated.

  3. SU-D-BRF-02: In Situ Verification of Radiation Therapy Dose Distributions From High-Energy X-Rays Using PET Imaging

    SciTech Connect

    Zhang, Q; Kai, L; Wang, X; Hua, B; Chui, L; Wang, Q; Ma, C

    2014-06-01

    Purpose: To study the possibility of in situ verification of radiation therapy dose distributions using PET imaging based on the activity distribution of 11C and 15O produced via photonuclear reactions in patient irradiated by 45MV x-rays. Methods: The method is based on the photonuclear reactions in the most elemental composition {sup 12}C and {sup 16}O in body tissues irradiated by bremsstrahlung photons with energies up to 45 MeV, resulting primarily in {sup 11}C and {sup 15}O, which are positron-emitting nuclei. The induced positron activity distributions were obtained with a PET scanner in the same room of a LA45 accelerator (Top Grade Medical, Beijing, China). The experiments were performed with a brain phantom using realistic treatment plans. The phantom was scanned at 20min and 2-5min after irradiation for {sup 11}C and {sup 15}, respectively. The interval between the two scans was 20 minutes. The activity distributions of {sup 11}C and {sup 15}O within the irradiated volume can be separated from each other because the half-life is 20min and 2min for {sup 11}C and {sup 15}O, respectively. Three x-ray energies were used including 10MV, 25MV and 45MV. The radiation dose ranged from 1.0Gy to 10.0Gy per treatment. Results: It was confirmed that no activity was detected at 10 MV beam energy, which was far below the energy threshold for photonuclear reactions. At 25 MV x-ray activity distribution images were observed on PET, which needed much higher radiation dose in order to obtain good quality. For 45 MV photon beams, good quality activation images were obtained with 2-3Gy radiation dose, which is the typical daily dose for radiation therapy. Conclusion: The activity distribution of {sup 15}O and {sup 11}C could be used to derive the dose distribution of 45MV x-rays at the regular daily dose level. This method can potentially be used to verify in situ dose distributions of patients treated on the LA45 accelerator.

  4. Levels and spatial distribution of airborne chemical elements in a heavy industrial area located in the north of Spain.

    PubMed

    Lage, J; Almeida, S M; Reis, M A; Chaves, P C; Ribeiro, T; Garcia, S; Faria, J P; Fernández, B G; Wolterbeek, H T

    2014-01-01

    The adverse health effects of airborne particles have been subjected to intense investigation in recent years; however, more studies on the chemical characterization of particles from pollution emissions are needed to (1) identify emission sources, (2) better understand the relative toxicity of particles, and (3) pinpoint more targeted emission control strategies and regulations. The main objective of this study was to assess the levels and spatial distribution of airborne chemical elements in a heavy industrial area located in the north of Spain. Instrumental and biomonitoring techniques were integrated and analytical methods for k0 instrumental neutron activation analysis and particle-induced x-ray emission were used to determine element content in aerosol filters and lichens. Results indicated that in general local industry contributed to the emissions of As, Sb, Cu, V, and Ni, which are associated with combustion processes. In addition, the steelwork emitted significant quantities of Fe and Mn and the cement factory was associated with Ca emissions. The spatial distribution of Zn and Al also indicated an important contribution of two industries located outside the studied area.

  5. Experimental verification of improved depth-dose distribution using hyper-thermal neutron incidence in neutron capture therapy.

    PubMed

    Sakurai, Y; Kobayashi, T

    2001-01-01

    We have proposed the utilization of 'hyper-thermal neutrons' for neutron capture therapy (NCT) from the viewpoint of the improvement in the dose distribution in a human body. In order to verify the improved depth-dose distribution due to hyper-thermal neutron incidence, two experiments were carried out using a test-type hyper-thermal neutron generator at a thermal neutron irradiation field in Kyoto University Reactor (KUR), which is actually utilized for NCT clinical irradiation. From the free-in-air experiment for the spectrum-shift characteristics, it was confirmed that the hyper-thermal neutrons of approximately 860 K at maximum could be obtained by the generator. From the phantom experiment, the improvement effect and the controllability for the depth-dose distribution were confirmed. For example, it was found that the relative neutron depth-dose distribution was about 1 cm improved with the 860 K hyper-thermal neutron incidence, compared to the normal thermal neutron incidence.

  6. TESTING AND VERIFICATION OF REAL-TIME WATER QUALITY MONITORING SENSORS IN A DISTRIBUTION SYSTEM AGAINST INTRODUCED CONTAMINATION

    EPA Science Inventory

    Drinking water distribution systems reach the majority of American homes, business and civic areas, and are therefore an attractive target for terrorist attack via direct contamination, or backflow events. Instrumental monitoring of such systems may be used to signal the prese...

  7. A Novel Method to Incorporate the Spatial Location of the Lung Dose Distribution into Predictive Radiation Pneumonitis Modeling

    SciTech Connect

    Vinogradskiy, Yevgeniy; Tucker, Susan L.; Liao, Zhongxing; Martel, Mary K.

    2012-03-15

    Purpose: Studies have proposed that patients who receive radiation therapy to the base of the lung are more susceptible to radiation pneumonitis than patients who receive therapy to the apex of the lung. The primary purpose of the present study was to develop a novel method to incorporate the lung dose spatial information into a predictive radiation pneumonitis model. A secondary goal was to apply the method to a 547 lung cancer patient database to determine whether including the spatial information could improve the fit of our model. Methods and Materials: The three-dimensional dose distribution of each patient was mapped onto one common coordinate system. The boundaries of the coordinate system were defined by the extreme points of each individual patient lung. Once all dose distributions were mapped onto the common coordinate system, the spatial information was incorporated into a Lyman-Kutcher-Burman predictive radiation pneumonitis model. Specifically, the lung dose voxels were weighted using a user-defined spatial weighting matrix. We investigated spatial weighting matrices that linearly scaled each dose voxel according to the following orientations: superior-inferior, anterior-posterior, medial-lateral, left-right, and radial. The model parameters were fit to our patient cohort with the endpoint of severe radiation pneumonitis. The spatial dose model was compared against a conventional dose-volume model to determine whether adding a spatial component improved the fit of the model. Results: Of the 547 patients analyzed, 111 (20.3%) experienced severe radiation pneumonitis. Adding in a spatial parameter did not significantly increase the accuracy of the model for any of the weighting schemes. Conclusions: A novel method was developed to investigate the relationship between the location of the deposited lung dose and pneumonitis rate. The method was applied to a patient database, and we found that for our patient cohort, the spatial location does not influence

  8. Implementation of a novel double-side technique for partial discharge detection and location in covered conductor overhead distribution networks

    NASA Astrophysics Data System (ADS)

    He, Weisheng; Li, Hongjie; Liang, Deliang; Sun, Haojie; Yang, Chenbo; Wei, Jinqu; Yuan, Zhijian

    2015-12-01

    Partial discharge (PD) detection has proven to be one of the most acceptable techniques for on-line condition monitoring and predictive maintenance of power apparatus. A powerful tool for detecting PD in covered-conductor (CC) lines is urgently needed to improve the asset management of CC overhead distribution lines. In this paper, an appropriate, portable and simple system designed to detect PD activity in CC lines and ultimately pinpoint the PD source is developed and tested. The system is based on a novel double-side synchronised PD measurement technique driven by pulse injection. Emphasis is placed on the proposed PD-location mechanism and hardware structure, with descriptions of the pulse-injection process, detection device, synchronisation principle and PD-location algorithm. The system is simulated using ATP-EMTP, and the simulated results are found to be consistent with the actual simulation layout. For further validation, the capability of the system is tested in a high-voltage laboratory experiment using a 10-kV CC line with cross-linked polyethylene insulation.

  9. Phase Velocity and Full-Waveform Analysis of Co-located Distributed Acoustic Sensing (DAS) Channels and Geophone Sensor

    NASA Astrophysics Data System (ADS)

    Parker, L.; Mellors, R. J.; Thurber, C. H.; Wang, H. F.; Zeng, X.

    2015-12-01

    A 762-meter Distributed Acoustic Sensing (DAS) array with a channel spacing of one meter was deployed at the Garner Valley Downhole Array in Southern California. The array was approximately rectangular with dimensions of 180 meters by 80 meters. The array also included two subdiagonals within the rectangle along which three-component geophones were co-located. Several active sources were deployed, including a 45-kN, swept-frequency, shear-mass shaker, which produced strong Rayleigh waves across the array. Both DAS and geophone traces were filtered in 2-Hz steps between 4 and 20 Hz to obtain phase velocities as a function of frequency from fitting the moveout of travel times over distances of 35 meters or longer. As an alternative to this traditional means of finding phase velocity, it is theoretically possible to find the Rayleigh-wave phase velocity at each point of co-location as the ratio of DAS and geophone responses, because DAS is sensitive to ground strain and geophones are sensitive to ground velocity, after suitable corrections for instrument response (Mikumo & Aki, 1964). The concept was tested in WPP, a seismic wave propagation program, by first validating and then using a 3D synthetic, full-waveform seismic model to simulate the effect of increased levels of noise and uncertainty as data go from ideal to more realistic. The results obtained from this study provide a better understanding of the DAS response and its potential for being combined with traditional seismometers for obtaining phase velocity at a single location. This analysis is part of the PoroTomo project (Poroelastic Tomography by Adjoint Inverse Modeling of Data from Seismology, Geodesy, and Hydrology, http://geoscience.wisc.edu/feigl/porotomo).

  10. KAT-7 Science Verification: Using H I Observations of NGC 3109 to Understand its Kinematics and Mass Distribution

    NASA Astrophysics Data System (ADS)

    Carignan, C.; Frank, B. S.; Hess, K. M.; Lucero, D. M.; Randriamampandry, T. H.; Goedhart, S.; Passmoor, S. S.

    2013-09-01

    H I observations of the Magellanic-type spiral NGC 3109, obtained with the seven dish Karoo Array Telescope (KAT-7), are used to analyze its mass distribution. Our results are compared to those obtained using Very Large Array (VLA) data. KAT-7 is a pathfinder of the Square Kilometer Array precursor MeerKAT, which is under construction. The short baselines and low system temperature of the telescope make it sensitive to large-scale, low surface brightness emission. The new observations with KAT-7 allow the measurement of the rotation curve (RC) of NGC 3109 out to 32', doubling the angular extent of existing measurements. A total H I mass of 4.6 × 108 M ⊙ is derived, 40% more than what is detected by the VLA observations. The observationally motivated pseudo-isothermal dark matter (DM) halo model can reproduce the observed RC very well, but the cosmologically motivated Navarro-Frenk-White DM model gives a much poorer fit to the data. While having a more accurate gas distribution has reduced the discrepancy between the observed RC and the MOdified Newtonian Dynamics (MOND) models, this is done at the expense of having to use unrealistic mass-to-light ratios for the stellar disk and/or very large values for the MOND universal constant a 0. Different distances or H I contents cannot reconcile MOND with the observed kinematics, in view of the small errors on these two quantities. As with many slowly rotating gas-rich galaxies studied recently, the present result for NGC 3109 continues to pose a serious challenge to the MOND theory.

  11. KAT-7 Science Verification: Using HI Observations of NGC 3109 to Understand its Kinematics and Mass Distribution

    NASA Astrophysics Data System (ADS)

    Lucero, Danielle M.; Carignan, C.; Hess, K. M.; Frank, B. S.; Randriamampandry, T. H.; Goedhart, S.; Passmoor, S. S.

    2014-01-01

    HI observations of the Magellanic-type spiral NGC 3109, obtained with the seven dish Karoo Array Telescope (KAT-7), are used to analyze its mass distribution. Our results are compared to those obtained using Very Large Array (VLA) data. KAT-7 is a pathfinder of the Square Kilometer Array precursor MeerKAT, which is under construction. The short baselines and low system temperature of the telescope make it sensitive to large-scale, low surface brightness emission. The new observations with KAT-7 allow the measurement of the rotation curve (RC) of NGC 3109 out to 32', doubling the angular extent of existing measurements. A total HI mass of 4.6×108 M⊙ is derived, 40% more than what is detected by the VLA observations. The observationally motivated pseudo-isothermal dark matter halo model can reproduce the observed RC very well, but the cosmologically motivated Navarro-Frenk-White DM model gives a much poorer fit to the data. While having a more accurate gas distribution has reduced the discrepancy between the observed RC and the MOdified Newtonian Dynamics (MOND) models, this is done at the expense of having to use unrealistic mass-to-light ratios for the stellar disk and/or very large values for the MOND universal constant a0. Different distances or HI contents cannot reconcile MOND with the observed kinematics, in view of the small errors on these two quantities. As with many slowly rotating gas-rich galaxies studied recently, the present result for NGC 3109 continues to pose a serious challenge to the MOND theory.

  12. KAT-7 SCIENCE VERIFICATION: USING H I OBSERVATIONS OF NGC 3109 TO UNDERSTAND ITS KINEMATICS AND MASS DISTRIBUTION

    SciTech Connect

    Carignan, C.; Frank, B. S.; Hess, K. M.; Lucero, D. M.; Randriamampandry, T. H.; Goedhart, S.; Passmoor, S. S.

    2013-09-15

    H I observations of the Magellanic-type spiral NGC 3109, obtained with the seven dish Karoo Array Telescope (KAT-7), are used to analyze its mass distribution. Our results are compared to those obtained using Very Large Array (VLA) data. KAT-7 is a pathfinder of the Square Kilometer Array precursor MeerKAT, which is under construction. The short baselines and low system temperature of the telescope make it sensitive to large-scale, low surface brightness emission. The new observations with KAT-7 allow the measurement of the rotation curve (RC) of NGC 3109 out to 32', doubling the angular extent of existing measurements. A total H I mass of 4.6 Multiplication-Sign 10{sup 8} M{sub Sun} is derived, 40% more than what is detected by the VLA observations. The observationally motivated pseudo-isothermal dark matter (DM) halo model can reproduce the observed RC very well, but the cosmologically motivated Navarro-Frenk-White DM model gives a much poorer fit to the data. While having a more accurate gas distribution has reduced the discrepancy between the observed RC and the MOdified Newtonian Dynamics (MOND) models, this is done at the expense of having to use unrealistic mass-to-light ratios for the stellar disk and/or very large values for the MOND universal constant a{sub 0}. Different distances or H I contents cannot reconcile MOND with the observed kinematics, in view of the small errors on these two quantities. As with many slowly rotating gas-rich galaxies studied recently, the present result for NGC 3109 continues to pose a serious challenge to the MOND theory.

  13. Verification Games: Crowd-Sourced Formal Verification

    DTIC Science & Technology

    2016-03-01

    Formal Verification the verification tools developed by the Programming Languages and Software Engineering group were improved. A series of games...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox. Verification tools and games were integrated to verify...N/A i Contents List of Figures 1. SUMMARY .............................................................................................. 1 2

  14. Frequency Distribution of Second Solid Cancer Locations in Relation to the Irradiated Volume Among 115 Patients Treated for Childhood Cancer

    SciTech Connect

    Diallo, Ibrahima Haddy, Nadia; Adjadj, Elisabeth; Samand, Akhtar; Quiniou, Eric; Chavaudra, Jean; Alziar, Iannis; Perret, Nathalie; Guerin, Sylvie; Lefkopoulos, Dimitri; Vathaire, Florent de

    2009-07-01

    Purpose: To provide better estimates of the frequency distribution of second malignant neoplasm (SMN) sites in relation to previous irradiated volumes, and better estimates of the doses delivered to these sites during radiotherapy (RT) of the first malignant neoplasm (FMN). Methods and Materials: The study focused on 115 patients who developed a solid SMN among a cohort of 4581 individuals. The homemade software package Dos{sub E}G was used to estimate the radiation doses delivered to SMN sites during RT of the FMN. Three-dimensional geometry was used to evaluate the distances between the irradiated volume, for RT delivered to each FMN, and the site of the subsequent SMN. Results: The spatial distribution of SMN relative to the irradiated volumes in our cohort was as follows: 12% in the central area of the irradiated volume, which corresponds to the planning target volume (PTV), 66% in the beam-bordering region (i.e., the area surrounding the PTV), and 22% in regions located more than 5 cm from the irradiated volume. At the SMN site, all dose levels ranging from almost zero to >75 Gy were represented. A peak SMN frequency of approximately 31% was identified in volumes that received <2.5 Gy. Conclusion: A greater volume of tissues receives low or intermediate doses in regions bordering the irradiated volume with modern multiple-beam RT arrangements. These results should be considered for risk-benefit evaluations of RT.

  15. Verification of Anderson superexchange in MnO via magnetic pair distribution function analysis and ab initio theory

    DOE PAGES

    Benjamin A. Frandsen; Brunelli, Michela; Page, Katharine; ...

    2016-05-11

    Here, we present a temperature-dependent atomic and magnetic pair distribution function (PDF) analysis of neutron total scattering measurements of antiferromagnetic MnO, an archetypal strongly correlated transition-metal oxide. The known antiferromagnetic ground-state structure fits the low-temperature data closely with refined parameters that agree with conventional techniques, confirming the reliability of the newly developed magnetic PDF method. The measurements performed in the paramagnetic phase reveal significant short-range magnetic correlations on a ~1 nm length scale that differ substantially from the low-temperature long-range spin arrangement. Ab initio calculations using a self-interaction-corrected local spin density approximation of density functional theory predict magnetic interactions dominatedmore » by Anderson superexchange and reproduce the measured short-range magnetic correlations to a high degree of accuracy. Further calculations simulating an additional contribution from a direct exchange interaction show much worse agreement with the data. Furthermore, the Anderson superexchange model for MnO is thus verified by experimentation and confirmed by ab initio theory.« less

  16. Verification of Anderson Superexchange in MnO via Magnetic Pair Distribution Function Analysis and ab initio Theory

    NASA Astrophysics Data System (ADS)

    Frandsen, Benjamin A.; Brunelli, Michela; Page, Katharine; Uemura, Yasutomo J.; Staunton, Julie B.; Billinge, Simon J. L.

    2016-05-01

    We present a temperature-dependent atomic and magnetic pair distribution function (PDF) analysis of neutron total scattering measurements of antiferromagnetic MnO, an archetypal strongly correlated transition-metal oxide. The known antiferromagnetic ground-state structure fits the low-temperature data closely with refined parameters that agree with conventional techniques, confirming the reliability of the newly developed magnetic PDF method. The measurements performed in the paramagnetic phase reveal significant short-range magnetic correlations on a ˜1 nm length scale that differ substantially from the low-temperature long-range spin arrangement. Ab initio calculations using a self-interaction-corrected local spin density approximation of density functional theory predict magnetic interactions dominated by Anderson superexchange and reproduce the measured short-range magnetic correlations to a high degree of accuracy. Further calculations simulating an additional contribution from a direct exchange interaction show much worse agreement with the data. The Anderson superexchange model for MnO is thus verified by experimentation and confirmed by ab initio theory.

  17. Modified Kolmogorov-Smirnov, Anderson-Darling, and Cramer-Von Mises Tests for the Pareto Distribution with Unknown Location and Scale Parameters.

    DTIC Science & Technology

    1985-12-01

    tion in statistical analysis. It is named after Vilfredo Pareto (1848-1923), a Swiss professor of economics who con- ducted the first extensive...THE PARETO DISTRIBUTION WITH UNKNOWN LOCATION AND SCALE PARAMETERS THESIS James E. Porter III Captain, USAF AFIT/GSO/MA/85D-6 Approved for public... PARETO DISTRIBUTION WITH UNKNOWN LOCATION AND SCALE PARAMETERS -.- THES IS Presented to the Faculty of the School of Engineering *- of the Air Force

  18. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  19. Physician Location Selection and Distribution. A Bibliography of Relevant Articles, Reports and Data Sources. Health Manpower Policy Discussion Paper Series No. D3.

    ERIC Educational Resources Information Center

    Crane, Stephen C.; Reynolds, Juanita

    This bibliography provides background material on two general issues of how physicians are distributed geographically and how physicians choose a practice location. The report is divided into five major categories of information: overview summary of annotated articles, reference key to location decision factors, reference key to public policy…

  20. Atmospheric aerosols size distribution properties in winter and pre-monsoon over western Indian Thar Desert location

    NASA Astrophysics Data System (ADS)

    Panwar, Chhagan; Vyas, B. M.

    2016-05-01

    The first ever experimental results over Indian Thar Desert region concerning to height integrated aerosols size distribution function in particles size ranging between 0.09 to 2 µm such as, aerosols columnar size distribution (CSD), effective radius (Reff), integrated content of total aerosols (Nt), columnar content of accumulation and coarse size aerosols particles concentration (Na) (size < 0.5 µm) and (Nc) (size between 0.5 to 2 µm) have been described specifically during winter (a stable weather condition and intense anthropogenic pollution activity period) and pre-monsoon (intense dust storms of natural mineral aerosols as well as unstable atmospheric weather condition period) at Jaisalmer (26.90°N, 69.90°E, 220 m above surface level (asl)) located in central Thar desert vicinity of western Indian site. The CSD and various derived other aerosols size parameters are retrieved from their average spectral characteristics of Aerosol Optical Thickness (AOT) from UV to Infrared wavelength spectrum measured from Multi-Wavelength solar Radiometer (MWR). The natures of CSD are, in general, bio-modal character, instead of uniformly distributed character and power law distributions. The observed primary peaks in CSD plots are seen around about 1013 m2 μm-1 at radius range 0.09-0.20 µm during both the seasons. But, in winter months, secondary peaks of relatively lower CSD values of 1010 to 1011 m2/μm-1 occur within a lower radius size range 0.4 to 0.6 µm. In contrast to this, while in dust dominated and hot season, the dominated secondary maxima of the higher CSD of about 1012 m2μm-3 is found of bigger aerosols size particles in a rage of 0.6 to 1.0 µm which is clearly demonstrating the characteristics of higher aerosols laden of bigger size aerosols in summer months relative to their prevailed lower aerosols loading of smaller size aerosols particles (0.4 to 0.6 µm) in cold months. Several other interesting features of changing nature of monthly spectral AOT

  1. Sub-micron particle number size distributions characteristics at an urban location, Kanpur, in the Indo-Gangetic Plain

    NASA Astrophysics Data System (ADS)

    Kanawade, V. P.; Tripathi, S. N.; Bhattu, Deepika; Shamjad, P. M.

    2014-10-01

    We present long-term measurements of sub-micron particle number size distributions (PNSDs) conducted at an urban location, Kanpur, in India, from September 2007 to July 2011. The mean Aitken mode (NAIT), accumulation mode (NACCU), the total particle (NTOT), and black carbon (BC) mass concentrations were 12.4 × 103 cm- 3, 18.9 × 103 cm- 3, 31.9 × 103 cm- 3, and 7.96 μg m- 3, respectively, within the observed range at other urban locations worldwide, but much higher than those reported at urban sites in the developed nations. The total particle volume concentration appears to be dominated mainly by the accumulation mode particles, except during the monsoon months, perhaps due to efficient wet deposition of accumulation mode particles by precipitation. At Kanpur, the diurnal variation of particle number concentrations was very distinct, with highest during morning and late evening hours, and lowest during the afternoon hours. This behavior could be attributed to the large primary emissions of aerosol particles and temporal evolution of the planetary boundary layer. A distinct seasonal variation in the total particle number and BC mass concentrations was observed, with the maximum in winter and minimum during the rainy season, however, the Aitken mode particles did not show a clear seasonal fluctuation. The ratio of Aitken to accumulation mode particles, NAIT/NACCU, was varied from 0.1 to 14.2, with maximum during April to September months, probably suggesting the importance of new particle formation processes and subsequent particle growth. This finding suggests that dedicated long-term measurements of PNSDs (from a few nanometer to one micron) are required to systematically characterize new particle formation over the Indian subcontinent that has been largely unstudied so far. Contrarily, the low NAIT/NACCU during post-monsoon and winter indicated the dominance of biomass/biofuel burning aerosol emissions at this site.

  2. NW Indian Ocean crustal thickness, micro-continent distribution and ocean-continent transition location from satellite gravity inversion

    NASA Astrophysics Data System (ADS)

    Kusznir, N. J.; Tymms, V.

    2009-04-01

    Satellite gravity anomaly inversion incorporating a lithosphere thermal gravity anomaly correction has been used to determine Moho depth, crustal thickness and lithosphere thinning factor for the NW Indian Ocean and to map ocean-continent transition location (OCT) and micro-continent distribution. Input data is satellite gravity (Sandwell & Smith 1997) and digital bathymetry (Gebco 2003). Crustal thicknesses predicted by gravity inversion under the Seychelles and Mascarenes are in excess of 30 km and form a single micro-continent extending southwards towards Mauritius. Thick crust (> 25 km) offshore SW India is predicted to extend oceanwards under the Lacadive and Maldive Islands and southwards under the Chagos Archipelago. Superposition of illuminated satellite gravity data onto crustal thickness maps from gravity inversion clearly shows pre-separation conjugacy of the thick crust underlying the Chagos and Mascarene Islands. Maps of crustal thickness from gravity inversion show a pronounced discontinuity in crustal thickness between Mauritius-Reunion and the Mascarene Basin which is of Late Cretaceous age and pre-dates recent plume volcanism. Gravity inversion to determine Moho depth and crustal thickness variation is carried out in the 3D spectral domain and incorporates a lithosphere thermal gravity anomaly correction for both oceanic and continental margin lithosphere (Chappell & Kusznir 2008). Failure to incorporate a lithosphere thermal gravity anomaly correction gives a substantial over-estimate of crustal thickness predicted by gravity inversion. The lithosphere thermal model used to predict the lithosphere thermal gravity anomaly correction may be conditioned using magnetic isochron data to provide the age of oceanic lithosphere (Mueller et al. 1997). The resulting crustal thickness determination and the location of the OCT are sensitive to errors in the magnetic isochron data. An alternative method of inverting satellite gravity to give crustal thickness

  3. Simple Syringe Filtration Methods for Reliably Examining Dissolved and Colloidal Trace Element Distributions in Remote Field Locations

    NASA Astrophysics Data System (ADS)

    Shiller, A. M.

    2002-12-01

    Methods for obtaining reliable dissolved trace element samples frequently utilize clean labs, portable laminar flow benches, or other equipment not readily transportable to remote locations. In some cases unfiltered samples can be obtained in a remote location and transported back to a lab for filtration. However, this may not always be possible or desirable. Additionally, methods for obtaining information on colloidal composition are likewise frequently too cumbersome for remote locations as well as being time-consuming. For that reason I have examined clean methods for collecting samples filtered through 0.45 and 0.02 micron syringe filters. With this methodology, only small samples are collected (typically 15 mL). However, with the introduction of the latest generation of ICP-MS's and microflow nebulizers, sample requirements for elemental analysis are much lower than just a few years ago. Thus, a determination of a suite of first row transition elements is frequently readily obtainable with samples of less than 1 mL. To examine the "traditional" (<0.45 micron) dissolved phase, 25 mm diameter polypropylene syringe filters and all polyethylene/polypropylene syringes are utilized. Filters are pre-cleaned in the lab using 40 mL of approx. 1 M HCl followed by a clean water rinse. Syringes are pre-cleaned by leaching with hot 1 M HCl followed by a clean water rinse. Sample kits are packed in polyethylene bags for transport to the field. Results are similar to results obtained using 0.4 micron polycarbonate screen filters, though concentrations may differ somewhat depending on the extent of sample pre-rinsing of the filter. Using this method, a multi-year time series of dissolved metals in a remote Rocky Mountain stream has been obtained. To examine the effect of colloidal material on dissolved metal concentrations, 0.02 micron alumina syringe filters have been utilized. Other workers have previously used these filters for examining colloidal Fe distributions in lake

  4. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  5. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  6. Columbus pressurized module verification

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Comandatore, Emanuele

    1986-01-01

    The baseline verification approach of the COLUMBUS Pressurized Module was defined during the A and B1 project phases. Peculiarities of the verification program are the testing requirements derived from the permanent manned presence in space. The model philosophy and the test program have been developed in line with the overall verification concept. Such critical areas as meteoroid protections, heat pipe radiators and module seals are identified and tested. Verification problem areas are identified and recommendations for the next development are proposed.

  7. Verification of VENTSAR

    SciTech Connect

    Simpkins, A.A.

    1995-01-01

    The VENTSAR code is an upgraded and improved version of the VENTX code, which estimates concentrations on or near a building from a release at a nearby location. The code calculates the concentrations either for a given meteorological exceedance probability or for a given stability and wind speed combination. A single building can be modeled which lies in the path of the plume, or a penthouse can be added to the top of the building. Plume rise may also be considered. Release types can be either chemical or radioactive. Downwind concentrations are determined at user-specified incremental distances. This verification report was prepared to demonstrate that VENTSAR is properly executing all algorithms and transferring data. Hand calculations were also performed to ensure proper application of methodologies.

  8. Candida parapsilosis (sensu lato) isolated from hospitals located in the Southeast of Brazil: Species distribution, antifungal susceptibility and virulence attributes.

    PubMed

    Ziccardi, Mariangela; Souza, Lucieri O P; Gandra, Rafael M; Galdino, Anna Clara M; Baptista, Andréa R S; Nunes, Ana Paula F; Ribeiro, Mariceli A; Branquinha, Marta H; Santos, André L S

    2015-12-01

    Candida parapsilosis (sensu lato), which represents a fungal complex composed of three genetically related species - Candida parapsilosis sensu stricto, Candida orthopsilosis and Candida metapsilosis, has emerged as an important yeast causing fungemia worldwide. The goal of the present work was to assess the prevalence, antifungal susceptibility and production of virulence traits in 53 clinical isolates previously identified as C. parapsilosis (sensu lato) obtained from hospitals located in the Southeast of Brazil. Species forming this fungal complex are physiologically/morphologically indistinguishable; however, polymerase chain reaction followed by restriction fragment length polymorphism of FKS1 gene has solved the identification inaccuracy, revealing that 43 (81.1%) isolates were identified as C. parapsilosis sensu stricto and 10 (18.9%) as C. orthopsilosis. No C. metapsilosis was found. The geographic distribution of these Candida species was uniform among the studied Brazilian States (São Paulo, Rio de Janeiro and Espírito Santo). All C. orthopsilosis and almost all C. parapsilosis sensu stricto (95.3%) isolates were susceptible to amphotericin B, fluconazole, itraconazole, voriconazole and caspofungin. Nevertheless, one C. parapsilosis sensu stricto isolate was resistant to fluconazole and another one was resistant to caspofungin. C. parapsilosis sensu stricto isolates exhibited higher MIC mean values to amphotericin B, fluconazole and caspofungin than those of C. orthopsilosis, while C. orthopsilosis isolates displayed higher MIC mean to itraconazole compared to C. parapsilosis sensu stricto. Identical MIC mean values to voriconazole were measured for these Candida species. All the isolates of both species were able to form biofilm on polystyrene surface. Impressively, biofilm-growing cells of C. parapsilosis sensu stricto and C. orthopsilosis exhibited a considerable resistance to all antifungal agents tested. Pseudohyphae were observed in 67.4% and 80

  9. Shift Verification and Validation

    SciTech Connect

    Pandya, Tara M.; Evans, Thomas M.; Davidson, Gregory G; Johnson, Seth R.; Godfrey, Andrew T.

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  10. Structural System Identification Technology Verification

    DTIC Science & Technology

    1981-11-01

    USAAVRADCOM-TR-81-D-28Q V󈧄 ADA1091 81 LEI STRUCTURAL SYSTEM IDENTIFICATION TECHNOLOGY VERIFICATION \\ N. Giansante, A. Berman, W. o. Flannelly, E...release; distribution unlimited. Prepared for APPLIED TECHNOLOGY LABORATORY U. S. ARMY RESEARCH AND TECHNOLOGY LABORATORIES (AVRADCOM) S Fort Eustis...Va. 23604 4-J" APPLI ED TECHNOLOGY LABORATORY POSITION STATEMENT The Applied Technology Laboratory has been involved in the development of the Struc

  11. Environmental Technology Verification Report - Electric Power and Heat Production Using Renewable Biogas at Patterson Farms

    EPA Science Inventory

    The U.S. EPA operates the Environmental Technology Verification program to facilitate the deployment of innovative technologies through performance verification and information dissemination. A technology area of interest is distributed electrical power generation, particularly w...

  12. Light dose verification for pleural PDT.

    PubMed

    Sandell, Julia L; Liang, Xing; Zhu, Timothy

    2012-02-13

    The ability to deliver uniform light dose in Photodynamic therapy (PDT) is critical to treatment efficacy. Current protocol in pleural photodynamic therapy uses 7 isotropic detectors placed at discrete locations within the pleural cavity to monitor light dose throughout treatment. While effort is made to place the detectors uniformly through the cavity, measurements do not provide an overall uniform measurement of delivered dose. A real-time infrared (IR) tracking camera is development to better deliver and monitor a more uniform light distribution during treatment. It has been shown previously that there is good agreement between fluence calculated using IR tracking data and isotropic detector measurements for direct light phantom experiments. This study presents the results of an extensive phantom study which uses variable, patient-like geometries and optical properties (both absorption and scattering). Position data of the treatment is collected from the IR navigation system while concurrently light distribution measurements are made using the aforementioned isotropic detectors. These measurements are compared to fluence calculations made using data from the IR navigation system to verify our light distribution theory is correct and applicable in patient-like settings. The verification of this treatment planning technique is an important step in bringing real-time fluence monitoring into the clinic for more effective treatment.

  13. Distribution of Foraminifera in the Core Samples of Kollidam and Marakanam Mangrove Locations, Tamil Nadu, Southeast Coast of India

    NASA Astrophysics Data System (ADS)

    Nowshath, M.

    2013-05-01

    In order to study the distribution of Foraminifera in the subsurface sediments of mangrove environment, two core samples have been collected i) near boating house, Pitchavaram, from Kollidam estuary (C2) and ii) backwaters of Marakanam (C2)with the help of PVC corer. The length of the core varies from a total of 25 samples from both cores were obtained and they were subjected to standard micropaleontological and sedimentological analyses for the evaluation of different sediment characteristics. The core sample No.C1 (Pitchavaram) yielded only foraminifera whereas the other one core no.C2 (Marakanam) has yielded discussed only the down core distribution of foraminifera. The widely utilized classification proposed by Loeblich and Tappan (1987) has been followed in the present study for Foraminiferal taxonomy and accordingly 23 foraminiferal species belonging to 18 genera, 10 families, 8 superfamilies and 4 suborders have been reported and illustrated. The foraminiferal species recorded are characteristic of shallow innershelf to marginal marine and tropical in nature. Sedimentological parameters such as CaCO3, Organic matter and sand-silt-clay ratio was estimated and their down core distribution is discussed. An attempt has been made to evaluate the favourable substrate for the Foraminifera population abundance in the present area of study. From the overall distribution of foraminifera in different samples of Kollidam estuary (Pitchavaram area), and Marakanam estuary it is observed that siltysand and sandysilt are more accommodative substrate for the population of foraminifera, respectively. The distribution of foraminifera in the core samples indicate that the sediments were deposited under normal oxygenated environment conditions.;

  14. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  15. Reliable Entanglement Verification

    NASA Astrophysics Data System (ADS)

    Arrazola, Juan; Gittsovich, Oleg; Donohue, John; Lavoie, Jonathan; Resch, Kevin; Lütkenhaus, Norbert

    2013-05-01

    Entanglement plays a central role in quantum protocols. It is therefore important to be able to verify the presence of entanglement in physical systems from experimental data. In the evaluation of these data, the proper treatment of statistical effects requires special attention, as one can never claim to have verified the presence of entanglement with certainty. Recently increased attention has been paid to the development of proper frameworks to pose and to answer these type of questions. In this work, we apply recent results by Christandl and Renner on reliable quantum state tomography to construct a reliable entanglement verification procedure based on the concept of confidence regions. The statements made do not require the specification of a prior distribution nor the assumption of an independent and identically distributed (i.i.d.) source of states. Moreover, we develop efficient numerical tools that are necessary to employ this approach in practice, rendering the procedure ready to be employed in current experiments. We demonstrate this fact by analyzing the data of an experiment where photonic entangled two-photon states were generated and whose entanglement is verified with the use of an accessible nonlinear witness.

  16. Location and distribution of a receptor for the 987P pilus of Escherichia coli in small intestines.

    PubMed

    Dean, E A; Isaacson, R E

    1985-02-01

    Frozen sections of rabbit or pig small intestines were stained with fluorescein-labeled antibody specific for the 987P receptor isolated from adult rabbit small intestines. The 987P receptor was present along the entire villous surface and in goblet cells in adult rabbits, but only in goblet cells in infant rabbits. In adult rabbits, the receptor was distributed equally in the jejunum and the ileum. Material antigenically similar to the rabbit 987P receptor was demonstrated in goblet cells in neonatal piglet ileum.

  17. Earthquake Forecasting, Validation and Verification

    NASA Astrophysics Data System (ADS)

    Rundle, J.; Holliday, J.; Turcotte, D.; Donnellan, A.; Tiampo, K.; Klein, B.

    2009-05-01

    Techniques for earthquake forecasting are in development using both seismicity data mining methods, as well as numerical simulations. The former rely on the development of methods to recognize patterns in data, while the latter rely on the use of dynamical models that attempt to faithfully replicate the actual fault systems. Testing such forecasts is necessary not only to determine forecast quality, but also to improve forecasts. A large number of techniques to validate and verify forecasts have been developed for weather and financial applications. Many of these have been elaborated in public locations, including, for example, the URL as listed below. Typically, the goal is to test for forecast resolution, reliability and sharpness. A good forecast is characterized by consistency, quality and value. Most, if not all of these forecast verification procedures can be readily applied to earthquake forecasts as well. In this talk, we discuss both methods of forecasting, as well as validation and verification using a number of these standard methods. We show how these test methods might be useful for both fault-based forecasting, a group of forecast methods that includes the WGCEP and simulator-based renewal models, and grid-based forecasting, which includes the Relative Intensity, Pattern Informatics, and smoothed seismicity methods. We find that applying these standard methods of forecast verification is straightforward. Judgments about the quality of a given forecast method can often depend on the test applied, as well as on the preconceptions and biases of the persons conducting the tests.

  18. Proton Therapy Verification with PET Imaging

    PubMed Central

    Zhu, Xuping; Fakhri, Georges El

    2013-01-01

    Proton therapy is very sensitive to uncertainties introduced during treatment planning and dose delivery. PET imaging of proton induced positron emitter distributions is the only practical approach for in vivo, in situ verification of proton therapy. This article reviews the current status of proton therapy verification with PET imaging. The different data detecting systems (in-beam, in-room and off-line PET), calculation methods for the prediction of proton induced PET activity distributions, and approaches for data evaluation are discussed. PMID:24312147

  19. 10 CFR 72.79 - Facility information and verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... DOC/NRC Form AP-A and associated forms; (b) Shall submit location information described in § 75.11 of this chapter on DOC/NRC Form AP-1 and associated forms; and (c) Shall permit verification thereof...

  20. 10 CFR 72.79 - Facility information and verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... DOC/NRC Form AP-A and associated forms; (b) Shall submit location information described in § 75.11 of this chapter on DOC/NRC Form AP-1 and associated forms; and (c) Shall permit verification thereof...

  1. 10 CFR 72.79 - Facility information and verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... DOC/NRC Form AP-A and associated forms; (b) Shall submit location information described in § 75.11 of this chapter on DOC/NRC Form AP-1 and associated forms; and (c) Shall permit verification thereof...

  2. 10 CFR 72.79 - Facility information and verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... DOC/NRC Form AP-A and associated forms; (b) Shall submit location information described in § 75.11 of this chapter on DOC/NRC Form AP-1 and associated forms; and (c) Shall permit verification thereof...

  3. 10 CFR 72.79 - Facility information and verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... DOC/NRC Form AP-A and associated forms; (b) Shall submit location information described in § 75.11 of this chapter on DOC/NRC Form AP-1 and associated forms; and (c) Shall permit verification thereof...

  4. Approaches to wind resource verification

    NASA Technical Reports Server (NTRS)

    Barchet, W. R.

    1982-01-01

    Verification of the regional wind energy resource assessments produced by the Pacific Northwest Laboratory addresses the question: Is the magnitude of the resource given in the assessments truly representative of the area of interest? Approaches using qualitative indicators of wind speed (tree deformation, eolian features), old and new data of opportunity not at sites specifically chosen for their exposure to the wind, and data by design from locations specifically selected to be good wind sites are described. Data requirements and evaluation procedures for verifying the resource are discussed.

  5. Lunar Pickup Ions Observed by ARTEMIS: Spatial and Temporal Distribution and Constraints on Species and Source Locations

    NASA Technical Reports Server (NTRS)

    Halekas, Jasper S.; Poppe, A. R.; Delory, G. T.; Sarantos, M.; Farrell, W. M.; Angelopoulos, V.; McFadden, J. P.

    2012-01-01

    ARTEMIS observes pickup ions around the Moon, at distances of up to 20,000 km from the surface. The observed ions form a plume with a narrow spatial and angular extent, generally seen in a single energy/angle bin of the ESA instrument. Though ARTEMIS has no mass resolution capability, we can utilize the analytically describable characteristics of pickup ion trajectories to constrain the possible ion masses that can reach the spacecraft at the observation location in the correct energy/angle bin. We find that most of the observations are consistent with a mass range of approx. 20-45 amu, with a smaller fraction consistent with higher masses, and very few consistent with masses below 15 amu. With the assumption that the highest fluxes of pickup ions come from near the surface, the observations favor mass ranges of approx. 20-24 and approx. 36-40 amu. Although many of the observations have properties consistent with a surface or near-surface release of ions, some do not, suggesting that at least some of the observed ions have an exospheric source. Of all the proposed sources for ions and neutrals about the Moon, the pickup ion flux measured by ARTEMIS correlates best with the solar wind proton flux, indicating that sputtering plays a key role in either directly producing ions from the surface, or producing neutrals that subsequently become ionized.

  6. Investigation of Reflectance Distribution and Trend for the Double Ray Located in the Northwest of Tycho Crater

    NASA Astrophysics Data System (ADS)

    Yi, Eung Seok; Kim, Kyeong Ja; Choi, Yi Re; Kim, Yong Ha; Lee, Sung Soon; Lee, Seung Ryeol

    2015-06-01

    Analysis of lunar samples returned by the US Apollo missions revealed that the lunar highlands consist of anorthosite, plagioclase, pyroxene, and olivine; also, the lunar maria are composed of materials such as basalt and ilmenite. More recently, the remote sensing approach has enabled reduction of the time required to investigate the entire lunar surface, compared to the approach of returning samples. Moreover, remote sensing has also made it possible to determine the existence of specific minerals and to examine wide areas. In this paper, an investigation was performed on the reflectance distribution and its trend. The results were applied to the example of the double ray stretched in parallel lines from the Tycho crater to the third-quadrant of Mare Nubium. Basic research and background information for the investigation of lunar surface characteristics is also presented. For this research, resources aboard the SELenological and ENgineering Explorer (SELENE), a Japanese lunar probe, were used. These included the Multiband Imager (MI) in the Lunar Imager / Spectrometer (LISM). The data of these instruments were edited through the toolkit, an image editing and analysis tool, Exelis Visual Information Solution (ENVI).

  7. Perception of drinking water in the Quebec City region (Canada): the influence of water quality and consumer location in the distribution system.

    PubMed

    Turgeon, Steve; Rodriguez, Manuel J; Thériault, Marius; Levallois, Patrick

    2004-04-01

    The purpose of every water utility is to provide consumers with drinking water that is aesthetically acceptable and presents no risk to public health. Several studies have been carried out to analyze people's perception and attitude about the drinking water coming from their water distribution systems. The goal of the present study is to investigate the influence of water quality and the geographic location of consumers within a distribution system on consumer perception of tap water. The study is based on the data obtained from two surveys carried out in municipalities of the Quebec City area (Canada). Three perception variables were used to study consumer perception: general satisfaction, taste satisfaction and risk perception. Data analysis based on logistic regression indicates that water quality variations and geographic location in the distribution system have a significant impact on the consumer perception. This impact appears to be strongly associated with residual chlorine levels. The study also confirms the importance of socio-economic characteristics of consumers on their perception of drinking water quality.

  8. A statistical study of the spatial distribution of Co-operative UK Twin Located Auroral Sounding System (CUTLASS) backscatter power during EISCAT heater beam-sweeping experiments

    NASA Astrophysics Data System (ADS)

    Shergill, H.; Robinson, T. R.; Dhillon, R. S.; Lester, M.; Milan, S. E.; Yeoman, T. K.

    2010-05-01

    High-power electromagnetic waves can excite a variety of plasma instabilities in Earth's ionosphere. These lead to the growth of plasma waves and plasma density irregularities within the heated volume, including patches of small-scale field-aligned electron density irregularities. This paper reports a statistical study of intensity distributions in patches of these irregularities excited by the European Incoherent Scatter (EISCAT) heater during beam-sweeping experiments. The irregularities were detected by the Co-operative UK Twin Located Auroral Sounding System (CUTLASS) coherent scatter radar located in Finland. During these experiments the heater beam direction is steadily changed from northward to southward pointing. Comparisons are made between statistical parameters of CUTLASS backscatter power distributions and modeled heater beam power distributions provided by the EZNEC version 4 software. In general, good agreement between the statistical parameters and the modeled beam is observed, clearly indicating the direct causal connection between the heater beam and the irregularities, despite the sometimes seemingly unpredictable nature of unaveraged results. The results also give compelling evidence in support of the upper hybrid theory of irregularity excitation.

  9. Fuel Retrieval System Design Verification Report

    SciTech Connect

    GROTH, B.D.

    2000-04-11

    The Fuel Retrieval Subproject was established as part of the Spent Nuclear Fuel Project (SNF Project) to retrieve and repackage the SNF located in the K Basins. The Fuel Retrieval System (FRS) construction work is complete in the KW Basin, and start-up testing is underway. Design modifications and construction planning are also underway for the KE Basin. An independent review of the design verification process as applied to the K Basin projects was initiated in support of preparation for the SNF Project operational readiness review (ORR). A Design Verification Status Questionnaire, Table 1, is included which addresses Corrective Action SNF-EG-MA-EG-20000060, Item No.9 (Miller 2000).

  10. Computation and Analysis of the Global Distribution of the Radioxenon Isotope 133Xe based on Emissions from Nuclear Power Plants and Radioisotope Production Facilities and its Relevance for the Verification of the Nuclear-Test-Ban Treaty

    NASA Astrophysics Data System (ADS)

    Wotawa, Gerhard; Becker, Andreas; Kalinowski, Martin; Saey, Paul; Tuma, Matthias; Zähringer, Matthias

    2010-05-01

    Monitoring of radioactive noble gases, in particular xenon isotopes, is a crucial element of the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The capability of the noble gas network, which is currently under construction, to detect signals from a nuclear explosion critically depends on the background created by other sources. Therefore, the global distribution of these isotopes based on emissions and transport patterns needs to be understood. A significant xenon background exists in the reactor regions of North America, Europe and Asia. An emission inventory of the four relevant xenon isotopes has recently been created, which specifies source terms for each power plant. As the major emitters of xenon isotopes worldwide, a few medical radioisotope production facilities have been recently identified, in particular the facilities in Chalk River (Canada), Fleurus (Belgium), Pelindaba (South Africa) and Petten (Netherlands). Emissions from these sites are expected to exceed those of the other sources by orders of magnitude. In this study, emphasis is put on 133Xe, which is the most prevalent xenon isotope. First, based on the emissions known, the resulting 133Xe concentration levels at all noble gas stations of the final CTBT verification network were calculated and found to be consistent with observations. Second, it turned out that emissions from the radioisotope facilities can explain a number of observed peaks, meaning that atmospheric transport modelling is an important tool for the categorization of measurements. Third, it became evident that Nuclear Power Plant emissions are more difficult to treat in the models, since their temporal variation is high and not generally reported. Fourth, there are indications that the assumed annual emissions may be underestimated by factors of two to ten, while the general emission patterns seem to be well understood. Finally, it became evident that 133Xe sources mainly influence the sensitivity of the

  11. How do wetland type and location affect their hydrological services? - A distributed hydrological modelling study of the contribution of isolated and riparian wetlands

    NASA Astrophysics Data System (ADS)

    Fossey, Maxime; Rousseau, Alain N.; Savary, Stéphane; Royer, Alain

    2015-04-01

    Wetlands play a significant role on the hydrological cycle, reducing peak flows through water storage functions and sustaining low flows through slow release of water. However, their impacts on water resource availability and flood control are mainly driven by wetland types and locations within a watershed. So, despite the general agreement about these major hydrological functions, little is known about their spatial and typological influences. Consequently, assessing the quantitative impact of wetlands on hydrological regimes has become a relevant issue for both the scientific community and the decision-maker community. To investigate the hydrologic response at the watershed scale, mathematical modelling has been a well-accepted framework. Specific isolated and riparian wetland modules were implemented in the PHYSITEL/HYDROTEL distributed hydrological modelling platform to assess the impact of the spatial distribution of isolated and riparian wetlands on the stream flows of the Becancour River watershed, Quebec, Canada. More specifically, the focus was on assessing whether stream flow parameters, including peak flow and low flow, were related to: (i) geographic location of wetlands, (ii) typology of wetlands, and (iii) season of the year. Preliminary results suggest that isolated and riparian wetlands have individual space- and time-dependent impacts on the hydrologic response of the study watershed and provide relevant information for the design of wetland protection and restoration programs.

  12. Voltage verification unit

    DOEpatents

    Martin, Edward J.

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  13. SU-E-J-58: Dosimetric Verification of Metal Artifact Effects: Comparison of Dose Distributions Affected by Patient Teeth and Implants

    SciTech Connect

    Lee, M; Kang, S; Lee, S; Suh, T; Lee, J; Park, J; Park, H; Lee, B

    2014-06-01

    Purpose: Implant-supported dentures seem particularly appropriate for the predicament of becoming edentulous and cancer patients are no exceptions. As the number of people having dental implants increased in different ages, critical dosimetric verification of metal artifact effects are required for the more accurate head and neck radiation therapy. The purpose of this study is to verify the theoretical analysis of the metal(streak and dark) artifact, and to evaluate dosimetric effect which cause by dental implants in CT images of patients with the patient teeth and implants inserted humanoid phantom. Methods: The phantom comprises cylinder which is shaped to simulate the anatomical structures of a human head and neck. Through applying various clinical cases, made phantom which is closely allied to human. Developed phantom can verify two classes: (i)closed mouth (ii)opened mouth. RapidArc plans of 4 cases were created in the Eclipse planning system. Total dose of 2000 cGy in 10 fractions is prescribed to the whole planning target volume (PTV) using 6MV photon beams. Acuros XB (AXB) advanced dose calculation algorithm, Analytical Anisotropic Algorithm (AAA) and progressive resolution optimizer were used in dose optimization and calculation. Results: In closed and opened mouth phantom, because dark artifacts formed extensively around the metal implants, dose variation was relatively higher than that of streak artifacts. As the PTV was delineated on the dark regions or large streak artifact regions, maximum 7.8% dose error and average 3.2% difference was observed. The averaged minimum dose to the PTV predicted by AAA was about 5.6% higher and OARs doses are also 5.2% higher compared to AXB. Conclusion: The results of this study showed that AXB dose calculation involving high-density materials is more accurate than AAA calculation, and AXB was superior to AAA in dose predictions beyond dark artifact/air cavity portion when compared against the measurements.

  14. [Spatial Distribution of Type 2 Diabetes Mellitus in Berlin: Application of a Geographically Weighted Regression Analysis to Identify Location-Specific Risk Groups].

    PubMed

    Kauhl, Boris; Pieper, Jonas; Schweikart, Jürgen; Keste, Andrea; Moskwyn, Marita

    2017-02-16

    Understanding which population groups in which locations are at higher risk for type 2 diabetes mellitus (T2DM) allows efficient and cost-effective interventions targeting these risk-populations in great need in specific locations. The goal of this study was to analyze the spatial distribution of T2DM and to identify the location-specific, population-based risk factors using global and local spatial regression models. To display the spatial heterogeneity of T2DM, bivariate kernel density estimation was applied. An ordinary least squares regression model (OLS) was applied to identify population-based risk factors of T2DM. A geographically weighted regression model (GWR) was then constructed to analyze the spatially varying association between the identified risk factors and T2DM. T2DM is especially concentrated in the east and outskirts of Berlin. The OLS model identified proportions of persons aged 80 and older, persons without migration background, long-term unemployment, households with children and a negative association with single-parenting households as socio-demographic risk groups. The results of the GWR model point out important local variations of the strength of association between the identified risk factors and T2DM. The risk factors for T2DM depend largely on the socio-demographic composition of the neighborhoods in Berlin and highlight that a one-size-fits-all approach is not appropriate for the prevention of T2DM. Future prevention strategies should be tailored to target location-specific risk-groups.

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION--FUELCELL ENERGY, INC.: DFC 300A MOLTEN CARBONATE FUEL CELL COMBINED HEAT AND POWER SYSTEM

    EPA Science Inventory

    The U.S. EPA operates the Environmental Technology Verification program to facilitate the deployment of innovative technologies through performance verification and information dissemination. A technology area of interest is distributed electrical power generation, particularly w...

  16. Aerosol mass size distribution and black carbon over a high altitude location in Western Trans-Himalayas: Impact of a dust episode

    NASA Astrophysics Data System (ADS)

    Kompalli, Sobhan Kumar; Krishna Moorthy, K.; Suresh Babu, S.; Manoj, M. R.

    2014-12-01

    The information on the aerosol properties from remote locations provides insights into the background and natural conditions against which anthropogenic impacts could be compared. Measurements of the near surface aerosol mass size distribution from the high altitude remote site help us to understand the natural processes, such as, the association between Aeolian and fluvial processes that have a direct bearing on the mass concentrations, especially in the larger size ranges. In the present study, the total mass concentration and mass-size distribution of the near surface aerosols, measured using a 10-channel Quartz Crystal Microbalance (QCM) Impactor from a high altitude location-Hanle (32.78°N, 78.95°E, 4520 m asl) in the western Trans-Himalayas, have been used to characterize the composite aerosols. Also the impact of a highly localized, short-duration dust storm episode on the mass size distribution has been examined. In general, though the total mass concentration (Mt) remained very low (∼0.75 ± 0.61 μg m-3), interestingly, coarse mode (super-micron) aerosols contributed almost 72 ± 6% to the total aerosol mass loading near the surface. The mass-size distribution showed 3 modes, a fine particle mode (∼0.2 μm), an accumulation mode at ∼0.5 μm, and a coarse mode at ∼3 μm. During a localized short duration dust storm episode, Mt reached as high as ∼13.5 μg m-3 with coarse mode aerosols contributing to nearly 90% of it. The mass size distribution changed significantly, with a broad coarse mode so that the accumulation mode became inconspicuous. Concurrent measurements of aerosol black carbon (BC) using twin wavelength measurements of the aethalometer showed an increase in the wavelength index of absorption, from the normal values of ∼1 to 1.5 signifying the enhanced absorption at the short wavelength (380 nm) by the dust.

  17. Modular verification of concurrent systems

    SciTech Connect

    Sobel, A.E.K.

    1986-01-01

    During the last ten years, a number of authors have proposed verification techniques that allow one to prove properties of individual processes by using global assumptions about the behavior of the remaining processes in the distributed program. As a result, one must justify these global assumptions before drawing any conclusions regarding the correctness of the entire program. This justification is often the most difficult part of the proof and presents a serious obstacle to hierarchical program development. This thesis develops a new approach to the verification of concurrent systems. The approach is modular and supports compositional development of programs since the proofs of each individual process of a program are completely isolated from all others. The generality of this approach is illustrated by applying it to a representative set of contemporary concurrent programming languages, namely: CSP, ADA, Distributed Processes, and a shared variable language. In addition, it is also shown how the approach may be used to deal with a number of other constructs that have been proposed for inclusion in concurrent languages: FORK and JOIN primitives, nested monitor calls, path expressions, atomic transactions, and asynchronous message passing. These results allow argument that the approach is universal and can be used to design proof systems for any concurrent language.

  18. Location Privacy

    NASA Astrophysics Data System (ADS)

    Meng, Xiaofeng; Chen, Jidong

    With rapid development of sensor and wireless mobile devices, it is easy to access mobile users' location information anytime and anywhere. On one hand, LBS is becoming more and more valuable and important. On the other hand, location privacy issues raised by such applications have also gained more attention. However, due to the specificity of location information, traditional privacy-preserving techniques in data publishing cannot be used. In this chapter, we will introduce location privacy, and analyze the challenges of location privacy-preserving, and give a survey of existing work including the system architecture, location anonymity and query processing.

  19. Influence of pH, layer charge location and crystal thickness distribution on U(VI) sorption onto heterogeneous dioctahedral smectite.

    PubMed

    Guimarães, Vanessa; Rodríguez-Castellón, Enrique; Algarra, Manuel; Rocha, Fernando; Bobos, Iuliu

    2016-11-05

    The UO2(2+) adsorption on smectite (samples BA1, PS2 and PS3) with a heterogeneous structure was investigated at pH 4 (I=0.02M) and pH 6 (I=0.2M) in batch experiments, with the aim to evaluate the influence of pH, layer charge location and crystal thickness distribution. Mean crystal thickness distribution of smectite crystallite used in sorption experiments range from 4.8nm (sample PS2), to 5.1nm (sample PS3) and, to 7.4nm (sample BA1). Smaller crystallites have higher total surface area and sorption capacity. Octahedral charge location favor higher sorption capacity. The sorption isotherms of Freundlich, Langmuir and SIPS were used to model the sorption experiments. The surface complexation and cation exchange reactions were modeled using PHREEQC-code to describe the UO2(2+) sorption on smectite. The amount of UO2(2+) adsorbed on smectite samples decreased significantly at pH 6 and higher ionic strength, where the sorption mechanism was restricted to the edge sites of smectite. Two binding energy components at 380.8±0.3 and 382.2±0.3eV, assigned to hydrated UO2(2+) adsorbed by cation exchange and by inner-sphere complexation on the external sites at pH 4, were identified after the U4f7/2 peak deconvolution by X-photoelectron spectroscopy. Also, two new binding energy components at 380.3±0.3 and 381.8±0.3eV assigned to AlOUO2(+) and SiOUO2(+) surface species were observed at pH 6.

  20. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    SciTech Connect

    Chukbar, B. K.

    2015-12-15

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm{sup –3} in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  1. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    NASA Astrophysics Data System (ADS)

    Chukbar, B. K.

    2015-12-01

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm-3 in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  2. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  3. Nuclear disarmament verification

    SciTech Connect

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  4. Voice verification upgrade

    NASA Astrophysics Data System (ADS)

    Davis, R. L.; Sinnamon, J. T.; Cox, D. L.

    1982-06-01

    This contractor has two major objectives. The first was to build, test, and deliver to the government an entry control system using speaker verification (voice authentication) as the mechanism for verifying the user's claimed identity. This system included a physical mantrap, with an integral weight scale to prevent more than one user from gaining access with one verification (tailgating). The speaker verification part of the entry control system contained all the updates and embellishments to the algorithm that was developed earlier for the BISS (Base and Installation Security System) system under contract with the Electronic Systems Division of the USAF. These updates were tested prior to and during the contract on an operational system used at Texas Instruments in Dallas, Texas, for controlling entry to the Corporate Information Center (CIC).

  5. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    steps in the process. Verification insures that the CFD code is solving the equations as intended (no errors in the implementation). This is typically done either through the method of manufactured solutions (MMS) or through careful step-by-step comparisons with other verified codes. After the verification step is concluded, validation is performed to document the ability of the turbulence model to represent different types of flow physics. Validation can involve a large number of test case comparisons with experiments, theory, or DNS. Organized workshops have proved to be valuable resources for the turbulence modeling community in its pursuit of turbulence modeling verification and validation. Workshop contributors using different CFD codes run the same cases, often according to strict guidelines, and compare results. Through these comparisons, it is often possible to (1) identify codes that have likely implementation errors, and (2) gain insight into the capabilities and shortcomings of different turbulence models to predict the flow physics associated with particular types of flows. These are valuable lessons because they help bring consistency to CFD codes by encouraging the correction of faulty programming and facilitating the adoption of better models. They also sometimes point to specific areas needed for improvement in the models. In this paper, several recent workshops are summarized primarily from the point of view of turbulence modeling verification and validation. Furthermore, the NASA Langley Turbulence Modeling Resource website is described. The purpose of this site is to provide a central location where RANS turbulence models are documented, and test cases, grids, and data are provided. The goal of this paper is to provide an abbreviated survey of turbulence modeling verification and validation efforts, summarize some of the outcomes, and give some ideas for future endeavors in this area.

  6. Verification and validation benchmarks.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  7. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  8. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  9. Voice Verification Upgrade.

    DTIC Science & Technology

    1982-06-01

    to develop speaker verification techniques for use over degraded commun- ication channels -- specifically telephone lines. A test of BISS type speaker...verification technology was performed on a degraded channel and compensation techniques were then developed . The fifth program [103 (Total Voice SV...UPGAW. *mbit aL DuI~sel Jme T. SImmoon e~d David L. Cox AAWVLP FIR MIEW RMAS Utgl~rIMIW At" DT11C AU9 231f CD, _ ROME AIR DEVELOPMENT CENTER Air

  10. Interactions of microbial biofilms with toxic trace metals; 2: Prediction and verification of an integrated computer model of lead (II) distribution in the presence of microbial activity

    SciTech Connect

    Hsieh, K.M.; Murgel, G.A.; Lion, L.W.; Shuler, M.L. )

    1994-06-20

    The interfacial interactions of a toxic trace metal, Pb, with a surface modified by a marine film-forming bacterium, Pseudomonas atlantica, were predicted by a structured biofilm model used in conjunction with a chemical speciation model. The validity of the integrated model was tested for batch and continuous operations. Dynamic responses of the biophase due to transient lead concentration increases were also simulated. The reasonable predictions achieved by the model demonstrate its utility in describing trace metal distributions in complex systems where the adsorption properties of inorganic surfaces are modified by adherent bacteria and bacterial production of extracellular polymers.

  11. Alu and L1 sequence distributions in Xq24-q28 and their comparative utility in YAC contig assembly and verification

    SciTech Connect

    Porta, G.; Zucchi, I.; Schlessinger, D.; Hillier, L.; Green, P.; Nowotny, V.; D`Urso, M.

    1993-05-01

    The contents of Alu- and L1-containing TaqI restriction fragments were assessed by Southern blot analyses across YAC contigs already assembled by other means and localized within Xq24-q28. Fingerprinting patterns of YACs in contigs were concordant. Using software based on that of M. V. Olson et al. to analyze digitized data on fragment sizes, fingerprinting itself could establish matches among about 40% of a test group of 435 YACs. At 100-kb resolution, both repetitive elements were found throughout the region, with no apparent enrichment of Alu or L1 in DNA of G compared to that found in R bands. However, consistent with a random overall distribution, delimited regions of up to 100 kb contained clusters of repetitive elements. The local concentrations may help to account for the reported differential hybridization of Alu and L1 probes to segments of metaphase chromosomes. 40 refs., 6 figs., 2 tabs.

  12. Computer Graphics Verification

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Video processing creates technical animation sequences using studio quality equipment to realistically represent fluid flow over space shuttle surfaces, helicopter rotors, and turbine blades.Computer systems Co-op, Tim Weatherford, performing computer graphics verification. Part of Co-op brochure.

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  14. FPGA Verification Accelerator (FVAX)

    NASA Technical Reports Server (NTRS)

    Oh, Jane; Burke, Gary

    2008-01-01

    Is Verification Acceleration Possible? - Increasing the visibility of the internal nodes of the FPGA results in much faster debug time - Forcing internal signals directly allows a problem condition to be setup very quickly center dot Is this all? - No, this is part of a comprehensive effort to improve the JPL FPGA design and V&V process.

  15. Verification of Anderson superexchange in MnO via magnetic pair distribution function analysis and ab initio theory

    SciTech Connect

    Benjamin A. Frandsen; Brunelli, Michela; Page, Katharine; Uemura, Yasutomo J.; Staunton, Julie B.; Billinge, Simon J. L.

    2016-05-11

    Here, we present a temperature-dependent atomic and magnetic pair distribution function (PDF) analysis of neutron total scattering measurements of antiferromagnetic MnO, an archetypal strongly correlated transition-metal oxide. The known antiferromagnetic ground-state structure fits the low-temperature data closely with refined parameters that agree with conventional techniques, confirming the reliability of the newly developed magnetic PDF method. The measurements performed in the paramagnetic phase reveal significant short-range magnetic correlations on a ~1 nm length scale that differ substantially from the low-temperature long-range spin arrangement. Ab initio calculations using a self-interaction-corrected local spin density approximation of density functional theory predict magnetic interactions dominated by Anderson superexchange and reproduce the measured short-range magnetic correlations to a high degree of accuracy. Further calculations simulating an additional contribution from a direct exchange interaction show much worse agreement with the data. Furthermore, the Anderson superexchange model for MnO is thus verified by experimentation and confirmed by ab initio theory.

  16. Spectroscopic verification of zinc absorption and distribution in the desert plant Prosopis juliflora-velutina (velvet mesquite) treated with ZnO nanoparticles

    PubMed Central

    Hernandez-Viezcas, J.A.; Castillo-Michel, H.; Servin, A.D.; Peralta-Videa, J.R.; Gardea-Torresdey, J.L.

    2012-01-01

    The impact of metal nanoparticles (NPs) on biological systems, especially plants, is still not well understood. The aim of this research was to determine the effects of zinc oxide (ZnO) NPs in velvet mesquite (Prosopis juliflora-velutina). Mesquite seedlings were grown for 15 days in hydroponics with ZnO NPs (10 nm) at concentrations varying from 500 to 4000 mg L−1. Zinc concentrations in roots, stems and leaves were determined by inductively coupled plasma optical emission spectroscopy (ICP-OES). Plant stress was examined by the specific activity of catalase (CAT) and ascorbate peroxidase (APOX); while the biotransformation of ZnO NPs and Zn distribution in tissues was determined by X-ray absorption spectroscopy (XAS) and micro X-ray fluorescence (μXRF), respectively. ICP-OES results showed that Zn concentrations in tissues (2102 ± 87, 1135 ± 56, and 628 ± 130 mg kg−1 d wt in roots, stems, and leaves, respectively) were found at 2000 mg ZnO NPs L−1. Stress tests showed that ZnO NPs increased CAT in roots, stems, and leaves, while APOX increased only in stems and leaves. XANES spectra demonstrated that ZnO NPs were not present in mesquite tissues, while Zn was found as Zn(II), resembling the spectra of Zn(NO3)2. The μXRF analysis confirmed the presence of Zn in the vascular system of roots and leaves in ZnO NP treated plants. PMID:22820414

  17. Improved Verification for Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Powell, Mark A.

    2008-01-01

    Aerospace systems are subject to many stringent performance requirements to be verified with low risk. This report investigates verification planning using conditional approaches vice the standard classical statistical methods, and usage of historical surrogate data for requirement validation and in verification planning. The example used in this report to illustrate the results of these investigations is a proposed mission assurance requirement with the concomitant maximum acceptable verification risk for the NASA Constellation Program Orion Launch Abort System (LAS). This report demonstrates the following improvements: 1) verification planning using conditional approaches vice classical statistical methods results in plans that are more achievable and feasible; 2) historical surrogate data can be used to bound validation of performance requirements; and, 3) incorporation of historical surrogate data in verification planning using conditional approaches produces even less costly and more reasonable verification plans. The procedures presented in this report may produce similar improvements and cost savings in verification for any stringent performance requirement for an aerospace system.

  18. NON-INVASIVE DETERMINATION OF THE LOCATION AND DISTRIBUTION OF FREE-PHASE DENSE NONAQUEOUS PHASE LIQUIDS (DNAPL) BY SEISMIC REFLECTION TECHNIQUES

    SciTech Connect

    Michael G. Waddell; William J. Domoracki; Tom J. Temples; Jerome Eyer

    2001-05-01

    The Earth Sciences and Resources Institute, University of South Carolina is conducting a 14 month proof of concept study to determine the location and distribution of subsurface Dense Nonaqueous Phase Liquid (DNAPL) carbon tetrachloride (CCl{sub 4}) contamination at the 216-Z-9 crib, 200 West area, Department of Energy (DOE) Hanford Site, Washington by use of two-dimensional high resolution seismic reflection surveys and borehole geophysical data. The study makes use of recent advances in seismic reflection amplitude versus offset (AVO) technology to directly detect the presence of subsurface DNAPL. The techniques proposed are a noninvasive means towards site characterization and direct free-phase DNAPL detection. This report covers the results of Task 3 and change of scope of Tasks 4-6. Task 1 contains site evaluation and seismic modeling studies. The site evaluation consists of identifying and collecting preexisting geological and geophysical information regarding subsurface structure and the presence and quantity of DNAPL. The seismic modeling studies were undertaken to determine the likelihood that an AVO response exists and its probable manifestation. Task 2 is the design and acquisition of 2-D seismic reflection data designed to image areas of probable high concentration of DNAPL. Task 3 is the processing and interpretation of the 2-D data. Task 4, 5, and 6 were designing, acquiring, processing, and interpretation of a three dimensional seismic survey (3D) at the Z-9 crib area at 200 west area, Hanford.

  19. Motor activity (exploration) and formation of home bases in mice (C57BL/6) influenced by visual and tactile cues: modification of movement distribution, distance, location, and speed.

    PubMed

    Clark, Benjamin J; Hamilton, Derek A; Whishaw, Ian Q

    2006-04-15

    The motor activity of mice in tests of "exploration" is organized. Mice establish home bases, operationally defined as places where they spend long periods of time, near physical objects and nesting material from which they make excursions. This organization raises the question of the extent to which mouse motoric activity is modulated by innate predispositions versus environmental influences. Here the influence of contextual cues (visual and tactile) on the motor activity of C57BL/6 mice was examined: (1) on an open field that had no walls, a partial wall, or a complete wall, (2) in the presence of distinct visual cues, room cues, or in the absence of visual cues (infrared light), and (3) in the presence of configurations of visual and tactile cues. Mice were generally less active in the presence of salient cues and formed home bases near those cues. In addition, movement speed, path distribution, and the number and length of stops were modulated by contextual cues. With repeated tests, mice favored tactile cues over visual cues as their home base locations. Although responses to cues were robust over test days, conditioning to context was generally weak. That the exploratory behavior of mice is affected by experience and context provides insights into performance variability and may prove useful in investigating the genetic and neural influences on mouse behavior.

  20. Pyroclastic Eruptions in a Mars Climate Model: The Effects of Grain Size, Plume Height, Density, Geographical Location, and Season on Ash Distribution

    NASA Astrophysics Data System (ADS)

    Kerber, L. A.; Head, J. W.; Madeleine, J.; Wilson, L.; Forget, F.

    2010-12-01

    Pyroclastic volcanism has played a major role in the geologic history of the planet Mars. In addition to several highland patera features interpreted to be composed of pyroclastic material, there are a number of vast, fine-grained, friable deposits which may have a volcanic origin. The physical processes involved in the explosive eruption of magma, including the nucleation of bubbles, the fragmentation of magma, the incorporation of atmospheric gases, the formation of a buoyant plume, and the fall-out of individual pyroclasts has been modeled extensively for martian conditions [Wilson, L., J.W. Head (2007), Explosive volcanic eruptions on Mars: Tephra and accretionary lapilli formation, dispersal and recognition in the geologic record, J. Volcanol. Geotherm. Res. 163, 83-97]. We have further developed and expanded this original model in order to take into account differing temperature, pressure, and wind regimes found at different altitudes, at different geographic locations, and during different martian seasons. Using a well-established Mars global circulation model [LMD-GCM, Forget, F., F. Hourdin, R. Fournier, C. Hourdin, O. Talagrand (1999), Improved general circulation models of the martian atmosphere from the surface to above 80 km, J. Geophys. Res. 104, 24,155-24,176] we are able to link the volcanic eruption model of Wilson and Head (2007) to the spatially and temporally dynamic GCM temperature, pressure, and wind profiles to create three-dimensional maps of expected ash deposition on the surface. Here we present results exploring the effects of grain-size distribution, plume height, density of ash, latitude, season, and atmospheric pressure on the areal extent and shape of the resulting ash distribution. Our results show that grain-size distribution and plume height most strongly effect the distance traveled by the pyroclasts from the vent, while latitude and season can have a large effect on the direction in which the pyroclasts travel and the final shape

  1. Microcode Verification Project.

    DTIC Science & Technology

    1980-05-01

    MICROCOPY RESOLUTION TEST CHART MADCTR.S042 /2>t w NI TeduIem R"pm’ 00 0 MICRQCODE VERIFICATION PROJECT Unhvrsity of Southern California Stephen D...in the production, testing , and maintenance of Air Force software. This effort was undertaken in response to that goal. The objective of the effort was...rather than hard wiring, is a recent development in computer technology. Hardware diagnostics do not fulfill testing requirements for these computers

  2. Robust verification analysis

    NASA Astrophysics Data System (ADS)

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-01

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  3. Robust verification analysis

    SciTech Connect

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-15

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  4. RESRAD-BUILD verification.

    SciTech Connect

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-31

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft{reg_sign} Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified.

  5. Visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-03-01

    On-site visual inspection will play an essential role in future Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection can greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can visual inspection offer ``ground truth`` in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending party may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection.

  6. Assessment of total and organic vanadium levels and their bioaccumulation in edible sea cucumbers: tissues distribution, inter-species-specific, locational differences and seasonal variations.

    PubMed

    Liu, Yanjun; Zhou, Qingxin; Xu, Jie; Xue, Yong; Liu, Xiaofang; Wang, Jingfeng; Xue, Changhu

    2016-02-01

    The objective of this study is to investigate the levels, inter-species-specific, locational differences and seasonal variations of vanadium in sea cucumbers and to validate further several potential factors controlling the distribution of metals in sea cucumbers. Vanadium levels were evaluated in samples of edible sea cucumbers and were demonstrated exhibit differences in different seasons, species and sampling sites. High vanadium concentrations were measured in the sea cucumbers, and all of the vanadium detected was in an organic form. Mean vanadium concentrations were considerably higher in the blood (sea cucumber) than in the other studied tissues. The highest concentration of vanadium (2.56 μg g(-1)), as well as a higher degree of organic vanadium (85.5 %), was observed in the Holothuria scabra samples compared with all other samples. Vanadium levels in Apostichopus japonicus from Bohai Bay and Yellow Sea have marked seasonal variations. Average values of 1.09 μg g(-1) of total vanadium and 0.79 μg g(-1) of organic vanadium were obtained in various species of sea cucumbers. Significant positive correlations between vanadium in the seawater and V org in the sea cucumber (r = 81.67 %, p = 0.00), as well as between vanadium in the sediment and V org in the sea cucumber (r = 77.98 %, p = 0.00), were observed. Vanadium concentrations depend on the seasons (salinity, temperature), species, sampling sites and seawater environment (seawater, sediment). Given the adverse toxicological effects of inorganic vanadium and positive roles in controlling the development of diabetes in humans, a regular monitoring programme of vanadium content in edible sea cucumbers can be recommended.

  7. Verification of ICESat-2/ATLAS Science Receiver Algorithm Onboard Databases

    NASA Astrophysics Data System (ADS)

    Carabajal, C. C.; Saba, J. L.; Leigh, H. W.; Magruder, L. A.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.

    2013-12-01

    NASA's ICESat-2 mission will fly the Advanced Topographic Laser Altimetry System (ATLAS) instrument on a 3-year mission scheduled to launch in 2016. ATLAS is a single-photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz, and a 6 spot pattern on the Earth's surface. A set of onboard Receiver Algorithms will perform signal processing to reduce the data rate and data volume to acceptable levels. These Algorithms distinguish surface echoes from the background noise, limit the daily data volume, and allow the instrument to telemeter only a small vertical region about the signal. For this purpose, three onboard databases are used: a Surface Reference Map (SRM), a Digital Elevation Model (DEM), and a Digital Relief Maps (DRMs). The DEM provides minimum and maximum heights that limit the signal search region of the onboard algorithms, including a margin for errors in the source databases, and onboard geolocation. Since the surface echoes will be correlated while noise will be randomly distributed, the signal location is found by histogramming the received event times and identifying the histogram bins with statistically significant counts. Once the signal location has been established, the onboard Digital Relief Maps (DRMs) will be used to determine the vertical width of the telemetry band about the signal. University of Texas-Center for Space Research (UT-CSR) is developing the ICESat-2 onboard databases, which are currently being tested using preliminary versions and equivalent representations of elevation ranges and relief more recently developed at Goddard Space Flight Center (GSFC). Global and regional elevation models have been assessed in terms of their accuracy using ICESat geodetic control, and have been used to develop equivalent representations of the onboard databases for testing against the UT-CSR databases, with special emphasis on the ice sheet regions. A series of verification checks have been implemented, including

  8. Quantum money with classical verification

    SciTech Connect

    Gavinsky, Dmitry

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  9. Quantum money with classical verification

    NASA Astrophysics Data System (ADS)

    Gavinsky, Dmitry

    2014-12-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  10. Hardware verification of distributed/adaptive control

    NASA Technical Reports Server (NTRS)

    Eldred, D. B.; Schaechter, D. B.

    1983-01-01

    Adaptive control techniques are studied for their future application to the control of large space structures, where uncertain or changing parameters may destabilize standard control system designs. The approach used is to examine an extended Kalman filter estimator, in which the state vector is augmented with the unknown parameters. The associated Riccatti equation is linearized about the case of exact knowledge of the parameters. By assuming that parameter variations occur slowly, the filter complexity is reduced further yet. Simulations on a two degree-of-freedom oscillator demonstrate the parameter-tracking capability of the filter, and an implementation on the JPL Flexible Beam Facility using an incorrect model shows the adaptive filter/optimal control to be stable where a standard Kalman filter/optimal control design is unstable.

  11. Systematic analysis of biological and physical limitations of proton beam range verification with offline PET/CT scans

    NASA Astrophysics Data System (ADS)

    Knopf, A.; Parodi, K.; Bortfeld, T.; Shih, H. A.; Paganetti, H.

    2009-07-01

    The clinical use of offline positron emission tomography/computed tomography (PET/CT) scans for proton range verification is currently under investigation at the Massachusetts General Hospital (MGH). Validation is achieved by comparing measured activity distributions, acquired in patients after receiving one fraction of proton irradiation, with corresponding Monte Carlo (MC) simulated distributions. Deviations between measured and simulated activity distributions can either reflect errors during the treatment chain from planning to delivery or they can be caused by various inherent challenges of the offline PET/CT verification method. We performed a systematic analysis to assess the impact of the following aspects on the feasibility and accuracy of the offline PET/CT method: (1) biological washout processes, (2) patient motion, (3) Hounsfield unit (HU) based tissue classification for the simulation of the activity distributions and (4) tumor site specific aspects. It was found that the spatial reproducibility of the measured activity distributions is within 1 mm. However, the feasibility of range verification is restricted to a limited amount of positions and tumor sites. Washout effects introduce discrepancies between the measured and simulated ranges of about 4 mm at positions where the proton beam stops in soft tissue. Motion causes spatial deviations of up to 3 cm between measured and simulated activity distributions in abdominopelvic tumor cases. In these later cases, the MC simulated activity distributions were found to be limited to about 35% accuracy in absolute values and about 2 mm in spatial accuracy depending on the correlativity of HU into the physical and biological parameters of the irradiated tissue. Besides, for further specific tumor locations, the beam arrangement, the limited accuracy of rigid co-registration and organ movements can prevent the success of PET/CT range verification. All the addressed factors explain why the proton beam range can

  12. Production readiness verification testing

    NASA Technical Reports Server (NTRS)

    James, A. M.; Bohon, H. L.

    1980-01-01

    A Production Readiness Verification Testing (PRVT) program has been established to determine if structures fabricated from advanced composites can be committed on a production basis to commercial airline service. The program utilizes subcomponents which reflect the variabilities in structure that can realistically be expected from current production and quality control technology to estimate the production qualities, variation in static strength, and durability of advanced composite structures. The results of the static tests and a durability assessment after one year of continuous load/environment testing of twenty two duplicates of each of two structural components (a segment of the front spar and cover of a vertical stabilizer box structure) are discussed.

  13. Automating engineering verification in ALMA subsystems

    NASA Astrophysics Data System (ADS)

    Ortiz, José; Castillo, Jorge

    2014-08-01

    The Atacama Large Millimeter/submillimeter Array is an interferometer comprising 66 individual high precision antennas located over 5000 meters altitude in the north of Chile. Several complex electronic subsystems need to be meticulously tested at different stages of an antenna commissioning, both independently and when integrated together. First subsystem integration takes place at the Operations Support Facilities (OSF), at an altitude of 3000 meters. Second integration occurs at the high altitude Array Operations Site (AOS), where also combined performance with Central Local Oscillator (CLO) and Correlator is assessed. In addition, there are several other events requiring complete or partial verification of instrument specifications compliance, such as parts replacements, calibration, relocation within AOS, preventive maintenance and troubleshooting due to poor performance in scientific observations. Restricted engineering time allocation and the constant pressure of minimizing downtime in a 24/7 astronomical observatory, impose the need to complete (and report) the aforementioned verifications in the least possible time. Array-wide disturbances, such as global power interruptions and following recovery, generate the added challenge of executing this checkout on multiple antenna elements at once. This paper presents the outcome of the automation of engineering verification setup, execution, notification and reporting in ALMA and how these efforts have resulted in a dramatic reduction of both time and operator training required. Signal Path Connectivity (SPC) checkout is introduced as a notable case of such automation.

  14. Software Verification and Validation Procedure

    SciTech Connect

    Olund, Thomas S.

    2008-09-15

    This Software Verification and Validation procedure provides the action steps for the Tank Waste Information Network System (TWINS) testing process. The primary objective of the testing process is to provide assurance that the software functions as intended, and meets the requirements specified by the client. Verification and validation establish the primary basis for TWINS software product acceptance.

  15. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  16. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  17. TFE Verification Program

    SciTech Connect

    Not Available

    1993-05-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program builds directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern.

  18. Cleanup Verification Package for the 118-F-6 Burial Ground

    SciTech Connect

    H. M. Sulloway

    2008-10-02

    This cleanup verification package documents completion of remedial action for the 118-F-6 Burial Ground located in the 100-FR-2 Operable Unit of the 100-F Area on the Hanford Site. The trenches received waste from the 100-F Experimental Animal Farm, including animal manure, animal carcasses, laboratory waste, plastic, cardboard, metal, and concrete debris as well as a railroad tank car.

  19. Approaches to wind-resource verification

    SciTech Connect

    Barchet, W.R.

    1981-07-01

    Verification of the regional wind energy resource assessments produced by the Pacific Northwest Laboratory addresses the question: Is the magnitude of the resource given in the assessments truly representative of the area of interest. Approaches using qualitative indicators of wind speed (tree deformation, eolian features), old and new data of opportunity not at sites specifically chosen for their exposure to the wind, and data by design from locations specifically selected to be good wind sites are described. Data requirements and evaluation procedures for verifying the resource are discussed.

  20. Cancelable face verification using optical encryption and authentication.

    PubMed

    Taheri, Motahareh; Mozaffari, Saeed; Keshavarzi, Parviz

    2015-10-01

    In a cancelable biometric system, each instance of enrollment is distorted by a transform function, and the output should not be retransformed to the original data. This paper presents a new cancelable face verification system in the encrypted domain. Encrypted facial images are generated by a double random phase encoding (DRPE) algorithm using two keys (RPM1 and RPM2). To make the system noninvertible, a photon counting (PC) method is utilized, which requires a photon distribution mask for information reduction. Verification of sparse images that are not recognizable by direct visual inspection is performed by unconstrained minimum average correlation energy filter. In the proposed method, encryption keys (RPM1, RPM2, and PDM) are used in the sender side, and the receiver needs only encrypted images and correlation filters. In this manner, the system preserves privacy if correlation filters are obtained by an adversary. Performance of PC-DRPE verification system is evaluated under illumination variation, pose changes, and facial expression. Experimental results show that utilizing encrypted images not only increases security concerns but also enhances verification performance. This improvement can be attributed to the fact that, in the proposed system, the face verification problem is converted to key verification tasks.

  1. Trajectory Based Behavior Analysis for User Verification

    NASA Astrophysics Data System (ADS)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  2. Space station data management system - A common GSE test interface for systems testing and verification

    NASA Technical Reports Server (NTRS)

    Martinez, Pedro A.; Dunn, Kevin W.

    1987-01-01

    This paper examines the fundamental problems and goals associated with test, verification, and flight-certification of man-rated distributed data systems. First, a summary of the characteristics of modern computer systems that affect the testing process is provided. Then, verification requirements are expressed in terms of an overall test philosophy for distributed computer systems. This test philosophy stems from previous experience that was gained with centralized systems (Apollo and the Space Shuttle), and deals directly with the new problems that verification of distributed systems may present. Finally, a description of potential hardware and software tools to help solve these problems is provided.

  3. Distributions.

    ERIC Educational Resources Information Center

    Bowers, Wayne A.

    This monograph was written for the Conference of the New Instructional Materials in Physics, held at the University of Washington in summer, 1965. It is intended for students who have had an introductory college physics course. It seeks to provide an introduction to the idea of distributions in general, and to some aspects of the subject in…

  4. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Elvatech, Ltd. ElvaX (ElvaX) x-ray fluorescence (XRF) analyzer distributed in the United States by Xcalibur XRF Services (Xcalibur), was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ElvaX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ElvaX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as s

  5. Online fingerprint verification.

    PubMed

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  6. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  7. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  8. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  9. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  10. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  11. An Adaptive Tabu Search Heuristic for the Location Routing Pickup and Delivery Problem with Time Windows with a Theater Distribution Application

    DTIC Science & Technology

    2006-08-01

    theater distribution problem and find excellent solutions. This research utilizes advanced tabu search techniques, including reactive tabu search and...5.4.2 Within Cycle Swap (WCS) Move Neighborhood.............................. 102 5.4.3 Complete Route Insert ( CRI ) Move Neighborhood...Fractional Factorial Design........................................... 128 6.3 An Excel – VBA based LPDPTW Problem Generator

  12. Generic interpreters and microprocessor verification

    NASA Technical Reports Server (NTRS)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  13. 28 CFR 74.6 - Location of eligible persons.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 28 Judicial Administration 2 2013-07-01 2013-07-01 false Location of eligible persons. 74.6 Section 74.6 Judicial Administration DEPARTMENT OF JUSTICE (CONTINUED) CIVIL LIBERTIES ACT REDRESS PROVISION Verification of Eligibility § 74.6 Location of eligible persons. The Office shall compare...

  14. Developing sub-domain verification methods based on GIS tools

    NASA Astrophysics Data System (ADS)

    Smith, J. A.; Foley, T. A.; Raby, J. W.

    2014-12-01

    The meteorological community makes extensive use of the Model Evaluation Tools (MET) developed by National Center for Atmospheric Research for numerical weather prediction model verification through grid-to-point, grid-to-grid and object-based domain level analyses. MET Grid-Stat has been used to perform grid-to-grid neighborhood verification to account for the uncertainty inherent in high resolution forecasting, and MET Method for Object-based Diagnostic Evaluation (MODE) has been used to develop techniques for object-based spatial verification of high resolution forecast grids for continuous meteorological variables. High resolution modeling requires more focused spatial and temporal verification over parts of the domain. With a Geographical Information System (GIS), researchers can now consider terrain type/slope and land use effects and other spatial and temporal variables as explanatory metrics in model assessments. GIS techniques, when coupled with high resolution point and gridded observations sets, allow location-based approaches that permit discovery of spatial and temporal scales where models do not sufficiently resolve the desired phenomena. In this paper we discuss our initial GIS approach to verify WRF-ARW with a one-kilometer horizontal resolution inner domain centered over Southern California. Southern California contains a mixture of urban, sub-urban, agricultural and mountainous terrain types along with a rich array of observational data with which to illustrate our ability to conduct sub-domain verification.

  15. Verification and Validation for Flight-Critical Systems (VVFCS)

    NASA Technical Reports Server (NTRS)

    Graves, Sharon S.; Jacobsen, Robert A.

    2010-01-01

    On March 31, 2009 a Request for Information (RFI) was issued by NASA s Aviation Safety Program to gather input on the subject of Verification and Validation (V & V) of Flight-Critical Systems. The responses were provided to NASA on or before April 24, 2009. The RFI asked for comments in three topic areas: Modeling and Validation of New Concepts for Vehicles and Operations; Verification of Complex Integrated and Distributed Systems; and Software Safety Assurance. There were a total of 34 responses to the RFI, representing a cross-section of academic (26%), small & large industry (47%) and government agency (27%).

  16. Distribution of polychlorinated biphenyls and organochlorine pesticides in human breast milk from various locations in Tunisia: Levels of contamination, influencing factors, and infant risk assessment

    SciTech Connect

    Ennaceur, S. Gandoura, N.; Driss, M.R.

    2008-09-15

    The concentrations of dichlorodiphenytrichloroethane and its metabolites (DDTs), hexachlorobenzene (HCB), hexachlorocyclohexane isomers (HCHs), dieldrin, and 20 polychlorinated biphenyls (PCBs) were determined in 237 human breast milk samples collected from 12 locations in Tunisia. Gas chromatography with electron capture detector (GC-ECD) was used to identify and quantify residue levels on a lipid basis of organochlorine compounds (OCs). The predominant OCs in human breast milk were PCBs, p,p'-DDE, p,p'-DDT, HCHs, and HCB. Concentrations of DDTs in human breast milk from rural areas were significantly higher than those from urban locations (p<0.05). With regard to PCBs, we observed the predominance of mid-chlorinated congeners due to the presence of PCBs with high K{sub ow} such as PCB 153, 138, and 180. Positive correlations were found between concentrations of OCs in human breast milk and age of mothers and number of parities, suggesting the influence of such factors on OC burdens in lactating mothers. The comparison of daily intakes of PCBs, DDTs, HCHs, and HCB to infants through human breast milk with guidelines proposed by WHO and Health Canada shows that some individuals accumulated OCs in breast milk close to or higher than these guidelines.

  17. Soyasapogenol A and B distribution in soybean (Glycine max L. Merr.) in relation to seed physiology, genetic variability, and growing location.

    PubMed

    Rupasinghe, H P Vasantha; Jackson, Chung-Ja C; Poysa, Vaino; Di Berardo, Christina; Bewley, J Derek; Jenkinson, Jonathan

    2003-09-24

    An efficient analytical method utilizing high-performance liquid chromatography (HPLC)/evaporative light scattering detector (ELSD) was developed to isolate and quantify the two major soyasaponin aglycones or precursors in soybeans, triterpene soyasapogenol A and B. Soaking of seeds in water up to 15 h did not change the content of soyasapogenols. Seed germination had no influence on soyasapogenol A content but increased the accumulation of soyasapogenol B. Soyasapogenols were mainly concentrated in the axis of the seeds as compared with the cotyledons and seed coat. In the seedling, the root (radicle) contained the highest concentration of soyasapogenol A, while the plumule had the greatest amounts of soyasapogenol B. In 10 advanced food-grade soybean cultivars grown in four locations in Ontario, total soyasapogenol content in soybeans was 2 +/- 0.3 mg/g. Soyasapogenol B content (1.5 +/- 0.27 mg/g) was 2.5-4.5-fold higher than soyasapogenol A content (0.49 +/- 0.1 mg/g). A significant variation in soyasapogenol content was observed among cultivars and growing locations. There was no significant correlation between the content of soyasapogenols and the total isoflavone aglycones.

  18. Thoughts on Verification of Nuclear Disarmament

    SciTech Connect

    Dunlop, W H

    2007-09-26

    perfect and it was expected that occasionally there might be a verification measurement that was slightly above 150 kt. But the accuracy was much improved over the earlier seismic measurements. In fact some of this improvement was because as part of this verification protocol the US and Soviet Union provided the yields of several past tests to improve seismic calibrations. This actually helped provide a much needed calibration for the seismic measurements. It was also accepted that since nuclear tests were to a large part R&D related, it was also expected that occasionally there might be a test that was slightly above 150 kt, as you could not always predict the yield with high accuracy in advance of the test. While one could hypothesize that the Soviets could do a test at some other location than their test sites, if it were even a small fraction of 150 kt it would clearly be observed and would be a violation of the treaty. So the issue of clandestine tests of significance was easily covered for this particular treaty.

  19. Organics Verification Study for Sinclair and Dyes Inlets, Washington

    SciTech Connect

    Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.

    2006-09-28

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the

  20. Experiments for locating damaged truss members in a truss structure

    NASA Technical Reports Server (NTRS)

    Mcgowan, Paul E.; Smith, Suzanne W.; Javeed, Mehzad

    1991-01-01

    Locating damaged truss members in large space structures will involve a combination of sensing and diagnostic techniques. Methods developed for damage location require experimental verification prior to on-orbit applications. To this end, a series of experiments for locating damaged members using a generic, ten bay truss structure were conducted. A 'damaged' member is a member which has been removed entirely. Previously developed identification methods are used in conjunction with the experimental data to locate damage. Preliminary results to date are included, and indicate that mode selection and sensor location are important issues for location performance. A number of experimental data sets representing various damage configurations were compiled using the ten bay truss. The experimental data and the corresponding finite element analysis models are available to researchers for verification of various methods of structure identification and damage location.

  1. Evaluation of 3D pre-treatment verification for volumetric modulated arc therapy plan in head region

    NASA Astrophysics Data System (ADS)

    Ruangchan, S.; Oonsiri, S.; Suriyapee, S.

    2016-03-01

    The development of pre-treatment QA tools contributes to the three dimension (3D) dose verification using the calculation software with the measured planar dose distribution. This research is aimed to evaluate the Sun Nuclear 3DVH software with Thermo luminescence dosimeter (TLD) measurement. The two VMAT patient plans (2.5 arcs) of 6 MV photons with different PTV locations were transferred to the Rando phantom images. The PTV of the first plan located in homogeneous area and vice versa in the second plan. For treatment planning process, the Rando phantom images were employed in optimization and calculation with the PTV, brain stem, lens and TLD position contouring. The verification plans were created, transferred to the ArcCHECK for measurement and calculated the 3D dose using 3DVH software. The range of the percent dose differences in both PTV and organ at risk (OAR) between TLD and 3DVH software of the first and the second plans were -2.09 to 3.87% and -1.39 to 6.88%, respectively. The mean percent dose differences for the PTV were 1.62% and 3.93% for the first and the second plans, respectively. In conclusion, the 3DVH software results show good agreement with TLD when the tumor located in the homogeneous area.

  2. Woodward Effect Experimental Verifications

    NASA Astrophysics Data System (ADS)

    March, Paul

    2004-02-01

    The work of J. F. Woodward (1990 1996a; 1996b; 1998; 2002a; 2002b; 2004) on the existence of ``mass fluctuations'' and their use in exotic propulsion schemes was examined for possible application in improving space flight propulsion and power generation. Woodward examined Einstein's General Relativity Theory (GRT) and assumed that if the strong Machian interpretation of GRT as well as gravitational / inertia like Wheeler-Feynman radiation reaction forces hold, then when an elementary particle is accelerated through a potential gradient, its rest mass should fluctuate around its mean value during its acceleration. Woodward also used GRT to clarify the precise experimental conditions necessary for observing and exploiting these mass fluctuations or ``Woodward effect'' (W-E). Later, in collaboration with his ex-graduate student T. Mahood, they also pushed the experimental verification boundaries of these proposals. If these purported mass fluctuations occur as Woodward claims, and his assumption that gravity and inertia are both byproducts of the same GRT based phenomenon per Mach's Principle is correct, then many innovative applications such as propellantless propulsion and gravitational exotic matter generators may be feasible. This paper examines the reality of mass fluctuations and the feasibility of using the W-E to design propellantless propulsion devices in the near to mid-term future. The latest experimental results, utilizing MHD-like force rectification systems, will also be presented.

  3. Cold fusion verification

    NASA Astrophysics Data System (ADS)

    North, M. H.; Mastny, G. F.; Wesley, E. J.

    1991-03-01

    The objective of this work to verify and reproduce experimental observations of Cold Nuclear Fusion (CNF), as originally reported in 1989. The method was to start with the original report and add such additional information as became available to build a set of operational electrolytic CNF cells. Verification was to be achieved by first observing cells for neutron production, and for those cells that demonstrated a nuclear effect, careful calorimetric measurements were planned. The authors concluded, after laboratory experience, reading published work, talking with others in the field, and attending conferences, that CNF probably is chimera and will go the way of N-rays and polywater. The neutron detector used for these tests was a completely packaged unit built into a metal suitcase that afforded electrostatic shielding for the detectors and self-contained electronics. It was battery-powered, although it was on charge for most of the long tests. The sensor element consists of He detectors arranged in three independent layers in a solid moderating block. The count from each of the three layers as well as the sum of all the detectors were brought out and recorded separately. The neutron measurements were made with both the neutron detector and the sample tested in a cave made of thick moderating material that surrounded the two units on the sides and bottom.

  4. Expose : procedure and results of the joint experiment verification tests

    NASA Astrophysics Data System (ADS)

    Panitz, C.; Rettberg, P.; Horneck, G.; Rabbow, E.; Baglioni, P.

    The International Space Station will carry the EXPOSE facility accommodated at the universal workplace URM-D located outside the Russian Service Module. The launch will be affected in 2005 and it is planned to stay in space for 1.5 years. The tray like structure will accomodate 2 chemical and 6 biological PI-experiments or experiment systems of the ROSE (Response of Organisms to Space Environment) consortium. EXPOSE will support long-term in situ studies of microbes in artificial meteorites, as well as of microbial communities from special ecological niches, such as endolithic and evaporitic ecosystems. The either vented or sealed experiment pockets will be covered by an optical filter system to control intensity and spectral range of solar UV irradiation. Control of sun exposure will be achieved by the use of individual shutters. To test the compatibility of the different biological systems and their adaptation to the opportunities and constraints of space conditions a profound ground support program has been developed. The procedure and first results of this joint Experiment Verification Tests (EVT) will be presented. The results will be essential for the success of the EXPOSE mission and have been done in parallel with the development and construction of the final hardware design of the facility. The results of the mission will contribute to the understanding of the organic chemistry processes in space, the biological adaptation strategies to extreme conditions, e.g. on early Earth and Mars, and the distribution of life beyond its planet of origin.

  5. Distribution and abundance of zooplankton at selected locations on the Savannah River and from tributaries of the Savannah River Plant: December 1984--August 1985

    SciTech Connect

    Chimney, M.J.; Cody, W.R.

    1986-11-01

    Spatial and temporal differences in the abundance and composition of the zooplankton community occurred at Savannah River and SRP creek/swamp sampling locations. Stations are grouped into four categories based on differences in community structure: Savannah River; thermally influenced stations on Four Mile Creek and Pen Branch; closed-canopy stations in the Steel Creek system; and open-canopy Steel Creek stations, non-thermally influenced stations on Pen Branch and Beaver Dam Creek. Differences among stations were little related to water temperature, dissolved oxygen concentration, conductivity or pH at the tine of collection. None of these parameters appeared to be limiting. Rather, past thermal history and habitat structure seemed to be important controlling factors. 66 refs.

  6. Distribution and mobility of lead (Pb), copper (Cu), zinc (Zn), and antimony (Sb) from ammunition residues on shooting ranges for small arms located on mires.

    PubMed

    Mariussen, Espen; Johnsen, Ida Vaa; Strømseng, Arnljot Einride

    2017-03-06

    An environmental survey was performed on shooting ranges for small arms located on minerotrophic mires. The highest mean concentrations of Pb (13 g/kg), Cu (5.2 g/kg), Zn (1.1 g/kg), and Sb (0.83 g/kg) in the top soil were from a range located on a poor minerotrophic and acidic mire. This range had also the highest concentrations of Pb, Cu, Zn, and Sb in discharge water (0.18 mg/L Pb, 0.42 mg/L Cu, 0.63 mg/L Zn, and 65 μg/L Sb) and subsurface soil water (2.5 mg/L Pb, 0.9 mg/L Cu, 1.6 mg/L Zn, and 0.15 mg/L Sb). No clear differences in the discharge of ammunition residues between the mires were observed based on the characteristics of the mires. In surface water with high pH (pH ~7), there was a trend with high concentrations of Sb and lower relative concentrations of Cu and Pb. The relatively low concentrations of ammunition residues both in the soil and soil water, 20 cm below the top soil, indicates limited vertical migration in the soil. Channels in the mires, made by plant roots or soil layer of less decomposed materials, may increase the rate of transport of contaminated surface water into deeper soil layers and ground water. A large portion of both Cu and Sb were associated to the oxidizable components in the peat, which may imply that these elements form inner-sphere complexes with organic matter. The largest portion of Pb and Zn were associated with the exchangeable and pH-sensitive components in the peat, which may imply that these elements form outer-sphere complexes with the peat.

  7. Towards an in-situ measurement of wave velocity in buried plastic water distribution pipes for the purposes of leak location

    NASA Astrophysics Data System (ADS)

    Almeida, Fabrício C. L.; Brennan, Michael J.; Joseph, Phillip F.; Dray, Simon; Whitfield, Stuart; Paschoalini, Amarildo T.

    2015-12-01

    Water companies are under constant pressure to ensure that water leakage is kept to a minimum. Leak noise correlators are often used to help find and locate leaks. These devices correlate acoustic or vibration signals from sensors which are placed either side the location of a suspected leak. The peak in the cross-correlation function of the measured signals gives the time difference between the arrival times of the leak noise at the sensors. To convert the time delay into a distance, the speed at which the leak noise propagates along the pipe (wave-speed) needs to be known. Often, this is estimated from historical wave-speed data measured on other pipes obtained at various times and under various conditions, or it is estimated from tables which are calculated using simple formula. Usually, the wave-speed is not measured directly at the time of the correlation measurement and is therefore potentially a source of significant error in the localisation of the leak. In this paper, a new method of measuring the wave-speed in-situ in the presence of a leak, that is robust and simple, is explored. Experiments were conducted on a bespoke large scale buried pipe test-rig, in which a leak was also induced in the pipe between the measurement positions to simulate a condition that is likely to occur in practice. It is shown that even in conditions where the signal to noise ratio is very poor, the wave-speed estimate calculated using the new method is less than 5% different from the best estimate of 387 m s-1.

  8. Method and system for determining depth distribution of radiation-emitting material located in a source medium and radiation detector system for use therein

    DOEpatents

    Benke, Roland R.; Kearfott, Kimberlee J.; McGregor, Douglas S.

    2003-03-04

    A method, system and a radiation detector system for use therein are provided for determining the depth distribution of radiation-emitting material distributed in a source medium, such as a contaminated field, without the need to take samples, such as extensive soil samples, to determine the depth distribution. The system includes a portable detector assembly with an x-ray or gamma-ray detector having a detector axis for detecting the emitted radiation. The radiation may be naturally-emitted by the material, such as gamma-ray-emitting radionuclides, or emitted when the material is struck by other radiation. The assembly also includes a hollow collimator in which the detector is positioned. The collimator causes the emitted radiation to bend toward the detector as rays parallel to the detector axis of the detector. The collimator may be a hollow cylinder positioned so that its central axis is perpendicular to the upper surface of the large area source when positioned thereon. The collimator allows the detector to angularly sample the emitted radiation over many ranges of polar angles. This is done by forming the collimator as a single adjustable collimator or a set of collimator pieces having various possible configurations when connected together. In any one configuration, the collimator allows the detector to detect only the radiation emitted from a selected range of polar angles measured from the detector axis. Adjustment of the collimator or the detector therein enables the detector to detect radiation emitted from a different range of polar angles. The system further includes a signal processor for processing the signals from the detector wherein signals obtained from different ranges of polar angles are processed together to obtain a reconstruction of the radiation-emitting material as a function of depth, assuming, but not limited to, a spatially-uniform depth distribution of the material within each layer. The detector system includes detectors having

  9. A Modified Cramer-von Mises and Anderson-Darling Test for the Weibull Distribution with Unknown Location and Scale Parameters.

    DTIC Science & Technology

    1981-12-01

    13 6 Plotting Positions Versus A2 or W2 Statistics. 22 7 Gamma Shape = 2 ..... ................ .... 24 8 a. Beta, p-l, q-l...Level-.20, n=20 . 71 13 Shape vs W2 Critical Values, Level-.20, n-25 . 72 14 Shape vs W2 Critical Values, Level-.20, n-30 . 73 15 Shape vs W2 Critical...formula to calculateW 2 is given by Eq (4). Letting x( 13 ,(2),...x(n ) be the n order statistics and letting Ui F (xi), the cumulative distribution

  10. NON-INVASIVE DETERMINATION OF THE LOCATION AND DISTRIBUTION OF FREE-PHASE DENSE NONAQUEOUS PHASE LIQUIDS (DNAPL) BY SEISMIC REFLECTION TECHNIQUES

    SciTech Connect

    Michael G. Waddell; William J. Domoracki; Tom J. Temples

    2001-12-01

    This annual technical progress report is for part of Task 4 (site evaluation), Task 5 (2D seismic design, acquisition, and processing), and Task 6 (2D seismic reflection, interpretation, and AVO analysis) on DOE contact number DE-AR26-98FT40369. The project had planned one additional deployment to another site other than Savannah River Site (SRS) or DOE Hanford Site. After the SUBCON midyear review in Albuquerque, NM, it was decided that two additional deployments would be performed. The first deployment is to test the feasibility of using non-invasive seismic reflection and AVO analysis as a monitoring tool to assist in determining the effectiveness of Dynamic Underground Stripping (DUS) in removal of DNAPL. The second deployment is to the Department of Defense (DOD) Charleston Naval Weapons Station Solid Waste Management Unit 12 (SWMU-12), Charleston, SC to further test the technique to detect high concentrations of DNAPL. The Charleston Naval Weapons Station SWMU-12 site was selected in consultation with National Energy Technology Laboratory (NETL) and DOD Naval Facilities Engineering Command Southern Division (NAVFAC) personnel. Based upon the review of existing data and due to the shallow target depth, the project team collected three Vertical Seismic Profiles (VSP) and an experimental P-wave seismic reflection line. After preliminary data analysis of the VSP data and the experimental reflection line data, it was decided to proceed with Task 5 and Task 6. Three high resolution P-wave reflection profiles were collected with two objectives; (1) design the reflection survey to image a target depth of 20 feet below land surface to assist in determining the geologic controls on the DNAPL plume geometry, and (2) apply AVO analysis to the seismic data to locate the zone of high concentration of DNAPL. Based upon the results of the data processing and interpretation of the seismic data, the project team was able to map the channel that is controlling the DNAPL plume

  11. Polarization-multiplexed plasmonic phase generation with distributed nanoslits.

    PubMed

    Lee, Seung-Yeol; Kim, Kyuho; Lee, Gun-Yeal; Lee, Byoungho

    2015-06-15

    Methods for multiplexing surface plasmon polaritons (SPPs) have been attracting much attention due to their potentials for plasmonic integrated systems, plasmonic holography, and optical tweezing. Here, using closely-distanced distributed nanoslits, we propose a method for generating polarization-multiplexed SPP phase profiles which can be applied for implementing general SPP phase distributions. Two independent types of SPP phase generation mechanisms - polarization-independent and polarization-reversible ones - are combined to generate fully arbitrary phase profiles for each optical handedness. As a simple verification of the proposed scheme, we experimentally demonstrate that the location of plasmonic focus can be arbitrary designed, and switched by the change of optical handedness.

  12. Verification of Advances in a Coupled Snow-runoff Modeling Framework for Operational Streamflow Forecasts

    NASA Astrophysics Data System (ADS)

    Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.

    2011-12-01

    The National Oceanic and Atmospheric Administration's (NOAA's) River Forecast Centers (RFCs) issue hydrologic forecasts related to flood events, reservoir operations for water supply, streamflow regulation, and recreation on the nation's streams and rivers. The RFCs use the National Weather Service River Forecast System (NWSRFS) for streamflow forecasting which relies on a coupled snow model (i.e. SNOW17) and rainfall-runoff model (i.e. SAC-SMA) in snow-dominated regions of the US. Errors arise in various steps of the forecasting system from input data, model structure, model parameters, and initial states. The goal of the current study is to undertake verification of potential improvements in the SNOW17-SAC-SMA modeling framework developed for operational streamflow forecasts. We undertake verification for a range of parameters sets (i.e. RFC, DREAM (Differential Evolution Adaptive Metropolis)) as well as a data assimilation (DA) framework developed for the coupled models. Verification is also undertaken for various initial conditions to observe the influence of variability in initial conditions on the forecast. The study basin is the North Fork America River Basin (NFARB) located on the western side of the Sierra Nevada Mountains in northern California. Hindcasts are verified using both deterministic (i.e. Nash Sutcliffe efficiency, root mean square error, and joint distribution) and probabilistic (i.e. reliability diagram, discrimination diagram, containing ratio, and Quantile plots) statistics. Our presentation includes comparison of the performance of different optimized parameters and the DA framework as well as assessment of the impact associated with the initial conditions used for streamflow forecasts for the NFARB.

  13. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  14. CTBT integrated verification system evaluation model supplement

    SciTech Connect

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  15. Spatial Evaluation and Verification of Earthquake Simulators

    NASA Astrophysics Data System (ADS)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2016-09-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  16. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  17. Technical challenges for dismantlement verification

    SciTech Connect

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-11-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion.

  18. Evaluation of Mesoscale Model Phenomenological Verification Techniques

    NASA Technical Reports Server (NTRS)

    Lambert, Winifred

    2006-01-01

    Forecasters at the Spaceflight Meteorology Group, 45th Weather Squadron, and National Weather Service in Melbourne, FL use mesoscale numerical weather prediction model output in creating their operational forecasts. These models aid in forecasting weather phenomena that could compromise the safety of launch, landing, and daily ground operations and must produce reasonable weather forecasts in order for their output to be useful in operations. Considering the importance of model forecasts to operations, their accuracy in forecasting critical weather phenomena must be verified to determine their usefulness. The currently-used traditional verification techniques involve an objective point-by-point comparison of model output and observations valid at the same time and location. The resulting statistics can unfairly penalize high-resolution models that make realistic forecasts of a certain phenomena, but are offset from the observations in small time and/or space increments. Manual subjective verification can provide a more valid representation of model performance, but is time-consuming and prone to personal biases. An objective technique that verifies specific meteorological phenomena, much in the way a human would in a subjective evaluation, would likely produce a more realistic assessment of model performance. Such techniques are being developed in the research community. The Applied Meteorology Unit (AMU) was tasked to conduct a literature search to identify phenomenological verification techniques being developed, determine if any are ready to use operationally, and outline the steps needed to implement any operationally-ready techniques into the Advanced Weather Information Processing System (AWIPS). The AMU conducted a search of all literature on the topic of phenomenological-based mesoscale model verification techniques and found 10 different techniques in various stages of development. Six of the techniques were developed to verify precipitation forecasts, one

  19. NON-INVASIVE DETERMINATION OF THE LOCATION AND DISTRIBUTION OF FREE-PHASE DENSE NONAQUEOUS PHASE LIQUIDS (DNAPL) BY SEISMIC REFLECTION TECHNIQUES

    SciTech Connect

    Michael G. Waddell; William J. Domoracki; Tom J. Temples

    2001-05-01

    This semi-annual technical progress report is for Task 4 site evaluation, Task 5 seismic reflection design and acquisition, and Task 6 seismic reflection processing and interpretation on DOE contact number DE-AR26-98FT40369. The project had planned one additional deployment to another site other than Savannah River Site (SRS) or DOE Hanford. During this reporting period the project had an ASME peer review. The findings and recommendation of the review panel, as well at the project team response to comments, are in Appendix A. After the SUBCON midyear review in Albuquerque, NM and the peer review it was decided that two additional deployments would be performed. The first deployment is to test the feasibility of using non-invasive seismic reflection and AVO analysis as monitoring to assist in determining the effectiveness of Dynamic Underground Stripping (DUS) in removal of DNAPL. Under the rescope of the project, Task 4 would be performed at the Charleston Navy Weapons Station, Charleston, SC and not at the Dynamic Underground Stripping (DUS) project at SRS. The project team had already completed Task 4 at the M-area seepage basin, only a few hundred yards away from the DUS site. Because the geology is the same, Task 4 was not necessary. However, a Vertical Seismic Profile (VSP) was conducted in one well to calibrate the geology to the seismic data. The first deployment to the DUS Site (Tasks 5 and 6) has been completed. Once the steam has been turned off these tasks will be performed again to compare the results to the pre-steam data. The results from the first deployment to the DUS site indicated a seismic amplitude anomaly at the location and depths of the known high concentrations of DNAPL. The deployment to another site with different geologic conditions was supposed to occur during this reporting period. The first site selected was DOE Paducah, Kentucky. After almost eight months of negotiation, site access was denied requiring the selection of another site

  20. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  1. Non-damaging, portable radiography: Applications in arms control verification

    SciTech Connect

    Morris, R.A.; Butterfield, K.B.; Apt, K.E.

    1992-08-01

    The state-of-the-technology necessary to perform portable radiography in support of arms control verification is evaluated. Specific requirements, such as accurate measurements of the location of features in a treaty-limited object and the detection of deeply imbedded features, are defined in three scenarios. Sources, detectors, portability, mensuration, and safety are discussed in relation to the scenarios. Examples are given of typical radiographic systems that would be capable of addressing the inspection problems associated with the three scenarios.

  2. CHEMICAL INDUCTION MIXER VERIFICATION - ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Wet-Weather Flow Technologies Pilot of the Environmental Technology Verification (ETV) Program, which is supported by the U.S. Environmental Protection Agency and facilitated by NSF International, has recently evaluated the performance of chemical induction mixers used for di...

  3. Heterogeneity of nervous system mitochondria: location, location, location!

    PubMed

    Dubinsky, Janet M

    2009-08-01

    Mitochondrial impairments have been associated with many neurological disorders, from inborn errors of metabolism or genetic disorders to age and environmentally linked diseases of aging (DiMauro S., Schon E.A. 2008. Mitochondrial disorders in the nervous system. Annu. Rev., Neurosci. 31, 91-123.). In these disorders, specific nervous system components or brain regions appear to be initially more susceptible to the triggering event or pathological process. Such regional variation in susceptibility to multiple types of stressors raises the possibility that inherent differences in mitochondrial function may mediate some aspect of pathogenesis. Regional differences in the distribution or number of mitochondria, mitochondrial enzyme activities, enzyme expression levels, mitochondrial genes or availability of necessary metabolites become attractive explanations for selective vulnerability of a nervous system structure. While regionally selective mitochondrial vulnerability has been documented, regional variations in other cellular and tissue characteristics may also contribute to metabolic impairment. Such environmental variables include high tonic firing rates, neurotransmitter phenotype, location of mitochondria within a neuron, or the varied tissue perfusion pressure of different cerebral arterial branches. These contextual variables exert regionally distinct regulatory influences on mitochondria to tune their energy production to local demands. Thus to understand variations in mitochondrial functioning and consequent selective vulnerability to injury, the organelle must be placed within the context of its cellular, functional, developmental and neuroanatomical environment.

  4. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  5. Toward Regional Fossil Fuel CO2 Emissions Verification Using WRF-CHEM

    NASA Astrophysics Data System (ADS)

    Delle Monache, L.; Kosoviæ, B.; Cameron-Smith, P.; Bergmann, D.; Grant, K.; Guilderson, T.

    2008-12-01

    As efforts to reduce emissions of green house gases take shape it is becoming obvious that an essential component of a viable solution will involve emission verification. While detailed inventories of green house gas sources will represent important component of the solution additional verification methodologies will be necessary to reduce uncertainties in emission estimates especially for distributed sources and CO2 offsets. We developed tools for solving inverse dispersion problem for distributed emissions of green house gases. For that purpose we combine probabilistic inverse methodology based on Bayesian inversion with stochastic sampling and weather forecasting and air quality model WRF-CHEM. We demonstrate estimation of CO2 emissions associated with fossil fuel burning in California over two one-week periods in 2006. We use WRF- CHEM in tracer simulation mode to solve forward dispersion problem for emissions over eleven air basins. We first use direct inversion approach to determine optimal location for a limited number of CO2 - C14 isotope sensors. We then use Bayesian inference with stochastic sampling to determine probability distributions for emissions from California air basins. Moreover, we vary the number of sensors and frequency of measurements to study their effect on the accuracy and uncertainty level of the emission estimation. Finally, to take into account uncertainties associated with forward modeling, we combine Bayesian inference and stochastic sampling with ensemble modeling. The ensemble is created by running WRF-CHEM with different initial and boundary conditions as well as different boundary layer and surface model options. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory (LLNL) under Contract DE-AC52-07NA27344 (LLNL-ABS-406901-DRAFT). The project 07-ERD- 064 was funded by the Laboratory Directed Research and Development Program at LLNL.

  6. Forecast Validation and Verification for Earthquakes, Weather and Finance

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Turcotte, D. L.; Donnellan, A.; Tiampo, K.

    2009-04-01

    Techniques for earthquake forecasting are in development using both seismicity data mining methods, as well as numerical simulations. Testing such forecasts is necessary not only to determine forecast quality, but also to carry out forecast improvement. A large number of techniques to validate and verify forecasts have been developed for weather and financial applications. Many of these have been elaborated in public locations, including http://www.bom.gov.au/bmrc/wefor/staff/eee/verif/verif_web_page.html. Typically, the goal is to test for forecast resolution, reliability and sharpness. A good forecast is characterized by consistency, quality and value. Most, if not all of these forecast verification procedures can be readily applied to earthquake forecasts as well. In this talk, we discuss a number of these methods, and show how they might be useful for both fault-based forecasting, a group that includes the WGCEP and simulator-based renewal models, and grid-based forecasting, which includes the Relative Intensity, Pattern Informatics, and smoothed seismicity methods. We find that applying these standard methods of forecast verification is straightforward, and we conclude that judgments about the quality of a given forecast method can often depend on the test applied.

  7. Automated verification system user's guide

    NASA Technical Reports Server (NTRS)

    Hoffman, R. H.

    1972-01-01

    Descriptions of the operational requirements for all of the programs of the Automated Verification System (AVS) are provided. The AVS programs are: (1) FORTRAN code analysis and instrumentation program (QAMOD); (2) Test Effectiveness Evaluation Program (QAPROC); (3) Transfer Control Variable Tracking Program (QATRAK); (4) Program Anatomy Table Generator (TABGEN); and (5) Network Path Analysis Program (RAMBLE).

  8. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-07-16

    This paper will explore the difficulties of deep reductions by examining the technical verification challenges. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 (Pifer 2010). Further reductions will include stepping stones at 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national lab complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  9. Chip connectivity verification program

    NASA Technical Reports Server (NTRS)

    Riley, Josh (Inventor); Patterson, George (Inventor)

    1999-01-01

    A method for testing electrical connectivity between conductive structures on a chip that is preferably layered with conductive and nonconductive layers. The method includes determining the layer on which each structure is located and defining the perimeter of each structure. Conductive layer connections between each of the layers are determined, and, for each structure, the points of intersection between the perimeter of that structure and the perimeter of each other structure on the chip are also determined. Finally, electrical connections between the structures are determined using the points of intersection and the conductive layer connections.

  10. A DICOM-RT-based toolbox for the evaluation and verification of radiotherapy plans

    NASA Astrophysics Data System (ADS)

    Spezi, E.; Lewis, D. G.; Smith, C. W.

    2002-12-01

    The verification of radiotherapy plans is an essential step in the treatment planning process. This is especially important for highly conformal and IMRT plans which produce non-intuitive fluence maps and complex 3D dose distributions. In this work we present a DICOM (Digital Imaging and Communication in Medicine) based toolbox, developed for the evaluation and the verification of radiotherapy treatment plans. The toolbox offers the possibility of importing treatment plans generated with different calculation algorithms and/or different optimization engines and evaluating dose distributions on an independent platform. Furthermore the radiotherapy set-up can be exported to the BEAM Monte Carlo code system for dose verification. This can be done by simulating the irradiation of the patient CT dataset or the irradiation of a software-generated water phantom. We show the application of some of the functions implemented in this toolbox for the evaluation and verification of an IMRT treatment of the head and neck region.

  11. A DICOM-RT-based toolbox for the evaluation and verification of radiotherapy plans.

    PubMed

    Spezi, E; Lewis, D G; Smith, C W

    2002-12-07

    The verification of radiotherapy plans is an essential step in the treatment planning process. This is especially important for highly conformal and IMRT plans which produce non-intuitive fluence maps and complex 3D dose distributions. In this work we present a DICOM (Digital Imaging and Communication in Medicine) based toolbox, developed for the evaluation and the verification of radiotherapy treatment plans. The toolbox offers the possibility of importing treatment plans generated with different calculation algorithms and/or different optimization engines and evaluating dose distributions on an independent platform. Furthermore the radiotherapy set-up can be exported to the BEAM Monte Carlo code system for dose verification. This can be done by simulating the irradiation of the patient CT dataset or the irradiation of a software-generated water phantom. We show the application of some of the functions implemented in this toolbox for the evaluation and verification of an IMRT treatment of the head and neck region.

  12. 3D VMAT Verification Based on Monte Carlo Log File Simulation with Experimental Feedback from Film Dosimetry

    PubMed Central

    Barbeiro, A. R.; Ureba, A.; Baeza, J. A.; Linares, R.; Perucha, M.; Jiménez-Ortega, E.; Velázquez, S.; Mateos, J. C.

    2016-01-01

    A model based on a specific phantom, called QuAArC, has been designed for the evaluation of planning and verification systems of complex radiotherapy treatments, such as volumetric modulated arc therapy (VMAT). This model uses the high accuracy provided by the Monte Carlo (MC) simulation of log files and allows the experimental feedback from the high spatial resolution of films hosted in QuAArC. This cylindrical phantom was specifically designed to host films rolled at different radial distances able to take into account the entrance fluence and the 3D dose distribution. Ionization chamber measurements are also included in the feedback process for absolute dose considerations. In this way, automated MC simulation of treatment log files is implemented to calculate the actual delivery geometries, while the monitor units are experimentally adjusted to reconstruct the dose-volume histogram (DVH) on the patient CT. Prostate and head and neck clinical cases, previously planned with Monaco and Pinnacle treatment planning systems and verified with two different commercial systems (Delta4 and COMPASS), were selected in order to test operational feasibility of the proposed model. The proper operation of the feedback procedure was proved through the achieved high agreement between reconstructed dose distributions and the film measurements (global gamma passing rates > 90% for the 2%/2 mm criteria). The necessary discretization level of the log file for dose calculation and the potential mismatching between calculated control points and detection grid in the verification process were discussed. Besides the effect of dose calculation accuracy of the analytic algorithm implemented in treatment planning systems for a dynamic technique, it was discussed the importance of the detection density level and its location in VMAT specific phantom to obtain a more reliable DVH in the patient CT. The proposed model also showed enough robustness and efficiency to be considered as a pre

  13. 3D VMAT Verification Based on Monte Carlo Log File Simulation with Experimental Feedback from Film Dosimetry.

    PubMed

    Barbeiro, A R; Ureba, A; Baeza, J A; Linares, R; Perucha, M; Jiménez-Ortega, E; Velázquez, S; Mateos, J C; Leal, A

    2016-01-01

    A model based on a specific phantom, called QuAArC, has been designed for the evaluation of planning and verification systems of complex radiotherapy treatments, such as volumetric modulated arc therapy (VMAT). This model uses the high accuracy provided by the Monte Carlo (MC) simulation of log files and allows the experimental feedback from the high spatial resolution of films hosted in QuAArC. This cylindrical phantom was specifically designed to host films rolled at different radial distances able to take into account the entrance fluence and the 3D dose distribution. Ionization chamber measurements are also included in the feedback process for absolute dose considerations. In this way, automated MC simulation of treatment log files is implemented to calculate the actual delivery geometries, while the monitor units are experimentally adjusted to reconstruct the dose-volume histogram (DVH) on the patient CT. Prostate and head and neck clinical cases, previously planned with Monaco and Pinnacle treatment planning systems and verified with two different commercial systems (Delta4 and COMPASS), were selected in order to test operational feasibility of the proposed model. The proper operation of the feedback procedure was proved through the achieved high agreement between reconstructed dose distributions and the film measurements (global gamma passing rates > 90% for the 2%/2 mm criteria). The necessary discretization level of the log file for dose calculation and the potential mismatching between calculated control points and detection grid in the verification process were discussed. Besides the effect of dose calculation accuracy of the analytic algorithm implemented in treatment planning systems for a dynamic technique, it was discussed the importance of the detection density level and its location in VMAT specific phantom to obtain a more reliable DVH in the patient CT. The proposed model also showed enough robustness and efficiency to be considered as a pre

  14. Design, Implementation, and Verification of the Reliable Multicast Protocol. Thesis

    NASA Technical Reports Server (NTRS)

    Montgomery, Todd L.

    1995-01-01

    This document describes the Reliable Multicast Protocol (RMP) design, first implementation, and formal verification. RMP provides a totally ordered, reliable, atomic multicast service on top of an unreliable multicast datagram service. RMP is fully and symmetrically distributed so that no site bears an undue portion of the communications load. RMP provides a wide range of guarantees, from unreliable delivery to totally ordered delivery, to K-resilient, majority resilient, and totally resilient atomic delivery. These guarantees are selectable on a per message basis. RMP provides many communication options, including virtual synchrony, a publisher/subscriber model of message delivery, a client/server model of delivery, mutually exclusive handlers for messages, and mutually exclusive locks. It has been commonly believed that total ordering of messages can only be achieved at great performance expense. RMP discounts this. The first implementation of RMP has been shown to provide high throughput performance on Local Area Networks (LAN). For two or more destinations a single LAN, RMP provides higher throughput than any other protocol that does not use multicast or broadcast technology. The design, implementation, and verification activities of RMP have occurred concurrently. This has allowed the verification to maintain a high fidelity between design model, implementation model, and the verification model. The restrictions of implementation have influenced the design earlier than in normal sequential approaches. The protocol as a whole has matured smoother by the inclusion of several different perspectives into the product development.

  15. LOCATING MONITORING STATIONS IN WATER DISTRIBUTION SYSTEMS

    EPA Science Inventory

    Water undergoes changes in quality between the time it leaves the treatment plant and the time it reaches the customer's tap, making it important to select monitoring stations that will adequately monitor these changers. But because there is no uniform schedule or framework for ...

  16. Verification of Loop Diagnostics

    NASA Technical Reports Server (NTRS)

    Winebarger, A.; Lionello, R.; Mok, Y.; Linker, J.; Mikic, Z.

    2014-01-01

    Many different techniques have been used to characterize the plasma in the solar corona: density-sensitive spectral line ratios are used to infer the density, the evolution of coronal structures in different passbands is used to infer the temperature evolution, and the simultaneous intensities measured in multiple passbands are used to determine the emission measure. All these analysis techniques assume that the intensity of the structures can be isolated through background subtraction. In this paper, we use simulated observations from a 3D hydrodynamic simulation of a coronal active region to verify these diagnostics. The density and temperature from the simulation are used to generate images in several passbands and spectral lines. We identify loop structures in the simulated images and calculate the loop background. We then determine the density, temperature and emission measure distribution as a function of time from the observations and compare with the true temperature and density of the loop. We find that the overall characteristics of the temperature, density, and emission measure are recovered by the analysis methods, but the details of the true temperature and density are not. For instance, the emission measure curves calculated from the simulated observations are much broader than the true emission measure distribution, though the average temperature evolution is similar. These differences are due, in part, to inadequate background subtraction, but also indicate a limitation of the analysis methods.

  17. Verification of the karst flow model under laboratory controlled conditions

    NASA Astrophysics Data System (ADS)

    Gotovac, Hrvoje; Andric, Ivo; Malenica, Luka; Srzic, Veljko

    2016-04-01

    Karst aquifers are very important groundwater resources around the world as well as in coastal part of Croatia. They consist of extremely complex structure defining by slow and laminar porous medium and small fissures and usually fast turbulent conduits/karst channels. Except simple lumped hydrological models that ignore high karst heterogeneity, full hydraulic (distributive) models have been developed exclusively by conventional finite element and finite volume elements considering complete karst heterogeneity structure that improves our understanding of complex processes in karst. Groundwater flow modeling in complex karst aquifers are faced by many difficulties such as a lack of heterogeneity knowledge (especially conduits), resolution of different spatial/temporal scales, connectivity between matrix and conduits, setting of appropriate boundary conditions and many others. Particular problem of karst flow modeling is verification of distributive models under real aquifer conditions due to lack of above-mentioned information. Therefore, we will show here possibility to verify karst flow models under the laboratory controlled conditions. Special 3-D karst flow model (5.6*2.6*2 m) consists of concrete construction, rainfall platform, 74 piezometers, 2 reservoirs and other supply equipment. Model is filled by fine sand (3-D porous matrix) and drainage plastic pipes (1-D conduits). This model enables knowledge of full heterogeneity structure including position of different sand layers as well as conduits location and geometry. Moreover, we know geometry of conduits perforation that enable analysis of interaction between matrix and conduits. In addition, pressure and precipitation distribution and discharge flow rates from both phases can be measured very accurately. These possibilities are not present in real sites what this model makes much more useful for karst flow modeling. Many experiments were performed under different controlled conditions such as different

  18. Realistic weather simulations and forecast verification with COSMO-EULAG

    NASA Astrophysics Data System (ADS)

    Wójcik, Damian; Piotrowski, Zbigniew; Rosa, Bogdan; Ziemiański, Michał

    2015-04-01

    Research conducted at Polish Institute of Meteorology and Water Management, National Research Institute, in collaboration with Consortium for Small Scale Modeling (COSMO) resulted in the development of a new prototype model COSMO-EULAG. The dynamical core of the new model is based on anelastic set of equation and numerics adopted from the EULAG model. The core is coupled, with the 1st degree of accuracy, to the COSMO physical parameterizations involving turbulence, friction, radiation, moist processes and surface fluxes. The tool is capable to compute weather forecast in mountainous area for the horizontal resolutions ranging from 2.2 km to 0.1 km and with slopes reaching 82 degree of inclination. An employment of EULAG allows to profit from its desirable conservative properties and numerical robustness confirmed in number of benchmark tests and widely documented in scientific literature. In this study we show a realistic case study of Alpine summer convection simulated by COSMO-EULAG. It compares the convection-permitting realization of the flow using 2.2 km horizontal grid size, typical for contemporary very high resolution regional NWP forecast, with realization of LES type using grid size of 100 m. The study presents comparison of flow, cloud and precipitation structure together with the reference results of standard compressible COSMO Runge-Kutta model forecast in 2.2 km horizontal resolution. The case study results are supplemented by COSMO-EULAG forecast verification results for Alpine domain in 2.2 km horizontal resolution. Wind, temperature, cloud, humidity and precipitation scores are being presented. Verification period covers one summer month (June 2013) and one autumn month (November 2013). Verification is based on data collected by a network of approximately 200 stations (surface data verification) and 6 stations (upper-air verification) located in the Alps and vicinity.

  19. Subsurface barrier verification technologies, informal report

    SciTech Connect

    Heiser, J.H.

    1994-06-01

    One of the more promising remediation options available to the DOE waste management community is subsurface barriers. Some of the uses of subsurface barriers include surrounding and/or containing buried waste, as secondary confinement of underground storage tanks, to direct or contain subsurface contaminant plumes and to restrict remediation methods, such as vacuum extraction, to a limited area. To be most effective the barriers should be continuous and depending on use, have few or no breaches. A breach may be formed through numerous pathways including: discontinuous grout application, from joints between panels and from cracking due to grout curing or wet-dry cycling. The ability to verify barrier integrity is valuable to the DOE, EPA, and commercial sector and will be required to gain full public acceptance of subsurface barriers as either primary or secondary confinement at waste sites. It is recognized that no suitable method exists for the verification of an emplaced barrier`s integrity. The large size and deep placement of subsurface barriers makes detection of leaks challenging. This becomes magnified if the permissible leakage from the site is low. Detection of small cracks (fractions of an inch) at depths of 100 feet or more has not been possible using existing surface geophysical techniques. Compounding the problem of locating flaws in a barrier is the fact that no placement technology can guarantee the completeness or integrity of the emplaced barrier. This report summarizes several commonly used or promising technologies that have been or may be applied to in-situ barrier continuity verification.

  20. Wavelet Features Based Fingerprint Verification

    NASA Astrophysics Data System (ADS)

    Bagadi, Shweta U.; Thalange, Asha V.; Jain, Giridhar P.

    2010-11-01

    In this work; we present a automatic fingerprint identification system based on Level 3 features. Systems based only on minutiae features do not perform well for poor quality images. In practice, we often encounter extremely dry, wet fingerprint images with cuts, warts, etc. Due to such fingerprints, minutiae based systems show poor performance for real time authentication applications. To alleviate the problem of poor quality fingerprints, and to improve overall performance of the system, this paper proposes fingerprint verification based on wavelet statistical features & co-occurrence matrix features. The features include mean, standard deviation, energy, entropy, contrast, local homogeneity, cluster shade, cluster prominence, Information measure of correlation. In this method, matching can be done between the input image and the stored template without exhaustive search using the extracted feature. The wavelet transform based approach is better than the existing minutiae based method and it takes less response time and hence suitable for on-line verification, with high accuracy.

  1. NEXT Thruster Component Verification Testing

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Sovey, James S.

    2007-01-01

    Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.

  2. Ontology Matching with Semantic Verification

    PubMed Central

    Jean-Mary, Yves R.; Shironoshita, E. Patrick; Kabuka, Mansur R.

    2009-01-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies. PMID:20186256

  3. Verification and transparency in future arms control

    SciTech Connect

    Pilat, J.F.

    1996-09-01

    Verification`s importance has changed dramatically over time, although it always has been in the forefront of arms control. The goals and measures of verification and the criteria for success have changed with the times as well, reflecting such factors as the centrality of the prospective agreement to East-West relations during the Cold War, the state of relations between the United States and the Soviet Union, and the technologies available for monitoring. Verification`s role may be declining in the post-Cold War period. The prospects for such a development will depend, first and foremost, on the high costs of traditional arms control, especially those associated with requirements for verification. Moreover, the growing interest in informal, or non-negotiated arms control does not allow for verification provisions by the very nature of these arrangements. Multilateral agreements are also becoming more prominent and argue against highly effective verification measures, in part because of fears of promoting proliferation by opening sensitive facilities to inspectors from potential proliferant states. As a result, it is likely that transparency and confidence-building measures will achieve greater prominence, both as supplements to and substitutes for traditional verification. Such measures are not panaceas and do not offer all that we came to expect from verification during the Cold war. But they may be the best possible means to deal with current problems of arms reductions and restraints at acceptable levels of expenditure.

  4. Crowd-Sourced Program Verification

    DTIC Science & Technology

    2012-12-01

    S / ROBERT L. KAMINSKI WARREN H. DEBANY, JR. Work Unit Manager Technical Advisor, Information Exploitation & Operations...Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington, DC 20503. PLEASE DO NOT RETURN...investigation, the contractor constructed a prototype of a crowd-sourced verification system that takes as input a given program and produces as output a

  5. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  6. Nuclear Data Verification and Standardization

    SciTech Connect

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  7. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  8. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  9. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  10. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  11. Compendium of Arms Control Verification Proposals.

    DTIC Science & Technology

    1982-03-01

    ZONAL ON-SITE INSPECTION ............ 123 CHAPTER D - CONTROL POSTS ................................... 139 CHAPTER E - RECORDS MONITORING...de:cribi.nr in reneral the zirnifiemit features of the verification method concerned. I’ ’ ’vi.i Chapters A to D deal with verification by direct on...inspection (i.e. increasing as confidence develops), and chapter D with control or observation posts. Chapter E deals with verification by examination of

  12. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  13. Using tools for verification, documentation and testing

    NASA Technical Reports Server (NTRS)

    Osterweil, L. J.

    1978-01-01

    Methodologies are discussed on four of the major approaches to program upgrading -- namely dynamic testing, symbolic execution, formal verification and static analysis. The different patterns of strengths, weaknesses and applications of these approaches are shown. It is demonstrated that these patterns are in many ways complementary, offering the hope that they can be coordinated and unified into a single comprehensive program testing and verification system capable of performing a diverse and useful variety of error detection, verification and documentation functions.

  14. Location-dependent RF geotags for positioning and security

    NASA Astrophysics Data System (ADS)

    Qiu, Di; Lynch, Robert; Yang, Chun

    2011-06-01

    Geo-security service, which refers to the authorization of persons or facilities based on their distinctive location information, is an application of the fields of position, navigation and time (PNT). Location features from radio navigation signals are mapped into a precise verification tag or geotag to block or allow certain action or access. A device that integrates a location sensor and geotag generation algorithm is tamper-resistant, that is, one cannot spoof the device to bypass the location validation. This paper develops a theoretical framework of the geotag-based positioning and security systems, and evaluates the system performance analytically and experimentally by using Loran signals as a case study.

  15. METHOD OF LOCATING GROUNDS

    DOEpatents

    Macleish, K.G.

    1958-02-11

    ABS>This patent presents a method for locating a ground in a d-c circult having a number of parallel branches connected across a d-c source or generator. The complete method comprises the steps of locating the ground with reference to the mildpoint of the parallel branches by connecting a potentiometer across the terminals of the circuit and connecting the slider of the potentiometer to ground through a current indicating instrument, adjusting the slider to right or left of the mildpoint so as to cause the instrument to indicate zero, connecting the terminal of the network which is farthest from the ground as thus indicated by the potentiometer to ground through a condenser, impressing a ripple voltage on the circuit, and then measuring the ripple voltage at the midpoint of each parallel branch to find the branch in which is the lowest value of ripple voltage, and then measuring the distribution of the ripple voltage along this branch to determine the point at which the ripple voltage drops off to zero or substantially zero due to the existence of a ground. The invention has particular application where a circuit ground is present which will disappear if the normal circuit voltage is removed.

  16. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  17. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  18. Input apparatus for dynamic signature verification systems

    DOEpatents

    EerNisse, Errol P.; Land, Cecil E.; Snelling, Jay B.

    1978-01-01

    The disclosure relates to signature verification input apparatus comprising a writing instrument and platen containing piezoelectric transducers which generate signals in response to writing pressures.

  19. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  20. Verification of regional climates of GISS GCM. Part 2: Summer

    NASA Technical Reports Server (NTRS)

    Druyan, Leonard M.; Rind, David

    1989-01-01

    Verification is made of the synoptic fields, sea-level pressure, precipitation rate, 200mb zonal wind and the surface resultant wind generated by two versions of the Goddard Institute for Space Studies (GISS) climate model. The models differ regarding the horizontal resolution of the computation grids and the specification of the sea-surface temperatures. Maps of the regional distributions of seasonal means of the model fields are shown alongside maps that show the observed distributions. Comparisons of the model results with observations are discussed and also summarized in tables according to geographic region.

  1. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Cycles § 1065.550 Gas analyzer range verification and drift verification. (a) Range verification. If an... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations...-specific emissions over the entire duty cycle for drift. For each constituent to be verified, both sets...

  2. Location-assured, multifactor authentication on smartphones via LTE communication

    NASA Astrophysics Data System (ADS)

    Kuseler, Torben; Lami, Ihsan A.; Al-Assam, Hisham

    2013-05-01

    With the added security provided by LTE, geographical location has become an important factor for authentication to enhance the security of remote client authentication during mCommerce applications using Smartphones. Tight combination of geographical location with classic authentication factors like PINs/Biometrics in a real-time, remote verification scheme over the LTE layer connection assures the authenticator about the client itself (via PIN/biometric) as well as the client's current location, thus defines the important aspects of "who", "when", and "where" of the authentication attempt without eaves dropping or man on the middle attacks. To securely integrate location as an authentication factor into the remote authentication scheme, client's location must be verified independently, i.e. the authenticator should not solely rely on the location determined on and reported by the client's Smartphone. The latest wireless data communication technology for mobile phones (4G LTE, Long-Term Evolution), recently being rolled out in various networks, can be employed to enhance this location-factor requirement of independent location verification. LTE's Control Plane LBS provisions, when integrated with user-based authentication and independent source of localisation factors ensures secure efficient, continuous location tracking of the Smartphone. This feature can be performed during normal operation of the LTE-based communication between client and network operator resulting in the authenticator being able to verify the client's claimed location more securely and accurately. Trials and experiments show that such algorithm implementation is viable for nowadays Smartphone-based banking via LTE communication.

  3. Variability in the Galactic globular cluster M15. Science verification phase of T80Cam/JAST80@OAJ

    NASA Astrophysics Data System (ADS)

    Vázquez Ramió, H.; Varela, J.; Cristóbal-Hornillos, D.; Muniesa, D.; Civera, T.; Hernández-Fuertes, J.; Ederoclite, A.; Blanco Siffert, B.; Chies Santos, A.; San Roman, I.; Lamadrid, J. L.; Iglesias Marzoa, R.; Díaz-Martín, M. C.; Kanaan, A.; Carvano, J.; Cortesi, A.; Ribeiro, T.; Reis, R.; Coelho, P.; Castillo, J.; López, A.; López San Juan, C.; Cenarro, A. J.; Marín-Franch, A.; Yanes, A.; Moles, M.

    2017-03-01

    In the framework of the Science Verification Phase of T80Cam of the 83cm Javalambre Auxiliary Survey Telescope (JAST80) located at the Observatorio Astrofísico de Javalambre (OAJ), Teruel, Spain, a program was proposed to study the variability of RR Lyrae stars, as well as other variable sources, belonging to the Galactic globular cluster M15. The observations were carried out on different epochs (almost a dozen different nights along a ˜ 4 months period) using the complete set of 12 filters, centered at the optical spectral range, that are being devoted to the exectuion of the ongoing Javalambre Photometric Local Universe Survey (J-PLUS). One of the main goals is the characterization of the variability of the spectral energy distribution of RR Lyrae stars along their pulsation. This will be used to define methods to detect these type of variables in J-PLUS and J-PLUS. Preliminarly results are presented here.

  4. LOCATING LEAKS WITH ACOUSTIC TECHNOLOGY

    EPA Science Inventory

    Many water distribution systems in this country are almost 100 years old. About 26 percent of piping in these systems is made of unlined cast iron or steel and is in poor condition. Many methods that locate leaks in these pipes are time-consuming, costly, disruptive to operations...

  5. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  6. Optimal Imaging for Treaty Verification

    SciTech Connect

    Brubaker, Erik; Hilton, Nathan R.; Johnson, William; Marleau, Peter; Kupinski, Matthew; MacGahan, Christopher Jonathan

    2014-09-01

    Future arms control treaty verification regimes may use radiation imaging measurements to confirm and track nuclear warheads or other treaty accountable items (TAIs). This project leverages advanced inference methods developed for medical and adaptive imaging to improve task performance in arms control applications. Additionally, we seek a method to acquire and analyze imaging data of declared TAIs without creating an image of those objects or otherwise storing or revealing any classified information. Such a method would avoid the use of classified-information barriers (IB).

  7. Why do verification and validation?

    DOE PAGES

    Hu, Kenneth T.; Paez, Thomas L.

    2016-02-19

    In this discussion paper, we explore different ways to assess the value of verification and validation (V&V) of engineering models. We first present a literature review on the value of V&V and then use value chains and decision trees to show how value can be assessed from a decision maker's perspective. In this context, the value is what the decision maker is willing to pay for V&V analysis with the understanding that the V&V results are uncertain. As a result, the 2014 Sandia V&V Challenge Workshop is used to illustrate these ideas.

  8. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    SciTech Connect

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  9. Hailstorms over Switzerland: Verification of Crowd-sourced Data

    NASA Astrophysics Data System (ADS)

    Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia

    2016-04-01

    The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.

  10. Locative Inversion in Cantonese.

    ERIC Educational Resources Information Center

    Mok, Sui-Sang

    This study investigates the phenomenon of "Locative Inversion" in Cantonese. The term "Locative Inversion" indicates that the locative phrase (LP) syntactic process in Cantonese and the appears at the sentence-initial position and its logical subject occurs postverbally. It is demonstrated that this Locative Inversion is a…

  11. Location, Location, Location: Development of Spatiotemporal Sequence Learning in Infancy

    ERIC Educational Resources Information Center

    Kirkham, Natasha Z.; Slemmer, Jonathan A.; Richardson, Daniel C.; Johnson, Scott P.

    2007-01-01

    We investigated infants' sensitivity to spatiotemporal structure. In Experiment 1, circles appeared in a statistically defined spatial pattern. At test 11-month-olds, but not 8-month-olds, looked longer at a novel spatial sequence. Experiment 2 presented different color/shape stimuli, but only the location sequence was violated during test;…

  12. CFE verification: The decision to inspect

    SciTech Connect

    Allentuck, J.

    1990-01-01

    Verification of compliance with the provisions of the treaty on Conventional Forces-Europe (CFE) is subject to inspection quotas of various kinds. Thus the decision to carry out a specific inspection or verification activity must be prudently made. This decision process is outlined, and means for conserving quotas'' are suggested. 4 refs., 1 fig.

  13. Identity Verification, Control, and Aggression in Marriage

    ERIC Educational Resources Information Center

    Stets, Jan E.; Burke, Peter J.

    2005-01-01

    In this research we study the identity verification process and its effects in marriage. Drawing on identity control theory, we hypothesize that a lack of verification in the spouse identity (1) threatens stable self-meanings and interaction patterns between spouses, and (2) challenges a (nonverified) spouse's perception of control over the…

  14. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Verification program. 460.17 Section...

  15. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Verification program. 460.17 Section...

  16. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Verification program. 460.17 Section...

  17. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements:...

  18. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 5 2011-04-01 2011-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements:...

  19. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  20. The monitoring and verification of nuclear weapons

    SciTech Connect

    Garwin, Richard L.

    2014-05-09

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  1. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  2. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  3. 18 CFR 286.107 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 286.107 Section 286.107 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... Contested Audit Findings and Proposed Remedies § 286.107 Verification. The facts stated in the...

  4. 18 CFR 41.5 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 41.5 Section 41.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  5. 18 CFR 349.5 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 349.5 Section 349.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... PROPOSED REMEDIES § 349.5 Verification. The facts stated in the memorandum must be sworn to by...

  6. 18 CFR 286.107 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 286.107 Section 286.107 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... Contested Audit Findings and Proposed Remedies § 286.107 Verification. The facts stated in the...

  7. 18 CFR 41.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 41.5 Section 41.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  8. 18 CFR 41.5 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification. 41.5 Section 41.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  9. 18 CFR 349.5 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification. 349.5 Section 349.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... PROPOSED REMEDIES § 349.5 Verification. The facts stated in the memorandum must be sworn to by...

  10. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  11. 18 CFR 349.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 349.5 Section 349.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... PROPOSED REMEDIES § 349.5 Verification. The facts stated in the memorandum must be sworn to by...

  12. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 1 2013-10-01 2013-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  13. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  14. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 1 2014-10-01 2014-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  15. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  16. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  18. Guidelines for qualifying cleaning and verification materials

    NASA Technical Reports Server (NTRS)

    Webb, D.

    1995-01-01

    This document is intended to provide guidance in identifying technical issues which must be addressed in a comprehensive qualification plan for materials used in cleaning and cleanliness verification processes. Information presented herein is intended to facilitate development of a definitive checklist that should address all pertinent materials issues when down selecting a cleaning/verification media.

  19. Gender Verification of Female Olympic Athletes.

    ERIC Educational Resources Information Center

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  20. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... a flight. Verification must include flight testing. ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Verification program. 460.17 Section 460.17... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17...

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  2. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  3. Video-based fingerprint verification.

    PubMed

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-09-04

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, "inside similarity" and "outside similarity" are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low.

  4. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness, which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  5. Criteria for monitoring a chemical arms treaty: Implications for the verification regime. Report No. 13

    SciTech Connect

    Mullen, M.F.; Apt, K.E.; Stanbro, W.D.

    1991-12-01

    The multinational Chemical Weapons Convention (CWC) being negotiated at the Conference on Disarmament in Geneva is viewed by many as an effective way to rid the world of the threat of chemical weapons. Parties could, however, legitimately engage in certain CW-related activities in industry, agriculture, research, medicine, and law enforcement. Treaty verification requirements related to declared activities include: confirming destruction of declared CW stockpiles and production facilities; monitoring legitimate, treaty-allowed activities, such as production of certain industrial chemicals; and, detecting proscribed activities within the declared locations of treaty signatories, e.g., the illegal production of CW agents at a declared industrial facility or the diversion or substitution of declared CW stockpile items. Verification requirements related to undeclared activities or locations include investigating possible clandestine CW stocks and production capability not originally declared by signatories; detecting clandestine, proscribed activities at facilities or sites that are not declared and hence not subject to routine inspection; and, investigating allegations of belligerent use of CW. We discuss here a possible set of criteria for assessing the effectiveness of CWC verification (and certain aspects of the bilateral CW reduction agreement). Although the criteria are applicable to the full range of verification requirements, the discussion emphasizes verification of declared activities and sites.

  6. Criteria for monitoring a chemical arms treaty: Implications for the verification regime

    SciTech Connect

    Mullen, M.F.; Apt, K.E.; Stanbro, W.D.

    1991-12-01

    The multinational Chemical Weapons Convention (CWC) being negotiated at the Conference on Disarmament in Geneva is viewed by many as an effective way to rid the world of the threat of chemical weapons. Parties could, however, legitimately engage in certain CW-related activities in industry, agriculture, research, medicine, and law enforcement. Treaty verification requirements related to declared activities include: confirming destruction of declared CW stockpiles and production facilities; monitoring legitimate, treaty-allowed activities, such as production of certain industrial chemicals; and, detecting proscribed activities within the declared locations of treaty signatories, e.g., the illegal production of CW agents at a declared industrial facility or the diversion or substitution of declared CW stockpile items. Verification requirements related to undeclared activities or locations include investigating possible clandestine CW stocks and production capability not originally declared by signatories; detecting clandestine, proscribed activities at facilities or sites that are not declared and hence not subject to routine inspection; and, investigating allegations of belligerent use of CW. We discuss here a possible set of criteria for assessing the effectiveness of CWC verification (and certain aspects of the bilateral CW reduction agreement). Although the criteria are applicable to the full range of verification requirements, the discussion emphasizes verification of declared activities and sites.

  7. Test/QA Plan for Verification of Semi-Continuous Ambient Air Monitoring Systems - Second Round

    EPA Science Inventory

    Test/QA Plan for Verification of Semi-Continuous Ambient Air Monitoring Systems - Second Round. Changes reflect performance of second round of testing at new location and with various changes to personnel. Additional changes reflect general improvements to the Version 1 test/QA...

  8. Environmental Technology Verification Report for Applikon MARGA Semi-Continuous Ambient Air Monitoring System

    EPA Science Inventory

    The verification test was conducted oer a period of 30 days (October 1 to October 31, 2008) and involved the continuous operation of duplicate semi-continuous monitoring technologies at the Burdens Creek Air Monitoring Site, an existing ambient-air monitoring station located near...

  9. 47 CFR 64.1120 - Verification of orders for telecommunications service.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... not be owned, managed, controlled, or directed by the carrier or the carrier's marketing agent; must... carrier's marketing agent; and must operate in a location physically separate from the carrier or the carrier's marketing agent. (i) Methods of third party verification. Automated third party...

  10. Cleanup Verification Package for the 118-F-1 Burial Ground

    SciTech Connect

    E. J. Farris and H. M. Sulloway

    2008-01-10

    This cleanup verification package documents completion of remedial action for the 118-F-1 Burial Ground on the Hanford Site. This burial ground is a combination of two locations formerly called Minor Construction Burial Ground No. 2 and Solid Waste Burial Ground No. 2. This waste site received radioactive equipment and other miscellaneous waste from 105-F Reactor operations, including dummy elements and irradiated process tubing; gun barrel tips, steel sleeves, and metal chips removed from the reactor; filter boxes containing reactor graphite chips; and miscellaneous construction solid waste.

  11. Stennis Space Center Verification & Validation Capabilities

    NASA Technical Reports Server (NTRS)

    Pagnutti, Mary; Ryan, Robert E.; Holekamp, Kara; O'Neal, Duane; Knowlton, Kelly; Ross, Kenton; Blonski, Slawomir

    2007-01-01

    Scientists within NASA#s Applied Research & Technology Project Office (formerly the Applied Sciences Directorate) have developed a well-characterized remote sensing Verification & Validation (V&V) site at the John C. Stennis Space Center (SSC). This site enables the in-flight characterization of satellite and airborne high spatial resolution remote sensing systems and their products. The smaller scale of the newer high resolution remote sensing systems allows scientists to characterize geometric, spatial, and radiometric data properties using a single V&V site. The targets and techniques used to characterize data from these newer systems can differ significantly from the techniques used to characterize data from the earlier, coarser spatial resolution systems. Scientists have used the SSC V&V site to characterize thermal infrared systems. Enhancements are being considered to characterize active lidar systems. SSC employs geodetic targets, edge targets, radiometric tarps, atmospheric monitoring equipment, and thermal calibration ponds to characterize remote sensing data products. Similar techniques are used to characterize moderate spatial resolution sensing systems at selected nearby locations. The SSC Instrument Validation Lab is a key component of the V&V capability and is used to calibrate field instrumentation and to provide National Institute of Standards and Technology traceability. This poster presents a description of the SSC characterization capabilities and examples of calibration data.

  12. CTBT Integrated Verification System Evaluation Model

    SciTech Connect

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  13. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  14. Hierarchical Design and Verification for VLSI

    NASA Technical Reports Server (NTRS)

    Shostak, R. E.; Elliott, W. D.; Levitt, K. N.

    1983-01-01

    The specification and verification work is described in detail, and some of the problems and issues to be resolved in their application to Very Large Scale Integration VLSI systems are examined. The hierarchical design methodologies enable a system architect or design team to decompose a complex design into a formal hierarchy of levels of abstraction. The first step inprogram verification is tree formation. The next step after tree formation is the generation from the trees of the verification conditions themselves. The approach taken here is similar in spirit to the corresponding step in program verification but requires modeling of the semantics of circuit elements rather than program statements. The last step is that of proving the verification conditions using a mechanical theorem-prover.

  15. Indoor location estimation using radio beacons

    NASA Astrophysics Data System (ADS)

    Ahmad, Uzair; Lee, Young-Koo; Lee, Sungyoug; Park, Chongkug

    2007-12-01

    We present a simple location estimation method for developing radio beacon based location system in the indoor environments. It employs an online learning approach for making large scale location systems in a short time collaboratively. The salient features of our method are low memory requirements and simple computations which make it suitable for both distributed location-aware applications based on client-server model as well as privacy sensitive applications residing on stand alone devices.

  16. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    NASA Astrophysics Data System (ADS)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  17. Sleeping at work: not all about location, location, location.

    PubMed

    Jay, Sarah M; Aisbett, Brad; Sprajcer, Madeline; Ferguson, Sally A

    2015-02-01

    Working arrangements in industries that use non-standard hours sometimes necessitate an 'onsite' workforce where workers sleep in accommodation within or adjacent to the workplace. Of particular relevance to these workers is the widely held (and largely anecdotal) assumption that sleep at home is better than sleep away, particularly when away for work. This narrative review explores the idea that sleep outcomes in these unique work situations are the product of an interaction between numerous factors including timing and duration of breaks, commute length, sleeping environment (noise, movement, vibration, light), circadian phase, demographic factors and familiarity with the sleep location. Based on the data presented in this review, it is our contention that the location of sleep, whilst important, is secondary to other factors such as the timing and duration of sleep periods. We suggest that future research should include measures that allow conceptualisation of other critical factors such as familiarity with the sleeping environment.

  18. First Images from VLT Science Verification Programme

    NASA Astrophysics Data System (ADS)

    1998-09-01

    morning of September 1 when the telescope was returned to the Commissioning Team that has since continued its work. The FORS instrument is now being installed and the first images from this facility are expected shortly. Observational circumstances During the two-week SV period, a total of 154 hours were available for astronomical observations. Of these, 95 hours (62%) were used to collect scientific data, including calibrations, e.g. flat-fielding and photometric standard star observations. 15 hours (10%) were spent to solve minor technical problems, while another 44 hours (29%) were lost due to adverse meteorological conditions (clouds or wind exceeding 15 m/sec). The amount of telescope technical downtime is very small at this moment of the UT1 commissioning. This fact provides an impressive indication of high technical reliability that has been achieved and which will be further consolidated during the next months. The meteorological conditions that were encountered at Paranal during this period were unfortunately below average, when compared to data from the same calendar period in earlier years. There was an excess of bad seeing and fewer good seeing periods than normal; see, however, ESO PR Photo 35c/98 with 0.26 arcsec image quality. Nevertheless, the measured image quality on the acquired frames was often better than the seeing measured outside the enclosure by the Paranal seeing monitor. Part of this very positive effect is due to "active field stabilization" , now performed during all observations by rapid motion (10 - 70 times per second) of the 1.1-m secondary mirror of beryllium (M2) and compensating for the "twinkling" of stars. Science Verification data soon to be released A great amount of valuable data was collected during the SV programme. The available programme time was distributed as follows: Hubble Deep Field - South [HDF-S; NICMOS and STIS Fields] (37.1 hrs); Lensed QSOs (3.2 hrs); High-z Clusters (6.2 hrs); Host Galaxies of Gamma-Ray Bursters (2

  19. Leaf sequencing and dosimetric verification in intensity-modulated radiotherapy

    NASA Astrophysics Data System (ADS)

    Agazaryan, Nzhde

    Although sophisticated means to calculate and deliver intensity modulated radiotherapy (IMRT) have been developed by many groups, methods to verify the delivery, as well as definitions of acceptability of a treatment in terms of these measurements are the most problematic at this stage of advancement of IMRT. Present intensity modulated radiotherapy systems fail to account for many dosimetric characteristics of the delivery system. In this dissertation, a dosimetrically based leaf sequencing algorithm is developed and implemented for multileaf collimated intensity modulated radiotherapy. The dosimetric considerations are investigated and are shown to significantly improve the outcome in terms of an agreement between desired and delivered radiation dose distributions. Subsequently, a system for determining the desirability of a produced intensity modulated radiotherapy plan in terms of deliverability of calculated profiles with the use of a multileaf collimator is developed. Three deliverability scoring indices are defined to evaluate the deliverability of the profiles. Gradient Index (GI) is a measure of the complexity of the profile in terms of gradients. Baseline Index (BI) is the fraction of the profile that is planned to get lower than the minimum level of transmission radiation. Cumulative Monitor Unit Index (CMUI) is the ratio of the cumulative monitor units (CMU) required for obtaining the desired profile to an average dose level in the profile. The dosimetric investigations of the deliverability scoring indices are presented, showing a clear correlation between scoring indices and dosimetric accuracy. Finally, materials and methods are developed for verification of intensity modulated radiotherapy. Dosimetric verification starts from investigations of the developed leaf sequencing algorithm, then extends to dosimetric verification in terms of deliverability, and lastly, dosimetric verification of complete clinical IMRT plans is performed.

  20. Technology for bolus verification in proton therapy

    NASA Astrophysics Data System (ADS)

    Shipulin, K. N.; Mytsin, G. V.; Agapov, A. V.

    2015-01-01

    To ensure the conformal depth-dose distribution of a proton beam within a target volume, complex shaped range shifters (so-called boluses), which account for the heterogeneous structure of patient tissue and organs in the beam path, were calculated and manufactured. The precise manufacturing of proton compensators used for patient treatment is a vital step in quality assurance in proton therapy. In this work a software-hardware complex that verifies the quality and precision of bolus manufacturing at the Medico-Technical Complex (MTC) was developed. The boluses consisted of a positioning system with two photoelectric biosensors. We evaluated 20 boluses used in proton therapy of five patients. A total number of 2562 experimental points were measured, of which only two points had values that differed from the calculated value by more than 0.5 mm. The other data points displayed a deviation within ±0.5 mm from the calculated value. The technology for bolus verification developed in this work can be used for the high precision testing of geometrical parameters of proton compensators in radiotherapy.

  1. In vivo proton range verification: a review

    NASA Astrophysics Data System (ADS)

    Knopf, Antje-Christin; Lomax, Antony

    2013-08-01

    Protons are an interesting modality for radiotherapy because of their well defined range and favourable depth dose characteristics. On the other hand, these same characteristics lead to added uncertainties in their delivery. This is particularly the case at the distal end of proton dose distributions, where the dose gradient can be extremely steep. In practice however, this gradient is rarely used to spare critical normal tissues due to such worries about its exact position in the patient. Reasons for this uncertainty are inaccuracies and non-uniqueness of the calibration from CT Hounsfield units to proton stopping powers, imaging artefacts (e.g. due to metal implants) and anatomical changes of the patient during treatment. In order to improve the precision of proton therapy therefore, it would be extremely desirable to verify proton range in vivo, either prior to, during, or after therapy. In this review, we describe and compare state-of-the art in vivo proton range verification methods currently being proposed, developed or clinically implemented.

  2. KAT-7 Science Verification Highlights

    NASA Astrophysics Data System (ADS)

    Lucero, Danielle M.; Carignan, Claude; KAT-7 Science Data; Processing Team, KAT-7 Science Commissioning Team

    2015-01-01

    KAT-7 is a pathfinder of the Square Kilometer Array precursor MeerKAT, which is under construction. Its short baselines and low system temperature make it sensitive to large scale, low surface brightness emission. This makes it an ideal instrument to use in searches for faint extended radio emission and low surface density extraplanar gas. We present an update on the progress of several such ongoing KAT-7 science verification projects. These include a large scale radio continuum and polarization survey of the Galactic Center, deep HI observations (100+ hours) of nearby disk galaxies (e.g. NGC253 and NGC3109), and targeted searches for HI tidal tails in galaxy groups (e.g. IC1459). A brief status update for MeerKAT will also be presented if time permits.

  3. MFTF sensor verification computer program

    SciTech Connect

    Chow, H.K.

    1984-11-09

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system.

  4. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  5. Retail applications of signature verification

    NASA Astrophysics Data System (ADS)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  6. Verification of FANTASTIC integrated code

    NASA Technical Reports Server (NTRS)

    Chauhan, Rajinder Singh

    1987-01-01

    FANTASTIC is an acronym for Failure Analysis Nonlinear Thermal and Structural Integrated Code. This program was developed by Failure Analysis Associates, Palo Alto, Calif., for MSFC to improve the accuracy of solid rocket motor nozzle analysis. FANTASTIC has three modules: FACT - thermochemical analysis; FAHT - heat transfer analysis; and FAST - structural analysis. All modules have keywords for data input. Work is in progress for the verification of the FAHT module, which is done by using data for various problems with known solutions as inputs to the FAHT module. The information obtained is used to identify problem areas of the code and passed on to the developer for debugging purposes. Failure Analysis Associates have revised the first version of the FANTASTIC code and a new improved version has been released to the Thermal Systems Branch.

  7. Location, Location, Location: Where Do Location-Based Services Fit into Your Institution's Social Media Mix?

    ERIC Educational Resources Information Center

    Nekritz, Tim

    2011-01-01

    Foursquare is a location-based social networking service that allows users to share their location with friends. Some college administrators have been thinking about whether and how to take the leap into location-based services, which are also known as geosocial networking services. These platforms, which often incorporate gaming elements like…

  8. Locatives in Kpelle.

    ERIC Educational Resources Information Center

    Kuha, Mai

    This paper examines the differences between locative expressions in Kpelle and English, based on the dialect of one native speaker of Kpelle. It discusses the crucial role of the reference object in defining the meaning of locatives in Kpelle, in contrast to English, where the characteristics of the object to be located are less important. An…

  9. Enhanced Cancelable Biometrics for Online Signature Verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. In this paper, we propose a scheme to improve the performance of a cancelable approach for online signature verification. Our scheme generates two cancelable dataset from one raw dataset and uses them for verification. Preliminary experiments were performed using a distance-based online signature verification algorithm. The experimental results show that our proposed scheme is promising.

  10. Formal specification and verification of Ada software

    NASA Technical Reports Server (NTRS)

    Hird, Geoffrey R.

    1991-01-01

    The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.

  11. Liquefied Natural Gas (LNG) dispenser verification device

    NASA Astrophysics Data System (ADS)

    Xiong, Maotao; Yang, Jie-bin; Zhao, Pu-jun; Yu, Bo; Deng, Wan-quan

    2013-01-01

    The composition of working principle and calibration status of LNG (Liquefied Natural Gas) dispenser in China are introduced. According to the defect of weighing method in the calibration of LNG dispenser, LNG dispenser verification device has been researched. The verification device bases on the master meter method to verify LNG dispenser in the field. The experimental results of the device indicate it has steady performance, high accuracy level and flexible construction, and it reaches the international advanced level. Then LNG dispenser verification device will promote the development of LNG dispenser industry in China and to improve the technical level of LNG dispenser manufacture.

  12. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  13. Inclusion type radiochromic gel dosimeter for threedimensional dose verification

    NASA Astrophysics Data System (ADS)

    Usui, Shuji; Yoshioka, Munenori; Hayashi, Shin-ichiro; Tominaga, Takahiro

    2015-01-01

    For the verification of 3D dose distributions in modern radiation therapy, a new inclusion type radiochromic gel detector has been developed. In this gel, a hydrophobic leuco dye (leucomalachite green: LMG) was dissolved in water as an inclusion complex with highly branched cyclic dextrin. The radiation induced radical oxidation property of the LMG gel with various sensitizers was investigated. As a result, the optical dose responses were enhanced by the addition of bromoacetic acid and manganese (II) chloride. Unfavorable auto-oxidation of the gel was reduced when it was stored at 4°C.

  14. Potential Crash Location (PCL) Model

    DTIC Science & Technology

    2014-02-05

    UNCLASSIFIED AD NUMBER LIMITATION CHANGES TO: FROM: AUTHORITY THIS PAGE IS UNCLASSIFIED ADB383242 Approved for public release; distribution is...model; Sensis model 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON...approach to defining the outer limits . This report also discusses two different approaches to modeling rotor craft UAS crash locations. NAWCADPAX/TR

  15. An integrated user-oriented laboratory for verification of digital flight control systems: Features and capabilities

    NASA Technical Reports Server (NTRS)

    Defeo, P.; Doane, D.; Saito, J.

    1982-01-01

    A Digital Flight Control Systems Verification Laboratory (DFCSVL) has been established at NASA Ames Research Center. This report describes the major elements of the laboratory, the research activities that can be supported in the area of verification and validation of digital flight control systems (DFCS), and the operating scenarios within which these activities can be carried out. The DFCSVL consists of a palletized dual-dual flight-control system linked to a dedicated PDP-11/60 processor. Major software support programs are hosted in a remotely located UNIVAC 1100 accessible from the PDP-11/60 through a modem link. Important features of the DFCSVL include extensive hardware and software fault insertion capabilities, a real-time closed loop environment to exercise the DFCS, an integrated set of software verification tools, and a user-oriented interface to all the resources and capabilities.

  16. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: STORMWATER TECHNOLOGIES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techn...

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: FUEL CELLS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  19. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Energy Regulatory Commission 18 CFR Part 40 Generator Verification Reliability Standards AGENCY: Federal... approve the following Reliability Standards that were submitted to the Commission for approval by the North American Electric Reliability Corporation, the Commission-certified Electric...

  20. Environmental Technology Verification Program (ETV) Policy Compendium

    EPA Science Inventory

    The Policy Compendium summarizes operational decisions made to date by participants in the U.S. Environmental Protection Agency's (EPA's) Environmental Technology Verification Program (ETV) to encourage consistency among the ETV centers. The policies contained herein evolved fro...

  1. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  2. Engineering drawing field verification program. Revision 3

    SciTech Connect

    Ulk, P.F.

    1994-10-12

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented.

  3. The PASCAL-HDM Verification System

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The PASCAL-HDM verification system is described. This system supports the mechanical generation of verification conditions from PASCAL programs and HDM-SPECIAL specifications using the Floyd-Hoare axiomatic method. Tools are provided to parse programs and specifications, check their static semantics, generate verification conditions from Hoare rules, and translate the verification conditions appropriately for proof using the Shostak Theorem Prover, are explained. The differences between standard PASCAL and the language handled by this system are explained. This consists mostly of restrictions to the standard language definition, the only extensions or modifications being the addition of specifications to the code and the change requiring the references to a function of no arguments to have empty parentheses.

  4. MAMA Software Features: Quantification Verification Documentation-1

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  5. Verification of the Calore thermal analysis code.

    SciTech Connect

    Dowding, Kevin J.; Blackwell, Bennie Francis

    2004-07-01

    Calore is the ASC code developed to model steady and transient thermal diffusion with chemistry and dynamic enclosure radiation. An integral part of the software development process is code verification, which addresses the question 'Are we correctly solving the model equations'? This process aids the developers in that it identifies potential software bugs and gives the thermal analyst confidence that a properly prepared input will produce satisfactory output. Grid refinement studies have been performed on problems for which we have analytical solutions. In this talk, the code verification process is overviewed and recent results are presented. Recent verification studies have focused on transient nonlinear heat conduction and verifying algorithms associated with (tied) contact and adaptive mesh refinement. In addition, an approach to measure the coverage of the verification test suite relative to intended code applications is discussed.

  6. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... deviation occurs; (d) Reviewing the critical limits; (e) Reviewing other records pertaining to the...

  7. Mine locations: Kazakhstan

    SciTech Connect

    Perry, Bradley A

    2008-01-01

    Upon accepting this internship at Los Alamos National Laboratory, I was excited but a bit nervous because I was placed into a field I knew nothing about and did not incorporate my mechanical engineering background. However, I stayed positive and realized that experience and education can come in many forms and that this would be a once in a lifetime opportunity. The EES-II Division (which stands for Earth and Environmental Sciences, Geophysics division) concentrates on several topics, including Nuclear Treaty Verification Seismology. The study of this is extremely important in order to monitor countries that have nuclear capability and make sure they follow the rules of the international comprehensive nuclear test ban treaty. Seismology is only one aspect of this monitoring and EES-II works diligently with many other groups here at Los Alamos and across the world.

  8. Verification of thermal analysis codes for modeling solid rocket nozzles

    NASA Technical Reports Server (NTRS)

    Keyhani, M.

    1993-01-01

    One of the objectives of the Solid Propulsion Integrity Program (SPIP) at Marshall Space Flight Center (MSFC) is development of thermal analysis codes capable of accurately predicting the temperature field, pore pressure field and the surface recession experienced by decomposing polymers which are used as thermal barriers in solid rocket nozzles. The objective of this study is to provide means for verifications of thermal analysis codes developed for modeling of flow and heat transfer in solid rocket nozzles. In order to meet the stated objective, a test facility was designed and constructed for measurement of the transient temperature field in a sample composite subjected to a constant heat flux boundary condition. The heating was provided via a steel thin-foil with a thickness of 0.025 mm. The designed electrical circuit can provide a heating rate of 1800 W. The heater was sandwiched between two identical samples, and thus ensure equal power distribution between them. The samples were fitted with Type K thermocouples, and the exact location of the thermocouples were determined via X-rays. The experiments were modeled via a one-dimensional code (UT1D) as a conduction and phase change heat transfer process. Since the pyrolysis gas flow was in the direction normal to the heat flow, the numerical model could not account for the convection cooling effect of the pyrolysis gas flow. Therefore, the predicted values in the decomposition zone are considered to be an upper estimate of the temperature. From the analysis of the experimental and the numerical results the following are concluded: (1) The virgin and char specific heat data for FM 5055 as reported by SoRI can not be used to obtain any reasonable agreement between the measured temperatures and the predictions. However, use of virgin and char specific heat data given in Acurex report produced good agreement for most of the measured temperatures. (2) Constant heat flux heating process can produce a much higher

  9. SU-E-J-138: On the Ion Beam Range and Dose Verification in Hadron Therapy Using Sound Waves

    SciTech Connect

    Fourkal, E; Veltchev, I; Gayou, O; Nahirnyak, V

    2015-06-15

    Purpose: Accurate range verification is of great importance to fully exploit the potential benefits of ion beam therapies. Current research efforts on this topic include the use of PET imaging of induced activity, detection of emerging prompt gamma rays or secondary particles. It has also been suggested recently to detect the ultrasound waves emitted through the ion energy absorption process. The energy absorbed in a medium is dissipated as heat, followed by thermal expansion that leads to generation of acoustic waves. By using an array of ultrasound transducers the precise spatial location of the Bragg peak can be obtained. The shape and intensity of the emitted ultrasound pulse depend on several variables including the absorbed energy and the pulse length. The main objective of this work is to understand how the ultrasound wave amplitude and shape depend on the initial ion energy and intensity. This would help guide future experiments in ionoacoustic imaging. Methods: The absorbed energy density for protons and carbon ions of different energy and field sizes were obtained using Fluka Monte Carlo code. Subsequently, the system of coupled equations for temperature and pressure is solved for different ion pulse intensities and lengths to obtain the pressure wave shape, amplitude and spectral distribution. Results: The proposed calculations show that the excited pressure wave amplitude is proportional to the absorbed energy density and for longer ion pulses inversely proportional to the ion pulse duration. It is also shown that the resulting ionoacoustic pressure distribution depends on both ion pulse duration and time between the pulses. Conclusion: The Bragg peak localization using ionoacoustic signal may eventually lead to the development of an alternative imaging method with sub-millimeter resolution. It may also open a way for in-vivo dose verification from the measured acoustic signal.

  10. The NPARC Alliance Verification and Validation Archive

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Dudek, Julianne C.; Tatum, Kenneth E.

    2000-01-01

    The NPARC Alliance (National Project for Applications oriented Research in CFD) maintains a publicly-available, web-based verification and validation archive as part of the development and support of the WIND CFD code. The verification and validation methods used for the cases attempt to follow the policies and guidelines of the ASME and AIAA. The emphasis is on air-breathing propulsion flow fields with Mach numbers ranging from low-subsonic to hypersonic.

  11. Transmutation Fuel Performance Code Thermal Model Verification

    SciTech Connect

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  12. Dynamic testing for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.

    1972-01-01

    Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.

  13. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  14. Cleanliness verification process at Martin Marietta Astronautics

    NASA Astrophysics Data System (ADS)

    King, Elizabeth A.; Giordano, Thomas J.

    1994-06-01

    The Montreal Protocol and the 1990 Clean Air Act Amendments mandate CFC-113, other chlorinated fluorocarbons (CFC's) and 1,1,1-Trichloroethane (TCA) be banned from production after December 31, 1995. In response to increasing pressures, the Air Force has formulated policy that prohibits purchase of these solvents for Air Force use after April 1, 1994. In response to the Air Force policy, Martin Marietta Astronautics is in the process of eliminating all CFC's and TCA from use at the Engineering Propulsion Laboratory (EPL), located on Air Force property PJKS. Gross and precision cleaning operations are currently performed on spacecraft components at EPL. The final step of the operation is a rinse with a solvent, typically CFC-113. This solvent is then analyzed for nonvolatile residue (NVR), particle count and total filterable solids (TFS) to determine cleanliness of the parts. The CFC-113 used in this process must be replaced in response to the above policies. Martin Marietta Astronautics, under contract to the Air Force, is currently evaluating and testing alternatives for a cleanliness verification solvent. Completion of test is scheduled for May, 1994. Evaluation of the alternative solvents follows a three step approach. This first is initial testing of solvents picked from literature searches and analysis. The second step is detailed testing of the top candidates from the initial test phase. The final step is implementation and validation of the chosen alternative(s). Testing will include contaminant removal, nonvolatile residue, material compatibility and propellant compatibility. Typical materials and contaminants will be tested with a wide range of solvents. Final results of the three steps will be presented as well as the implementation plan for solvent replacement.

  15. Reversible micromachining locator

    DOEpatents

    Salzer, Leander J.; Foreman, Larry R.

    1999-01-01

    This invention provides a device which includes a locator, a kinematic mount positioned on a conventional tooling machine, a part carrier disposed on the locator and a retainer ring. The locator has disposed therein a plurality of steel balls, placed in an equidistant position circumferentially around the locator. The kinematic mount includes a plurality of magnets which are in registry with the steel balls on the locator. In operation, a blank part to be machined is placed between a surface of a locator and the retainer ring (fitting within the part carrier). When the locator (with a blank part to be machined) is coupled to the kinematic mount, the part is thus exposed for the desired machining process. Because the locator is removably attachable to the kinematic mount, it can easily be removed from the mount, reversed, and reinserted onto the mount for additional machining. Further, the locator can likewise be removed from the mount and placed onto another tooling machine having a properly aligned kinematic mount. Because of the unique design and use of magnetic forces of the present invention, positioning errors of less than 0.25 micrometer for each machining process can be achieved.

  16. Reversible micromachining locator

    DOEpatents

    Salzer, L.J.; Foreman, L.R.

    1999-08-31

    This invention provides a device which includes a locator, a kinematic mount positioned on a conventional tooling machine, a part carrier disposed on the locator and a retainer ring. The locator has disposed therein a plurality of steel balls, placed in an equidistant position circumferentially around the locator. The kinematic mount includes a plurality of magnets which are in registry with the steel balls on the locator. In operation, a blank part to be machined is placed between a surface of a locator and the retainer ring (fitting within the part carrier). When the locator (with a blank part to be machined) is coupled to the kinematic mount, the part is thus exposed for the desired machining process. Because the locator is removably attachable to the kinematic mount, it can easily be removed from the mount, reversed, and reinserted onto the mount for additional machining. Further, the locator can likewise be removed from the mount and placed onto another tooling machine having a properly aligned kinematic mount. Because of the unique design and use of magnetic forces of the present invention, positioning errors of less than 0.25 micrometer for each machining process can be achieved. 7 figs.

  17. National Verification System of National Meteorological Center , China

    NASA Astrophysics Data System (ADS)

    Zhang, Jinyan; Wei, Qing; Qi, Dan

    2016-04-01

    Product Quality Verification Division for official weather forecasting of China was founded in April, 2011. It is affiliated to Forecast System Laboratory (FSL), National Meteorological Center (NMC), China. There are three employees in this department. I'm one of the employees and I am in charge of Product Quality Verification Division in NMC, China. After five years of construction, an integrated realtime National Verification System of NMC, China has been established. At present, its primary roles include: 1) to verify official weather forecasting quality of NMC, China; 2) to verify the official city weather forecasting quality of Provincial Meteorological Bureau; 3) to evaluate forecasting quality for each forecasters in NMC, China. To verify official weather forecasting quality of NMC, China, we have developed : • Grid QPF Verification module ( including upascale) • Grid temperature, humidity and wind forecast verification module • Severe convective weather forecast verification module • Typhoon forecast verification module • Disaster forecast verification • Disaster warning verification module • Medium and extend period forecast verification module • Objective elements forecast verification module • Ensemble precipitation probabilistic forecast verification module To verify the official city weather forecasting quality of Provincial Meteorological Bureau, we have developed : • City elements forecast verification module • Public heavy rain forecast verification module • City air quality forecast verification module. To evaluate forecasting quality for each forecasters in NMC, China, we have developed : • Off-duty forecaster QPF practice evaluation module • QPF evaluation module for forecasters • Severe convective weather forecast evaluation module • Typhoon track forecast evaluation module for forecasters • Disaster warning evaluation module for forecasters • Medium and extend period forecast evaluation module The further

  18. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Niton XLt 700 Series (XLt) XRF Services x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XLt analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XLt analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy

  19. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Oxford ED2000 x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ED2000 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ED2000 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by com

  20. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Rigaku ZSX Mini II (ZSX Mini II) XRF Services x-ray fluorescence (XRF) analyzer was demon-strated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ZSX Mini II analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ZSX Mini II analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element con

  1. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Innov-X XT400 Series (XT400) x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XT400 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XT400 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was as

  2. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Rontec PicoTAX x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the PicoTAX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the PicoTAX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by c

  3. Learning Assumptions for Compositional Verification

    NASA Technical Reports Server (NTRS)

    Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.

  4. ALMA Band 5 Science Verification

    NASA Astrophysics Data System (ADS)

    Humphreys, L.; Biggs, A.; Immer, K.; Laing, R.; Liu, H. B.; Marconi, G.; Mroczkowski, T.; Testi, L.; Yagoubov, P.

    2017-03-01

    ALMA Band 5 (163–211 GHz) was recently commissioned and Science Verification (SV) observations were obtained in the latter half of 2016. A primary scientific focus of this band is the H2O line at 183.3 GHz, which can be observed around 15% of the time when the precipitable water vapour is sufficiently low (< 0.5 mm). Many more lines are covered in Band 5 and can be observed for over 70% of the time on Chajnantor, requiring similar restrictions to those for ALMA Bands 4 and 6. Examples include the H218O line at 203 GHz, some of the bright (3–2) lines of singly and doubly deuterated forms of formaldehyde, the (2–1) lines of HCO+, HCN, HNC, N2H+ and several of their isotopologues. A young star-forming region near the centre of the Milky Way, an evolved star also in our Galaxy, and a nearby ultraluminous infrared galaxy (ULIRG) were observed as part of the SV process and the data are briefly described. The reduced data, along with imaged data products, are now public and demonstrate the power of ALMA for high-resolution studies of H2O and other molecules in a variety of astronomical targets.

  5. 46 CFR 193.60-10 - Location.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Location. 193.60-10 Section 193.60-10 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) OCEANOGRAPHIC RESEARCH VESSELS FIRE PROTECTION EQUIPMENT Fire Axes § 193.60-10 Location. (a) Fire axes shall be distributed throughout the spaces...

  6. Model based correction of placement error in EBL and its verification

    NASA Astrophysics Data System (ADS)

    Babin, Sergey; Borisov, Sergey; Militsin, Vladimir; Komagata, Tadashi; Wakatsuki, Tetsuro

    2016-05-01

    In maskmaking, the main source of error contributing to placement error is charging. DISPLACE software corrects the placement error for any layout, based on a physical model. The charge of a photomask and multiple discharge mechanisms are simulated to find the charge distribution over the mask. The beam deflection is calculated for each location on the mask, creating data for the placement correction. The software considers the mask layout, EBL system setup, resist, and writing order, as well as other factors such as fogging and proximity effects correction. The output of the software is the data for placement correction. One important step is the calibration of physical model. A test layout on a single calibration mask was used for calibration. The extracted model parameters were used to verify the correction. As an ultimate test for the correction, a sophisticated layout was used for the verification that was very different from the calibration mask. The placement correction results were predicted by DISPLACE. A good correlation of the measured and predicted values of the correction confirmed the high accuracy of the charging placement error correction.

  7. Reversible micromachining locator

    SciTech Connect

    Salzer, Leander J.; Foreman, Larry R.

    2002-01-01

    A locator with a part support is used to hold a part onto the kinematic mount of a tooling machine so that the part can be held in or replaced in exactly the same position relative to the cutting tool for machining different surfaces of the part or for performing different machining operations on the same or different surfaces of the part. The locator has disposed therein a plurality of steel balls placed at equidistant positions around the planar surface of the locator and the kinematic mount has a plurality of magnets which alternate with grooves which accommodate the portions of the steel balls projecting from the locator. The part support holds the part to be machined securely in place in the locator. The locator can be easily detached from the kinematic mount, turned over, and replaced onto the same kinematic mount or another kinematic mount on another tooling machine without removing the part to be machined from the locator so that there is no need to touch or reposition the part within the locator, thereby assuring exact replication of the position of the part in relation to the cutting tool on the tooling machine for each machining operation on the part.

  8. Automatic vehicle location system

    NASA Technical Reports Server (NTRS)

    Hansen, G. R., Jr. (Inventor)

    1973-01-01

    An automatic vehicle detection system is disclosed, in which each vehicle whose location is to be detected carries active means which interact with passive elements at each location to be identified. The passive elements comprise a plurality of passive loops arranged in a sequence along the travel direction. Each of the loops is tuned to a chosen frequency so that the sequence of the frequencies defines the location code. As the vehicle traverses the sequence of the loops as it passes over each loop, signals only at the frequency of the loop being passed over are coupled from a vehicle transmitter to a vehicle receiver. The frequencies of the received signals in the receiver produce outputs which together represent a code of the traversed location. The code location is defined by a painted pattern which reflects light to a vehicle carried detector whose output is used to derive the code defined by the pattern.

  9. RFI and SCRIMP Model Development and Verification

    NASA Technical Reports Server (NTRS)

    Loos, Alfred C.; Sayre, Jay

    2000-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this technique. Therefore, the objective of this research was to modify an existing three-dimensional, Resin Film Infusion (RFI)/Resin Transfer Molding (RTM) model to include VARTM simulation capabilities and to verify this model with the fabrication of aircraft structural composites. An additional objective was to use the VARTM model as a process analysis tool, where this tool would enable the user to configure the best process for manufacturing quality composites. Experimental verification of the model was performed by processing several flat composite panels. The parameters verified included flow front patterns and infiltration times. The flow front patterns were determined to be qualitatively accurate, while the simulated infiltration times over predicted experimental times by 8 to 10%. Capillary and gravitational forces were incorporated into the existing RFI/RTM model in order to simulate VARTM processing physics more accurately. The theoretical capillary pressure showed the capability to reduce the simulated infiltration times by as great as 6%. The gravity, on the other hand, was found to be negligible for all cases. Finally, the VARTM model was used as a process analysis tool. This enabled the user to determine such important process constraints as the location and type of injection ports and the permeability and location of the high-permeable media. A process for a three-stiffener composite panel was proposed. This configuration evolved from the variation of the process

  10. Numerical Weather Predictions Evaluation Using Spatial Verification Methods

    NASA Astrophysics Data System (ADS)

    Tegoulias, I.; Pytharoulis, I.; Kotsopoulos, S.; Kartsios, S.; Bampzelis, D.; Karacostas, T.

    2014-12-01

    During the last years high-resolution numerical weather prediction simulations have been used to examine meteorological events with increased convective activity. Traditional verification methods do not provide the desired level of information to evaluate those high-resolution simulations. To assess those limitations new spatial verification methods have been proposed. In the present study an attempt is made to estimate the ability of the WRF model (WRF -ARW ver3.5.1) to reproduce selected days with high convective activity during the year 2010 using those feature-based verification methods. Three model domains, covering Europe, the Mediterranean Sea and northern Africa (d01), the wider area of Greece (d02) and central Greece - Thessaly region (d03) are used at horizontal grid-spacings of 15km, 5km and 1km respectively. By alternating microphysics (Ferrier, WSM6, Goddard), boundary layer (YSU, MYJ) and cumulus convection (Kain-­-Fritsch, BMJ) schemes, a set of twelve model setups is obtained. The results of those simulations are evaluated against data obtained using a C-Band (5cm) radar located at the centre of the innermost domain. Spatial characteristics are well captured but with a variable time lag between simulation results and radar data. Acknowledgements: This research is co­financed by the European Union (European Regional Development Fund) and Greek national funds, through the action "COOPERATION 2011: Partnerships of Production and Research Institutions in Focused Research and Technology Sectors" (contract number 11SYN_8_1088 - DAPHNE) in the framework of the operational programme "Competitiveness and Entrepreneurship" and Regions in Transition (OPC II, NSRF 2007-­-2013).

  11. Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models

    SciTech Connect

    Ahmed Hassan; Jenny Chapman

    2006-02-01

    The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP

  12. Alternate calibration method of radiochromic EBT3 film for quality assurance verification of clinical radiotherapy treatments

    NASA Astrophysics Data System (ADS)

    Park, Soah; Kang, Sei-Kwon; Cheong, Kwang-Ho; Hwang, Taejin; Yoon, Jai-Woong; Koo, Taeryool; Han, Tae Jin; Kim, Haeyoung; Lee, Me Yeon; Bae, Hoonsik; Kim, Kyoung Ju

    2016-07-01

    EBT3 film is utilized as a dosimetry quality assurance tool for the verification of clinical radiotherapy treatments. In this work, we suggest a percentage-depth-dose (PDD) calibration method that can calibrate several EBT3 film pieces together at different dose levels because photon beams provide different dose levels at different depths along the axis of the beam. We investigated the feasibility of the film PDD calibration method based on PDD data and compared the results those from the traditional film calibration method. Photon beams at 6 MV were delivered to EBT3 film pieces for both calibration methods. For the PDD-based calibration, the film pieces were placed on solid phantoms at the depth of maximum dose (dmax) and at depths of 3, 5, 8, 12, 17, and 22 cm, and a photon beam was delivered twice, at 100 cGy and 400 cGy, to extend the calibration dose range under the same conditions. Fourteen film pieces, to maintain their consistency, were irradiated at doses ranging from approximately 30 to 400 cGy for both film calibrations. The film pieces were located at the center position on the scan bed of an Epson 1680 flatbed scanner in the parallel direction. Intensity-modulated radiation therapy (IMRT) plans were created, and their dose distributions were delivered to the film. The dose distributions for the traditional method and those for the PDD-based calibration method were evaluated using a Gamma analysis. The PDD dose values using a CC13 ion chamber and those obtained by using a FC65-G Farmer chamber and measured at the depth of interest produced very similar results. With the objective test criterion of a 1% dosage agreement at 1 mm, the passing rates for the four cases of the three IMRT plans were essentially identical. The traditional and the PDD-based calibrations provided similar plan verification results. We also describe another alternative for calibrating EBT3 films, i.e., a PDD-based calibration method that provides an easy and time-saving approach

  13. Verification and Validation of Kinetic Codes

    NASA Astrophysics Data System (ADS)

    Christlieb, Andrew

    2014-10-01

    We review the last three workshops held on Validation and Verification of Kinetic Codes. The goal of the workshops was to highlight the need to develop benchmark test problems beyond traditional test problems such as Landau damping and the two-stream instability. These test problems provide a limited understanding how a code might perform and mask key issues in more complicated situations. Developing these test problems highlights the strengths and weaknesses of both mesh- and particle-based codes. One outcome is that designing test problems that clearly deliver a path forward for developing improved methods is complicated by the need to create a completely self-consistent model. For example, two test cases proposed by the authors as simple test cases turn out to be ill defined. The first case is the modeling of sheath formation in a 1D 1V collisionless plasma. We found that losses to the wall lead to discontinuous distribution functions, a challenge for high order mesh-based solvers. The semi-infinite case was problematic because the far field boundary condition poses difficulty in computing on a finite domain. Our second case was flow of a collisionless electron beam in a pipe. Here, numerical diffusion is a key problem we are testing; however, two-stream instability at the beam edges introduces other issues in terms of finding convergent solutions. For mesh-based codes, before particle trapping takes place, mesh-based methods find themselves outside of the asymptotic regime. Another conclusion we draw from this exercise is that including collisional models in benchmark test problems for mesh-based plasma simulation tools is an important step in providing robust test problems for mesh-based kinetic solvers. In collaboration with Yaman Guclu, David Seal, and John Verboncoeur, Michigan State University.

  14. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  15. Monitoring and verification R&D

    SciTech Connect

    Pilat, Joseph F; Budlong - Sylvester, Kory W; Fearey, Bryan L

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.

  16. Experimental verification of interactions between randomly distributed fine magnetic particles

    NASA Astrophysics Data System (ADS)

    Taketomi, Susamu; Shull, Robert D.

    2003-10-01

    We experimentally examined whether or not a magnetic fluid (MF) is really superparamagnetic by comparing the initial magnetic susceptibilities of the mother MFs with those of their highly diluted solutions (more than 1000 times diluted) in which the dipole-dipole interaction between the particles was negligible. We used three mother MFs, SA 1, SB 1, and SC 1, and their highly diluted solutions, SA 2, SB 2, and SC 2, respectively. The particles' dispersability was best in SA 1 and poorest in SC 1. From the static field experiment, it was found that the mutual interaction between the particles in SB 1, and SC 1 made clusters of particles with magnetically closed flux circuits even at zero field while no interaction was detected in SA 1. The initial complex magnetic susceptibility, χ˜, as a function of temperature, T, under an AC field experiment revealed that the complex susceptibility of both the samples SA 1 and SA 2 showed peaks as a function of T. However, their χ˜ vs. T curves were not similar, leading to the conclusion that the sample SA 1 was not superparamagnetic. Instead, SA 1 was a magnetic spin-glass induced by the weak interaction between the particle spins. The existence of the spin-glass state was also confirmed by the Volgel-Fulcher law dependence of the AC-susceptibility peak temperature, Tp, or the frequency of the AC field.

  17. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services §...

  18. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services §...

  19. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services §...

  20. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services §...

  1. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services §...

  2. 37 CFR 384.7 - Verification of royalty distributions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... shall use commercially reasonable efforts to obtain or to provide access to any relevant books and... ordinary course of business according to generally accepted auditing standards by an independent...

  3. 37 CFR 380.7 - Verification of royalty distributions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... to any relevant books and records maintained by third parties for the purpose of the audit. The... underlying paperwork, which was performed in the ordinary course of business according to generally...

  4. 37 CFR 380.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... shall use commercially reasonable efforts to obtain or to provide access to any relevant books and... was performed in the ordinary course of business according to generally accepted auditing standards...

  5. 37 CFR 380.26 - Verification of royalty distributions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... to any relevant books and records maintained by third parties for the purpose of the audit. The... underlying paperwork, which was performed in the ordinary course of business according to generally...

  6. Sensors Locate Radio Interference

    NASA Technical Reports Server (NTRS)

    2009-01-01

    After receiving a NASA Small Business Innovation Research (SBIR) contract from Kennedy Space Center, Soneticom Inc., based in West Melbourne, Florida, created algorithms for time difference of arrival and radio interferometry, which it used in its Lynx Location System (LLS) to locate electromagnetic interference that can disrupt radio communications. Soneticom is collaborating with the Federal Aviation Administration (FAA) to install and test the LLS at its field test center in New Jersey in preparation for deploying the LLS at commercial airports. The software collects data from each sensor in order to compute the location of the interfering emitter.

  7. Protection of autonomous microgrids using agent-based distributed communication

    SciTech Connect

    Cintuglu, Mehmet H.; Ma, Tan; Mohammed, Osama A.

    2016-04-06

    This study presents a real-time implementation of autonomous microgrid protection using agent-based distributed communication. Protection of an autonomous microgrid requires special considerations compared to large scale distribution net-works due to the presence of power converters and relatively low inertia. In this work, we introduce a practical overcurrent and a frequency selectivity method to overcome conventional limitations. The proposed overcurrent scheme defines a selectivity mechanism considering the remedial action scheme (RAS) of the microgrid after a fault instant based on feeder characteristics and the location of the intelligent electronic devices (IEDs). A synchrophasor-based online frequency selectivity approach is proposed to avoid pulse loading effects in low inertia microgrids. Experimental results are presented for verification of the pro-posed schemes using a laboratory based microgrid. The setup was composed of actual generation units and IEDs using IEC 61850 protocol. The experimental results were in excellent agreement with the proposed protection scheme.

  8. Monitoring/Verification using DMS: TATP Example

    SciTech Connect

    Stephan Weeks; Kevin Kyle

    2008-03-01

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a 'smart dust' sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the use of explosives or chemical and biological weapons in terrorist activities. Two peroxide-based liquid explosives, triacetone triperoxide (TATP) and hexamethylene triperoxide diamine (HMTD), are synthesized from common chemicals such as hydrogen peroxide, acetone, sulfuric acid, ammonia, and citric acid (Figure 1). Recipes can be readily found on the Internet by anyone seeking to generate sufficient quantities of these highly explosive chemicals to cause considerable collateral damage. Detection of TATP and HMTD by advanced sensing systems can provide the early warning necessary to prevent terror plots from coming to fruition. DMS is currently one of the foremost emerging technologies for the separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. DMS separates and identifies ions at ambient pressures by utilizing the non-linear dependence of an ion's mobility on the radio frequency (rf) electric field strength. GC is widely considered to be one of the leading analytical methods for the separation of chemical species in complex mixtures. Advances in the technique have led to the development of low-thermal-mass fast GC columns. These columns are capable of

  9. Nasogastric tube placement and verification in children: review of the current literature.

    PubMed

    Irving, Sharon Y; Lyman, Beth; Northington, LaDonna; Bartlett, Jacqueline A; Kemper, Carol

    2014-06-01

    Placement of a nasogastric enteral access device (NG-EAD), often referred to as a nasogastric tube, is common practice and largely in the domain of nursing care. Most often an NG-EAD is placed at the bedside without radiographic assistance. Correct initial placement and ongoing location verification are the primary challenges surrounding NG-EAD use and have implications for patient safety. Although considered an innocuous procedure, placement of an NG-EAD carries risk of serious and potentially lethal complications. Despite acknowledgment that an abdominal radiograph is the gold standard, other methods of verifying placement location are widely used and have success rates from 80% to 85%. The long-standing challenges surrounding bedside placement of NG-EADs and a practice alert issued by the Child Health Patient Safety Organization on this issue were the stimuli for the conception of The New Opportunities for Verification of Enteral Tube Location Project sponsored by the American Society for Parenteral and Enteral Nutrition. Its mission is to identify and promote best practices with the potential of technology development that will enable accurate determination of NG-EAD placement for both the inpatient and outpatient pediatric populations. This article presents the challenges of bedside NG-EAD placement and ongoing location verification in children through an overview of the current state of the science. It is important for all health care professionals to be knowledgeable about the current literature, to be vigilant for possible complications, and to avoid complacency with NG-EAD placement and ongoing verification of tube location.

  10. The choice of practice location

    PubMed Central

    Butler, J. R.; Knight, Rose

    1975-01-01

    A ten per cent sample survey of all general practitioners in England and Wales in 1969-70 included two questions about the choice of practice location. The most common reasons given were the absence of any real alternatives (in the immediate post-war period), the influence of family or friends, the existence of medical contacts in the area, and favourable points about the practice itself. In considering possible future moves, general practitioners would pay closest attention to the educational facilities of an area, its rural or coastal location, its social and cultural amenities, and the practice conditions. The conclusion is drawn that financial incentives are unlikely to contribute much towards a more equal distribution of general-practitioner manpower. More thought should be given to recruitment to the medical profession in under-doctored areas through the development of the highest professional standards and facilities in such places. PMID:1195223

  11. Experimental verification of multipartite entanglement in quantum networks

    NASA Astrophysics Data System (ADS)

    McCutcheon, W.; Pappa, A.; Bell, B. A.; McMillan, A.; Chailloux, A.; Lawson, T.; Mafu, M.; Markham, D.; Diamanti, E.; Kerenidis, I.; Rarity, J. G.; Tame, M. S.

    2016-11-01

    Multipartite entangled states are a fundamental resource for a wide range of quantum information processing tasks. In particular, in quantum networks, it is essential for the parties involved to be able to verify if entanglement is present before they carry out a given distributed task. Here we design and experimentally demonstrate a protocol that allows any party in a network to check if a source is distributing a genuinely multipartite entangled state, even in the presence of untrusted parties. The protocol remains secure against dishonest behaviour of the source and other parties, including the use of system imperfections to their advantage. We demonstrate the verification protocol in a three- and four-party setting using polarization-entangled photons, highlighting its potential for realistic photonic quantum communication and networking applications.

  12. Experimental verification of multipartite entanglement in quantum networks

    PubMed Central

    McCutcheon, W.; Pappa, A.; Bell, B. A.; McMillan, A.; Chailloux, A.; Lawson, T.; Mafu, M.; Markham, D.; Diamanti, E.; Kerenidis, I.; Rarity, J. G.; Tame, M. S.

    2016-01-01

    Multipartite entangled states are a fundamental resource for a wide range of quantum information processing tasks. In particular, in quantum networks, it is essential for the parties involved to be able to verify if entanglement is present before they carry out a given distributed task. Here we design and experimentally demonstrate a protocol that allows any party in a network to check if a source is distributing a genuinely multipartite entangled state, even in the presence of untrusted parties. The protocol remains secure against dishonest behaviour of the source and other parties, including the use of system imperfections to their advantage. We demonstrate the verification protocol in a three- and four-party setting using polarization-entangled photons, highlighting its potential for realistic photonic quantum communication and networking applications. PMID:27827361

  13. Experimental verification of multipartite entanglement in quantum networks.

    PubMed

    McCutcheon, W; Pappa, A; Bell, B A; McMillan, A; Chailloux, A; Lawson, T; Mafu, M; Markham, D; Diamanti, E; Kerenidis, I; Rarity, J G; Tame, M S

    2016-11-09

    Multipartite entangled states are a fundamental resource for a wide range of quantum information processing tasks. In particular, in quantum networks, it is essential for the parties involved to be able to verify if entanglement is present before they carry out a given distributed task. Here we design and experimentally demonstrate a protocol that allows any party in a network to check if a source is distributing a genuinely multipartite entangled state, even in the presence of untrusted parties. The protocol remains secure against dishonest behaviour of the source and other parties, including the use of system imperfections to their advantage. We demonstrate the verification protocol in a three- and four-party setting using polarization-entangled photons, highlighting its potential for realistic photonic quantum communication and networking applications.

  14. Clinical application of in vivo treatment delivery verification based on PET/CT imaging of positron activity induced at high energy photon therapy

    NASA Astrophysics Data System (ADS)

    Janek Strååt, Sara; Andreassen, Björn; Jonsson, Cathrine; Noz, Marilyn E.; Maguire, Gerald Q., Jr.; Näfstadius, Peder; Näslund, Ingemar; Schoenahl, Frederic; Brahme, Anders

    2013-08-01

    The purpose of this study was to investigate in vivo verification of radiation treatment with high energy photon beams using PET/CT to image the induced positron activity. The measurements of the positron activation induced in a preoperative rectal cancer patient and a prostate cancer patient following 50 MV photon treatments are presented. A total dose of 5 and 8 Gy, respectively, were delivered to the tumors. Imaging was performed with a 64-slice PET/CT scanner for 30 min, starting 7 min after the end of the treatment. The CT volume from the PET/CT and the treatment planning CT were coregistered by matching anatomical reference points in the patient. The treatment delivery was imaged in vivo based on the distribution of the induced positron emitters produced by photonuclear reactions in tissue mapped on to the associated dose distribution of the treatment plan. The results showed that spatial distribution of induced activity in both patients agreed well with the delivered beam portals of the treatment plans in the entrance subcutaneous fat regions but less so in blood and oxygen rich soft tissues. For the preoperative rectal cancer patient however, a 2 ± (0.5) cm misalignment was observed in the cranial-caudal direction of the patient between the induced activity distribution and treatment plan, indicating a beam patient setup error. No misalignment of this kind was seen in the prostate cancer patient. However, due to a fast patient setup error in the PET/CT scanner a slight mis-position of the patient in the PET/CT was observed in all three planes, resulting in a deformed activity distribution compared to the treatment plan. The present study indicates that the induced positron emitters by high energy photon beams can be measured quite accurately using PET imaging of subcutaneous fat to allow portal verification of the delivered treatment beams. Measurement of the induced activity in the patient 7 min after receiving 5 Gy involved count rates which were about

  15. Using polygons to recognize and locate partially occluded objects.

    PubMed

    Koch, M W; Kashyap, R L

    1987-04-01

    We present computer vision algorithms that recognize and locate partially occluded objects. The scene may contain unknown objects that may touch or overlap giving rise to partial occlusion. The algorithms revolve around a generate-test paradigm. The paradigm iteratively generates and tests hypotheses for compatibility with the scene until it identifies all the scene objects. Polygon representations of the object's boundary guide the hypothesis generation scheme. Choosing the polygon representation turns out to have powerful consequences in all phases of hypothesis generation and verification. Special vertices of the polygon called ``corners'' help detect and locate the model in the scene. Polygon moment calculations lead to estimates of the dissimilarity between scene and model corners, and determine the model corner location in the scene. An association graph represents the matches and compatibility constraints. Extraction of the largest set of mutually compatible matches from the association graph forms a model hypothesis. Using a coordinate transform that maps the model onto the scene, the hypothesis gives the proposed model's location and orientation. Hypothesis verification requires checking for region consistency. The union of two polygons and other polygon operations combine to measure the consistency of the hypothesis with the scene. Experimental results give examples of all phases of recognizing and locating the objects.

  16. RF model of the distribution system as a communication channel, phase 2. Volume 2: Task reports

    NASA Technical Reports Server (NTRS)

    Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.

    1982-01-01

    Based on the established feasibility of predicting, via a model, the propagation of Power Line Frequency on radial type distribution feeders, verification studies comparing model predictions against measurements were undertaken using more complicated feeder circuits and situations. Detailed accounts of the major tasks are presented. These include: (1) verification of model; (2) extension, implementation, and verification of perturbation theory; (3) parameter sensitivity; (4) transformer modeling; and (5) compensation of power distribution systems for enhancement of power line carrier communication reliability.

  17. Electronic apex locators.

    PubMed

    Gordon, M P J; Chandler, N P

    2004-07-01

    Prior to root canal treatment at least one undistorted radiograph is required to assess canal morphology. The apical extent of instrumentation and the final root filling have a role in treatment success, and are primarily determined radiographically. Electronic apex locators reduce the number of radiographs required and assist where radiographic methods create difficulty. They may also indicate cases where the apical foramen is some distance from the radiographic apex. Other roles include the detection of root canal perforation. A review of the literature focussed first on the subject of electronic apex location. A second review used the names of apex location devices. From the combined searches, 113 pertinent articles in English were found. This paper reviews the development, action, use and types of electronic apex locators.

  18. Smart Location Mapping

    EPA Pesticide Factsheets

    The Smart Location Database, Access to Jobs and Workers via Transit, and National Walkability Index tools can help assess indicators related to the built environment, transit accessibility, and walkability.

  19. Uranium Location Database Compilation

    EPA Pesticide Factsheets

    EPA has compiled mine location information from federal, state, and Tribal agencies into a single database as part of its investigation into the potential environmental hazards of wastes from abandoned uranium mines in the western United States.

  20. INF verification: a guide for the perplexed

    SciTech Connect

    Mendelsohn, J.

    1987-09-01

    The administration has dug itself some deep holes on the verification issue. It will have to conclude an arms control treaty without having resolved earlier (but highly questionable) compliance issues on which it has placed great emphasis. It will probably have to abandon its more sweeping (and unnecessary) on-site inspection (OSI) proposals because of adverse security and political implications for the United States and its allies. And, finally, it will probably have to present to the Congress an INF treaty that will provide for a considerably less-stringent (but nonetheless adequate) verification regime that it had originally demanded. It is difficult to dispel the impression that, when the likelihood of concluding an INF treaty seemed remote, the administration indulged its penchant for intrusive and non-negotiable verification measures. As the possibility of, and eagerness for, a treaty increased, and as the Soviet Union shifted its policy from one of the resistance to OSI to one of indicating that on-site verification involved reciprocal obligations, the administration was forced to scale back its OSI rhetoric. This re-evaluation of OSI by the administration does not make the INF treaty any less verifiable; from the outset the Reagan administration was asking for a far more extensive verification package than was necessary, practicable, acceptable, or negotiable.

  1. Neighborhood Repulsed Metric Learning for Kinship Verification.

    PubMed

    Lu, Jiwen; Zhou, Xiuzhuang; Tan, Yap-Pen; Shang, Yuanyuan; Zhou, Jie

    2013-07-16

    Kinship verification from facial images is an interesting and challenging problem in computer vision, and there is very limited attempts on tackle this problem in the iterature. In this paper, we propose a new neighborhood repulsed metric learning (NRML) method for kinship verification. Motivated by the fact that interclass samples (without kinship relations) with higher similarity usually lie in a neighborhood and are more easily misclassified than those with lower similarity, we aim to learn a distance metric under which the intraclass samples (with kinship relations) are pulled as close as possible and interclass samples lying in a neighborhood are repulsed and pushed away as far as possible, simultaneously, such that more discriminative information can be exploited for verification. To make better use of multiple feature descriptors to extract complementary information, we further propose a multiview NRML (MNRML) method to seek a common distance metric to perform multiple feature fusion to improve the kinship verification performance. Experimental results are presented to demonstrate the efficacy of our proposed methods. Lastly, we also test human ability in kinship verification from facial images and our experimental results show that our methods are comparable to that of human observers.

  2. Neighborhood repulsed metric learning for kinship verification.

    PubMed

    Lu, Jiwen; Zhou, Xiuzhuang; Tan, Yap-Pen; Shang, Yuanyuan; Zhou, Jie

    2014-02-01

    Kinship verification from facial images is an interesting and challenging problem in computer vision, and there are very limited attempts on tackle this problem in the literature. In this paper, we propose a new neighborhood repulsed metric learning (NRML) method for kinship verification. Motivated by the fact that interclass samples (without a kinship relation) with higher similarity usually lie in a neighborhood and are more easily misclassified than those with lower similarity, we aim to learn a distance metric under which the intraclass samples (with a kinship relation) are pulled as close as possible and interclass samples lying in a neighborhood are repulsed and pushed away as far as possible, simultaneously, such that more discriminative information can be exploited for verification. To make better use of multiple feature descriptors to extract complementary information, we further propose a multiview NRML (MNRML) method to seek a common distance metric to perform multiple feature fusion to improve the kinship verification performance. Experimental results are presented to demonstrate the efficacy of our proposed methods. Finally, we also test human ability in kinship verification from facial images and our experimental results show that our methods are comparable to that of human observers.

  3. Hybrid Deep Learning for Face Verification.

    PubMed

    Sun, Yi; Wang, Xiaogang; Tang, Xiaoou

    2016-10-01

    This paper proposes a hybrid convolutional network (ConvNet)-Restricted Boltzmann Machine (RBM) model for face verification. A key contribution of this work is to learn high-level relational visual features with rich identity similarity information. The deep ConvNets in our model start by extracting local relational visual features from two face images in comparison, which are further processed through multiple layers to extract high-level and global relational features. To keep enough discriminative information, we use the last hidden layer neuron activations of the ConvNet as features for face verification instead of those of the output layer. To characterize face similarities from different aspects, we concatenate the features extracted from different face region pairs by different deep ConvNets. The resulting high-dimensional relational features are classified by an RBM for face verification. After pre-training each ConvNet and the RBM separately, the entire hybrid network is jointly optimized to further improve the accuracy. Various aspects of the ConvNet structures, relational features, and face verification classifiers are investigated. Our model achieves the state-of-the-art face verification performance on the challenging LFW dataset under both the unrestricted protocol and the setting when outside data is allowed to be used for training.

  4. Ionoacoustics: A new direct method for range verification

    NASA Astrophysics Data System (ADS)

    Parodi, Katia; Assmann, Walter

    2015-05-01

    The superior ballistic properties of ion beams may offer improved tumor-dose conformality and unprecedented sparing of organs at risk in comparison to other radiation modalities in external radiotherapy. However, these advantages come at the expense of increased sensitivity to uncertainties in the actual treatment delivery, resulting from inaccuracies of patient positioning, physiological motion and uncertainties in the knowledge of the ion range in living tissue. In particular, the dosimetric selectivity of ion beams depends on the longitudinal location of the Bragg peak, making in vivo knowledge of the actual beam range the greatest challenge to full clinical exploitation of ion therapy. Nowadays, in vivo range verification techniques, which are already, or close to, being investigated in the clinical practice, rely on the detection of the secondary annihilation photons or prompt gammas, resulting from nuclear interaction of the primary ion beam with the irradiated tissue. Despite the initial promising results, these methods utilize a not straightforward correlation between nuclear and electromagnetic processes, and typically require massive and costly instrumentation. On the contrary, the long-term known, yet only recently revisited process of "ionoacoustics", which is generated by local tissue heating especially at the Bragg peak, may offer a more direct approach to in vivo range verification, as reviewed here.

  5. Lunar Impact Flash Locations

    NASA Technical Reports Server (NTRS)

    Moser, D. E.; Suggs, R. M.; Kupferschmidt, L.; Feldman, J.

    2015-01-01

    A bright impact flash detected by the NASA Lunar Impact Monitoring Program in March 2013 brought into focus the importance of determining the impact flash location. A process for locating the impact flash, and presumably its associated crater, was developed using commercially available software tools. The process was successfully applied to the March 2013 impact flash and put into production on an additional 300 impact flashes. The goal today: provide a description of the geolocation technique developed.

  6. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  7. Implications of Non-Systematic Observations for Verification of Forecasts of Aviation Weather Variables

    NASA Astrophysics Data System (ADS)

    Brown, B. G.; Young, G. S.; Fowler, T. L.

    2001-12-01

    Over the last several years, efforts have been undertaken to develop improved automated forecasts of weather phenomena that have large impacts on aviation, including turbulence and in-flight icing conditions. Verification of these forecasts - which has played a major role in their development - is difficult due to the nature of the limited observations available for these evaluations; in particular, voice reports by pilots (PIREPs). These reports, which are provided inconsistently by pilots, currently are the best observations of turbulence and in-flight icing conditions available. However, their sampling characteristics make PIREPs a difficult dataset to use for these evaluations. In particular, PIREPs have temporal and spatial biases (e.g., they are more frequent during daylight hours, and they occur most frequently along flight routes and in the vicinity of major airports, where aircraft are concentrated), and they are subjective. Most importantly, the observations are non-systematic. That is, observations are not consistently reported at the same location and time. This characteristic of the reports has numerous implications for the verification of forecasts of these phenomena. In particular, it is inappropriate to estimate certain common verification statistics that normally are of interest in forecast evaluations. For example, estimates of the false alarm ratio and critical success index are incorrect, due to the unrepresentativeness of the observations. Analytical explanations for this result have been developed, and the magnitudes of the errors associated with estimating these statistics have been estimated through Monte Carlo simulations. In addition, several approaches have been developed to compensate for these characteristics of PIREPs in verification studies, including methods for estimating confidence intervals for the verification statistics, which take into account their sampling variability. These approaches also have implications for verification

  8. Theoretical study of closed-loop recycling liquid-liquid chromatography and experimental verification of the theory.

    PubMed

    Kostanyan, Artak E; Erastov, Andrey A

    2016-09-02

    The non-ideal recycling equilibrium-cell model including the effects of extra-column dispersion is used to simulate and analyze closed-loop recycling counter-current chromatography (CLR CCC). Previously, the operating scheme with the detector located before the column was considered. In this study, analysis of the process is carried out for a more realistic and practical scheme with the detector located immediately after the column. Peak equation for individual cycles and equations describing the transport of single peaks and complex chromatograms inside the recycling closed-loop, as well as equations for the resolution between single solute peaks of the neighboring cycles, for the resolution of peaks in the recycling chromatogram and for the resolution between the chromatograms of the neighboring cycles are presented. It is shown that, unlike conventional chromatography, increasing of the extra-column volume (the recycling line length) may allow a better separation of the components in CLR chromatography. For the experimental verification of the theory, aspirin, caffeine, coumarin and the solvent system hexane/ethyl acetate/ethanol/water (1:1:1:1) were used. Comparison of experimental and simulated processes of recycling and distribution of the solutes in the closed-loop demonstrated a good agreement between theory and experiment.

  9. Ionospheric Modeling: Development, Verification and Validation

    DTIC Science & Technology

    2007-08-15

    Investigation of the Reliability of the ESIR Ionogram Autoscaling Method (Expert System for Ionogram Reduction) ESIR.book.pdf Dec 06 Quality...Figures and Error Bars for Autoscaled Vertical Incidence Ionograms . Background and User Documentation for QualScan V2007.2 AFRL_QualScan.book.pdf Feb...Distribution of Ionosonde Locations USU_old_new.book.pdf Jul 07 Validation of QualScan when applied to Ionograms scaled by ARTIST 5 A5V.book.pdf Jul 07

  10. Dust storm events over Delhi: verification of dust AOD forecasts with satellite and surface observations

    NASA Astrophysics Data System (ADS)

    Singh, Aditi; Iyengar, Gopal R.; George, John P.

    2016-05-01

    Thar desert located in northwest part of India is considered as one of the major dust source. Dust storms originate in Thar desert during pre-monsoon season, affects large part of Indo-Gangetic plains. High dust loading causes the deterioration of the ambient air quality and degradation in visibility. Present study focuses on the identification of dust events and verification of the forecast of dust events over Delhi and western part of IG Plains, during the pre-monsoon season of 2015. Three dust events have been identified over Delhi during the study period. For all the selected days, Terra-MODIS AOD at 550 nm are found close to 1.0, while AURA-OMI AI shows high values. Dust AOD forecasts from NCMRWF Unified Model (NCUM) for the three selected dust events are verified against satellite (MODIS) and ground based observations (AERONET). Comparison of observed AODs at 550 nm from MODIS with NCUM predicted AODs reveals that NCUM is able to predict the spatial and temporal distribution of dust AOD, in these cases. Good correlation (~0.67) is obtained between the NCUM predicted dust AODs and location specific observations available from AERONET. Model under-predicted the AODs as compared to the AERONET observations. This may be mainly because the model account for only dust and no anthropogenic activities are considered. The results of the present study emphasize the requirement of more realistic representation of local dust emission in the model both of natural and anthropogenic origin, to improve the forecast of dust from NCUM during the dust events.

  11. Dust forecast over North Africa: verification with satellite and ground based observations

    NASA Astrophysics Data System (ADS)

    Singh, Aditi; Kumar, Sumit; George, John P.

    2016-05-01

    Arid regions of North Africa are considered as one of the major dust source. Present study focuses on the forecast of aerosol optical depth (AOD) of dust over different regions of North Africa. NCMRWF Unified Model (NCUM) produces dust AOD forecasts at different wavelengths with lead time upto 240 hr, based on 00UTC initial conditions. Model forecast of dust AOD at 550 nm up to 72 hr forecast, based on different initial conditions are verified against satellite and ground based observations of total AOD during May-June 2014 with the assumption that except dust, presence of all other aerosols type are negligible. Location specific and geographical distribution of dust AOD forecast is verified against Aerosol Robotic Network (AERONET) station observations of total and coarse mode AOD. Moderate Resolution Imaging Spectroradiometer (MODIS) dark target and deep blue merged level 3 total aerosol optical depth (AOD) at 550 nm and Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) retrieved dust AOD at 532 nm are also used for verification. CALIOP dust AOD was obtained by vertical integration of aerosol extinction coefficient at 532 nm from the aerosol profile level 2 products. It is found that at all the selected AERONET stations, the trend in dust AODs is well predicted by NCUM up to three days advance. Good correlation, with consistently low bias (~ +/-0.06) and RMSE (~ 0.2) values, is found between model forecasts and point measurements of AERONET, except over one location Cinzana (Mali). Model forecast consistently overestimated the dust AOD compared to CALIOP dust AOD, with a bias of 0.25 and RMSE of 0.40.

  12. PBL Verification with Radiosonde and Aircraft Data

    NASA Astrophysics Data System (ADS)

    Tsidulko, M.; McQueen, J.; Dimego, G.; Ek, M.

    2008-12-01

    Boundary layer depth is an important characteristic in weather forecasting and it is a key parameter in air quality modeling determining extent of turbulence and dispersion for pollutants. Real-time PBL depths from the NAM(WRF/NMM) model are verified with different types of observations. PBL depths verification is incorporated into NCEP verification system including an ability to provide a range of statistical characteristics for the boundary layer heights. For the model, several types of boundary layer definitions are used. PBL height from the TKE scheme, critical Ri number approach as well as mixed layer depth are compared with observations. Observed PBL depths are determined applying Ri number approach to radiosonde profiles. Also, preliminary study of using ACARS data for PBL verification is conducted.

  13. Active alignment/contact verification system

    DOEpatents

    Greenbaum, William M.

    2000-01-01

    A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.

  14. Packaged low-level waste verification system

    SciTech Connect

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-08-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy`s National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL).

  15. Land Ice Verification and Validation Kit

    SciTech Connect

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and test data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.

  16. Critical Surface Cleaning and Verification Alternatives

    NASA Technical Reports Server (NTRS)

    Melton, Donald M.; McCool, A. (Technical Monitor)

    2000-01-01

    As a result of federal and state requirements, historical critical cleaning and verification solvents such as Freon 113, Freon TMC, and Trichloroethylene (TCE) are either highly regulated or no longer 0 C available. Interim replacements such as HCFC 225 have been qualified, however toxicity and future phase-out regulations necessitate long term solutions. The scope of this project was to qualify a safe and environmentally compliant LOX surface verification alternative to Freon 113, TCE and HCFC 225. The main effort was focused on initiating the evaluation and qualification of HCFC 225G as an alternate LOX verification solvent. The project was scoped in FY 99/00 to perform LOX compatibility, cleaning efficiency and qualification on flight hardware.

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, BIO-MICROBICS, INC., MODEL RETROFAST ®0.375

    EPA Science Inventory

    Verification testing of the Bio-Microbics RetroFAST® 0.375 System to determine the reduction of nitrogen in residential wastewater was conducted over a twelve-month period at the Mamquam Wastewater Technology Test Facility, located at the Mamquam Wastewater Treatment Plant. The R...

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, F. R. MAHONEY & ASSOC., AMPHIDROME SYSTEM FOR SINGLE FAMILY HOMES - 02/05/WQPC-SWP

    EPA Science Inventory

    Verification testing of the F.R. Mahoney Amphidrome System was conducted over a twelve month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at the Otis Air National Guard Base in Borne, MA. Sanitary Sewerage from the base residential housing w...

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSDF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, SEPTITECH, INC. MODEL 400 SYSTEM - 02/04/WQPC-SWP

    EPA Science Inventory

    Verification testing of the SeptiTech Model 400 System was conducted over a twelve month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at the Otis Air National Guard Base in Borne, MA. Sanitary Sewerage from the base residential housing was u...

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, AQUAPOINT, INC. BIOCLERE MODEL 16/12 - 02/02/WQPC-SWP

    EPA Science Inventory

    Verification testing of the Aquapoint, Inc. (AQP) BioclereTM Model 16/12 was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC), located at Otis Air National Guard Base in Bourne, Massachusetts. Sanitary sewerage from the ba...

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL RESIDENTIAL HOMES, WATERLOO BIOFILTER® MODEL 4-BEDROOM (NSF 02/03/WQPC-SWP)

    EPA Science Inventory

    Verification testing of the Waterloo Biofilter Systems (WBS), Inc. Waterloo Biofilter® Model 4-Bedroom system was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at Otis Air National Guard Base in Bourne, Mas...

  2. A tracking and verification system implemented in a clinical environment for partial HIPAA compliance

    NASA Astrophysics Data System (ADS)

    Guo, Bing; Documet, Jorge; Liu, Brent; King, Nelson; Shrestha, Rasu; Wang, Kevin; Huang, H. K.; Grant, Edward G.

    2006-03-01

    The paper describes the methodology for the clinical design and implementation of a Location Tracking and Verification System (LTVS) that has distinct benefits for the Imaging Department at the Healthcare Consultation Center II (HCCII), an outpatient imaging facility located on the USC Health Science Campus. A novel system for tracking and verification of patients and staff in a clinical environment using wireless and facial biometric technology to monitor and automatically identify patients and staff was developed in order to streamline patient workflow, protect against erroneous examinations and create a security zone to prevent and audit unauthorized access to patient healthcare data under the HIPAA mandate. This paper describes the system design and integration methodology based on initial clinical workflow studies within a clinical environment. An outpatient center was chosen as an initial first step for the development and implementation of this system.

  3. Hierarchical Representation Learning for Kinship Verification.

    PubMed

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  4. Hierarchical Representation Learning for Kinship Verification.

    PubMed

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2016-09-14

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this research, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determines their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d1, and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical Kinship Verification via Representation Learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU Kinship Database is created which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields stateof- the-art kinship verification accuracy on the WVU Kinship database and on four existing benchmark datasets. Further, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  5. 37 CFR 380.6 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... TRANSMISSIONS, NEW SUBSCRIPTION SERVICES AND THE MAKING OF EPHEMERAL REPRODUCTIONS § 380.6 Verification of... purpose of the audit. The Collective shall retain the report of the verification for a period of not...

  6. Generic Protocol for the Verification of Ballast Water Treatment Technology

    EPA Science Inventory

    In anticipation of the need to address performance verification and subsequent approval of new and innovative ballast water treatment technologies for shipboard installation, the U.S Coast Guard and the Environmental Protection Agency‘s Environmental Technology Verification Progr...

  7. Jet Propulsion Laboratory Environmental Verification Processes and Test Effectiveness

    NASA Technical Reports Server (NTRS)

    Hoffman, Alan R.; Green, Nelson W.

    2006-01-01

    Viewgraphs on the JPL processes for enviornmental verification and testing of aerospace systems is presented. The topics include: 1) Processes: a) JPL Design Principles b) JPL Flight Project Practices; 2) Environmental Verification; and 3) Test Effectiveness Assessment: Inflight Anomaly Trends.

  8. Dose Verification of Stereotactic Radiosurgery Treatment for Trigeminal Neuralgia with Presage 3D Dosimetry System

    NASA Astrophysics Data System (ADS)

    Wang, Z.; Thomas, A.; Newton, J.; Ibbott, G.; Deasy, J.; Oldham, M.

    2010-11-01

    Achieving adequate verification and quality-assurance (QA) for radiosurgery treatment of trigeminal-neuralgia (TGN) is particularly challenging because of the combination of very small fields, very high doses, and complex irradiation geometries (multiple gantry and couch combinations). TGN treatments have extreme requirements for dosimetry tools and QA techniques, to ensure adequate verification. In this work we evaluate the potential of Presage/Optical-CT dosimetry system as a tool for the verification of TGN distributions in high-resolution and in 3D. A TGN treatment was planned and delivered to a Presage 3D dosimeter positioned inside the Radiological-Physics-Center (RPC) head and neck IMRT credentialing phantom. A 6-arc treatment plan was created using the iPlan system, and a maximum dose of 80Gy was delivered with a Varian Trilogy machine. The delivered dose to Presage was determined by optical-CT scanning using the Duke Large field-of-view Optical-CT Scanner (DLOS) in 3D, with isotropic resolution of 0.7mm3. DLOS scanning and reconstruction took about 20minutes. 3D dose comparisons were made with the planning system. Good agreement was observed between the planned and measured 3D dose distributions, and this work provides strong support for the viability of Presage/Optical-CT as a highly useful new approach for verification of this complex technique.

  9. Ultra-wideband Location Authentication for Item Tracking

    SciTech Connect

    Rowe, Nathan C; Kuhn, Michael J; Stinson, Brad J; Holland, Stephen A

    2012-01-01

    International safeguards is increasingly utilizing unattended and remote monitoring methods to improve inspector efficiency and the timeliness of diversion detection. Item identification and tracking has been proposed as one unattended remote monitoring method, and a number of radio-frequency (RF) technologies have been proposed. When utilizing location information for verification purposes, strong assurance of the authenticity of the reported location is required, but most commercial RF systems are vulnerable to a variety of spoofing and relay attacks. ORNL has developed a distance bounding method that uses ultra-wideband technology to provide strong assurance of item location. This distance bounding approach can be coupled with strong symmetric key authentication methods to provide a fully authenticable tracking system that is resistant to both spoofing and relay attacks. This paper will discuss the overall problems associated with RF tracking including the common spoofing and relay attack scenarios, the ORNL distance bounding approach for authenticating location, and the potential applications for this technology.

  10. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  11. Verification of Plan Models Using UPPAAL

    NASA Technical Reports Server (NTRS)

    Khatib, Lina; Muscettola, Nicola; Haveland, Klaus; Lau, Sonic (Technical Monitor)

    2001-01-01

    This paper describes work on the verification of HSTS, the planner and scheduler of the Remote Agent autonomous control system deployed in Deep Space 1 (DS1). The verification is done using UPPAAL, a real time model checking tool. We start by motivating our work in the introduction. Then we give a brief description of HSTS and UPPAAL. After that, we give a mapping of HSTS models into UPPAAL and we present samples of plan model properties one may want to verify. Finally, we conclude with a summary.

  12. Challenges in High-Assurance Runtime Verification

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.

    2016-01-01

    Safety-critical systems are growing more complex and becoming increasingly autonomous. Runtime Verification (RV) has the potential to provide protections when a system cannot be assured by conventional means, but only if the RV itself can be trusted. In this paper, we proffer a number of challenges to realizing high-assurance RV and illustrate how we have addressed them in our research. We argue that high-assurance RV provides a rich target for automated verification tools in hope of fostering closer collaboration among the communities.

  13. Verification Of Tooling For Robotic Welding

    NASA Technical Reports Server (NTRS)

    Osterloh, Mark R.; Sliwinski, Karen E.; Anderson, Ronald R.

    1991-01-01

    Computer simulations, robotic inspections, and visual inspections performed to detect discrepancies. Method for verification of tooling for robotic welding involves combination of computer simulations and visual inspections. Verification process ensures accuracy of mathematical model representing tooling in off-line programming system that numerically simulates operation of robotic welding system. Process helps prevent damaging collisions between welding equipment and workpiece, ensures tooling positioned and oriented properly with respect to workpiece, and/or determines whether tooling to be modified or adjusted to achieve foregoing objectives.

  14. Electric power system test and verification program

    NASA Technical Reports Server (NTRS)

    Rylicki, Daniel S.; Robinson, Frank, Jr.

    1994-01-01

    Space Station Freedom's (SSF's) electric power system (EPS) hardware and software verification is performed at all levels of integration, from components to assembly and system level tests. Careful planning is essential to ensure the EPS is tested properly on the ground prior to launch. The results of the test performed on breadboard model hardware and analyses completed to date have been evaluated and used to plan for design qualification and flight acceptance test phases. These results and plans indicate the verification program for SSF's 75-kW EPS would have been successful and completed in time to support the scheduled first element launch.

  15. On Backward-Style Anonymity Verification

    NASA Astrophysics Data System (ADS)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  16. Source Identification and Location Techniques

    NASA Technical Reports Server (NTRS)

    Weir, Donald; Bridges, James; Agboola, Femi; Dougherty, Robert

    2001-01-01

    Mr. Weir presented source location results obtained from an engine test as part of the Engine Validation of Noise Reduction Concepts program. Two types of microphone arrays were used in this program to determine the jet noise source distribution for the exhaust from a 4.3 bypass ratio turbofan engine. One was a linear array of 16 microphones located on a 25 ft. sideline and the other was a 103 microphone 3-D "cage" array in the near field of the jet. Data were obtained from a baseline nozzle and from numerous nozzle configuration using chevrons and/or tabs to reduce the jet noise. Mr. Weir presented data from two configurations: the baseline nozzle and a nozzle configuration with chevrons on both the core and bypass nozzles. This chevron configuration had achieved a jet noise reduction of 4 EPNdB in small scale tests conducted at the Glenn Research Center. IR imaging showed that the chevrons produced significant improvements in mixing and greatly reduced the length of the jet potential core. Comparison of source location data from the 1-D phased array showed a shift of the noise sources towards the nozzle and clear reductions of the sources due to the noise reduction devices. Data from the 3-D array showed a single source at a frequency of 125 Hz. located several diameters downstream from the nozzle exit. At 250 and 400 Hz., multiple sources, periodically spaced, appeared to exist downstream of the nozzle. The trend of source location moving toward the nozzle exit with increasing frequency was also observed. The 3-D array data also showed a reduction in source strength with the addition of chevrons. The overall trend of source location with frequency was compared for the two arrays and with classical experience. Similar trends were observed. Although overall trends with frequency and addition of suppression devices were consistent between the data from the 1-D and the 3-D arrays, a comparison of the details of the inferred source locations did show differences. A

  17. Earthquake location in island arcs

    USGS Publications Warehouse

    Engdahl, E.R.; Dewey, J.W.; Fujita, K.

    1982-01-01

    A comprehensive data set of selected teleseismic P-wave arrivals and local-network P- and S-wave arrivals from large earthquakes occurring at all depths within a small section of the central Aleutians is used to examine the general problem of earthquake location in island arcs. Reference hypocenters for this special data set are determined for shallow earthquakes from local-network data and for deep earthquakes from combined local and teleseismic data by joint inversion for structure and location. The high-velocity lithospheric slab beneath the central Aleutians may displace hypocenters that are located using spherically symmetric Earth models; the amount of displacement depends on the position of the earthquakes with respect to the slab and on whether local or teleseismic data are used to locate the earthquakes. Hypocenters for trench and intermediate-depth events appear to be minimally biased by the effects of slab structure on rays to teleseismic stations. However, locations of intermediate-depth events based on only local data are systematically displaced southwards, the magnitude of the displacement being proportional to depth. Shallow-focus events along the main thrust zone, although well located using only local-network data, are severely shifted northwards and deeper, with displacements as large as 50 km, by slab effects on teleseismic travel times. Hypocenters determined by a method that utilizes seismic ray tracing through a three-dimensional velocity model of the subduction zone, derived by thermal modeling, are compared to results obtained by the method of joint hypocenter determination (JHD) that formally assumes a laterally homogeneous velocity model over the source region and treats all raypath anomalies as constant station corrections to the travel-time curve. The ray-tracing method has the theoretical advantage that it accounts for variations in travel-time anomalies within a group of events distributed over a sizable region of a dipping, high

  18. Interim Letter Report - Verification Survey of 19 Grids in the Lester Flat Area, David Witherspoon Inc. 1630 Site Knoxville, Tennessee

    SciTech Connect

    P.C. Weaver

    2008-10-17

    Perform verification surveys of 19 available grids located in the Lester Flat Area at the Davod Witherspoon Site. The survey grids included E11, E12, E13, F11, F12, F13, F14, F15, G15, G16, G17, H16, H17, H18, X16, X17, X18, K16, and J16.

  19. Confederation Verification, Validation, and Accreditation Master Plan (CVVAMP) - Confederation 1994 Integrated Test Plan. Enclosure 2.

    DTIC Science & Technology

    1994-04-01

    while still maintaining altitude Y. 4) Fly the mission outside the lateral limits of the CBS playbox while still maintaining altitude Y. 5) BINGO ... BINGO RESA mission. TEST VERIFICATION: CBS: 2) Ensure that missions are not ghosted in CBS until they enter the lateral limits of the CBS playbox. 3...on subsequent game cycles. AWSIM: 1) Verify that the RESA mission is ghosted at the correct location in AWSIM. 2) Verify that ghosted RESA mission

  20. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory's Source Region Program

    SciTech Connect

    Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.

    1993-01-21

    As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.

  1. A Verification of MCNP6 FMESH Tally Capabilities

    SciTech Connect

    Swift, Alicia L.; McKigney, Edward A.; Schirato, Richard C.; Robinson, Alex Philip; Temple, Brian Allen

    2015-02-10

    This work serves to verify the MCNP6 FMESH capability through comparison to two types of data. FMESH tallies, binned in time, were generated on an ideal detector face for neutrons undergoing a single scatter in a graphite target. For verification, FMESH results were compared to analytic calculations of the nonrelativistic TOF for elastic and inelastic single neutron scatters (TOF for the purposes of this paper is the time for a neutron to travel from its scatter location in the graphite target to the detector face). FMESH tally results were also compared to F4 tally results, an MNCP tally that calculates fluence in the same way as the FMESH tally. The FMESH tally results agree well with the analytic results and the F4 tally; hence, it is believed that, for simple geometries, MCNP6 FMESH tallies represent the physics of neutron scattering very well.

  2. Methods for identification and verification using vacuum XRF system

    NASA Technical Reports Server (NTRS)

    Schramm, Fred (Inventor); Kaiser, Bruce (Inventor)

    2005-01-01

    Apparatus and methods in which one or more elemental taggants that are intrinsically located in an object are detected by x-ray fluorescence analysis under vacuum conditions to identify or verify the object's elemental content for elements with lower atomic numbers. By using x-ray fluorescence analysis, the apparatus and methods of the invention are simple and easy to use, as well as provide detection by a non line-of-sight method to establish the origin of objects, as well as their point of manufacture, authenticity, verification, security, and the presence of impurities. The invention is extremely advantageous because it provides the capability to measure lower atomic number elements in the field with a portable instrument.

  3. RFI emitter location techniques

    NASA Technical Reports Server (NTRS)

    Rao, B. L. J.

    1973-01-01

    The possibility is discussed of using Doppler techniques for determining the location of ground based emitters causing radio frequency interference with low orbiting satellites. An error analysis indicates that it is possible to find the emitter location within an error range of 2 n.mi. The parameters which determine the required satellite receiver characteristic are discussed briefly along with the non-real time signal processing which may by used in obtaining the Doppler curve. Finally, the required characteristics of the satellite antenna are analyzed.

  4. Marine cable location system

    SciTech Connect

    Zachariadis, R.G.

    1984-05-01

    An acoustic positioning system locates a marine cable at an exploration site, such cable employing a plurality of hydrophones at spaced-apart positions along the cable. A marine vessel measures water depth to the cable as the vessel passes over the cable and interrogates the hydrophones with sonar pulses along a slant range as the vessel travels in a parallel and horizontally offset path to the cable. The location of the hydrophones is determined from the recordings of water depth and slant range.

  5. Weather model verification using Sodankylä mast measurements

    NASA Astrophysics Data System (ADS)

    Kangas, Markku; Rontu, Laura; Fortelius, Carl; Aurela, Mika; Poikonen, Antti

    2016-04-01

    Sodankylä, in the heart of Arctic Research Centre of the Finnish Meteorological Institute (FMI ARC) in northern Finland, is an ideal site for atmospheric and environmental research in the boreal and sub-Arctic zone. With temperatures ranging from -50 to +30 °C, it provides a challenging testing ground for numerical weather forecasting (NWP) models as well as weather forecasting in general. An extensive set of measurements has been carried out in Sodankylä for more than 100 years. In 2000, a 48 m-high micrometeorological mast was erected in the area. In this article, the use of Sodankylä mast measurements in NWP model verification is described. Starting in 2000, with the NWP model HIRLAM and Sodankylä measurements, the verification system has now been expanded to include comparisons between 12 NWP models and seven measurement masts, distributed across Europe. A case study, comparing forecasted and observed radiation fluxes, is also presented. It was found that three different radiation schemes, applicable in NWP model HARMONIE-AROME, produced somewhat different downwelling longwave radiation fluxes during cloudy days, which however did not change the overall cold bias of the predicted screen-level temperature.

  6. Monte Carlo dose verification for intensity-modulated arc therapy

    NASA Astrophysics Data System (ADS)

    Li, X. Allen; Ma, Lijun; Naqvi, Shahid; Shih, Rompin; Yu, Cedric

    2001-09-01

    Intensity-modulated arc therapy (IMAT), a technique which combines beam rotation and dynamic multileaf collimation, has been implemented in our clinic. Dosimetric errors can be created by the inability of the planning system to accurately account for the effects of tissue inhomogeneities and physical characteristics of the multileaf collimator (MLC). The objective of this study is to explore the use of Monte Carlo (MC) simulation for IMAT dose verification. The BEAM/DOSXYZ Monte Carlo system was implemented to perform dose verification for the IMAT treatment. The implementation includes the simulation of the linac head/MLC (Elekta SL20), the conversion of patient CT images and beam arrangement for 3D dose calculation, the calculation of gantry rotation and leaf motion by a series of static beams and the development of software to automate the entire MC process. The MC calculations were verified by measurements for conventional beam settings. The agreement was within 2%. The IMAT dose distributions generated by a commercial forward planning system (RenderPlan, Elekta) were compared with those calculated by the MC package. For the cases studied, discrepancies of over 10% were found between the MC and the RenderPlan dose calculations. These discrepancies were due in part to the inaccurate dose calculation of the RenderPlan system. The computation time for the IMAT MC calculation was in the range of 20-80 min on 15 Pentium-III computers. The MC method was also useful in verifying the beam apertures used in the IMAT treatments.

  7. Non-Invertible Transforms for Image-Based Verification

    SciTech Connect

    White, Timothy A.; Robinson, Sean M.; Jarman, Kenneth D.; Miller, Erin A.; Seifert, Allen; McDonald, Benjamin S.; Pitts, W. Karl; Misner, Alex C.

    2011-07-20

    Imaging may play a unique role in verifying the presence and distribution of warhead components in warhead counting and dismantlement settings where image information content can distinguish among shapes, forms, and material composition of items. However, a major issue with imaging is the high level of intrusiveness, and in particular, the possible need to store sensitive comparison images in the inspection system that would violate information barrier (IB) principles. Reducing images via transformations or feature extraction can produce image features (e.g. attributes) for verification, but with enough prior information about structure the reduced information itself may be sufficient to deduce sensitive details of the original image. Further reducing resolution of the transformed image information is an option, but too much reduction destroys the quality of the attribute. We study the possibility of a one-way transform that allows storage of non-sensitive reference information and analysis to enable comparison of transformed images within IB constraints. In particular, we consider the degree to which images can be reconstructed from image intensity histograms depending on the number of pixel intensity bins and the degree of frequency data quantization, as well as assumed knowledge of configuration of objects in the images. We also explore the concept of a 'perceptual hash' as a class of transforms that may enable verification with provable non-invertibility, leading to an effective one-way transform that preserves the nature of the image feature data without revealing sufficient information to reconstruct the original image.

  8. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  9. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  10. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  11. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  12. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  13. 40 CFR 1066.240 - Torque transducer verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Torque transducer verification. 1066... POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification. Verify torque-measurement systems by performing the verifications described in §§ 1066.270...

  14. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  15. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  16. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  17. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  18. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  19. 15 CFR 748.13 - Delivery Verification (DV).

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 15 Commerce and Foreign Trade 2 2011-01-01 2011-01-01 false Delivery Verification (DV). 748.13... (CLASSIFICATION, ADVISORY, AND LICENSE) AND DOCUMENTATION § 748.13 Delivery Verification (DV). (a) Scope. (1) BIS may request the licensee to obtain verifications of delivery on a selective basis. A...

  20. 15 CFR 748.13 - Delivery Verification (DV).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 2 2010-01-01 2010-01-01 false Delivery Verification (DV). 748.13... (CLASSIFICATION, ADVISORY, AND LICENSE) AND DOCUMENTATION § 748.13 Delivery Verification (DV). (a) Scope. (1) BIS may request the licensee to obtain verifications of delivery on a selective basis. A...