Science.gov

Sample records for distributed location verification

  1. Collaborative Localization and Location Verification in WSNs

    PubMed Central

    Miao, Chunyu; Dai, Guoyong; Ying, Kezhen; Chen, Qingzhang

    2015-01-01

    Localization is one of the most important technologies in wireless sensor networks. A lightweight distributed node localization scheme is proposed by considering the limited computational capacity of WSNs. The proposed scheme introduces the virtual force model to determine the location by incremental refinement. Aiming at solving the drifting problem and malicious anchor problem, a location verification algorithm based on the virtual force mode is presented. In addition, an anchor promotion algorithm using the localization reliability model is proposed to re-locate the drifted nodes. Extended simulation experiments indicate that the localization algorithm has relatively high precision and the location verification algorithm has relatively high accuracy. The communication overhead of these algorithms is relative low, and the whole set of reliable localization methods is practical as well as comprehensive. PMID:25954948

  2. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  3. Protecting Privacy and Securing the Gathering of Location Proofs - The Secure Location Verification Proof Gathering Protocol

    NASA Astrophysics Data System (ADS)

    Graham, Michelle; Gray, David

    As wireless networks become increasingly ubiquitous, the demand for a method of locating a device has increased dramatically. Location Based Services are now commonplace but there are few methods of verifying or guaranteeing a location provided by a user without some specialised hardware, especially in larger scale networks. We propose a system for the verification of location claims, using proof gathered from neighbouring devices. In this paper we introduce a protocol to protect this proof gathering process, protecting the privacy of all involved parties and securing it from intruders and malicious claiming devices. We present the protocol in stages, extending the security of this protocol to allow for flexibility within its application. The Secure Location Verification Proof Gathering Protocol (SLVPGP) has been designed to function within the area of Vehicular Networks, although its application could be extended to any device with wireless & cryptographic capabilities.

  4. 37 CFR 384.7 - Verification of royalty distributions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... BUSINESS ESTABLISHMENT SERVICES § 384.7 Verification of royalty distributions. (a) General. This section prescribes procedures by which any Copyright Owner may verify the royalty distributions made by the... Copyright Owner and the Collective have agreed as to proper verification methods. (b) Frequency...

  5. 37 CFR 384.7 - Verification of royalty distributions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... BUSINESS ESTABLISHMENT SERVICES § 384.7 Verification of royalty distributions. (a) General. This section prescribes procedures by which any Copyright Owner may verify the royalty distributions made by the... Copyright Owner and the Collective have agreed as to proper verification methods. (b) Frequency...

  6. 37 CFR 380.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Owner or Performer may verify the royalty distributions made by the Collective; provided, however, that nothing contained in this section shall apply to situations where a Copyright Owner or Performer and the Collective have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner...

  7. 37 CFR 380.7 - Verification of royalty distributions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... royalty distributions. (a) General. This section prescribes procedures by which any Copyright Owner or... contained in this section shall apply to situations where a Copyright Owner or Performer and the Collective have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner...

  8. 37 CFR 380.7 - Verification of royalty distributions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... procedures by which any Copyright Owner or Performer may verify the royalty distributions made by the... Copyright Owner or Performer and the Collective have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner or Performer may conduct a single audit of the Collective...

  9. 37 CFR 380.26 - Verification of royalty distributions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... which any Copyright Owner or Performer may verify the royalty distributions made by the Collective; provided, however, that nothing contained in this section shall apply to situations where a Copyright Owner... verification. A Copyright Owner or Performer may conduct a single audit of the Collective upon...

  10. Distributed Avionics and Software Verification for the Constellation Program

    NASA Technical Reports Server (NTRS)

    Hood, Laura E.; Adams, James E.

    2008-01-01

    This viewgraph presentation reviews the planned verification of the avionics and software being developed for the Constellation program.The Constellation Distributed System Integration Laboratory (DSIL) will consist of multiple System Integration Labs (SILs), Simulators, Emulators, Testbeds, and Control Centers interacting with each other over a broadband network to provide virtual test systems for multiple test scenarios.

  11. The Error Distribution of BATSE GRB Location

    NASA Technical Reports Server (NTRS)

    Briggs, Michael S.; Pendleton, Geoffrey N.; Kippen, R. Marc; Brainerd, J. J.; Hurley, Kevin; Connaughton, Valerie; Meegan, Charles A.

    1998-01-01

    We develop empirical probability models for BATSE GRB location errors by a Bayesian analysis of the separations between BATSE GRB locations and locations obtained with the InterPlanetary Network (IPN). Models are compared and their parameters estimated using 394 GRBs with single IPN annuli and 20 GRBs with intersecting IPN annuli. Most of the analysis is for the 4B (rev) BATSE catalog; earlier catalogs are also analyzed. The simplest model that provides a good representation of the error distribution has 78% of the locations in a 'core' term with a systematic error of 1.85 degrees and the remainder in an extended tail with a systematic error of 5.36 degrees, implying a 68% confidence region for bursts with negligible statistical errors of 2.3 degrees. There is some evidence for a more complicated model in which the error distribution depends on the BATSE datatype that was used to obtain the location. Bright bursts are typically located using the CONT datatype, and according to the more complicated model, the 68% confidence region for CONT-located bursts with negligible statistical errors is 2.0 degrees.

  12. Modeling and Verification of Distributed Generation and Voltage Regulation Equipment for Unbalanced Distribution Power Systems; Annual Subcontract Report, June 2007

    SciTech Connect

    Davis, M. W.; Broadwater, R.; Hambrick, J.

    2007-07-01

    This report summarizes the development of models for distributed generation and distribution circuit voltage regulation equipment for unbalanced power systems and their verification through actual field measurements.

  13. GENERIC VERIFICATION PROTOCOL: DISTRIBUTED GENERATION AND COMBINED HEAT AND POWER FIELD TESTING PROTOCOL

    EPA Science Inventory

    This report is a generic verification protocol by which EPA’s Environmental Technology Verification program tests newly developed equipment for distributed generation of electric power, usually micro-turbine generators and internal combustion engine generators. The protocol will ...

  14. Mobile agent location in distributed environments

    NASA Astrophysics Data System (ADS)

    Fountoukis, S. G.; Argyropoulos, I. P.

    2012-12-01

    An agent is a small program acting on behalf of a user or an application which plays the role of a user. Artificial intelligence can be encapsulated in agents so that they can be capable of both behaving autonomously and showing an elementary decision ability regarding movement and some specific actions. Therefore they are often called autonomous mobile agents. In a distributed system, they can move themselves from one processing node to another through the interconnecting network infrastructure. Their purpose is to collect useful information and to carry it back to their user. Also, agents are used to start, monitor and stop processes running on the individual interconnected processing nodes of computer cluster systems. An agent has a unique id to discriminate itself from other agents and a current position. The position can be expressed as the address of the processing node which currently hosts the agent. Very often, it is necessary for a user, a processing node or another agent to know the current position of an agent in a distributed system. Several procedures and algorithms have been proposed for the purpose of position location of mobile agents. The most basic of all employs a fixed computing node, which acts as agent position repository, receiving messages from all the moving agents and keeping records of their current positions. The fixed node, responds to position queries and informs users, other nodes and other agents about the position of an agent. Herein, a model is proposed that considers pairs and triples of agents instead of single ones. A location method, which is investigated in this paper, attempts to exploit this model.

  15. DOE-EPRI distributed wind Turbine Verification Program (TVP III)

    SciTech Connect

    McGowin, C.; DeMeo, E.; Calvert, S.

    1997-12-31

    In 1992, the Electric Power Research Institute (EPRI) and the U.S. Department of Energy (DOE) initiated the Utility Wind Turbine Verification Program (TVP). The goal of the program is to evaluate prototype advanced wind turbines at several sites developed by U.S. electric utility companies. Two six MW wind projects have been installed under the TVP program by Central and South West Services in Fort Davis, Texas and Green Mountain Power Corporation in Searsburg, Vermont. In early 1997, DOE and EPRI selected five more utility projects to evaluate distributed wind generation using smaller {open_quotes}clusters{close_quotes} of wind turbines connected directly to the electricity distribution system. This paper presents an overview of the objectives, scope, and status of the EPRI-DOE TVP program and the existing and planned TVP projects.

  16. Verification and translation of distributed computing system software design

    SciTech Connect

    Chen, J.N.

    1987-01-01

    A methodology for generating a distributed computing system application program for the design specification based on modified Petri nets is presented. There are four major stages in this methodology: (1) to build a structured graphics specification model, (2) to verify abstract data type and detect deadlock of the model, (3) the define communicate among individual processes within the model, and (4) to translate symbolic representation into a program of a specified high-level target language. In this dissertation, Ada is used as the specified high-level target language. The structured graphics promote intelligibility because hierarchical decomposition functional modules is encouraged and the behavior of each process can be easily extracted from the net as a separate view of the system. The formal method described in this dissertation uses symbolic formal method presentation to represent the design specification of distributed computing systems. This symbolic representation is then translated into an equivalent Ada program structure, especially with the features of concurrency and synchronization. Artificial intelligence techniques are employed to verify and to detect deadlock properties in a distributed computing system environment. In the aspect of verification, the axioms of abstract data types are translated into PROLOG clauses and some inquires are tested to prove correctness of abstract data types.

  17. Design and Verification of a Distributed Communication Protocol

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.

  18. Reconstructing Spatial Distributions from Anonymized Locations

    SciTech Connect

    Horey, James L; Forrest, Stephanie; Groat, Michael

    2012-01-01

    Devices such as mobile phones, tablets, and sensors are often equipped with GPS that accurately report a person's location. Combined with wireless communication, these devices enable a wide range of new social tools and applications. These same qualities, however, leave location-aware applications vulnerable to privacy violations. This paper introduces the Negative Quad Tree, a privacy protection method for location aware applications. The method is broadly applicable to applications that use spatial density information, such as social applications that measure the popularity of social venues. The method employs a simple anonymization algorithm running on mobile devices, and a more complex reconstruction algorithm on a central server. This strategy is well suited to low-powered mobile devices. The paper analyzes the accuracy of the reconstruction method in a variety of simulated and real-world settings and demonstrates that the method is accurate enough to be used in many real-world scenarios.

  19. Distributed Engine Control Empirical/Analytical Verification Tools

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  20. Radionuclide Inventory Distribution Project Data Evaluation and Verification White Paper

    SciTech Connect

    NSTec Environmental Restoration

    2010-05-17

    Testing of nuclear explosives caused widespread contamination of surface soils on the Nevada Test Site (NTS). Atmospheric tests produced the majority of this contamination. The Radionuclide Inventory and Distribution Program (RIDP) was developed to determine distribution and total inventory of radionuclides in surface soils at the NTS to evaluate areas that may present long-term health hazards. The RIDP achieved this objective with aerial radiological surveys, soil sample results, and in situ gamma spectroscopy. This white paper presents the justification to support the use of RIDP data as a guide for future evaluation and to support closure of Soils Sub-Project sites under the purview of the Federal Facility Agreement and Consent Order. Use of the RIDP data as part of the Data Quality Objective process is expected to provide considerable cost savings and accelerate site closures. The following steps were completed: - Summarize the RIDP data set and evaluate the quality of the data. - Determine the current uses of the RIDP data and cautions associated with its use. - Provide recommendations for enhancing data use through field verification or other methods. The data quality is sufficient to utilize RIDP data during the planning process for site investigation and closure. Project planning activities may include estimating 25-millirem per industrial access year dose rate boundaries, optimizing characterization efforts, projecting final end states, and planning remedial actions. In addition, RIDP data may be used to identify specific radionuclide distributions, and augment other non-radionuclide dose rate data. Finally, the RIDP data can be used to estimate internal and external dose rates. The data quality is sufficient to utilize RIDP data during the planning process for site investigation and closure. Project planning activities may include estimating 25-millirem per industrial access year dose rate boundaries, optimizing characterization efforts, projecting final

  1. PERFORMANCE VERIFICATION OF CONTINUOUS MULTI-PARAMETER WATER MONITORS FOR DISTRIBUTION SYSTEMS

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program's Advanced Monitoring Systems (AMS) Center has been charged by EPA to verify the performance of commercially available monitoring technologies for air, water, soil. Multi-parameter water monitors for distributions systems we...

  2. Protection of Location Privacy Based on Distributed Collaborative Recommendations

    PubMed Central

    Wang, Peng; Yang, Jing; Zhang, Jian-Pei

    2016-01-01

    In the existing centralized location services system structure, the server is easily attracted and be the communication bottleneck. It caused the disclosure of users’ location. For this, we presented a new distributed collaborative recommendation strategy that is based on the distributed system. In this strategy, each node establishes profiles of their own location information. When requests for location services appear, the user can obtain the corresponding location services according to the recommendation of the neighboring users’ location information profiles. If no suitable recommended location service results are obtained, then the user can send a service request to the server according to the construction of a k-anonymous data set with a centroid position of the neighbors. In this strategy, we designed a new model of distributed collaborative recommendation location service based on the users’ location information profiles and used generalization and encryption to ensure the safety of the user’s location information privacy. Finally, we used the real location data set to make theoretical and experimental analysis. And the results show that the strategy proposed in this paper is capable of reducing the frequency of access to the location server, providing better location services and protecting better the user’s location privacy. PMID:27649308

  3. Protection of Location Privacy Based on Distributed Collaborative Recommendations.

    PubMed

    Wang, Peng; Yang, Jing; Zhang, Jian-Pei

    2016-01-01

    In the existing centralized location services system structure, the server is easily attracted and be the communication bottleneck. It caused the disclosure of users' location. For this, we presented a new distributed collaborative recommendation strategy that is based on the distributed system. In this strategy, each node establishes profiles of their own location information. When requests for location services appear, the user can obtain the corresponding location services according to the recommendation of the neighboring users' location information profiles. If no suitable recommended location service results are obtained, then the user can send a service request to the server according to the construction of a k-anonymous data set with a centroid position of the neighbors. In this strategy, we designed a new model of distributed collaborative recommendation location service based on the users' location information profiles and used generalization and encryption to ensure the safety of the user's location information privacy. Finally, we used the real location data set to make theoretical and experimental analysis. And the results show that the strategy proposed in this paper is capable of reducing the frequency of access to the location server, providing better location services and protecting better the user's location privacy. PMID:27649308

  4. Fault Location Methods for Ungrounded Distribution Systems Using Local Measurements

    NASA Astrophysics Data System (ADS)

    Xiu, Wanjing; Liao, Yuan

    2013-08-01

    This article presents novel fault location algorithms for ungrounded distribution systems. The proposed methods are capable of locating faults by using obtained voltage and current measurements at the local substation. Two types of fault location algorithms, using line to neutral and line to line measurements, are presented. The network structure and parameters are assumed to be known. The network structure needs to be updated based on information obtained from utility telemetry system. With the help of bus impedance matrix, local voltage changes due to the fault can be expressed as a function of fault currents. Since the bus impedance matrix contains information about fault location, superimposed voltages at local substation can be expressed as a function of fault location, through which fault location can be solved. Simulation studies have been carried out based on a sample distribution power system. From the evaluation study, it is evinced that very accurate fault location estimates are obtained from both types of methods.

  5. Locational distribution of gene functional classes in Arabidopsis thaliana

    PubMed Central

    Riley, Michael C; Clare, Amanda; King, Ross D

    2007-01-01

    Background We are interested in understanding the locational distribution of genes and their functions in genomes, as this distribution has both functional and evolutionary significance. Gene locational distribution is known to be affected by various evolutionary processes, with tandem duplication thought to be the main process producing clustering of homologous sequences. Recent research has found clustering of protein structural families in the human genome, even when genes identified as tandem duplicates have been removed from the data. However, this previous research was hindered as they were unable to analyse small sample sizes. This is a challenge for bioinformatics as more specific functional classes have fewer examples and conventional statistical analyses of these small data sets often produces unsatisfactory results. Results We have developed a novel bioinformatics method based on Monte Carlo methods and Greenwood's spacing statistic for the computational analysis of the distribution of individual functional classes of genes (from GO). We used this to make the first comprehensive statistical analysis of the relationship between gene functional class and location on a genome. Analysis of the distribution of all genes except tandem duplicates on the five chromosomes of A. thaliana reveals that the distribution on chromosomes I, II, IV and V is clustered at P = 0.001. Many functional classes are clustered, with the degree of clustering within an individual class generally consistent across all five chromosomes. A novel and surprising result was that the locational distribution of some functional classes were significantly more evenly spaced than would be expected by chance. Conclusion Analysis of the A. thaliana genome reveals evidence of unexplained order in the locational distribution of genes. The same general analysis method can be applied to any genome, and indeed any sequential data involving classes. PMID:17397552

  6. Measurement verification of dose distributions in pulsed-dose rate brachytherapy in breast cancer

    PubMed Central

    Mantaj, Patrycja; Zwierzchowski, Grzegorz

    2013-01-01

    Aim The aim of the study was to verify the dose distribution optimisation method in pulsed brachytherapy. Background The pulsed-dose rate brachytherapy is a very important method of breast tumour treatment using a standard brachytheraphy equipment. The appropriate dose distribution round an implant is an important issue in treatment planning. Advanced computer systems of treatment planning are equipped with algorithms optimising dose distribution. Materials and methods The wax-paraffin phantom was constructed and seven applicators were placed within it. Two treatment plans (non-optimised, optimised) were prepared. The reference points were located at a distance of 5 mm from the applicators’ axis. Thermoluminescent detectors were placed in the phantom at suitable 35 chosen reference points. Results The dosimetry verification was carried out in 35 reference points for the plans before and after optimisation. Percentage difference for the plan without optimisation ranged from −8.5% to 1.4% and after optimisation from −8.3% to 0.01%. In 16 reference points, the calculated percentage difference was negative (from −8.5% to 1.3% for the plan without optimisation and from −8.3% to 0.8% for the optimised plan). In the remaining 19 points percentage difference was from 9.1% to 1.4% for the plan without optimisation and from 7.5% to 0.01% for the optimised plan. No statistically significant differences were found between calculated doses and doses measured at reference points in both dose distribution non-optimised treatment plans and optimised treatment plans. Conclusions No statistically significant differences were found in dose values at reference points between doses calculated by the treatment planning system and those measured by TLDs. This proves the consistency between the measurements and the calculations. PMID:24416545

  7. The Error Distribution of BATSE Gamma-Ray Burst Locations

    NASA Technical Reports Server (NTRS)

    Briggs, Michael S.; Pendleton, Geoffrey N.; Kippen, R. Marc; Brainerd, J. J.; Hurley, Kevin; Connaughton, Valerie; Meegan, Charles A.

    1999-01-01

    Empirical probability models for BATSE gamma-ray burst (GRB) location errors are developed via a Bayesian analysis of the separations between BATSE GRB locations and locations obtained with the Interplanetary Network (IPN). Models are compared and their parameters estimated using 392 GRBs with single IPN annuli and 19 GRBs with intersecting IPN annuli. Most of the analysis is for the 4Br BATSE catalog; earlier catalogs are also analyzed. The simplest model that provides a good representation of the error distribution has 78% of the probability in a "core" term with a systematic error of 1.85 deg and the remainder in an extended tail with a systematic error of 5.1 deg, which implies a 68% confidence radius for bursts with negligible statistical uncertainties of 2.2 deg. There is evidence for a more complicated model in which the error distribution depends on the BATSE data type that was used to obtain the location. Bright bursts are typically located using the CONT data type, and according to the more complicated model, the 68% confidence radius for CONT-located bursts with negligible statistical uncertainties is 2.0 deg.

  8. Statistical distributions of nucleosomes: nonrandom locations by a stochastic mechanism.

    PubMed Central

    Kornberg, R D; Stryer, L

    1988-01-01

    Expressions are derived for distributions of nucleosomes in chromatin. Nucleosomes are placed on DNA at the densities found in bulk chromatin, and their locations are allowed to vary at random. No further assumptions are required to simulate the periodic patterns of digestion obtained with various nucleases. The introduction of a boundary constraint, due for example to sequence-specific protein binding, results in an array of regularly spaced nucleosomes at nonrandom locations, similar to the arrays reported for some genes and other chromosomal regions. PMID:3399412

  9. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... distributions. 382.16 Section 382.16 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services §...

  10. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... distributions. 382.16 Section 382.16 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services §...

  11. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... distributions. 382.16 Section 382.16 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services §...

  12. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... distributions. 382.16 Section 382.16 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services §...

  13. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... distributions. 382.16 Section 382.16 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services §...

  14. The verification of lightning location accuracy in Finland deduced from lightning strikes to trees

    NASA Astrophysics Data System (ADS)

    Mäkelä, Antti; Mäkelä, Jakke; Haapalainen, Jussi; Porjo, Niko

    2016-05-01

    We present a new method to determine the ground truth and accuracy of lightning location systems (LLS), using natural lightning strikes to trees. Observations of strikes to trees are being collected with a Web-based survey tool at the Finnish Meteorological Institute. Since the Finnish thunderstorms tend to have on average a low flash rate, it is often possible to identify from the LLS data unambiguously the stroke that caused damage to a given tree. The coordinates of the tree are then the ground truth for that stroke. The technique has clear advantages over other methods used to determine the ground truth. Instrumented towers and rocket launches measure upward-propagating lightning. Video and audio records, even with triangulation, are rarely capable of high accuracy. We present data for 36 quality-controlled tree strikes in the years 2007-2008. We show that the average inaccuracy of the lightning location network for that period was 600 m. In addition, we show that the 50% confidence ellipse calculated by the lightning location network and used operationally for describing the location accuracy is physically meaningful: half of all the strikes were located within the uncertainty ellipse of the nearest recorded stroke. Using tree strike data thus allows not only the accuracy of the LLS to be estimated but also the reliability of the uncertainty ellipse. To our knowledge, this method has not been attempted before for natural lightning.

  15. Towards the Formal Verification of a Distributed Real-Time Automotive System

    NASA Technical Reports Server (NTRS)

    Endres, Erik; Mueller, Christian; Shadrin, Andrey; Tverdyshev, Sergey

    2010-01-01

    We present the status of a project which aims at building, formally and pervasively verifying a distributed automotive system. The target system is a gate-level model which consists of several interconnected electronic control units with independent clocks. This model is verified against the specification as seen by a system programmer. The automotive system is implemented on several FPGA boards. The pervasive verification is carried out using combination of interactive theorem proving (Isabelle/HOL) and model checking (LTL).

  16. Design and verification of distributed logic controllers with application of Petri nets

    SciTech Connect

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał; Wiśniewska, Monika

    2015-12-31

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  17. Design and verification of distributed logic controllers with application of Petri nets

    NASA Astrophysics Data System (ADS)

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał; Wiśniewska, Monika

    2015-12-01

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  18. Solute location in a nanoconfined liquid depends on charge distribution

    NASA Astrophysics Data System (ADS)

    Harvey, Jacob A.; Thompson, Ward H.

    2015-07-01

    Nanostructured materials that can confine liquids have attracted increasing attention for their diverse properties and potential applications. Yet, significant gaps remain in our fundamental understanding of such nanoconfined liquids. Using replica exchange molecular dynamics simulations of a nanoscale, hydroxyl-terminated silica pore system, we determine how the locations explored by a coumarin 153 (C153) solute in ethanol depend on its charge distribution, which can be changed through a charge transfer electronic excitation. The solute position change is driven by the internal energy, which favors C153 at the pore surface compared to the pore interior, but less so for the more polar, excited-state molecule. This is attributed to more favorable non-specific solvation of the large dipole moment excited-state C153 by ethanol at the expense of hydrogen-bonding with the pore. It is shown that a change in molecule location resulting from shifts in the charge distribution is a general result, though how the solute position changes will depend upon the specific system. This has important implications for interpreting measurements and designing applications of mesoporous materials.

  19. Solute location in a nanoconfined liquid depends on charge distribution

    SciTech Connect

    Harvey, Jacob A.; Thompson, Ward H.

    2015-07-28

    Nanostructured materials that can confine liquids have attracted increasing attention for their diverse properties and potential applications. Yet, significant gaps remain in our fundamental understanding of such nanoconfined liquids. Using replica exchange molecular dynamics simulations of a nanoscale, hydroxyl-terminated silica pore system, we determine how the locations explored by a coumarin 153 (C153) solute in ethanol depend on its charge distribution, which can be changed through a charge transfer electronic excitation. The solute position change is driven by the internal energy, which favors C153 at the pore surface compared to the pore interior, but less so for the more polar, excited-state molecule. This is attributed to more favorable non-specific solvation of the large dipole moment excited-state C153 by ethanol at the expense of hydrogen-bonding with the pore. It is shown that a change in molecule location resulting from shifts in the charge distribution is a general result, though how the solute position changes will depend upon the specific system. This has important implications for interpreting measurements and designing applications of mesoporous materials.

  20. Verification of secure distributed systems in higher order logic: A modular approach using generic components

    SciTech Connect

    Alves-Foss, J.; Levitt, K.

    1991-01-01

    In this paper we present a generalization of McCullough's restrictiveness model as the basis for proving security properties about distributed system designs. We mechanize this generalization and an event-based model of computer systems in the HOL (Higher Order Logic) system to prove the composability of the model and several other properties about the model. We then develop a set of generalized classes of system components and show for which families of user views they satisfied the model. Using these classes we develop a collection of general system components that are instantiations of one of these classes and show that the instantiations also satisfied the security property. We then conclude with a sample distributed secure system, based on the Rushby and Randell distributed system design and designed using our collection of components, and show how our mechanized verification system can be used to verify such designs. 16 refs., 20 figs.

  1. Evaluation of gafchromic EBT film for intensity modulated radiation therapy dose distribution verification

    PubMed Central

    Sankar, A.; Kurup, P. G. Goplakrishna; Murali, V.; Ayyangar, Komanduri M.; Nehru, R. Mothilal; Velmurugan, J.

    2006-01-01

    This work was undertaken with the intention of investigating the possibility of clinical use of commercially available self-developing radiochromic film – Gafchromic EBT film – for IMRT dose verification. The dose response curves were generated for the films using VXR-16 film scanner. The results obtained with EBT films were compared with the results of Kodak EDR2 films. It was found that the EBT film has a linear response between the dose ranges of 0 and 600 cGy. The dose-related characteristics of the EBT film, like post-irradiation color growth with time, film uniformity and effect of scanning orientation, were studied. There is up to 8.6% increase in the color density between 2 and 40 h after irradiation. There was a considerable variation, up to 8.5%, in the film uniformity over its sensitive region. The quantitative difference between calculated and measured dose distributions was analyzed using Gamma index with the tolerance of 3% dose difference and 3 mm distance agreement. EDR2 films showed good and consistent results with the calculated dose distribution, whereas the results obtained using EBT were inconsistent. The variation in the film uniformity limits the use of EBT film for conventional large field IMRT verification. For IMRT of smaller field size (4.5 × 4.5 cm), the results obtained with EBT were comparable with results of EDR2 films. PMID:21206669

  2. Mass and galaxy distributions of four massive galaxy clusters from Dark Energy Survey Science Verification data

    DOE PAGES

    Melchior, P.; Suchyta, E.; Huff, E.; Hirsch, M.; Kacprzak, T.; Rykoff, E.; Gruen, D.; Armstrong, R.; Bacon, D.; Bechtol, K.; et al

    2015-03-31

    We measure the weak-lensing masses and galaxy distributions of four massive galaxy clusters observed during the Science Verification phase of the Dark Energy Survey. This pathfinder study is meant to 1) validate the DECam imager for the task of measuring weak-lensing shapes, and 2) utilize DECam's large field of view to map out the clusters and their environments over 90 arcmin. We conduct a series of rigorous tests on astrometry, photometry, image quality, PSF modelling, and shear measurement accuracy to single out flaws in the data and also to identify the optimal data processing steps and parameters. We find Sciencemore » Verification data from DECam to be suitable for the lensing analysis described in this paper. The PSF is generally well-behaved, but the modelling is rendered difficult by a flux-dependent PSF width and ellipticity. We employ photometric redshifts to distinguish between foreground and background galaxies, and a red-sequence cluster finder to provide cluster richness estimates and cluster-galaxy distributions. By fitting NFW profiles to the clusters in this study, we determine weak-lensing masses that are in agreement with previous work. For Abell 3261, we provide the first estimates of redshift, weak-lensing mass, and richness. Additionally, the cluster-galaxy distributions indicate the presence of filamentary structures attached to 1E 0657-56 and RXC J2248.7-4431, stretching out as far as 1degree (approximately 20 Mpc), showcasing the potential of DECam and DES for detailed studies of degree-scale features on the sky.« less

  3. Mass and galaxy distributions of four massive galaxy clusters from Dark Energy Survey Science Verification data

    SciTech Connect

    Melchior, P.; et al.

    2015-05-21

    We measure the weak-lensing masses and galaxy distributions of four massive galaxy clusters observed during the Science Verification phase of the Dark Energy Survey. This pathfinder study is meant to 1) validate the DECam imager for the task of measuring weak-lensing shapes, and 2) utilize DECam's large field of view to map out the clusters and their environments over 90 arcmin. We conduct a series of rigorous tests on astrometry, photometry, image quality, PSF modeling, and shear measurement accuracy to single out flaws in the data and also to identify the optimal data processing steps and parameters. We find Science Verification data from DECam to be suitable for the lensing analysis described in this paper. The PSF is generally well-behaved, but the modeling is rendered difficult by a flux-dependent PSF width and ellipticity. We employ photometric redshifts to distinguish between foreground and background galaxies, and a red-sequence cluster finder to provide cluster richness estimates and cluster-galaxy distributions. By fitting NFW profiles to the clusters in this study, we determine weak-lensing masses that are in agreement with previous work. For Abell 3261, we provide the first estimates of redshift, weak-lensing mass, and richness. In addition, the cluster-galaxy distributions indicate the presence of filamentary structures attached to 1E 0657-56 and RXC J2248.7-4431, stretching out as far as 1 degree (approximately 20 Mpc), showcasing the potential of DECam and DES for detailed studies of degree-scale features on the sky.

  4. Mass and galaxy distributions of four massive galaxy clusters from Dark Energy Survey Science Verification data

    SciTech Connect

    Melchior, P.; Suchyta, E.; Huff, E.; Hirsch, M.; Kacprzak, T.; Rykoff, E.; Gruen, D.; Armstrong, R.; Bacon, D.; Bechtol, K.; Bernstein, G. M.; Bridle, S.; Clampitt, J.; Honscheid, K.; Jain, B.; Jouvel, S.; Krause, E.; Lin, H.; MacCrann, N.; Patton, K.; Plazas, A.; Rowe, B.; Vikram, V.; Wilcox, H.; Young, J.; Zuntz, J.; Abbott, T.; Abdalla, F. B.; Allam, S. S.; Banerji, M.; Bernstein, J. P.; Bernstein, R. A.; Bertin, E.; Buckley-Geer, E.; Burke, D. L.; Castander, F. J.; da Costa, L. N.; Cunha, C. E.; Depoy, D. L.; Desai, S.; Diehl, H. T.; Doel, P.; Estrada, J.; Evrard, A. E.; Neto, A. F.; Fernandez, E.; Finley, D. A.; Flaugher, B.; Frieman, J. A.; Gaztanaga, E.; Gerdes, D.; Gruendl, R. A.; Gutierrez, G. R.; Jarvis, M.; Karliner, I.; Kent, S.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Maia, M. A. G.; Makler, M.; Marriner, J.; Marshall, J. L.; Merritt, K. W.; Miller, C. J.; Miquel, R.; Mohr, J.; Neilsen, E.; Nichol, R. C.; Nord, B. D.; Reil, K.; Roe, N. A.; Roodman, A.; Sako, M.; Sanchez, E.; Santiago, B. X.; Schindler, R.; Schubnell, M.; Sevilla-Noarbe, I.; Sheldon, E.; Smith, C.; Soares-Santos, M.; Swanson, M. E. C.; Sypniewski, A. J.; Tarle, G.; Thaler, J.; Thomas, D.; Tucker, D. L.; Walker, A.; Wechsler, R.; Weller, J.; Wester, W.

    2015-03-31

    We measure the weak-lensing masses and galaxy distributions of four massive galaxy clusters observed during the Science Verification phase of the Dark Energy Survey. This pathfinder study is meant to 1) validate the DECam imager for the task of measuring weak-lensing shapes, and 2) utilize DECam's large field of view to map out the clusters and their environments over 90 arcmin. We conduct a series of rigorous tests on astrometry, photometry, image quality, PSF modelling, and shear measurement accuracy to single out flaws in the data and also to identify the optimal data processing steps and parameters. We find Science Verification data from DECam to be suitable for the lensing analysis described in this paper. The PSF is generally well-behaved, but the modelling is rendered difficult by a flux-dependent PSF width and ellipticity. We employ photometric redshifts to distinguish between foreground and background galaxies, and a red-sequence cluster finder to provide cluster richness estimates and cluster-galaxy distributions. By fitting NFW profiles to the clusters in this study, we determine weak-lensing masses that are in agreement with previous work. For Abell 3261, we provide the first estimates of redshift, weak-lensing mass, and richness. Additionally, the cluster-galaxy distributions indicate the presence of filamentary structures attached to 1E 0657-56 and RXC J2248.7-4431, stretching out as far as 1degree (approximately 20 Mpc), showcasing the potential of DECam and DES for detailed studies of degree-scale features on the sky.

  5. Dosimetric verification of stereotactic radiosurgery/stereotactic radiotherapy dose distributions using Gafchromic EBT3

    SciTech Connect

    Cusumano, Davide; Fumagalli, Maria L.; Marchetti, Marcello; Fariselli, Laura; De Martin, Elena

    2015-10-01

    Aim of this study is to examine the feasibility of using the new Gafchromic EBT3 film in a high-dose stereotactic radiosurgery and radiotherapy quality assurance procedure. Owing to the reduced dimensions of the involved lesions, the feasibility of scanning plan verification films on the scanner plate area with the best uniformity rather than using a correction mask was evaluated. For this purpose, signal values dispersion and reproducibility of film scans were investigated. Uniformity was then quantified in the selected area and was found to be within 1.5% for doses up to 8 Gy. A high-dose threshold level for analyses using this procedure was established evaluating the sensitivity of the irradiated films. Sensitivity was found to be of the order of centiGray for doses up to 6.2 Gy and decreasing for higher doses. The obtained results were used to implement a procedure comparing dose distributions delivered with a CyberKnife system to planned ones. The procedure was validated through single beam irradiation on a Gafchromic film. The agreement between dose distributions was then evaluated for 13 patients (brain lesions, 5 Gy/die prescription isodose ~80%) using gamma analysis. Results obtained using Gamma test criteria of 5%/1 mm show a pass rate of 94.3%. Gamma frequency parameters calculation for EBT3 films showed to strongly depend on subtraction of unexposed film pixel values from irradiated ones. In the framework of the described dosimetric procedure, EBT3 films proved to be effective in the verification of high doses delivered to lesions with complex shapes and adjacent to organs at risk.

  6. Distribution and Location of Genetic Effects for Dairy Traits

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Genetic effects for many dairy traits and for total economic merit are fairly evenly distributed across all chromosomes. A high-density scan using 38,416 SNP markers for 5,285 bulls confirmed two previously-known major genes on Bos taurus autosomes (BTA) 6 and 14 but revealed few other large effects...

  7. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Kranz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.

  8. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.

  9. SLR data screening; location of peak of data distribution

    NASA Technical Reports Server (NTRS)

    Sinclair, Andrew T.

    1993-01-01

    At the 5th Laser Ranging Instrumentation Workshop held at Herstmonceux in 1984, consideration was given to the formation of on-site normal points by laser stations, and an algorithm was formulated. The algorithm included a recommendation that an iterated 3.0 x rms rejection criterion should be used to screen the data, and that arithmetic means should be formed within the normal point bins of the retained data. From Sept. 1990 onwards, this algorithm and screening criterion have been brought into effect by various laser stations for forming on-site normal points, and small variants of the algorithm are used by most analysis centers for forming normal points from full-rate data, although the data screening criterion they use ranges from about 2.5 to 3.0 x rms. At the CSTG Satellite Laser Ranging (SLR) Subcommission, a working group was set up in Mar. 1991 to review the recommended screening procedure. This paper has been influenced by the discussions of this working group, although the views expressed are primarily those of this author. The main thrust of this paper is that, particularly for single photon systems, a more important issue than data screening is the determination of the peak of a data distribution and hence, the determination of the bias of the peak from the mean. Several methods of determining the peak are discussed.

  10. Study On Burst Location Technology under Steady-state in Water Distribution System

    NASA Astrophysics Data System (ADS)

    Liu, Xianpin; Li, Shuping; Wang, Shaowei; He, Fang; He, Zhixun; Cao, Guodong

    2010-11-01

    According to the characteristics of hydraulic information under the state of burst in water distribution system, to get the correlation of monitoring values and burst location and locate the position of burst on time by mathematical fitting. This method can effectively make use of the information of SCADA in water distribution system to active locating burst position. A new idea of burst location in water distribution systems to shorten the burst time, reduce the impact on urban water supply, economic losses and waste of water resources.

  11. Redshift Distributions of Galaxies in the DES Science Verification Shear Catalogue and Implications for Weak Lensing

    SciTech Connect

    Bonnett, C.

    2015-07-21

    We present photometric redshift estimates for galaxies used in the weak lensing analysis of the Dark Energy Survey Science Verification (DES SV) data. Four model- or machine learning-based photometric redshift methods { annz2, bpz calibrated against BCC-U fig simulations, skynet, and tpz { are analysed. For training, calibration, and testing of these methods, we also construct a catalogue of spectroscopically confirmed galaxies matched against DES SV data. The performance of the methods is evalu-ated against the matched spectroscopic catalogue, focusing on metrics relevant for weak lensing analyses, with additional validation against COSMOS photo-zs. From the galaxies in the DES SV shear catalogue, which have mean redshift 0.72 ±0.01 over the range 0:3 < z < 1:3, we construct three tomographic bins with means of z = {0.45; 0.67,1.00g}. These bins each have systematic uncertainties δz ≲ 0.05 in the mean of the fiducial skynet photo-z n(z). We propagate the errors in the redshift distributions through to their impact on cosmological parameters estimated with cosmic shear, and find that they cause shifts in the value of σ8 of approx. 3%. This shift is within the one sigma statistical errors on σ8 for the DES SV shear catalog. We also found that further study of the potential impact of systematic differences on the critical surface density, Σcrit, contained levels of bias safely less than the statistical power of DES SV data. We recommend a final Gaussian prior for the photo-z bias in the mean of n(z) of width 0:05 for each of the three tomographic bins, and show that this is a sufficient bias model for the corresponding cosmology analysis.

  12. Geometric verification

    NASA Technical Reports Server (NTRS)

    Grebowsky, G. J.

    1982-01-01

    Present LANDSAT data formats are reviewed to clarify how the geodetic location and registration capabilities were defined for P-tape products and RBV data. Since there is only one geometric model used in the master data processor, geometric location accuracy of P-tape products depends on the absolute accuracy of the model and registration accuracy is determined by the stability of the model. Due primarily to inaccuracies in data provided by the LANDSAT attitude management system, desired accuracies are obtained only by using ground control points and a correlation process. The verification of system performance with regards to geodetic location requires the capability to determine pixel positions of map points in a P-tape array. Verification of registration performance requires the capability to determine pixel positions of common points (not necessarily map points) in 2 or more P-tape arrays for a given world reference system scene. Techniques for registration verification can be more varied and automated since map data are not required. The verification of LACIE extractions is used as an example.

  13. A Distribution-class Locational Marginal Price (DLMP) Index for Enhanced Distribution Systems

    NASA Astrophysics Data System (ADS)

    Akinbode, Oluwaseyi Wemimo

    The smart grid initiative is the impetus behind changes that are expected to culminate into an enhanced distribution system with the communication and control infrastructure to support advanced distribution system applications and resources such as distributed generation, energy storage systems, and price responsive loads. This research proposes a distribution-class analog of the transmission LMP (DLMP) as an enabler of the advanced applications of the enhanced distribution system. The DLMP is envisioned as a control signal that can incentivize distribution system resources to behave optimally in a manner that benefits economic efficiency and system reliability and that can optimally couple the transmission and the distribution systems. The DLMP is calculated from a two-stage optimization problem; a transmission system OPF and a distribution system OPF. An iterative framework that ensures accurate representation of the distribution system's price sensitive resources for the transmission system problem and vice versa is developed and its convergence problem is discussed. As part of the DLMP calculation framework, a DCOPF formulation that endogenously captures the effect of real power losses is discussed. The formulation uses piecewise linear functions to approximate losses. This thesis explores, with theoretical proofs, the breakdown of the loss approximation technique when non-positive DLMPs/LMPs occur and discusses a mixed integer linear programming formulation that corrects the breakdown. The DLMP is numerically illustrated in traditional and enhanced distribution systems and its superiority to contemporary pricing mechanisms is demonstrated using price responsive loads. Results show that the impact of the inaccuracy of contemporary pricing schemes becomes significant as flexible resources increase. At high elasticity, aggregate load consumption deviated from the optimal consumption by up to about 45 percent when using a flat or time-of-use rate. Individual load

  14. Geographic location, network patterns and population distribution of rural settlements in Greece

    NASA Astrophysics Data System (ADS)

    Asimakopoulos, Avraam; Mogios, Emmanuel; Xenikos, Dimitrios G.

    2016-10-01

    Our work addresses the problem of how social networks are embedded in space, by studying the spread of human population over complex geomorphological terrain. We focus on villages or small cities up to a few thousand inhabitants located in mountainous areas in Greece. This terrain presents a familiar tree-like structure of valleys and land plateaus. Cities are found more often at lower altitudes and exhibit preference on south orientation. Furthermore, the population generally avoids flat land plateaus and river beds, preferring locations slightly uphill, away from the plateau edge. Despite the location diversity regarding geomorphological parameters, we find certain quantitative norms when we examine location and population distributions relative to the (man-made) transportation network. In particular, settlements at radial distance ℓ away from road network junctions have the same mean altitude, practically independent of ℓ ranging from a few meters to 10 km. Similarly, the distribution of the settlement population at any given ℓ is the same for all ℓ. Finally, the cumulative distribution of the number of rural cities n(ℓ) is fitted to the Weibull distribution, suggesting that human decisions for creating settlements could be paralleled to mechanisms typically attributed to this particular statistical distribution.

  15. A novel multi-human location method for distributed binary pyroelectric infrared sensor tracking system: Region partition using PNN and bearing-crossing location

    NASA Astrophysics Data System (ADS)

    Yang, Bo; Li, Xiaoshan; Luo, Jing

    2015-01-01

    This paper proposes a novel multi-human location method for distributed binary pyroelectric infrared sensor tracking system based on region partition using probabilistic neural network and bearing-crossing location. The detection space of system is divided into many sub-regions and encoded uniformly. The human region is located by an integrated neural network classifier, which is developed based on the probabilistic neural network ensembles and the Bagging algorithm. The location of a human target can be achieved by first determining a coarse location by this classifier and then a fine location using our previous bearing-crossing location method. Simulation and experimental results have shown that the human region can be judged rapidly and the false detection points of multi-human location can be eliminated effectively. Compared with the bearing-crossing location method, the novel method has significantly improved the locating and tracking accuracy of multiple human targets in infrared sensor tracking system.

  16. Development of Micro Discharge Locator for Distribution Line using Analogue Signal Processing

    NASA Astrophysics Data System (ADS)

    Kumazawa, Takao; Oka, Fujio

    Micro discharges (MDs) such as spark or partial discharges on distribution lines, which occur by degradation of insulators, insulated wires, bushings, etc., may cause television interference or ground fault. A technique for locating MDs using differences in arrival time of electromagnetic pulses radiated from the MDs has been investigated recently. However, the technique requires a large and expensive apparatus such as a digital storage oscilloscope able to record the received pulse signals very fast. We investigated a new technique to estimate the direction of arrival (DOA) of the electromagnetic pulses using analogue signal processing, and produced a prototype of a MD locator. In order to evaluate the estimation error of DOA, we performed several experiments to locate spark discharges about 50nC/pulse on testing distribution line by using the MD locator. The average estimation error was about 5 degree, and the error of azimuth was several times larger than that of elevation in most cases. This reason is considered that resolving power of azimuth became lower than that of elevation owing to configuration of receiving antennas. We also tried to locate MDs on real distribution lines, and confirmed that there was no significant influence of reflected or carrier waves on DOA estimation.

  17. Optimization of pressure gauge locations for water distribution systems using entropy theory.

    PubMed

    Yoo, Do Guen; Chang, Dong Eil; Jun, Hwandon; Kim, Joong Hoon

    2012-12-01

    It is essential to select the optimal pressure gauge location for effective management and maintenance of water distribution systems. This study proposes an objective and quantified standard for selecting the optimal pressure gauge location by defining the pressure change at other nodes as a result of demand change at a specific node using entropy theory. Two cases are considered in terms of demand change: that in which demand at all nodes shows peak load by using a peak factor and that comprising the demand change of the normal distribution whose average is the base demand. The actual pressure change pattern is determined by using the emitter function of EPANET to reflect the pressure that changes practically at each node. The optimal pressure gauge location is determined by prioritizing the node that processes the largest amount of information it gives to (giving entropy) and receives from (receiving entropy) the whole system according to the entropy standard. The suggested model is applied to one virtual and one real pipe network, and the optimal pressure gauge location combination is calculated by implementing the sensitivity analysis based on the study results. These analysis results support the following two conclusions. Firstly, the installation priority of the pressure gauge in water distribution networks can be determined with a more objective standard through the entropy theory. Secondly, the model can be used as an efficient decision-making guide for gauge installation in water distribution systems.

  18. A method to optimize sampling locations for measuring indoor air distributions

    NASA Astrophysics Data System (ADS)

    Huang, Yan; Shen, Xiong; Li, Jianmin; Li, Bingye; Duan, Ran; Lin, Chao-Hsin; Liu, Junjie; Chen, Qingyan

    2015-02-01

    Indoor air distributions, such as the distributions of air temperature, air velocity, and contaminant concentrations, are very important to occupants' health and comfort in enclosed spaces. When point data is collected for interpolation to form field distributions, the sampling locations (the locations of the point sensors) have a significant effect on time invested, labor costs and measuring accuracy on field interpolation. This investigation compared two different sampling methods: the grid method and the gradient-based method, for determining sampling locations. The two methods were applied to obtain point air parameter data in an office room and in a section of an economy-class aircraft cabin. The point data obtained was then interpolated to form field distributions by the ordinary Kriging method. Our error analysis shows that the gradient-based sampling method has 32.6% smaller error of interpolation than the grid sampling method. We acquired the function between the interpolation errors and the sampling size (the number of sampling points). According to the function, the sampling size has an optimal value and the maximum sampling size can be determined by the sensor and system errors. This study recommends the gradient-based sampling method for measuring indoor air distributions.

  19. Optimal Capacity and Location Assessment of Natural Gas Fired Distributed Generation in Residential Areas

    NASA Astrophysics Data System (ADS)

    Khalil, Sarah My

    With ever increasing use of natural gas to generate electricity, installed natural gas fired microturbines are found in residential areas to generate electricity locally. This research work discusses a generalized methodology for assessing optimal capacity and locations for installing natural gas fired microturbines in a distribution residential network. The overall objective is to place microturbines to minimize the system power loss occurring in the electrical distribution network; in such a way that the electric feeder does not need any up-gradation. The IEEE 123 Node Test Feeder is selected as the test bed for validating the developed methodology. Three-phase unbalanced electric power flow is run in OpenDSS through COM server, and the gas distribution network is analyzed using GASWorkS. The continual sensitivity analysis methodology is developed to select multiple DG locations and annual simulation is run to minimize annual average losses. The proposed placement of microturbines must be feasible in the gas distribution network and should not result into gas pipeline reinforcement. The corresponding gas distribution network is developed in GASWorkS software, and nodal pressures of the gas system are checked for various cases to investigate if the existing gas distribution network can accommodate the penetration of selected microturbines. The results indicate the optimal locations suitable to place microturbines and capacity that can be accommodated by the system, based on the consideration of overall minimum annual average losses as well as the guarantee of nodal pressure provided by the gas distribution network. The proposed method is generalized and can be used for any IEEE test feeder or an actual residential distribution network.

  20. Improved location algorithm for multiple intrusions in distributed Sagnac fiber sensing system.

    PubMed

    Wang, He; Sun, Qizhen; Li, Xiaolei; Wo, Jianghai; Shum, Perry Ping; Liu, Deming

    2014-04-01

    An improved algorithm named "twice-FFT" for multi-point intrusion location in distributed Sagnac sensing system is proposed and demonstrated. To find the null-frequencies more accurately and efficiently, a second FFT is applied to the frequency spectrum of the phase signal caused by intrusion. After Gaussian fitting and searching the peak response frequency in the twice-FFT curve, the intrusion position could be calculated out stably. Meanwhile, the twice-FFT algorithm could solve the problem of multi-point intrusion location. Based on the experiment with twice-FFT algorithm, the location error less than 100m for single intrusion is achieved at any position along the total length of 41km, and the locating ability for two or three intrusions occurring simultaneously is also demonstrated. PMID:24718133

  1. Acoustic Emission Source Location Using a Distributed Feedback Fiber Laser Rosette

    PubMed Central

    Huang, Wenzhu; Zhang, Wentao; Li, Fang

    2013-01-01

    This paper proposes an approach for acoustic emission (AE) source localization in a large marble stone using distributed feedback (DFB) fiber lasers. The aim of this study is to detect damage in structures such as those found in civil applications. The directional sensitivity of DFB fiber laser is investigated by calculating location coefficient using a method of digital signal analysis. In this, autocorrelation is used to extract the location coefficient from the periodic AE signal and wavelet packet energy is calculated to get the location coefficient of a burst AE source. Normalization is processed to eliminate the influence of distance and intensity of AE source. Then a new location algorithm based on the location coefficient is presented and tested to determine the location of AE source using a Delta (Δ) DFB fiber laser rosette configuration. The advantage of the proposed algorithm over the traditional methods based on fiber Bragg Grating (FBG) include the capability of: having higher strain resolution for AE detection and taking into account two different types of AE source for location. PMID:24141266

  2. Acoustic emission source location using a distributed feedback fiber laser rosette.

    PubMed

    Huang, Wenzhu; Zhang, Wentao; Li, Fang

    2013-01-01

    This paper proposes an approach for acoustic emission (AE) source localization in a large marble stone using distributed feedback (DFB) fiber lasers. The aim of this study is to detect damage in structures such as those found in civil applications. The directional sensitivity of DFB fiber laser is investigated by calculating location coefficient using a method of digital signal analysis. In this, autocorrelation is used to extract the location coefficient from the periodic AE signal and wavelet packet energy is calculated to get the location coefficient of a burst AE source. Normalization is processed to eliminate the influence of distance and intensity of AE source. Then a new location algorithm based on the location coefficient is presented and tested to determine the location of AE source using a Delta (Δ) DFB fiber laser rosette configuration. The advantage of the proposed algorithm over the traditional methods based on fiber Bragg Grating (FBG) include the capability of: having higher strain resolution for AE detection and taking into account two different types of AE source for location. PMID:24141266

  3. Role of origin and release location in pre-spawning distribution and movements of anadromous alewife

    USGS Publications Warehouse

    Frank, Holly J.; Mather, M. E.; Smith, Joseph M.; Muth, Robert M.; Finn, John T.

    2011-01-01

    Capturing adult anadromous fish that are ready to spawn from a self sustaining population and transferring them into a depleted system is a common fisheries enhancement tool. The behaviour of these transplanted fish, however, has not been fully evaluated. The movements of stocked and native anadromous alewife, Alosa pseudoharengus (Wilson), were monitored in the Ipswich River, Massachusetts, USA, to provide a scientific basis for this management tool. Radiotelemetry was used to examine the effect of origin (native or stocked) and release location (upstream or downstream) on distribution and movement during the spawning migration. Native fish remained in the river longer than stocked fish regardless of release location. Release location and origin influenced where fish spent time and how they moved. The spatial mosaic of available habitats and the entire trajectory of freshwater movements should be considered to restore effectively spawners that traverse tens of kilometres within coastal rivers.

  4. Location of lightning stroke on OPGW by use of distributed optical fiber sensor

    NASA Astrophysics Data System (ADS)

    Lu, Lidong; Liang, Yun; Li, Binglin; Guo, Jinghong; Zhang, Hao; Zhang, Xuping

    2014-12-01

    A new method based on a distributed optical fiber sensor (DOFS) to locate the position of lightning stroke on the optical fiber ground wire (OPGW) is proposed and experimentally demonstrated. In the method, the lightning stroke process is considered to be a heat release process at the lightning stroke position, so Brillouin optical time domain reflectometry (BOTDR) with spatial resolution of 2m is used as the distributed temperature sensor. To simulate the lightning stroke process, an electric anode with high pulsed current and a negative electrode (the OPGW) are adopted to form a lightning impulse system with duration time of 200ms. In the experiment, lightning strokes with the quantity of electric discharge of 100 Coul and 200 Coul are generated respectively, and the DOFS can sensitively capture the temperature change of the lightning stroke position in the transient electric discharging process. Experimental results show that DOFS is a feasible instrument to locate the lightning stroke on the OPGW and it has excellent potential for the maintenance of electric power transmission line. Additionally, as the range of lightning stroke is usually within 10cm and the spatial resolution of a typical DOFS is beyond 1m, the temperature characteristics in a small area cannot be accurately represented by a DOFS with a large spatial resolution. Therefore, for further application of distributed optical fiber temperature sensors for lightning stroke location on OPGW, such as BOTDR and ROTDR, it is important to enhance the spatial resolution.

  5. Estimation of Distributed Fermat-Point Location for Wireless Sensor Networking

    PubMed Central

    Huang, Po-Hsian; Chen, Jiann-Liang; Larosa, Yanuarius Teofilus; Chiang, Tsui-Lien

    2011-01-01

    This work presents a localization scheme for use in wireless sensor networks (WSNs) that is based on a proposed connectivity-based RF localization strategy called the distributed Fermat-point location estimation algorithm (DFPLE). DFPLE applies triangle area of location estimation formed by intersections of three neighboring beacon nodes. The Fermat point is determined as the shortest path from three vertices of the triangle. The area of estimated location then refined using Fermat point to achieve minimum error in estimating sensor nodes location. DFPLE solves problems of large errors and poor performance encountered by localization schemes that are based on a bounding box algorithm. Performance analysis of a 200-node development environment reveals that, when the number of sensor nodes is below 150, the mean error decreases rapidly as the node density increases, and when the number of sensor nodes exceeds 170, the mean error remains below 1% as the node density increases. Second, when the number of beacon nodes is less than 60, normal nodes lack sufficient beacon nodes to enable their locations to be estimated. However, the mean error changes slightly as the number of beacon nodes increases above 60. Simulation results revealed that the proposed algorithm for estimating sensor positions is more accurate than existing algorithms, and improves upon conventional bounding box strategies. PMID:22163851

  6. Estimation of distributed Fermat-point location for wireless sensor networking.

    PubMed

    Huang, Po-Hsian; Chen, Jiann-Liang; Larosa, Yanuarius Teofilus; Chiang, Tsui-Lien

    2011-01-01

    This work presents a localization scheme for use in wireless sensor networks (WSNs) that is based on a proposed connectivity-based RF localization strategy called the distributed Fermat-point location estimation algorithm (DFPLE). DFPLE applies triangle area of location estimation formed by intersections of three neighboring beacon nodes. The Fermat point is determined as the shortest path from three vertices of the triangle. The area of estimated location then refined using Fermat point to achieve minimum error in estimating sensor nodes location. DFPLE solves problems of large errors and poor performance encountered by localization schemes that are based on a bounding box algorithm. Performance analysis of a 200-node development environment reveals that, when the number of sensor nodes is below 150, the mean error decreases rapidly as the node density increases, and when the number of sensor nodes exceeds 170, the mean error remains below 1% as the node density increases. Second, when the number of beacon nodes is less than 60, normal nodes lack sufficient beacon nodes to enable their locations to be estimated. However, the mean error changes slightly as the number of beacon nodes increases above 60. Simulation results revealed that the proposed algorithm for estimating sensor positions is more accurate than existing algorithms, and improves upon conventional bounding box strategies.

  7. Tomotherapy dose distribution verification using MAGIC-f polymer gel dosimetry

    SciTech Connect

    Pavoni, J. F.; Pike, T. L.; Snow, J.; DeWerd, L.; Baffa, O.

    2012-05-15

    Purpose: This paper presents the application of MAGIC-f gel in a three-dimensional dose distribution measurement and its ability to accurately measure the dose distribution from a tomotherapy unit. Methods: A prostate intensity-modulated radiation therapy (IMRT) irradiation was simulated in the gel phantom and the treatment was delivered by a TomoTherapy equipment. Dose distribution was evaluated by the R2 distribution measured in magnetic resonance imaging. Results: A high similarity was found by overlapping of isodoses of the dose distribution measured with the gel and expected by the treatment planning system (TPS). Another analysis was done by comparing the relative absorbed dose profiles in the measured and in the expected dose distributions extracted along indicated lines of the volume and the results were also in agreement. The gamma index analysis was also applied to the data and a high pass rate was achieved (88.4% for analysis using 3%/3 mm and of 96.5% using 4%/4 mm). The real three-dimensional analysis compared the dose-volume histograms measured for the planning volumes and expected by the treatment planning, being the results also in good agreement by the overlapping of the curves. Conclusions: These results show that MAGIC-f gel is a promise for tridimensional dose distribution measurements.

  8. Approaches to verification of two-dimensional water quality models

    SciTech Connect

    Butkus, S.R. . Water Quality Dept.)

    1990-11-01

    The verification of a water quality model is the one procedure most needed by decision making evaluating a model predictions, but is often not adequate or done at all. The results of a properly conducted verification provide the decision makers with an estimate of the uncertainty associated with model predictions. Several statistical tests are available for quantifying of the performance of a model. Six methods of verification were evaluated using an application of the BETTER two-dimensional water quality model for Chickamauga reservoir. Model predictions for ten state variables were compared to observed conditions from 1989. Spatial distributions of the verification measures showed the model predictions were generally adequate, except at a few specific locations in the reservoir. The most useful statistics were the mean standard error of the residuals. Quantifiable measures of model performance should be calculated during calibration and verification of future applications of the BETTER model. 25 refs., 5 figs., 7 tabs.

  9. Experimental verification of reconstructed absorbers embedded in scattering media by optical power ratio distribution.

    PubMed

    Yamaoki, Toshihiko; Hamada, Hiroaki; Matoba, Osamu

    2016-09-01

    Experimental investigation to show the effectiveness of the extraction method of absorber information in a scattering medium by taking the output power ratio distribution is presented. In the experiment, two metallic wires sandwiched by three homogeneous scattering media are used as absorbers in transmission geometry. The output power ratio distributions can extract the influence of the absorbers to enhance the optical signal. The peak position of the output power ratio distributions agree with the results suggested by numerical simulation. From the reconstructed results of tomography in the scattering media, we have confirmed that the tomographic image of two wires can distinguish them successfully from 41×21 output power ratio distributions by using continuous-wave light. PMID:27607261

  10. Distributed fiber sensing system with wide frequency response and accurate location

    NASA Astrophysics Data System (ADS)

    Shi, Yi; Feng, Hao; Zeng, Zhoumo

    2016-02-01

    A distributed fiber sensing system merging Mach-Zehnder interferometer and phase-sensitive optical time domain reflectometer (Φ-OTDR) is demonstrated for vibration measurement, which requires wide frequency response and accurate location. Two narrow line-width lasers with delicately different wavelengths are used to constitute the interferometer and reflectometer respectively. A narrow band Fiber Bragg Grating is responsible for separating the two wavelengths. In addition, heterodyne detection is applied to maintain the signal to noise rate of the locating signal. Experiment results show that the novel system has a wide frequency from 1 Hz to 50 MHz, limited by the sample frequency of data acquisition card, and a spatial resolution of 20 m, according to 200 ns pulse width, along 2.5 km fiber link.

  11. FDTD verification of deep-set brain tumor hyperthermia using a spherical microwave source distribution

    SciTech Connect

    Dunn, D.; Rappaport, C.M.; Terzuoli, A.J. Jr.

    1996-10-01

    Although use of noninvasive microwave hyperthermia to treat cancer is problematic in many human body structures, careful selection of the source electric field distribution around the entire surface of the head can generate a tightly focused global power density maximum at the deepest point within the brain. An analytic prediction of the optimum volume field distribution in a layered concentric head model based on summing spherical harmonic modes is derived and presented. This ideal distribution is then verified using a three-dimensional finite difference time domain (TDTD) simulation with a discretized, MRI-based head model excited by the spherical source. The numerical computation gives a very similar dissipated power pattern as the analytic prediction. This study demonstrates that microwave hyperthermia can theoretically be a feasible cancer treatment modality for tumors in the head, providing a well-resolved hot-spot at depth without overheating any other healthy tissue.

  12. Verification, Validation, and Accreditation Challenges of Distributed Simulation for Space Exploration Technology

    NASA Technical Reports Server (NTRS)

    Thomas, Danny; Hartway, Bobby; Hale, Joe

    2006-01-01

    Throughout its rich history, NASA has invested heavily in sophisticated simulation capabilities. These capabilities reside in NASA facilities across the country - and with partners around the world. NASA s Exploration Systems Mission Directorate (ESMD) has the opportunity to leverage these considerable investments to resolve technical questions relating to its missions. The distributed nature of the assets, both in terms of geography and organization, present challenges to their combined and coordinated use, but precedents of geographically distributed real-time simulations exist. This paper will show how technological advances in simulation can be employed to address the issues associated with netting NASA simulation assets.

  13. Humidity distribution affected by freely exposed water surfaces: Simulations and experimental verification

    NASA Astrophysics Data System (ADS)

    Hygum, M. A.; Popok, V. N.

    2014-07-01

    Accurate models for the water vapor flux at a water-air interface are required in various scientific, reliability and civil engineering aspects. Here, a study of humidity distribution in a container with air and freely exposed water is presented. A model predicting a spatial distribution and time evolution of relative humidity based on statistical rate theory and computational fluid dynamics is developed. In our approach we use short-term steady-state steps to simulate the slowly evolving evaporation in the system. Experiments demonstrate considerably good agreement with the computer modeling and allow one to distinguish the most important parameters for the model.

  14. Gas Chromatographic Verification of a Mathematical Model: Product Distribution Following Methanolysis Reactions.

    ERIC Educational Resources Information Center

    Lam, R. B.; And Others

    1983-01-01

    Investigated application of binomial statistics to equilibrium distribution of ester systems by employing gas chromatography to verify the mathematical model used. Discusses model development and experimental techniques, indicating the model enables a straightforward extension to symmetrical polyfunctional esters and presents a mathematical basis…

  15. Gene Location and DNA Density Determine Transcription Factor Distributions in E. coli

    NASA Astrophysics Data System (ADS)

    Kuhlman, Thomas; Cox, Edward

    2013-03-01

    The diffusion coefficient of the prototypical transcription factor LacI within living Escherichia coli has been measured directly by in vivo tracking to be D = 0.4 μm2/s. At this rate, simple models of diffusion lead to the expectation that LacI and other proteins will rapidly homogenize throughout the cell. We have tested this expectation of spatial homogeneity by single molecule visualization of LacI molecules non-specifically bound to DNA in fixed cells. Contrary to expectation, we find that the distribution depends on the spatial location of its encoding gene. We demonstrate that the spatial distribution of LacI is also determined by the local state of DNA compaction, and that E. coli can dynamically redistribute proteins by modifying the state of its nucleoid. Finally, we show that LacI inhomogeneity increases the strength with which targets located proximally to the LacI gene are regulated. We propose a model for intranucleoid diffusion which can reconcile these results with previous measurements of LacI diffusion. This work was supported by the National Institutes of Health [GM078591, GM071508] and the Howard Hughes Medical Institute [52005884]. TEK is supported by an NIH Ruth Kirschstein NRSA Fellowship [F32GM090568-01A1].

  16. Location, Location, Location!

    ERIC Educational Resources Information Center

    Ramsdell, Kristin

    2004-01-01

    Of prime importance in real estate, location is also a key element in the appeal of romances. Popular geographic settings and historical periods sell, unpopular ones do not--not always with a logical explanation, as the author discovered when she conducted a survey on this topic last year. (Why, for example, are the French Revolution and the…

  17. Current and potential distribution on a coated pipeline with holidays. Part 1 - Model and experimental verification

    SciTech Connect

    Kennelley, K.J.; Bone, L. ); Orazem, M.E. )

    1993-03-01

    A two-dimensional model was developed to predict the current and potential distribution of an underground coated pipe in high-resistivity soil under an impressed current, parallel anode, cathodic protection (CP) system. The model was designed to study the effect of discrete holidays of various sizes on coated pipe without having to assume that holidays simply reduce the efficiency of the protective coating. Full-scale experimental tests were conducted with a 0.508 m (20 in.) diameter pipe coated with 460[mu]m (18 mils) of fusion-bonded epoxy. The performance of a parallel anode CP system was measured in the presence and absence of a discrete longitudinal coating defect (2.4% of the pipe circumference) that exposed bare steel. All tests were conducted in 108,000 ohm-cm water. Good agreement was obtained between experimental results and modeling predictions. The results show that there can be a significant difference in the performance of a cathodic protection system when localized defects exist in the coating as compared to uniformly distributed holidays. This difference is expected to be most pronounced in high-resistivity soils with close anode to pipe spacing.

  18. Nanofibre distribution in composites manufactured with epoxy reinforced with nanofibrillated cellulose: model prediction and verification

    NASA Astrophysics Data System (ADS)

    Aitomäki, Yvonne; Westin, Mikael; Korpimäki, Jani; Oksman, Kristiina

    2016-07-01

    In this study a model based on simple scattering is developed and used to predict the distribution of nanofibrillated cellulose in composites manufactured by resin transfer moulding (RTM) where the resin contains nanofibres. The model is a Monte Carlo based simulation where nanofibres are randomly chosen from probability density functions for length, diameter and orientation. Their movements are then tracked as they advance through a random arrangement of fibres in defined fibre bundles. The results of the model show that the fabric filters the nanofibres within the first 20 µm unless clear inter-bundle channels are available. The volume fraction of the fabric fibres, flow velocity and size of nanofibre influence this to some extent. To verify the model, an epoxy with 0.5 wt.% Kraft Birch nanofibres was made through a solvent exchange route and stained with a colouring agent. This was infused into a glass fibre fabric using an RTM process. The experimental results confirmed the filtering of the nanofibres by the fibre bundles and their penetration in the fabric via the inter-bundle channels. Hence, the model is a useful tool for visualising the distribution of the nanofibres in composites in this manufacturing process.

  19. Experimental verification of a model describing the intensity distribution from a single mode optical fiber

    SciTech Connect

    Moro, Erik A; Puckett, Anthony D; Todd, Michael D

    2011-01-24

    The intensity distribution of a transmission from a single mode optical fiber is often approximated using a Gaussian-shaped curve. While this approximation is useful for some applications such as fiber alignment, it does not accurately describe transmission behavior off the axis of propagation. In this paper, another model is presented, which describes the intensity distribution of the transmission from a single mode optical fiber. A simple experimental setup is used to verify the model's accuracy, and agreement between model and experiment is established both on and off the axis of propagation. Displacement sensor designs based on the extrinsic optical lever architecture are presented. The behavior of the transmission off the axis of propagation dictates the performance of sensor architectures where large lateral offsets (25-1500 {micro}m) exist between transmitting and receiving fibers. The practical implications of modeling accuracy over this lateral offset region are discussed as they relate to the development of high-performance intensity modulated optical displacement sensors. In particular, the sensitivity, linearity, resolution, and displacement range of a sensor are functions of the relative positioning of the sensor's transmitting and receiving fibers. Sensor architectures with high combinations of sensitivity and displacement range are discussed. It is concluded that the utility of the accurate model is in its predicative capability and that this research could lead to an improved methodology for high-performance sensor design.

  20. Estimation of distributional parameters for censored trace level water quality data. 2. Verification and applications

    USGS Publications Warehouse

    Helsel, D.R.; Gilliom, R.J.

    1986-01-01

    Estimates of distributional parameters (mean, standard deviation, median, interquartile range) are often desired for data sets containing censored observations. Eight methods for estimating these parameters have been evaluated by R. J. Gilliom and D. R. Helsel (this issue) using Monte Carlo simulations. To verify those findings, the same methods are now applied to actual water quality data. The best method (lowest root-mean-squared error (rmse)) over all parameters, sample sizes, and censoring levels is log probability regression (LR), the method found best in the Monte Carlo simulations. Best methods for estimating moment or percentile parameters separately are also identical to the simulations. Reliability of these estimates can be expressed as confidence intervals using rmse and bias values taken from the simulation results. Finally, a new simulation study shows that best methods for estimating uncensored sample statistics from censored data sets are identical to those for estimating population parameters.

  1. Geostatistical modeling of the spatial distribution of soil dioxin in the vicinity of an incinerator. 2. Verification and calibration study.

    PubMed

    Goovaerts, Pierre; Trinh, Hoa T; Demond, Avery H; Towey, Timothy; Chang, Shu-Chi; Gwinn, Danielle; Hong, Biling; Franzblau, Alfred; Garabrant, David; Gillespie, Brenda W; Lepkowski, James; Adriaens, Peter

    2008-05-15

    A key component in any investigation of cause-effect relationships between point source pollution, such as an incinerator, and human health is the availability of measurements and/or accurate models of exposure at the same scale or geography as the health data. Geostatistics allows one to simulate the spatial distribution of pollutant concentrations over various spatial supports while incorporating both field data and predictions of deterministic dispersion models. This methodology was used in a companion paper to identify the census blocks that have a high probability of exceeding a given level of dioxin TEQ (toxic equivalents) around an incinerator in Midland, MI. This geostatistical model, along with population data, provided guidance for the collection of 51 new soil data, which permits the verification of the geostatistical predictions, and calibration of the model. Each new soil measurement was compared to the set of 100 TEQ values simulated at the closest grid node. The correlation between the measured concentration and the averaged simulated value is moderate (0.44), and the actual concentrations are clearly overestimated in the vicinity of the plant property line. Nevertheless, probability intervals computed from simulated TEQ values provide an accurate model of uncertainty: the proportion of observations that fall within these intervals exceeds what is expected from the model. Simulation-based probability intervals are also narrower than the intervals derived from the global histogram of the data, which demonstrates the greater precision of the geostatistical approach. Log-normal ordinary kriging provided fairly similar estimation results for the small and well-sampled area used in this validation study; however, the model of uncertainty was not always accurate. The regression analysis and geostatistical simulation were then conducted using the combined set of 53 original and 51 new soil samples, leading to an updated model for the spatial distribution of

  2. Location, Identification, and Size Distribution of Depleted Uranium Grains in Reservoir Sediments

    NASA Astrophysics Data System (ADS)

    Lo, D.; Fleischer, R. L.; Albert, E. A.; Arnason, J. G.

    2006-05-01

    The location, isotopic composition, and size distribution of uranium-rich grains in sediment layers can be identified by analysis of etched particle tracks. Samples are pressed against track detectors, irradiated with thermal neutrons, and the detectors are chemically etched to reveal fission tracks. The total track abundance from the sample is a measure of the U-235 content; hence, if the bulk uranium (mostly U-238) has been measured, the two sets of results give the depletion or enrichment of the uranium. Each uranium-rich particle produces a sunburst of tracks where the number of tracks is proportional to the size of the particle. From 1958 to 1984, National Lead Industries processed depleted uranium (DU) at its plant in Colonie, NY (just west of Albany). Radioactive materials, principally DU, that were emitted from its exhaust stacks have been found 40 km away (Dietz, 1981). We have studied a sediment core taken by Arnason and Fletcher (2003, 2004) from a small body of water, the Patroon Reservoir, which is 1 km east-southeast of the National Lead plant. Examination of portions of that core demonstrates the usefulness of induced nuclear tracks (1) to locate microscopic high-uranium grains for further mineralogical study ; (2) to determine the size distribution of uranium grains; and (3) to help analyze the average isotopic depletion of the uranium when total U concentrations are known. We infer that the size of DU particles in the sediment was controlled by both atmospheric transport from stack to reservoir and fluvial transport within the reservoir.

  3. Quantitative Verification of Dynamic Wedge Dose Distribution Using a 2D Ionization Chamber Array.

    PubMed

    Sahnoun, Tarek; Farhat, Leila; Mtibaa, Anis; Besbes, Mounir; Daoud, Jamel

    2015-10-01

    The accuracy of two calculation algorithms of the Eclipse 8.9 treatment planning system (TPS)--the anisotropic analytic algorithm (AAA) and pencil-beam convolution (PBC)--in modeling the enhanced dynamic wedge (EDW) was investigated. Measurements were carried out for 6 and 18 MV photon beams using a 2D ionization chamber array. Accuracy of the TPS was evaluated using a gamma index analysis with the following acceptance criteria for dose differences (DD) and distance to agreement (DTA): 3%/3 mm and 2%/2 mm. The TPS models the dose distribution accurately except for 20×20 cm(2) field size, 60 (°) and 45 (°) wedge angles using PBC at 6 MV photon energy. For these latter fields, the pass rate and the mean value of gamma were less than 90% and more than 0.5, respectively at the (3%/3 mm) acceptance criteria. In addition, an accuracy level of (2%/2 mm) was achieved using AAA with better agreement for 18 MV photon energy.

  4. Atomic scale verification of oxide-ion vacancy distribution near a single grain boundary in YSZ.

    PubMed

    An, Jihwan; Park, Joong Sun; Koh, Ai Leen; Lee, Hark B; Jung, Hee Joon; Schoonman, Joop; Sinclair, Robert; Gür, Turgut M; Prinz, Fritz B

    2013-01-01

    This study presents atomic scale characterization of grain boundary defect structure in a functional oxide with implications for a wide range of electrochemical and electronic behavior. Indeed, grain boundary engineering can alter transport and kinetic properties by several orders of magnitude. Here we report experimental observation and determination of oxide-ion vacancy concentration near the Σ13 (510)/[001] symmetric tilt grain-boundary of YSZ bicrystal using aberration-corrected TEM operated under negative spherical aberration coefficient imaging condition. We show significant oxygen deficiency due to segregation of oxide-ion vacancies near the grain-boundary core with half-width < 0.6 nm. Electron energy loss spectroscopy measurements with scanning TEM indicated increased oxide-ion vacancy concentration at the grain boundary core. Oxide-ion density distribution near a grain boundary simulated by molecular dynamics corroborated well with experimental results. Such column-by-column quantification of defect concentration in functional materials can provide new insights that may lead to engineered grain boundaries designed for specific functionalities.

  5. Atomic Scale Verification of Oxide-Ion Vacancy Distribution near a Single Grain Boundary in YSZ

    PubMed Central

    An, Jihwan; Park, Joong Sun; Koh, Ai Leen; Lee, Hark B.; Jung, Hee Joon; Schoonman, Joop; Sinclair, Robert; Gür, Turgut M.; Prinz, Fritz B.

    2013-01-01

    This study presents atomic scale characterization of grain boundary defect structure in a functional oxide with implications for a wide range of electrochemical and electronic behavior. Indeed, grain boundary engineering can alter transport and kinetic properties by several orders of magnitude. Here we report experimental observation and determination of oxide-ion vacancy concentration near the Σ13 (510)/[001] symmetric tilt grain-boundary of YSZ bicrystal using aberration-corrected TEM operated under negative spherical aberration coefficient imaging condition. We show significant oxygen deficiency due to segregation of oxide-ion vacancies near the grain-boundary core with half-width < 0.6 nm. Electron energy loss spectroscopy measurements with scanning TEM indicated increased oxide-ion vacancy concentration at the grain boundary core. Oxide-ion density distribution near a grain boundary simulated by molecular dynamics corroborated well with experimental results. Such column-by-column quantification of defect concentration in functional materials can provide new insights that may lead to engineered grain boundaries designed for specific functionalities. PMID:24042150

  6. Commercial milk distribution profiles and production locations. Hanford Environmental Dose Reconstruction Project

    SciTech Connect

    Deonigi, D.E.; Anderson, D.M.; Wilfert, G.L.

    1994-04-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project was established to estimate radiation doses that people could have received from nuclear operations at the Hanford Site since 1944. For this period iodine-131 is the most important offsite contributor to radiation doses from Hanford operations. Consumption of milk from cows that ate vegetation contaminated by iodine-131 is the dominant radiation pathway for individuals who drank milk (Napier 1992). Information has been developed on commercial milk cow locations and commercial milk distribution during 1945 and 1951. The year 1945 was selected because during 1945 the largest amount of iodine-131 was released from Hanford facilities in a calendar year (Heeb 1993); therefore, 1945 was the year in which an individual was likely to have received the highest dose. The year 1951 was selected to provide data for comparing the changes that occurred in commercial milk flows (i.e., sources, processing locations, and market areas) between World War II and the post-war period. To estimate the doses people could have received from this milk flow, it is necessary to estimate the amount of milk people consumed, the source of the milk, the specific feeding regime used for milk cows, and the amount of iodine-131 contamination deposited on feed.

  7. Commercial milk distribution profiles and production locations. Hanford Environmental Dose Reconstruction Project

    SciTech Connect

    Deonigi, D.E.; Anderson, D.M.; Wilfert, G.L.

    1993-12-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project was established to estimate radiation doses that people could have received from nuclear operations at the Hanford Site since 1944. For this period iodine-131 is the most important offsite contributor to radiation doses from Hanford operations. Consumption of milk from cows that ate vegetation contaminated by iodine-131 is the dominant radiation pathway for individuals who drank milk. Information has been developed on commercial milk cow locations and commercial milk distribution during 1945 and 1951. The year 1945 was selected because during 1945 the largest amount of iodine-131 was released from Hanford facilities in a calendar year; therefore, 1945 was the year in which an individual was likely to have received the highest dose. The year 1951 was selected to provide data for comparing the changes that occurred in commercial milk flows (i.e., sources, processing locations, and market areas) between World War II and the post-war period. To estimate the doses people could have received from this milk flow, it is necessary to estimate the amount of milk people consumed, the source of the milk, the specific feeding regime used for milk cows, and the amount of iodine-131 contamination deposited on feed.

  8. Verification of the efficiency of chemical disinfection and sanitation measures in in-building distribution systems.

    PubMed

    Lenz, J; Linke, S; Gemein, S; Exner, M; Gebel, J

    2010-06-01

    Previous investigations of biofilms, generated in a silicone tube model have shown that the number of colony forming units (CFU) can reach 10(7)/cm(2), the total cell count (TCC) of microorganisms can be up to 10(8)cells/cm(2). The present study focuses on the situation in in-building distribution systems. Different chemical disinfectants were tested for their efficacy on drinking water biofilms in silicone tubes: free chlorine (electrochemically activated), chlorine dioxide, hydrogen peroxide (H(2)O(2)), silver, and fruit acids. With regard to the widely differing manufacturers' instructions for the usage of their disinfectants three different variations of the silicone tube model were developed to simulate practical use conditions. First the continuous treatment, second the intermittent treatment, third the efficacy of external disinfection treatment and the monitoring for possible biofilm formation with the Hygiene-Monitor. The working experience showed that it is important to know how to handle the individual disinfectants. Every active ingredient has its own optimal application concerning its concentration, exposure time, physical parameters like pH, temperature or redox potential. When used correctly all products tested were able to reduce the CFU to a value below the detection limit. Most of the active ingredients could not significantly reduce the TCC/cm(2), which means that viable microorganisms may still be present in the system. Thus the question arises what happened with these cells? In some cases SEM pictures of the biofilm matrix after a successful disinfection still showed biofilm residues. According to these results, no general correlation between CFU/cm(2), TCC/cm(2) and the visualised biofilm matrix on the silicone tube surface (SEM) could be demonstrated after a treatment with disinfectants.

  9. [Spatial correlation of active mounds locative distribution of Solenopsis invicta Buren polygyne populations].

    PubMed

    Lu, Yong-yue; Li, Ning-dong; Liang, Guang-wen; Zeng, Ling

    2007-01-01

    By using geostatistic method, this paper studied the spatial distribution patterns of the active mounds of Solenopsis invicta Buren polygyne populations in Wuchuan and Shenzhen, and built up the spherical models of the interval distances and semivariances of the mounds. The semivariograms were described at the two directions of east-west and south-north, which were obviously positively correlated to the interval distances, revealing that the active mounds in locative area were space-dependent. The ranges of the 5 spherical models constructed for 5 sampling plots in Wuchuan were 9.1 m, 7.6 m, 23.5 m, 7.5 m and 14.5 m, respectively, with an average of 12.4 m. The mounds of any two plots in this range were significantly correlated. There was a randomicity in the spatial distribution of active mounds, and the randomicity index (Nugget/Sill) was 0.7034, 0.9247, 0.4398, 1.1196 and 0.4624, respectively. In Shenzhen, the relationships between the interval distances and semivariances were described by 7 spherical models, and the ranges were 14.5 m, 11.2 m, 10.8 m, 17.6 m, 11.3 m, 9.9 m and 12.8 m, respectively, with an average of 12.6 m.

  10. Aerosol number size distributions over a coastal semi urban location: Seasonal changes and ultrafine particle bursts.

    PubMed

    Babu, S Suresh; Kompalli, Sobhan Kumar; Moorthy, K Krishna

    2016-09-01

    Number-size distribution is one of the important microphysical properties of atmospheric aerosols that influence aerosol life cycle, aerosol-radiation interaction as well as aerosol-cloud interactions. Making use of one-yearlong measurements of aerosol particle number-size distributions (PNSD) over a broad size spectrum (~15-15,000nm) from a tropical coastal semi-urban location-Trivandrum (Thiruvananthapuram), the size characteristics, their seasonality and response to mesoscale and synoptic scale meteorology are examined. While the accumulation mode contributed mostly to the annual mean concentration, ultrafine particles (having diameter <100nm) contributed as much as 45% to the total concentration, and thus constitute a strong reservoir, that would add to the larger particles through size transformation. The size distributions were, in general, bimodal with well-defined modes in the accumulation and coarse regimes, with mode diameters lying in the range 141 to 167nm and 1150 to 1760nm respectively, in different seasons. Despite the contribution of the coarse sized particles to the total number concentration being meager, they contributed significantly to the surface area and volume, especially during transport of marine air mass highlighting the role of synoptic air mass changes. Significant diurnal variation occurred in the number concentrations, geometric mean diameters, which is mostly attributed to the dynamics of the local coastal atmospheric boundary layer and the effect of mesoscale land/sea breeze circulation. Bursts of ultrafine particles (UFP) occurred quite frequently, apparently during periods of land-sea breeze transitions, caused by the strong mixing of precursor-rich urban air mass with the cleaner marine air mass; the resulting turbulence along with boundary layer dynamics aiding the nucleation. These ex-situ particles were observed at the surface due to the transport associated with boundary layer dynamics. The particle growth rates from

  11. Aerosol number size distributions over a coastal semi urban location: Seasonal changes and ultrafine particle bursts.

    PubMed

    Babu, S Suresh; Kompalli, Sobhan Kumar; Moorthy, K Krishna

    2016-09-01

    Number-size distribution is one of the important microphysical properties of atmospheric aerosols that influence aerosol life cycle, aerosol-radiation interaction as well as aerosol-cloud interactions. Making use of one-yearlong measurements of aerosol particle number-size distributions (PNSD) over a broad size spectrum (~15-15,000nm) from a tropical coastal semi-urban location-Trivandrum (Thiruvananthapuram), the size characteristics, their seasonality and response to mesoscale and synoptic scale meteorology are examined. While the accumulation mode contributed mostly to the annual mean concentration, ultrafine particles (having diameter <100nm) contributed as much as 45% to the total concentration, and thus constitute a strong reservoir, that would add to the larger particles through size transformation. The size distributions were, in general, bimodal with well-defined modes in the accumulation and coarse regimes, with mode diameters lying in the range 141 to 167nm and 1150 to 1760nm respectively, in different seasons. Despite the contribution of the coarse sized particles to the total number concentration being meager, they contributed significantly to the surface area and volume, especially during transport of marine air mass highlighting the role of synoptic air mass changes. Significant diurnal variation occurred in the number concentrations, geometric mean diameters, which is mostly attributed to the dynamics of the local coastal atmospheric boundary layer and the effect of mesoscale land/sea breeze circulation. Bursts of ultrafine particles (UFP) occurred quite frequently, apparently during periods of land-sea breeze transitions, caused by the strong mixing of precursor-rich urban air mass with the cleaner marine air mass; the resulting turbulence along with boundary layer dynamics aiding the nucleation. These ex-situ particles were observed at the surface due to the transport associated with boundary layer dynamics. The particle growth rates from

  12. Relation Between Sprite Distribution and Source Locations of VHF Pulses Derived From JEM- GLIMS Measurements

    NASA Astrophysics Data System (ADS)

    Sato, Mitsuteru; Mihara, Masahiro; Ushio, Tomoo; Morimoto, Takeshi; Kikuchi, Hiroshi; Adachi, Toru; Suzuki, Makoto; Yamazaki, Atsushi; Takahashi, Yukihiro

    2015-04-01

    JEM-GLIMS is continuing the comprehensive nadir observations of lightning and TLEs using optical instruments and electromagnetic wave receivers since November 2012. For the period between November 20, 2012 and November 30, 2014, JEM-GLIMS succeeded in detecting 5,048 lightning events. A total of 567 events in 5,048 lightning events were TLEs, which were mostly elves events. To identify the sprite occurrences from the transient optical flash data, it is necessary to perform the following data analysis: (1) a subtraction of the appropriately scaled wideband camera data from the narrowband camera data; (2) a calculation of intensity ratio between different spectrophotometer channels; and (3) an estimation of the polarization and CMC for the parent CG discharges using ground-based ELF measurement data. From a synthetic comparison of these results, it is confirmed that JEM-GLISM succeeded in detecting sprite events. The VHF receiver (VITF) onboard JEM-GLIMS uses two patch-type antennas separated by a 1.6-m interval and can detect VHF pulses emitted by lightning discharges in the 70-100 MHz frequency range. Using both an interferometric technique and a group delay technique, we can estimate the source locations of VHF pulses excited by lightning discharges. In the event detected at 06:41:15.68565 UT on June 12, 2014 over central North America, sprite was distributed with a horizontal displacement of 20 km from the peak location of the parent lightning emission. In this event, a total of 180 VHF pulses were simultaneously detected by VITF. From the detailed data analysis of these VHF pulse data, it is found that the majority of the source locations were placed near the area of the dim lightning emission, which may imply that the VHF pulses were associated with the in-cloud lightning current. At the presentation, we will show detailed comparison between the spatiotemporal characteristics of sprite emission and source locations of VHF pulses excited by the parent lightning

  13. Redshift distributions of galaxies in the Dark Energy Survey Science Verification shear catalogue and implications for weak lensing

    NASA Astrophysics Data System (ADS)

    Bonnett, C.; Troxel, M. A.; Hartley, W.; Amara, A.; Leistedt, B.; Becker, M. R.; Bernstein, G. M.; Bridle, S. L.; Bruderer, C.; Busha, M. T.; Carrasco Kind, M.; Childress, M. J.; Castander, F. J.; Chang, C.; Crocce, M.; Davis, T. M.; Eifler, T. F.; Frieman, J.; Gangkofner, C.; Gaztanaga, E.; Glazebrook, K.; Gruen, D.; Kacprzak, T.; King, A.; Kwan, J.; Lahav, O.; Lewis, G.; Lidman, C.; Lin, H.; MacCrann, N.; Miquel, R.; O'Neill, C. R.; Palmese, A.; Peiris, H. V.; Refregier, A.; Rozo, E.; Rykoff, E. S.; Sadeh, I.; Sánchez, C.; Sheldon, E.; Uddin, S.; Wechsler, R. H.; Zuntz, J.; Abbott, T.; Abdalla, F. B.; Allam, S.; Armstrong, R.; Banerji, M.; Bauer, A. H.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Capozzi, D.; Carnero Rosell, A.; Carretero, J.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Dietrich, J. P.; Doel, P.; Fausti Neto, A.; Fernandez, E.; Flaugher, B.; Fosalba, P.; Gerdes, D. W.; Gruendl, R. A.; Honscheid, K.; Jain, B.; James, D. J.; Jarvis, M.; Kim, A. G.; Kuehn, K.; Kuropatkin, N.; Li, T. S.; Lima, M.; Maia, M. A. G.; March, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Miller, C. J.; Neilsen, E.; Nichol, R. C.; Nord, B.; Ogando, R.; Plazas, A. A.; Reil, K.; Romer, A. K.; Roodman, A.; Sako, M.; Sanchez, E.; Santiago, B.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thaler, J.; Thomas, D.; Vikram, V.; Walker, A. R.; Dark Energy Survey Collaboration

    2016-08-01

    We present photometric redshift estimates for galaxies used in the weak lensing analysis of the Dark Energy Survey Science Verification (DES SV) data. Four model- or machine learning-based photometric redshift methods—annz2, bpz calibrated against BCC-Ufig simulations, skynet, and tpz—are analyzed. For training, calibration, and testing of these methods, we construct a catalogue of spectroscopically confirmed galaxies matched against DES SV data. The performance of the methods is evaluated against the matched spectroscopic catalogue, focusing on metrics relevant for weak lensing analyses, with additional validation against COSMOS photo-z 's. From the galaxies in the DES SV shear catalogue, which have mean redshift 0.72 ±0.01 over the range 0.3 distributions through to their impact on cosmological parameters estimated with cosmic shear, and find that they cause shifts in the value of σ8 of approximately 3%. This shift is within the one sigma statistical errors on σ8 for the DES SV shear catalogue. We further study the potential impact of systematic differences on the critical surface density, Σcrit , finding levels of bias safely less than the statistical power of DES SV data. We recommend a final Gaussian prior for the photo-z bias in the mean of n (z ) of width 0.05 for each of the three tomographic bins, and show that this is a sufficient bias model for the corresponding cosmology analysis.

  14. System performance and performance enhancement relative to element position location errors for distributed linear antenna arrays

    NASA Astrophysics Data System (ADS)

    Adrian, Andrew

    For the most part, antenna phased arrays have traditionally been comprised of antenna elements that are very carefully and precisely placed in very periodic grid structures. Additionally, the relative positions of the elements to each other are typically mechanically fixed as best as possible. There is never an assumption the relative positions of the elements are a function of time or some random behavior. In fact, every array design is typically analyzed for necessary element position tolerances in order to meet necessary performance requirements such as directivity, beamwidth, sidelobe level, and beam scanning capability. Consider an antenna array that is composed of several radiating elements, but the position of each of the elements is not rigidly, mechanically fixed like a traditional array. This is not to say that the element placement structure is ignored or irrelevant, but each element is not always in its relative, desired location. Relative element positioning would be analogous to a flock of birds in flight or a swarm of insects. They tend to maintain a near fixed position with the group, but not always. In the antenna array analog, it would be desirable to maintain a fixed formation, but due to other random processes, it is not always possible to maintain perfect formation. This type of antenna array is referred to as a distributed antenna array. A distributed antenna array's inability to maintain perfect formation causes degradations in the antenna factor pattern of the array. Directivity, beamwidth, sidelobe level and beam pointing error are all adversely affected by element relative position error. This impact is studied as a function of element relative position error for linear antenna arrays. The study is performed over several nominal array element spacings, from lambda to lambda, several sidelobe levels (20 to 50 dB) and across multiple array illumination tapers. Knowing the variation in performance, work is also performed to utilize a minimum

  15. Genomic distribution of AFLP markers relative to gene locations for different eukaryotic species

    PubMed Central

    2013-01-01

    Background Amplified fragment length polymorphism (AFLP) markers are frequently used for a wide range of studies, such as genome-wide mapping, population genetic diversity estimation, hybridization and introgression studies, phylogenetic analyses, and detection of signatures of selection. An important issue to be addressed for some of these fields is the distribution of the markers across the genome, particularly in relation to gene sequences. Results Using in-silico restriction fragment analysis of the genomes of nine eukaryotic species we characterise the distribution of AFLP fragments across the genome and, particularly, in relation to gene locations. First, we identify the physical position of markers across the chromosomes of all species. An observed accumulation of fragments around (peri) centromeric regions in some species is produced by repeated sequences, and this accumulation disappears when AFLP bands rather than fragments are considered. Second, we calculate the percentage of AFLP markers positioned within gene sequences. For the typical EcoRI/MseI enzyme pair, this ranges between 28 and 87% and is usually larger than that expected by chance because of the higher GC content of gene sequences relative to intergenic ones. In agreement with this, the use of enzyme pairs with GC-rich restriction sites substantially increases the above percentages. For example, using the enzyme system SacI/HpaII, 86% of AFLP markers are located within gene sequences in A. thaliana, and 100% of markers in Plasmodium falciparun. We further find that for a typical trait controlled by 50 genes of average size, if 1000 AFLPs are used in a study, the number of those within 1 kb distance from any of the genes would be only about 1–2, and only about 50% of the genes would have markers within that distance. Conclusions The high coverage of AFLP markers across the genomes and the high proportion of markers within or close to gene sequences make them suitable for genome scans and

  16. Impact detection, location, and characterization using spatially weighted distributed fiber optic sensors

    NASA Astrophysics Data System (ADS)

    Spillman, William B., Jr.; Huston, Dryver R.

    1996-11-01

    The ability to detect, localize and characterize impacts in real time is of critical importance for the safe operation of aircraft, spacecraft and other vehicles, particularly in light of the increasing use of high performance composite materials with unconventional and often catastrophic failure modes. Although a number of systems based on fiber optic sensors have been proposed or demonstrated, they have generally proved not to be useful due to difficulty of implementation, limited accuracy or high cost. In this paper, we present the results of an investigation using two spatially weighted distributed fiber optic sensors to detect, localize and characterize impacts along an extended linear region. By having the sensors co-located with one having sensitivity to impacts ranging from low to high along its length while the other sensor has sensitivity ranging from high to low along the same path, impacts can be localized and their magnitudes determined using a very simple algorithm. A theoretical description of the techniques is given and compared with experimental results.

  17. Optimal sensor placement for leak location in water distribution networks using genetic algorithms.

    PubMed

    Casillas, Myrna V; Puig, Vicenç; Garza-Castañón, Luis E; Rosich, Albert

    2013-11-04

    This paper proposes a new sensor placement approach for leak location in water distribution networks (WDNs). The sensor placement problem is formulated as an integer optimization problem. The optimization criterion consists in minimizing the number of non-isolable leaks according to the isolability criteria introduced. Because of the large size and non-linear integer nature of the resulting optimization problem, genetic algorithms (GAs) are used as the solution approach. The obtained results are compared with a semi-exhaustive search method with higher computational effort, proving that GA allows one to find near-optimal solutions with less computational load. Moreover, three ways of increasing the robustness of the GA-based sensor placement method have been proposed using a time horizon analysis, a distance-based scoring and considering different leaks sizes. A great advantage of the proposed methodology is that it does not depend on the isolation method chosen by the user, as long as it is based on leak sensitivity analysis. Experiments in two networks allow us to evaluate the performance of the proposed approach.

  18. Optimal Sensor Placement for Leak Location in Water Distribution Networks Using Genetic Algorithms

    PubMed Central

    Casillas, Myrna V.; Puig, Vicenç; Garza-Castañón, Luis E.; Rosich, Albert

    2013-01-01

    This paper proposes a new sensor placement approach for leak location in water distribution networks (WDNs). The sensor placement problem is formulated as an integer optimization problem. The optimization criterion consists in minimizing the number of non-isolable leaks according to the isolability criteria introduced. Because of the large size and non-linear integer nature of the resulting optimization problem, genetic algorithms (GAs) are used as the solution approach. The obtained results are compared with a semi-exhaustive search method with higher computational effort, proving that GA allows one to find near-optimal solutions with less computational load. Moreover, three ways of increasing the robustness of the GA-based sensor placement method have been proposed using a time horizon analysis, a distance-based scoring and considering different leaks sizes. A great advantage of the proposed methodology is that it does not depend on the isolation method chosen by the user, as long as it is based on leak sensitivity analysis. Experiments in two networks allow us to evaluate the performance of the proposed approach. PMID:24193099

  19. Distribution and Neighborhood Correlates of Sober Living House Locations in Los Angeles.

    PubMed

    Mericle, Amy A; Karriker-Jaffe, Katherine J; Gupta, Shalika; Sheridan, David M; Polcin, Doug L

    2016-09-01

    Sober living houses (SLHs) are alcohol and drug-free living environments for individuals in recovery. The goal of this study was to map the distribution of SLHs in Los Angeles (LA) County, California (N = 260) and examine neighborhood correlates of SLH density. Locations of SLHs were geocoded and linked to tract-level Census data as well as to publicly available information on alcohol outlets and recovery resources. Neighborhoods with SLHs differed from neighborhoods without them on measures of socioeconomic disadvantage and accessibility of recovery resources. In multivariate, spatially lagged hurdle models stratified by monthly fees charged (less than $1400/month vs. $1400/month or greater), minority composition, and accessibility of treatment were associated with the presence of affordable SLHs. Accessibility of treatment was also associated with the number of affordable SLHs in those neighborhoods. Higher median housing value and accessibility of treatment were associated with whether a neighborhood had high-cost SLHs, and lower population density was associated with the number of high-cost SLHs in those neighborhoods. Neighborhood factors are associated with the availability of SLHs, and research is needed to better understand how these factors affect resident outcomes, as well as how SLHs may affect neighborhoods over time. PMID:27628590

  20. Distribution of Impact Locations and Velocities of Earth Meteorites on the Moon

    NASA Astrophysics Data System (ADS)

    Armstrong, John C.

    2010-12-01

    Following the analytical work of Armstrong et al. (Icarus 160:183-196, 2002), we detail an expanded N-body calculation of the direct transfer of terrestrial material to the Moon during a giant impact. By simulating 1.4 million particles over a range of launch velocities and ejecta angles, we have derived a map of the impact velocities, impact angles, and probable impact sites on the moon over the last 4 billion years. The maps indicate that the impacts with the highest vertical impact speeds are concentrated on the leading edge, with lower velocity/higher-angle impacts more numerous on the Moon's trailing edge. While this enhanced simulation indicates the estimated globally averaged direct transfer fraction reported in Armstrong et al. (Icarus 160:183-196, 2002) is overestimated by a factor of 3-6, local concentrations can reach or exceed the previously published estimate. The most favorable location for large quantities of low velocity terrestrial material is 50 W, 85 S, with 8.4 times more impacts per square kilometer than the lunar surface average. This translates to 300-500 kg km-2, compared to 200 kg km-2 from the previous estimate. The maps also indicate a significant amount of material impacting elsewhere in the polar regions, especially near the South Pole-Aiken basin, a likely target for sample return in the near future. The magnitudes of the impact speeds cluster near 3 km/s, but there is a bimodal distribution in impact angles, leading to 43% of impacts with very low (<1 km/s) vertical impact speeds. This, combined with the enhanced surface density of meteorites in specific regions, increases the likelihood of weakly shocked terrestrial material being identified and recovered on the Moon.

  1. Lexical distributional cues, but not situational cues, are readily used to learn abstract locative verb-structure associations.

    PubMed

    Twomey, Katherine E; Chang, Franklin; Ambridge, Ben

    2016-08-01

    Children must learn the structural biases of locative verbs in order to avoid making overgeneralisation errors (e.g., (∗)I filled water into the glass). It is thought that they use linguistic and situational information to learn verb classes that encode structural biases. In addition to situational cues, we examined whether children and adults could use the lexical distribution of nouns in the post-verbal noun phrase of transitive utterances to assign novel verbs to locative classes. In Experiment 1, children and adults used lexical distributional cues to assign verb classes, but were unable to use situational cues appropriately. In Experiment 2, adults generalised distributionally-learned classes to novel verb arguments, demonstrating that distributional information can cue abstract verb classes. Taken together, these studies show that human language learners can use a lexical distributional mechanism that is similar to that used by computational linguistic systems that use large unlabelled corpora to learn verb meaning. PMID:27183399

  2. Spatial distribution of soil water repellency in a grassland located in Lithuania

    NASA Astrophysics Data System (ADS)

    Pereira, Paulo; Novara, Agata

    2014-05-01

    Soil water repellency (SWR) it is recognized to be very heterogeneous in time in space and depends on soil type, climate, land use, vegetation and season (Doerr et al., 2002). It prevents or reduces water infiltration, with important impacts on soil hydrology, influencing the mobilization and transport of substances into the soil profile. The reduced infiltration increases surface runoff and soil erosion. SWR reduce also the seed emergency and plant growth due the reduced amount of water in the root zone. Positive aspects of SWR are the increase of soil aggregate stability, organic carbon sequestration and reduction of water evaporation (Mataix-Solera and Doerr, 2004; Diehl, 2013). SWR depends on the soil aggregate size. In fire affected areas it was founded that SWR was more persistent in small size aggregates (Mataix-Solera and Doerr, 2004; Jordan et al., 2011). However, little information is available about SWR spatial distribution according to soil aggregate size. The aim of this work is study the spatial distribution of SWR in fine earth (<2 mm) and different aggregate sizes, 2-1 mm, 1-0.5 mm, 0.5-0.25 mm and <0.25 mm. The studied area is located near Vilnius (Lithuania) at 54° 42' N, 25° 08 E, 158 masl. A plot with 400 m2 (20 x 20 m with 5 m space between sampling points) and 25 soil samples were collected in the top soil (0-5 cm) and taken to the laboratory. Previously to SWR assessment, the samples were air dried. The persistence of SWR was analysed according to the Water Drop Penetration Method, which involves placing three drops of distilled water onto the soil surface and registering the time in seconds (s) required for the drop complete penetration (Wessel, 1988). Data did not respected Gaussian distribution, thus in order to meet normality requirements it was log-normal transformed. Spatial interpolations were carried out using Ordinary Kriging. The results shown that SWR was on average in fine earth 2.88 s (Coeficient of variation % (CV%)=44.62), 2

  3. Responses of European precipitation distributions and regimes to different blocking locations

    NASA Astrophysics Data System (ADS)

    Sousa, Pedro M.; Trigo, Ricardo M.; Barriopedro, David; Soares, Pedro M. M.; Ramos, Alexandre M.; Liberato, Margarida L. R.

    2016-04-01

    In this work we performed an analysis on the impacts of blocking episodes on seasonal and annual European precipitation and the associated physical mechanisms. Distinct domains were considered in detail taking into account different blocking center positions spanning between the Atlantic and western Russia. Significant positive precipitation anomalies are found for southernmost areas while generalized negative anomalies (up to 75 % in some areas) occur in large areas of central and northern Europe. This dipole of anomalies is reversed when compared to that observed during episodes of strong zonal flow conditions. We illustrate that the location of the maximum precipitation anomalies follows quite well the longitudinal positioning of the blocking centers and discuss regional and seasonal differences in the precipitation responses. To better understand the precipitation anomalies, we explore the blocking influence on cyclonic activity. The results indicate a split of the storm-tracks north and south of blocking systems, leading to an almost complete reduction of cyclonic centers in northern and central Europe and increases in southern areas, where cyclone frequency doubles during blocking episodes. However, the underlying processes conductive to the precipitation anomalies are distinct between northern and southern European regions, with a significant role of atmospheric instability in southern Europe, and moisture availability as the major driver at higher latitudes. This distinctive underlying process is coherent with the characteristic patterns of latent heat release from the ocean associated with blocked and strong zonal flow patterns. We also analyzed changes in the full range of the precipitation distribution of several regional sectors during blocked and zonal days. Results show that precipitation reductions in the areas under direct blocking influence are driven by a substantial drop in the frequency of moderate rainfall classes. Contrarily, southwards of

  4. Where exactly am I? Self-location judgements distribute between head and torso.

    PubMed

    Alsmith, Adrian J T; Longo, Matthew R

    2014-02-01

    I am clearly located where my body is located. But is there one particular place inside my body where I am? Recent results have provided apparently contradictory findings about this question. Here, we addressed this issue using a more direct approach than has been used in previous studies. Using a simple pointing task, we asked participants to point directly at themselves, either by manual manipulation of the pointer whilst blindfolded or by visually discerning when the pointer was in the correct position. Self-location judgements in haptic and visual modalities were highly similar, and were clearly modulated by the starting location of the pointer. Participants most frequently chose to point to one of two likely regions, the upper face or the upper torso, according to which they reached first. These results suggest that while the experienced self is not spread out homogeneously across the entire body, nor is it localised in any single point. Rather, two distinct regions, the upper face and upper torso, appear to be judged as where "I" am. PMID:24457520

  5. On intra-supply chain system with an improved distribution plan, multiple sales locations and quality assurance.

    PubMed

    Chiu, Singa Wang; Huang, Chao-Chih; Chiang, Kuo-Wei; Wu, Mei-Fang

    2015-01-01

    Transnational companies, operating in extremely competitive global markets, always seek to lower different operating costs, such as inventory holding costs in their intra- supply chain system. This paper incorporates a cost reducing product distribution policy into an intra-supply chain system with multiple sales locations and quality assurance studied by [Chiu et al., Expert Syst Appl, 40:2669-2676, (2013)]. Under the proposed cost reducing distribution policy, an added initial delivery of end items is distributed to multiple sales locations to meet their demand during the production unit's uptime and rework time. After rework when the remaining production lot goes through quality assurance, n fixed quantity installments of finished items are then transported to sales locations at a fixed time interval. Mathematical modeling and optimization techniques are used to derive closed-form optimal operating policies for the proposed system. Furthermore, the study demonstrates significant savings in stock holding costs for both the production unit and sales locations. Alternative of outsourcing product delivery task to an external distributor is analyzed to assist managerial decision making in potential outsourcing issues in order to facilitate further reduction in operating costs. PMID:26576330

  6. On intra-supply chain system with an improved distribution plan, multiple sales locations and quality assurance.

    PubMed

    Chiu, Singa Wang; Huang, Chao-Chih; Chiang, Kuo-Wei; Wu, Mei-Fang

    2015-01-01

    Transnational companies, operating in extremely competitive global markets, always seek to lower different operating costs, such as inventory holding costs in their intra- supply chain system. This paper incorporates a cost reducing product distribution policy into an intra-supply chain system with multiple sales locations and quality assurance studied by [Chiu et al., Expert Syst Appl, 40:2669-2676, (2013)]. Under the proposed cost reducing distribution policy, an added initial delivery of end items is distributed to multiple sales locations to meet their demand during the production unit's uptime and rework time. After rework when the remaining production lot goes through quality assurance, n fixed quantity installments of finished items are then transported to sales locations at a fixed time interval. Mathematical modeling and optimization techniques are used to derive closed-form optimal operating policies for the proposed system. Furthermore, the study demonstrates significant savings in stock holding costs for both the production unit and sales locations. Alternative of outsourcing product delivery task to an external distributor is analyzed to assist managerial decision making in potential outsourcing issues in order to facilitate further reduction in operating costs.

  7. Distributed optical fiber sensor for spatial location of polarization mode coupling

    NASA Astrophysics Data System (ADS)

    Cokgor, Ilkan; Handerek, Vincent A.; Rogers, Alan J.

    1993-03-01

    Transverse stress applied to a highly birefringent fiber at an arbitrary angle (other than 0 or 90 degrees) to the fiber birefringence axes causes rotation of the birefringence axes and changes the beat length of the fiber in that section. If one of the polarization modes is excited at the input, coupling of light from one mode to the other will be observed at a stress point. The presentation describes a method for determining the locations of discrete mode coupling points spaced along a polarization maintaining fiber using a pump-prob architecture based on the optical Kerr effect. Probe light experiences coupling at different stress locations. Counterpropagating strong pump light also experiences coupling while inducing additional birefringence, and changing the polarization state of the probe at the output. This system may be made temperature independent by introducing a phase tracking/triggering system. The advantages and limitations of this technique are described.

  8. Effects of cluster location and cluster distribution on performance on the traveling salesman problem.

    PubMed

    MacGregor, James N

    2015-10-01

    Research on human performance in solving traveling salesman problems typically uses point sets as stimuli, and most models have proposed a processing stage at which stimulus dots are clustered. However, few empirical studies have investigated the effects of clustering on performance. In one recent study, researchers compared the effects of clustered, random, and regular stimuli, and concluded that clustering facilitates performance (Dry, Preiss, & Wagemans, 2012). Another study suggested that these results may have been influenced by the location rather than the degree of clustering (MacGregor, 2013). Two experiments are reported that mark an attempt to disentangle these factors. The first experiment tested several combinations of degree of clustering and cluster location, and revealed mixed evidence that clustering influences performance. In a second experiment, both factors were varied independently, showing that they interact. The results are discussed in terms of the importance of clustering effects, in particular, and perceptual factors, in general, during performance of the traveling salesman problem.

  9. The hemodynamic effects of the LVAD outflow cannula location on the thrombi distribution in the aorta: A primary numerical study.

    PubMed

    Zhang, Yage; Gao, Bin; Yu, Chang

    2016-09-01

    Although a growing number of patients undergo LVAD implantation for heart failure treatment, thrombi are still the devastating complication for patients who used LVAD. LVAD outflow cannula location and thrombi generation sources were hypothesized to affect the thrombi distribution in the aorta. To test this hypothesis, numerical studies were conducted by using computational fluid dynamic (CFD) theory. Two anastomotic configurations, in which the LVAD outflow cannula is anastomosed to the anterior and lateral ascending aortic wall (named as anterior configurations and lateral configurations, respectively), are designed. The particles, whose sized are same as those of thrombi, are released at the LVAD output cannula and the aortic valve (named as thrombiP and thrombiL, respectively) to calculate the distribution of thrombi. The simulation results demonstrate that the thrombi distribution in the aorta is significantly affected by the LVAD outflow cannula location. In anterior configuration, the thrombi probability of entering into the three branches is 23.60%, while that in lateral configuration is 36.68%. Similarly, in anterior configuration, the thrombi probabilities of entering into brachiocephalic artery, left common carotid artery and left subclavian artery, is 8.51%, 9.64%, 5.45%, respectively, while that in lateral configuration it is 11.39%, 3.09%, 22.20% respectively. Moreover, the origins of thrombi could affect their distributions in the aorta. In anterior configuration, the thrombiP has a lower probability to enter into the three branches than thrombiL (12% vs. 25%). In contrast, in lateral configuration, the thrombiP has a higher probability to enter into the three branches than thrombiL (47% vs. 35%). In brief, the LVAD outflow cannula location significantly affects the distribution of thrombi in the aorta. Thus, in the clinical practice, the selection of outflow location of LVAD and the risk of thrombi formed in the left ventricle should be paid more

  10. Condensation of earthquake location distributions: Optimal spatial information encoding and application to multifractal analysis of south Californian seismicity.

    PubMed

    Kamer, Yavor; Ouillon, Guy; Sornette, Didier; Wössner, Jochen

    2015-08-01

    We present the "condensation" method that exploits the heterogeneity of the probability distribution functions (PDFs) of event locations to improve the spatial information content of seismic catalogs. As its name indicates, the condensation method reduces the size of seismic catalogs while improving the access to the spatial information content of seismic catalogs. The PDFs of events are first ranked by decreasing location errors and then successively condensed onto better located and lower variance event PDFs. The obtained condensed catalog differs from the initial catalog by attributing different weights to each event, the set of weights providing an optimal spatial representation with respect to the spatially varying location capability of the seismic network. Synthetic tests on fractal distributions perturbed with realistic location errors show that condensation improves spatial information content of the original catalog, which is quantified by the likelihood gain per event. Applied to Southern California seismicity, the new condensed catalog highlights major mapped fault traces and reveals possible additional structures while reducing the catalog length by ∼25%. The condensation method allows us to account for location error information within a point based spatial analysis. We demonstrate this by comparing the multifractal properties of the condensed catalog locations with those of the original catalog. We evidence different spatial scaling regimes characterized by distinct multifractal spectra and separated by transition scales. We interpret the upper scale as to agree with the thickness of the brittle crust, while the lower scale (2.5 km) might depend on the relocation procedure. Accounting for these new results, the epidemic type aftershock model formulation suggests that, contrary to previous studies, large earthquakes dominate the earthquake triggering process. This implies that the limited capability of detecting small magnitude events cannot be used

  11. Distributed fiber optic sensor employing phase generate carrier for disturbance detection and location

    NASA Astrophysics Data System (ADS)

    Xu, Haiyan; Wu, Hongyan; Zhang, Xuewu; Zhang, Zhuo; Li, Min

    2015-05-01

    Distributed optic fiber sensor is a new type of system, which could be used in the long-distance and strong-EMI condition for monitoring and inspection. A method of external modulation with a phase modulator is proposed in this paper to improve the positioning accuracy of the disturbance in a distributed optic-fiber sensor. We construct distributed disturbance detecting system based on Michelson interferometer, and a phase modulator has been attached to the fiber sensor in front of the Faraday rotation mirror (FRM), to elevate the signal produced by interfering of the two lights reflected by the Faraday rotation Mirror to a high frequency, while other signals remain in the low frequency. Through a high pass filter and phase retrieve circus, a signal which is proportional to the external disturbance is acquired. The accuracy of disturbance positioning with this signal can be largely improved. The method is quite simple and easy to achieve. Theoretical analysis and experimental results show that, this method can effectively improve the positioning accuracy.

  12. Power approximation for the van Elteren test based on location-scale family of distributions.

    PubMed

    Zhao, Yan D; Qu, Yongming; Rahardja, Dewi

    2006-01-01

    The van Elteren test, as a type of stratified Wilcoxon-Mann-Whitney test for comparing two treatments accounting for stratum effects, has been used to replace the analysis of variance when the normality assumption was seriously violated. The sample size estimation methods for the van Elteren test have been proposed and evaluated previously. However, in designing an active-comparator trial where a sample of responses from the new treatment is available but the patient response data to the comparator are limited to summary statistics, the existing methods are either inapplicable or poorly behaved. In this paper we develop a new method for active-comparator trials assuming the responses from both treatments are from the same location-scale family. Theories and simulations have shown that the new method performs well when the location-scale assumption holds and works reasonably when the assumption does not hold. Thus, the new method is preferred when computing sample sizes for the van Elteren test in active-comparator trials.

  13. Power approximation for the van Elteren test based on location-scale family of distributions.

    PubMed

    Zhao, Yan D; Qu, Yongming; Rahardja, Dewi

    2006-01-01

    The van Elteren test, as a type of stratified Wilcoxon-Mann-Whitney test for comparing two treatments accounting for stratum effects, has been used to replace the analysis of variance when the normality assumption was seriously violated. The sample size estimation methods for the van Elteren test have been proposed and evaluated previously. However, in designing an active-comparator trial where a sample of responses from the new treatment is available but the patient response data to the comparator are limited to summary statistics, the existing methods are either inapplicable or poorly behaved. In this paper we develop a new method for active-comparator trials assuming the responses from both treatments are from the same location-scale family. Theories and simulations have shown that the new method performs well when the location-scale assumption holds and works reasonably when the assumption does not hold. Thus, the new method is preferred when computing sample sizes for the van Elteren test in active-comparator trials. PMID:17146980

  14. Syringe filtration methods for examining dissolved and colloidal trace element distributions in remote field locations

    NASA Technical Reports Server (NTRS)

    Shiller, Alan M.

    2003-01-01

    It is well-established that sampling and sample processing can easily introduce contamination into dissolved trace element samples if precautions are not taken. However, work in remote locations sometimes precludes bringing bulky clean lab equipment into the field and likewise may make timely transport of samples to the lab for processing impossible. Straightforward syringe filtration methods are described here for collecting small quantities (15 mL) of 0.45- and 0.02-microm filtered river water in an uncontaminated manner. These filtration methods take advantage of recent advances in analytical capabilities that require only small amounts of waterfor analysis of a suite of dissolved trace elements. Filter clogging and solute rejection artifacts appear to be minimal, although some adsorption of metals and organics does affect the first approximately 10 mL of water passing through the filters. Overall the methods are clean, easy to use, and provide reproducible representations of the dissolved and colloidal fractions of trace elements in river waters. Furthermore, sample processing materials can be prepared well in advance in a clean lab and transported cleanly and compactly to the field. Application of these methods is illustrated with data from remote locations in the Rocky Mountains and along the Yukon River. Evidence from field flow fractionation suggests that the 0.02-microm filters may provide a practical cutoff to distinguish metals associated with small inorganic and organic complexes from those associated with silicate and oxide colloids.

  15. Verification of Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  16. MPL-Net Measurements of Aerosol and Cloud Vertical Distributions at Co-Located AERONET Sites

    NASA Technical Reports Server (NTRS)

    Welton, Ellsworth J.; Campbell, James R.; Berkoff, Timothy A.; Spinhirne, James D.; Tsay, Si-Chee; Holben, Brent; Starr, David OC. (Technical Monitor)

    2002-01-01

    In the early 1990s, the first small, eye-safe, and autonomous lidar system was developed, the Micropulse Lidar (MPL). The MPL acquires signal profiles of backscattered laser light from aerosols and clouds. The signals are analyzed to yield multiple layer heights, optical depths of each layer, average extinction-to-backscatter ratios for each layer, and profiles of extinction in each layer. In 2000, several MPL sites were organized into a coordinated network, called MPL-Net, by the Cloud and Aerosol Lidar Group at NASA Goddard Space Flight Center (GSFC) using funding provided by the NASA Earth Observing System. tn addition to the funding provided by NASA EOS, the NASA CERES Ground Validation Group supplied four MPL systems to the project, and the NASA TOMS group contributed their MPL for work at GSFC. The Atmospheric Radiation Measurement Program (ARM) also agreed to make their data available to the MPL-Net project for processing. In addition to the initial NASA and ARM operated sites, several other independent research groups have also expressed interest in joining the network using their own instruments. Finally, a limited amount of EOS funding was set aside to participate in various field experiments each year. The NASA Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) project also provides funds to deploy their MPL during ocean research cruises. All together, the MPL-Net project has participated in four major field experiments since 2000. Most MPL-Net sites and field experiment locations are also co-located with sunphotometers in the NASA Aerosol Robotic Network. (AERONET). Therefore, at these locations data is collected on both aerosol and cloud vertical structure as well as column optical depth and sky radiance. Real-time data products are now available from most MPL-Net sites. Our real-time products are generated at times of AERONET aerosol optical depth (AOD) measurements. The AERONET AOD is used as input to our

  17. Distribution of deciduous stands in villages located in coniferous forest landscapes in Sweden.

    PubMed

    Mikusiński, Grzegorz; Angelstam, Per; Sporrong, Ulf

    2003-12-01

    Termination of fire along with active removal of deciduous trees in favor of conifers together with anthropogenic transformation of productive forest into agricultural land, have transformed northern European coniferous forests and reduced their deciduous component. Locally, however, in the villages, deciduous trees and stands were maintained, and have more recently regenerated on abandoned agricultural land. We hypothesize that the present distribution of the deciduous component is related to the village in-field/out-field zonation in different regions, which emerges from physical conditions and recent economic development expressed as land-use change. We analyzed the spatial distribution of deciduous stands in in-field and out-field zones of villages in 6 boreal/hemiboreal Swedish regions (Norrbotten, Angermanland, Jämtland, Dalarna, Bergslagen, Småland). In each region 6 individual quadrates 5 x 5 km centered on village areas were selected. We found significant regional differences in the deciduous component (DEC) in different village zones. At the scale of villages Angermanland had the highest mean proportion of DEC (17%) and Jämtland the lowest (2%). However, the amounts of the DEC varied systematically in in-field and out-field zones. DEC was highest in the in-field in the south (Småland), but generally low further north. By contrast, the amount of DEC in the out-field was highest in the north. The relative amount of DEC in the forest edge peaked in landscapes with the strongest decline in active agriculture (Angermanland, Dalarna, Bergslagen). Because former and present local villages are vital for biodiversity linked to the deciduous component, our results indicate a need for integrated management of deciduous forest within entire landscapes. This study shows that simplified satellite data are useful for estimating the spatial distribution of deciduous trees and stands at the landscape scale. However, for detailed studies better thematic resolution is

  18. Optimal Location through Distributed Algorithm to Avoid Energy Hole in Mobile Sink WSNs

    PubMed Central

    Qing-hua, Li; Wei-hua, Gui; Zhi-gang, Chen

    2014-01-01

    In multihop data collection sensor network, nodes near the sink need to relay on remote data and, thus, have much faster energy dissipation rate and suffer from premature death. This phenomenon causes energy hole near the sink, seriously damaging the network performance. In this paper, we first compute energy consumption of each node when sink is set at any point in the network through theoretical analysis; then we propose an online distributed algorithm, which can adjust sink position based on the actual energy consumption of each node adaptively to get the actual maximum lifetime. Theoretical analysis and experimental results show that the proposed algorithms significantly improve the lifetime of wireless sensor network. It lowers the network residual energy by more than 30% when it is dead. Moreover, the cost for moving the sink is relatively smaller. PMID:24895668

  19. Arsenic distribution in soils and rye plants of a cropland located in an abandoned mining area.

    PubMed

    Álvarez-Ayuso, Esther; Abad-Valle, Patricia; Murciego, Ascensión; Villar-Alonso, Pedro

    2016-01-15

    A mining impacted cropland was studied in order to assess its As pollution level and the derived environmental and health risks. Profile soil samples (0-50 cm) and rye plant samples were collected at different distances (0-150 m) from the near mine dump and analyzed for their As content and distribution. These cropland soils were sandy, acidic and poor in organic matter and Fe/Al oxides. The soil total As concentrations (38-177 mg kg(-1)) and, especially, the soil soluble As concentrations (0.48-4.1 mg kg(-1)) importantly exceeded their safe limits for agricultural use of soils. Moreover, the soil As contents more prone to be mobilized could rise up to 25-69% of total As levels as determined using (NH4)2SO4, NH4H2PO4 and (NH4)2C2O4·H2O as sequential extractants. Arsenic in rye plants was primarily distributed in roots (3.4-18.8 mg kg(-1)), with restricted translocation to shoots (TF=0.05-0.26) and grains (TF=<0.02-0.14). The mechanism for this excluder behavior should be likely related to arsenate reduction to arsenite in roots, followed by its complexation with thiols, as suggested by the high arsenite level in rye roots (up to 95% of the total As content) and the negative correlation between thiol concentrations in rye roots and As concentrations in rye shoots (|R|=0.770; p<0.01). Accordingly, in spite of the high mobile and mobilizable As contents in soils, As concentrations in rye above-ground tissues comply with the European regulation on undesirable substances in animal feed. Likewise, rye grain As concentrations were below its maximum tolerable concentration in cereals established by international legislation. PMID:26519583

  20. Arsenic distribution in soils and rye plants of a cropland located in an abandoned mining area.

    PubMed

    Álvarez-Ayuso, Esther; Abad-Valle, Patricia; Murciego, Ascensión; Villar-Alonso, Pedro

    2016-01-15

    A mining impacted cropland was studied in order to assess its As pollution level and the derived environmental and health risks. Profile soil samples (0-50 cm) and rye plant samples were collected at different distances (0-150 m) from the near mine dump and analyzed for their As content and distribution. These cropland soils were sandy, acidic and poor in organic matter and Fe/Al oxides. The soil total As concentrations (38-177 mg kg(-1)) and, especially, the soil soluble As concentrations (0.48-4.1 mg kg(-1)) importantly exceeded their safe limits for agricultural use of soils. Moreover, the soil As contents more prone to be mobilized could rise up to 25-69% of total As levels as determined using (NH4)2SO4, NH4H2PO4 and (NH4)2C2O4·H2O as sequential extractants. Arsenic in rye plants was primarily distributed in roots (3.4-18.8 mg kg(-1)), with restricted translocation to shoots (TF=0.05-0.26) and grains (TF=<0.02-0.14). The mechanism for this excluder behavior should be likely related to arsenate reduction to arsenite in roots, followed by its complexation with thiols, as suggested by the high arsenite level in rye roots (up to 95% of the total As content) and the negative correlation between thiol concentrations in rye roots and As concentrations in rye shoots (|R|=0.770; p<0.01). Accordingly, in spite of the high mobile and mobilizable As contents in soils, As concentrations in rye above-ground tissues comply with the European regulation on undesirable substances in animal feed. Likewise, rye grain As concentrations were below its maximum tolerable concentration in cereals established by international legislation.

  1. A mitochondrial location for haemoglobins--dynamic distribution in ageing and Parkinson's disease.

    PubMed

    Shephard, Freya; Greville-Heygate, Oliver; Marsh, Oliver; Anderson, Susan; Chakrabarti, Lisa

    2014-01-01

    Haemoglobins are iron-containing proteins that transport oxygen in the blood of most vertebrates. The mitochondrion is the cellular organelle which consumes oxygen in order to synthesise ATP. Mitochondrial dysfunction is implicated in neurodegeneration and ageing. We find that α and β haemoglobin (Hba and Hbb) proteins are altered in their distribution in mitochondrial fractions from degenerating brain. We demonstrate that both Hba and Hbb are co-localised with the mitochondrion in mammalian brain. The precise localisation of the Hbs is within the inner membrane space and associated with inner mitochondrial membrane. Relative mitochondrial to cytoplasmic ratios of Hba and Hbb show changing distributions of these proteins during the process of neurodegeneration in the pcd(5j) mouse brain. A significant difference in mitochondrial Hba and Hbb content in the mitochondrial fraction is seen at 31 days after birth, this corresponds to a stage when dynamic neuronal loss is measured to be greatest in the Purkinje Cell Degeneration mouse. We also report changes in mitochondrial Hba and Hbb levels in ageing brain and muscle. Significant differences in mitochondrial Hba and Hbb can be seen when comparing aged brain to muscle, suggesting tissue specific functions of these proteins in the mitochondrion. In muscle there are significant differences between Hba levels in old and young mitochondria. To understand whether the changes detected in mitochondrial Hbs are of clinical significance, we examined Parkinson's disease brain, immunohistochemistry studies suggest that cell bodies in the substantia nigra accumulate mitochondrial Hb. However, western blotting of mitochondrial fractions from PD and control brains indicates significantly less Hb in PD brain mitochondria. One explanation could be a specific loss of cells containing mitochondria loaded with Hb proteins. Our study opens the door to an examination of the role of Hb function, within the context of the mitochondrion

  2. Impacts to the chest of PMHSs - Influence of impact location and load distribution on chest response.

    PubMed

    Holmqvist, Kristian; Svensson, Mats Y; Davidsson, Johan; Gutsche, Andreas; Tomasch, Ernst; Darok, Mario; Ravnik, Dean

    2016-02-01

    The chest response of the human body has been studied for several load conditions, but is not well known in the case of steering wheel rim-to-chest impact in heavy goods vehicle frontal collisions. The aim of this study was to determine the response of the human chest in a set of simulated steering wheel impacts. PMHS tests were carried out and analysed. The steering wheel load pattern was represented by a rigid pendulum with a straight bar-shaped front. A crash test dummy chest calibration pendulum was utilised for comparison. In this study, a set of rigid bar impacts were directed at various heights of the chest, spanning approximately 120mm around the fourth intercostal space. The impact energy was set below a level estimated to cause rib fracture. The analysed results consist of responses, evaluated with respect to differences in the impacting shape and impact heights on compression and viscous criteria chest injury responses. The results showed that the bar impacts consistently produced lesser scaled chest compressions than the hub; the Middle bar responses were around 90% of the hub responses. A superior bar impact provided lesser chest compression; the average response was 86% of the Middle bar response. For inferior bar impacts, the chest compression response was 116% of the chest compression in the middle. The damping properties of the chest caused the compression to decrease in the high speed bar impacts to 88% of that in low speed impacts. From the analysis it could be concluded that the bar impact shape provides lower chest criteria responses compared to the hub. Further, the bar responses are dependent on the impact location of the chest. Inertial and viscous effects of the upper body affect the responses. The results can be used to assess the responses of human substitutes such as anthropomorphic test devices and finite element human body models, which will benefit the development process of heavy goods vehicle safety systems.

  3. Impacts to the chest of PMHSs - Influence of impact location and load distribution on chest response.

    PubMed

    Holmqvist, Kristian; Svensson, Mats Y; Davidsson, Johan; Gutsche, Andreas; Tomasch, Ernst; Darok, Mario; Ravnik, Dean

    2016-02-01

    The chest response of the human body has been studied for several load conditions, but is not well known in the case of steering wheel rim-to-chest impact in heavy goods vehicle frontal collisions. The aim of this study was to determine the response of the human chest in a set of simulated steering wheel impacts. PMHS tests were carried out and analysed. The steering wheel load pattern was represented by a rigid pendulum with a straight bar-shaped front. A crash test dummy chest calibration pendulum was utilised for comparison. In this study, a set of rigid bar impacts were directed at various heights of the chest, spanning approximately 120mm around the fourth intercostal space. The impact energy was set below a level estimated to cause rib fracture. The analysed results consist of responses, evaluated with respect to differences in the impacting shape and impact heights on compression and viscous criteria chest injury responses. The results showed that the bar impacts consistently produced lesser scaled chest compressions than the hub; the Middle bar responses were around 90% of the hub responses. A superior bar impact provided lesser chest compression; the average response was 86% of the Middle bar response. For inferior bar impacts, the chest compression response was 116% of the chest compression in the middle. The damping properties of the chest caused the compression to decrease in the high speed bar impacts to 88% of that in low speed impacts. From the analysis it could be concluded that the bar impact shape provides lower chest criteria responses compared to the hub. Further, the bar responses are dependent on the impact location of the chest. Inertial and viscous effects of the upper body affect the responses. The results can be used to assess the responses of human substitutes such as anthropomorphic test devices and finite element human body models, which will benefit the development process of heavy goods vehicle safety systems. PMID:26687541

  4. Drop size distributions and related properties of fog for five locations measured from aircraft

    NASA Technical Reports Server (NTRS)

    Zak, J. Allen

    1994-01-01

    Fog drop size distributions were collected from aircraft as part of the Synthetic Vision Technology Demonstration Program. Three west coast marine advection fogs, one frontal fog, and a radiation fog were sampled from the top of the cloud to the bottom as the aircraft descended on a 3-degree glideslope. Drop size versus altitude versus concentration are shown in three dimensional plots for each 10-meter altitude interval from 1-minute samples. Also shown are median volume radius and liquid water content. Advection fogs contained the largest drops with median volume radius of 5-8 micrometers, although the drop sizes in the radiation fog were also large just above the runway surface. Liquid water content increased with height, and the total number of drops generally increased with time. Multimodal variations in number density and particle size were noted in most samples where there was a peak concentration of small drops (2-5 micrometers) at low altitudes, midaltitude peak of drops 5-11 micrometers, and high-altitude peak of the larger drops (11-15 micrometers and above). These observations are compared with others and corroborate previous results in fog gross properties, although there is considerable variation with time and altitude even in the same type of fog.

  5. Estimation of hydrothermal deposits location from magnetization distribution and magnetic properties in the North Fiji Basin

    NASA Astrophysics Data System (ADS)

    Choi, S.; Kim, C.; Park, C.; Kim, H.

    2013-12-01

    The North Fiji Basin is belong to one of the youngest basins of back-arc basins in the southwest Pacific (from 12 Ma ago). We performed the marine magnetic and the bathymetry survey in the North Fiji Basin for finding the submarine hydrothermal deposits in April 2012. We acquired magnetic and bathymetry datasets by using Multi-Beam Echo Sounder EM120 (Kongsberg Co.) and Overhouser Proton Magnetometer SeaSPY (Marine Magnetics Co.). We conducted the data processing to obtain detailed seabed topography, magnetic anomaly, reduce to the pole(RTP), analytic signal and magnetization. The study areas composed of the two areas(KF-1(longitude : 173.5 ~ 173.7 and latitude : -16.2 ~ -16.5) and KF-3(longitude : 173.4 ~ 173.6 and latitude : -18.7 ~ -19.1)) in Central Spreading Ridge(CSR) and one area(KF-2(longitude : 173.7 ~ 174 and latitude : -16.8 ~ -17.2)) in Triple Junction(TJ). The seabed topography of KF-1 existed thin horst in two grabens that trends NW-SE direction. The magnetic properties of KF-1 showed high magnetic anomalies in center part and magnetic lineament structure of trending E-W direction. In the magnetization distribution of KF-1, the low magnetization zone matches well with a strong analytic signal in the northeastern part. KF-2 area has TJ. The seabed topography formed like Y-shape and showed a high feature in the center of TJ. The magnetic properties of KF-2 displayed high magnetic anomalies in N-S spreading ridge center and northwestern part. In the magnetization distribution of KF-2, the low magnetization zone matches well with a strong analytic signal in the northeastern part. The seabed topography of KF-3 presented a flat and high topography like dome structure at center axis and some seamounts scattered around the axis. The magnetic properties of KF-3 showed high magnetic anomalies in N-S spreading ridge center part. In the magnetization of KF-2, the low magnetization zone mismatches to strong analytic signal in this area. The difference of KF-3

  6. Reliability Improvements in Design of Water Distribution Networks Recognizing Valve Location

    NASA Astrophysics Data System (ADS)

    Bouchart, F.; Goulter, I.

    1991-12-01

    Water distribution networks can fail either by the actual demand at one or more nodes exceeding the design demands, or by a pipe between two nodes failing. The implications of each type of failure can be assessed by the shortfall in supply caused by a failure event together with the probability of occurrence of the event, and can be represented by the expected volume of deficit. Converting the implications of the two failure types into these commensurate units permits them to be added directly to give a single consistent measure of reliability. The assessment of shortfall for the pipe failure mode is derived from the observation that when a pipe breaks, a section of pipe must be isolated by valves to permit the repair to be made. Isolating the pipe also isolates the customers who withdraw water from that section of pipe. Thus the shortfall in supply in this case is based on the amount of supply (number of customers) cut off by isolating the pipe for repair. The measure extends previous reliability parameters by recognizing that in reality demand occurs along links rather than being concentrated solely at nodes at the ends of the links which is the normal assumption for both simulation and optimization models. If reliability of the network is found to be unsatisfactory, it can be improved in two ways. One is to increase the design demand at nodes so that the probability of actual demand's exceeding the design value is reduced. The other is to add more valves so that the length of pipe which has to be isolated in order to repair a pipe is reduced, thereby, reducing the number of customers who must have their supply cut off during a repair. Use of the two methods to determine and, if necessary, to improve reliability is demonstrated by their application to an example network.

  7. Using Distributed Temperature Sensing to Locate and Quantify Thermal Refugia: Insights Into Radiative & Hydrologic Processes

    NASA Astrophysics Data System (ADS)

    Bond, R. M.; Stubblefield, A. P.

    2012-12-01

    Stream temperature plays a critical role in determining the overall structure and function of stream ecosystems. Aquatic fauna are particularly vulnerable to projected increases in the magnitude and duration of elevated stream temperatures from global climate change. Northern California cold water salmon and trout fisheries have been declared thermally impacted by the California State Water Resources Control Board. This study employed Distributed Temperature Sensing (DTS) to detect stream heating and cooling at one meter resolution along a one kilometer section of the North Fork of the Salmon River, a tributary of the Klamath River, northern California, USA. The Salmon River has an extensive legacy of hydraulic gold mining tailing which have been reworked into large gravel bars; creating shallow wide runs, possibly filling in pools and disrupting riparian vegetation recruitment. Eight days of temperature data were collected at 15 minute intervals during July 2012. Three remote weather stations were deployed during the study period. The main objectives of this research were: one, quantify thermal inputs that create and maintain thermal refugia for cold water fishes; two, investigate the role of riparian and topographic shading in buffering peak summer temperatures; and three, create and validate a physically based stream heating model to predict effects of riparian management, drought, and climate change on stream temperature. DTS was used to spatially identify cold water seeps and quantify their contribution to the stream's thermal regime. Along the one kilometer reach, hyporheic flow was identified using DTS. The spring was between 16-18°C while the peak mainstem temperature above the spring reached a maximum of 23°C. The study found a diel heating cycle of 5°C with a Maximum Weekly Average Temperature (MWAT) of over 22°C; exceeding salmon and trout protective temperature standards set by USEPA Region 10. Twenty intensive fish counts over five days were

  8. What influences national and foreign physicians’ geographic distribution? An analysis of medical doctors’ residence location in Portugal

    PubMed Central

    2012-01-01

    Background The debate over physicians’ geographical distribution has attracted the attention of the economic and public health literature over the last forty years. Nonetheless, it is still to date unclear what influences physicians’ location, and whether foreign physicians contribute to fill the geographical gaps left by national doctors in any given country. The present research sets out to investigate the current distribution of national and international physicians in Portugal, with the objective to understand its determinants and provide an evidence base for policy-makers to identify policies to influence it. Methods A cross-sectional study of physicians currently registered in Portugal was conducted to describe the population and explore the association of physician residence patterns with relevant personal and municipality characteristics. Data from the Portuguese Medical Council on physicians’ residence and characteristics were analysed, as well as data from the National Institute of Statistics on municipalities’ population, living standards and health care network. Descriptive statistics, chi-square tests, negative binomial and logistic regression modelling were applied to determine: (a) municipality characteristics predicting Portuguese and International physicians’ geographical distribution, and; (b) doctors’ characteristics that could increase the odds of residing outside the country’s metropolitan areas. Results There were 39,473 physicians in Portugal in 2008, 51.1% of whom male, and 40.2% between 41 and 55 years of age. They were predominantly Portuguese (90.5%), with Spanish, Brazilian and African nationalities also represented. Population, Population’s Purchasing Power, Nurses per capita and Municipality Development Index (MDI) were the municipality characteristics displaying the strongest association with national physicians’ location. For foreign physicians, the MDI was not statistically significant, while municipalities

  9. Dip distribution of Oita-Kumamoto Tectonic Line located in central Kyushu, Japan, estimated by eigenvectors of gravity gradient tensor

    NASA Astrophysics Data System (ADS)

    Kusumoto, Shigekazu

    2016-09-01

    We estimated the dip distribution of Oita-Kumamoto Tectonic Line located in central Kyushu, Japan, by using the dip of the maximum eigenvector of the gravity gradient tensor. A series of earthquakes in Kumamoto and Oita beginning on 14 April 2016 occurred along this tectonic line, the largest of which was M = 7.3. Because a gravity gradiometry survey has not been conducted in the study area, we calculated the gravity gradient tensor from the Bouguer gravity anomaly and employed it to the analysis. The general dip distribution of the Oita-Kumamoto Tectonic Line was found to be about 65° and tends to be higher towards its eastern end. In addition, we estimated the dip around the largest earthquake to be about 60° from the gravity gradient tensor. This result agrees with the dip of the earthquake source fault obtained by Global Navigation Satellite System data analysis.[Figure not available: see fulltext.

  10. Experimental Verification of Application of Looped System and Centralized Voltage Control in a Distribution System with Renewable Energy Sources

    NASA Astrophysics Data System (ADS)

    Hanai, Yuji; Hayashi, Yasuhiro; Matsuki, Junya

    The line voltage control in a distribution network is one of the most important issues for a penetration of Renewable Energy Sources (RES). A loop distribution network configuration is an effective solution to resolve voltage and distribution loss issues concerned about a penetration of RES. In this paper, for a loop distribution network, the authors propose a voltage control method based on tap change control of LRT and active/reactive power control of RES. The tap change control of LRT takes a major role of the proposed voltage control. Additionally the active/reactive power control of RES supports the voltage control when voltage deviation from the upper or lower voltage limit is unavoidable. The proposed method adopts SCADA system based on measured data from IT switches, which are sectionalizing switch with sensor installed in distribution feeder. In order to check the validity of the proposed voltage control method, experimental simulations using a distribution system analog simulator “ANSWER” are carried out. In the simulations, the voltage maintenance capability in the normal and the emergency is evaluated.

  11. Levels and spatial distribution of airborne chemical elements in a heavy industrial area located in the north of Spain.

    PubMed

    Lage, J; Almeida, S M; Reis, M A; Chaves, P C; Ribeiro, T; Garcia, S; Faria, J P; Fernández, B G; Wolterbeek, H T

    2014-01-01

    The adverse health effects of airborne particles have been subjected to intense investigation in recent years; however, more studies on the chemical characterization of particles from pollution emissions are needed to (1) identify emission sources, (2) better understand the relative toxicity of particles, and (3) pinpoint more targeted emission control strategies and regulations. The main objective of this study was to assess the levels and spatial distribution of airborne chemical elements in a heavy industrial area located in the north of Spain. Instrumental and biomonitoring techniques were integrated and analytical methods for k0 instrumental neutron activation analysis and particle-induced x-ray emission were used to determine element content in aerosol filters and lichens. Results indicated that in general local industry contributed to the emissions of As, Sb, Cu, V, and Ni, which are associated with combustion processes. In addition, the steelwork emitted significant quantities of Fe and Mn and the cement factory was associated with Ca emissions. The spatial distribution of Zn and Al also indicated an important contribution of two industries located outside the studied area. PMID:25072718

  12. SU-D-BRF-02: In Situ Verification of Radiation Therapy Dose Distributions From High-Energy X-Rays Using PET Imaging

    SciTech Connect

    Zhang, Q; Kai, L; Wang, X; Hua, B; Chui, L; Wang, Q; Ma, C

    2014-06-01

    Purpose: To study the possibility of in situ verification of radiation therapy dose distributions using PET imaging based on the activity distribution of 11C and 15O produced via photonuclear reactions in patient irradiated by 45MV x-rays. Methods: The method is based on the photonuclear reactions in the most elemental composition {sup 12}C and {sup 16}O in body tissues irradiated by bremsstrahlung photons with energies up to 45 MeV, resulting primarily in {sup 11}C and {sup 15}O, which are positron-emitting nuclei. The induced positron activity distributions were obtained with a PET scanner in the same room of a LA45 accelerator (Top Grade Medical, Beijing, China). The experiments were performed with a brain phantom using realistic treatment plans. The phantom was scanned at 20min and 2-5min after irradiation for {sup 11}C and {sup 15}, respectively. The interval between the two scans was 20 minutes. The activity distributions of {sup 11}C and {sup 15}O within the irradiated volume can be separated from each other because the half-life is 20min and 2min for {sup 11}C and {sup 15}O, respectively. Three x-ray energies were used including 10MV, 25MV and 45MV. The radiation dose ranged from 1.0Gy to 10.0Gy per treatment. Results: It was confirmed that no activity was detected at 10 MV beam energy, which was far below the energy threshold for photonuclear reactions. At 25 MV x-ray activity distribution images were observed on PET, which needed much higher radiation dose in order to obtain good quality. For 45 MV photon beams, good quality activation images were obtained with 2-3Gy radiation dose, which is the typical daily dose for radiation therapy. Conclusion: The activity distribution of {sup 15}O and {sup 11}C could be used to derive the dose distribution of 45MV x-rays at the regular daily dose level. This method can potentially be used to verify in situ dose distributions of patients treated on the LA45 accelerator.

  13. TESTING AND VERIFICATION OF REAL-TIME WATER QUALITY MONITORING SENSORS IN A DISTRIBUTION SYSTEM AGAINST INTRODUCED CONTAMINATION

    EPA Science Inventory

    Drinking water distribution systems reach the majority of American homes, business and civic areas, and are therefore an attractive target for terrorist attack via direct contamination, or backflow events. Instrumental monitoring of such systems may be used to signal the prese...

  14. Experimental verification of improved depth-dose distribution using hyper-thermal neutron incidence in neutron capture therapy

    NASA Astrophysics Data System (ADS)

    Sakurai, Yoshinori; Kobayashi, Tooru

    2001-01-01

    We have proposed the utilization of `hyper-thermal neutrons' for neutron capture therapy (NCT) from the viewpoint of the improvement in the dose distribution in a human body. In order to verify the improved depth-dose distribution due to hyper-thermal neutron incidence, two experiments were carried out using a test-type hyper-thermal neutron generator at a thermal neutron irradiation field in Kyoto University Reactor (KUR), which is actually utilized for NCT clinical irradiation. From the free-in-air experiment for the spectrum-shift characteristics, it was confirmed that the hyper-thermal neutrons of approximately 860 K at maximum could be obtained by the generator. From the phantom experiment, the improvement effect and the controllability for the depth-dose distribution were confirmed. For example, it was found that the relative neutron depth-dose distribution was about 1 cm improved with the 860 K hyper-thermal neutron incidence, compared to the normal thermal neutron incidence.

  15. Experimental verification of improved depth-dose distribution using hyper-thermal neutron incidence in neutron capture therapy.

    PubMed

    Sakurai, Y; Kobayashi, T

    2001-01-01

    We have proposed the utilization of 'hyper-thermal neutrons' for neutron capture therapy (NCT) from the viewpoint of the improvement in the dose distribution in a human body. In order to verify the improved depth-dose distribution due to hyper-thermal neutron incidence, two experiments were carried out using a test-type hyper-thermal neutron generator at a thermal neutron irradiation field in Kyoto University Reactor (KUR), which is actually utilized for NCT clinical irradiation. From the free-in-air experiment for the spectrum-shift characteristics, it was confirmed that the hyper-thermal neutrons of approximately 860 K at maximum could be obtained by the generator. From the phantom experiment, the improvement effect and the controllability for the depth-dose distribution were confirmed. For example, it was found that the relative neutron depth-dose distribution was about 1 cm improved with the 860 K hyper-thermal neutron incidence, compared to the normal thermal neutron incidence.

  16. Global plasma simulation of charge state distribution inside a 2.45 GHz ECR plasma with experimental verification

    NASA Astrophysics Data System (ADS)

    Bodendorfer, M.; Wurz, P.; Hohl, M.

    2010-08-01

    For the first time, the charge state distribution inside the MEsskammer für FlugzeitInStrumente und Time-Of-Flight (MEFISTO) electron cyclotron resonance (ECR) plasma and in the extracted ion beam was successfully simulated. A self-consistent ECR plasma ionization model (Hohl M 2002 MEFISTO II: Design, setup, characterization and operation of an improved calibration facility for solar plasma instrumentation PhD Thesis University of Bern) was further developed, recomputing the ion confinement time for every ion species and in every time step based on the actual plasma potential rather than using a prescribed constant ion confinement time. The simulation starts with a user defined set of initial conditions and develops the problem in time space by an adaptive step length fourth order Runge-Kutta (RK4) solver, considering particle densities based on ionization rates, recombination rates, ion confinement times and plasma potential. At the simulation end, a steady-state ion charge state distribution is reached, which is in excellent agreement with the measured ion beam charge state distribution of the MEFISTO ion source for Ar1+ to Ar5+ and in good agreement for Ar6+.

  17. Development and verification of an analytical algorithm to predict absorbed dose distributions in ocular proton therapy using Monte Carlo simulations.

    PubMed

    Koch, Nicholas C; Newhauser, Wayne D

    2010-02-01

    Proton beam radiotherapy is an effective and non-invasive treatment for uveal melanoma. Recent research efforts have focused on improving the dosimetric accuracy of treatment planning and overcoming the present limitation of relative analytical dose calculations. Monte Carlo algorithms have been shown to accurately predict dose per monitor unit (D/MU) values, but this has yet to be shown for analytical algorithms dedicated to ocular proton therapy, which are typically less computationally expensive than Monte Carlo algorithms. The objective of this study was to determine if an analytical method could predict absolute dose distributions and D/MU values for a variety of treatment fields like those used in ocular proton therapy. To accomplish this objective, we used a previously validated Monte Carlo model of an ocular nozzle to develop an analytical algorithm to predict three-dimensional distributions of D/MU values from pristine Bragg peaks and therapeutically useful spread-out Bragg peaks (SOBPs). Results demonstrated generally good agreement between the analytical and Monte Carlo absolute dose calculations. While agreement in the proximal region decreased for beams with less penetrating Bragg peaks compared with the open-beam condition, the difference was shown to be largely attributable to edge-scattered protons. A method for including this effect in any future analytical algorithm was proposed. Comparisons of D/MU values showed typical agreement to within 0.5%. We conclude that analytical algorithms can be employed to accurately predict absolute proton dose distributions delivered by an ocular nozzle.

  18. TPS verification with UUT simulation

    NASA Astrophysics Data System (ADS)

    Wang, Guohua; Meng, Xiaofeng; Zhao, Ruixian

    2006-11-01

    TPS's (Test Program Set) verification or first article acceptance test commonly depends on fault insertion experiment on UUT (Unit Under Test). However the failure modes injected on UUT is limited and it is almost infeasible when the UUT is in development or in a distributed state. To resolve this problem, a TPS verification method based on UUT interface signal simulation is putting forward. The interoperability between ATS (automatic test system) and UUT simulation platform is very important to realize automatic TPS verification. After analyzing the ATS software architecture, the approach to realize interpretability between ATS software and UUT simulation platform is proposed. And then the UUT simulation platform software architecture is proposed based on the ATS software architecture. The hardware composition and software architecture of the UUT simulation is described in details. The UUT simulation platform has been implemented in avionics equipment TPS development, debug and verification.

  19. Radar prediction of absolute rain fade distributions for earth-satellite paths and general methods for extrapolation of fade statistics to other locations

    NASA Technical Reports Server (NTRS)

    Goldhirsh, J.

    1982-01-01

    The first absolute rain fade distribution method described establishes absolute fade statistics at a given site by means of a sampled radar data base. The second method extrapolates absolute fade statistics from one location to another, given simultaneously measured fade and rain rate statistics at the former. Both methods employ similar conditional fade statistic concepts and long term rain rate distributions. Probability deviations in the 2-19% range, with an 11% average, were obtained upon comparison of measured and predicted levels at given attenuations. The extrapolation of fade distributions to other locations at 28 GHz showed very good agreement with measured data at three sites located in the continental temperate region.

  20. Implementation of a novel double-side technique for partial discharge detection and location in covered conductor overhead distribution networks

    NASA Astrophysics Data System (ADS)

    He, Weisheng; Li, Hongjie; Liang, Deliang; Sun, Haojie; Yang, Chenbo; Wei, Jinqu; Yuan, Zhijian

    2015-12-01

    Partial discharge (PD) detection has proven to be one of the most acceptable techniques for on-line condition monitoring and predictive maintenance of power apparatus. A powerful tool for detecting PD in covered-conductor (CC) lines is urgently needed to improve the asset management of CC overhead distribution lines. In this paper, an appropriate, portable and simple system designed to detect PD activity in CC lines and ultimately pinpoint the PD source is developed and tested. The system is based on a novel double-side synchronised PD measurement technique driven by pulse injection. Emphasis is placed on the proposed PD-location mechanism and hardware structure, with descriptions of the pulse-injection process, detection device, synchronisation principle and PD-location algorithm. The system is simulated using ATP-EMTP, and the simulated results are found to be consistent with the actual simulation layout. For further validation, the capability of the system is tested in a high-voltage laboratory experiment using a 10-kV CC line with cross-linked polyethylene insulation.

  1. Phase Velocity and Full-Waveform Analysis of Co-located Distributed Acoustic Sensing (DAS) Channels and Geophone Sensor

    NASA Astrophysics Data System (ADS)

    Parker, L.; Mellors, R. J.; Thurber, C. H.; Wang, H. F.; Zeng, X.

    2015-12-01

    A 762-meter Distributed Acoustic Sensing (DAS) array with a channel spacing of one meter was deployed at the Garner Valley Downhole Array in Southern California. The array was approximately rectangular with dimensions of 180 meters by 80 meters. The array also included two subdiagonals within the rectangle along which three-component geophones were co-located. Several active sources were deployed, including a 45-kN, swept-frequency, shear-mass shaker, which produced strong Rayleigh waves across the array. Both DAS and geophone traces were filtered in 2-Hz steps between 4 and 20 Hz to obtain phase velocities as a function of frequency from fitting the moveout of travel times over distances of 35 meters or longer. As an alternative to this traditional means of finding phase velocity, it is theoretically possible to find the Rayleigh-wave phase velocity at each point of co-location as the ratio of DAS and geophone responses, because DAS is sensitive to ground strain and geophones are sensitive to ground velocity, after suitable corrections for instrument response (Mikumo & Aki, 1964). The concept was tested in WPP, a seismic wave propagation program, by first validating and then using a 3D synthetic, full-waveform seismic model to simulate the effect of increased levels of noise and uncertainty as data go from ideal to more realistic. The results obtained from this study provide a better understanding of the DAS response and its potential for being combined with traditional seismometers for obtaining phase velocity at a single location. This analysis is part of the PoroTomo project (Poroelastic Tomography by Adjoint Inverse Modeling of Data from Seismology, Geodesy, and Hydrology, http://geoscience.wisc.edu/feigl/porotomo).

  2. KAT-7 SCIENCE VERIFICATION: USING H I OBSERVATIONS OF NGC 3109 TO UNDERSTAND ITS KINEMATICS AND MASS DISTRIBUTION

    SciTech Connect

    Carignan, C.; Frank, B. S.; Hess, K. M.; Lucero, D. M.; Randriamampandry, T. H.; Goedhart, S.; Passmoor, S. S.

    2013-09-15

    H I observations of the Magellanic-type spiral NGC 3109, obtained with the seven dish Karoo Array Telescope (KAT-7), are used to analyze its mass distribution. Our results are compared to those obtained using Very Large Array (VLA) data. KAT-7 is a pathfinder of the Square Kilometer Array precursor MeerKAT, which is under construction. The short baselines and low system temperature of the telescope make it sensitive to large-scale, low surface brightness emission. The new observations with KAT-7 allow the measurement of the rotation curve (RC) of NGC 3109 out to 32', doubling the angular extent of existing measurements. A total H I mass of 4.6 Multiplication-Sign 10{sup 8} M{sub Sun} is derived, 40% more than what is detected by the VLA observations. The observationally motivated pseudo-isothermal dark matter (DM) halo model can reproduce the observed RC very well, but the cosmologically motivated Navarro-Frenk-White DM model gives a much poorer fit to the data. While having a more accurate gas distribution has reduced the discrepancy between the observed RC and the MOdified Newtonian Dynamics (MOND) models, this is done at the expense of having to use unrealistic mass-to-light ratios for the stellar disk and/or very large values for the MOND universal constant a{sub 0}. Different distances or H I contents cannot reconcile MOND with the observed kinematics, in view of the small errors on these two quantities. As with many slowly rotating gas-rich galaxies studied recently, the present result for NGC 3109 continues to pose a serious challenge to the MOND theory.

  3. Verification of Anderson Superexchange in MnO via Magnetic Pair Distribution Function Analysis and ab initio Theory.

    PubMed

    Frandsen, Benjamin A; Brunelli, Michela; Page, Katharine; Uemura, Yasutomo J; Staunton, Julie B; Billinge, Simon J L

    2016-05-13

    We present a temperature-dependent atomic and magnetic pair distribution function (PDF) analysis of neutron total scattering measurements of antiferromagnetic MnO, an archetypal strongly correlated transition-metal oxide. The known antiferromagnetic ground-state structure fits the low-temperature data closely with refined parameters that agree with conventional techniques, confirming the reliability of the newly developed magnetic PDF method. The measurements performed in the paramagnetic phase reveal significant short-range magnetic correlations on a ∼1  nm length scale that differ substantially from the low-temperature long-range spin arrangement. Ab initio calculations using a self-interaction-corrected local spin density approximation of density functional theory predict magnetic interactions dominated by Anderson superexchange and reproduce the measured short-range magnetic correlations to a high degree of accuracy. Further calculations simulating an additional contribution from a direct exchange interaction show much worse agreement with the data. The Anderson superexchange model for MnO is thus verified by experimentation and confirmed by ab initio theory. PMID:27232042

  4. Verification of Anderson Superexchange in MnO via Magnetic Pair Distribution Function Analysis and ab initio Theory

    NASA Astrophysics Data System (ADS)

    Frandsen, Benjamin A.; Brunelli, Michela; Page, Katharine; Uemura, Yasutomo J.; Staunton, Julie B.; Billinge, Simon J. L.

    2016-05-01

    We present a temperature-dependent atomic and magnetic pair distribution function (PDF) analysis of neutron total scattering measurements of antiferromagnetic MnO, an archetypal strongly correlated transition-metal oxide. The known antiferromagnetic ground-state structure fits the low-temperature data closely with refined parameters that agree with conventional techniques, confirming the reliability of the newly developed magnetic PDF method. The measurements performed in the paramagnetic phase reveal significant short-range magnetic correlations on a ˜1 nm length scale that differ substantially from the low-temperature long-range spin arrangement. Ab initio calculations using a self-interaction-corrected local spin density approximation of density functional theory predict magnetic interactions dominated by Anderson superexchange and reproduce the measured short-range magnetic correlations to a high degree of accuracy. Further calculations simulating an additional contribution from a direct exchange interaction show much worse agreement with the data. The Anderson superexchange model for MnO is thus verified by experimentation and confirmed by ab initio theory.

  5. Verification of Anderson superexchange in MnO via magnetic pair distribution function analysis and ab initio theory

    DOE PAGES

    Benjamin A. Frandsen; Brunelli, Michela; Page, Katharine; Uemura, Yasutomo J.; Staunton, Julie B.; Billinge, Simon J. L.

    2016-05-11

    Here, we present a temperature-dependent atomic and magnetic pair distribution function (PDF) analysis of neutron total scattering measurements of antiferromagnetic MnO, an archetypal strongly correlated transition-metal oxide. The known antiferromagnetic ground-state structure fits the low-temperature data closely with refined parameters that agree with conventional techniques, confirming the reliability of the newly developed magnetic PDF method. The measurements performed in the paramagnetic phase reveal significant short-range magnetic correlations on a ~1 nm length scale that differ substantially from the low-temperature long-range spin arrangement. Ab initio calculations using a self-interaction-corrected local spin density approximation of density functional theory predict magnetic interactions dominatedmore » by Anderson superexchange and reproduce the measured short-range magnetic correlations to a high degree of accuracy. Further calculations simulating an additional contribution from a direct exchange interaction show much worse agreement with the data. Furthermore, the Anderson superexchange model for MnO is thus verified by experimentation and confirmed by ab initio theory.« less

  6. Atmospheric aerosols size distribution properties in winter and pre-monsoon over western Indian Thar Desert location

    NASA Astrophysics Data System (ADS)

    Panwar, Chhagan; Vyas, B. M.

    2016-05-01

    The first ever experimental results over Indian Thar Desert region concerning to height integrated aerosols size distribution function in particles size ranging between 0.09 to 2 µm such as, aerosols columnar size distribution (CSD), effective radius (Reff), integrated content of total aerosols (Nt), columnar content of accumulation and coarse size aerosols particles concentration (Na) (size < 0.5 µm) and (Nc) (size between 0.5 to 2 µm) have been described specifically during winter (a stable weather condition and intense anthropogenic pollution activity period) and pre-monsoon (intense dust storms of natural mineral aerosols as well as unstable atmospheric weather condition period) at Jaisalmer (26.90°N, 69.90°E, 220 m above surface level (asl)) located in central Thar desert vicinity of western Indian site. The CSD and various derived other aerosols size parameters are retrieved from their average spectral characteristics of Aerosol Optical Thickness (AOT) from UV to Infrared wavelength spectrum measured from Multi-Wavelength solar Radiometer (MWR). The natures of CSD are, in general, bio-modal character, instead of uniformly distributed character and power law distributions. The observed primary peaks in CSD plots are seen around about 1013 m2 μm-1 at radius range 0.09-0.20 µm during both the seasons. But, in winter months, secondary peaks of relatively lower CSD values of 1010 to 1011 m2/μm-1 occur within a lower radius size range 0.4 to 0.6 µm. In contrast to this, while in dust dominated and hot season, the dominated secondary maxima of the higher CSD of about 1012 m2μm-3 is found of bigger aerosols size particles in a rage of 0.6 to 1.0 µm which is clearly demonstrating the characteristics of higher aerosols laden of bigger size aerosols in summer months relative to their prevailed lower aerosols loading of smaller size aerosols particles (0.4 to 0.6 µm) in cold months. Several other interesting features of changing nature of monthly spectral AOT

  7. Physician Location Selection and Distribution. A Bibliography of Relevant Articles, Reports and Data Sources. Health Manpower Policy Discussion Paper Series No. D3.

    ERIC Educational Resources Information Center

    Crane, Stephen C.; Reynolds, Juanita

    This bibliography provides background material on two general issues of how physicians are distributed geographically and how physicians choose a practice location. The report is divided into five major categories of information: overview summary of annotated articles, reference key to location decision factors, reference key to public policy…

  8. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  9. Simple Syringe Filtration Methods for Reliably Examining Dissolved and Colloidal Trace Element Distributions in Remote Field Locations

    NASA Astrophysics Data System (ADS)

    Shiller, A. M.

    2002-12-01

    Methods for obtaining reliable dissolved trace element samples frequently utilize clean labs, portable laminar flow benches, or other equipment not readily transportable to remote locations. In some cases unfiltered samples can be obtained in a remote location and transported back to a lab for filtration. However, this may not always be possible or desirable. Additionally, methods for obtaining information on colloidal composition are likewise frequently too cumbersome for remote locations as well as being time-consuming. For that reason I have examined clean methods for collecting samples filtered through 0.45 and 0.02 micron syringe filters. With this methodology, only small samples are collected (typically 15 mL). However, with the introduction of the latest generation of ICP-MS's and microflow nebulizers, sample requirements for elemental analysis are much lower than just a few years ago. Thus, a determination of a suite of first row transition elements is frequently readily obtainable with samples of less than 1 mL. To examine the "traditional" (<0.45 micron) dissolved phase, 25 mm diameter polypropylene syringe filters and all polyethylene/polypropylene syringes are utilized. Filters are pre-cleaned in the lab using 40 mL of approx. 1 M HCl followed by a clean water rinse. Syringes are pre-cleaned by leaching with hot 1 M HCl followed by a clean water rinse. Sample kits are packed in polyethylene bags for transport to the field. Results are similar to results obtained using 0.4 micron polycarbonate screen filters, though concentrations may differ somewhat depending on the extent of sample pre-rinsing of the filter. Using this method, a multi-year time series of dissolved metals in a remote Rocky Mountain stream has been obtained. To examine the effect of colloidal material on dissolved metal concentrations, 0.02 micron alumina syringe filters have been utilized. Other workers have previously used these filters for examining colloidal Fe distributions in lake

  10. The prevalence and distribution of gastrointestinal parasites of stray and refuge dogs in four locations in India.

    PubMed

    Traub, Rebecca J; Pednekar, Riddhi P; Cuttell, Leigh; Porter, Ronald B; Abd Megat Rani, Puteri Azaziah; Gatne, Mukulesh L

    2014-09-15

    A gastrointestinal parasite survey of 411 stray and refuge dogs sampled from four geographical and climactically distinct locations in India revealed these animals to represent a significant source of environmental contamination for parasites that pose a zoonotic risk to the public. Hookworms were the most commonly identified parasite in dogs in Sikkim (71.3%), Mumbai (48.8%) and Delhi (39.1%). In Ladakh, which experiences harsh extremes in climate, a competitive advantage was observed for parasites such as Sarcocystis spp. (44.2%), Taenia hydatigena (30.3%) and Echinococcus granulosus (2.3%) that utilise intermediate hosts for the completion of their life cycle. PCR identified Ancylostoma ceylanicum and Ancylostoma caninum to occur sympatrically, either as single or mixed infections in Sikkim (Northeast) and Mumbai (West). In Delhi, A. caninum was the only species identified in dogs, probably owing to its ability to evade unfavourable climatic conditions by undergoing arrested development in host tissue. The expansion of the known distribution of A. ceylanicum to the west, as far as Mumbai, justifies the renewed interest in this emerging zoonosis and advocates for its surveillance in future human parasite surveys. Of interest was the absence of Trichuris vulpis in dogs, in support of previous canine surveys in India. This study advocates the continuation of birth control programmes in stray dogs that will undoubtedly have spill-over effects on reducing the levels of environmental contamination with parasite stages. In particular, owners of pet animals exposed to these environments must be extra vigilant in ensuring their animals are regularly dewormed and maintaining strict standards of household and personal hygiene.

  11. The prevalence and distribution of gastrointestinal parasites of stray and refuge dogs in four locations in India.

    PubMed

    Traub, Rebecca J; Pednekar, Riddhi P; Cuttell, Leigh; Porter, Ronald B; Abd Megat Rani, Puteri Azaziah; Gatne, Mukulesh L

    2014-09-15

    A gastrointestinal parasite survey of 411 stray and refuge dogs sampled from four geographical and climactically distinct locations in India revealed these animals to represent a significant source of environmental contamination for parasites that pose a zoonotic risk to the public. Hookworms were the most commonly identified parasite in dogs in Sikkim (71.3%), Mumbai (48.8%) and Delhi (39.1%). In Ladakh, which experiences harsh extremes in climate, a competitive advantage was observed for parasites such as Sarcocystis spp. (44.2%), Taenia hydatigena (30.3%) and Echinococcus granulosus (2.3%) that utilise intermediate hosts for the completion of their life cycle. PCR identified Ancylostoma ceylanicum and Ancylostoma caninum to occur sympatrically, either as single or mixed infections in Sikkim (Northeast) and Mumbai (West). In Delhi, A. caninum was the only species identified in dogs, probably owing to its ability to evade unfavourable climatic conditions by undergoing arrested development in host tissue. The expansion of the known distribution of A. ceylanicum to the west, as far as Mumbai, justifies the renewed interest in this emerging zoonosis and advocates for its surveillance in future human parasite surveys. Of interest was the absence of Trichuris vulpis in dogs, in support of previous canine surveys in India. This study advocates the continuation of birth control programmes in stray dogs that will undoubtedly have spill-over effects on reducing the levels of environmental contamination with parasite stages. In particular, owners of pet animals exposed to these environments must be extra vigilant in ensuring their animals are regularly dewormed and maintaining strict standards of household and personal hygiene. PMID:25139393

  12. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  13. Candida parapsilosis (sensu lato) isolated from hospitals located in the Southeast of Brazil: Species distribution, antifungal susceptibility and virulence attributes.

    PubMed

    Ziccardi, Mariangela; Souza, Lucieri O P; Gandra, Rafael M; Galdino, Anna Clara M; Baptista, Andréa R S; Nunes, Ana Paula F; Ribeiro, Mariceli A; Branquinha, Marta H; Santos, André L S

    2015-12-01

    Candida parapsilosis (sensu lato), which represents a fungal complex composed of three genetically related species - Candida parapsilosis sensu stricto, Candida orthopsilosis and Candida metapsilosis, has emerged as an important yeast causing fungemia worldwide. The goal of the present work was to assess the prevalence, antifungal susceptibility and production of virulence traits in 53 clinical isolates previously identified as C. parapsilosis (sensu lato) obtained from hospitals located in the Southeast of Brazil. Species forming this fungal complex are physiologically/morphologically indistinguishable; however, polymerase chain reaction followed by restriction fragment length polymorphism of FKS1 gene has solved the identification inaccuracy, revealing that 43 (81.1%) isolates were identified as C. parapsilosis sensu stricto and 10 (18.9%) as C. orthopsilosis. No C. metapsilosis was found. The geographic distribution of these Candida species was uniform among the studied Brazilian States (São Paulo, Rio de Janeiro and Espírito Santo). All C. orthopsilosis and almost all C. parapsilosis sensu stricto (95.3%) isolates were susceptible to amphotericin B, fluconazole, itraconazole, voriconazole and caspofungin. Nevertheless, one C. parapsilosis sensu stricto isolate was resistant to fluconazole and another one was resistant to caspofungin. C. parapsilosis sensu stricto isolates exhibited higher MIC mean values to amphotericin B, fluconazole and caspofungin than those of C. orthopsilosis, while C. orthopsilosis isolates displayed higher MIC mean to itraconazole compared to C. parapsilosis sensu stricto. Identical MIC mean values to voriconazole were measured for these Candida species. All the isolates of both species were able to form biofilm on polystyrene surface. Impressively, biofilm-growing cells of C. parapsilosis sensu stricto and C. orthopsilosis exhibited a considerable resistance to all antifungal agents tested. Pseudohyphae were observed in 67.4% and 80

  14. Columbus pressurized module verification

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Comandatore, Emanuele

    1986-01-01

    The baseline verification approach of the COLUMBUS Pressurized Module was defined during the A and B1 project phases. Peculiarities of the verification program are the testing requirements derived from the permanent manned presence in space. The model philosophy and the test program have been developed in line with the overall verification concept. Such critical areas as meteoroid protections, heat pipe radiators and module seals are identified and tested. Verification problem areas are identified and recommendations for the next development are proposed.

  15. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  16. Verification of VENTSAR

    SciTech Connect

    Simpkins, A.A.

    1995-01-01

    The VENTSAR code is an upgraded and improved version of the VENTX code, which estimates concentrations on or near a building from a release at a nearby location. The code calculates the concentrations either for a given meteorological exceedance probability or for a given stability and wind speed combination. A single building can be modeled which lies in the path of the plume, or a penthouse can be added to the top of the building. Plume rise may also be considered. Release types can be either chemical or radioactive. Downwind concentrations are determined at user-specified incremental distances. This verification report was prepared to demonstrate that VENTSAR is properly executing all algorithms and transferring data. Hand calculations were also performed to ensure proper application of methodologies.

  17. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  18. Verification of Adaptive Systems

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui; Vassev, Emil; Hinchey, Mike; Rouff, Christopher; Buskens, Richard

    2012-01-01

    Adaptive systems are critical for future space and other unmanned and intelligent systems. Verification of these systems is also critical for their use in systems with potential harm to human life or with large financial investments. Due to their nondeterministic nature and extremely large state space, current methods for verification of software systems are not adequate to provide a high level of assurance for them. The combination of stabilization science, high performance computing simulations, compositional verification and traditional verification techniques, plus operational monitors, provides a complete approach to verification and deployment of adaptive systems that has not been used before. This paper gives an overview of this approach.

  19. Distribution of somatostatin in the frog brain, Rana catesbiana, in relation to location of catecholamine-containing neuron system.

    PubMed

    Inagaki, S; Shiosaka, S; Takatsuki, K; Sakanaka, M; Takagi, H; Senba, E; Matsuzaki, T; Tohyama, M

    1981-10-10

    The distribution of somatostatin (SRIF)-immunoreactive structures in the central nervous system of the bull frog (both with and without treatment of colchicine) was studied, using the indirect immunofluorescence technique of Coons and co-workers (Coons, '58). SRIF-containing cells were observed in more than ten areas including the spinal cord. These SRIF-positive cells showed segmental distribution, in that SRIF-positive neurons were identified in various areas at various brain levels. An extensive network of SRIF-positive fibers was found in most parts of the central nervous system. The distribution of a catecholamine (CA)-containing neuron system in the frog brain is also presented in this study. The possible interactions between SRIF and CA neurons systems are briefly discussed. PMID:6116726

  20. The motor unit potential distribution over the skin surface and its use in estimating the motor unit location.

    PubMed

    Roeleveld, K; Stegeman, D F; Vingerhoets, H M; Van Oosterom, A

    1997-12-01

    The amplitude of a surface electromyogram is dependent on the number of active motor units, their size and the relative position of the recording electrode. It is not possible to interpret the surface electromyogram quantitatively without disentangling these different aspects. In this study the decline of different components of the motor unit potential with increasing radial distance from the motor unit is quantified. Fifty-two motor units in the biceps brachii muscle were studied using 36-channel surface electromyography combined with intramuscular scanning electromyography. Scanning electromyography was used to locate precisely the motor unit. The dependence of the surface motor unit potential magnitude on the radial distance between the motor unit and the recording electrodes can be described fairly well by an inverse power function. The steepness of this function depends on the chosen motor unit potential parameter and the interelectrode distance, but also varies between motor units. The change of the negative peak amplitude of the motor unit potential over the skin surface can be used to give a fairly accurate estimate of the location of the motor unit under the skin surface. We found that for all practical purposes the depth of a motor unit in the biceps brachii muscle can be estimated as 20% of the distance over the skin surface where motor unit potentials can be recorded with higher amplitudes than 50% of the maximal amplitude recorded at the skin surface caused by activity of the same motor unit.

  1. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  2. Conversion and Distribution of Cobalamin in Euglena gracilis z, with Special Reference to Its Location and Probable Function within Chloroplasts

    PubMed Central

    Isegawa, Yuji; Nakano, Yoshihisa; Kitaoka, Shozaburo

    1984-01-01

    Cobalamin is essentially required for growth by Euglena gracilis and shown to be converted to coenzyme forms promptly after feeding cyanocobalamin. Concentrations of coenzymes, methylcobalamin, and 5′-deoxyadenosylcobalamin, reached about 1 femtomole/106 cells 2 hours after feeding cyanocobalamin to cobalamin-limited cells. Cobalamins all were bound to proteins in Euglena cells and located in subcellular fractions of chloroplasts, mitochondria, microsomes, and cytosol. Incorporated cobalamin into chloroplasts was localized in thylakoids. Methylcobalamin existed in chloroplasts, mitochondria, and cytosol, while 5′-deoxyadenosylcobalamin was in mitochondria and the cytosol, 2 h after feeding cyanocobalamin to Euglena cells. Quantitative alterations of methylcobalamin and 5′-deoxyadenosylcobalamin in chloroplasts suggest their important functions as coenzymes in this organelle. The occurrence of functional cobalamins in chloroplasts has not been reported in other photosynthetic eukaryotes. PMID:16663929

  3. Environmental Technology Verification Report - Electric Power and Heat Production Using Renewable Biogas at Patterson Farms

    EPA Science Inventory

    The U.S. EPA operates the Environmental Technology Verification program to facilitate the deployment of innovative technologies through performance verification and information dissemination. A technology area of interest is distributed electrical power generation, particularly w...

  4. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  5. Location and distribution of virus antigen in the central nervous system of mice persistently infected with Theiler's virus.

    PubMed

    Sethi, P; Lipton, H L

    1983-02-01

    The present study has shown that virus can be readily detected by immunofluorescent staining in the central nervous system (CNS) of SJL mice persistently infected with Theiler's murine encephalomyelitis virus (TMEV). Considering the low CNS virus content, large amounts of virus antigen were found in the white matter, the site of demyelinating lesions. Virus antigen was detected in all animals killed after post-infection (PI) Day 21, a time which can be considered as the beginning of the persistent phase of this infection, and the appearance of virus antigen in white matter corresponded closely in time with the onset of demyelination. The pathogensis of this persistent infection can now be reasonably well reconstructed from the temporal observations made in this study. It would appear that between the second and third week PI, virus replication largely shifts from neurons in spinal cord gray matter to other cell types located in white matter. While a lower-grade persistent infection (in terms of the relative number of cells containing virus antigen) is established and maintained in cells in the gray matter and inflammatory and leptomeningeal infiltrates, cells in white matter appear to be mainly responsible for perpetuating the infection. Why these cells should supplant neurons as the most susceptible host cell during the chronic phase of the infection is discussed.

  6. Lunar Pickup Ions Observed by ARTEMIS: Spatial and Temporal Distribution and Constraints on Species and Source Locations

    NASA Technical Reports Server (NTRS)

    Halekas, Jasper S.; Poppe, A. R.; Delory, G. T.; Sarantos, M.; Farrell, W. M.; Angelopoulos, V.; McFadden, J. P.

    2012-01-01

    ARTEMIS observes pickup ions around the Moon, at distances of up to 20,000 km from the surface. The observed ions form a plume with a narrow spatial and angular extent, generally seen in a single energy/angle bin of the ESA instrument. Though ARTEMIS has no mass resolution capability, we can utilize the analytically describable characteristics of pickup ion trajectories to constrain the possible ion masses that can reach the spacecraft at the observation location in the correct energy/angle bin. We find that most of the observations are consistent with a mass range of approx. 20-45 amu, with a smaller fraction consistent with higher masses, and very few consistent with masses below 15 amu. With the assumption that the highest fluxes of pickup ions come from near the surface, the observations favor mass ranges of approx. 20-24 and approx. 36-40 amu. Although many of the observations have properties consistent with a surface or near-surface release of ions, some do not, suggesting that at least some of the observed ions have an exospheric source. Of all the proposed sources for ions and neutrals about the Moon, the pickup ion flux measured by ARTEMIS correlates best with the solar wind proton flux, indicating that sputtering plays a key role in either directly producing ions from the surface, or producing neutrals that subsequently become ionized.

  7. Reliable Entanglement Verification

    NASA Astrophysics Data System (ADS)

    Arrazola, Juan; Gittsovich, Oleg; Donohue, John; Lavoie, Jonathan; Resch, Kevin; Lütkenhaus, Norbert

    2013-05-01

    Entanglement plays a central role in quantum protocols. It is therefore important to be able to verify the presence of entanglement in physical systems from experimental data. In the evaluation of these data, the proper treatment of statistical effects requires special attention, as one can never claim to have verified the presence of entanglement with certainty. Recently increased attention has been paid to the development of proper frameworks to pose and to answer these type of questions. In this work, we apply recent results by Christandl and Renner on reliable quantum state tomography to construct a reliable entanglement verification procedure based on the concept of confidence regions. The statements made do not require the specification of a prior distribution nor the assumption of an independent and identically distributed (i.i.d.) source of states. Moreover, we develop efficient numerical tools that are necessary to employ this approach in practice, rendering the procedure ready to be employed in current experiments. We demonstrate this fact by analyzing the data of an experiment where photonic entangled two-photon states were generated and whose entanglement is verified with the use of an accessible nonlinear witness.

  8. Investigation of Reflectance Distribution and Trend for the Double Ray Located in the Northwest of Tycho Crater

    NASA Astrophysics Data System (ADS)

    Yi, Eung Seok; Kim, Kyeong Ja; Choi, Yi Re; Kim, Yong Ha; Lee, Sung Soon; Lee, Seung Ryeol

    2015-06-01

    Analysis of lunar samples returned by the US Apollo missions revealed that the lunar highlands consist of anorthosite, plagioclase, pyroxene, and olivine; also, the lunar maria are composed of materials such as basalt and ilmenite. More recently, the remote sensing approach has enabled reduction of the time required to investigate the entire lunar surface, compared to the approach of returning samples. Moreover, remote sensing has also made it possible to determine the existence of specific minerals and to examine wide areas. In this paper, an investigation was performed on the reflectance distribution and its trend. The results were applied to the example of the double ray stretched in parallel lines from the Tycho crater to the third-quadrant of Mare Nubium. Basic research and background information for the investigation of lunar surface characteristics is also presented. For this research, resources aboard the SELenological and ENgineering Explorer (SELENE), a Japanese lunar probe, were used. These included the Multiband Imager (MI) in the Lunar Imager / Spectrometer (LISM). The data of these instruments were edited through the toolkit, an image editing and analysis tool, Exelis Visual Information Solution (ENVI).

  9. The German Environmental Survey for Children (GerES IV): reference values and distributions for time-location patterns of German children.

    PubMed

    Conrad, André; Seiwert, Margarete; Hünken, Andreas; Quarcoo, David; Schlaud, Martin; Groneberg, David

    2013-01-01

    Children's time-location patterns are important determinants of environmental exposure and other health-relevant factors. Building on data of the German Environmental Survey for Children (GerES IV), our study aimed at deriving reference values and distributions for time-location patterns of 3-14-year-old German children. We also investigated if GerES IV data are appropriate for evaluating associations with children's health determinants by linking them to data of the National Health Interview and Examination Survey for Children and Adolescents (KiGGS). Parents reported on the time their children usually spend at home, in other indoor environments, and outdoors. This information was characterized by statistical parameters, which were also calculated for different strata concerning socio-demography and the residential environment. Consequently, group differences were evaluated by t-tests and univariate ANOVA. Reference distributions were fitted to the time-location data by a Maximum Likelihood approach to make them also useable in probabilistic exposure modeling. Finally, associations between data on the children's physical activity as well as body weight and their outdoor time were investigated by bivariate correlation analysis and cross tabulation. On daily average, German children spend 15 h and 31 min at home, 4 h and 46 min in other indoor environments, and 3 h and 43 min outdoors. Time spent at home and outdoors decreases with age while time spent in other indoor environments increases. Differences in time-location patterns were also observed for the socio-economic status (SES) and immigration status. E.g., children with a high SES spend 24 min less outdoors than low SES children. Immigrants spend on daily average 20 min more at home and 15 min less outdoors than non-immigrant children. Outdoor time was associated with parameters of the residential environment like the building development. Children living in 1- or 2-family houses spend more time outdoors than

  10. The German Environmental Survey for Children (GerES IV): reference values and distributions for time-location patterns of German children.

    PubMed

    Conrad, André; Seiwert, Margarete; Hünken, Andreas; Quarcoo, David; Schlaud, Martin; Groneberg, David

    2013-01-01

    Children's time-location patterns are important determinants of environmental exposure and other health-relevant factors. Building on data of the German Environmental Survey for Children (GerES IV), our study aimed at deriving reference values and distributions for time-location patterns of 3-14-year-old German children. We also investigated if GerES IV data are appropriate for evaluating associations with children's health determinants by linking them to data of the National Health Interview and Examination Survey for Children and Adolescents (KiGGS). Parents reported on the time their children usually spend at home, in other indoor environments, and outdoors. This information was characterized by statistical parameters, which were also calculated for different strata concerning socio-demography and the residential environment. Consequently, group differences were evaluated by t-tests and univariate ANOVA. Reference distributions were fitted to the time-location data by a Maximum Likelihood approach to make them also useable in probabilistic exposure modeling. Finally, associations between data on the children's physical activity as well as body weight and their outdoor time were investigated by bivariate correlation analysis and cross tabulation. On daily average, German children spend 15 h and 31 min at home, 4 h and 46 min in other indoor environments, and 3 h and 43 min outdoors. Time spent at home and outdoors decreases with age while time spent in other indoor environments increases. Differences in time-location patterns were also observed for the socio-economic status (SES) and immigration status. E.g., children with a high SES spend 24 min less outdoors than low SES children. Immigrants spend on daily average 20 min more at home and 15 min less outdoors than non-immigrant children. Outdoor time was associated with parameters of the residential environment like the building development. Children living in 1- or 2-family houses spend more time outdoors than

  11. Approaches to wind resource verification

    NASA Technical Reports Server (NTRS)

    Barchet, W. R.

    1982-01-01

    Verification of the regional wind energy resource assessments produced by the Pacific Northwest Laboratory addresses the question: Is the magnitude of the resource given in the assessments truly representative of the area of interest? Approaches using qualitative indicators of wind speed (tree deformation, eolian features), old and new data of opportunity not at sites specifically chosen for their exposure to the wind, and data by design from locations specifically selected to be good wind sites are described. Data requirements and evaluation procedures for verifying the resource are discussed.

  12. Resource distribution in mental health services: changes in geographic location and use of personnel in Norwegian mental health services 1979-1994.

    PubMed

    Pedersen, Per Bernhard; Lilleeng, Solfrid

    2000-03-01

    BACKGROUND: During the last decades, a central aim of Norwegian health policy has been to achieve a more equal geographical distribution of services. Of special interest is the 1980 financial reform. Central government reimbursements for the treatment of in-patients were replaced by a block grant to each county, based on indicators of relative "need". AIMS OF THE STUDY: The aim of this paper is to assess whether the distribution of specialized mental health services did take the course suggested by the proponents of the reform (i.e. a more equal distribution), or the opposite (i.e. a more unequal distribution) as claimed by the opponents. METHODS: Man year per capita ratios were used as indicators for the distribution of mental health services by county. Ratios were estimated for "all personnel", and for MDs and psychologists separately. Man years were assigned to counties by location of services (i.e. in which county the services were produced), and by residence of users (i.e. in which county the services were consumed). Indicators of geographic variation were estimated using the standard deviation (STD) as a measure of absolute variation, and the coefficient of variation (CV) and the Gini index as indicators of relative variation. Indicators were estimated for 1979, 1984, 1989 and 1994, based on data for all specialized adult mental health services in the country. Changes in distributions over the period were tested, using Levene's test of homogeneity. RESULTS: Relative variations in the distribution of personnel by location of services were substantially reduced over the period, the CV being reduced by more than 50% for all groups. Variations in the personnel ratios by residence of users were smaller at the start of the period, and the reductions were also smaller. Still, relative variations were reduced by 20-35, 40 and 60% approximately for "all personnel", MDs and psychologists respectively. In spite of a major increase in the supply of MDs and psychologists

  13. Verification and arms control

    SciTech Connect

    Potter, W.C.

    1985-01-01

    Recent years have witnessed an increased stress upon the verification of arms control agreements, both as a technical problem and as a political issue. As one contribution here points out, the middle ground has shrunk between those who are persuaded that the Soviets are ''cheating'' and those who are willing to take some verification risks for the sake of achieving arms control. One angle, according to a Lawrence Livermore physicist who served as a member of the delegation to the various test-ban treaty negotiations, is the limited effectiveness of on-site inspection as compared to other means of verification.

  14. Computation and Analysis of the Global Distribution of the Radioxenon Isotope 133Xe based on Emissions from Nuclear Power Plants and Radioisotope Production Facilities and its Relevance for the Verification of the Nuclear-Test-Ban Treaty

    NASA Astrophysics Data System (ADS)

    Wotawa, Gerhard; Becker, Andreas; Kalinowski, Martin; Saey, Paul; Tuma, Matthias; Zähringer, Matthias

    2010-05-01

    Monitoring of radioactive noble gases, in particular xenon isotopes, is a crucial element of the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The capability of the noble gas network, which is currently under construction, to detect signals from a nuclear explosion critically depends on the background created by other sources. Therefore, the global distribution of these isotopes based on emissions and transport patterns needs to be understood. A significant xenon background exists in the reactor regions of North America, Europe and Asia. An emission inventory of the four relevant xenon isotopes has recently been created, which specifies source terms for each power plant. As the major emitters of xenon isotopes worldwide, a few medical radioisotope production facilities have been recently identified, in particular the facilities in Chalk River (Canada), Fleurus (Belgium), Pelindaba (South Africa) and Petten (Netherlands). Emissions from these sites are expected to exceed those of the other sources by orders of magnitude. In this study, emphasis is put on 133Xe, which is the most prevalent xenon isotope. First, based on the emissions known, the resulting 133Xe concentration levels at all noble gas stations of the final CTBT verification network were calculated and found to be consistent with observations. Second, it turned out that emissions from the radioisotope facilities can explain a number of observed peaks, meaning that atmospheric transport modelling is an important tool for the categorization of measurements. Third, it became evident that Nuclear Power Plant emissions are more difficult to treat in the models, since their temporal variation is high and not generally reported. Fourth, there are indications that the assumed annual emissions may be underestimated by factors of two to ten, while the general emission patterns seem to be well understood. Finally, it became evident that 133Xe sources mainly influence the sensitivity of the

  15. SU-E-J-58: Dosimetric Verification of Metal Artifact Effects: Comparison of Dose Distributions Affected by Patient Teeth and Implants

    SciTech Connect

    Lee, M; Kang, S; Lee, S; Suh, T; Lee, J; Park, J; Park, H; Lee, B

    2014-06-01

    Purpose: Implant-supported dentures seem particularly appropriate for the predicament of becoming edentulous and cancer patients are no exceptions. As the number of people having dental implants increased in different ages, critical dosimetric verification of metal artifact effects are required for the more accurate head and neck radiation therapy. The purpose of this study is to verify the theoretical analysis of the metal(streak and dark) artifact, and to evaluate dosimetric effect which cause by dental implants in CT images of patients with the patient teeth and implants inserted humanoid phantom. Methods: The phantom comprises cylinder which is shaped to simulate the anatomical structures of a human head and neck. Through applying various clinical cases, made phantom which is closely allied to human. Developed phantom can verify two classes: (i)closed mouth (ii)opened mouth. RapidArc plans of 4 cases were created in the Eclipse planning system. Total dose of 2000 cGy in 10 fractions is prescribed to the whole planning target volume (PTV) using 6MV photon beams. Acuros XB (AXB) advanced dose calculation algorithm, Analytical Anisotropic Algorithm (AAA) and progressive resolution optimizer were used in dose optimization and calculation. Results: In closed and opened mouth phantom, because dark artifacts formed extensively around the metal implants, dose variation was relatively higher than that of streak artifacts. As the PTV was delineated on the dark regions or large streak artifact regions, maximum 7.8% dose error and average 3.2% difference was observed. The averaged minimum dose to the PTV predicted by AAA was about 5.6% higher and OARs doses are also 5.2% higher compared to AXB. Conclusion: The results of this study showed that AXB dose calculation involving high-density materials is more accurate than AAA calculation, and AXB was superior to AAA in dose predictions beyond dark artifact/air cavity portion when compared against the measurements.

  16. Verification of RADTRAN

    SciTech Connect

    Kanipe, F.L.; Neuhauser, K.S.

    1995-12-31

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes.

  17. Voltage verification unit

    DOEpatents

    Martin, Edward J.

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  18. The Assembly, Integration, and Verification (AIV) team

    NASA Astrophysics Data System (ADS)

    2009-06-01

    Assembly, Integration, and Verification (AIV) is the process by which the software and hardware deliveries from the distributed ALMA partners (North America, South America, Europe, and East Asia) are assembled and integrated into a working system, and the initial technical capabilities tested to insure that they will meet the observatories exacting requirements for science.

  19. Fuel Retrieval System (FRS) Design Verification

    SciTech Connect

    YANOCHKO, R.M.

    2000-01-27

    This document was prepared as part of an independent review to explain design verification activities already completed, and to define the remaining design verification actions for the Fuel Retrieval System. The Fuel Retrieval Subproject was established as part of the Spent Nuclear Fuel Project (SNF Project) to retrieve and repackage the SNF located in the K Basins. The Fuel Retrieval System (FRS) construction work is complete in the KW Basin, and start-up testing is underway Design modifications and construction planning are also underway for the KE Basin. An independent review of the design verification process as applied to the K Basin projects was initiated in support of preparation for the SNF Project operational readiness review (ORR).

  20. Influence of pH, layer charge location and crystal thickness distribution on U(VI) sorption onto heterogeneous dioctahedral smectite.

    PubMed

    Guimarães, Vanessa; Rodríguez-Castellón, Enrique; Algarra, Manuel; Rocha, Fernando; Bobos, Iuliu

    2016-11-01

    The UO2(2+) adsorption on smectite (samples BA1, PS2 and PS3) with a heterogeneous structure was investigated at pH 4 (I=0.02M) and pH 6 (I=0.2M) in batch experiments, with the aim to evaluate the influence of pH, layer charge location and crystal thickness distribution. Mean crystal thickness distribution of smectite crystallite used in sorption experiments range from 4.8nm (sample PS2), to 5.1nm (sample PS3) and, to 7.4nm (sample BA1). Smaller crystallites have higher total surface area and sorption capacity. Octahedral charge location favor higher sorption capacity. The sorption isotherms of Freundlich, Langmuir and SIPS were used to model the sorption experiments. The surface complexation and cation exchange reactions were modeled using PHREEQC-code to describe the UO2(2+) sorption on smectite. The amount of UO2(2+) adsorbed on smectite samples decreased significantly at pH 6 and higher ionic strength, where the sorption mechanism was restricted to the edge sites of smectite. Two binding energy components at 380.8±0.3 and 382.2±0.3eV, assigned to hydrated UO2(2+) adsorbed by cation exchange and by inner-sphere complexation on the external sites at pH 4, were identified after the U4f7/2 peak deconvolution by X-photoelectron spectroscopy. Also, two new binding energy components at 380.3±0.3 and 381.8±0.3eV assigned to AlOUO2(+) and SiOUO2(+) surface species were observed at pH 6.

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION--FUELCELL ENERGY, INC.: DFC 300A MOLTEN CARBONATE FUEL CELL COMBINED HEAT AND POWER SYSTEM

    EPA Science Inventory

    The U.S. EPA operates the Environmental Technology Verification program to facilitate the deployment of innovative technologies through performance verification and information dissemination. A technology area of interest is distributed electrical power generation, particularly w...

  2. Nuclear disarmament verification

    SciTech Connect

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  3. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  4. Wind gust warning verification

    NASA Astrophysics Data System (ADS)

    Primo, Cristina

    2016-07-01

    Operational meteorological centres around the world increasingly include warnings as one of their regular forecast products. Warnings are issued to warn the public about extreme weather situations that might occur leading to damages and losses. In forecasting these extreme events, meteorological centres help their potential users in preventing the damage or losses they might suffer. However, verifying these warnings requires specific methods. This is due not only to the fact that they happen rarely, but also because a new temporal dimension is added when defining a warning, namely the time window of the forecasted event. This paper analyses the issues that might appear when dealing with warning verification. It also proposes some new verification approaches that can be applied to wind warnings. These new techniques are later applied to a real life example, the verification of wind gust warnings at the German Meteorological Centre ("Deutscher Wetterdienst"). Finally, the results obtained from the latter are discussed.

  5. Interactions of microbial biofilms with toxic trace metals; 2: Prediction and verification of an integrated computer model of lead (II) distribution in the presence of microbial activity

    SciTech Connect

    Hsieh, K.M.; Murgel, G.A.; Lion, L.W.; Shuler, M.L. )

    1994-06-20

    The interfacial interactions of a toxic trace metal, Pb, with a surface modified by a marine film-forming bacterium, Pseudomonas atlantica, were predicted by a structured biofilm model used in conjunction with a chemical speciation model. The validity of the integrated model was tested for batch and continuous operations. Dynamic responses of the biophase due to transient lead concentration increases were also simulated. The reasonable predictions achieved by the model demonstrate its utility in describing trace metal distributions in complex systems where the adsorption properties of inorganic surfaces are modified by adherent bacteria and bacterial production of extracellular polymers.

  6. Secure optical verification using dual phase-only correlation

    NASA Astrophysics Data System (ADS)

    Liu, Wei; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun; Liu, Shutian

    2015-02-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method.

  7. Verification and validation benchmarks.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  8. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    steps in the process. Verification insures that the CFD code is solving the equations as intended (no errors in the implementation). This is typically done either through the method of manufactured solutions (MMS) or through careful step-by-step comparisons with other verified codes. After the verification step is concluded, validation is performed to document the ability of the turbulence model to represent different types of flow physics. Validation can involve a large number of test case comparisons with experiments, theory, or DNS. Organized workshops have proved to be valuable resources for the turbulence modeling community in its pursuit of turbulence modeling verification and validation. Workshop contributors using different CFD codes run the same cases, often according to strict guidelines, and compare results. Through these comparisons, it is often possible to (1) identify codes that have likely implementation errors, and (2) gain insight into the capabilities and shortcomings of different turbulence models to predict the flow physics associated with particular types of flows. These are valuable lessons because they help bring consistency to CFD codes by encouraging the correction of faulty programming and facilitating the adoption of better models. They also sometimes point to specific areas needed for improvement in the models. In this paper, several recent workshops are summarized primarily from the point of view of turbulence modeling verification and validation. Furthermore, the NASA Langley Turbulence Modeling Resource website is described. The purpose of this site is to provide a central location where RANS turbulence models are documented, and test cases, grids, and data are provided. The goal of this paper is to provide an abbreviated survey of turbulence modeling verification and validation efforts, summarize some of the outcomes, and give some ideas for future endeavors in this area.

  9. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  10. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    SciTech Connect

    Chukbar, B. K.

    2015-12-15

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm{sup –3} in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  11. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    NASA Astrophysics Data System (ADS)

    Chukbar, B. K.

    2015-12-01

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm-3 in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  12. Alu and L1 sequence distributions in Xq24-q28 and their comparative utility in YAC contig assembly and verification

    SciTech Connect

    Porta, G.; Zucchi, I.; Schlessinger, D.; Hillier, L.; Green, P.; Nowotny, V.; D`Urso, M.

    1993-05-01

    The contents of Alu- and L1-containing TaqI restriction fragments were assessed by Southern blot analyses across YAC contigs already assembled by other means and localized within Xq24-q28. Fingerprinting patterns of YACs in contigs were concordant. Using software based on that of M. V. Olson et al. to analyze digitized data on fragment sizes, fingerprinting itself could establish matches among about 40% of a test group of 435 YACs. At 100-kb resolution, both repetitive elements were found throughout the region, with no apparent enrichment of Alu or L1 in DNA of G compared to that found in R bands. However, consistent with a random overall distribution, delimited regions of up to 100 kb contained clusters of repetitive elements. The local concentrations may help to account for the reported differential hybridization of Alu and L1 probes to segments of metaphase chromosomes. 40 refs., 6 figs., 2 tabs.

  13. NON-INVASIVE DETERMINATION OF THE LOCATION AND DISTRIBUTION OF FREE-PHASE DENSE NONAQUEOUS PHASE LIQUIDS (DNAPL) BY SEISMIC REFLECTION TECHNIQUES

    SciTech Connect

    Michael G. Waddell; William J. Domoracki; Tom J. Temples; Jerome Eyer

    2001-05-01

    The Earth Sciences and Resources Institute, University of South Carolina is conducting a 14 month proof of concept study to determine the location and distribution of subsurface Dense Nonaqueous Phase Liquid (DNAPL) carbon tetrachloride (CCl{sub 4}) contamination at the 216-Z-9 crib, 200 West area, Department of Energy (DOE) Hanford Site, Washington by use of two-dimensional high resolution seismic reflection surveys and borehole geophysical data. The study makes use of recent advances in seismic reflection amplitude versus offset (AVO) technology to directly detect the presence of subsurface DNAPL. The techniques proposed are a noninvasive means towards site characterization and direct free-phase DNAPL detection. This report covers the results of Task 3 and change of scope of Tasks 4-6. Task 1 contains site evaluation and seismic modeling studies. The site evaluation consists of identifying and collecting preexisting geological and geophysical information regarding subsurface structure and the presence and quantity of DNAPL. The seismic modeling studies were undertaken to determine the likelihood that an AVO response exists and its probable manifestation. Task 2 is the design and acquisition of 2-D seismic reflection data designed to image areas of probable high concentration of DNAPL. Task 3 is the processing and interpretation of the 2-D data. Task 4, 5, and 6 were designing, acquiring, processing, and interpretation of a three dimensional seismic survey (3D) at the Z-9 crib area at 200 west area, Hanford.

  14. Pyroclastic Eruptions in a Mars Climate Model: The Effects of Grain Size, Plume Height, Density, Geographical Location, and Season on Ash Distribution

    NASA Astrophysics Data System (ADS)

    Kerber, L. A.; Head, J. W.; Madeleine, J.; Wilson, L.; Forget, F.

    2010-12-01

    Pyroclastic volcanism has played a major role in the geologic history of the planet Mars. In addition to several highland patera features interpreted to be composed of pyroclastic material, there are a number of vast, fine-grained, friable deposits which may have a volcanic origin. The physical processes involved in the explosive eruption of magma, including the nucleation of bubbles, the fragmentation of magma, the incorporation of atmospheric gases, the formation of a buoyant plume, and the fall-out of individual pyroclasts has been modeled extensively for martian conditions [Wilson, L., J.W. Head (2007), Explosive volcanic eruptions on Mars: Tephra and accretionary lapilli formation, dispersal and recognition in the geologic record, J. Volcanol. Geotherm. Res. 163, 83-97]. We have further developed and expanded this original model in order to take into account differing temperature, pressure, and wind regimes found at different altitudes, at different geographic locations, and during different martian seasons. Using a well-established Mars global circulation model [LMD-GCM, Forget, F., F. Hourdin, R. Fournier, C. Hourdin, O. Talagrand (1999), Improved general circulation models of the martian atmosphere from the surface to above 80 km, J. Geophys. Res. 104, 24,155-24,176] we are able to link the volcanic eruption model of Wilson and Head (2007) to the spatially and temporally dynamic GCM temperature, pressure, and wind profiles to create three-dimensional maps of expected ash deposition on the surface. Here we present results exploring the effects of grain-size distribution, plume height, density of ash, latitude, season, and atmospheric pressure on the areal extent and shape of the resulting ash distribution. Our results show that grain-size distribution and plume height most strongly effect the distance traveled by the pyroclasts from the vent, while latitude and season can have a large effect on the direction in which the pyroclasts travel and the final shape

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  16. FPGA Verification Accelerator (FVAX)

    NASA Technical Reports Server (NTRS)

    Oh, Jane; Burke, Gary

    2008-01-01

    Is Verification Acceleration Possible? - Increasing the visibility of the internal nodes of the FPGA results in much faster debug time - Forcing internal signals directly allows a problem condition to be setup very quickly center dot Is this all? - No, this is part of a comprehensive effort to improve the JPL FPGA design and V&V process.

  17. Is flow verification necessary

    SciTech Connect

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper.

  18. Telescope performance verification

    NASA Astrophysics Data System (ADS)

    Swart, Gerhard P.; Buckley, David A. H.

    2004-09-01

    While Systems Engineering appears to be widely applied on the very large telescopes, it is lacking in the development of many of the medium and small telescopes currently in progress. The latter projects rely heavily on the experience of the project team, verbal requirements and conjecture based on the successes and failures of other telescopes. Furthermore, it is considered an unaffordable luxury to "close-the-loop" by carefully analysing and documenting the requirements and then verifying the telescope's compliance with them. In this paper the authors contend that a Systems Engineering approach is a keystone in the development of any telescope and that verification of the telescope's performance is not only an important management tool but also forms the basis upon which successful telescope operation can be built. The development of the Southern African Large Telescope (SALT) has followed such an approach and is now in the verification phase of its development. Parts of the SALT verification process will be discussed in some detail to illustrate the suitability of this approach, including oversight by the telescope shareholders, recording of requirements and results, design verification and performance testing. Initial test results will be presented where appropriate.

  19. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  20. Improved Verification for Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Powell, Mark A.

    2008-01-01

    Aerospace systems are subject to many stringent performance requirements to be verified with low risk. This report investigates verification planning using conditional approaches vice the standard classical statistical methods, and usage of historical surrogate data for requirement validation and in verification planning. The example used in this report to illustrate the results of these investigations is a proposed mission assurance requirement with the concomitant maximum acceptable verification risk for the NASA Constellation Program Orion Launch Abort System (LAS). This report demonstrates the following improvements: 1) verification planning using conditional approaches vice classical statistical methods results in plans that are more achievable and feasible; 2) historical surrogate data can be used to bound validation of performance requirements; and, 3) incorporation of historical surrogate data in verification planning using conditional approaches produces even less costly and more reasonable verification plans. The procedures presented in this report may produce similar improvements and cost savings in verification for any stringent performance requirement for an aerospace system.

  1. Ada(R) Test and Verification System (ATVS)

    NASA Technical Reports Server (NTRS)

    Strelich, Tom

    1986-01-01

    The Ada Test and Verification System (ATVS) functional description and high level design are completed and summarized. The ATVS will provide a comprehensive set of test and verification capabilities specifically addressing the features of the Ada language, support for embedded system development, distributed environments, and advanced user interface capabilities. Its design emphasis was on effective software development environment integration and flexibility to ensure its long-term use in the Ada software development community.

  2. Assessment of total and organic vanadium levels and their bioaccumulation in edible sea cucumbers: tissues distribution, inter-species-specific, locational differences and seasonal variations.

    PubMed

    Liu, Yanjun; Zhou, Qingxin; Xu, Jie; Xue, Yong; Liu, Xiaofang; Wang, Jingfeng; Xue, Changhu

    2016-02-01

    The objective of this study is to investigate the levels, inter-species-specific, locational differences and seasonal variations of vanadium in sea cucumbers and to validate further several potential factors controlling the distribution of metals in sea cucumbers. Vanadium levels were evaluated in samples of edible sea cucumbers and were demonstrated exhibit differences in different seasons, species and sampling sites. High vanadium concentrations were measured in the sea cucumbers, and all of the vanadium detected was in an organic form. Mean vanadium concentrations were considerably higher in the blood (sea cucumber) than in the other studied tissues. The highest concentration of vanadium (2.56 μg g(-1)), as well as a higher degree of organic vanadium (85.5 %), was observed in the Holothuria scabra samples compared with all other samples. Vanadium levels in Apostichopus japonicus from Bohai Bay and Yellow Sea have marked seasonal variations. Average values of 1.09 μg g(-1) of total vanadium and 0.79 μg g(-1) of organic vanadium were obtained in various species of sea cucumbers. Significant positive correlations between vanadium in the seawater and V org in the sea cucumber (r = 81.67 %, p = 0.00), as well as between vanadium in the sediment and V org in the sea cucumber (r = 77.98 %, p = 0.00), were observed. Vanadium concentrations depend on the seasons (salinity, temperature), species, sampling sites and seawater environment (seawater, sediment). Given the adverse toxicological effects of inorganic vanadium and positive roles in controlling the development of diabetes in humans, a regular monitoring programme of vanadium content in edible sea cucumbers can be recommended.

  3. TFE Verification Program

    SciTech Connect

    Not Available

    1990-03-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown on Fig. 1-1. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program;(3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-86; (5) and Thermionic Technology Program in 1986 and 1987. 18 refs., 64 figs., 43 tabs.

  4. Visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-03-01

    On-site visual inspection will play an essential role in future Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection can greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can visual inspection offer ``ground truth`` in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending party may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection.

  5. Calibration or verification? A balanced approach for science.

    USGS Publications Warehouse

    Myers, C.T.; Kennedy, D.M.

    1997-01-01

    The calibration of balances is routinely performed both in the laboratory and the field. This process is required to accurately determine the weight of an object or chemical. The frequency of calibration and verification of balances is mandated by their use and location. Tolerance limits for balances could not be located in any standard procedure manuals. A survey was conducted to address the issues of calibration and verification frequency and to discuss the significance of defining tolerance limits for balances. Finally, for the benefit of laboratories unfamiliar with such procedures, we provide a working model based on our laboratory, the Upper Mississippi Science Center (UMSC), in La Crosse, Wisconsin.

  6. Continuous verification using multimodal biometrics.

    PubMed

    Sim, Terence; Zhang, Sheng; Janakiraman, Rajkumar; Kumar, Sandeep

    2007-04-01

    Conventional verification systems, such as those controlling access to a secure room, do not usually require the user to reauthenticate himself for continued access to the protected resource. This may not be sufficient for high-security environments in which the protected resource needs to be continuously monitored for unauthorized use. In such cases, continuous verification is needed. In this paper, we present the theory, architecture, implementation, and performance of a multimodal biometrics verification system that continuously verifies the presence of a logged-in user. Two modalities are currently used--face and fingerprint--but our theory can be readily extended to include more modalities. We show that continuous verification imposes additional requirements on multimodal fusion when compared to conventional verification systems. We also argue that the usual performance metrics of false accept and false reject rates are insufficient yardsticks for continuous verification and propose new metrics against which we benchmark our system. PMID:17299225

  7. Verification of ICESat-2/ATLAS Science Receiver Algorithm Onboard Databases

    NASA Astrophysics Data System (ADS)

    Carabajal, C. C.; Saba, J. L.; Leigh, H. W.; Magruder, L. A.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.

    2013-12-01

    NASA's ICESat-2 mission will fly the Advanced Topographic Laser Altimetry System (ATLAS) instrument on a 3-year mission scheduled to launch in 2016. ATLAS is a single-photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz, and a 6 spot pattern on the Earth's surface. A set of onboard Receiver Algorithms will perform signal processing to reduce the data rate and data volume to acceptable levels. These Algorithms distinguish surface echoes from the background noise, limit the daily data volume, and allow the instrument to telemeter only a small vertical region about the signal. For this purpose, three onboard databases are used: a Surface Reference Map (SRM), a Digital Elevation Model (DEM), and a Digital Relief Maps (DRMs). The DEM provides minimum and maximum heights that limit the signal search region of the onboard algorithms, including a margin for errors in the source databases, and onboard geolocation. Since the surface echoes will be correlated while noise will be randomly distributed, the signal location is found by histogramming the received event times and identifying the histogram bins with statistically significant counts. Once the signal location has been established, the onboard Digital Relief Maps (DRMs) will be used to determine the vertical width of the telemetry band about the signal. University of Texas-Center for Space Research (UT-CSR) is developing the ICESat-2 onboard databases, which are currently being tested using preliminary versions and equivalent representations of elevation ranges and relief more recently developed at Goddard Space Flight Center (GSFC). Global and regional elevation models have been assessed in terms of their accuracy using ICESat geodetic control, and have been used to develop equivalent representations of the onboard databases for testing against the UT-CSR databases, with special emphasis on the ice sheet regions. A series of verification checks have been implemented, including

  8. Using color for face verification

    NASA Astrophysics Data System (ADS)

    Leszczynski, Mariusz

    2009-06-01

    This paper presents research on importance of color information in face verification system. Four most popular color spaces where used: RGB, YIQ, YCbCr, luminance and compared using four types of discriminant classifiers. Experiments conducted on facial databases with complex background, different poses and light condition show that color information can improve the verification accuracy compared to the traditionally used luminance information. To achieve the best performance we recommend to use multi frames verification encoded to YIQ color space.

  9. Quantum money with classical verification

    SciTech Connect

    Gavinsky, Dmitry

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  10. Manta rays in the Marquesas Islands: first records of Manta birostris in French Polynesia and most easterly location of Manta alfredi in the Pacific Ocean, with notes on their distribution.

    PubMed

    Mourier, J

    2012-11-01

    Based on direct observations of free-ranging specimens, the giant manta ray Manta birostris is reported from the Marquesas Islands, the first sighting in French Polynesia. Sightings of its sister species, the reef manta ray Manta alfredi, are also reported at the most easterly location in the Pacific Ocean. Preliminary individual identification as well as notes on their distribution are also reported.

  11. Requirements of Operational Verification of the NWSRFS-ESP Forecasts

    NASA Astrophysics Data System (ADS)

    Imam, B.; Werner, K.; Hartmann, H.; Sorooshian, S.; Pritchard, E.

    2006-12-01

    National Weather Service River Forecast System (NWSRFS). We focus on short (1- 15 days) ensemble forecasts and investigate the utility of both simple "single forecast" graphical approaches, and analytical "distribution" based measures and their associated diagrams. The presentation also addresses the role of both observation and historical-simulation, which is used in initializing hindcasts (retrospective forecasts) for diagnostic verification studies in operational procedures.

  12. Automating engineering verification in ALMA subsystems

    NASA Astrophysics Data System (ADS)

    Ortiz, José; Castillo, Jorge

    2014-08-01

    The Atacama Large Millimeter/submillimeter Array is an interferometer comprising 66 individual high precision antennas located over 5000 meters altitude in the north of Chile. Several complex electronic subsystems need to be meticulously tested at different stages of an antenna commissioning, both independently and when integrated together. First subsystem integration takes place at the Operations Support Facilities (OSF), at an altitude of 3000 meters. Second integration occurs at the high altitude Array Operations Site (AOS), where also combined performance with Central Local Oscillator (CLO) and Correlator is assessed. In addition, there are several other events requiring complete or partial verification of instrument specifications compliance, such as parts replacements, calibration, relocation within AOS, preventive maintenance and troubleshooting due to poor performance in scientific observations. Restricted engineering time allocation and the constant pressure of minimizing downtime in a 24/7 astronomical observatory, impose the need to complete (and report) the aforementioned verifications in the least possible time. Array-wide disturbances, such as global power interruptions and following recovery, generate the added challenge of executing this checkout on multiple antenna elements at once. This paper presents the outcome of the automation of engineering verification setup, execution, notification and reporting in ALMA and how these efforts have resulted in a dramatic reduction of both time and operator training required. Signal Path Connectivity (SPC) checkout is introduced as a notable case of such automation.

  13. Software Verification and Validation Procedure

    SciTech Connect

    Olund, Thomas S.

    2008-09-15

    This Software Verification and Validation procedure provides the action steps for the Tank Waste Information Network System (TWINS) testing process. The primary objective of the testing process is to provide assurance that the software functions as intended, and meets the requirements specified by the client. Verification and validation establish the primary basis for TWINS software product acceptance.

  14. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  15. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  16. In Vivo Proton Beam Range Verification Using Spine MRI Changes

    SciTech Connect

    Gensheimer, Michael F.; Yock, Torunn I.; Liebsch, Norbert J.; Sharp, Gregory C.; Paganetti, Harald; Madan, Neel; Grant, P. Ellen; Bortfeld, Thomas

    2010-09-01

    Purpose: In proton therapy, uncertainty in the location of the distal dose edge can lead to cautious treatment plans that reduce the dosimetric advantage of protons. After radiation exposure, vertebral bone marrow undergoes fatty replacement that is visible on magnetic resonance imaging (MRI). This presents an exciting opportunity to observe radiation dose distribution in vivo. We used quantitative spine MRI changes to precisely detect the distal dose edge in proton radiation patients. Methods and Materials: We registered follow-up T1-weighted MRI images to planning computed tomography scans from 10 patients who received proton spine irradiation. A radiation dose-MRI signal intensity curve was created using the lateral beam penumbra in the sacrum. This curve was then used to measure range errors in the lumbar spine. Results: In the lateral penumbra, there was an increase in signal intensity with higher dose throughout the full range of 0-37.5 Gy (RBE). In the distal fall-off region, the beam sometimes appeared to penetrate farther than planned. The mean overshoot in 10 patients was 1.9 mm (95% confidence interval, 0.8-3.1 mm), on the order of the uncertainties inherent to our range verification method. Conclusions: We have demonstrated in vivo proton range verification using posttreatment spine MRI changes. Our analysis suggests the presence of a systematic overshoot of a few millimeters in some proton spine treatments, but the range error does not exceed the uncertainty incorporated into the treatment planning margin. It may be possible to extend our technique to MRI sequences that show early bone marrow changes, enabling adaptive treatment modification.

  17. Cleanup Verification Package for the 118-F-6 Burial Ground

    SciTech Connect

    H. M. Sulloway

    2008-10-02

    This cleanup verification package documents completion of remedial action for the 118-F-6 Burial Ground located in the 100-FR-2 Operable Unit of the 100-F Area on the Hanford Site. The trenches received waste from the 100-F Experimental Animal Farm, including animal manure, animal carcasses, laboratory waste, plastic, cardboard, metal, and concrete debris as well as a railroad tank car.

  18. Cancelable face verification using optical encryption and authentication.

    PubMed

    Taheri, Motahareh; Mozaffari, Saeed; Keshavarzi, Parviz

    2015-10-01

    In a cancelable biometric system, each instance of enrollment is distorted by a transform function, and the output should not be retransformed to the original data. This paper presents a new cancelable face verification system in the encrypted domain. Encrypted facial images are generated by a double random phase encoding (DRPE) algorithm using two keys (RPM1 and RPM2). To make the system noninvertible, a photon counting (PC) method is utilized, which requires a photon distribution mask for information reduction. Verification of sparse images that are not recognizable by direct visual inspection is performed by unconstrained minimum average correlation energy filter. In the proposed method, encryption keys (RPM1, RPM2, and PDM) are used in the sender side, and the receiver needs only encrypted images and correlation filters. In this manner, the system preserves privacy if correlation filters are obtained by an adversary. Performance of PC-DRPE verification system is evaluated under illumination variation, pose changes, and facial expression. Experimental results show that utilizing encrypted images not only increases security concerns but also enhances verification performance. This improvement can be attributed to the fact that, in the proposed system, the face verification problem is converted to key verification tasks. PMID:26479930

  19. Trajectory Based Behavior Analysis for User Verification

    NASA Astrophysics Data System (ADS)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  20. Introduction of a die-to-database verification tool for the entire printed geometry of a die: geometry verification system NGR2100 for DFM

    NASA Astrophysics Data System (ADS)

    Kitamura, Tadashi; Kubota, Kazufumi; Hasebe, Toshiaki; Sakai, Futoshi; Nakazawa, Shinichi; Vohra, Neeti; Yamamoto, Masahiro; Inoue, Masahiro

    2005-05-01

    The Geometry Verification System NGR2100 enables verification of the entire die, on a resist or an after-etch wafer, by comparing images of a die with corresponding target CAD data. The system detects systematic defects by variable criteria setting for allowable deformation quantities and obtains a CD distribution diagram. The result of systematic defects can then be used to make root cause analysis. The CD distribution diagram can achieve stepper aberration analysis, process windows extraction, macro-loading effect analysis, FEM measurement, and trend analysis more efficiently. Consequently, the Geometry Verification System NGR2100 will contribute to quicker TAT for DFM in Design, Lithography and Mask production.

  1. Introduction of a die-to-database verification tool for the entire printed geometry of a die: geometry verification system NGR2100 for DFM

    NASA Astrophysics Data System (ADS)

    Kitamura, Tadashi; Kubota, Kazufumi; Hasebe, Toshiaki; Sakai, Futoshi; Nakazawa, Shinichi; Vohra, Neeti; Yamamoto, Masahiro; Inoue, Masahiro

    2005-06-01

    The Geometry Verification System NGR2100 enables verification of the entire die, on a resist or an after-etch wafer, by comparing images of a die with corresponding target CAD data. The system detects systematic defects by variable criteria setting for allowable deformation quantities and obtains a CD distribution diagram. The result of systematic defects can then be used to make root cause analysis. The CD distribution diagram can achieve stepper aberration analysis, process windows extraction, macro-loading effect analysis, FEM measurement, and trend analysis more efficiently. Consequently, the Geometry Verification System NGR2100 will contribute to quicker TAT for DFM in Design, Lithography and Mask production.

  2. Online fingerprint verification.

    PubMed

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications. PMID:17365425

  3. Analyzing personalized policies for online biometric verification.

    PubMed

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  4. Surfactants in the sea-surface microlayer and sub-surface water at estuarine locations: Their concentration, distribution, enrichment, and relation to physicochemical characteristics.

    PubMed

    Huang, Yun-Jie; Brimblecombe, Peter; Lee, Chon-Lin; Latif, Mohd Talib

    2015-08-15

    Samples of sea-surface microlayer (SML) and sub-surface water (SSW) were collected from two areas-Kaohsiung City (Taiwan) and the southwest coast of Peninsular Malaysia to study the influence of SML on enrichment and distribution and to compare SML with the SSW. Anionic surfactants (MBAS) predominated in this study and were significantly higher in Kaohsiung than in Malaysia. Industrial areas in Kaohsiung were enriched with high loads of anthropogenic sources, accounted for higher surfactant amounts, and pose higher environmental disadvantages than in Malaysia, where pollutants were associated with agricultural activities. The dissolved organic carbon (DOC), MBAS, and cationic surfactant (DBAS) concentrations in the SML correlated to the SSW, reflecting exchanges between the SML and SSW in Kaohsiung. The relationships between surfactants and the physiochemical parameters indicated that DOC and saltwater dilution might affect the distributions of MBAS and DBAS in Kaohsiung. In Malaysia, DOC might be the important factor controlling DBAS. PMID:26093815

  5. Surfactants in the sea-surface microlayer and sub-surface water at estuarine locations: Their concentration, distribution, enrichment, and relation to physicochemical characteristics.

    PubMed

    Huang, Yun-Jie; Brimblecombe, Peter; Lee, Chon-Lin; Latif, Mohd Talib

    2015-08-15

    Samples of sea-surface microlayer (SML) and sub-surface water (SSW) were collected from two areas-Kaohsiung City (Taiwan) and the southwest coast of Peninsular Malaysia to study the influence of SML on enrichment and distribution and to compare SML with the SSW. Anionic surfactants (MBAS) predominated in this study and were significantly higher in Kaohsiung than in Malaysia. Industrial areas in Kaohsiung were enriched with high loads of anthropogenic sources, accounted for higher surfactant amounts, and pose higher environmental disadvantages than in Malaysia, where pollutants were associated with agricultural activities. The dissolved organic carbon (DOC), MBAS, and cationic surfactant (DBAS) concentrations in the SML correlated to the SSW, reflecting exchanges between the SML and SSW in Kaohsiung. The relationships between surfactants and the physiochemical parameters indicated that DOC and saltwater dilution might affect the distributions of MBAS and DBAS in Kaohsiung. In Malaysia, DOC might be the important factor controlling DBAS.

  6. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  7. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  8. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  9. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  10. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  11. Distribution of polychlorinated biphenyls and organochlorine pesticides in human breast milk from various locations in Tunisia: Levels of contamination, influencing factors, and infant risk assessment

    SciTech Connect

    Ennaceur, S. Gandoura, N.; Driss, M.R.

    2008-09-15

    The concentrations of dichlorodiphenytrichloroethane and its metabolites (DDTs), hexachlorobenzene (HCB), hexachlorocyclohexane isomers (HCHs), dieldrin, and 20 polychlorinated biphenyls (PCBs) were determined in 237 human breast milk samples collected from 12 locations in Tunisia. Gas chromatography with electron capture detector (GC-ECD) was used to identify and quantify residue levels on a lipid basis of organochlorine compounds (OCs). The predominant OCs in human breast milk were PCBs, p,p'-DDE, p,p'-DDT, HCHs, and HCB. Concentrations of DDTs in human breast milk from rural areas were significantly higher than those from urban locations (p<0.05). With regard to PCBs, we observed the predominance of mid-chlorinated congeners due to the presence of PCBs with high K{sub ow} such as PCB 153, 138, and 180. Positive correlations were found between concentrations of OCs in human breast milk and age of mothers and number of parities, suggesting the influence of such factors on OC burdens in lactating mothers. The comparison of daily intakes of PCBs, DDTs, HCHs, and HCB to infants through human breast milk with guidelines proposed by WHO and Health Canada shows that some individuals accumulated OCs in breast milk close to or higher than these guidelines.

  12. Biometric verification with correlation filters.

    PubMed

    Vijaya Kumar, B V K; Savvides, Marios; Xie, Chunyan; Venkataramani, Krithika; Thornton, Jason; Mahalanobis, Abhijit

    2004-01-10

    Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric (such as a fingerprint, face image, or iris image) to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images (e.g., facial expressions, illumination changes, etc.). We investigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification. PMID:14735958

  13. Generic interpreters and microprocessor verification

    NASA Technical Reports Server (NTRS)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  14. 28 CFR 74.6 - Location of eligible persons.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 28 Judicial Administration 2 2013-07-01 2013-07-01 false Location of eligible persons. 74.6 Section 74.6 Judicial Administration DEPARTMENT OF JUSTICE (CONTINUED) CIVIL LIBERTIES ACT REDRESS PROVISION Verification of Eligibility § 74.6 Location of eligible persons. The Office shall compare...

  15. 28 CFR 74.6 - Location of eligible persons.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 28 Judicial Administration 2 2011-07-01 2011-07-01 false Location of eligible persons. 74.6 Section 74.6 Judicial Administration DEPARTMENT OF JUSTICE (CONTINUED) CIVIL LIBERTIES ACT REDRESS PROVISION Verification of Eligibility § 74.6 Location of eligible persons. The Office shall compare...

  16. 28 CFR 74.6 - Location of eligible persons.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Location of eligible persons. 74.6 Section 74.6 Judicial Administration DEPARTMENT OF JUSTICE (CONTINUED) CIVIL LIBERTIES ACT REDRESS PROVISION Verification of Eligibility § 74.6 Location of eligible persons. The Office shall compare...

  17. Method and system for determining depth distribution of radiation-emitting material located in a source medium and radiation detector system for use therein

    DOEpatents

    Benke, Roland R.; Kearfott, Kimberlee J.; McGregor, Douglas S.

    2003-03-04

    A method, system and a radiation detector system for use therein are provided for determining the depth distribution of radiation-emitting material distributed in a source medium, such as a contaminated field, without the need to take samples, such as extensive soil samples, to determine the depth distribution. The system includes a portable detector assembly with an x-ray or gamma-ray detector having a detector axis for detecting the emitted radiation. The radiation may be naturally-emitted by the material, such as gamma-ray-emitting radionuclides, or emitted when the material is struck by other radiation. The assembly also includes a hollow collimator in which the detector is positioned. The collimator causes the emitted radiation to bend toward the detector as rays parallel to the detector axis of the detector. The collimator may be a hollow cylinder positioned so that its central axis is perpendicular to the upper surface of the large area source when positioned thereon. The collimator allows the detector to angularly sample the emitted radiation over many ranges of polar angles. This is done by forming the collimator as a single adjustable collimator or a set of collimator pieces having various possible configurations when connected together. In any one configuration, the collimator allows the detector to detect only the radiation emitted from a selected range of polar angles measured from the detector axis. Adjustment of the collimator or the detector therein enables the detector to detect radiation emitted from a different range of polar angles. The system further includes a signal processor for processing the signals from the detector wherein signals obtained from different ranges of polar angles are processed together to obtain a reconstruction of the radiation-emitting material as a function of depth, assuming, but not limited to, a spatially-uniform depth distribution of the material within each layer. The detector system includes detectors having

  18. Towards an in-situ measurement of wave velocity in buried plastic water distribution pipes for the purposes of leak location

    NASA Astrophysics Data System (ADS)

    Almeida, Fabrício C. L.; Brennan, Michael J.; Joseph, Phillip F.; Dray, Simon; Whitfield, Stuart; Paschoalini, Amarildo T.

    2015-12-01

    Water companies are under constant pressure to ensure that water leakage is kept to a minimum. Leak noise correlators are often used to help find and locate leaks. These devices correlate acoustic or vibration signals from sensors which are placed either side the location of a suspected leak. The peak in the cross-correlation function of the measured signals gives the time difference between the arrival times of the leak noise at the sensors. To convert the time delay into a distance, the speed at which the leak noise propagates along the pipe (wave-speed) needs to be known. Often, this is estimated from historical wave-speed data measured on other pipes obtained at various times and under various conditions, or it is estimated from tables which are calculated using simple formula. Usually, the wave-speed is not measured directly at the time of the correlation measurement and is therefore potentially a source of significant error in the localisation of the leak. In this paper, a new method of measuring the wave-speed in-situ in the presence of a leak, that is robust and simple, is explored. Experiments were conducted on a bespoke large scale buried pipe test-rig, in which a leak was also induced in the pipe between the measurement positions to simulate a condition that is likely to occur in practice. It is shown that even in conditions where the signal to noise ratio is very poor, the wave-speed estimate calculated using the new method is less than 5% different from the best estimate of 387 m s-1.

  19. [Spatial distribution and temporal variation in composition of black fly species (Diptera: Simuliidae) in a small watershed located in the Northern of Paraná State, Brazil].

    PubMed

    Santos, Rachel B Dos; Lopes, José; Santos, Karen B Dos

    2010-01-01

    In this work, the survey of simuliid species and the study of their spatial distribution in four streams of a small watershed situated in Londrina, Paraná State, were carried out from January to October 2007. Changes in the species composition of the breeding sites were also checked along the sampling months. Seventeen black fly species were found, being Simulium botulibranchium Lutz, Simulium travassosi d'Andretta & d'Andretta, Simulium anamariae Vulcano, Simulium brachycladum Lutz & Pinto and Simulium metallicum s. l. Bellardi new records for Paraná State. The Canonical Correspondence Analysis showed that the environmental variables most correlated to the species distribution among sample sites were water conductivity and those linked to physical dimensions of the breeding sites, like width, depth and water velocity. The matrix of faunistic similarity among collecting dates was negatively correlated to the time interval of sampling matrix for three of the water bodies studied, showing the existence of temporal changes in the species composition. According to Multiple Regression Analysis, temporal abundance variation of Simulium perflavum Roubaud, Simulium inaequale Paterson & Shannon and Simulium lutzianum s. l. Pinto was not linked to air temperature, photoperiod and rainfall, suggesting the influence of other factors, probably those directly associated to specific breeding site conditions. The results indicate that differences in physical and chemical characteristics among water bodies may affect the taxonomic composition of simuliids in this watershed. PMID:20498969

  20. [Spatial distribution and temporal variation in composition of black fly species (Diptera: Simuliidae) in a small watershed located in the Northern of Paraná State, Brazil].

    PubMed

    Santos, Rachel B Dos; Lopes, José; Santos, Karen B Dos

    2010-01-01

    In this work, the survey of simuliid species and the study of their spatial distribution in four streams of a small watershed situated in Londrina, Paraná State, were carried out from January to October 2007. Changes in the species composition of the breeding sites were also checked along the sampling months. Seventeen black fly species were found, being Simulium botulibranchium Lutz, Simulium travassosi d'Andretta & d'Andretta, Simulium anamariae Vulcano, Simulium brachycladum Lutz & Pinto and Simulium metallicum s. l. Bellardi new records for Paraná State. The Canonical Correspondence Analysis showed that the environmental variables most correlated to the species distribution among sample sites were water conductivity and those linked to physical dimensions of the breeding sites, like width, depth and water velocity. The matrix of faunistic similarity among collecting dates was negatively correlated to the time interval of sampling matrix for three of the water bodies studied, showing the existence of temporal changes in the species composition. According to Multiple Regression Analysis, temporal abundance variation of Simulium perflavum Roubaud, Simulium inaequale Paterson & Shannon and Simulium lutzianum s. l. Pinto was not linked to air temperature, photoperiod and rainfall, suggesting the influence of other factors, probably those directly associated to specific breeding site conditions. The results indicate that differences in physical and chemical characteristics among water bodies may affect the taxonomic composition of simuliids in this watershed.

  1. NON-INVASIVE DETERMINATION OF THE LOCATION AND DISTRIBUTION OF FREE-PHASE DENSE NONAQUEOUS PHASE LIQUIDS (DNAPL) BY SEISMIC REFLECTION TECHNIQUES

    SciTech Connect

    Michael G. Waddell; William J. Domoracki; Tom J. Temples

    2001-12-01

    This annual technical progress report is for part of Task 4 (site evaluation), Task 5 (2D seismic design, acquisition, and processing), and Task 6 (2D seismic reflection, interpretation, and AVO analysis) on DOE contact number DE-AR26-98FT40369. The project had planned one additional deployment to another site other than Savannah River Site (SRS) or DOE Hanford Site. After the SUBCON midyear review in Albuquerque, NM, it was decided that two additional deployments would be performed. The first deployment is to test the feasibility of using non-invasive seismic reflection and AVO analysis as a monitoring tool to assist in determining the effectiveness of Dynamic Underground Stripping (DUS) in removal of DNAPL. The second deployment is to the Department of Defense (DOD) Charleston Naval Weapons Station Solid Waste Management Unit 12 (SWMU-12), Charleston, SC to further test the technique to detect high concentrations of DNAPL. The Charleston Naval Weapons Station SWMU-12 site was selected in consultation with National Energy Technology Laboratory (NETL) and DOD Naval Facilities Engineering Command Southern Division (NAVFAC) personnel. Based upon the review of existing data and due to the shallow target depth, the project team collected three Vertical Seismic Profiles (VSP) and an experimental P-wave seismic reflection line. After preliminary data analysis of the VSP data and the experimental reflection line data, it was decided to proceed with Task 5 and Task 6. Three high resolution P-wave reflection profiles were collected with two objectives; (1) design the reflection survey to image a target depth of 20 feet below land surface to assist in determining the geologic controls on the DNAPL plume geometry, and (2) apply AVO analysis to the seismic data to locate the zone of high concentration of DNAPL. Based upon the results of the data processing and interpretation of the seismic data, the project team was able to map the channel that is controlling the DNAPL plume

  2. Verification and Validation for Flight-Critical Systems (VVFCS)

    NASA Technical Reports Server (NTRS)

    Graves, Sharon S.; Jacobsen, Robert A.

    2010-01-01

    On March 31, 2009 a Request for Information (RFI) was issued by NASA s Aviation Safety Program to gather input on the subject of Verification and Validation (V & V) of Flight-Critical Systems. The responses were provided to NASA on or before April 24, 2009. The RFI asked for comments in three topic areas: Modeling and Validation of New Concepts for Vehicles and Operations; Verification of Complex Integrated and Distributed Systems; and Software Safety Assurance. There were a total of 34 responses to the RFI, representing a cross-section of academic (26%), small & large industry (47%) and government agency (27%).

  3. Thoughts on Verification of Nuclear Disarmament

    SciTech Connect

    Dunlop, W H

    2007-09-26

    perfect and it was expected that occasionally there might be a verification measurement that was slightly above 150 kt. But the accuracy was much improved over the earlier seismic measurements. In fact some of this improvement was because as part of this verification protocol the US and Soviet Union provided the yields of several past tests to improve seismic calibrations. This actually helped provide a much needed calibration for the seismic measurements. It was also accepted that since nuclear tests were to a large part R&D related, it was also expected that occasionally there might be a test that was slightly above 150 kt, as you could not always predict the yield with high accuracy in advance of the test. While one could hypothesize that the Soviets could do a test at some other location than their test sites, if it were even a small fraction of 150 kt it would clearly be observed and would be a violation of the treaty. So the issue of clandestine tests of significance was easily covered for this particular treaty.

  4. Organics Verification Study for Sinclair and Dyes Inlets, Washington

    SciTech Connect

    Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.

    2006-09-28

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the

  5. Expose : procedure and results of the joint experiment verification tests

    NASA Astrophysics Data System (ADS)

    Panitz, C.; Rettberg, P.; Horneck, G.; Rabbow, E.; Baglioni, P.

    The International Space Station will carry the EXPOSE facility accommodated at the universal workplace URM-D located outside the Russian Service Module. The launch will be affected in 2005 and it is planned to stay in space for 1.5 years. The tray like structure will accomodate 2 chemical and 6 biological PI-experiments or experiment systems of the ROSE (Response of Organisms to Space Environment) consortium. EXPOSE will support long-term in situ studies of microbes in artificial meteorites, as well as of microbial communities from special ecological niches, such as endolithic and evaporitic ecosystems. The either vented or sealed experiment pockets will be covered by an optical filter system to control intensity and spectral range of solar UV irradiation. Control of sun exposure will be achieved by the use of individual shutters. To test the compatibility of the different biological systems and their adaptation to the opportunities and constraints of space conditions a profound ground support program has been developed. The procedure and first results of this joint Experiment Verification Tests (EVT) will be presented. The results will be essential for the success of the EXPOSE mission and have been done in parallel with the development and construction of the final hardware design of the facility. The results of the mission will contribute to the understanding of the organic chemistry processes in space, the biological adaptation strategies to extreme conditions, e.g. on early Earth and Mars, and the distribution of life beyond its planet of origin.

  6. Using multi-scale distribution and movement effects along a montane highway to identify optimal crossing locations for a large-bodied mammal community

    PubMed Central

    Römer, Heinrich; Germain, Ryan R.

    2013-01-01

    Roads are a major cause of habitat fragmentation that can negatively affect many mammal populations. Mitigation measures such as crossing structures are a proposed method to reduce the negative effects of roads on wildlife, but the best methods for determining where such structures should be implemented, and how their effects might differ between species in mammal communities is largely unknown. We investigated the effects of a major highway through south-eastern British Columbia, Canada on several mammal species to determine how the highway may act as a barrier to animal movement, and how species may differ in their crossing-area preferences. We collected track data of eight mammal species across two winters, along both the highway and pre-marked transects, and used a multi-scale modeling approach to determine the scale at which habitat characteristics best predicted preferred crossing sites for each species. We found evidence for a severe barrier effect on all investigated species. Freely-available remotely-sensed habitat landscape data were better than more costly, manually-digitized microhabitat maps in supporting models that identified preferred crossing sites; however, models using both types of data were better yet. Further, in 6 of 8 cases models which incorporated multiple spatial scales were better at predicting preferred crossing sites than models utilizing any single scale. While each species differed in terms of the landscape variables associated with preferred/avoided crossing sites, we used a multi-model inference approach to identify locations along the highway where crossing structures may benefit all of the species considered. By specifically incorporating both highway and off-highway data and predictions we were able to show that landscape context plays an important role for maximizing mitigation measurement efficiency. Our results further highlight the need for mitigation measures along major highways to improve connectivity between mammal

  7. Automated verification of system configuration

    NASA Astrophysics Data System (ADS)

    Andrews, W. H., Jr.; Baker, S. P.; Blalock, A. V.

    1991-05-01

    Errors in field wiring can result in significant correction costs (if the errors are discovered prior to use), in erroneous or unusable data (if the errors are not discovered in time), or in serious accidents (if the errors corrupt critical data). Detailed field wiring checkout rework are tedious and expensive, but they are essential steps in the quality assurance process for large, complex instrumentation and control systems. A recent Oak Ridge National Laboratory (ORNL) development, the CONFiguration IDEnification System (CONFIDES) automates verification of field wiring. In CONFIDES, an identifier module is installed on or integrated into each component (e.g., sensor, actuator, cable, distribution panel) to be verified. Interrogator modules, controlled by a personal computer (PC), are installed at the connections of the field wiring to the inputs of the data acquisition and control system (DACS). Interrogator modules poll the components connected to each channel of the DACS and can determine the path taken by each channel's signal to or from the end device for that channel. The system will provide not only the identification (ID) code for the cables and patch panels in the path to a particular sensor or actuator but for individual cable conductor IDs as well. One version of the system uses existing signal wires for communications between CONFIDES modules. Another, more powerful version requires a single dedicated conductor in each cable. Both version can operate with or without instrument power applied and neither interferes with the normal operation of the DACS. Identifier modules can provide a variety of information including status and calibration data.

  8. Evaluation of 3D pre-treatment verification for volumetric modulated arc therapy plan in head region

    NASA Astrophysics Data System (ADS)

    Ruangchan, S.; Oonsiri, S.; Suriyapee, S.

    2016-03-01

    The development of pre-treatment QA tools contributes to the three dimension (3D) dose verification using the calculation software with the measured planar dose distribution. This research is aimed to evaluate the Sun Nuclear 3DVH software with Thermo luminescence dosimeter (TLD) measurement. The two VMAT patient plans (2.5 arcs) of 6 MV photons with different PTV locations were transferred to the Rando phantom images. The PTV of the first plan located in homogeneous area and vice versa in the second plan. For treatment planning process, the Rando phantom images were employed in optimization and calculation with the PTV, brain stem, lens and TLD position contouring. The verification plans were created, transferred to the ArcCHECK for measurement and calculated the 3D dose using 3DVH software. The range of the percent dose differences in both PTV and organ at risk (OAR) between TLD and 3DVH software of the first and the second plans were -2.09 to 3.87% and -1.39 to 6.88%, respectively. The mean percent dose differences for the PTV were 1.62% and 3.93% for the first and the second plans, respectively. In conclusion, the 3DVH software results show good agreement with TLD when the tumor located in the homogeneous area.

  9. Differences in distribution of esterase between cell fractions of rat liver homogenates prepared in various media. Relevance to the lysosomal location of the enzyme in the intact cell

    PubMed Central

    Barrow, Patience C.; Holt, S. J.

    1971-01-01

    The distribution of esterase in subcellular fractions of rat liver homogenates was compared with that of the lysosomal enzyme acid phosphatase and the microsomal enzyme glucose 6-phosphatase. Most of the esterase from sucrose homogenate sediments with glucose 6-phosphatase and about 8% is recovered in the supernatant. However, up to 53% of the esterase can be washed from microtome sections of unfixed liver, in which less cellular damage would be expected than that caused by homogenization. About 40% of both esterase and acid phosphatase are recovered in the soluble fraction after homogenization in aqueous glycerol or in a two-phase system (Arcton 113–0.25m-sucrose), although glucose 6-phosphatase is still recovered in the microsomal fraction of such homogenates. The esterase of the microsomal fraction prepared from a sucrose homogenate is much more readily released by treatment with 0.26% deoxycholate than are other constituents of this fraction. The release of esterase from the microsomal fraction by the detergent and its concomitant release with acid phosphatase after homogenization in glycerol or the two-phase system suggests that a greater proportion of esterase may be present in lysosomes of the intact cell than is indicated by the results of standard fractionation procedures. PMID:4335692

  10. NON-INVASIVE DETERMINATION OF THE LOCATION AND DISTRIBUTION OF FREE-PHASE DENSE NONAQUEOUS PHASE LIQUIDS (DNAPL) BY SEISMIC REFLECTION TECHNIQUES

    SciTech Connect

    Michael G. Waddell; William J. Domoracki; Tom J. Temples

    2001-05-01

    This semi-annual technical progress report is for Task 4 site evaluation, Task 5 seismic reflection design and acquisition, and Task 6 seismic reflection processing and interpretation on DOE contact number DE-AR26-98FT40369. The project had planned one additional deployment to another site other than Savannah River Site (SRS) or DOE Hanford. During this reporting period the project had an ASME peer review. The findings and recommendation of the review panel, as well at the project team response to comments, are in Appendix A. After the SUBCON midyear review in Albuquerque, NM and the peer review it was decided that two additional deployments would be performed. The first deployment is to test the feasibility of using non-invasive seismic reflection and AVO analysis as monitoring to assist in determining the effectiveness of Dynamic Underground Stripping (DUS) in removal of DNAPL. Under the rescope of the project, Task 4 would be performed at the Charleston Navy Weapons Station, Charleston, SC and not at the Dynamic Underground Stripping (DUS) project at SRS. The project team had already completed Task 4 at the M-area seepage basin, only a few hundred yards away from the DUS site. Because the geology is the same, Task 4 was not necessary. However, a Vertical Seismic Profile (VSP) was conducted in one well to calibrate the geology to the seismic data. The first deployment to the DUS Site (Tasks 5 and 6) has been completed. Once the steam has been turned off these tasks will be performed again to compare the results to the pre-steam data. The results from the first deployment to the DUS site indicated a seismic amplitude anomaly at the location and depths of the known high concentrations of DNAPL. The deployment to another site with different geologic conditions was supposed to occur during this reporting period. The first site selected was DOE Paducah, Kentucky. After almost eight months of negotiation, site access was denied requiring the selection of another site

  11. Proceedings of the array signal processing symposium: Treaty Verification Program

    SciTech Connect

    Harris, D.B.

    1988-02-01

    A common theme underlying the research these groups conduct is the use of propagating waves to detect, locate, image or otherwise identify features of the environment significant to their applications. The applications considered in this symposium are verification of nuclear test ban treaties, non-destructive evaluation (NDE) of manufactured components, and sonar and electromagnetic target acquisition and tracking. These proceedings cover just the first two topics. In these applications, arrays of sensors are used to detect propagating waves and to measure the characteristics that permit interpretation. The reason for using sensors arrays, which are inherently more expensive than single sensor systems, is twofold. By combining the signals from multiple sensors, it is usually possible to suppress unwanted noise, which permtis detection and analysis of waker signals. Secondly, in complicated situations in which many waves are present, arrays make it possible to separate the waves and to measure their individual characteristics (direction, velocity, etc.). Other systems (such as three-component sensors in the seismic application) can perform these functions to some extent, but none are so effective and versatile as arrays. The objectives of test ban treaty verification are to detect, locate and identify underground nuclear explosions, and to discriminate them from earthquakes and conventional chemical explosions. Two physical modes of treaty verification are considered: monitoring with arrays of seismic stations (solid earth propagation), and monitoring with arrays of acoustic (infrasound) stations (atmospheric propagation). The majority of the presentations represented in these proceeding address various aspects of the seismic verification problem.

  12. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  13. Face verification with balanced thresholds.

    PubMed

    Yan, Shuicheng; Xu, Dong; Tang, Xiaoou

    2007-01-01

    The process of face verification is guided by a pre-learned global threshold, which, however, is often inconsistent with class-specific optimal thresholds. It is, hence, beneficial to pursue a balance of the class-specific thresholds in the model-learning stage. In this paper, we present a new dimensionality reduction algorithm tailored to the verification task that ensures threshold balance. This is achieved by the following aspects. First, feasibility is guaranteed by employing an affine transformation matrix, instead of the conventional projection matrix, for dimensionality reduction, and, hence, we call the proposed algorithm threshold balanced transformation (TBT). Then, the affine transformation matrix, constrained as the product of an orthogonal matrix and a diagonal matrix, is optimized to improve the threshold balance and classification capability in an iterative manner. Unlike most algorithms for face verification which are directly transplanted from face identification literature, TBT is specifically designed for face verification and clarifies the intrinsic distinction between these two tasks. Experiments on three benchmark face databases demonstrate that TBT significantly outperforms the state-of-the-art subspace techniques for face verification.

  14. CTBT integrated verification system evaluation model supplement

    SciTech Connect

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  15. Verification of Advances in a Coupled Snow-runoff Modeling Framework for Operational Streamflow Forecasts

    NASA Astrophysics Data System (ADS)

    Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.

    2011-12-01

    The National Oceanic and Atmospheric Administration's (NOAA's) River Forecast Centers (RFCs) issue hydrologic forecasts related to flood events, reservoir operations for water supply, streamflow regulation, and recreation on the nation's streams and rivers. The RFCs use the National Weather Service River Forecast System (NWSRFS) for streamflow forecasting which relies on a coupled snow model (i.e. SNOW17) and rainfall-runoff model (i.e. SAC-SMA) in snow-dominated regions of the US. Errors arise in various steps of the forecasting system from input data, model structure, model parameters, and initial states. The goal of the current study is to undertake verification of potential improvements in the SNOW17-SAC-SMA modeling framework developed for operational streamflow forecasts. We undertake verification for a range of parameters sets (i.e. RFC, DREAM (Differential Evolution Adaptive Metropolis)) as well as a data assimilation (DA) framework developed for the coupled models. Verification is also undertaken for various initial conditions to observe the influence of variability in initial conditions on the forecast. The study basin is the North Fork America River Basin (NFARB) located on the western side of the Sierra Nevada Mountains in northern California. Hindcasts are verified using both deterministic (i.e. Nash Sutcliffe efficiency, root mean square error, and joint distribution) and probabilistic (i.e. reliability diagram, discrimination diagram, containing ratio, and Quantile plots) statistics. Our presentation includes comparison of the performance of different optimized parameters and the DA framework as well as assessment of the impact associated with the initial conditions used for streamflow forecasts for the NFARB.

  16. Spatial Evaluation and Verification of Earthquake Simulators

    NASA Astrophysics Data System (ADS)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2016-09-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  17. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  18. Evaluation of Mesoscale Model Phenomenological Verification Techniques

    NASA Technical Reports Server (NTRS)

    Lambert, Winifred

    2006-01-01

    Forecasters at the Spaceflight Meteorology Group, 45th Weather Squadron, and National Weather Service in Melbourne, FL use mesoscale numerical weather prediction model output in creating their operational forecasts. These models aid in forecasting weather phenomena that could compromise the safety of launch, landing, and daily ground operations and must produce reasonable weather forecasts in order for their output to be useful in operations. Considering the importance of model forecasts to operations, their accuracy in forecasting critical weather phenomena must be verified to determine their usefulness. The currently-used traditional verification techniques involve an objective point-by-point comparison of model output and observations valid at the same time and location. The resulting statistics can unfairly penalize high-resolution models that make realistic forecasts of a certain phenomena, but are offset from the observations in small time and/or space increments. Manual subjective verification can provide a more valid representation of model performance, but is time-consuming and prone to personal biases. An objective technique that verifies specific meteorological phenomena, much in the way a human would in a subjective evaluation, would likely produce a more realistic assessment of model performance. Such techniques are being developed in the research community. The Applied Meteorology Unit (AMU) was tasked to conduct a literature search to identify phenomenological verification techniques being developed, determine if any are ready to use operationally, and outline the steps needed to implement any operationally-ready techniques into the Advanced Weather Information Processing System (AWIPS). The AMU conducted a search of all literature on the topic of phenomenological-based mesoscale model verification techniques and found 10 different techniques in various stages of development. Six of the techniques were developed to verify precipitation forecasts, one

  19. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  20. CHEMICAL INDUCTION MIXER VERIFICATION - ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Wet-Weather Flow Technologies Pilot of the Environmental Technology Verification (ETV) Program, which is supported by the U.S. Environmental Protection Agency and facilitated by NSF International, has recently evaluated the performance of chemical induction mixers used for di...

  1. A verification system of RMAP protocol controller

    NASA Astrophysics Data System (ADS)

    Khanov, V. Kh; Shakhmatov, A. V.; Chekmarev, S. A.

    2015-01-01

    The functional verification problem of IP blocks of RMAP protocol controller is considered. The application of the verification method using fully- functional models of the processor and the internal bus of a system-on-chip is justified. Principles of construction of a verification system based on the given approach are proposed. The practical results of creating a system of verification of IP block of RMAP protocol controller is presented.

  2. Bayesian ROC curve estimation under verification bias.

    PubMed

    Gu, Jiezhun; Ghosal, Subhashis; Kleiner, David E

    2014-12-20

    Receiver operating characteristic (ROC) curve has been widely used in medical science for its ability to measure the accuracy of diagnostic tests under the gold standard. However, in a complicated medical practice, a gold standard test can be invasive, expensive, and its result may not always be available for all the subjects under study. Thus, a gold standard test is implemented only when it is necessary and possible. This leads to the so-called 'verification bias', meaning that subjects with verified disease status (also called label) are not selected in a completely random fashion. In this paper, we propose a new Bayesian approach for estimating an ROC curve based on continuous data following the popular semiparametric binormal model in the presence of verification bias. By using a rank-based likelihood, and following Gibbs sampling techniques, we compute the posterior distribution of the binormal parameters intercept and slope, as well as the area under the curve by imputing the missing labels within Markov Chain Monte-Carlo iterations. Consistency of the resulting posterior under mild conditions is also established. We compare the new method with other comparable methods and conclude that our estimator performs well in terms of accuracy. PMID:25269427

  3. Yeast flavohemoglobin, a nitric oxide oxidoreductase, is located in both the cytosol and the mitochondrial matrix: effects of respiration, anoxia, and the mitochondrial genome on its intracellular level and distribution.

    PubMed

    Cassanova, Nina; O'Brien, Kristin M; Stahl, Brett T; McClure, Travis; Poyton, Robert O

    2005-03-01

    Yeast flavohemoglobin, YHb, encoded by the nuclear gene YHB1, has been implicated in both the oxidative and nitrosative stress responses in Saccharomyces cerevisiae. Previous studies have shown that the expression of YHB1 is optimal under normoxic or hyperoxic conditions, yet respiring yeast cells have low levels of reduced YHb pigment as detected by carbon monoxide (CO) photolysis difference spectroscopy of glucose-reduced cells. Here, we have addressed this apparent discrepancy by determining the intracellular location of the YHb protein and analyzing the relationships between respiration, YHb level, and intracellular location. We have found that although intact respiration-proficient cells lack a YHb CO spectral signature, cell extracts from these cells have both a YHb CO spectral signature and nitric oxide (NO) consuming activity. This suggests either that YHb cannot be reduced in vivo or that YHb heme is maintained in an oxidized state in respiring cells. By using an anti-YHb antibody and CO difference spectroscopy and by measuring NO consumption, we have found that YHb localizes to two distinct intracellular compartments in respiring cells, the mitochondrial matrix and the cytosol. Moreover, we have found that the distribution of YHb between these two compartments is affected by the presence or absence of oxygen and by the mitochondrial genome. The findings suggest that YHb functions in oxidative stress indirectly by consuming NO, which inhibits mitochondrial respiration and leads to enhanced production of reactive oxygen species, and that cells can regulate intracellular distribution of YHb in accordance with this function.

  4. Non-damaging, portable radiography: Applications in arms control verification

    SciTech Connect

    Morris, R.A.; Butterfield, K.B.; Apt, K.E.

    1992-08-01

    The state-of-the-technology necessary to perform portable radiography in support of arms control verification is evaluated. Specific requirements, such as accurate measurements of the location of features in a treaty-limited object and the detection of deeply imbedded features, are defined in three scenarios. Sources, detectors, portability, mensuration, and safety are discussed in relation to the scenarios. Examples are given of typical radiographic systems that would be capable of addressing the inspection problems associated with the three scenarios.

  5. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  6. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Energy Regulatory Commission 18 CFR Part 40 Generator Verification Reliability Standards AGENCY: Federal... Organization: MOD-025-2 (Verification and Data Reporting of Generator Real and Reactive Power Capability and Synchronous Condenser Reactive Power Capability), MOD- 026-1 (Verification of Models and Data for...

  7. The science verification of FLAMES

    NASA Astrophysics Data System (ADS)

    Primas, Francesca

    2003-06-01

    After a new VLT instrument has been commissioned and thoroughly tested1, a series of scientific and technical checkups are scheduled in order to test the front-to-end operations chain before the official start of regular operations. Technically speaking, these are the socalled Dry Runs, part of which are usually devoted to the Science Verification (SV for short) of that specific instrument. A Science Verification programme includes a set of typical scientific observations with the aim of verifying and demonstrating to the community the capabilities of a new instrument in the operational framework of the VLT Paranal Observatory. Though manifold, its goals can be summarised in two main points: from the scientific point of view, by demonstrating the scientific potential of the new instrument, these observations will provide ESO users with first science- grade data, thus fostering an early scientific return. From the technical point of view, by testing the whole operational system (from the preparation of the observations to their execution and analysis), it will provide important feedback to the Instrument Operation Teams (both in Paranal and in Garching), to the Instrument Division, and to the Data Flow groups. More details about the concept(s) behind a Science Verification can be found in the “Science Verification Policy and Procedures” document (available at http://www.eso.org/science/vltsv/).

  8. Toward Regional Fossil Fuel CO2 Emissions Verification Using WRF-CHEM

    NASA Astrophysics Data System (ADS)

    Delle Monache, L.; Kosoviæ, B.; Cameron-Smith, P.; Bergmann, D.; Grant, K.; Guilderson, T.

    2008-12-01

    As efforts to reduce emissions of green house gases take shape it is becoming obvious that an essential component of a viable solution will involve emission verification. While detailed inventories of green house gas sources will represent important component of the solution additional verification methodologies will be necessary to reduce uncertainties in emission estimates especially for distributed sources and CO2 offsets. We developed tools for solving inverse dispersion problem for distributed emissions of green house gases. For that purpose we combine probabilistic inverse methodology based on Bayesian inversion with stochastic sampling and weather forecasting and air quality model WRF-CHEM. We demonstrate estimation of CO2 emissions associated with fossil fuel burning in California over two one-week periods in 2006. We use WRF- CHEM in tracer simulation mode to solve forward dispersion problem for emissions over eleven air basins. We first use direct inversion approach to determine optimal location for a limited number of CO2 - C14 isotope sensors. We then use Bayesian inference with stochastic sampling to determine probability distributions for emissions from California air basins. Moreover, we vary the number of sensors and frequency of measurements to study their effect on the accuracy and uncertainty level of the emission estimation. Finally, to take into account uncertainties associated with forward modeling, we combine Bayesian inference and stochastic sampling with ensemble modeling. The ensemble is created by running WRF-CHEM with different initial and boundary conditions as well as different boundary layer and surface model options. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory (LLNL) under Contract DE-AC52-07NA27344 (LLNL-ABS-406901-DRAFT). The project 07-ERD- 064 was funded by the Laboratory Directed Research and Development Program at LLNL.

  9. LOCATING MONITORING STATIONS IN WATER DISTRIBUTION SYSTEMS

    EPA Science Inventory

    Water undergoes changes in quality between the time it leaves the treatment plant and the time it reaches the customer's tap, making it important to select monitoring stations that will adequately monitor these changers. But because there is no uniform schedule or framework for ...

  10. Programmable RET Mask Layout Verification

    NASA Astrophysics Data System (ADS)

    Beale, Daniel F.; Mayhew, Jeffrey P.; Rieger, Michael L.; Tang, Zongwu

    2002-12-01

    Emerging resolution enhancement techniques (RET) and OPC are dramatically increasing the complexity of mask layouts and, in turn, mask verification. Mask shapes needed to achieve required results on the wafer diverge significantly from corresponding shapes in the physical design, and in some cases a single chip layer may be decomposed into two masks used in multiple exposures. The mask verification challenge is to certify that a RET-synthesized mask layout will produce an acceptable facsimile of the design intent expressed in the design layout. Furthermore costs, tradeoffs between mask-complexity, design intent, targeted process latitude, and other factors are playing a growing role in helping to control rising mask costs. All of these considerations must in turn be incorporated into the mask layout verification strategy needed for data prep sign-off. In this paper we describe a technique for assessing the lithographic quality of mask layouts for diverse RET methods while effectively accommodating various manufacturing objectives and specifications. It leverages the familiar DRC paradigm for identifying errors and producing DRC-like error shapes in its output layout. It integrates a unique concept of "check figures" - layer-based geometries that dictate where and how simulations of shapes on the wafer are to be compared to the original desired layout. We will show how this provides a highly programmable environment that makes it possible to engage in "compound" check strategies that vary based on design intent and adaptive simulation with multiple checks. Verification may be applied at the "go/no go" level or can be used to build a body of data for quantitative analysis of lithographic behavior at multiple process conditions or for specific user-defined critical features. In addition, we will outline automated methods that guide the selection of input parameters controlling specific verification strategies.

  11. Verification of surface temperature forecast in southern Italy

    NASA Astrophysics Data System (ADS)

    Federico, S.; Avolio, E.; Pasqualoni, L.; Bellecci, C.

    2009-09-01

    Operational gridded temperature forecast is issued for Calabria since January 2007 at CRATI Scrl in cooperation with ISAC-CNR. The forecast is based on the output of the RAMS model at 6 km horizontal resolution and is issued for the following 4 days. Forecast quality and skill are determined relative to the Regional Meteorological Network which consists of more than 60 thermometers distributed, rather uniformly, over the Region. Measurements available are daily minimum, medium and maximum temperatures and verification refers to these parameters. Cumulative statistics are used to reduce the dimensionality of the forecast verification. In particular, BIAS, RMSE (Root Mean Square Error) and MAE (Mean Absolute Error) are shown for each of the 4-day forecast. Skills are also presented as a function of the season. The orographic complexity of the country is clearly reflected by the cumulative scores. Worst statistics are realized across northwest Calabria, where the resolution of the model is not enough to resolve the steep orographic gradient of "Catena Costiera". Best scores are attained for the gentle terrain of "Marchesato" in the East side of the peninsula. Statistics show the tendency of the model to over-predict maximum temperatures and to under-predict minimum temperatures. This tendency increases with forecast time and show a model drift to overestimate the diurnal cycle with forecasting time. Murphy and Winkler (1987) summarize many of the limitations of the traditional accuracy cumulative measures. Alternatively, a distribution-oriented approach can be followed that uses the joint distribution of the forecast and observed values. The large dimensionality of the joint distribution approach is a significant drawback as a result of the large combinations of forecast and observations. Reduction in the dimensionality requires defining specific applications and verification goals. To reduce dimensionality we present joint distributions to assess general forecast

  12. Bayesian Estimation of Combined Accuracy for Tests with Verification Bias

    PubMed Central

    Broemeling, Lyle D.

    2011-01-01

    This presentation will emphasize the estimation of the combined accuracy of two or more tests when verification bias is present. Verification bias occurs when some of the subjects are not subject to the gold standard. The approach is Bayesian where the estimation of test accuracy is based on the posterior distribution of the relevant parameter. Accuracy of two combined binary tests is estimated employing either “believe the positive” or “believe the negative” rule, then the true and false positive fractions for each rule are computed for two tests. In order to perform the analysis, the missing at random assumption is imposed, and an interesting example is provided by estimating the combined accuracy of CT and MRI to diagnose lung cancer. The Bayesian approach is extended to two ordinal tests when verification bias is present, and the accuracy of the combined tests is based on the ROC area of the risk function. An example involving mammography with two readers with extreme verification bias illustrates the estimation of the combined test accuracy for ordinal tests. PMID:26859487

  13. Design, Implementation, and Verification of the Reliable Multicast Protocol. Thesis

    NASA Technical Reports Server (NTRS)

    Montgomery, Todd L.

    1995-01-01

    This document describes the Reliable Multicast Protocol (RMP) design, first implementation, and formal verification. RMP provides a totally ordered, reliable, atomic multicast service on top of an unreliable multicast datagram service. RMP is fully and symmetrically distributed so that no site bears an undue portion of the communications load. RMP provides a wide range of guarantees, from unreliable delivery to totally ordered delivery, to K-resilient, majority resilient, and totally resilient atomic delivery. These guarantees are selectable on a per message basis. RMP provides many communication options, including virtual synchrony, a publisher/subscriber model of message delivery, a client/server model of delivery, mutually exclusive handlers for messages, and mutually exclusive locks. It has been commonly believed that total ordering of messages can only be achieved at great performance expense. RMP discounts this. The first implementation of RMP has been shown to provide high throughput performance on Local Area Networks (LAN). For two or more destinations a single LAN, RMP provides higher throughput than any other protocol that does not use multicast or broadcast technology. The design, implementation, and verification activities of RMP have occurred concurrently. This has allowed the verification to maintain a high fidelity between design model, implementation model, and the verification model. The restrictions of implementation have influenced the design earlier than in normal sequential approaches. The protocol as a whole has matured smoother by the inclusion of several different perspectives into the product development.

  14. Verification of a numerical simulation technique for natural convection

    SciTech Connect

    Gadgil, A.; Bauman, F.; Altmayer, E.; Kammerud, R.C.

    1983-03-01

    The present paper describes a verification of CONVEC2 for single-zone geometries by comparison with the results of two natural convection experiments performed in small-scale rectangular enclosures. These experiments were selected because of the high Rayleigh numbers obtained and the small heat loss through the insulated surfaces. Comparisons are presented for (1) heat transfer rates, (2) fluid temperature profiles, and (3) surface heat flux distributions.

  15. Verification of the karst flow model under laboratory controlled conditions

    NASA Astrophysics Data System (ADS)

    Gotovac, Hrvoje; Andric, Ivo; Malenica, Luka; Srzic, Veljko

    2016-04-01

    Karst aquifers are very important groundwater resources around the world as well as in coastal part of Croatia. They consist of extremely complex structure defining by slow and laminar porous medium and small fissures and usually fast turbulent conduits/karst channels. Except simple lumped hydrological models that ignore high karst heterogeneity, full hydraulic (distributive) models have been developed exclusively by conventional finite element and finite volume elements considering complete karst heterogeneity structure that improves our understanding of complex processes in karst. Groundwater flow modeling in complex karst aquifers are faced by many difficulties such as a lack of heterogeneity knowledge (especially conduits), resolution of different spatial/temporal scales, connectivity between matrix and conduits, setting of appropriate boundary conditions and many others. Particular problem of karst flow modeling is verification of distributive models under real aquifer conditions due to lack of above-mentioned information. Therefore, we will show here possibility to verify karst flow models under the laboratory controlled conditions. Special 3-D karst flow model (5.6*2.6*2 m) consists of concrete construction, rainfall platform, 74 piezometers, 2 reservoirs and other supply equipment. Model is filled by fine sand (3-D porous matrix) and drainage plastic pipes (1-D conduits). This model enables knowledge of full heterogeneity structure including position of different sand layers as well as conduits location and geometry. Moreover, we know geometry of conduits perforation that enable analysis of interaction between matrix and conduits. In addition, pressure and precipitation distribution and discharge flow rates from both phases can be measured very accurately. These possibilities are not present in real sites what this model makes much more useful for karst flow modeling. Many experiments were performed under different controlled conditions such as different

  16. Experimental verification of quantum computation

    NASA Astrophysics Data System (ADS)

    Barz, Stefanie; Fitzsimons, Joseph F.; Kashefi, Elham; Walther, Philip

    2013-11-01

    Quantum computers are expected to offer substantial speed-ups over their classical counterparts and to solve problems intractable for classical computers. Beyond such practical significance, the concept of quantum computation opens up fundamental questions, among them the issue of whether quantum computations can be certified by entities that are inherently unable to compute the results themselves. Here we present the first experimental verification of quantum computation. We show, in theory and experiment, how a verifier with minimal quantum resources can test a significantly more powerful quantum computer. The new verification protocol introduced here uses the framework of blind quantum computing and is independent of the experimental quantum-computation platform used. In our scheme, the verifier is required only to generate single qubits and transmit them to the quantum computer. We experimentally demonstrate this protocol using four photonic qubits and show how the verifier can test the computer's ability to perform quantum computation.

  17. Ontology Matching with Semantic Verification

    PubMed Central

    Jean-Mary, Yves R.; Shironoshita, E. Patrick; Kabuka, Mansur R.

    2009-01-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies. PMID:20186256

  18. Ontology Matching with Semantic Verification.

    PubMed

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  19. Realistic weather simulations and forecast verification with COSMO-EULAG

    NASA Astrophysics Data System (ADS)

    Wójcik, Damian; Piotrowski, Zbigniew; Rosa, Bogdan; Ziemiański, Michał

    2015-04-01

    Research conducted at Polish Institute of Meteorology and Water Management, National Research Institute, in collaboration with Consortium for Small Scale Modeling (COSMO) resulted in the development of a new prototype model COSMO-EULAG. The dynamical core of the new model is based on anelastic set of equation and numerics adopted from the EULAG model. The core is coupled, with the 1st degree of accuracy, to the COSMO physical parameterizations involving turbulence, friction, radiation, moist processes and surface fluxes. The tool is capable to compute weather forecast in mountainous area for the horizontal resolutions ranging from 2.2 km to 0.1 km and with slopes reaching 82 degree of inclination. An employment of EULAG allows to profit from its desirable conservative properties and numerical robustness confirmed in number of benchmark tests and widely documented in scientific literature. In this study we show a realistic case study of Alpine summer convection simulated by COSMO-EULAG. It compares the convection-permitting realization of the flow using 2.2 km horizontal grid size, typical for contemporary very high resolution regional NWP forecast, with realization of LES type using grid size of 100 m. The study presents comparison of flow, cloud and precipitation structure together with the reference results of standard compressible COSMO Runge-Kutta model forecast in 2.2 km horizontal resolution. The case study results are supplemented by COSMO-EULAG forecast verification results for Alpine domain in 2.2 km horizontal resolution. Wind, temperature, cloud, humidity and precipitation scores are being presented. Verification period covers one summer month (June 2013) and one autumn month (November 2013). Verification is based on data collected by a network of approximately 200 stations (surface data verification) and 6 stations (upper-air verification) located in the Alps and vicinity.

  20. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  1. [Rare location of arachnoid cysts. Extratemporal cysts].

    PubMed

    Martinez-Perez, Rafael; Hinojosa, José; Pascual, Beatriz; Panaderos, Teresa; Welter, Diego; Muñoz, María J

    2016-01-01

    The therapeutic management of arachnoid cysts depends largely on its location. Almost 50% of arachnoid cysts are located in the temporal fossa-Sylvian fissure, whereas the other half is distributed in different locations, sometimes exceptional. Under the name of infrequent location arachnoid cysts, a description is presented of those composed of 2 sheets of arachnoid membrane, which are not located in the temporal fossa, and are primary or congenital.

  2. METHOD OF LOCATING GROUNDS

    DOEpatents

    Macleish, K.G.

    1958-02-11

    ABS>This patent presents a method for locating a ground in a d-c circult having a number of parallel branches connected across a d-c source or generator. The complete method comprises the steps of locating the ground with reference to the mildpoint of the parallel branches by connecting a potentiometer across the terminals of the circuit and connecting the slider of the potentiometer to ground through a current indicating instrument, adjusting the slider to right or left of the mildpoint so as to cause the instrument to indicate zero, connecting the terminal of the network which is farthest from the ground as thus indicated by the potentiometer to ground through a condenser, impressing a ripple voltage on the circuit, and then measuring the ripple voltage at the midpoint of each parallel branch to find the branch in which is the lowest value of ripple voltage, and then measuring the distribution of the ripple voltage along this branch to determine the point at which the ripple voltage drops off to zero or substantially zero due to the existence of a ground. The invention has particular application where a circuit ground is present which will disappear if the normal circuit voltage is removed.

  3. Verification and transparency in future arms control

    SciTech Connect

    Pilat, J.F.

    1996-09-01

    Verification`s importance has changed dramatically over time, although it always has been in the forefront of arms control. The goals and measures of verification and the criteria for success have changed with the times as well, reflecting such factors as the centrality of the prospective agreement to East-West relations during the Cold War, the state of relations between the United States and the Soviet Union, and the technologies available for monitoring. Verification`s role may be declining in the post-Cold War period. The prospects for such a development will depend, first and foremost, on the high costs of traditional arms control, especially those associated with requirements for verification. Moreover, the growing interest in informal, or non-negotiated arms control does not allow for verification provisions by the very nature of these arrangements. Multilateral agreements are also becoming more prominent and argue against highly effective verification measures, in part because of fears of promoting proliferation by opening sensitive facilities to inspectors from potential proliferant states. As a result, it is likely that transparency and confidence-building measures will achieve greater prominence, both as supplements to and substitutes for traditional verification. Such measures are not panaceas and do not offer all that we came to expect from verification during the Cold war. But they may be the best possible means to deal with current problems of arms reductions and restraints at acceptable levels of expenditure.

  4. Nuclear Data Verification and Standardization

    SciTech Connect

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  5. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  6. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  7. Gender verification in competitive sports.

    PubMed

    Simpson, J L; Ljungqvist, A; de la Chapelle, A; Ferguson-Smith, M A; Genel, M; Carlson, A S; Ehrhardt, A A; Ferris, E

    1993-11-01

    The possibility that men might masquerade as women and be unfair competitors in women's sports is accepted as outrageous by athletes and the public alike. Since the 1930s, media reports have fuelled claims that individuals who once competed as female athletes subsequently appeared to be men. In most of these cases there was probably ambiguity of the external genitalia, possibly as a result of male pseudohermaphroditism. Nonetheless, beginning at the Rome Olympic Games in 1960, the International Amateur Athletics Federation (IAAF) began establishing rules of eligibility for women athletes. Initially, physical examination was used as a method for gender verification, but this plan was widely resented. Thus, sex chromatin testing (buccal smear) was introduced at the Mexico City Olympic Games in 1968. The principle was that genetic females (46,XX) show a single X-chromatic mass, whereas males (46,XY) do not. Unfortunately, sex chromatin analysis fell out of common diagnostic use by geneticists shortly after the International Olympic Committee (IOC) began its implementation for gender verification. The lack of laboratories routinely performing the test aggravated the problem of errors in interpretation by inexperienced workers, yielding false-positive and false-negative results. However, an even greater problem is that there exist phenotypic females with male sex chromatin patterns (e.g. androgen insensitivity, XY gonadal dysgenesis). These individuals have no athletic advantage as a result of their congenital abnormality and reasonably should not be excluded from competition. That is, only the chromosomal (genetic) sex is analysed by sex chromatin testing, not the anatomical or psychosocial status. For all the above reasons sex chromatin testing unfairly excludes many athletes. Although the IOC offered follow-up physical examinations that could have restored eligibility for those 'failing' sex chromatin tests, most affected athletes seemed to prefer to 'retire'. All

  8. LOCATING LEAKS WITH ACOUSTIC TECHNOLOGY

    EPA Science Inventory

    Many water distribution systems in this country are almost 100 years old. About 26 percent of piping in these systems is made of unlined cast iron or steel and is in poor condition. Many methods that locate leaks in these pipes are time-consuming, costly, disruptive to operations...

  9. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PEMS calibrations and verifications....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the applicable calibrations and verifications in subpart D of this part, including the linearity verifications...

  10. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PEMS calibrations and verifications....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the applicable calibrations and verifications in subpart D of this part, including the linearity verifications...

  11. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  12. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  13. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  14. Location-assured, multifactor authentication on smartphones via LTE communication

    NASA Astrophysics Data System (ADS)

    Kuseler, Torben; Lami, Ihsan A.; Al-Assam, Hisham

    2013-05-01

    With the added security provided by LTE, geographical location has become an important factor for authentication to enhance the security of remote client authentication during mCommerce applications using Smartphones. Tight combination of geographical location with classic authentication factors like PINs/Biometrics in a real-time, remote verification scheme over the LTE layer connection assures the authenticator about the client itself (via PIN/biometric) as well as the client's current location, thus defines the important aspects of "who", "when", and "where" of the authentication attempt without eaves dropping or man on the middle attacks. To securely integrate location as an authentication factor into the remote authentication scheme, client's location must be verified independently, i.e. the authenticator should not solely rely on the location determined on and reported by the client's Smartphone. The latest wireless data communication technology for mobile phones (4G LTE, Long-Term Evolution), recently being rolled out in various networks, can be employed to enhance this location-factor requirement of independent location verification. LTE's Control Plane LBS provisions, when integrated with user-based authentication and independent source of localisation factors ensures secure efficient, continuous location tracking of the Smartphone. This feature can be performed during normal operation of the LTE-based communication between client and network operator resulting in the authenticator being able to verify the client's claimed location more securely and accurately. Trials and experiments show that such algorithm implementation is viable for nowadays Smartphone-based banking via LTE communication.

  15. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  16. Input apparatus for dynamic signature verification systems

    DOEpatents

    EerNisse, Errol P.; Land, Cecil E.; Snelling, Jay B.

    1978-01-01

    The disclosure relates to signature verification input apparatus comprising a writing instrument and platen containing piezoelectric transducers which generate signals in response to writing pressures.

  17. Guide to good practices for independent verification

    SciTech Connect

    1998-12-01

    This Guide to Good Practices is written to enhance understanding of, and provide direction for, Independent Verification, Chapter X of Department of Energy (DOE) Order 5480.19, Conduct of Operations Requirements for DOE Facilities. The practices in this guide should be considered when planning or reviewing independent verification activities. Contractors are advised to adopt procedures that meet the intent of DOE Order 5480.19. Independent Verification is an element of an effective Conduct of Operations program. The complexity and array of activities performed in DOE facilities dictate the necessity for coordinated independent verification activities to promote safe and efficient operations.

  18. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  19. Location, Location, Location: Development of Spatiotemporal Sequence Learning in Infancy

    ERIC Educational Resources Information Center

    Kirkham, Natasha Z.; Slemmer, Jonathan A.; Richardson, Daniel C.; Johnson, Scott P.

    2007-01-01

    We investigated infants' sensitivity to spatiotemporal structure. In Experiment 1, circles appeared in a statistically defined spatial pattern. At test 11-month-olds, but not 8-month-olds, looked longer at a novel spatial sequence. Experiment 2 presented different color/shape stimuli, but only the location sequence was violated during test;…

  20. Verification of regional climates of GISS GCM. Part 2: Summer

    NASA Technical Reports Server (NTRS)

    Druyan, Leonard M.; Rind, David

    1989-01-01

    Verification is made of the synoptic fields, sea-level pressure, precipitation rate, 200mb zonal wind and the surface resultant wind generated by two versions of the Goddard Institute for Space Studies (GISS) climate model. The models differ regarding the horizontal resolution of the computation grids and the specification of the sea-surface temperatures. Maps of the regional distributions of seasonal means of the model fields are shown alongside maps that show the observed distributions. Comparisons of the model results with observations are discussed and also summarized in tables according to geographic region.

  1. Verification of regional climates of GISS GCM. Part 1: Winter

    NASA Technical Reports Server (NTRS)

    Druyan, Leonard M.; Rind, David

    1988-01-01

    Verification is made of the synoptic fields, sea level pressure, precipitation rate, 200 mb zonal wind and the surface resultant wind, generated by two versions of the GISS climate model. The models differ regarding the horizontal resolution of the computational grids and the specification of the sea surface temperatures. Maps of the regional distributions of seasonal variations of the model fields are shown alongside maps showing the observed distributions. Comparisons of the model results with observations are discussed, and also summarized in tables according to geographic regions.

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION: Reduction of Nitrogen in Domestic Wastewater from Individual Residential Homes. BioConcepts, Inc. ReCip® RTS ~ 500 System

    EPA Science Inventory

    Verification testing of the ReCip® RTS-500 System was conducted over a 12-month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located on Otis Air National Guard Base in Bourne, Massachusetts. A nine-week startup period preceded the verification test t...

  3. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Cycles § 1065.550 Gas analyzer range verification and drift verification. (a) Range verification. If an... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations...-specific emissions over the entire duty cycle for drift. For each constituent to be verified, both sets...

  4. Verification of Public Weather Forecasts Available via the Media.

    NASA Astrophysics Data System (ADS)

    Brooks, Harold E.; Witt, Arthur; Eilts, Michael D.

    1997-10-01

    The question of who is the "best" forecaster in a particular media market is one that the public frequently asks. The authors have collected approximately one year's forecasts from the National Weather Service and major media presentations for Oklahoma City. Diagnostic verification procedures indicate that the question of best does not have a clear answer. All of the forecast sources have strengths and weaknesses, and it is possible that a user could take information from a variety of sources to come up with a forecast that has more value than any one individual source provides. The analysis provides numerous examples of the utility of a distributions-oriented approach to verification while also providing insight into the problems the public faces in evaluating the array of forecasts presented to them.

  5. Conformance Verification of Privacy Policies

    NASA Astrophysics Data System (ADS)

    Fu, Xiang

    Web applications are both the consumers and providers of information. To increase customer confidence, many websites choose to publish their privacy protection policies. However, policy conformance is often neglected. We propose a logic based framework for formally specifying and reasoning about the implementation of privacy protection by a web application. A first order extension of computation tree logic is used to specify a policy. A verification paradigm, built upon a static control/data flow analysis, is presented to verify if a policy is satisfied.

  6. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  7. Why do verification and validation?

    DOE PAGES

    Hu, Kenneth T.; Paez, Thomas L.

    2016-02-19

    In this discussion paper, we explore different ways to assess the value of verification and validation (V&V) of engineering models. We first present a literature review on the value of V&V and then use value chains and decision trees to show how value can be assessed from a decision maker's perspective. In this context, the value is what the decision maker is willing to pay for V&V analysis with the understanding that the V&V results are uncertain. As a result, the 2014 Sandia V&V Challenge Workshop is used to illustrate these ideas.

  8. RISKIND verification and benchmark comparisons

    SciTech Connect

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  9. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  10. Runtime Verification with State Estimation

    NASA Technical Reports Server (NTRS)

    Stoller, Scott D.; Bartocci, Ezio; Seyster, Justin; Grosu, Radu; Havelund, Klaus; Smolka, Scott A.; Zadok, Erez

    2011-01-01

    We introduce the concept of Runtime Verification with State Estimation and show how this concept can be applied to estimate theprobability that a temporal property is satisfied by a run of a program when monitoring overhead is reduced by sampling. In such situations, there may be gaps in the observed program executions, thus making accurate estimation challenging. To deal with the effects of sampling on runtime verification, we view event sequences as observation sequences of a Hidden Markov Model (HMM), use an HMM model of the monitored program to "fill in" sampling-induced gaps in observation sequences, and extend the classic forward algorithm for HMM state estimation (which determines the probability of a state sequence, given an observation sequence) to compute the probability that the property is satisfied by an execution of the program. To validate our approach, we present a case study based on the mission software for a Mars rover. The results of our case study demonstrate high prediction accuracy for the probabilities computed by our algorithm. They also show that our technique is much more accurate than simply evaluating the temporal property on the given observation sequences, ignoring the gaps.

  11. A Verification Method for MASOES.

    PubMed

    Perozo, N; Aguilar Perozo, J; Terán, O; Molina, H

    2013-02-01

    MASOES is a 3agent architecture for designing and modeling self-organizing and emergent systems. This architecture describes the elements, relationships, and mechanisms, both at the individual and the collective levels, that favor the analysis of the self-organizing and emergent phenomenon without mathematically modeling the system. In this paper, a method is proposed for verifying MASOES from the point of view of design in order to study the self-organizing and emergent behaviors of the modeled systems. The verification criteria are set according to what is proposed in MASOES for modeling self-organizing and emerging systems and the principles of the wisdom of crowd paradigm and the fuzzy cognitive map (FCM) theory. The verification method for MASOES has been implemented in a tool called FCM Designer and has been tested to model a community of free software developers that works under the bazaar style as well as a Wikipedia community in order to study their behavior and determine their self-organizing and emergent capacities.

  12. Video-based fingerprint verification.

    PubMed

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-09-04

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, "inside similarity" and "outside similarity" are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low.

  13. Video-Based Fingerprint Verification

    PubMed Central

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283

  14. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  15. Hailstorms over Switzerland: Verification of Crowd-sourced Data

    NASA Astrophysics Data System (ADS)

    Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia

    2016-04-01

    The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.

  16. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    SciTech Connect

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  17. High Resolution Verification of Precipitation Using Non-gts Data

    NASA Astrophysics Data System (ADS)

    Milelli, M.; Oberto, E.; Pelosini, R.

    In this work the non-hydrostatic Lokal Modell (LM, a Limited Area Model used in the framework of the COSMO Consortium between Germany, Switzerland, Italy, Greece and Poland) has been compared to the ECMWF hydrostatic model and to the observed values of precipitation over the Piedmont region, located in the north-western part of Italy. There, a very dense non-GTS network of raingauges is available and makes possible a high resolution verification. In particular, the objectives of the work are the evaluation of the change in the LM orography parametrization (14/12/2001), the determination of a possible seasonal trend and of a possible diurnal cycle, and the comparison of performance between the LM00 and LM12 runs. Standard schemes of precipitation verification have been used, such as contingence tables for different precipitation thresholds, from which it is possible to create statistical indices like the BIAS, the False Alarm Rate (FAR), the Threat Score (TS) and the Hit Rain Rate (HRR). The observed and forecasted values have been averaged and then compared over each of the 10 sub-domains which have been created according to meteo-hydrological criteria. Moreover, a kind of climatological verification has been carried out by comparing the total amount of rain (mm) predicted and observed over the whole region during the years 2000 and 2001.

  18. Students' Verification Strategies for Combinatorial Problems

    ERIC Educational Resources Information Center

    Mashiach Eizenberg, Michal; Zaslavsky, Orit

    2004-01-01

    We focus on a major difficulty in solving combinatorial problems, namely, on the verification of a solution. Our study aimed at identifying undergraduate students' tendencies to verify their solutions, and the verification strategies that they employ when solving these problems. In addition, an attempt was made to evaluate the level of efficiency…

  19. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  20. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Verification program. 460.17 Section...

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  2. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements:...

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  4. The monitoring and verification of nuclear weapons

    SciTech Connect

    Garwin, Richard L.

    2014-05-09

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  5. 29 CFR 1903.19 - Abatement verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 5 2014-07-01 2014-07-01 false Abatement verification. 1903.19 Section 1903.19 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR INSPECTIONS, CITATIONS AND PROPOSED PENALTIES § 1903.19 Abatement verification. Purpose. OSHA's inspections are intended to result in...

  6. Gender Verification of Female Olympic Athletes.

    ERIC Educational Resources Information Center

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  7. Guidelines for qualifying cleaning and verification materials

    NASA Technical Reports Server (NTRS)

    Webb, D.

    1995-01-01

    This document is intended to provide guidance in identifying technical issues which must be addressed in a comprehensive qualification plan for materials used in cleaning and cleanliness verification processes. Information presented herein is intended to facilitate development of a definitive checklist that should address all pertinent materials issues when down selecting a cleaning/verification media.

  8. CFE verification: The decision to inspect

    SciTech Connect

    Allentuck, J.

    1990-01-01

    Verification of compliance with the provisions of the treaty on Conventional Forces-Europe (CFE) is subject to inspection quotas of various kinds. Thus the decision to carry out a specific inspection or verification activity must be prudently made. This decision process is outlined, and means for conserving quotas'' are suggested. 4 refs., 1 fig.

  9. New method of verificating optical flat flatness

    NASA Astrophysics Data System (ADS)

    Sun, Hao; Li, Xueyuan; Han, Sen; Zhu, Jianrong; Guo, Zhenglai; Fu, Yuegang

    2014-11-01

    Optical flat is commonly used in optical testing instruments, flatness is the most important parameter of forming errors. As measurement criteria, optical flat flatness (OFF) index needs to have good precision. Current measurement in China is heavily dependent on the artificial visual interpretation, through discrete points to characterize the flatness. The efficiency and accuracy of this method can not meet the demand of industrial development. In order to improve the testing efficiency and accuracy of measurement, it is necessary to develop an optical flat verification system, which can obtain all surface information rapidly and efficiently, at the same time, in accordance with current national metrological verification procedures. This paper reviews current optical flat verification method and solves the problems existing in previous test, by using new method and its supporting software. Final results show that the new system can improve verification efficiency and accuracy, by comparing with JJG 28-2000 metrological verification procedures method.

  10. Working memory mechanism in proportional quantifier verification.

    PubMed

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-12-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g., "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow dots". The second study reveals that both types of sentences are correlated with memory storage, however, only proportional sentences are associated with the cognitive control. This result suggests that the cognitive mechanism underlying the verification of proportional quantifiers is crucially related to the integration process, in which an individual has to compare in memory the cardinalities of two sets. In the third study we find that the numerical distance between two cardinalities that must be compared significantly influences the verification time and accuracy. The results of our studies are discussed in the broader context of processing complex sentences. PMID:24374596

  11. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  12. Planck Telescope: optical design and verification

    NASA Astrophysics Data System (ADS)

    Martin, Philippe; Riti, Jean-Bernard; de Chambure, Daniel

    2004-06-01

    The cornerstone mission of the European Space Agency (ESA) scientific program Herschel/Planck is currently in the design manufacturing phase (phase C/D). The Planck satellite will be launched in 2007, together with Herschel. Located around the L2 Lagrange point, Planck aims at obtaining very accurate images of the Cosmic Wave Background fluctuations. Working up to high frequency (857 GHz, i.e. 350 μm wavelength), Planck is expected to give sharper images than the recently launched WMAP satellite. The Planck Telescope is an off-axis (unobscured) Gregorian antenna, with a 1.5 m diameter pupil, a small F-number (~1) and a large FOV (+/-5° circular), owing to place a large number of detectors (bolometers) in the focal plane. This paper presents the optical design, performance, and verification concept of the Planck telescope. The custom made sequential Hartmann system is described. Working at 10.6 μm, it will directly measure the wavefront of the telescope in cryogenic environment i.e. at operational conditions. This will be a major milestone in the spacecraft development.

  13. CTBT Integrated Verification System Evaluation Model

    SciTech Connect

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  14. Stennis Space Center Verification & Validation Capabilities

    NASA Technical Reports Server (NTRS)

    Pagnutti, Mary; Ryan, Robert E.; Holekamp, Kara; O'Neal, Duane; Knowlton, Kelly; Ross, Kenton; Blonski, Slawomir

    2007-01-01

    Scientists within NASA#s Applied Research & Technology Project Office (formerly the Applied Sciences Directorate) have developed a well-characterized remote sensing Verification & Validation (V&V) site at the John C. Stennis Space Center (SSC). This site enables the in-flight characterization of satellite and airborne high spatial resolution remote sensing systems and their products. The smaller scale of the newer high resolution remote sensing systems allows scientists to characterize geometric, spatial, and radiometric data properties using a single V&V site. The targets and techniques used to characterize data from these newer systems can differ significantly from the techniques used to characterize data from the earlier, coarser spatial resolution systems. Scientists have used the SSC V&V site to characterize thermal infrared systems. Enhancements are being considered to characterize active lidar systems. SSC employs geodetic targets, edge targets, radiometric tarps, atmospheric monitoring equipment, and thermal calibration ponds to characterize remote sensing data products. Similar techniques are used to characterize moderate spatial resolution sensing systems at selected nearby locations. The SSC Instrument Validation Lab is a key component of the V&V capability and is used to calibrate field instrumentation and to provide National Institute of Standards and Technology traceability. This poster presents a description of the SSC characterization capabilities and examples of calibration data.

  15. Locating Continuing Education Programs.

    ERIC Educational Resources Information Center

    Mason, Robert C.

    1986-01-01

    Emphasizes program location as an important component of the marketing plan for continuing education. Also discusses relations among program location and quality, costs, supportive services, and economies of scale. (CH)

  16. Cable-fault locator

    NASA Technical Reports Server (NTRS)

    Cason, R. L.; Mcstay, J. J.; Heymann, A. P., Sr.

    1979-01-01

    Inexpensive system automatically indicates location of short-circuited section of power cable. Monitor does not require that cable be disconnected from its power source or that test signals be applied. Instead, ground-current sensors are installed in manholes or at other selected locations along cable run. When fault occurs, sensors transmit information about fault location to control center. Repair crew can be sent to location and cable can be returned to service with minimum of downtime.

  17. Cleanup Verification Package for the 118-F-1 Burial Ground

    SciTech Connect

    E. J. Farris and H. M. Sulloway

    2008-01-10

    This cleanup verification package documents completion of remedial action for the 118-F-1 Burial Ground on the Hanford Site. This burial ground is a combination of two locations formerly called Minor Construction Burial Ground No. 2 and Solid Waste Burial Ground No. 2. This waste site received radioactive equipment and other miscellaneous waste from 105-F Reactor operations, including dummy elements and irradiated process tubing; gun barrel tips, steel sleeves, and metal chips removed from the reactor; filter boxes containing reactor graphite chips; and miscellaneous construction solid waste.

  18. Location, Location, Location: Where Do Location-Based Services Fit into Your Institution's Social Media Mix?

    ERIC Educational Resources Information Center

    Nekritz, Tim

    2011-01-01

    Foursquare is a location-based social networking service that allows users to share their location with friends. Some college administrators have been thinking about whether and how to take the leap into location-based services, which are also known as geosocial networking services. These platforms, which often incorporate gaming elements like…

  19. Criteria for monitoring a chemical arms treaty: Implications for the verification regime. Report No. 13

    SciTech Connect

    Mullen, M.F.; Apt, K.E.; Stanbro, W.D.

    1991-12-01

    The multinational Chemical Weapons Convention (CWC) being negotiated at the Conference on Disarmament in Geneva is viewed by many as an effective way to rid the world of the threat of chemical weapons. Parties could, however, legitimately engage in certain CW-related activities in industry, agriculture, research, medicine, and law enforcement. Treaty verification requirements related to declared activities include: confirming destruction of declared CW stockpiles and production facilities; monitoring legitimate, treaty-allowed activities, such as production of certain industrial chemicals; and, detecting proscribed activities within the declared locations of treaty signatories, e.g., the illegal production of CW agents at a declared industrial facility or the diversion or substitution of declared CW stockpile items. Verification requirements related to undeclared activities or locations include investigating possible clandestine CW stocks and production capability not originally declared by signatories; detecting clandestine, proscribed activities at facilities or sites that are not declared and hence not subject to routine inspection; and, investigating allegations of belligerent use of CW. We discuss here a possible set of criteria for assessing the effectiveness of CWC verification (and certain aspects of the bilateral CW reduction agreement). Although the criteria are applicable to the full range of verification requirements, the discussion emphasizes verification of declared activities and sites.

  20. Criteria for monitoring a chemical arms treaty: Implications for the verification regime

    SciTech Connect

    Mullen, M.F.; Apt, K.E.; Stanbro, W.D.

    1991-12-01

    The multinational Chemical Weapons Convention (CWC) being negotiated at the Conference on Disarmament in Geneva is viewed by many as an effective way to rid the world of the threat of chemical weapons. Parties could, however, legitimately engage in certain CW-related activities in industry, agriculture, research, medicine, and law enforcement. Treaty verification requirements related to declared activities include: confirming destruction of declared CW stockpiles and production facilities; monitoring legitimate, treaty-allowed activities, such as production of certain industrial chemicals; and, detecting proscribed activities within the declared locations of treaty signatories, e.g., the illegal production of CW agents at a declared industrial facility or the diversion or substitution of declared CW stockpile items. Verification requirements related to undeclared activities or locations include investigating possible clandestine CW stocks and production capability not originally declared by signatories; detecting clandestine, proscribed activities at facilities or sites that are not declared and hence not subject to routine inspection; and, investigating allegations of belligerent use of CW. We discuss here a possible set of criteria for assessing the effectiveness of CWC verification (and certain aspects of the bilateral CW reduction agreement). Although the criteria are applicable to the full range of verification requirements, the discussion emphasizes verification of declared activities and sites.

  1. A hybrid framework for verification of satellite precipitation products

    NASA Astrophysics Data System (ADS)

    Li, J.; Hsu, K.; AghaKouchak, A.; Sorooshian, S.

    2011-12-01

    Advances in satellite technology have led to the development of many remote-sensing algorithms to estimate precipitation at quasi-global scales. A number of satellite precipitation products are provided at high spatial and temporal resolutions that are suitable for short-term hydrologic applications. Several coordinated validation activities have been established to evaluate the accuracy of satellite precipitation. Traditional verification measures summarize pixel-to-pixel differences between observation and estimates. Object-based verification methods, however, extend pixel based validation to address errors related to spatial patterns and storm structure, such as the shape, volume, and distribution of precipitation rain-objects. In this investigation, a 2D watershed segmentation technique is used to identify rain storm objects and is further adopted in a hybrid verification framework to diagnose the storm-scale rainfall objects from both satellite-based precipitation estimates and ground observations (radar estimates). Five key scores are identified in the objective-based verification framework, including false alarm ratio, missing ratio, maximum of total interest, equal weight and weighted summation of total interest. These scores indicate the performance of satellite estimates with features extracted from the segmented storm objects. The proposed object-based verification framework was used to evaluate PERSIANN, PERSIANN-CCS, CMORPH, 3B42RT against NOAA stage IV MPE multi-sensor composite rain analysis. All estimates are evaluated at 0.25°x0.25° daily-scale in summer 2008 over the continental United States (CONUS). The five final scores for each precipitation product are compared with the median of maximum interest (MMI) of the Method for Object-Based Diagnostic Evaluation (MODE). The results show PERSIANN and CMORPH outperform 3B42RT and PERSIANN-CCS. Different satellite products presented distinct features of precipitation. For example, the sizes of

  2. 47 CFR 64.1120 - Verification of orders for telecommunications service.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... mobile radio services (CMRS) providers shall be excluded from the verification requirements of this part... not be owned, managed, controlled, or directed by the carrier or the carrier's marketing agent; must... carrier's marketing agent; and must operate in a location physically separate from the carrier or...

  3. 47 CFR 64.1120 - Verification of orders for telecommunications service.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... mobile radio services (CMRS) providers shall be excluded from the verification requirements of this part... not be owned, managed, controlled, or directed by the carrier or the carrier's marketing agent; must... carrier's marketing agent; and must operate in a location physically separate from the carrier or...

  4. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    NASA Astrophysics Data System (ADS)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  5. First Images from VLT Science Verification Programme

    NASA Astrophysics Data System (ADS)

    1998-09-01

    morning of September 1 when the telescope was returned to the Commissioning Team that has since continued its work. The FORS instrument is now being installed and the first images from this facility are expected shortly. Observational circumstances During the two-week SV period, a total of 154 hours were available for astronomical observations. Of these, 95 hours (62%) were used to collect scientific data, including calibrations, e.g. flat-fielding and photometric standard star observations. 15 hours (10%) were spent to solve minor technical problems, while another 44 hours (29%) were lost due to adverse meteorological conditions (clouds or wind exceeding 15 m/sec). The amount of telescope technical downtime is very small at this moment of the UT1 commissioning. This fact provides an impressive indication of high technical reliability that has been achieved and which will be further consolidated during the next months. The meteorological conditions that were encountered at Paranal during this period were unfortunately below average, when compared to data from the same calendar period in earlier years. There was an excess of bad seeing and fewer good seeing periods than normal; see, however, ESO PR Photo 35c/98 with 0.26 arcsec image quality. Nevertheless, the measured image quality on the acquired frames was often better than the seeing measured outside the enclosure by the Paranal seeing monitor. Part of this very positive effect is due to "active field stabilization" , now performed during all observations by rapid motion (10 - 70 times per second) of the 1.1-m secondary mirror of beryllium (M2) and compensating for the "twinkling" of stars. Science Verification data soon to be released A great amount of valuable data was collected during the SV programme. The available programme time was distributed as follows: Hubble Deep Field - South [HDF-S; NICMOS and STIS Fields] (37.1 hrs); Lensed QSOs (3.2 hrs); High-z Clusters (6.2 hrs); Host Galaxies of Gamma-Ray Bursters (2

  6. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  7. MCFC power plant system verification

    SciTech Connect

    Farooque, M.; Bernard, R.; Doyon, J.; Paetsch, L.; Patel, P.; Skok, A.; Yuh, C.

    1993-11-01

    In pursuit of commercialization, efforts are underway to: (1) advance the technology base by enhancing performance and demonstrating endurance, (2) scale up stack to the full area and height, (3) acquire stack manufacturing capability and experience, (4) establish capability as well as gain experience for power plant system testing of the full-height carbonate fuel cell stack, (5) and define power plant design and develop critical subsystem components. All the major project objectives have already been attained. Over the last year, significant progress has been achieved in establishing the full-height stack design, gaining stack manufacturing and system integrated testing experience, and verifying the major equipment design in power plant system tests. In this paper, recent progresses on stack scaleup, demonstration testing, BOP verification, and stack endurance are presented.

  8. Retail applications of signature verification

    NASA Astrophysics Data System (ADS)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  9. Muscle glycogen and cell function--Location, location, location.

    PubMed

    Ørtenblad, N; Nielsen, J

    2015-12-01

    The importance of glycogen, as a fuel during exercise, is a fundamental concept in exercise physiology. The use of electron microscopy has revealed that glycogen is not evenly distributed in skeletal muscle fibers, but rather localized in distinct pools. In this review, we present the available evidence regarding the subcellular localization of glycogen in skeletal muscle and discuss this from the perspective of skeletal muscle fiber function. The distribution of glycogen in the defined pools within the skeletal muscle varies depending on exercise intensity, fiber phenotype, training status, and immobilization. Furthermore, these defined pools may serve specific functions in the cell. Specifically, reduced levels of these pools of glycogen are associated with reduced SR Ca(2+) release, muscle relaxation rate, and membrane excitability. Collectively, the available literature strongly demonstrates that the subcellular localization of glycogen has to be considered to fully understand the role of glycogen metabolism and signaling in skeletal muscle function. Here, we propose that the effect of low muscle glycogen on excitation-contraction coupling may serve as a built-in mechanism, which links the energetic state of the muscle fiber to energy utilization.

  10. In vivo proton range verification: a review

    NASA Astrophysics Data System (ADS)

    Knopf, Antje-Christin; Lomax, Antony

    2013-08-01

    Protons are an interesting modality for radiotherapy because of their well defined range and favourable depth dose characteristics. On the other hand, these same characteristics lead to added uncertainties in their delivery. This is particularly the case at the distal end of proton dose distributions, where the dose gradient can be extremely steep. In practice however, this gradient is rarely used to spare critical normal tissues due to such worries about its exact position in the patient. Reasons for this uncertainty are inaccuracies and non-uniqueness of the calibration from CT Hounsfield units to proton stopping powers, imaging artefacts (e.g. due to metal implants) and anatomical changes of the patient during treatment. In order to improve the precision of proton therapy therefore, it would be extremely desirable to verify proton range in vivo, either prior to, during, or after therapy. In this review, we describe and compare state-of-the art in vivo proton range verification methods currently being proposed, developed or clinically implemented.

  11. Shipper/receiver difference verification of spent fuel by use of PDET

    SciTech Connect

    Ham, Y. S.; Sitaraman, S.

    2011-07-01

    Spent fuel storage pools in most countries are rapidly approaching their design limits with the discharge of over 10,000 metric tons of heavy metal from global reactors. Countries like UK, France or Japan have adopted a closed fuel cycle by reprocessing spent fuel and recycling MOX fuel while many other countries opted for above ground interim dry storage for their spent fuel management strategy. Some countries like Finland and Sweden are already well on the way to setting up a conditioning plant and a deep geological repository for spent fuel. For all these situations, shipments of spent fuel are needed and the number of these shipments is expected to increase significantly. Although shipper/receiver difference (SRD) verification measurements are needed by IAEA when the recipient facility receives spent fuel, these are not being practiced to the level that IAEA has desired due to lack of a credible measurement methodology and instrument that can reliably perform these measurements to verify non-diversion of spent fuel during shipment and confirm facility operator declarations on the spent fuel. In this paper, we describe a new safeguards method and an associated instrument, Partial Defect Tester (PDET), which can detect pin diversion from Pressurized Water Reactor (PWR) Spent Fuel Assemblies in an in-situ condition. The PDET uses multiple tiny neutron and gamma detectors in the form of a cluster and a simple, yet highly precise, gravity-driven system to obtain underwater radiation measurements inside a Pressurized Water Reactor (PWR) spent fuel assembly. The method takes advantage of the PWR fuel design which contains multiple guide tubes which can be accessed from the top. The data obtained in such a manner can provide spatial distribution of neutron and gamma flux within a spent fuel assembly. Our simulation study as well as validation measurements indicated that the ratio of the gamma signal to the thermal neutron signal at each detector location normalized to

  12. Mine locations: Kazakhstan

    SciTech Connect

    Perry, Bradley A

    2008-01-01

    Upon accepting this internship at Los Alamos National Laboratory, I was excited but a bit nervous because I was placed into a field I knew nothing about and did not incorporate my mechanical engineering background. However, I stayed positive and realized that experience and education can come in many forms and that this would be a once in a lifetime opportunity. The EES-II Division (which stands for Earth and Environmental Sciences, Geophysics division) concentrates on several topics, including Nuclear Treaty Verification Seismology. The study of this is extremely important in order to monitor countries that have nuclear capability and make sure they follow the rules of the international comprehensive nuclear test ban treaty. Seismology is only one aspect of this monitoring and EES-II works diligently with many other groups here at Los Alamos and across the world.

  13. Enhanced Cancelable Biometrics for Online Signature Verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. In this paper, we propose a scheme to improve the performance of a cancelable approach for online signature verification. Our scheme generates two cancelable dataset from one raw dataset and uses them for verification. Preliminary experiments were performed using a distance-based online signature verification algorithm. The experimental results show that our proposed scheme is promising.

  14. Liquefied Natural Gas (LNG) dispenser verification device

    NASA Astrophysics Data System (ADS)

    Xiong, Maotao; Yang, Jie-bin; Zhao, Pu-jun; Yu, Bo; Deng, Wan-quan

    2013-01-01

    The composition of working principle and calibration status of LNG (Liquefied Natural Gas) dispenser in China are introduced. According to the defect of weighing method in the calibration of LNG dispenser, LNG dispenser verification device has been researched. The verification device bases on the master meter method to verify LNG dispenser in the field. The experimental results of the device indicate it has steady performance, high accuracy level and flexible construction, and it reaches the international advanced level. Then LNG dispenser verification device will promote the development of LNG dispenser industry in China and to improve the technical level of LNG dispenser manufacture.

  15. Reversible micromachining locator

    DOEpatents

    Salzer, Leander J.; Foreman, Larry R.

    1999-01-01

    This invention provides a device which includes a locator, a kinematic mount positioned on a conventional tooling machine, a part carrier disposed on the locator and a retainer ring. The locator has disposed therein a plurality of steel balls, placed in an equidistant position circumferentially around the locator. The kinematic mount includes a plurality of magnets which are in registry with the steel balls on the locator. In operation, a blank part to be machined is placed between a surface of a locator and the retainer ring (fitting within the part carrier). When the locator (with a blank part to be machined) is coupled to the kinematic mount, the part is thus exposed for the desired machining process. Because the locator is removably attachable to the kinematic mount, it can easily be removed from the mount, reversed, and reinserted onto the mount for additional machining. Further, the locator can likewise be removed from the mount and placed onto another tooling machine having a properly aligned kinematic mount. Because of the unique design and use of magnetic forces of the present invention, positioning errors of less than 0.25 micrometer for each machining process can be achieved.

  16. Reversible micromachining locator

    DOEpatents

    Salzer, L.J.; Foreman, L.R.

    1999-08-31

    This invention provides a device which includes a locator, a kinematic mount positioned on a conventional tooling machine, a part carrier disposed on the locator and a retainer ring. The locator has disposed therein a plurality of steel balls, placed in an equidistant position circumferentially around the locator. The kinematic mount includes a plurality of magnets which are in registry with the steel balls on the locator. In operation, a blank part to be machined is placed between a surface of a locator and the retainer ring (fitting within the part carrier). When the locator (with a blank part to be machined) is coupled to the kinematic mount, the part is thus exposed for the desired machining process. Because the locator is removably attachable to the kinematic mount, it can easily be removed from the mount, reversed, and reinserted onto the mount for additional machining. Further, the locator can likewise be removed from the mount and placed onto another tooling machine having a properly aligned kinematic mount. Because of the unique design and use of magnetic forces of the present invention, positioning errors of less than 0.25 micrometer for each machining process can be achieved. 7 figs.

  17. Development of array-type prompt gamma measurement system for in vivo range verification in proton therapy

    SciTech Connect

    Min, Chul Hee; Lee, Han Rim; Kim, Chan Hyeong; Lee, Se Byeong

    2012-04-15

    Purpose: In vivo range verification is one of the most important parts of proton therapy to fully utilize its benefits delivering high radiation dose to tumor, while sparing the normal tissue with the so-called Bragg peak. Currently, however, range verification method is not used in clinics. The purpose of the present study is to optimize and evaluate the configuration of an array-type prompt gamma measurement system on determining distal dose edge for in vivo range verification of proton therapy. Methods: To effectively measure the prompt gammas against the background gammas, the Monte Carlo simulations with the MCNPX code were employed in optimizing the configuration of the measurement system, and the Monte Carlo method was also used to understand the effect of the background gammas, mainly neutron capture gammas, in the measured gamma distribution. To reduce the effect of the background gammas, the optimized energy window of 4-10 MeV in measuring the prompt gammas was employed. A parameterized source was used to maximize computation speed in the optimization study. A simplified test measurement system, using only one detector moving from one measurement location to the next, was constructed and applied to therapeutic proton beams of 80-220 MeV. For accurate determination of the distal dose edge, the sigmoidal curve-fitting method was applied to the measured distributions of the prompt gammas, and then, the location of the half-value between the maximum and minimum value in the curve-fitting was determined as the distal dose edge and compared with the beam range assessed by the proton dose distribution. Results: The parameterized source term employed in optimization process improved the calculation speed by up to {approx}300 times. The optimization study indicates that an array-type measurement system with 3, 2, 2, and 150 mm for scintillator thickness, slit width, septal thickness, and slit length, respectively, can effectively measure the prompt gamma

  18. Object locating system

    DOEpatents

    Novak, James L.; Petterson, Ben

    1998-06-09

    A sensing system locates an object by sensing the object's effect on electric fields. The object's effect on the mutual capacitance of electrode pairs varies according to the distance between the object and the electrodes. A single electrode pair can sense the distance from the object to the electrodes. Multiple electrode pairs can more precisely locate the object in one or more dimensions.

  19. Reversible micromachining locator

    DOEpatents

    Salzer, Leander J.; Foreman, Larry R.

    2002-01-01

    A locator with a part support is used to hold a part onto the kinematic mount of a tooling machine so that the part can be held in or replaced in exactly the same position relative to the cutting tool for machining different surfaces of the part or for performing different machining operations on the same or different surfaces of the part. The locator has disposed therein a plurality of steel balls placed at equidistant positions around the planar surface of the locator and the kinematic mount has a plurality of magnets which alternate with grooves which accommodate the portions of the steel balls projecting from the locator. The part support holds the part to be machined securely in place in the locator. The locator can be easily detached from the kinematic mount, turned over, and replaced onto the same kinematic mount or another kinematic mount on another tooling machine without removing the part to be machined from the locator so that there is no need to touch or reposition the part within the locator, thereby assuring exact replication of the position of the part in relation to the cutting tool on the tooling machine for each machining operation on the part.

  20. MAMA Software Features: Quantification Verification Documentation-1

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  1. Engineering drawing field verification program. Revision 3

    SciTech Connect

    Ulk, P.F.

    1994-10-12

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented.

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: FUEL CELLS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: STORMWATER TECHNOLOGIES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techn...

  4. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis...

  5. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis...

  6. PERFORMANCE VERIFICATION OF WATER SECURITY - RELATED TECHNOLOGIES

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program's Advanced Monitoring Systems (AMS) Center has been charged by EPA to verify the performance of commercially available monitoring technologies for air, water, soil. Four categories of water security technologies (most of whi...

  7. Environmental Technology Verification Program (ETV) Policy Compendium

    EPA Science Inventory

    The Policy Compendium summarizes operational decisions made to date by participants in the U.S. Environmental Protection Agency's (EPA's) Environmental Technology Verification Program (ETV) to encourage consistency among the ETV centers. The policies contained herein evolved fro...

  8. U.S. Environmental Technology Verification Program

    EPA Science Inventory

    Overview of the U.S. Environmental Technology Verification Program (ETV), the ETV Greenhouse Gas Technology Center, and energy-related ETV projects. Presented at the Department of Energy's National Renewable Laboratory in Boulder, Colorado on June 23, 2008.

  9. An integrated user-oriented laboratory for verification of digital flight control systems: Features and capabilities

    NASA Technical Reports Server (NTRS)

    Defeo, P.; Doane, D.; Saito, J.

    1982-01-01

    A Digital Flight Control Systems Verification Laboratory (DFCSVL) has been established at NASA Ames Research Center. This report describes the major elements of the laboratory, the research activities that can be supported in the area of verification and validation of digital flight control systems (DFCS), and the operating scenarios within which these activities can be carried out. The DFCSVL consists of a palletized dual-dual flight-control system linked to a dedicated PDP-11/60 processor. Major software support programs are hosted in a remotely located UNIVAC 1100 accessible from the PDP-11/60 through a modem link. Important features of the DFCSVL include extensive hardware and software fault insertion capabilities, a real-time closed loop environment to exercise the DFCS, an integrated set of software verification tools, and a user-oriented interface to all the resources and capabilities.

  10. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  11. The NPARC Alliance Verification and Validation Archive

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Dudek, Julianne C.; Tatum, Kenneth E.

    2000-01-01

    The NPARC Alliance (National Project for Applications oriented Research in CFD) maintains a publicly-available, web-based verification and validation archive as part of the development and support of the WIND CFD code. The verification and validation methods used for the cases attempt to follow the policies and guidelines of the ASME and AIAA. The emphasis is on air-breathing propulsion flow fields with Mach numbers ranging from low-subsonic to hypersonic.

  12. 46 CFR 193.60-10 - Location.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Location. 193.60-10 Section 193.60-10 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) OCEANOGRAPHIC RESEARCH VESSELS FIRE PROTECTION EQUIPMENT Fire Axes § 193.60-10 Location. (a) Fire axes shall be distributed throughout the spaces...

  13. Verification of thermal analysis codes for modeling solid rocket nozzles

    NASA Technical Reports Server (NTRS)

    Keyhani, M.

    1993-01-01

    One of the objectives of the Solid Propulsion Integrity Program (SPIP) at Marshall Space Flight Center (MSFC) is development of thermal analysis codes capable of accurately predicting the temperature field, pore pressure field and the surface recession experienced by decomposing polymers which are used as thermal barriers in solid rocket nozzles. The objective of this study is to provide means for verifications of thermal analysis codes developed for modeling of flow and heat transfer in solid rocket nozzles. In order to meet the stated objective, a test facility was designed and constructed for measurement of the transient temperature field in a sample composite subjected to a constant heat flux boundary condition. The heating was provided via a steel thin-foil with a thickness of 0.025 mm. The designed electrical circuit can provide a heating rate of 1800 W. The heater was sandwiched between two identical samples, and thus ensure equal power distribution between them. The samples were fitted with Type K thermocouples, and the exact location of the thermocouples were determined via X-rays. The experiments were modeled via a one-dimensional code (UT1D) as a conduction and phase change heat transfer process. Since the pyrolysis gas flow was in the direction normal to the heat flow, the numerical model could not account for the convection cooling effect of the pyrolysis gas flow. Therefore, the predicted values in the decomposition zone are considered to be an upper estimate of the temperature. From the analysis of the experimental and the numerical results the following are concluded: (1) The virgin and char specific heat data for FM 5055 as reported by SoRI can not be used to obtain any reasonable agreement between the measured temperatures and the predictions. However, use of virgin and char specific heat data given in Acurex report produced good agreement for most of the measured temperatures. (2) Constant heat flux heating process can produce a much higher

  14. Cleanliness verification process at Martin Marietta Astronautics

    NASA Astrophysics Data System (ADS)

    King, Elizabeth A.; Giordano, Thomas J.

    1994-06-01

    The Montreal Protocol and the 1990 Clean Air Act Amendments mandate CFC-113, other chlorinated fluorocarbons (CFC's) and 1,1,1-Trichloroethane (TCA) be banned from production after December 31, 1995. In response to increasing pressures, the Air Force has formulated policy that prohibits purchase of these solvents for Air Force use after April 1, 1994. In response to the Air Force policy, Martin Marietta Astronautics is in the process of eliminating all CFC's and TCA from use at the Engineering Propulsion Laboratory (EPL), located on Air Force property PJKS. Gross and precision cleaning operations are currently performed on spacecraft components at EPL. The final step of the operation is a rinse with a solvent, typically CFC-113. This solvent is then analyzed for nonvolatile residue (NVR), particle count and total filterable solids (TFS) to determine cleanliness of the parts. The CFC-113 used in this process must be replaced in response to the above policies. Martin Marietta Astronautics, under contract to the Air Force, is currently evaluating and testing alternatives for a cleanliness verification solvent. Completion of test is scheduled for May, 1994. Evaluation of the alternative solvents follows a three step approach. This first is initial testing of solvents picked from literature searches and analysis. The second step is detailed testing of the top candidates from the initial test phase. The final step is implementation and validation of the chosen alternative(s). Testing will include contaminant removal, nonvolatile residue, material compatibility and propellant compatibility. Typical materials and contaminants will be tested with a wide range of solvents. Final results of the three steps will be presented as well as the implementation plan for solvent replacement.

  15. SU-E-J-138: On the Ion Beam Range and Dose Verification in Hadron Therapy Using Sound Waves

    SciTech Connect

    Fourkal, E; Veltchev, I; Gayou, O; Nahirnyak, V

    2015-06-15

    Purpose: Accurate range verification is of great importance to fully exploit the potential benefits of ion beam therapies. Current research efforts on this topic include the use of PET imaging of induced activity, detection of emerging prompt gamma rays or secondary particles. It has also been suggested recently to detect the ultrasound waves emitted through the ion energy absorption process. The energy absorbed in a medium is dissipated as heat, followed by thermal expansion that leads to generation of acoustic waves. By using an array of ultrasound transducers the precise spatial location of the Bragg peak can be obtained. The shape and intensity of the emitted ultrasound pulse depend on several variables including the absorbed energy and the pulse length. The main objective of this work is to understand how the ultrasound wave amplitude and shape depend on the initial ion energy and intensity. This would help guide future experiments in ionoacoustic imaging. Methods: The absorbed energy density for protons and carbon ions of different energy and field sizes were obtained using Fluka Monte Carlo code. Subsequently, the system of coupled equations for temperature and pressure is solved for different ion pulse intensities and lengths to obtain the pressure wave shape, amplitude and spectral distribution. Results: The proposed calculations show that the excited pressure wave amplitude is proportional to the absorbed energy density and for longer ion pulses inversely proportional to the ion pulse duration. It is also shown that the resulting ionoacoustic pressure distribution depends on both ion pulse duration and time between the pulses. Conclusion: The Bragg peak localization using ionoacoustic signal may eventually lead to the development of an alternative imaging method with sub-millimeter resolution. It may also open a way for in-vivo dose verification from the measured acoustic signal.

  16. Development of advanced seal verification

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Kosten, Susan E.; Abushagur, Mustafa A.

    1992-01-01

    The purpose of this research is to develop a technique to monitor and insure seal integrity with a sensor that has no active elements to burn-out during a long duration activity, such as a leakage test or especially during a mission in space. The original concept proposed is that by implementing fiber optic sensors, changes in the integrity of a seal can be monitored in real time and at no time should the optical fiber sensor fail. The electrical components which provide optical excitation and detection through the fiber are not part of the seal; hence, if these electrical components fail, they can be easily changed without breaking the seal. The optical connections required for the concept to work does present a functional problem to work out. The utility of the optical fiber sensor for seal monitoring should be general enough that the degradation of a seal can be determined before catastrophic failure occurs and appropriate action taken. Two parallel efforts were performed in determining the feasibility of using optical fiber sensors for seal verification. In one study, research on interferometric measurements of the mechanical response of the optical fiber sensors to seal integrity was studied. In a second study, the implementation of the optical fiber to a typical vacuum chamber was implemented and feasibility studies on microbend experiments in the vacuum chamber were performed. Also, an attempt was made to quantify the amount of pressure actually being applied to the optical fiber using finite element analysis software by Algor.

  17. Learning Assumptions for Compositional Verification

    NASA Technical Reports Server (NTRS)

    Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.

  18. Biometric verification in dynamic writing

    NASA Astrophysics Data System (ADS)

    George, Susan E.

    2002-03-01

    Pen-tablet devices capable of capturing the dynamics of writing record temporal and pressure information as well as the spatial pattern. This paper explores biometric verification based upon the dynamics of writing where writers are distinguished not on the basis of what they write (ie the signature), but how they write. We have collected samples of dynamic writing from 38 Chinese writers. Each writer was asked to provide 10 copies of a paragraph of text and the same number of signature samples. From the data we have extracted stroke-based primitives from the sentence data utilizing pen-up/down information and heuristic rules about the shape of the character. The x, y and pressure values of each primitive were interpolated into an even temporal range based upon a 20 msec sampling rate. We applied the Daubechies 1 wavelet transform to the x signal, y signal and pressure signal using the coefficients as inputs to a multi-layer perceptron trained with back-propagation on the sentence data. We found a sensitivity of 0.977 and specificity of 0.990 recognizing writers based on test primitives extracted from sentence data and measures of 0.916 and 0.961 respectively, from test primitives extracted from signature data.

  19. Sensors Locate Radio Interference

    NASA Technical Reports Server (NTRS)

    2009-01-01

    After receiving a NASA Small Business Innovation Research (SBIR) contract from Kennedy Space Center, Soneticom Inc., based in West Melbourne, Florida, created algorithms for time difference of arrival and radio interferometry, which it used in its Lynx Location System (LLS) to locate electromagnetic interference that can disrupt radio communications. Soneticom is collaborating with the Federal Aviation Administration (FAA) to install and test the LLS at its field test center in New Jersey in preparation for deploying the LLS at commercial airports. The software collects data from each sensor in order to compute the location of the interfering emitter.

  20. National Verification System of National Meteorological Center , China

    NASA Astrophysics Data System (ADS)

    Zhang, Jinyan; Wei, Qing; Qi, Dan

    2016-04-01

    Product Quality Verification Division for official weather forecasting of China was founded in April, 2011. It is affiliated to Forecast System Laboratory (FSL), National Meteorological Center (NMC), China. There are three employees in this department. I'm one of the employees and I am in charge of Product Quality Verification Division in NMC, China. After five years of construction, an integrated realtime National Verification System of NMC, China has been established. At present, its primary roles include: 1) to verify official weather forecasting quality of NMC, China; 2) to verify the official city weather forecasting quality of Provincial Meteorological Bureau; 3) to evaluate forecasting quality for each forecasters in NMC, China. To verify official weather forecasting quality of NMC, China, we have developed : • Grid QPF Verification module ( including upascale) • Grid temperature, humidity and wind forecast verification module • Severe convective weather forecast verification module • Typhoon forecast verification module • Disaster forecast verification • Disaster warning verification module • Medium and extend period forecast verification module • Objective elements forecast verification module • Ensemble precipitation probabilistic forecast verification module To verify the official city weather forecasting quality of Provincial Meteorological Bureau, we have developed : • City elements forecast verification module • Public heavy rain forecast verification module • City air quality forecast verification module. To evaluate forecasting quality for each forecasters in NMC, China, we have developed : • Off-duty forecaster QPF practice evaluation module • QPF evaluation module for forecasters • Severe convective weather forecast evaluation module • Typhoon track forecast evaluation module for forecasters • Disaster warning evaluation module for forecasters • Medium and extend period forecast evaluation module The further

  1. Object locating system

    DOEpatents

    Novak, J.L.; Petterson, B.

    1998-06-09

    A sensing system locates an object by sensing the object`s effect on electric fields. The object`s effect on the mutual capacitance of electrode pairs varies according to the distance between the object and the electrodes. A single electrode pair can sense the distance from the object to the electrodes. Multiple electrode pairs can more precisely locate the object in one or more dimensions. 12 figs.

  2. Lunar Impact Flash Locations

    NASA Technical Reports Server (NTRS)

    Moser, D. E.; Suggs, R. M.; Kupferschmidt, L.; Feldman, J.

    2015-01-01

    A bright impact flash detected by the NASA Lunar Impact Monitoring Program in March 2013 brought into focus the importance of determining the impact flash location. A process for locating the impact flash, and presumably its associated crater, was developed using commercially available software tools. The process was successfully applied to the March 2013 impact flash and put into production on an additional 300 impact flashes. The goal today: provide a description of the geolocation technique developed.

  3. RFI and SCRIMP Model Development and Verification

    NASA Technical Reports Server (NTRS)

    Loos, Alfred C.; Sayre, Jay

    2000-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this technique. Therefore, the objective of this research was to modify an existing three-dimensional, Resin Film Infusion (RFI)/Resin Transfer Molding (RTM) model to include VARTM simulation capabilities and to verify this model with the fabrication of aircraft structural composites. An additional objective was to use the VARTM model as a process analysis tool, where this tool would enable the user to configure the best process for manufacturing quality composites. Experimental verification of the model was performed by processing several flat composite panels. The parameters verified included flow front patterns and infiltration times. The flow front patterns were determined to be qualitatively accurate, while the simulated infiltration times over predicted experimental times by 8 to 10%. Capillary and gravitational forces were incorporated into the existing RFI/RTM model in order to simulate VARTM processing physics more accurately. The theoretical capillary pressure showed the capability to reduce the simulated infiltration times by as great as 6%. The gravity, on the other hand, was found to be negligible for all cases. Finally, the VARTM model was used as a process analysis tool. This enabled the user to determine such important process constraints as the location and type of injection ports and the permeability and location of the high-permeable media. A process for a three-stiffener composite panel was proposed. This configuration evolved from the variation of the process

  4. Model based correction of placement error in EBL and its verification

    NASA Astrophysics Data System (ADS)

    Babin, Sergey; Borisov, Sergey; Militsin, Vladimir; Komagata, Tadashi; Wakatsuki, Tetsuro

    2016-05-01

    In maskmaking, the main source of error contributing to placement error is charging. DISPLACE software corrects the placement error for any layout, based on a physical model. The charge of a photomask and multiple discharge mechanisms are simulated to find the charge distribution over the mask. The beam deflection is calculated for each location on the mask, creating data for the placement correction. The software considers the mask layout, EBL system setup, resist, and writing order, as well as other factors such as fogging and proximity effects correction. The output of the software is the data for placement correction. One important step is the calibration of physical model. A test layout on a single calibration mask was used for calibration. The extracted model parameters were used to verify the correction. As an ultimate test for the correction, a sophisticated layout was used for the verification that was very different from the calibration mask. The placement correction results were predicted by DISPLACE. A good correlation of the measured and predicted values of the correction confirmed the high accuracy of the charging placement error correction.

  5. Experimental validation of a commercial 3D dose verification system for intensity-modulated arc therapies

    NASA Astrophysics Data System (ADS)

    Boggula, Ramesh; Lorenz, Friedlieb; Mueller, Lutz; Birkner, Mattias; Wertz, Hansjoerg; Stieler, Florian; Steil, Volker; Lohr, Frank; Wenz, Frederik

    2010-10-01

    We validate the dosimetric performance of COMPASS®, a novel 3D quality assurance system for verification of volumetric-modulated arc therapy (VMAT) treatment plans that can correlate the delivered dose to the patient's anatomy, taking into account the tissue inhomogeneity. The accuracy of treatment delivery was assessed by the COMPASS® for 12 VMAT plans, and the resulting assessments were evaluated using an ionization chamber and film measurements. Dose-volume relationships were evaluated by the COMPASS® for three additional treatment plans and these were used to verify the accuracy of treatment planning dose calculations. The results matched well between COMPASS® and measurements for the ionization chamber (<=3%) and film (73-99% for gamma(3%/3 mm) < 1 and 98-100% for gamma(5%/5 mm) < 1) for the phantom plans. Differences in dose-volume statistics for the average dose to the PTV were within 2.5% for three treatment plans. For the structures located in the low-dose region, a maximum difference of <9% was observed. In its current implementation, the system could measure the delivered dose with sufficient accuracy and could project the 3D dose distribution directly on the patient's anatomy. Slight deviations were found for large open fields. These could be minimized by improving the COMPASS® in-built beam model.

  6. Verification and Validation of Kinetic Codes

    NASA Astrophysics Data System (ADS)

    Christlieb, Andrew

    2014-10-01

    We review the last three workshops held on Validation and Verification of Kinetic Codes. The goal of the workshops was to highlight the need to develop benchmark test problems beyond traditional test problems such as Landau damping and the two-stream instability. These test problems provide a limited understanding how a code might perform and mask key issues in more complicated situations. Developing these test problems highlights the strengths and weaknesses of both mesh- and particle-based codes. One outcome is that designing test problems that clearly deliver a path forward for developing improved methods is complicated by the need to create a completely self-consistent model. For example, two test cases proposed by the authors as simple test cases turn out to be ill defined. The first case is the modeling of sheath formation in a 1D 1V collisionless plasma. We found that losses to the wall lead to discontinuous distribution functions, a challenge for high order mesh-based solvers. The semi-infinite case was problematic because the far field boundary condition poses difficulty in computing on a finite domain. Our second case was flow of a collisionless electron beam in a pipe. Here, numerical diffusion is a key problem we are testing; however, two-stream instability at the beam edges introduces other issues in terms of finding convergent solutions. For mesh-based codes, before particle trapping takes place, mesh-based methods find themselves outside of the asymptotic regime. Another conclusion we draw from this exercise is that including collisional models in benchmark test problems for mesh-based plasma simulation tools is an important step in providing robust test problems for mesh-based kinetic solvers. In collaboration with Yaman Guclu, David Seal, and John Verboncoeur, Michigan State University.

  7. A Probabilistic Mass Estimation Algorithm for a Novel 7- Channel Capacitive Sample Verification Sensor

    NASA Technical Reports Server (NTRS)

    Wolf, Michael

    2012-01-01

    A document describes an algorithm created to estimate the mass placed on a sample verification sensor (SVS) designed for lunar or planetary robotic sample return missions. A novel SVS measures the capacitance between a rigid bottom plate and an elastic top membrane in seven locations. As additional sample material (soil and/or small rocks) is placed on the top membrane, the deformation of the membrane increases the capacitance. The mass estimation algorithm addresses both the calibration of each SVS channel, and also addresses how to combine the capacitances read from each of the seven channels into a single mass estimate. The probabilistic approach combines the channels according to the variance observed during the training phase, and provides not only the mass estimate, but also a value for the certainty of the estimate. SVS capacitance data is collected for known masses under a wide variety of possible loading scenarios, though in all cases, the distribution of sample within the canister is expected to be approximately uniform. A capacitance-vs-mass curve is fitted to this data, and is subsequently used to determine the mass estimate for the single channel s capacitance reading during the measurement phase. This results in seven different mass estimates, one for each SVS channel. Moreover, the variance of the calibration data is used to place a Gaussian probability distribution function (pdf) around this mass estimate. To blend these seven estimates, the seven pdfs are combined into a single Gaussian distribution function, providing the final mean and variance of the estimate. This blending technique essentially takes the final estimate as an average of the estimates of the seven channels, weighted by the inverse of the channel s variance.

  8. Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models

    SciTech Connect

    Ahmed Hassan; Jenny Chapman

    2006-02-01

    The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP

  9. Monitoring and verification R&D

    SciTech Connect

    Pilat, Joseph F; Budlong - Sylvester, Kory W; Fearey, Bryan L

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.

  10. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  11. Numerical Weather Predictions Evaluation Using Spatial Verification Methods

    NASA Astrophysics Data System (ADS)

    Tegoulias, I.; Pytharoulis, I.; Kotsopoulos, S.; Kartsios, S.; Bampzelis, D.; Karacostas, T.

    2014-12-01

    During the last years high-resolution numerical weather prediction simulations have been used to examine meteorological events with increased convective activity. Traditional verification methods do not provide the desired level of information to evaluate those high-resolution simulations. To assess those limitations new spatial verification methods have been proposed. In the present study an attempt is made to estimate the ability of the WRF model (WRF -ARW ver3.5.1) to reproduce selected days with high convective activity during the year 2010 using those feature-based verification methods. Three model domains, covering Europe, the Mediterranean Sea and northern Africa (d01), the wider area of Greece (d02) and central Greece - Thessaly region (d03) are used at horizontal grid-spacings of 15km, 5km and 1km respectively. By alternating microphysics (Ferrier, WSM6, Goddard), boundary layer (YSU, MYJ) and cumulus convection (Kain-­-Fritsch, BMJ) schemes, a set of twelve model setups is obtained. The results of those simulations are evaluated against data obtained using a C-Band (5cm) radar located at the centre of the innermost domain. Spatial characteristics are well captured but with a variable time lag between simulation results and radar data. Acknowledgements: This research is co­financed by the European Union (European Regional Development Fund) and Greek national funds, through the action "COOPERATION 2011: Partnerships of Production and Research Institutions in Focused Research and Technology Sectors" (contract number 11SYN_8_1088 - DAPHNE) in the framework of the operational programme "Competitiveness and Entrepreneurship" and Regions in Transition (OPC II, NSRF 2007-­-2013).

  12. Verification of Internal Dose Calculations.

    NASA Astrophysics Data System (ADS)

    Aissi, Abdelmadjid

    The MIRD internal dose calculations have been in use for more than 15 years, but their accuracy has always been questionable. There have been attempts to verify these calculations; however, these attempts had various shortcomings which kept the question of verification of the MIRD data still unanswered. The purpose of this research was to develop techniques and methods to verify the MIRD calculations in a more systematic and scientific manner. The research consisted of improving a volumetric dosimeter, developing molding techniques, and adapting the Monte Carlo computer code ALGAM to the experimental conditions and vice versa. The organic dosimetric system contained TLD-100 powder and could be shaped to represent human organs. The dosimeter possessed excellent characteristics for the measurement of internal absorbed doses, even in the case of the lungs. The molding techniques are inexpensive and were used in the fabrication of dosimetric and radioactive source organs. The adaptation of the computer program provided useful theoretical data with which the experimental measurements were compared. The experimental data and the theoretical calculations were compared for 6 source organ-7 target organ configurations. The results of the comparison indicated the existence of an agreement between measured and calculated absorbed doses, when taking into consideration the average uncertainty (16%) of the measurements, and the average coefficient of variation (10%) of the Monte Carlo calculations. However, analysis of the data gave also an indication that the Monte Carlo method might overestimate the internal absorbed doses. Even if the overestimate exists, at least it could be said that the use of the MIRD method in internal dosimetry was shown to lead to no unnecessary exposure to radiation that could be caused by underestimating the absorbed dose. The experimental and the theoretical data were also used to test the validity of the Reciprocity Theorem for heterogeneous

  13. Ozone Monitoring Instrument geolocation verification

    NASA Astrophysics Data System (ADS)

    Kroon, M.; Dobber, M. R.; Dirksen, R.; Veefkind, J. P.; van den Oord, G. H. J.; Levelt, P. F.

    2008-08-01

    Verification of the geolocation assigned to individual ground pixels as measured by the Ozone Monitoring Instrument (OMI) aboard the NASA EOS-Aura satellite was performed by comparing geophysical Earth surface details as observed in OMI false color images with the high-resolution continental outline vector map as provided by the Interactive Data Language (IDL) software tool from ITT Visual Information Solutions. The OMI false color images are generated from the OMI visible channel by integration over 20-nm-wide spectral bands of the Earth radiance intensity around 484 nm, 420 nm, and 360 nm wavelength per ground pixel. Proportional to the integrated intensity, we assign color values composed of CRT standard red, green, and blue to the OMI ground pixels. Earth surface details studied are mostly high-contrast coast lines where arid land or desert meets deep blue ocean. The IDL high-resolution vector map is based on the 1993 CIA World Database II Map with a 1-km accuracy. Our results indicate that the average OMI geolocation offset over the years 2005-2006 is 0.79 km in latitude and 0.29 km in longitude, with a standard deviation of 1.64 km in latitude and 2.04 km in longitude, respectively. Relative to the OMI nadir pixel size, one obtains mean displacements of ˜6.1% in latitude and ˜1.2% in longitude, with standard deviations of 12.6% and 7.9%, respectively. We conclude that the geolocation assigned to individual OMI ground pixels is sufficiently accurate to support scientific studies of atmospheric features as observed in OMI level 2 satellite data products, such as air quality issues on urban scales or volcanic eruptions and its plumes, that occur on spatial scales comparable to or smaller than OMI nadir pixels.

  14. Alternate calibration method of radiochromic EBT3 film for quality assurance verification of clinical radiotherapy treatments

    NASA Astrophysics Data System (ADS)

    Park, Soah; Kang, Sei-Kwon; Cheong, Kwang-Ho; Hwang, Taejin; Yoon, Jai-Woong; Koo, Taeryool; Han, Tae Jin; Kim, Haeyoung; Lee, Me Yeon; Bae, Hoonsik; Kim, Kyoung Ju

    2016-07-01

    EBT3 film is utilized as a dosimetry quality assurance tool for the verification of clinical radiotherapy treatments. In this work, we suggest a percentage-depth-dose (PDD) calibration method that can calibrate several EBT3 film pieces together at different dose levels because photon beams provide different dose levels at different depths along the axis of the beam. We investigated the feasibility of the film PDD calibration method based on PDD data and compared the results those from the traditional film calibration method. Photon beams at 6 MV were delivered to EBT3 film pieces for both calibration methods. For the PDD-based calibration, the film pieces were placed on solid phantoms at the depth of maximum dose (dmax) and at depths of 3, 5, 8, 12, 17, and 22 cm, and a photon beam was delivered twice, at 100 cGy and 400 cGy, to extend the calibration dose range under the same conditions. Fourteen film pieces, to maintain their consistency, were irradiated at doses ranging from approximately 30 to 400 cGy for both film calibrations. The film pieces were located at the center position on the scan bed of an Epson 1680 flatbed scanner in the parallel direction. Intensity-modulated radiation therapy (IMRT) plans were created, and their dose distributions were delivered to the film. The dose distributions for the traditional method and those for the PDD-based calibration method were evaluated using a Gamma analysis. The PDD dose values using a CC13 ion chamber and those obtained by using a FC65-G Farmer chamber and measured at the depth of interest produced very similar results. With the objective test criterion of a 1% dosage agreement at 1 mm, the passing rates for the four cases of the three IMRT plans were essentially identical. The traditional and the PDD-based calibrations provided similar plan verification results. We also describe another alternative for calibrating EBT3 films, i.e., a PDD-based calibration method that provides an easy and time-saving approach

  15. Monitoring/Verification using DMS: TATP Example

    SciTech Connect

    Stephan Weeks; Kevin Kyle

    2008-03-01

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a 'smart dust' sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the use of explosives or chemical and biological weapons in terrorist activities. Two peroxide-based liquid explosives, triacetone triperoxide (TATP) and hexamethylene triperoxide diamine (HMTD), are synthesized from common chemicals such as hydrogen peroxide, acetone, sulfuric acid, ammonia, and citric acid (Figure 1). Recipes can be readily found on the Internet by anyone seeking to generate sufficient quantities of these highly explosive chemicals to cause considerable collateral damage. Detection of TATP and HMTD by advanced sensing systems can provide the early warning necessary to prevent terror plots from coming to fruition. DMS is currently one of the foremost emerging technologies for the separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. DMS separates and identifies ions at ambient pressures by utilizing the non-linear dependence of an ion's mobility on the radio frequency (rf) electric field strength. GC is widely considered to be one of the leading analytical methods for the separation of chemical species in complex mixtures. Advances in the technique have led to the development of low-thermal-mass fast GC columns. These columns are capable of

  16. Geostar - Navigation location system

    NASA Astrophysics Data System (ADS)

    Keyser, Donald A.

    The author describes the Radiodetermination Satellite Service (RDSS). The initial phase of the RDSS provides for a unique service enabling central offices and headquarters to obtain position-location information and receive short digital messages from mobile user terminals throughout the contiguous United States, southern Canada, and northern Mexico. The system employs a spread-spectrum, CDMA modulation technique allowing multiple customers to use the system simultaneously, without preassigned coordination with fellow users. Position location is currently determined by employing an existing radio determination receiver, such as Loran-C, GPS, or Transit, in the mobile user terminal. In the early 1990s position location will be determined at a central earth station by time-differential ranging of the user terminals via two or more geostationary satellites. A brief overview of the RDSS system architecture is presented with emphasis on the user terminal and its diverse applications.

  17. Hybrid Deep Learning for Face Verification.

    PubMed

    Sun, Yi; Wang, Xiaogang; Tang, Xiaoou

    2016-10-01

    This paper proposes a hybrid convolutional network (ConvNet)-Restricted Boltzmann Machine (RBM) model for face verification. A key contribution of this work is to learn high-level relational visual features with rich identity similarity information. The deep ConvNets in our model start by extracting local relational visual features from two face images in comparison, which are further processed through multiple layers to extract high-level and global relational features. To keep enough discriminative information, we use the last hidden layer neuron activations of the ConvNet as features for face verification instead of those of the output layer. To characterize face similarities from different aspects, we concatenate the features extracted from different face region pairs by different deep ConvNets. The resulting high-dimensional relational features are classified by an RBM for face verification. After pre-training each ConvNet and the RBM separately, the entire hybrid network is jointly optimized to further improve the accuracy. Various aspects of the ConvNet structures, relational features, and face verification classifiers are investigated. Our model achieves the state-of-the-art face verification performance on the challenging LFW dataset under both the unrestricted protocol and the setting when outside data is allowed to be used for training. PMID:26660699

  18. Verification against perturbed analyses and observations

    NASA Astrophysics Data System (ADS)

    Bowler, N. E.; Cullen, M. J. P.; Piccolo, C.

    2015-07-01

    It has long been known that verification of a forecast against the sequence of analyses used to produce those forecasts can under-estimate the magnitude of forecast errors. Here we show that under certain conditions the verification of a short-range forecast against a perturbed analysis coming from an ensemble data assimilation scheme can give the same root-mean-square error as verification against the truth. This means that a perturbed analysis can be used as a reliable proxy for the truth. However, the conditions required for this result to hold are rather restrictive: the analysis must be optimal, the ensemble spread must be equal to the error in the mean, the ensemble size must be large and the forecast being verified must be the background forecast used in the data assimilation. Although these criteria are unlikely to be met exactly it becomes clear that for most cases verification against a perturbed analysis gives better results than verification against an unperturbed analysis. We demonstrate the application of these results in a idealised model framework and a numerical weather prediction context. In deriving this result we recall that an optimal (Kalman) analysis is one for which the analysis increments are uncorrelated with the analysis errors.

  19. Complementary technologies for verification of excess plutonium

    SciTech Connect

    Langner, , D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-12-31

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of {sup 240}Pu to {sup 239}Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime.

  20. Hybrid Deep Learning for Face Verification.

    PubMed

    Sun, Yi; Wang, Xiaogang; Tang, Xiaoou

    2016-10-01

    This paper proposes a hybrid convolutional network (ConvNet)-Restricted Boltzmann Machine (RBM) model for face verification. A key contribution of this work is to learn high-level relational visual features with rich identity similarity information. The deep ConvNets in our model start by extracting local relational visual features from two face images in comparison, which are further processed through multiple layers to extract high-level and global relational features. To keep enough discriminative information, we use the last hidden layer neuron activations of the ConvNet as features for face verification instead of those of the output layer. To characterize face similarities from different aspects, we concatenate the features extracted from different face region pairs by different deep ConvNets. The resulting high-dimensional relational features are classified by an RBM for face verification. After pre-training each ConvNet and the RBM separately, the entire hybrid network is jointly optimized to further improve the accuracy. Various aspects of the ConvNet structures, relational features, and face verification classifiers are investigated. Our model achieves the state-of-the-art face verification performance on the challenging LFW dataset under both the unrestricted protocol and the setting when outside data is allowed to be used for training.

  1. Verification of COSMO model over Poland

    NASA Astrophysics Data System (ADS)

    Linkowska, Joanna; Mazur, Andrzej; Wyszogrodzki, Andrzej

    2014-05-01

    The Polish National Weather Service and Institute of Meteorology and Water Management - National Research Institute (IMWM-NRI, Warsaw, Poland) joined the Consortium for Small-Scale Modeling (COSMO) in 2002. Thanks to cooperation in the consortium the meteorological model COSMO is run operationally at IMWM-NRI at both 2.8km and 7km horizontal resolutions. In research mode, data assimilation tests have been carried out using a 6-hourly cycle nudging scheme. We would like to present verification results of the COSMO model, comparing model generated surface temperature, wind and rain fall rates with the Synop measurements. In addition, verification results of vertical profiles for chosen variables will also be analyzed and presented. The verification is divided into the following areas: i) assessing impact of data assimilation on the quality of 2.8km resolution model forecasts by switching data assimilation on and off, ii) spatio-temporal verification of model results at 7km resolution and iii) conditional verification of selected parameters against chosen meteorological condition(s).

  2. Marine cable location system

    SciTech Connect

    Zachariadis, R.G.

    1984-05-01

    An acoustic positioning system locates a marine cable at an exploration site, such cable employing a plurality of hydrophones at spaced-apart positions along the cable. A marine vessel measures water depth to the cable as the vessel passes over the cable and interrogates the hydrophones with sonar pulses along a slant range as the vessel travels in a parallel and horizontally offset path to the cable. The location of the hydrophones is determined from the recordings of water depth and slant range.

  3. Earthquake location in island arcs

    USGS Publications Warehouse

    Engdahl, E.R.; Dewey, J.W.; Fujita, K.

    1982-01-01

    A comprehensive data set of selected teleseismic P-wave arrivals and local-network P- and S-wave arrivals from large earthquakes occurring at all depths within a small section of the central Aleutians is used to examine the general problem of earthquake location in island arcs. Reference hypocenters for this special data set are determined for shallow earthquakes from local-network data and for deep earthquakes from combined local and teleseismic data by joint inversion for structure and location. The high-velocity lithospheric slab beneath the central Aleutians may displace hypocenters that are located using spherically symmetric Earth models; the amount of displacement depends on the position of the earthquakes with respect to the slab and on whether local or teleseismic data are used to locate the earthquakes. Hypocenters for trench and intermediate-depth events appear to be minimally biased by the effects of slab structure on rays to teleseismic stations. However, locations of intermediate-depth events based on only local data are systematically displaced southwards, the magnitude of the displacement being proportional to depth. Shallow-focus events along the main thrust zone, although well located using only local-network data, are severely shifted northwards and deeper, with displacements as large as 50 km, by slab effects on teleseismic travel times. Hypocenters determined by a method that utilizes seismic ray tracing through a three-dimensional velocity model of the subduction zone, derived by thermal modeling, are compared to results obtained by the method of joint hypocenter determination (JHD) that formally assumes a laterally homogeneous velocity model over the source region and treats all raypath anomalies as constant station corrections to the travel-time curve. The ray-tracing method has the theoretical advantage that it accounts for variations in travel-time anomalies within a group of events distributed over a sizable region of a dipping, high

  4. Gated Treatment Delivery Verification With On-Line Megavoltage Fluoroscopy

    SciTech Connect

    Tai An; Christensen, James D.; Gore, Elizabeth; Khamene, Ali; Boettger, Thomas; Li, X. Allen

    2010-04-15

    Purpose: To develop and clinically demonstrate the use of on-line real-time megavoltage (MV) fluoroscopy for gated treatment delivery verification. Methods and Materials: Megavoltage fluoroscopy (MVF) image sequences were acquired using a flat panel equipped for MV cone-beam CT in synchrony with the respiratory signal obtained from the Anzai gating device. The MVF images can be obtained immediately before or during gated treatment delivery. A prototype software tool (named RTReg4D) was developed to register MVF images with phase-sequenced digitally reconstructed radiograph images generated from the treatment planning system based on four-dimensional CT. The image registration can be used to reposition the patient before or during treatment delivery. To demonstrate the reliability and clinical usefulness, the system was first tested using a thoracic phantom and then prospectively in actual patient treatments under an institutional review board-approved protocol. Results: The quality of the MVF images for lung tumors is adequate for image registration with phase-sequenced digitally reconstructed radiographs. The MVF was found to be useful for monitoring inter- and intrafractional variations of tumor positions. With the planning target volume contour displayed on the MVF images, the system can verify whether the moving target stays within the planning target volume margin during gated delivery. Conclusions: The use of MVF images was found to be clinically effective in detecting discrepancies in tumor location before and during respiration-gated treatment delivery. The tools and process developed can be useful for gated treatment delivery verification.

  5. Ionoacoustics: A new direct method for range verification

    NASA Astrophysics Data System (ADS)

    Parodi, Katia; Assmann, Walter

    2015-05-01

    The superior ballistic properties of ion beams may offer improved tumor-dose conformality and unprecedented sparing of organs at risk in comparison to other radiation modalities in external radiotherapy. However, these advantages come at the expense of increased sensitivity to uncertainties in the actual treatment delivery, resulting from inaccuracies of patient positioning, physiological motion and uncertainties in the knowledge of the ion range in living tissue. In particular, the dosimetric selectivity of ion beams depends on the longitudinal location of the Bragg peak, making in vivo knowledge of the actual beam range the greatest challenge to full clinical exploitation of ion therapy. Nowadays, in vivo range verification techniques, which are already, or close to, being investigated in the clinical practice, rely on the detection of the secondary annihilation photons or prompt gammas, resulting from nuclear interaction of the primary ion beam with the irradiated tissue. Despite the initial promising results, these methods utilize a not straightforward correlation between nuclear and electromagnetic processes, and typically require massive and costly instrumentation. On the contrary, the long-term known, yet only recently revisited process of "ionoacoustics", which is generated by local tissue heating especially at the Bragg peak, may offer a more direct approach to in vivo range verification, as reviewed here.

  6. Clinical application of in vivo treatment delivery verification based on PET/CT imaging of positron activity induced at high energy photon therapy

    NASA Astrophysics Data System (ADS)

    Janek Strååt, Sara; Andreassen, Björn; Jonsson, Cathrine; Noz, Marilyn E.; Maguire, Gerald Q., Jr.; Näfstadius, Peder; Näslund, Ingemar; Schoenahl, Frederic; Brahme, Anders

    2013-08-01

    The purpose of this study was to investigate in vivo verification of radiation treatment with high energy photon beams using PET/CT to image the induced positron activity. The measurements of the positron activation induced in a preoperative rectal cancer patient and a prostate cancer patient following 50 MV photon treatments are presented. A total dose of 5 and 8 Gy, respectively, were delivered to the tumors. Imaging was performed with a 64-slice PET/CT scanner for 30 min, starting 7 min after the end of the treatment. The CT volume from the PET/CT and the treatment planning CT were coregistered by matching anatomical reference points in the patient. The treatment delivery was imaged in vivo based on the distribution of the induced positron emitters produced by photonuclear reactions in tissue mapped on to the associated dose distribution of the treatment plan. The results showed that spatial distribution of induced activity in both patients agreed well with the delivered beam portals of the treatment plans in the entrance subcutaneous fat regions but less so in blood and oxygen rich soft tissues. For the preoperative rectal cancer patient however, a 2 ± (0.5) cm misalignment was observed in the cranial-caudal direction of the patient between the induced activity distribution and treatment plan, indicating a beam patient setup error. No misalignment of this kind was seen in the prostate cancer patient. However, due to a fast patient setup error in the PET/CT scanner a slight mis-position of the patient in the PET/CT was observed in all three planes, resulting in a deformed activity distribution compared to the treatment plan. The present study indicates that the induced positron emitters by high energy photon beams can be measured quite accurately using PET imaging of subcutaneous fat to allow portal verification of the delivered treatment beams. Measurement of the induced activity in the patient 7 min after receiving 5 Gy involved count rates which were about

  7. Location of Geothermal Resources

    SciTech Connect

    2004-07-01

    Geothermal resources, which utilize the heat of the earth, are located throughout the plant's crust. Those closer to the surface are most commonly used because geothermal drilling costs are currently prohibitive below depths of between 10,000 and 15,000 feet.

  8. Birefringent Stress Location Sensor

    NASA Astrophysics Data System (ADS)

    Franks, R. B.; Torruellas, W.; Youngquist, R. C.

    1986-08-01

    A new type of stress location sensor is discussed in which the FMCW technique is used to detect the difference in propagation time between two optical paths in an optical fiber due to stress induced modal coupling. Two versions of the system are included, and experimental results are presented for each system.

  9. LOCATING AREAS OF CONCERN

    EPA Science Inventory

    A simple method to locate changes in vegetation cover, which can be used to identify areas under stress. The method only requires inexpensive NDVI data. The use of remotely sensed data is far more cost-effective than field studies and can be performed more quickly. Local knowledg...

  10. Particle impact location detector

    NASA Technical Reports Server (NTRS)

    Auer, S. O.

    1974-01-01

    Detector includes delay lines connected to each detector surface strip. When several particles strike different strips simultaneously, pulses generated by each strip are time delayed by certain intervals. Delay time for each strip is known. By observing time delay in pulse, it is possible to locate strip that is struck by particle.

  11. RF model of the distribution system as a communication channel, phase 2. Volume 2: Task reports

    NASA Technical Reports Server (NTRS)

    Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.

    1982-01-01

    Based on the established feasibility of predicting, via a model, the propagation of Power Line Frequency on radial type distribution feeders, verification studies comparing model predictions against measurements were undertaken using more complicated feeder circuits and situations. Detailed accounts of the major tasks are presented. These include: (1) verification of model; (2) extension, implementation, and verification of perturbation theory; (3) parameter sensitivity; (4) transformer modeling; and (5) compensation of power distribution systems for enhancement of power line carrier communication reliability.

  12. Data verification in the residue laboratory.

    PubMed

    Ault, J A; Cassidy, P S; Crawford, C J; Jablonski, J E; Kenyon, R G

    1994-12-01

    Residue analysis frequently presents a challenge to the quality assurance (QA) auditor due to the sheer volume of data to be audited. In the face of multiple boxes of raw data, some process must be defined that assures the scientist and the QA auditor of the quality and integrity of the data. A program that ensures that complete and appropriate verification of data before it reaches the Quality Assurance Unit (QAU) is presented. The "Guidelines for Peer Review of Data" were formulated by the Residue Analysis Business Center at Ricerca, Inc. to accommodate efficient use of review time and to define any uncertainties concerning what are acceptable data. The core of this program centers around five elements: Study initiation (definitional) meetings, calculations, verification, approval, and the use of a verification checklist.

  13. Heavy water physical verification in power plants

    SciTech Connect

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper.

  14. Critical Surface Cleaning and Verification Alternatives

    NASA Technical Reports Server (NTRS)

    Melton, Donald M.; McCool, A. (Technical Monitor)

    2000-01-01

    As a result of federal and state requirements, historical critical cleaning and verification solvents such as Freon 113, Freon TMC, and Trichloroethylene (TCE) are either highly regulated or no longer 0 C available. Interim replacements such as HCFC 225 have been qualified, however toxicity and future phase-out regulations necessitate long term solutions. The scope of this project was to qualify a safe and environmentally compliant LOX surface verification alternative to Freon 113, TCE and HCFC 225. The main effort was focused on initiating the evaluation and qualification of HCFC 225G as an alternate LOX verification solvent. The project was scoped in FY 99/00 to perform LOX compatibility, cleaning efficiency and qualification on flight hardware.

  15. Active alignment/contact verification system

    DOEpatents

    Greenbaum, William M.

    2000-01-01

    A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.

  16. Packaged low-level waste verification system

    SciTech Connect

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-08-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy`s National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL).

  17. Land Ice Verification and Validation Kit

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&Vmore » involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and test data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  18. Land Ice Verification and Validation Kit

    SciTech Connect

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and test data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.

  19. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  20. Dust storm events over Delhi: verification of dust AOD forecasts with satellite and surface observations

    NASA Astrophysics Data System (ADS)

    Singh, Aditi; Iyengar, Gopal R.; George, John P.

    2016-05-01

    Thar desert located in northwest part of India is considered as one of the major dust source. Dust storms originate in Thar desert during pre-monsoon season, affects large part of Indo-Gangetic plains. High dust loading causes the deterioration of the ambient air quality and degradation in visibility. Present study focuses on the identification of dust events and verification of the forecast of dust events over Delhi and western part of IG Plains, during the pre-monsoon season of 2015. Three dust events have been identified over Delhi during the study period. For all the selected days, Terra-MODIS AOD at 550 nm are found close to 1.0, while AURA-OMI AI shows high values. Dust AOD forecasts from NCMRWF Unified Model (NCUM) for the three selected dust events are verified against satellite (MODIS) and ground based observations (AERONET). Comparison of observed AODs at 550 nm from MODIS with NCUM predicted AODs reveals that NCUM is able to predict the spatial and temporal distribution of dust AOD, in these cases. Good correlation (~0.67) is obtained between the NCUM predicted dust AODs and location specific observations available from AERONET. Model under-predicted the AODs as compared to the AERONET observations. This may be mainly because the model account for only dust and no anthropogenic activities are considered. The results of the present study emphasize the requirement of more realistic representation of local dust emission in the model both of natural and anthropogenic origin, to improve the forecast of dust from NCUM during the dust events.

  1. Dust forecast over North Africa: verification with satellite and ground based observations

    NASA Astrophysics Data System (ADS)

    Singh, Aditi; Kumar, Sumit; George, John P.

    2016-05-01

    Arid regions of North Africa are considered as one of the major dust source. Present study focuses on the forecast of aerosol optical depth (AOD) of dust over different regions of North Africa. NCMRWF Unified Model (NCUM) produces dust AOD forecasts at different wavelengths with lead time upto 240 hr, based on 00UTC initial conditions. Model forecast of dust AOD at 550 nm up to 72 hr forecast, based on different initial conditions are verified against satellite and ground based observations of total AOD during May-June 2014 with the assumption that except dust, presence of all other aerosols type are negligible. Location specific and geographical distribution of dust AOD forecast is verified against Aerosol Robotic Network (AERONET) station observations of total and coarse mode AOD. Moderate Resolution Imaging Spectroradiometer (MODIS) dark target and deep blue merged level 3 total aerosol optical depth (AOD) at 550 nm and Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) retrieved dust AOD at 532 nm are also used for verification. CALIOP dust AOD was obtained by vertical integration of aerosol extinction coefficient at 532 nm from the aerosol profile level 2 products. It is found that at all the selected AERONET stations, the trend in dust AODs is well predicted by NCUM up to three days advance. Good correlation, with consistently low bias (~ +/-0.06) and RMSE (~ 0.2) values, is found between model forecasts and point measurements of AERONET, except over one location Cinzana (Mali). Model forecast consistently overestimated the dust AOD compared to CALIOP dust AOD, with a bias of 0.25 and RMSE of 0.40.

  2. Theoretical study of closed-loop recycling liquid-liquid chromatography and experimental verification of the theory.

    PubMed

    Kostanyan, Artak E; Erastov, Andrey A

    2016-09-01

    The non-ideal recycling equilibrium-cell model including the effects of extra-column dispersion is used to simulate and analyze closed-loop recycling counter-current chromatography (CLR CCC). Previously, the operating scheme with the detector located before the column was considered. In this study, analysis of the process is carried out for a more realistic and practical scheme with the detector located immediately after the column. Peak equation for individual cycles and equations describing the transport of single peaks and complex chromatograms inside the recycling closed-loop, as well as equations for the resolution between single solute peaks of the neighboring cycles, for the resolution of peaks in the recycling chromatogram and for the resolution between the chromatograms of the neighboring cycles are presented. It is shown that, unlike conventional chromatography, increasing of the extra-column volume (the recycling line length) may allow a better separation of the components in CLR chromatography. For the experimental verification of the theory, aspirin, caffeine, coumarin and the solvent system hexane/ethyl acetate/ethanol/water (1:1:1:1) were used. Comparison of experimental and simulated processes of recycling and distribution of the solutes in the closed-loop demonstrated a good agreement between theory and experiment.

  3. Theoretical study of closed-loop recycling liquid-liquid chromatography and experimental verification of the theory.

    PubMed

    Kostanyan, Artak E; Erastov, Andrey A

    2016-09-01

    The non-ideal recycling equilibrium-cell model including the effects of extra-column dispersion is used to simulate and analyze closed-loop recycling counter-current chromatography (CLR CCC). Previously, the operating scheme with the detector located before the column was considered. In this study, analysis of the process is carried out for a more realistic and practical scheme with the detector located immediately after the column. Peak equation for individual cycles and equations describing the transport of single peaks and complex chromatograms inside the recycling closed-loop, as well as equations for the resolution between single solute peaks of the neighboring cycles, for the resolution of peaks in the recycling chromatogram and for the resolution between the chromatograms of the neighboring cycles are presented. It is shown that, unlike conventional chromatography, increasing of the extra-column volume (the recycling line length) may allow a better separation of the components in CLR chromatography. For the experimental verification of the theory, aspirin, caffeine, coumarin and the solvent system hexane/ethyl acetate/ethanol/water (1:1:1:1) were used. Comparison of experimental and simulated processes of recycling and distribution of the solutes in the closed-loop demonstrated a good agreement between theory and experiment. PMID:27492599

  4. Electric power system test and verification program

    NASA Technical Reports Server (NTRS)

    Rylicki, Daniel S.; Robinson, Frank, Jr.

    1994-01-01

    Space Station Freedom's (SSF's) electric power system (EPS) hardware and software verification is performed at all levels of integration, from components to assembly and system level tests. Careful planning is essential to ensure the EPS is tested properly on the ground prior to launch. The results of the test performed on breadboard model hardware and analyses completed to date have been evaluated and used to plan for design qualification and flight acceptance test phases. These results and plans indicate the verification program for SSF's 75-kW EPS would have been successful and completed in time to support the scheduled first element launch.

  5. Challenges in High-Assurance Runtime Verification

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.

    2016-01-01

    Safety-critical systems are growing more complex and becoming increasingly autonomous. Runtime Verification (RV) has the potential to provide protections when a system cannot be assured by conventional means, but only if the RV itself can be trusted. In this paper, we proffer a number of challenges to realizing high-assurance RV and illustrate how we have addressed them in our research. We argue that high-assurance RV provides a rich target for automated verification tools in hope of fostering closer collaboration among the communities.

  6. Verification of Plan Models Using UPPAAL

    NASA Technical Reports Server (NTRS)

    Khatib, Lina; Muscettola, Nicola; Haveland, Klaus; Lau, Sonic (Technical Monitor)

    2001-01-01

    This paper describes work on the verification of HSTS, the planner and scheduler of the Remote Agent autonomous control system deployed in Deep Space 1 (DS1). The verification is done using UPPAAL, a real time model checking tool. We start by motivating our work in the introduction. Then we give a brief description of HSTS and UPPAAL. After that, we give a mapping of HSTS models into UPPAAL and we present samples of plan model properties one may want to verify. Finally, we conclude with a summary.

  7. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  8. Generic Protocol for the Verification of Ballast Water Treatment Technology

    EPA Science Inventory

    In anticipation of the need to address performance verification and subsequent approval of new and innovative ballast water treatment technologies for shipboard installation, the U.S Coast Guard and the Environmental Protection Agency‘s Environmental Technology Verification Progr...

  9. A tracking and verification system implemented in a clinical environment for partial HIPAA compliance

    NASA Astrophysics Data System (ADS)

    Guo, Bing; Documet, Jorge; Liu, Brent; King, Nelson; Shrestha, Rasu; Wang, Kevin; Huang, H. K.; Grant, Edward G.

    2006-03-01

    The paper describes the methodology for the clinical design and implementation of a Location Tracking and Verification System (LTVS) that has distinct benefits for the Imaging Department at the Healthcare Consultation Center II (HCCII), an outpatient imaging facility located on the USC Health Science Campus. A novel system for tracking and verification of patients and staff in a clinical environment using wireless and facial biometric technology to monitor and automatically identify patients and staff was developed in order to streamline patient workflow, protect against erroneous examinations and create a security zone to prevent and audit unauthorized access to patient healthcare data under the HIPAA mandate. This paper describes the system design and integration methodology based on initial clinical workflow studies within a clinical environment. An outpatient center was chosen as an initial first step for the development and implementation of this system.

  10. Dose Verification of Stereotactic Radiosurgery Treatment for Trigeminal Neuralgia with Presage 3D Dosimetry System

    NASA Astrophysics Data System (ADS)

    Wang, Z.; Thomas, A.; Newton, J.; Ibbott, G.; Deasy, J.; Oldham, M.

    2010-11-01

    Achieving adequate verification and quality-assurance (QA) for radiosurgery treatment of trigeminal-neuralgia (TGN) is particularly challenging because of the combination of very small fields, very high doses, and complex irradiation geometries (multiple gantry and couch combinations). TGN treatments have extreme requirements for dosimetry tools and QA techniques, to ensure adequate verification. In this work we evaluate the potential of Presage/Optical-CT dosimetry system as a tool for the verification of TGN distributions in high-resolution and in 3D. A TGN treatment was planned and delivered to a Presage 3D dosimeter positioned inside the Radiological-Physics-Center (RPC) head and neck IMRT credentialing phantom. A 6-arc treatment plan was created using the iPlan system, and a maximum dose of 80Gy was delivered with a Varian Trilogy machine. The delivered dose to Presage was determined by optical-CT scanning using the Duke Large field-of-view Optical-CT Scanner (DLOS) in 3D, with isotropic resolution of 0.7mm3. DLOS scanning and reconstruction took about 20minutes. 3D dose comparisons were made with the planning system. Good agreement was observed between the planned and measured 3D dose distributions, and this work provides strong support for the viability of Presage/Optical-CT as a highly useful new approach for verification of this complex technique.

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, F. R. MAHONEY & ASSOC., AMPHIDROME SYSTEM FOR SINGLE FAMILY HOMES - 02/05/WQPC-SWP

    EPA Science Inventory

    Verification testing of the F.R. Mahoney Amphidrome System was conducted over a twelve month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at the Otis Air National Guard Base in Borne, MA. Sanitary Sewerage from the base residential housing w...

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, BIO-MICROBICS, INC., MODEL RETROFAST ®0.375

    EPA Science Inventory

    Verification testing of the Bio-Microbics RetroFAST® 0.375 System to determine the reduction of nitrogen in residential wastewater was conducted over a twelve-month period at the Mamquam Wastewater Technology Test Facility, located at the Mamquam Wastewater Treatment Plant. The R...

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, AQUAPOINT, INC. BIOCLERE MODEL 16/12 - 02/02/WQPC-SWP

    EPA Science Inventory

    Verification testing of the Aquapoint, Inc. (AQP) BioclereTM Model 16/12 was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC), located at Otis Air National Guard Base in Bourne, Massachusetts. Sanitary sewerage from the ba...

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL RESIDENTIAL HOMES, WATERLOO BIOFILTER® MODEL 4-BEDROOM (NSF 02/03/WQPC-SWP)

    EPA Science Inventory

    Verification testing of the Waterloo Biofilter Systems (WBS), Inc. Waterloo Biofilter® Model 4-Bedroom system was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at Otis Air National Guard Base in Bourne, Mas...

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSDF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, SEPTITECH, INC. MODEL 400 SYSTEM - 02/04/WQPC-SWP

    EPA Science Inventory

    Verification testing of the SeptiTech Model 400 System was conducted over a twelve month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at the Otis Air National Guard Base in Borne, MA. Sanitary Sewerage from the base residential housing was u...

  16. Interferometric locating system

    NASA Technical Reports Server (NTRS)

    Macdoran, P. F. (Inventor)

    1980-01-01

    A system is described for determining the position of a vehicle or other target that emits radio waves and which is of the type that senses the difference in time of arrival at spaced ground stations of signals from the vehicle to locate the vehicle on a set of intersecting hyperbolas. A network of four ground stations detects the radio emissions from the vehicle and by means of cross correlation derives the relative signal delay at the ground stations from which the vehicle position is deduced. Because the signal detection is by cross correlation, no knowledge of the emission is needed, which makes even unintentional radio noise emissions usable as a locator beacon. By positioning one of the four ground stations at an elevation significantly above the plane of the other three stations, a three dimensional fix on the vehicle is possible.

  17. Dipole Well Location

    1998-08-03

    The problem here is to model the three-dimensional response of an electromagnetic logging tool to a practical situation which is often encountered in oil and gas exploration. The DWELL code provide the electromagnetic fields on the axis of a borehole due to either an electric or a magnetic dipole located on the same axis. The borehole is cylindrical, and is located within a stratified formation in which the bedding planes are not horizontal. The anglemore » between the normal to the bedding planes and the axis of the borehole may assume any value, or in other words, the borehole axis may be tilted with respect to the bedding planes. Additionally, all of the formation layers may have invasive zones of drilling mud. The operating frequency of the source dipole(s) extends from a few Hertz to hundreds of Megahertz.« less

  18. Electric current locator

    DOEpatents

    King, Paul E.; Woodside, Charles Rigel

    2012-02-07

    The disclosure herein provides an apparatus for location of a quantity of current vectors in an electrical device, where the current vector has a known direction and a known relative magnitude to an input current supplied to the electrical device. Mathematical constants used in Biot-Savart superposition equations are determined for the electrical device, the orientation of the apparatus, and relative magnitude of the current vector and the input current, and the apparatus utilizes magnetic field sensors oriented to a sensing plane to provide current vector location based on the solution of the Biot-Savart superposition equations. Description of required orientations between the apparatus and the electrical device are disclosed and various methods of determining the mathematical constants are presented.

  19. Underwater hydrophone location survey

    NASA Technical Reports Server (NTRS)

    Cecil, Jack B.

    1993-01-01

    The Atlantic Undersea Test and Evaluation Center (AUTEC) is a U.S. Navy test range located on Andros Island, Bahamas, and a Division of the Naval Undersea Warfare Center (NUWC), Newport, RI. The Headquarters of AUTEC is located at a facility in West Palm Beach, FL. AUTEC's primary mission is to provide the U.S. Navy with a deep-water test and evaluation facility for making underwater acoustic measurements, testing and calibrating sonars, and providing accurate underwater, surface, and in-air tracking data on surface ships, submarines, aircraft, and weapon systems. Many of these programs are in support of Antisubmarine Warfare (ASW), undersea research and development programs, and Fleet assessment and operational readiness trials. Most tests conducted at AUTEC require precise underwater tracking (plus or minus 3 yards) of multiple acoustic signals emitted with the correct waveshape and repetition criteria from either a surface craft or underwater vehicle.

  20. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    NASA Technical Reports Server (NTRS)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  1. Optimal Facility-Location

    PubMed Central

    Goldman, A. J.

    2006-01-01

    Dr. Christoph Witzgall, the honoree of this Symposium, can count among his many contributions to applied mathematics and mathematical operations research a body of widely-recognized work on the optimal location of facilities. The present paper offers to non-specialists a sketch of that field and its evolution, with emphasis on areas most closely related to Witzgall’s research at NBS/NIST. PMID:27274920

  2. Ammonia Leak Locator Study

    NASA Technical Reports Server (NTRS)

    Dodge, Franklin T.; Wuest, Martin P.; Deffenbaugh, Danny M.

    1995-01-01

    The thermal control system of International Space Station Alpha will use liquid ammonia as the heat exchange fluid. It is expected that small leaks (of the order perhaps of one pound of ammonia per day) may develop in the lines transporting the ammonia to the various facilities as well as in the heat exchange equipment. Such leaks must be detected and located before the supply of ammonia becomes critically low. For that reason, NASA-JSC has a program underway to evaluate instruments that can detect and locate ultra-small concentrations of ammonia in a high vacuum environment. To be useful, the instrument must be portable and small enough that an astronaut can easily handle it during extravehicular activity. An additional complication in the design of the instrument is that the environment immediately surrounding ISSA will contain small concentrations of many other gases from venting of onboard experiments as well as from other kinds of leaks. These other vapors include water, cabin air, CO2, CO, argon, N2, and ethylene glycol. Altogether, this local environment might have a pressure of the order of 10(exp -7) to 10(exp -6) torr. Southwest Research Institute (SwRI) was contracted by NASA-JSC to provide support to NASA-JSC and its prime contractors in evaluating ammonia-location instruments and to make a preliminary trade study of the advantages and limitations of potential instruments. The present effort builds upon an earlier SwRI study to evaluate ammonia leak detection instruments [Jolly and Deffenbaugh]. The objectives of the present effort include: (1) Estimate the characteristics of representative ammonia leaks; (2) Evaluate the baseline instrument in the light of the estimated ammonia leak characteristics; (3) Propose alternative instrument concepts; and (4) Conduct a trade study of the proposed alternative concepts and recommend promising instruments. The baseline leak-location instrument selected by NASA-JSC was an ion gauge.

  3. Magnetic Location Indicator

    NASA Technical Reports Server (NTRS)

    Stegman, Thomas W.

    1992-01-01

    Ferrofluidic device indicates point of highest magnetic-flux density in workspace. Consists of bubble of ferrofluid in immiscible liquid carrier in clear plastic case. Used in flat block or tube. Axes of centering circle on flat-block version used to mark location of maximum flux density when bubble in circle. Device used to find point on wall corresponding to known point on opposite side of wall.

  4. Coso MT Site Locations

    SciTech Connect

    Doug Blankenship

    2011-05-04

    This data includes the locations of the MT data collected in and around the Coso Geothermal field that covered the West Flank area. These are the data that the 3D MT models were created from that were discussed in Phase 1 of the West Flank FORGE project. The projected coordinate system is NAD 1927 State Plane California IV FIPS 0404 and the Projection is Lambert Conformal Conic. Units are in feet.

  5. Methods for identification and verification using vacuum XRF system

    NASA Technical Reports Server (NTRS)

    Schramm, Fred (Inventor); Kaiser, Bruce (Inventor)

    2005-01-01

    Apparatus and methods in which one or more elemental taggants that are intrinsically located in an object are detected by x-ray fluorescence analysis under vacuum conditions to identify or verify the object's elemental content for elements with lower atomic numbers. By using x-ray fluorescence analysis, the apparatus and methods of the invention are simple and easy to use, as well as provide detection by a non line-of-sight method to establish the origin of objects, as well as their point of manufacture, authenticity, verification, security, and the presence of impurities. The invention is extremely advantageous because it provides the capability to measure lower atomic number elements in the field with a portable instrument.

  6. A Verification of MCNP6 FMESH Tally Capabilities

    SciTech Connect

    Swift, Alicia L.; McKigney, Edward A.; Schirato, Richard C.; Robinson, Alex Philip; Temple, Brian Allen

    2015-02-10

    This work serves to verify the MCNP6 FMESH capability through comparison to two types of data. FMESH tallies, binned in time, were generated on an ideal detector face for neutrons undergoing a single scatter in a graphite target. For verification, FMESH results were compared to analytic calculations of the nonrelativistic TOF for elastic and inelastic single neutron scatters (TOF for the purposes of this paper is the time for a neutron to travel from its scatter location in the graphite target to the detector face). FMESH tally results were also compared to F4 tally results, an MNCP tally that calculates fluence in the same way as the FMESH tally. The FMESH tally results agree well with the analytic results and the F4 tally; hence, it is believed that, for simple geometries, MCNP6 FMESH tallies represent the physics of neutron scattering very well.

  7. Weather model verification using Sodankylä mast measurements

    NASA Astrophysics Data System (ADS)

    Kangas, Markku; Rontu, Laura; Fortelius, Carl; Aurela, Mika; Poikonen, Antti

    2016-04-01

    Sodankylä, in the heart of Arctic Research Centre of the Finnish Meteorological Institute (FMI ARC) in northern Finland, is an ideal site for atmospheric and environmental research in the boreal and sub-Arctic zone. With temperatures ranging from -50 to +30 °C, it provides a challenging testing ground for numerical weather forecasting (NWP) models as well as weather forecasting in general. An extensive set of measurements has been carried out in Sodankylä for more than 100 years. In 2000, a 48 m-high micrometeorological mast was erected in the area. In this article, the use of Sodankylä mast measurements in NWP model verification is described. Starting in 2000, with the NWP model HIRLAM and Sodankylä measurements, the verification system has now been expanded to include comparisons between 12 NWP models and seven measurement masts, distributed across Europe. A case study, comparing forecasted and observed radiation fluxes, is also presented. It was found that three different radiation schemes, applicable in NWP model HARMONIE-AROME, produced somewhat different downwelling longwave radiation fluxes during cloudy days, which however did not change the overall cold bias of the predicted screen-level temperature.

  8. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory's Source Region Program

    SciTech Connect

    Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.

    1993-01-21

    As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.

  9. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  10. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  11. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  12. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  13. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  14. 45 CFR 1626.6 - Verification of citizenship.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 4 2014-10-01 2014-10-01 false Verification of citizenship. 1626.6 Section 1626.6... ON LEGAL ASSISTANCE TO ALIENS § 1626.6 Verification of citizenship. (a) A recipient shall require all... require verification of citizenship. A recipient shall not consider factors such as a person's...

  15. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  16. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  17. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  18. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  19. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  20. 8 CFR 343b.5 - Verification of naturalization.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 8 Aliens and Nationality 1 2012-01-01 2012-01-01 false Verification of naturalization. 343b.5... CERTIFICATE OF NATURALIZATION FOR RECOGNITION BY A FOREIGN STATE § 343b.5 Verification of naturalization. The application shall not be granted without first obtaining verification of the applicant's naturalization....

  1. 8 CFR 343b.5 - Verification of naturalization.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 8 Aliens and Nationality 1 2011-01-01 2011-01-01 false Verification of naturalization. 343b.5... CERTIFICATE OF NATURALIZATION FOR RECOGNITION BY A FOREIGN STATE § 343b.5 Verification of naturalization. The application shall not be granted without first obtaining verification of the applicant's naturalization....

  2. 8 CFR 343b.5 - Verification of naturalization.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 8 Aliens and Nationality 1 2014-01-01 2014-01-01 false Verification of naturalization. 343b.5... CERTIFICATE OF NATURALIZATION FOR RECOGNITION BY A FOREIGN STATE § 343b.5 Verification of naturalization. The application shall not be granted without first obtaining verification of the applicant's naturalization....

  3. 8 CFR 343b.5 - Verification of naturalization.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 8 Aliens and Nationality 1 2013-01-01 2013-01-01 false Verification of naturalization. 343b.5... CERTIFICATE OF NATURALIZATION FOR RECOGNITION BY A FOREIGN STATE § 343b.5 Verification of naturalization. The application shall not be granted without first obtaining verification of the applicant's naturalization....

  4. 8 CFR 343b.5 - Verification of naturalization.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Verification of naturalization. 343b.5... CERTIFICATE OF NATURALIZATION FOR RECOGNITION BY A FOREIGN STATE § 343b.5 Verification of naturalization. The application shall not be granted without first obtaining verification of the applicant's naturalization....

  5. 37 CFR 261.7 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... This section prescribes general rules pertaining to the verification by any Copyright Owner or... section shall apply to situations where a Copyright Owner or a Performer and a Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner or a...

  6. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... test conditions. As provided in 40 CFR 1068.5, we will deem your system to not meet the requirements of... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PEMS calibrations and verifications....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all...

  7. 40 CFR 98.3 - What are the general monitoring, reporting, recordkeeping and verification requirements of this...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... emissions from a cogeneration unit located at the facility. (vi) When applying paragraph (c)(4)(i) of this...) Verification. To verify the completeness and accuracy of reported GHG emissions, the Administrator may review... other credible evidence, in conjunction with a comprehensive review of the GHG reports and...

  8. Interim Letter Report - Verification Survey of 19 Grids in the Lester Flat Area, David Witherspoon Inc. 1630 Site Knoxville, Tennessee

    SciTech Connect

    P.C. Weaver

    2008-10-17

    Perform verification surveys of 19 available grids located in the Lester Flat Area at the Davod Witherspoon Site. The survey grids included E11, E12, E13, F11, F12, F13, F14, F15, G15, G16, G17, H16, H17, H18, X16, X17, X18, K16, and J16.

  9. A radar-based verification of precipitation forecast for local convective storms

    NASA Astrophysics Data System (ADS)

    Rezacova, Daniela; Sokol, Zbynek; Pesice, Petr

    2007-02-01

    Local flash flood storms with a rapid hydrological response are a real challenge for quantitative precipitation forecasting (QPF). It is relevant to assess space domains, to which the QPF approaches are applicable. In this paper an attempt is made to evaluate the forecasting capability of a high-resolution numerical weather prediction (NWP) model by means of area-related QPF verification. The results presented concern two local convective events, which occurred in the Czech Republic (CR) on 13 and 15 July 2002 and caused local flash floods. We used the LM COSMO model (Lokall Model of the COSMO consortium) adapted to the horizontal resolution of 2.8 km over a model domain covering the CR. The 18 h forecast of convective precipitation was verified by using radar rainfall totals adjusted to the measured rain gauge data. The grid point-related root mean square error (RMSE) value was calculated over a square around the grid point under the assumption that rainfall values were randomly distributed within the square. The forecast accuracy was characterized by the mean RMSE over the whole verification domain. We attempt to show a dependence of both the RMSE field and the mean RMSE on the square size. The importance of a suitable merger between the radar and rain gauge datasets is demonstrated by a comparison between the verification results obtained with and without the gauge adjustment. The application of verification procedure demonstrates uncertainties in the precipitation forecasts. The model was integrated with initial conditions shifted by 0.5° distances. The four verifications, corresponding to the shifts in the four directions, show differences in the resulting QPF, which depend on the size of verification area and on the direction of the shift.

  10. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 4 2012-10-01 2012-10-01 false Eligibility verification. 457.380 Section 457.380 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... this chapter, the State must obtain the information through that service. (h) Interaction with...

  11. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 4 2014-10-01 2014-10-01 false Eligibility verification. 457.380 Section 457.380 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES...) Interaction with program integrity requirements. Nothing in this section should be construed as limiting...

  12. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment....

  13. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment....

  14. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment....

  15. Synthesis, verification, and optimization of systolic arrays

    SciTech Connect

    Rajopadhye, S.V.

    1986-01-01

    This dissertation addresses the issue of providing a sound theoretical basis for three important issues relating to systolic arrays, namely synthesis, verification, and optimization. Former research has concentrated on analysis of the dependency structure of the computation, and there have been numerous approaches to map this dependency structure onto a locally interconnected network. This study pursues a similar approach, but with a major generalization of the class of problems analyzed. In earlier research, it was essential that the dependencies were expressible as constant vectors (from a point in the domain to the points that it depended on); here they are permitted to be arbitrary linear functions of the point. Theory for synthesizing systolic architectures from such generalized specifications is developed. Also a systematic (mechanizable) approach to the synthesis of systolic architectures that have control signals is presented. In the areas of verification and optimization, a rigorous mathematical framework is presented that permits reasoning about the behavior of systolic arrays as functions on streams of data. Using this approach, the verification of such architectures reduces to the problem of verification of functional program.s

  16. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  17. The ALMA Commissioning and Science Verification Team

    NASA Astrophysics Data System (ADS)

    Hales, A.; Sheth, K.; Wilson, T. L.

    2010-04-01

    The goal of Commissioning is to take ALMA from the stage reached at the end of AIV, that is, a system that functions at an engineering level to an instrument that meets the science / astronomy requirements. Science Verification is the quantitative confirmation that the data produced by the instrument is valid and has the required characteristics in terms of sensitivity, image quality and accuracy.

  18. An Interactive System for Graduation Verification.

    ERIC Educational Resources Information Center

    Wang, Y.; Dasarathy, B.

    1981-01-01

    This description of a computerized graduation verification system developed and implemented at the University of South Carolina at Columbia discusses the "table-driven" feature of the programs and details the implementation of the system, including examples of the Extended Backus Naur Form (EBNF) notation used to represent the system language…

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION OF BAGHOUSE FILTRATION PRODUCTS

    EPA Science Inventory

    The Environmental Technology Verification Program (ETV) was started by EPA in 1995 to generate independent credible data on the performance of innovative technologies that have potential to improve protection of public health and the environment. ETV does not approve or certify p...

  20. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  1. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  2. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  3. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  4. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  5. 21 CFR 123.8 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Verification. 123.8 Section 123.8 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... processor shall verify that the HACCP plan is adequate to control food safety hazards that are...

  6. 21 CFR 123.8 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Verification. 123.8 Section 123.8 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... processor shall verify that the HACCP plan is adequate to control food safety hazards that are...

  7. Needs Assessment Project: FY 82 Verification Study.

    ERIC Educational Resources Information Center

    Shively, Joe E.; O'Donnell, Phyllis

    As part of a continuing assessment of educational needs in a seven-state region, researchers conducted a verification study to check the validity of educational needs first identified in fiscal year (FY) 1980. The seven states comprise Alabama, Kentucky, Ohio, Pennsylvania, Tennessee, Virginia, and West Virginia. This report describes assessment…

  8. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  9. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  10. Distilling the Verification Process for Prognostics Algorithms

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai

    2013-01-01

    The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.

  11. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Verification program. 460.17 Section 460.17 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT...

  12. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Verification program. 460.17 Section 460.17 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT...

  13. Environmental Technology Verification (ETV) Quality Program (Poster)

    EPA Science Inventory

    This is a poster created for the ETV Quality Program. The EPA Environmental Technology Verification Program (ETV) develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The...

  14. Hardware verification at Computational Logic, Inc.

    NASA Technical Reports Server (NTRS)

    Brock, Bishop C.; Hunt, Warren A., Jr.

    1990-01-01

    The following topics are covered in viewgraph form: (1) hardware verification; (2) Boyer-Moore logic; (3) core RISC; (4) the FM8502 fabrication, implementation specification, and pinout; (5) hardware description language; (6) arithmetic logic generator; (7) near term expected results; (8) present trends; (9) future directions; (10) collaborations and technology transfer; and (11) technology enablers.

  15. 24 CFR 257.112 - Mortgagee verifications.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... URBAN DEVELOPMENT MORTGAGE AND LOAN INSURANCE PROGRAMS UNDER NATIONAL HOUSING ACT AND OTHER AUTHORITIES... income. (b) Mortgage fraud verification. The mortgagor shall provide a certification to the mortgagee that the mortgagor has not been convicted under federal or state law for fraud during the...

  16. 24 CFR 257.112 - Mortgagee verifications.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... URBAN DEVELOPMENT MORTGAGE AND LOAN INSURANCE PROGRAMS UNDER NATIONAL HOUSING ACT AND OTHER AUTHORITIES... income. (b) Mortgage fraud verification. The mortgagor shall provide a certification to the mortgagee that the mortgagor has not been convicted under federal or state law for fraud during the...

  17. Gender, Legitimation, and Identity Verification in Groups

    ERIC Educational Resources Information Center

    Burke, Peter J.; Stets, Jan E.; Cerven, Christine

    2007-01-01

    Drawing upon identity theory, expectation states theory, and legitimation theory, we examine how the task leader identity in task-oriented groups is more likely to be verified for persons with high status characteristics. We hypothesize that identity verification will be accomplished more readily for male group members and legitimated task leaders…

  18. Environmental Technology Verification Program Fact Sheet

    EPA Science Inventory

    This is a Fact Sheet for the ETV Program. The EPA Environmental Technology Verification Program (ETV) develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The program ...

  19. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  20. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...