Sample records for distributed location verification

  1. Influence of the Redundant Verification and the Non-Redundant Verification on the Hydraulic Tomography

    NASA Astrophysics Data System (ADS)

    Wei, T. B.; Chen, Y. L.; Lin, H. R.; Huang, S. Y.; Yeh, T. C. J.; Wen, J. C.

    2016-12-01

    In the groundwater study, it estimated the heterogeneous spatial distribution of hydraulic Properties, there were many scholars use to hydraulic tomography (HT) from field site pumping tests to estimate inverse of heterogeneous spatial distribution of hydraulic Properties, to prove the most of most field site aquifer was heterogeneous hydrogeological parameters spatial distribution field. Many scholars had proposed a method of hydraulic tomography to estimate heterogeneous spatial distribution of hydraulic Properties of aquifer, the Huang et al. [2011] was used the non-redundant verification analysis of pumping wells changed, observation wells fixed on the inverse and the forward, to reflect the feasibility of the heterogeneous spatial distribution of hydraulic Properties of field site aquifer of the non-redundant verification analysis on steady-state model.From post literature, finding only in steady state, non-redundant verification analysis of pumping well changed location and observation wells fixed location for inverse and forward. But the studies had not yet pumping wells fixed or changed location, and observation wells fixed location for redundant verification or observation wells change location for non-redundant verification of the various combinations may to explore of influences of hydraulic tomography method. In this study, it carried out redundant verification method and non-redundant verification method for forward to influences of hydraulic tomography method in transient. And it discuss above mentioned in NYUST campus sites the actual case, to prove the effectiveness of hydraulic tomography methods, and confirmed the feasibility on inverse and forward analysis from analysis results.Keywords: Hydraulic Tomography, Redundant Verification, Heterogeneous, Inverse, Forward

  2. Collaborative Localization and Location Verification in WSNs

    PubMed Central

    Miao, Chunyu; Dai, Guoyong; Ying, Kezhen; Chen, Qingzhang

    2015-01-01

    Localization is one of the most important technologies in wireless sensor networks. A lightweight distributed node localization scheme is proposed by considering the limited computational capacity of WSNs. The proposed scheme introduces the virtual force model to determine the location by incremental refinement. Aiming at solving the drifting problem and malicious anchor problem, a location verification algorithm based on the virtual force mode is presented. In addition, an anchor promotion algorithm using the localization reliability model is proposed to re-locate the drifted nodes. Extended simulation experiments indicate that the localization algorithm has relatively high precision and the location verification algorithm has relatively high accuracy. The communication overhead of these algorithms is relative low, and the whole set of reliable localization methods is practical as well as comprehensive. PMID:25954948

  3. Tritium as an indicator of venues for nuclear tests.

    PubMed

    Lyakhova, O N; Lukashenko, S N; Mulgin, S I; Zhdanov, S V

    2013-10-01

    Currently, due to the Treaty on the Non-proliferation of Nuclear Weapons there is a highly topical issue of an accurate verification of nuclear explosion venues. This paper proposes to consider new method for verification by using tritium as an indicator. Detailed studies of the tritium content in the air were carried in the locations of underground nuclear tests - "Balapan" and "Degelen" testing sites located in Semipalatinsk Test Site. The paper presents data on the levels and distribution of tritium in the air where tunnels and boreholes are located - explosion epicentres, wellheads and tunnel portals, as well as in estuarine areas of the venues for the underground nuclear explosions (UNE). Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. An Efficient Location Verification Scheme for Static Wireless Sensor Networks.

    PubMed

    Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok

    2017-01-24

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.

  5. An Efficient Location Verification Scheme for Static Wireless Sensor Networks

    PubMed Central

    Kim, In-hwan; Kim, Bo-sung; Song, JooSeok

    2017-01-01

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007

  6. Full-chip level MEEF analysis using model based lithography verification

    NASA Astrophysics Data System (ADS)

    Kim, Juhwan; Wang, Lantian; Zhang, Daniel; Tang, Zongwu

    2005-11-01

    MEEF (Mask Error Enhancement Factor) has become a critical factor in CD uniformity control since optical lithography process moved to sub-resolution era. A lot of studies have been done by quantifying the impact of the mask CD (Critical Dimension) errors on the wafer CD errors1-2. However, the benefits from those studies were restricted only to small pattern areas of the full-chip data due to long simulation time. As fast turn around time can be achieved for the complicated verifications on very large data by linearly scalable distributed processing technology, model-based lithography verification becomes feasible for various types of applications such as post mask synthesis data sign off for mask tape out in production and lithography process development with full-chip data3,4,5. In this study, we introduced two useful methodologies for the full-chip level verification of mask error impact on wafer lithography patterning process. One methodology is to check MEEF distribution in addition to CD distribution through process window, which can be used for RET/OPC optimization at R&D stage. The other is to check mask error sensitivity on potential pinch and bridge hotspots through lithography process variation, where the outputs can be passed on to Mask CD metrology to add CD measurements on those hotspot locations. Two different OPC data were compared using the two methodologies in this study.

  7. A Secure Framework for Location Verification in Pervasive Computing

    NASA Astrophysics Data System (ADS)

    Liu, Dawei; Lee, Moon-Chuen; Wu, Dan

    The way people use computing devices has been changed in some way by the relatively new pervasive computing paradigm. For example, a person can use a mobile device to obtain its location information at anytime and anywhere. There are several security issues concerning whether this information is reliable in a pervasive environment. For example, a malicious user may disable the localization system by broadcasting a forged location, and it may impersonate other users by eavesdropping their locations. In this paper, we address the verification of location information in a secure manner. We first present the design challenges for location verification, and then propose a two-layer framework VerPer for secure location verification in a pervasive computing environment. Real world GPS-based wireless sensor network experiments confirm the effectiveness of the proposed framework.

  8. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chukbar, B. K., E-mail: bchukbar@mail.ru

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm{sup –3} in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  9. Probabilistic Elastic Part Model: A Pose-Invariant Representation for Real-World Face Verification.

    PubMed

    Li, Haoxiang; Hua, Gang

    2018-04-01

    Pose variation remains to be a major challenge for real-world face recognition. We approach this problem through a probabilistic elastic part model. We extract local descriptors (e.g., LBP or SIFT) from densely sampled multi-scale image patches. By augmenting each descriptor with its location, a Gaussian mixture model (GMM) is trained to capture the spatial-appearance distribution of the face parts of all face images in the training corpus, namely the probabilistic elastic part (PEP) model. Each mixture component of the GMM is confined to be a spherical Gaussian to balance the influence of the appearance and the location terms, which naturally defines a part. Given one or multiple face images of the same subject, the PEP-model builds its PEP representation by sequentially concatenating descriptors identified by each Gaussian component in a maximum likelihood sense. We further propose a joint Bayesian adaptation algorithm to adapt the universally trained GMM to better model the pose variations between the target pair of faces/face tracks, which consistently improves face verification accuracy. Our experiments show that we achieve state-of-the-art face verification accuracy with the proposed representations on the Labeled Face in the Wild (LFW) dataset, the YouTube video face database, and the CMU MultiPIE dataset.

  10. Polarization-multiplexed plasmonic phase generation with distributed nanoslits.

    PubMed

    Lee, Seung-Yeol; Kim, Kyuho; Lee, Gun-Yeal; Lee, Byoungho

    2015-06-15

    Methods for multiplexing surface plasmon polaritons (SPPs) have been attracting much attention due to their potentials for plasmonic integrated systems, plasmonic holography, and optical tweezing. Here, using closely-distanced distributed nanoslits, we propose a method for generating polarization-multiplexed SPP phase profiles which can be applied for implementing general SPP phase distributions. Two independent types of SPP phase generation mechanisms - polarization-independent and polarization-reversible ones - are combined to generate fully arbitrary phase profiles for each optical handedness. As a simple verification of the proposed scheme, we experimentally demonstrate that the location of plasmonic focus can be arbitrary designed, and switched by the change of optical handedness.

  11. GENERIC VERIFICATION PROTOCOL: DISTRIBUTED GENERATION AND COMBINED HEAT AND POWER FIELD TESTING PROTOCOL

    EPA Science Inventory

    This report is a generic verification protocol by which EPA’s Environmental Technology Verification program tests newly developed equipment for distributed generation of electric power, usually micro-turbine generators and internal combustion engine generators. The protocol will ...

  12. Advanced Distributed Measurements and Data Processing at the Vibro-Acoustic Test Facility, GRC Space Power Facility, Sandusky, Ohio - an Architecture and an Example

    NASA Technical Reports Server (NTRS)

    Hill, Gerald M.; Evans, Richard K.

    2009-01-01

    A large-scale, distributed, high-speed data acquisition system (HSDAS) is currently being installed at the Space Power Facility (SPF) at NASA Glenn Research Center s Plum Brook Station in Sandusky, OH. This installation is being done as part of a facility construction project to add Vibro-acoustic Test Capabilities (VTC) to the current thermal-vacuum testing capability of SPF in support of the Orion Project s requirement for Space Environments Testing (SET). The HSDAS architecture is a modular design, which utilizes fully-remotely managed components, enables the system to support multiple test locations with a wide-range of measurement types and a very large system channel count. The architecture of the system is presented along with details on system scalability and measurement verification. In addition, the ability of the system to automate many of its processes such as measurement verification and measurement system analysis is also discussed.

  13. Protecting Privacy and Securing the Gathering of Location Proofs - The Secure Location Verification Proof Gathering Protocol

    NASA Astrophysics Data System (ADS)

    Graham, Michelle; Gray, David

    As wireless networks become increasingly ubiquitous, the demand for a method of locating a device has increased dramatically. Location Based Services are now commonplace but there are few methods of verifying or guaranteeing a location provided by a user without some specialised hardware, especially in larger scale networks. We propose a system for the verification of location claims, using proof gathered from neighbouring devices. In this paper we introduce a protocol to protect this proof gathering process, protecting the privacy of all involved parties and securing it from intruders and malicious claiming devices. We present the protocol in stages, extending the security of this protocol to allow for flexibility within its application. The Secure Location Verification Proof Gathering Protocol (SLVPGP) has been designed to function within the area of Vehicular Networks, although its application could be extended to any device with wireless & cryptographic capabilities.

  14. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems.

    DTIC Science & Technology

    1997-09-30

    set of software tools for specification and verification of distributed real time systems using formal methods. The task of this SBIR Phase II effort...to be used by designers of real - time systems for early detection of errors. The mathematical complexity of formal specification and verification has

  15. Space station data management system - A common GSE test interface for systems testing and verification

    NASA Technical Reports Server (NTRS)

    Martinez, Pedro A.; Dunn, Kevin W.

    1987-01-01

    This paper examines the fundamental problems and goals associated with test, verification, and flight-certification of man-rated distributed data systems. First, a summary of the characteristics of modern computer systems that affect the testing process is provided. Then, verification requirements are expressed in terms of an overall test philosophy for distributed computer systems. This test philosophy stems from previous experience that was gained with centralized systems (Apollo and the Space Shuttle), and deals directly with the new problems that verification of distributed systems may present. Finally, a description of potential hardware and software tools to help solve these problems is provided.

  16. Verification of operational solar flare forecast: Case of Regional Warning Center Japan

    NASA Astrophysics Data System (ADS)

    Kubo, Yûki; Den, Mitsue; Ishii, Mamoru

    2017-08-01

    In this article, we discuss a verification study of an operational solar flare forecast in the Regional Warning Center (RWC) Japan. The RWC Japan has been issuing four-categorical deterministic solar flare forecasts for a long time. In this forecast verification study, we used solar flare forecast data accumulated over 16 years (from 2000 to 2015). We compiled the forecast data together with solar flare data obtained with the Geostationary Operational Environmental Satellites (GOES). Using the compiled data sets, we estimated some conventional scalar verification measures with 95% confidence intervals. We also estimated a multi-categorical scalar verification measure. These scalar verification measures were compared with those obtained by the persistence method and recurrence method. As solar activity varied during the 16 years, we also applied verification analyses to four subsets of forecast-observation pair data with different solar activity levels. We cannot conclude definitely that there are significant performance differences between the forecasts of RWC Japan and the persistence method, although a slightly significant difference is found for some event definitions. We propose to use a scalar verification measure to assess the judgment skill of the operational solar flare forecast. Finally, we propose a verification strategy for deterministic operational solar flare forecasting. For dichotomous forecast, a set of proposed verification measures is a frequency bias for bias, proportion correct and critical success index for accuracy, probability of detection for discrimination, false alarm ratio for reliability, Peirce skill score for forecast skill, and symmetric extremal dependence index for association. For multi-categorical forecast, we propose a set of verification measures as marginal distributions of forecast and observation for bias, proportion correct for accuracy, correlation coefficient and joint probability distribution for association, the likelihood distribution for discrimination, the calibration distribution for reliability and resolution, and the Gandin-Murphy-Gerrity score and judgment skill score for skill.

  17. A Comparative Study of Two Azimuth Based Non Standard Location Methods

    DTIC Science & Technology

    2017-03-23

    Standard Location Methods Rongsong JIH U.S. Department of State / Arms Control, Verification, and Compliance Bureau, 2201 C Street, NW, Washington...COMPARATIVE STUDY OF TWO AZIMUTH-BASED NON-STANDARD LOCATION METHODS R. Jih Department of State / Arms Control, Verification, and Compliance Bureau...cable. The so-called “Yin Zhong Xian” (“引中线” in Chinese) algorithm, hereafter the YZX method , is an Oriental version of IPB-based procedure. It

  18. RF model of the distribution system as a communication channel, phase 2. Volume 2: Task reports

    NASA Technical Reports Server (NTRS)

    Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.

    1982-01-01

    Based on the established feasibility of predicting, via a model, the propagation of Power Line Frequency on radial type distribution feeders, verification studies comparing model predictions against measurements were undertaken using more complicated feeder circuits and situations. Detailed accounts of the major tasks are presented. These include: (1) verification of model; (2) extension, implementation, and verification of perturbation theory; (3) parameter sensitivity; (4) transformer modeling; and (5) compensation of power distribution systems for enhancement of power line carrier communication reliability.

  19. SU-D-BRC-03: Development and Validation of an Online 2D Dose Verification System for Daily Patient Plan Delivery Accuracy Check

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, J; Hu, W; Xing, Y

    Purpose: All plan verification systems for particle therapy are designed to do plan verification before treatment. However, the actual dose distributions during patient treatment are not known. This study develops an online 2D dose verification tool to check the daily dose delivery accuracy. Methods: A Siemens particle treatment system with a modulated scanning spot beam is used in our center. In order to do online dose verification, we made a program to reconstruct the delivered 2D dose distributions based on the daily treatment log files and depth dose distributions. In the log files we can get the focus size, positionmore » and particle number for each spot. A gamma analysis is used to compare the reconstructed dose distributions with the dose distributions from the TPS to assess the daily dose delivery accuracy. To verify the dose reconstruction algorithm, we compared the reconstructed dose distributions to dose distributions measured using PTW 729XDR ion chamber matrix for 13 real patient plans. Then we analyzed 100 treatment beams (58 carbon and 42 proton) for prostate, lung, ACC, NPC and chordoma patients. Results: For algorithm verification, the gamma passing rate was 97.95% for the 3%/3mm and 92.36% for the 2%/2mm criteria. For patient treatment analysis,the results were 97.7%±1.1% and 91.7%±2.5% for carbon and 89.9%±4.8% and 79.7%±7.7% for proton using 3%/3mm and 2%/2mm criteria, respectively. The reason for the lower passing rate for the proton beam is that the focus size deviations were larger than for the carbon beam. The average focus size deviations were −14.27% and −6.73% for proton and −5.26% and −0.93% for carbon in the x and y direction respectively. Conclusion: The verification software meets our requirements to check for daily dose delivery discrepancies. Such tools can enhance the current treatment plan and delivery verification processes and improve safety of clinical treatments.« less

  20. Deformable structure registration of bladder through surface mapping.

    PubMed

    Xiong, Li; Viswanathan, Akila; Stewart, Alexandra J; Haker, Steven; Tempany, Clare M; Chin, Lee M; Cormack, Robert A

    2006-06-01

    Cumulative dose distributions in fractionated radiation therapy depict the dose to normal tissues and therefore may permit an estimation of the risk of normal tissue complications. However, calculation of these distributions is highly challenging because of interfractional changes in the geometry of patient anatomy. This work presents an algorithm for deformable structure registration of the bladder and the verification of the accuracy of the algorithm using phantom and patient data. In this algorithm, the registration process involves conformal mapping of genus zero surfaces using finite element analysis, and guided by three control landmarks. The registration produces a correspondence between fractions of the triangular meshes used to describe the bladder surface. For validation of the algorithm, two types of balloons were inflated gradually to three times their original size, and several computerized tomography (CT) scans were taken during the process. The registration algorithm yielded a local accuracy of 4 mm along the balloon surface. The algorithm was then applied to CT data of patients receiving fractionated high-dose-rate brachytherapy to the vaginal cuff, with the vaginal cylinder in situ. The patients' bladder filling status was intentionally different for each fraction. The three required control landmark points were identified for the bladder based on anatomy. Out of an Institutional Review Board (IRB) approved study of 20 patients, 3 had radiographically identifiable points near the bladder surface that were used for verification of the accuracy of the registration. The verification point as seen in each fraction was compared with its predicted location based on affine as well as deformable registration. Despite the variation in bladder shape and volume, the deformable registration was accurate to 5 mm, consistently outperforming the affine registration. We conclude that the structure registration algorithm presented works with reasonable accuracy and provides a means of calculating cumulative dose distributions.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Xiong; Viswanathan, Akila; Stewart, Alexandra J.

    Cumulative dose distributions in fractionated radiation therapy depict the dose to normal tissues and therefore may permit an estimation of the risk of normal tissue complications. However, calculation of these distributions is highly challenging because of interfractional changes in the geometry of patient anatomy. This work presents an algorithm for deformable structure registration of the bladder and the verification of the accuracy of the algorithm using phantom and patient data. In this algorithm, the registration process involves conformal mapping of genus zero surfaces using finite element analysis, and guided by three control landmarks. The registration produces a correspondence between fractionsmore » of the triangular meshes used to describe the bladder surface. For validation of the algorithm, two types of balloons were inflated gradually to three times their original size, and several computerized tomography (CT) scans were taken during the process. The registration algorithm yielded a local accuracy of 4 mm along the balloon surface. The algorithm was then applied to CT data of patients receiving fractionated high-dose-rate brachytherapy to the vaginal cuff, with the vaginal cylinder in situ. The patients' bladder filling status was intentionally different for each fraction. The three required control landmark points were identified for the bladder based on anatomy. Out of an Institutional Review Board (IRB) approved study of 20 patients, 3 had radiographically identifiable points near the bladder surface that were used for verification of the accuracy of the registration. The verification point as seen in each fraction was compared with its predicted location based on affine as well as deformable registration. Despite the variation in bladder shape and volume, the deformable registration was accurate to 5 mm, consistently outperforming the affine registration. We conclude that the structure registration algorithm presented works with reasonable accuracy and provides a means of calculating cumulative dose distributions.« less

  2. Experimental verification of distributed piezoelectric actuators for use in precision space structures

    NASA Technical Reports Server (NTRS)

    Crawley, E. F.; De Luis, J.

    1986-01-01

    An analytic model for structures with distributed piezoelectric actuators is experimentally verified for the cases of both surface-bonded and embedded actuators. A technique for the selection of such piezoelectric actuators' location has been developed, and is noted to indicate that segmented actuators are always more effective than continuous ones, since the output of each can be individually controlled. Manufacturing techniques for the bonding or embedding of segmented piezoelectric actuators are also developed which allow independent electrical contact to be made with each actuator. Static tests have been conducted to determine how the elastic properties of the composite are affected by the presence of an embedded actuator, for the case of glass/epoxy laminates.

  3. Atmospheric transport modelling in support of CTBT verification—overview and basic concepts

    NASA Astrophysics Data System (ADS)

    Wotawa, Gerhard; De Geer, Lars-Erik; Denier, Philippe; Kalinowski, Martin; Toivonen, Harri; D'Amours, Real; Desiato, Franco; Issartel, Jean-Pierre; Langer, Matthias; Seibert, Petra; Frank, Andreas; Sloan, Craig; Yamazawa, Hiromi

    Under the provisions of the Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global monitoring system comprising different verification technologies is currently being set up. The network will include 80 radionuclide (RN) stations distributed all over the globe that measure treaty-relevant radioactive species. While the seismic subsystem cannot distinguish between chemical and nuclear explosions, RN monitoring would provide the "smoking gun" of a possible treaty violation. Atmospheric transport modelling (ATM) will be an integral part of CTBT verification, since it provides a geo-temporal location capability for the RN technology. In this paper, the basic concept for the future ATM software system to be installed at the International Data Centre is laid out. The system is based on the operational computation of multi-dimensional source-receptor sensitivity fields for all RN samples by means of adjoint tracer transport modelling. While the source-receptor matrix methodology has already been applied in the past, the system that we suggest will be unique and unprecedented, since it is global, real-time and aims at uncovering source scenarios that are compatible with measurements. Furthermore, it has to deal with source dilution ratios that are by orders of magnitude larger than in typical transport model applications. This new verification software will need continuous scientific attention, and may well provide a prototype system for future applications in areas of environmental monitoring, emergency response and verification of other international agreements and treaties.

  4. Mineral mapping in the Maherabad area, eastern Iran, using the HyMap remote sensing data

    NASA Astrophysics Data System (ADS)

    Molan, Yusuf Eshqi; Refahi, Davood; Tarashti, Ali Hoseinmardi

    2014-04-01

    This study applies matched filtering on the HyMap airborne hyperspectral data to obtain the distribution map of alteration minerals in the Maherabad area and uses virtual verification to verify the results. This paper also introduces "moving threshold" which tries to find an appropriate threshold value to convert gray scale images, produced by mapping methods, to target and background pixels. The Maherabad area, located in the eastern part of the Lut block, is a Cu-Au porphyry system in which quartz-sericite-pyrite, argillic and propylitic alteration are most common. Minimum noise fraction transform coupled with a pixel purity index was applied on the HyMap images to extract the endmembers of the alteration minerals, including kaolinite, montmorillonite, sericite (muscovite/illite), calcite, chlorite, epidote, and goethite. Since there was no access to any portable spectrometer and/or lab spectral measurements for the verification of the remote sensing imagery results, virtual verification achieved using the USGS spectral library and showed an agreement of 83.19%. The comparison between the results of the matched filtering and X-ray diffraction (XRD) analyses also showed an agreement of 56.13%.

  5. Proton therapy treatment monitoring with the DoPET system: activity range, positron emitters evaluation and comparison with Monte Carlo predictions

    NASA Astrophysics Data System (ADS)

    Muraro, S.; Battistoni, G.; Belcari, N.; Bisogni, M. G.; Camarlinghi, N.; Cristoforetti, L.; Del Guerra, A.; Ferrari, A.; Fracchiolla, F.; Morrocchi, M.; Righetto, R.; Sala, P.; Schwarz, M.; Sportelli, G.; Topi, A.; Rosso, V.

    2017-12-01

    Ion beam irradiations can deliver conformal dose distributions minimizing damage to healthy tissues thanks to their characteristic dose profiles. Nevertheless, the location of the Bragg peak can be affected by different sources of range uncertainties: a critical issue is the treatment verification. During the treatment delivery, nuclear interactions between the ions and the irradiated tissues generate β+ emitters: the detection of this activity signal can be used to perform the treatment monitoring if an expected activity distribution is available for comparison. Monte Carlo (MC) codes are widely used in the particle therapy community to evaluate the radiation transport and interaction with matter. In this work, FLUKA MC code was used to simulate the experimental conditions of irradiations performed at the Proton Therapy Center in Trento (IT). Several mono-energetic pencil beams were delivered on phantoms mimicking human tissues. The activity signals were acquired with a PET system (DoPET) based on two planar heads, and designed to be installed along the beam line to acquire data also during the irradiation. Different acquisitions are analyzed and compared with the MC predictions, with a special focus on validating the PET detectors response for activity range verification.

  6. Proton Therapy Verification with PET Imaging

    PubMed Central

    Zhu, Xuping; Fakhri, Georges El

    2013-01-01

    Proton therapy is very sensitive to uncertainties introduced during treatment planning and dose delivery. PET imaging of proton induced positron emitter distributions is the only practical approach for in vivo, in situ verification of proton therapy. This article reviews the current status of proton therapy verification with PET imaging. The different data detecting systems (in-beam, in-room and off-line PET), calculation methods for the prediction of proton induced PET activity distributions, and approaches for data evaluation are discussed. PMID:24312147

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION: Reduction of Nitrogen in Domestic Wastewater from Individual Residential Homes. BioConcepts, Inc. ReCip® RTS ~ 500 System

    EPA Science Inventory

    Verification testing of the ReCip® RTS-500 System was conducted over a 12-month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located on Otis Air National Guard Base in Bourne, Massachusetts. A nine-week startup period preceded the verification test t...

  8. Results of the Verification of the Statistical Distribution Model of Microseismicity Emission Characteristics

    NASA Astrophysics Data System (ADS)

    Cianciara, Aleksander

    2016-09-01

    The paper presents the results of research aimed at verifying the hypothesis that the Weibull distribution is an appropriate statistical distribution model of microseismicity emission characteristics, namely: energy of phenomena and inter-event time. It is understood that the emission under consideration is induced by the natural rock mass fracturing. Because the recorded emission contain noise, therefore, it is subjected to an appropriate filtering. The study has been conducted using the method of statistical verification of null hypothesis that the Weibull distribution fits the empirical cumulative distribution function. As the model describing the cumulative distribution function is given in an analytical form, its verification may be performed using the Kolmogorov-Smirnov goodness-of-fit test. Interpretations by means of probabilistic methods require specifying the correct model describing the statistical distribution of data. Because in these methods measurement data are not used directly, but their statistical distributions, e.g., in the method based on the hazard analysis, or in that that uses maximum value statistics.

  9. Environmental Technology Verification Report - Electric Power and Heat Production Using Renewable Biogas at Patterson Farms

    EPA Science Inventory

    The U.S. EPA operates the Environmental Technology Verification program to facilitate the deployment of innovative technologies through performance verification and information dissemination. A technology area of interest is distributed electrical power generation, particularly w...

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION--FUELCELL ENERGY, INC.: DFC 300A MOLTEN CARBONATE FUEL CELL COMBINED HEAT AND POWER SYSTEM

    EPA Science Inventory

    The U.S. EPA operates the Environmental Technology Verification program to facilitate the deployment of innovative technologies through performance verification and information dissemination. A technology area of interest is distributed electrical power generation, particularly w...

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL RESIDENTIAL HOMES, WATERLOO BIOFILTER® MODEL 4-BEDROOM (NSF 02/03/WQPC-SWP)

    EPA Science Inventory

    Verification testing of the Waterloo Biofilter Systems (WBS), Inc. Waterloo Biofilter® Model 4-Bedroom system was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at Otis Air National Guard Base in Bourne, Mas...

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSDF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, SEPTITECH, INC. MODEL 400 SYSTEM - 02/04/WQPC-SWP

    EPA Science Inventory

    Verification testing of the SeptiTech Model 400 System was conducted over a twelve month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at the Otis Air National Guard Base in Borne, MA. Sanitary Sewerage from the base residential housing was u...

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, BIO-MICROBICS, INC., MODEL RETROFAST ®0.375

    EPA Science Inventory

    Verification testing of the Bio-Microbics RetroFAST® 0.375 System to determine the reduction of nitrogen in residential wastewater was conducted over a twelve-month period at the Mamquam Wastewater Technology Test Facility, located at the Mamquam Wastewater Treatment Plant. The R...

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, AQUAPOINT, INC. BIOCLERE MODEL 16/12 - 02/02/WQPC-SWP

    EPA Science Inventory

    Verification testing of the Aquapoint, Inc. (AQP) BioclereTM Model 16/12 was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC), located at Otis Air National Guard Base in Bourne, Massachusetts. Sanitary sewerage from the ba...

  15. Theoretical study of closed-loop recycling liquid-liquid chromatography and experimental verification of the theory.

    PubMed

    Kostanyan, Artak E; Erastov, Andrey A

    2016-09-02

    The non-ideal recycling equilibrium-cell model including the effects of extra-column dispersion is used to simulate and analyze closed-loop recycling counter-current chromatography (CLR CCC). Previously, the operating scheme with the detector located before the column was considered. In this study, analysis of the process is carried out for a more realistic and practical scheme with the detector located immediately after the column. Peak equation for individual cycles and equations describing the transport of single peaks and complex chromatograms inside the recycling closed-loop, as well as equations for the resolution between single solute peaks of the neighboring cycles, for the resolution of peaks in the recycling chromatogram and for the resolution between the chromatograms of the neighboring cycles are presented. It is shown that, unlike conventional chromatography, increasing of the extra-column volume (the recycling line length) may allow a better separation of the components in CLR chromatography. For the experimental verification of the theory, aspirin, caffeine, coumarin and the solvent system hexane/ethyl acetate/ethanol/water (1:1:1:1) were used. Comparison of experimental and simulated processes of recycling and distribution of the solutes in the closed-loop demonstrated a good agreement between theory and experiment. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Security Verification of Secure MANET Routing Protocols

    DTIC Science & Technology

    2012-03-22

    SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS THESIS Matthew F. Steele, Captain, USAF AFIT/GCS/ ENG /12-03 DEPARTMENT OF THE AIR FORCE AIR...States AFIT/GCS/ ENG /12-03 SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS THESIS Presented to the Faculty Department of Electrical and Computer...DISTRIBUTION UNLIMITED AFIT/GCS/ ENG /12-03 SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS Matthew F. Steele, B.S.E.E. Captain, USAF

  17. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, F. R. MAHONEY & ASSOC., AMPHIDROME SYSTEM FOR SINGLE FAMILY HOMES - 02/05/WQPC-SWP

    EPA Science Inventory

    Verification testing of the F.R. Mahoney Amphidrome System was conducted over a twelve month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at the Otis Air National Guard Base in Borne, MA. Sanitary Sewerage from the base residential housing w...

  19. CTBT on-site inspections

    NASA Astrophysics Data System (ADS)

    Zucca, J. J.

    2014-05-01

    On-site inspection (OSI) is a critical part of the verification regime for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The OSI verification regime provides for international inspectors to make a suite of measurements and observations on site at the location of an event of interest. The other critical component of the verification regime is the International Monitoring System (IMS), which is a globally distributed network of monitoring stations. The IMS along with technical monitoring data from CTBT member countries, as appropriate, will be used to trigger an OSI. After the decision is made to carry out an OSI, it is important for the inspectors to deploy to the field site rapidly to be able to detect short-lived phenomena such as the aftershocks that may be observable after an underground nuclear explosion. The inspectors will be on site from weeks to months and will be working with many tens of tons of equipment. Parts of the OSI regime will be tested in a field exercise in the country of Jordan late in 2014. The build-up of the OSI regime has been proceeding steadily since the CTBT was signed in 1996 and is on track to becoming a deterrent to someone considering conducting a nuclear explosion in violation of the Treaty.

  20. Protection of autonomous microgrids using agent-based distributed communication

    DOE PAGES

    Cintuglu, Mehmet H.; Ma, Tan; Mohammed, Osama A.

    2016-04-06

    This study presents a real-time implementation of autonomous microgrid protection using agent-based distributed communication. Protection of an autonomous microgrid requires special considerations compared to large scale distribution net-works due to the presence of power converters and relatively low inertia. In this work, we introduce a practical overcurrent and a frequency selectivity method to overcome conventional limitations. The proposed overcurrent scheme defines a selectivity mechanism considering the remedial action scheme (RAS) of the microgrid after a fault instant based on feeder characteristics and the location of the intelligent electronic devices (IEDs). A synchrophasor-based online frequency selectivity approach is proposed to avoidmore » pulse loading effects in low inertia microgrids. Experimental results are presented for verification of the pro-posed schemes using a laboratory based microgrid. The setup was composed of actual generation units and IEDs using IEC 61850 protocol. The experimental results were in excellent agreement with the proposed protection scheme.« less

  1. Protection of autonomous microgrids using agent-based distributed communication

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cintuglu, Mehmet H.; Ma, Tan; Mohammed, Osama A.

    This study presents a real-time implementation of autonomous microgrid protection using agent-based distributed communication. Protection of an autonomous microgrid requires special considerations compared to large scale distribution net-works due to the presence of power converters and relatively low inertia. In this work, we introduce a practical overcurrent and a frequency selectivity method to overcome conventional limitations. The proposed overcurrent scheme defines a selectivity mechanism considering the remedial action scheme (RAS) of the microgrid after a fault instant based on feeder characteristics and the location of the intelligent electronic devices (IEDs). A synchrophasor-based online frequency selectivity approach is proposed to avoidmore » pulse loading effects in low inertia microgrids. Experimental results are presented for verification of the pro-posed schemes using a laboratory based microgrid. The setup was composed of actual generation units and IEDs using IEC 61850 protocol. The experimental results were in excellent agreement with the proposed protection scheme.« less

  2. Comprehensive Understanding for Vegetated Scene Radiance Relationships

    NASA Technical Reports Server (NTRS)

    Kimes, D. S.; Deering, D. W.

    1984-01-01

    Directional reflectance distributions spanning the entire existent hemisphere were measured in two field studies; one using a Mark III 3-band radiometer and one using the rapid scanning bidirectional field instrument called PARABOLA. Surfaces measured included corn, soybeans, bare soils, grass lawn, orchard grass, alfalfa, cotton row crops, plowed field, annual grassland, stipa grass, hard wheat, salt plain shrubland, and irrigated wheat. Analysis of field data showed unique reflectance distributions ranging from bare soil to complete vegetation canopies. Physical mechanisms causing these trends were proposed. A 3-D model was developed and is unique in that it predicts: (1) the directional spectral reflectance factors as a function of the sensor's azimuth and zenith angles and the sensor's position above the canopy; (2) the spectral absorption as a function of location within the scene; and (3) the directional spectral radiance as a function of the sensor's location within the scene. Initial verification of the model as applied to a soybean row crop showed that the simulated directional data corresponded relatively well in gross trends to the measured data. The model was expanded to include the anisotropic scattering properties of leaves as a function of the leaf orientation distribution in both the zenith and azimuth angle modes.

  3. A preliminary study on the use of FX-Glycine gel and an in-house optical cone beam CT readout for IMRT and RapidArc verification

    NASA Astrophysics Data System (ADS)

    Ravindran, Paul B.; Ebenezer, Suman Babu S.; Winfred, Michael Raj; Amalan, S.

    2017-05-01

    The radiochromic FX gel with Optical CT readout has been investigated by several authors and has shown promising results for 3D dosimetry. One of the applications of the gel dosimeters is their use in 3D dose verification for IMRT and RapidArc quality assurance. Though polymer gel has been used successfully for clinical dose verification, the use of FX gel for clinical dose verification with optical cone beam CT needs further validation. In this work, we have used FX gel and an in- house optical readout system for gamma analysis between the dose matrices of measured dose distribution and a treatment planning system (TPS) calculated dose distribution for a few test cases.

  4. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems

    DTIC Science & Technology

    1994-07-29

    time systems and to evaluate the design. The evaluation of the design includes investigation of both the capability and potential usefulness of the toolkit environment and the feasibility of its implementation....The goals of Phase 1 are to design in detail a toolkit environment based on formal methods for the specification and verification of distributed real

  5. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.

  6. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Kranz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.

  7. Verification of Advances in a Coupled Snow-runoff Modeling Framework for Operational Streamflow Forecasts

    NASA Astrophysics Data System (ADS)

    Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.

    2011-12-01

    The National Oceanic and Atmospheric Administration's (NOAA's) River Forecast Centers (RFCs) issue hydrologic forecasts related to flood events, reservoir operations for water supply, streamflow regulation, and recreation on the nation's streams and rivers. The RFCs use the National Weather Service River Forecast System (NWSRFS) for streamflow forecasting which relies on a coupled snow model (i.e. SNOW17) and rainfall-runoff model (i.e. SAC-SMA) in snow-dominated regions of the US. Errors arise in various steps of the forecasting system from input data, model structure, model parameters, and initial states. The goal of the current study is to undertake verification of potential improvements in the SNOW17-SAC-SMA modeling framework developed for operational streamflow forecasts. We undertake verification for a range of parameters sets (i.e. RFC, DREAM (Differential Evolution Adaptive Metropolis)) as well as a data assimilation (DA) framework developed for the coupled models. Verification is also undertaken for various initial conditions to observe the influence of variability in initial conditions on the forecast. The study basin is the North Fork America River Basin (NFARB) located on the western side of the Sierra Nevada Mountains in northern California. Hindcasts are verified using both deterministic (i.e. Nash Sutcliffe efficiency, root mean square error, and joint distribution) and probabilistic (i.e. reliability diagram, discrimination diagram, containing ratio, and Quantile plots) statistics. Our presentation includes comparison of the performance of different optimized parameters and the DA framework as well as assessment of the impact associated with the initial conditions used for streamflow forecasts for the NFARB.

  8. Test/QA Plan for Verification of Semi-Continuous Ambient Air Monitoring Systems - Second Round

    EPA Science Inventory

    Test/QA Plan for Verification of Semi-Continuous Ambient Air Monitoring Systems - Second Round. Changes reflect performance of second round of testing at new location and with various changes to personnel. Additional changes reflect general improvements to the Version 1 test/QA...

  9. Environmental Technology Verification Report for Applikon MARGA Semi-Continuous Ambient Air Monitoring System

    EPA Science Inventory

    The verification test was conducted oer a period of 30 days (October 1 to October 31, 2008) and involved the continuous operation of duplicate semi-continuous monitoring technologies at the Burdens Creek Air Monitoring Site, an existing ambient-air monitoring station located near...

  10. A Verification System for Distributed Objects with Asynchronous Method Calls

    NASA Astrophysics Data System (ADS)

    Ahrendt, Wolfgang; Dylla, Maximilian

    We present a verification system for Creol, an object-oriented modeling language for concurrent distributed applications. The system is an instance of KeY, a framework for object-oriented software verification, which has so far been applied foremost to sequential Java. Building on KeY characteristic concepts, like dynamic logic, sequent calculus, explicit substitutions, and the taclet rule language, the system presented in this paper addresses functional correctness of Creol models featuring local cooperative thread parallelism and global communication via asynchronous method calls. The calculus heavily operates on communication histories which describe the interfaces of Creol units. Two example scenarios demonstrate the usage of the system.

  11. RF model of the distribution system as a communication channel, phase 2. Volume 1: Summary Report

    NASA Technical Reports Server (NTRS)

    Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.

    1982-01-01

    The design, implementation, and verification of a computerized model for predicting the steady-state sinusoidal response of radial (tree) configured distribution feeders was undertaken. That work demonstrated the feasibility and validity based on verification measurements made on a limited size portion of an actual live feeder. On that basis a follow-on effort concerned with (1) extending the verification based on a greater variety of situations and network size, (2) extending the model capabilities for reverse direction propagation, (3) investigating parameter sensitivities, (4) improving transformer models, and (5) investigating procedures/fixes for ameliorating propagation trouble spots was conducted. Results are summarized.

  12. Clinical application of in vivo treatment delivery verification based on PET/CT imaging of positron activity induced at high energy photon therapy

    NASA Astrophysics Data System (ADS)

    Janek Strååt, Sara; Andreassen, Björn; Jonsson, Cathrine; Noz, Marilyn E.; Maguire, Gerald Q., Jr.; Näfstadius, Peder; Näslund, Ingemar; Schoenahl, Frederic; Brahme, Anders

    2013-08-01

    The purpose of this study was to investigate in vivo verification of radiation treatment with high energy photon beams using PET/CT to image the induced positron activity. The measurements of the positron activation induced in a preoperative rectal cancer patient and a prostate cancer patient following 50 MV photon treatments are presented. A total dose of 5 and 8 Gy, respectively, were delivered to the tumors. Imaging was performed with a 64-slice PET/CT scanner for 30 min, starting 7 min after the end of the treatment. The CT volume from the PET/CT and the treatment planning CT were coregistered by matching anatomical reference points in the patient. The treatment delivery was imaged in vivo based on the distribution of the induced positron emitters produced by photonuclear reactions in tissue mapped on to the associated dose distribution of the treatment plan. The results showed that spatial distribution of induced activity in both patients agreed well with the delivered beam portals of the treatment plans in the entrance subcutaneous fat regions but less so in blood and oxygen rich soft tissues. For the preoperative rectal cancer patient however, a 2 ± (0.5) cm misalignment was observed in the cranial-caudal direction of the patient between the induced activity distribution and treatment plan, indicating a beam patient setup error. No misalignment of this kind was seen in the prostate cancer patient. However, due to a fast patient setup error in the PET/CT scanner a slight mis-position of the patient in the PET/CT was observed in all three planes, resulting in a deformed activity distribution compared to the treatment plan. The present study indicates that the induced positron emitters by high energy photon beams can be measured quite accurately using PET imaging of subcutaneous fat to allow portal verification of the delivered treatment beams. Measurement of the induced activity in the patient 7 min after receiving 5 Gy involved count rates which were about 20 times lower than that of a patient undergoing standard 18F-FDG treatment. When using a combination of short lived nuclides such as 15O (half-life: 2 min) and 11C (half-life: 20 min) with low activity it is not optimal to use clinical reconstruction protocols. Thus, it might be desirable to further optimize reconstruction parameters as well as to address hardware improvements in realizing in vivo treatment verification with PET/CT in the future. A significant improvement with regard to 15O imaging could also be expected by having the PET/CT unit located close to the radiation treatment room.

  13. Mass-balance modelling of Ak-Shyirak massif Glaciers, Inner Tian Shan

    NASA Astrophysics Data System (ADS)

    Rets, Ekaterina; Barandun, Martina; Belozerov, Egor; Petrakov, Dmitry; Shpuntova, Alena

    2017-04-01

    Tian Shan is a water tower of Central Asia. Rapid and accelerating glacier downwasting is typical for this region. Study sites - Sary-Tor glacier and Glacier No.354 are located in Ak-Shyirak massif, Naryn headwaters. Sary-Tor was chosen as representative for Ak-Shyirak (Ushnurtsev, 1991; Oledeneniye TianShanya, 1995) for direct mass-balance measurements in 1985-1991. Glacier No.354 was an object of direct mass-balance measurements for 2011-2016. An energy-balance distributed A-Melt model (Rets et al, 2010) was used to reconstruct mass-balance for the glaciers for 2003-2015. Verification of modelingresults showed a good reproduction of direct melting measurements data on ablation stakes and mass loss according to geodetic method. Modeling results for Glacier No. 354 were compared to different modeling approach: distributed accumulation and temperature-index melt (Kronenberg et al, 2016)

  14. Evaluation of streamflow forecast for the National Water Model of U.S. National Weather Service

    NASA Astrophysics Data System (ADS)

    Rafieeinasab, A.; McCreight, J. L.; Dugger, A. L.; Gochis, D.; Karsten, L. R.; Zhang, Y.; Cosgrove, B.; Liu, Y.

    2016-12-01

    The National Water Model (NWM), an implementation of the community WRF-Hydro modeling system, is an operational hydrologic forecasting model for the contiguous United States. The model forecasts distributed hydrologic states and fluxes, including soil moisture, snowpack, ET, and ponded water. In particular, the NWM provides streamflow forecasts at more than 2.7 million river reaches for three forecast ranges: short (15 hr), medium (10 days), and long (30 days). In this study, we verify short and medium range streamflow forecasts in the context of the verification of their respective quantitative precipitation forecasts/forcing (QPF), the High Resolution Rapid Refresh (HRRR) and the Global Forecast System (GFS). The streamflow evaluation is performed for summer of 2016 at more than 6,000 USGS gauges. Both individual forecasts and forecast lead times are examined. Selected case studies of extreme events aim to provide insight into the quality of the NWM streamflow forecasts. A goal of this comparison is to address how much streamflow bias originates from precipitation forcing bias. To this end, precipitation verification is performed over the contributing areas above (and between assimilated) USGS gauge locations. Precipitation verification is based on the aggregated, blended StageIV/StageII data as the "reference truth". We summarize the skill of the streamflow forecasts, their skill relative to the QPF, and make recommendations for improving NWM forecast skill.

  15. Two years experience with quality assurance protocol for patient related Rapid Arc treatment plan verification using a two dimensional ionization chamber array

    PubMed Central

    2011-01-01

    Purpose To verify the dose distribution and number of monitor units (MU) for dynamic treatment techniques like volumetric modulated single arc radiation therapy - Rapid Arc - each patient treatment plan has to be verified prior to the first treatment. The purpose of this study was to develop a patient related treatment plan verification protocol using a two dimensional ionization chamber array (MatriXX, IBA, Schwarzenbruck, Germany). Method Measurements were done to determine the dependence between response of 2D ionization chamber array, beam direction, and field size. Also the reproducibility of the measurements was checked. For the patient related verifications the original patient Rapid Arc treatment plan was projected on CT dataset of the MatriXX and the dose distribution was calculated. After irradiation of the Rapid Arc verification plans measured and calculated 2D dose distributions were compared using the gamma evaluation method implemented in the measuring software OmniPro (version 1.5, IBA, Schwarzenbruck, Germany). Results The dependence between response of 2D ionization chamber array, field size and beam direction has shown a passing rate of 99% for field sizes between 7 cm × 7 cm and 24 cm × 24 cm for measurements of single arc. For smaller and larger field sizes than 7 cm × 7 cm and 24 cm × 24 cm the passing rate was less than 99%. The reproducibility was within a passing rate of 99% and 100%. The accuracy of the whole process including the uncertainty of the measuring system, treatment planning system, linear accelerator and isocentric laser system in the treatment room was acceptable for treatment plan verification using gamma criteria of 3% and 3 mm, 2D global gamma index. Conclusion It was possible to verify the 2D dose distribution and MU of Rapid Arc treatment plans using the MatriXX. The use of the MatriXX for Rapid Arc treatment plan verification in clinical routine is reasonable. The passing rate should be 99% than the verification protocol is able to detect clinically significant errors. PMID:21342509

  16. Large-Scale Cryogen Systems and Test Facilities

    NASA Technical Reports Server (NTRS)

    Johnson, R. G.; Sass, J. P.; Hatfield, W. H.

    2007-01-01

    NASA has completed initial construction and verification testing of the Integrated Systems Test Facility (ISTF) Cryogenic Testbed. The ISTF is located at Complex 20 at Cape Canaveral Air Force Station, Florida. The remote and secure location is ideally suited for the following functions: (1) development testing of advanced cryogenic component technologies, (2) development testing of concepts and processes for entire ground support systems designed for servicing large launch vehicles, and (3) commercial sector testing of cryogenic- and energy-related products and systems. The ISTF Cryogenic Testbed consists of modular fluid distribution piping and storage tanks for liquid oxygen/nitrogen (56,000 gal) and liquid hydrogen (66,000 gal). Storage tanks for liquid methane (41,000 gal) and Rocket Propellant 1 (37,000 gal) are also specified for the facility. A state-of-the-art blast proof test command and control center provides capability for remote operation, video surveillance, and data recording for all test areas.

  17. Design and verification of distributed logic controllers with application of Petri nets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał

    2015-12-31

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  18. Design and verification of distributed logic controllers with application of Petri nets

    NASA Astrophysics Data System (ADS)

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał; Wiśniewska, Monika

    2015-12-01

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  19. VeriClick: an efficient tool for table format verification

    NASA Astrophysics Data System (ADS)

    Nagy, George; Tamhankar, Mangesh

    2012-01-01

    The essential layout attributes of a visual table can be defined by the location of four critical grid cells. Although these critical cells can often be located by automated analysis, some means of human interaction is necessary for correcting residual errors. VeriClick is a macro-enabled spreadsheet interface that provides ground-truthing, confirmation, correction, and verification functions for CSV tables. All user actions are logged. Experimental results of seven subjects on one hundred tables suggest that VeriClick can provide a ten- to twenty-fold speedup over performing the same functions with standard spreadsheet editing commands.

  20. A tracking and verification system implemented in a clinical environment for partial HIPAA compliance

    NASA Astrophysics Data System (ADS)

    Guo, Bing; Documet, Jorge; Liu, Brent; King, Nelson; Shrestha, Rasu; Wang, Kevin; Huang, H. K.; Grant, Edward G.

    2006-03-01

    The paper describes the methodology for the clinical design and implementation of a Location Tracking and Verification System (LTVS) that has distinct benefits for the Imaging Department at the Healthcare Consultation Center II (HCCII), an outpatient imaging facility located on the USC Health Science Campus. A novel system for tracking and verification of patients and staff in a clinical environment using wireless and facial biometric technology to monitor and automatically identify patients and staff was developed in order to streamline patient workflow, protect against erroneous examinations and create a security zone to prevent and audit unauthorized access to patient healthcare data under the HIPAA mandate. This paper describes the system design and integration methodology based on initial clinical workflow studies within a clinical environment. An outpatient center was chosen as an initial first step for the development and implementation of this system.

  1. Convective Weather Forecast Accuracy Analysis at Center and Sector Levels

    NASA Technical Reports Server (NTRS)

    Wang, Yao; Sridhar, Banavar

    2010-01-01

    This paper presents a detailed convective forecast accuracy analysis at center and sector levels. The study is aimed to provide more meaningful forecast verification measures to aviation community, as well as to obtain useful information leading to the improvements in the weather translation capacity models. In general, the vast majority of forecast verification efforts over past decades have been on the calculation of traditional standard verification measure scores over forecast and observation data analyses onto grids. These verification measures based on the binary classification have been applied in quality assurance of weather forecast products at the national level for many years. Our research focuses on the forecast at the center and sector levels. We calculate the standard forecast verification measure scores for en-route air traffic centers and sectors first, followed by conducting the forecast validation analysis and related verification measures for weather intensities and locations at centers and sectors levels. An approach to improve the prediction of sector weather coverage by multiple sector forecasts is then developed. The weather severe intensity assessment was carried out by using the correlations between forecast and actual weather observation airspace coverage. The weather forecast accuracy on horizontal location was assessed by examining the forecast errors. The improvement in prediction of weather coverage was determined by the correlation between actual sector weather coverage and prediction. observed and forecasted Convective Weather Avoidance Model (CWAM) data collected from June to September in 2007. CWAM zero-minute forecast data with aircraft avoidance probability of 60% and 80% are used as the actual weather observation. All forecast measurements are based on 30-minute, 60- minute, 90-minute, and 120-minute forecasts with the same avoidance probabilities. The forecast accuracy analysis for times under one-hour showed that the errors in intensity and location for center forecast are relatively low. For example, 1-hour forecast intensity and horizontal location errors for ZDC center were about 0.12 and 0.13. However, the correlation between sector 1-hour forecast and actual weather coverage was weak, for sector ZDC32, about 32% of the total variation of observation weather intensity was unexplained by forecast; the sector horizontal location error was about 0.10. The paper also introduces an approach to estimate the sector three-dimensional actual weather coverage by using multiple sector forecasts, which turned out to produce better predictions. Using Multiple Linear Regression (MLR) model for this approach, the correlations between actual observation and the multiple sector forecast model prediction improved by several percents at 95% confidence level in comparison with single sector forecast.

  2. Towards an improved ensemble precipitation forecast: A probabilistic post-processing approach

    NASA Astrophysics Data System (ADS)

    Khajehei, Sepideh; Moradkhani, Hamid

    2017-03-01

    Recently, ensemble post-processing (EPP) has become a commonly used approach for reducing the uncertainty in forcing data and hence hydrologic simulation. The procedure was introduced to build ensemble precipitation forecasts based on the statistical relationship between observations and forecasts. More specifically, the approach relies on a transfer function that is developed based on a bivariate joint distribution between the observations and the simulations in the historical period. The transfer function is used to post-process the forecast. In this study, we propose a Bayesian EPP approach based on copula functions (COP-EPP) to improve the reliability of the precipitation ensemble forecast. Evaluation of the copula-based method is carried out by comparing the performance of the generated ensemble precipitation with the outputs from an existing procedure, i.e. mixed type meta-Gaussian distribution. Monthly precipitation from Climate Forecast System Reanalysis (CFS) and gridded observation from Parameter-Elevation Relationships on Independent Slopes Model (PRISM) have been employed to generate the post-processed ensemble precipitation. Deterministic and probabilistic verification frameworks are utilized in order to evaluate the outputs from the proposed technique. Distribution of seasonal precipitation for the generated ensemble from the copula-based technique is compared to the observation and raw forecasts for three sub-basins located in the Western United States. Results show that both techniques are successful in producing reliable and unbiased ensemble forecast, however, the COP-EPP demonstrates considerable improvement in the ensemble forecast in both deterministic and probabilistic verification, in particular in characterizing the extreme events in wet seasons.

  3. Oil flow at the scroll compressor discharge: visualization and CFD simulation

    NASA Astrophysics Data System (ADS)

    Xu, Jiu; Hrnjak, Pega

    2017-08-01

    Oil is important to the compressor but has other side effect on the refrigeration system performance. Discharge valves located in the compressor plenum are the gateway for the oil when leaving the compressor and circulate in the system. The space in between: the compressor discharge plenum has the potential to separate the oil mist and reduce the oil circulation ratio (OCR) in the system. In order to provide information for building incorporated separation feature for the oil flow near the compressor discharge, video processing method is used to quantify the oil droplets movement and distribution. Also, CFD discrete phase model gives the numerical approach to study the oil flow inside compressor plenum. Oil droplet size distributions are given by visualization and simulation and the results show a good agreement. The mass balance and spatial distribution are also discussed and compared with experimental results. The verification shows that discrete phase model has the potential to simulate the oil droplet flow inside the compressor.

  4. Audio-visual imposture

    NASA Astrophysics Data System (ADS)

    Karam, Walid; Mokbel, Chafic; Greige, Hanna; Chollet, Gerard

    2006-05-01

    A GMM based audio visual speaker verification system is described and an Active Appearance Model with a linear speaker transformation system is used to evaluate the robustness of the verification. An Active Appearance Model (AAM) is used to automatically locate and track a speaker's face in a video recording. A Gaussian Mixture Model (GMM) based classifier (BECARS) is used for face verification. GMM training and testing is accomplished on DCT based extracted features of the detected faces. On the audio side, speech features are extracted and used for speaker verification with the GMM based classifier. Fusion of both audio and video modalities for audio visual speaker verification is compared with face verification and speaker verification systems. To improve the robustness of the multimodal biometric identity verification system, an audio visual imposture system is envisioned. It consists of an automatic voice transformation technique that an impostor may use to assume the identity of an authorized client. Features of the transformed voice are then combined with the corresponding appearance features and fed into the GMM based system BECARS for training. An attempt is made to increase the acceptance rate of the impostor and to analyzing the robustness of the verification system. Experiments are being conducted on the BANCA database, with a prospect of experimenting on the newly developed PDAtabase developed within the scope of the SecurePhone project.

  5. Verification and Validation of Monte Carlo n-Particle Code 6 (MCNP6) with Neutron Protection Factor Measurements of an Iron Box

    DTIC Science & Technology

    2014-03-27

    VERIFICATION AND VALIDATION OF MONTE CARLO N- PARTICLE CODE 6 (MCNP6) WITH NEUTRON PROTECTION FACTOR... PARTICLE CODE 6 (MCNP6) WITH NEUTRON PROTECTION FACTOR MEASUREMENTS OF AN IRON BOX THESIS Presented to the Faculty Department of Engineering...STATEMENT A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED iv AFIT-ENP-14-M-05 VERIFICATION AND VALIDATION OF MONTE CARLO N- PARTICLE CODE 6

  6. Ada(R) Test and Verification System (ATVS)

    NASA Technical Reports Server (NTRS)

    Strelich, Tom

    1986-01-01

    The Ada Test and Verification System (ATVS) functional description and high level design are completed and summarized. The ATVS will provide a comprehensive set of test and verification capabilities specifically addressing the features of the Ada language, support for embedded system development, distributed environments, and advanced user interface capabilities. Its design emphasis was on effective software development environment integration and flexibility to ensure its long-term use in the Ada software development community.

  7. Verification of Radar Vehicle Detection Equipment

    DOT National Transportation Integrated Search

    1999-03-01

    Currently, inductive loops are used to count traffic at the 52 permanent sites located in South Dakota. Because they are located within the pavement, the loops are susceptible to being destroyed during maintenance projects. When they are destroyed, i...

  8. Pharmacy Automation in Navy Medicine: A Study of Naval Medical Center San Diego

    DTIC Science & Technology

    2015-09-01

    to pharmacist verification. ...............................................................7  Figure 3.  Robotic Delivery System Installed at Naval...medication, caps the vial, and affixes the label. This completed prescription is then placed on the conveyer belt for routing to pharmacist ...performing all steps, including transportation, up to pharmacist verification via the conveyer belt. Manual fills are located along the conveyor system

  9. Towards the Formal Verification of a Distributed Real-Time Automotive System

    NASA Technical Reports Server (NTRS)

    Endres, Erik; Mueller, Christian; Shadrin, Andrey; Tverdyshev, Sergey

    2010-01-01

    We present the status of a project which aims at building, formally and pervasively verifying a distributed automotive system. The target system is a gate-level model which consists of several interconnected electronic control units with independent clocks. This model is verified against the specification as seen by a system programmer. The automotive system is implemented on several FPGA boards. The pervasive verification is carried out using combination of interactive theorem proving (Isabelle/HOL) and model checking (LTL).

  10. Space shuttle engineering and operations support. Avionics system engineering

    NASA Technical Reports Server (NTRS)

    Broome, P. A.; Neubaur, R. J.; Welsh, R. T.

    1976-01-01

    The shuttle avionics integration laboratory (SAIL) requirements for supporting the Spacelab/orbiter avionics verification process are defined. The principal topics are a Spacelab avionics hardware assessment, test operations center/electronic systems test laboratory (TOC/ESL) data processing requirements definition, SAIL (Building 16) payload accommodations study, and projected funding and test scheduling. Because of the complex nature of the Spacelab/orbiter computer systems, the PCM data link, and the high rate digital data system hardware/software relationships, early avionics interface verification is required. The SAIL is a prime candidate test location to accomplish this early avionics verification.

  11. Ground vibration tests of a high fidelity truss for verification of on orbit damage location techniques

    NASA Technical Reports Server (NTRS)

    Kashangaki, Thomas A. L.

    1992-01-01

    This paper describes a series of modal tests that were performed on a cantilevered truss structure. The goal of the tests was to assemble a large database of high quality modal test data for use in verification of proposed methods for on orbit model verification and damage detection in flexible truss structures. A description of the hardware is provided along with details of the experimental setup and procedures for 16 damage cases. Results from selected cases are presented and discussed. Differences between ground vibration testing and on orbit modal testing are also described.

  12. Development and Verification of Sputtered Thin-Film Nickel-Titanium (NiTi) Shape Memory Alloy (SMA)

    DTIC Science & Technology

    2015-08-01

    Shape Memory Alloy (SMA) by Cory R Knick and Christopher J Morris Approved for public release; distribution unlimited...Laboratory Development and Verification of Sputtered Thin-Film Nickel-Titanium (NiTi) Shape Memory Alloy (SMA) by Cory R Knick and Christopher

  13. Cleanup Verification Package for the 118-F-6 Burial Ground

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    H. M. Sulloway

    2008-10-02

    This cleanup verification package documents completion of remedial action for the 118-F-6 Burial Ground located in the 100-FR-2 Operable Unit of the 100-F Area on the Hanford Site. The trenches received waste from the 100-F Experimental Animal Farm, including animal manure, animal carcasses, laboratory waste, plastic, cardboard, metal, and concrete debris as well as a railroad tank car.

  14. In vivo dose verification method in catheter based high dose rate brachytherapy.

    PubMed

    Jaselskė, Evelina; Adlienė, Diana; Rudžianskas, Viktoras; Urbonavičius, Benas Gabrielis; Inčiūra, Arturas

    2017-12-01

    In vivo dosimetry is a powerful tool for dose verification in radiotherapy. Its application in high dose rate (HDR) brachytherapy is usually limited to the estimation of gross errors, due to inability of the dosimetry system/ method to record non-uniform dose distribution in steep dose gradient fields close to the radioactive source. In vivo dose verification in interstitial catheter based HDR brachytherapy is crucial since the treatment is performed inserting radioactive source at the certain positions within the catheters that are pre-implanted into the tumour. We propose in vivo dose verification method for this type of brachytherapy treatment which is based on the comparison between experimentally measured and theoretical dose values calculated at well-defined locations corresponding dosemeter positions in the catheter. Dose measurements were performed using TLD 100-H rods (6 mm long, 1 mm diameter) inserted in a certain sequences into additionally pre-implanted dosimetry catheter. The adjustment of dosemeter positioning in the catheter was performed using reconstructed CT scans of patient with pre-implanted catheters. Doses to three Head&Neck and one Breast cancer patient have been measured during several randomly selected treatment fractions. It was found that the average experimental dose error varied from 4.02% to 12.93% during independent in vivo dosimetry control measurements for selected Head&Neck cancer patients and from 7.17% to 8.63% - for Breast cancer patient. Average experimental dose error was below the AAPM recommended margin of 20% and did not exceed the measurement uncertainty of 17.87% estimated for this type of dosemeters. Tendency of slightly increasing average dose error was observed in every following treatment fraction of the same patient. It was linked to the changes of theoretically estimated dosemeter positions due to the possible patient's organ movement between different treatment fractions, since catheter reconstruction was performed for the first treatment fraction only. These findings indicate potential for further average dose error reduction in catheter based brachytherapy by at least 2-3% in the case that catheter locations will be adjusted before each following treatment fraction, however it requires more detailed investigation. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  15. Integrity Verification for Multiple Data Copies in Cloud Storage Based on Spatiotemporal Chaos

    NASA Astrophysics Data System (ADS)

    Long, Min; Li, You; Peng, Fei

    Aiming to strike for a balance between the security, efficiency and availability of the data verification in cloud storage, a novel integrity verification scheme based on spatiotemporal chaos is proposed for multiple data copies. Spatiotemporal chaos is implemented for node calculation of the binary tree, and the location of the data in the cloud is verified. Meanwhile, dynamic operation can be made to the data. Furthermore, blind information is used to prevent a third-party auditor (TPA) leakage of the users’ data privacy in a public auditing process. Performance analysis and discussion indicate that it is secure and efficient, and it supports dynamic operation and the integrity verification of multiple copies of data. It has a great potential to be implemented in cloud storage services.

  16. KSOS Secure Unix Verification Plan (Kernelized Secure Operating System).

    DTIC Science & Technology

    1980-12-01

    shall be handled as proprietary information untii 5 Apri 1978. After that time, the Government m-. distribute the document as it sees fit. UNIX and PWB...Accession For P-’(’ T.’i3 :- NTI G.;:’... &I : " \\ " Y: Codes mdlc/or 71!O lii WDL-TR7809 KSOS VERIFICATION PLAN SECTION I INTRODUCTION "’The purpose...funding, additional tools may be available by the time they are needed for FSOS verification. We intend to use the best available technology in

  17. Verification Image of The Veins on The Back Palm with Modified Local Line Binary Pattern (MLLBP) and Histogram

    NASA Astrophysics Data System (ADS)

    Prijono, Agus; Darmawan Hangkawidjaja, Aan; Ratnadewi; Saleh Ahmar, Ansari

    2018-01-01

    The verification to person who is used today as a fingerprint, signature, personal identification number (PIN) in the bank system, identity cards, attendance, easily copied and forged. This causes the system not secure and is vulnerable to unauthorized persons to access the system. In this research will be implemented verification system using the image of the blood vessels in the back of the palms as recognition more difficult to imitate because it is located inside the human body so it is safer to use. The blood vessels located at the back of the human hand is unique, even humans twins have a different image of the blood vessels. Besides the image of the blood vessels do not depend on a person’s age, so it can be used for long term, except in the case of an accident, or disease. Because of the unique vein pattern recognition can be used in a person. In this paper, we used a modification method to perform the introduction of a person based on the image of the blood vessel that is using Modified Local Line Binary Pattern (MLLBP). The process of matching blood vessel image feature extraction using Hamming Distance. Test case of verification is done by calculating the percentage of acceptance of the same person. Rejection error occurs if a person was not matched by the system with the data itself. The 10 person with 15 image compared to 5 image vein for each person is resulted 80,67% successful Another test case of the verification is done by verified two image from different person that is forgery, and the verification will be true if the system can rejection the image forgery. The ten different person is not verified and the result is obtained 94%.

  18. A Probabilistic Mass Estimation Algorithm for a Novel 7- Channel Capacitive Sample Verification Sensor

    NASA Technical Reports Server (NTRS)

    Wolf, Michael

    2012-01-01

    A document describes an algorithm created to estimate the mass placed on a sample verification sensor (SVS) designed for lunar or planetary robotic sample return missions. A novel SVS measures the capacitance between a rigid bottom plate and an elastic top membrane in seven locations. As additional sample material (soil and/or small rocks) is placed on the top membrane, the deformation of the membrane increases the capacitance. The mass estimation algorithm addresses both the calibration of each SVS channel, and also addresses how to combine the capacitances read from each of the seven channels into a single mass estimate. The probabilistic approach combines the channels according to the variance observed during the training phase, and provides not only the mass estimate, but also a value for the certainty of the estimate. SVS capacitance data is collected for known masses under a wide variety of possible loading scenarios, though in all cases, the distribution of sample within the canister is expected to be approximately uniform. A capacitance-vs-mass curve is fitted to this data, and is subsequently used to determine the mass estimate for the single channel s capacitance reading during the measurement phase. This results in seven different mass estimates, one for each SVS channel. Moreover, the variance of the calibration data is used to place a Gaussian probability distribution function (pdf) around this mass estimate. To blend these seven estimates, the seven pdfs are combined into a single Gaussian distribution function, providing the final mean and variance of the estimate. This blending technique essentially takes the final estimate as an average of the estimates of the seven channels, weighted by the inverse of the channel s variance.

  19. The use of radiochromic EBT2 film for the quality assurance and dosimetric verification of 3D conformal radiotherapy using Microtek ScanMaker 9800XL flatbed scanner

    PubMed Central

    Sim, GS; Ng, KH

    2013-01-01

    Radiochromic and radiographic films are widely used for radiation dosimetry due to the advantage of high spatial resolution and two‐dimensional dose measurement. Different types of scanners, including various models of flatbed scanners, have been used as part of the dosimetry readout procedure. This paper focuses on the characterization of the EBT2 film response in combination with a Microtek ScanMaker 9800XL scanner and the subsequent use in the dosimetric verification of a 3D conformal radiotherapy treatment. The film reproducibility and scanner uniformity of the Microtek ScanMaker 9800XL was studied. A three‐field 3D conformal radiotherapy treatment was planned on an anthropomorphic phantom and EBT2 film measurements were carried out to verify the treatment. The interfilm reproducibility was found to be 0.25%. Over a period of three months, the films darkened by 1%. The scanner reproducibility was ± 2% and a nonuniformity was ±1.9% along the direction perpendicular to the scan direction. EBT2 measurements showed an underdose of 6.2% at high‐dose region compared to TPS predicted dose. This may be due to the inability of the treatment planning system to predict the correct dose distribution in the presence of tissue inhomogeneities and the uncertainty of the scanner reproducibility and uniformity. The use of EBT2 film in conjunction with the axial CT image of the anthropomorphic phantom allows the evaluation of the anatomical location of dose discrepancies between the EBT2 measured dose distribution and TPS predicted dose distribution. PACS number: 87.55.Qr PMID:23835383

  20. Simulating the Distribution of Individual Livestock Farms and Their Populations in the United States: An Example Using Domestic Swine (Sus scrofa domesticus) Farms

    PubMed Central

    Garza, Sarah J.; Miller, Ryan S.

    2015-01-01

    Livestock distribution in the United States (U.S.) can only be mapped at a county-level or worse resolution. We developed a spatial microsimulation model called the Farm Location and Agricultural Production Simulator (FLAPS) that simulated the distribution and populations of individual livestock farms throughout the conterminous U.S. Using domestic pigs (Sus scrofa domesticus) as an example species, we customized iterative proportional-fitting algorithms for the hierarchical structure of the U.S. Census of Agriculture and imputed unpublished state- or county-level livestock population totals that were redacted to ensure confidentiality. We used a weighted sampling design to collect data on the presence and absence of farms and used them to develop a national-scale distribution model that predicted the distribution of individual farms at a 100 m resolution. We implemented microsimulation algorithms that simulated the populations and locations of individual farms using output from our imputed Census of Agriculture dataset and distribution model. Approximately 19% of county-level pig population totals were unpublished in the 2012 Census of Agriculture and needed to be imputed. Using aerial photography, we confirmed the presence or absence of livestock farms at 10,238 locations and found livestock farms were correlated with open areas, cropland, and roads, and also areas with cooler temperatures and gentler topography. The distribution of swine farms was highly variable, but cross-validation of our distribution model produced an area under the receiver-operating characteristics curve value of 0.78, which indicated good predictive performance. Verification analyses showed FLAPS accurately imputed and simulated Census of Agriculture data based on absolute percent difference values of < 0.01% at the state-to-national scale, 3.26% for the county-to-state scale, and 0.03% for the individual farm-to-county scale. Our output data have many applications for risk management of agricultural systems including epidemiological studies, food safety, biosecurity issues, emergency-response planning, and conflicts between livestock and other natural resources. PMID:26571497

  1. Simulating the Distribution of Individual Livestock Farms and Their Populations in the United States: An Example Using Domestic Swine (Sus scrofa domesticus) Farms.

    PubMed

    Burdett, Christopher L; Kraus, Brian R; Garza, Sarah J; Miller, Ryan S; Bjork, Kathe E

    2015-01-01

    Livestock distribution in the United States (U.S.) can only be mapped at a county-level or worse resolution. We developed a spatial microsimulation model called the Farm Location and Agricultural Production Simulator (FLAPS) that simulated the distribution and populations of individual livestock farms throughout the conterminous U.S. Using domestic pigs (Sus scrofa domesticus) as an example species, we customized iterative proportional-fitting algorithms for the hierarchical structure of the U.S. Census of Agriculture and imputed unpublished state- or county-level livestock population totals that were redacted to ensure confidentiality. We used a weighted sampling design to collect data on the presence and absence of farms and used them to develop a national-scale distribution model that predicted the distribution of individual farms at a 100 m resolution. We implemented microsimulation algorithms that simulated the populations and locations of individual farms using output from our imputed Census of Agriculture dataset and distribution model. Approximately 19% of county-level pig population totals were unpublished in the 2012 Census of Agriculture and needed to be imputed. Using aerial photography, we confirmed the presence or absence of livestock farms at 10,238 locations and found livestock farms were correlated with open areas, cropland, and roads, and also areas with cooler temperatures and gentler topography. The distribution of swine farms was highly variable, but cross-validation of our distribution model produced an area under the receiver-operating characteristics curve value of 0.78, which indicated good predictive performance. Verification analyses showed FLAPS accurately imputed and simulated Census of Agriculture data based on absolute percent difference values of < 0.01% at the state-to-national scale, 3.26% for the county-to-state scale, and 0.03% for the individual farm-to-county scale. Our output data have many applications for risk management of agricultural systems including epidemiological studies, food safety, biosecurity issues, emergency-response planning, and conflicts between livestock and other natural resources.

  2. TH-AB-202-02: Real-Time Verification and Error Detection for MLC Tracking Deliveries Using An Electronic Portal Imaging Device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J Zwan, B; Central Coast Cancer Centre, Gosford, NSW; Colvill, E

    2016-06-15

    Purpose: The added complexity of the real-time adaptive multi-leaf collimator (MLC) tracking increases the likelihood of undetected MLC delivery errors. In this work we develop and test a system for real-time delivery verification and error detection for MLC tracking radiotherapy using an electronic portal imaging device (EPID). Methods: The delivery verification system relies on acquisition and real-time analysis of transit EPID image frames acquired at 8.41 fps. In-house software was developed to extract the MLC positions from each image frame. Three comparison metrics were used to verify the MLC positions in real-time: (1) field size, (2) field location and, (3)more » field shape. The delivery verification system was tested for 8 VMAT MLC tracking deliveries (4 prostate and 4 lung) where real patient target motion was reproduced using a Hexamotion motion stage and a Calypso system. Sensitivity and detection delay was quantified for various types of MLC and system errors. Results: For both the prostate and lung test deliveries the MLC-defined field size was measured with an accuracy of 1.25 cm{sup 2} (1 SD). The field location was measured with an accuracy of 0.6 mm and 0.8 mm (1 SD) for lung and prostate respectively. Field location errors (i.e. tracking in wrong direction) with a magnitude of 3 mm were detected within 0.4 s of occurrence in the X direction and 0.8 s in the Y direction. Systematic MLC gap errors were detected as small as 3 mm. The method was not found to be sensitive to random MLC errors and individual MLC calibration errors up to 5 mm. Conclusion: EPID imaging may be used for independent real-time verification of MLC trajectories during MLC tracking deliveries. Thresholds have been determined for error detection and the system has been shown to be sensitive to a range of delivery errors.« less

  3. Evaluation of the 29-km Eta Model. Part 1; Objective Verification at Three Selected Stations

    NASA Technical Reports Server (NTRS)

    Nutter, Paul A.; Manobianco, John; Merceret, Francis J. (Technical Monitor)

    1998-01-01

    This paper describes an objective verification of the National Centers for Environmental Prediction (NCEP) 29-km eta model from May 1996 through January 1998. The evaluation was designed to assess the model's surface and upper-air point forecast accuracy at three selected locations during separate warm (May - August) and cool (October - January) season periods. In order to enhance sample sizes available for statistical calculations, the objective verification includes two consecutive warm and cool season periods. Systematic model deficiencies comprise the larger portion of the total error in most of the surface forecast variables that were evaluated. The error characteristics for both surface and upper-air forecasts vary widely by parameter, season, and station location. At upper levels, a few characteristic biases are identified. Overall however, the upper-level errors are more nonsystematic in nature and could be explained partly by observational measurement uncertainty. With a few exceptions, the upper-air results also indicate that 24-h model error growth is not statistically significant. In February and August 1997, NCEP implemented upgrades to the eta model's physical parameterizations that were designed to change some of the model's error characteristics near the surface. The results shown in this paper indicate that these upgrades led to identifiable and statistically significant changes in forecast accuracy for selected surface parameters. While some of the changes were expected, others were not consistent with the intent of the model updates and further emphasize the need for ongoing sensitivity studies and localized statistical verification efforts. Objective verification of point forecasts is a stringent measure of model performance, but when used alone, is not enough to quantify the overall value that model guidance may add to the forecast process. Therefore, results from a subjective verification of the meso-eta model over the Florida peninsula are discussed in the companion paper by Manobianco and Nutter. Overall verification results presented here and in part two should establish a reasonable benchmark from which model users and developers may pursue the ongoing eta model verification strategies in the future.

  4. Current status of 3D EPID-based in vivo dosimetry in The Netherlands Cancer Institute

    NASA Astrophysics Data System (ADS)

    Mijnheer, B.; Olaciregui-Ruiz, I.; Rozendaal, R.; Spreeuw, H.; van Herk, M.; Mans, A.

    2015-01-01

    3D in vivo dose verification using a-Si EPIDs is performed routinely in our institution for almost all RT treatments. The EPID-based 3D dose distribution is reconstructed using a back-projection algorithm and compared with the planned dose distribution using 3D gamma evaluation. Dose-reconstruction and gamma-evaluation software runs automatically, and deviations outside the alert criteria are immediately available and investigated, in combination with inspection of cone-beam CT scans. The implementation of our 3D EPID- based in vivo dosimetry approach was able to replace pre-treatment verification for more than 90% of the patient treatments. Clinically relevant deviations could be detected for approximately 1 out of 300 patient treatments (IMRT and VMAT). Most of these errors were patient related anatomical changes or deviations from the routine clinical procedure, and would not have been detected by pre-treatment verification. Moreover, 3D EPID-based in vivo dose verification is a fast and accurate tool to assure the safe delivery of RT treatments. It provides clinically more useful information and is less time consuming than pre-treatment verification measurements. Automated 3D in vivo dosimetry is therefore a prerequisite for large-scale implementation of patient-specific quality assurance of RT treatments.

  5. Verification of road databases using multiple road models

    NASA Astrophysics Data System (ADS)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  6. Dose distribution verification for GYN brachytherapy using EBT Gafchromic film and TG-43 calculation.

    PubMed

    Gholami, Somayeh; Mirzaei, Hamid Reza; Jabbary Arfaee, Ali; Jaberi, Ramin; Nedaie, Hassan Ali; Rabi Mahdavi, Seied; Rajab Bolookat, Eftekhar; Meigooni, Ali S

    2016-01-01

    Verification of dose distributions for gynecological (GYN) brachytherapy implants using EBT Gafchromic film. One major challenge in brachytherapy is to verify the accuracy of dose distributions calculated by a treatment planning system. A new phantom was designed and fabricated using 90 slabs of 18 cm × 16 cm × 0.2 cm Perspex to accommodate a tandem and Ovoid assembly, which is normally used for GYN brachytherapy treatment. This phantom design allows the use of EBT Gafchromic films for dosimetric verification of GYN implants with a cobalt-60 HDR system or a LDR Cs-137 system. Gafchromic films were exposed using a plan that was designed to deliver 1.5 Gy of dose to 0.5 cm distance from the lateral surface of ovoids from a pair of ovoid assembly that was used for treatment vaginal cuff. For a quantitative analysis of the results for both LDR and HDR systems, the measured dose values at several points of interests were compared with the calculated data from a commercially available treatment planning system. This planning system was utilizing the TG-43 formalism and parameters for calculation of dose distributions around a brachytherapy implant. The results of these investigations indicated that the differences between the calculated and measured data at different points were ranging from 2.4% to 3.8% for the LDR Cs-137 and HDR Co-60 systems, respectively. The EBT Gafchromic films combined with the newly designed phantom could be utilized for verification of the dose distributions around different GYN implants treated with either LDR or HDR brachytherapy procedures.

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION: TEST REPORT OF MOBILE SOURCE EMISSION CONTROL DEVICES--PUREM NORTH AMERICA LLC, PMF GREENTEC 1004205.00.0 DIESEL PARTICULATE FILTER

    EPA Science Inventory

    The U.S. EPA has created the Environmental Technology Verification (ETV) program to provide high quality, peer reviewed data on technology performance to those involved in the design, distribution, financing, permitting, purchase, and use of environmental technologies. The Air Po...

  8. Resolution verification targets for airborne and spaceborne imaging systems at the Stennis Space Center

    NASA Astrophysics Data System (ADS)

    McKellip, Rodney; Yuan, Ding; Graham, William; Holland, Donald E.; Stone, David; Walser, William E.; Mao, Chengye

    1997-06-01

    The number of available spaceborne and airborne systems will dramatically increase over the next few years. A common systematic approach toward verification of these systems will become important for comparing the systems' operational performance. The Commercial Remote Sensing Program at the John C. Stennis Space Center (SSC) in Mississippi has developed design requirements for a remote sensing verification target range to provide a means to evaluate spatial, spectral, and radiometric performance of optical digital remote sensing systems. The verification target range consists of spatial, spectral, and radiometric targets painted on a 150- by 150-meter concrete pad located at SSC. The design criteria for this target range are based upon work over a smaller, prototypical target range at SSC during 1996. This paper outlines the purpose and design of the verification target range based upon an understanding of the systems to be evaluated as well as data analysis results from the prototypical target range.

  9. Verification of Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  10. National Center for Nuclear Security - NCNS

    ScienceCinema

    None

    2018-01-16

    As the United States embarks on a new era of nuclear arms control, the tools for treaty verification must be accurate and reliable, and must work at stand-off distances. The National Center for Nuclear Security, or NCNS, at the Nevada National Security Site, is poised to become the proving ground for these technologies. The center is a unique test bed for non-proliferation and arms control treaty verification technologies. The NNSS is an ideal location for these kinds of activities because of its multiple environments; its cadre of experienced nuclear personnel, and the artifacts of atmospheric and underground nuclear weapons explosions. The NCNS will provide future treaty negotiators with solid data on verification and inspection regimes and a realistic environment in which future treaty verification specialists can be trained. Work on warhead monitoring at the NCNS will also support future arms reduction treaties.

  11. The Mailbox Computer System for the IAEA verification experiment on HEU downlending at the Portsmouth Gaseous Diffusion Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aronson, A.L.; Gordon, D.M.

    IN APRIL 1996, THE UNITED STATES (US) ADDED THE PORTSMOUTH GASEOUS DIFFUSION PLANT TO THE LIST OF FACILITIES ELIGIBLE FOR THE APPLICATION OF INTERNATIONAL ATOMIC ENERGY AGENCY (IAEA) SAFEGUARDS. AT THAT TIME, THE US PROPOSED THAT THE IAEA CARRY OUT A ''VERIFICATION EXPERIMENT'' AT THE PLANT WITH RESPECT TO DOOWNBLENDING OF ABOUT 13 METRIC TONS OF HIGHLY ENRICHED URANIUM (HEU) IN THE FORM OF URANIUM HEXAFLUROIDE (UF6). DURING THE PERIOD DECEMBER 1997 THROUGH JULY 1998, THE IAEA CARRIED OUT THE REQUESTED VERIFICATION EXPERIMENT. THE VERIFICATION APPROACH USED FOR THIS EXPERIMENT INCLUDED, AMONG OTHER MEASURES, THE ENTRY OF PROCESS-OPERATIONAL DATA BYmore » THE FACILITY OPERATOR ON A NEAR-REAL-TIME BASIS INTO A ''MAILBOX'' COMPUTER LOCATED WITHIN A TAMPER-INDICATING ENCLOSURE SEALED BY THE IAEA.« less

  12. Observing atmospheric water in storms with the Nimbus 7 scanning multichannel microwave radiometer

    NASA Technical Reports Server (NTRS)

    Katsaros, K. B.; Lewis, R. M.

    1984-01-01

    Employing data on integrated atmospheric water vapor, total cloud liquid water and rain rate obtainable from the Nimbus 7 Scanning Multichannel Microwave Radiometer (SMMR), we study the frontal structure of several mid-latitude cyclones over the North Pacific Ocean as they approach the West Coast of North America in the winter of 1979. The fronts, analyzed with all available independent data, are consistently located at the leading edge of the strongest gradient in integrated water vapor. The cloud liquid water content, which unfortunately has received very little in situ verification, has patterns which are consistent with the structure seen in visible and infrared imagery. The rain distribution is also a good indicator of frontal location and rain amounts are generally within a factor of two of what is observed with rain gauges on the coast. Furthermore, the onset of rain on the coast can often be accurately forecast by simple advection of the SMMR observed rain areas.

  13. Mesoscale and synoptic scale features of North Pacific weather systems observed with the scanning multichannel microwave radiometer on Nimbus 7

    NASA Technical Reports Server (NTRS)

    Katsaros, K. B.; Lewis, R. M.

    1986-01-01

    Employing data on integrated atmospheric water vapor, total cloud liquid water and rain rate obtainable from the Nimbus 7 Scanning Multichannel Microwave Radiometer (SMMR), the frontal structure of several mid-latitude cyclones over the North Pacific Ocean as they approach the West Coast of North America in the winter of 1979. The fronts, analyzed with all available independent data, are consistently located at the leading edge of the strongest gradient in integrated water vapor. The cloud liquid water content, which unfortunately has received very little in situ verification, has patterns which are consistent with the structure seen in visible and infrared imagery. The rain distribution is also a good indicator of frontal location and rain amounts are generally within a factor of two of what is observed with rain gauges on the coast. Furthermore, the onset of rain on the coast can often be accurately forecast by simple advection of the SMMR observed rain areas.

  14. Interim Letter Report - Verification Survey of 19 Grids in the Lester Flat Area, David Witherspoon Inc. 1630 Site Knoxville, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P.C. Weaver

    2008-10-17

    Perform verification surveys of 19 available grids located in the Lester Flat Area at the Davod Witherspoon Site. The survey grids included E11, E12, E13, F11, F12, F13, F14, F15, G15, G16, G17, H16, H17, H18, X16, X17, X18, K16, and J16.

  15. Dust storm events over Delhi: verification of dust AOD forecasts with satellite and surface observations

    NASA Astrophysics Data System (ADS)

    Singh, Aditi; Iyengar, Gopal R.; George, John P.

    2016-05-01

    Thar desert located in northwest part of India is considered as one of the major dust source. Dust storms originate in Thar desert during pre-monsoon season, affects large part of Indo-Gangetic plains. High dust loading causes the deterioration of the ambient air quality and degradation in visibility. Present study focuses on the identification of dust events and verification of the forecast of dust events over Delhi and western part of IG Plains, during the pre-monsoon season of 2015. Three dust events have been identified over Delhi during the study period. For all the selected days, Terra-MODIS AOD at 550 nm are found close to 1.0, while AURA-OMI AI shows high values. Dust AOD forecasts from NCMRWF Unified Model (NCUM) for the three selected dust events are verified against satellite (MODIS) and ground based observations (AERONET). Comparison of observed AODs at 550 nm from MODIS with NCUM predicted AODs reveals that NCUM is able to predict the spatial and temporal distribution of dust AOD, in these cases. Good correlation (~0.67) is obtained between the NCUM predicted dust AODs and location specific observations available from AERONET. Model under-predicted the AODs as compared to the AERONET observations. This may be mainly because the model account for only dust and no anthropogenic activities are considered. The results of the present study emphasize the requirement of more realistic representation of local dust emission in the model both of natural and anthropogenic origin, to improve the forecast of dust from NCUM during the dust events.

  16. TU-C-BRE-11: 3D EPID-Based in Vivo Dosimetry: A Major Step Forward Towards Optimal Quality and Safety in Radiation Oncology Practice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mijnheer, B; Mans, A; Olaciregui-Ruiz, I

    Purpose: To develop a 3D in vivo dosimetry method that is able to substitute pre-treatment verification in an efficient way, and to terminate treatment delivery if the online measured 3D dose distribution deviates too much from the predicted dose distribution. Methods: A back-projection algorithm has been further developed and implemented to enable automatic 3D in vivo dose verification of IMRT/VMAT treatments using a-Si EPIDs. New software tools were clinically introduced to allow automated image acquisition, to periodically inspect the record-and-verify database, and to automatically run the EPID dosimetry software. The comparison of the EPID-reconstructed and planned dose distribution is donemore » offline to raise automatically alerts and to schedule actions when deviations are detected. Furthermore, a software package for online dose reconstruction was also developed. The RMS of the difference between the cumulative planned and reconstructed 3D dose distributions was used for triggering a halt of a linac. Results: The implementation of fully automated 3D EPID-based in vivo dosimetry was able to replace pre-treatment verification for more than 90% of the patient treatments. The process has been fully automated and integrated in our clinical workflow where over 3,500 IMRT/VMAT treatments are verified each year. By optimizing the dose reconstruction algorithm and the I/O performance, the delivered 3D dose distribution is verified in less than 200 ms per portal image, which includes the comparison between the reconstructed and planned dose distribution. In this way it was possible to generate a trigger that can stop the irradiation at less than 20 cGy after introducing large delivery errors. Conclusion: The automatic offline solution facilitated the large scale clinical implementation of 3D EPID-based in vivo dose verification of IMRT/VMAT treatments; the online approach has been successfully tested for various severe delivery errors.« less

  17. Experimental measurement-device-independent verification of quantum steering

    NASA Astrophysics Data System (ADS)

    Kocsis, Sacha; Hall, Michael J. W.; Bennet, Adam J.; Saunders, Dylan J.; Pryde, Geoff J.

    2015-01-01

    Bell non-locality between distant quantum systems—that is, joint correlations which violate a Bell inequality—can be verified without trusting the measurement devices used, nor those performing the measurements. This leads to unconditionally secure protocols for quantum information tasks such as cryptographic key distribution. However, complete verification of Bell non-locality requires high detection efficiencies, and is not robust to typical transmission losses over long distances. In contrast, quantum or Einstein-Podolsky-Rosen steering, a weaker form of quantum correlation, can be verified for arbitrarily low detection efficiencies and high losses. The cost is that current steering-verification protocols require complete trust in one of the measurement devices and its operator, allowing only one-sided secure key distribution. Here we present measurement-device-independent steering protocols that remove this need for trust, even when Bell non-locality is not present. We experimentally demonstrate this principle for singlet states and states that do not violate a Bell inequality.

  18. Experimental measurement-device-independent verification of quantum steering.

    PubMed

    Kocsis, Sacha; Hall, Michael J W; Bennet, Adam J; Saunders, Dylan J; Pryde, Geoff J

    2015-01-07

    Bell non-locality between distant quantum systems--that is, joint correlations which violate a Bell inequality--can be verified without trusting the measurement devices used, nor those performing the measurements. This leads to unconditionally secure protocols for quantum information tasks such as cryptographic key distribution. However, complete verification of Bell non-locality requires high detection efficiencies, and is not robust to typical transmission losses over long distances. In contrast, quantum or Einstein-Podolsky-Rosen steering, a weaker form of quantum correlation, can be verified for arbitrarily low detection efficiencies and high losses. The cost is that current steering-verification protocols require complete trust in one of the measurement devices and its operator, allowing only one-sided secure key distribution. Here we present measurement-device-independent steering protocols that remove this need for trust, even when Bell non-locality is not present. We experimentally demonstrate this principle for singlet states and states that do not violate a Bell inequality.

  19. SU-F-T-229: A Novel Method for EPID-Based In-Vivo Exit Dose Verification for Intensity Modulated Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Z; Wang, J; Peng, J

    Purpose: Electronic portal imaging device (EPID) can be used to acquire a two-dimensional exit dose distribution during treatment delivery, thus allowing the in-vivo verification of the dose delivery through a comparison of measured portal images to predicted portal dose images (PDI). The aim of this study was to present a novel method to easily and accurately predict PDI, and to establish an EPID-based in-vivo dose verification method during IMRT treatments. Methods: We developed a model to determine the predicted portal dose at the same plane of the EPID detector location. The Varian EPID (aS1000) positions at 150cm source-to-detector-distance (SDD), andmore » can be used to acquire in-vivo exit dose using Portal Dosimetry (PD) function. Our model was generated to make an equivalent water thickness represent the buildup plate of EPID. The exit dose at extend SDD plane with patient CT data in the beam can be calculated as the predicted PDI in the treatment planning system (TPS). After that, the PDI was converted to the fluence at SDD of 150cm using the inverse square law coded in MATLAB. Five head-and-neck and prostate IMRT patient plans contain 32 fields were investigated to evaluate the feasibility of this new method. The measured EPID image was compared with PDI using the gamma analysis. Results: The average results for cumulative dose comparison were 81.9% and 91.6% for 3%, 3mm and 4%, 4mm gamma criteria, respectively. Results indicate that the patient transit dosimetry predicted algorithm compares well with EPID measured PD doses for test situations. Conclusion: Our new method can be used as an easy and feasible tool for online EPID-based in-vivo dose delivery verification for IMRT treatments. It can be implemented for fast detecting those obvious treatment delivery errors for individual field and patient quality assurance.« less

  20. Litmus tests for verification of feeding tube location in infants: evaluation of their clinical use.

    PubMed

    Nyqvist, Kerstin Hedberg; Sorell, Annette; Ewald, Uwe

    2005-04-01

    To examine the clinical use of litmus paper tests for the assessment of aspirates in infants. In connection with establishing a programme for home care of infants with requirement of tube feeding with parents as the infants' carers, the need for a research-based method for verification of feeding tube position was identified by nurses as a complement to other methods. In adult care the litmus paper test is commonly used when visual inspection is not sufficient for assessment of aspirates obtained from feeding tubes. Observational study. Nurses performed litmus tests for verification of feeding tube location in a convenience sample of 60 infants born at a gestational age (GA) of 24-42 weeks. Presence/absence and volumes of aspirates were recorded as well as positive/negative litmus test reactions. Analyses on the association between test results and the infants' GA and postmenstrual and postnatal age at the time of the tests were conducted. Data were obtained from 2970 tube feeds. Aspirates were present on 1840 occasions (62%). A higher proportion of infants with absence of aspirates were born at a GA below 32 weeks. A positive reaction occurred in 97% of the tests in volumes between 0.01 and 22 ml. Birth at a GA below 32 weeks and respiratory problems were associated with negative tests. The high ratio of positive litmus reactions at all maturational levels supports the bedside use of analysis of pH in gastric aspirates for verification of feeding tube location. Application of pH indicator paper is recommended as a complementary method for assessment of aspirates from feeding tubes.

  1. Verification survey report of the south waste tank farm training/test tower and hazardous waste storage lockers at the West Valley demonstration project, West Valley, New York

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weaver, Phyllis C.

    A team from ORAU's Independent Environmental Assessment and Verification Program performed verification survey activities on the South Test Tower and four Hazardous Waste Storage Lockers. Scan data collected by ORAU determined that both the alpha and alpha-plus-beta activity was representative of radiological background conditions. The count rate distribution showed no outliers that would be indicative of alpha or alpha-plus-beta count rates in excess of background. It is the opinion of ORAU that independent verification data collected support the site?s conclusions that the South Tower and Lockers sufficiently meet the site criteria for release to recycle and reuse.

  2. An integrated user-oriented laboratory for verification of digital flight control systems: Features and capabilities

    NASA Technical Reports Server (NTRS)

    Defeo, P.; Doane, D.; Saito, J.

    1982-01-01

    A Digital Flight Control Systems Verification Laboratory (DFCSVL) has been established at NASA Ames Research Center. This report describes the major elements of the laboratory, the research activities that can be supported in the area of verification and validation of digital flight control systems (DFCS), and the operating scenarios within which these activities can be carried out. The DFCSVL consists of a palletized dual-dual flight-control system linked to a dedicated PDP-11/60 processor. Major software support programs are hosted in a remotely located UNIVAC 1100 accessible from the PDP-11/60 through a modem link. Important features of the DFCSVL include extensive hardware and software fault insertion capabilities, a real-time closed loop environment to exercise the DFCS, an integrated set of software verification tools, and a user-oriented interface to all the resources and capabilities.

  3. Location-assured, multifactor authentication on smartphones via LTE communication

    NASA Astrophysics Data System (ADS)

    Kuseler, Torben; Lami, Ihsan A.; Al-Assam, Hisham

    2013-05-01

    With the added security provided by LTE, geographical location has become an important factor for authentication to enhance the security of remote client authentication during mCommerce applications using Smartphones. Tight combination of geographical location with classic authentication factors like PINs/Biometrics in a real-time, remote verification scheme over the LTE layer connection assures the authenticator about the client itself (via PIN/biometric) as well as the client's current location, thus defines the important aspects of "who", "when", and "where" of the authentication attempt without eaves dropping or man on the middle attacks. To securely integrate location as an authentication factor into the remote authentication scheme, client's location must be verified independently, i.e. the authenticator should not solely rely on the location determined on and reported by the client's Smartphone. The latest wireless data communication technology for mobile phones (4G LTE, Long-Term Evolution), recently being rolled out in various networks, can be employed to enhance this location-factor requirement of independent location verification. LTE's Control Plane LBS provisions, when integrated with user-based authentication and independent source of localisation factors ensures secure efficient, continuous location tracking of the Smartphone. This feature can be performed during normal operation of the LTE-based communication between client and network operator resulting in the authenticator being able to verify the client's claimed location more securely and accurately. Trials and experiments show that such algorithm implementation is viable for nowadays Smartphone-based banking via LTE communication.

  4. Anatomy-corresponding method of IMRT verification.

    PubMed

    Winiecki, Janusz; Zurawski, Zbigniew; Drzewiecka, Barbara; Slosarek, Krzysztof

    2010-01-01

    During a proper execution of dMLC plans, there occurs an undesired but frequent effect of the dose locally accumulated by tissue being significantly different than expected. The conventional dosimetric QA procedures give only a partial picture of the quality of IMRT treatment, because their solely quantitative outcomes usually correspond more to the total area of the detector than the actually irradiated volume. The aim of this investigation was to develop a procedure of dynamic plans verification which would be able to visualize the potential anomalies of dose distribution and specify which tissue they exactly refer to. The paper presents a method discovered and clinically examined in our department. It is based on a Gamma Evaluation concept and allows accurate localization of deviations between predicted and acquired dose distributions, which were registered by portal as well as film dosimetry. All the calculations were performed on the self-made software GammaEval, the γ-images (2-dimensional distribution of γ-values) and γ-histograms were created as quantitative outcomes of verification. Over 150 maps of dose distribution have been analyzed and the cross-examination of the gamma images with DRRs was performed. It seems, that the complex monitoring of treatment would be possible owing to the images obtained as a cross-examination of γ-images and corresponding DRRs.

  5. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    NASA Astrophysics Data System (ADS)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  6. Verification and Validation for Flight-Critical Systems (VVFCS)

    NASA Technical Reports Server (NTRS)

    Graves, Sharon S.; Jacobsen, Robert A.

    2010-01-01

    On March 31, 2009 a Request for Information (RFI) was issued by NASA s Aviation Safety Program to gather input on the subject of Verification and Validation (V & V) of Flight-Critical Systems. The responses were provided to NASA on or before April 24, 2009. The RFI asked for comments in three topic areas: Modeling and Validation of New Concepts for Vehicles and Operations; Verification of Complex Integrated and Distributed Systems; and Software Safety Assurance. There were a total of 34 responses to the RFI, representing a cross-section of academic (26%), small & large industry (47%) and government agency (27%).

  7. Multi-ball and one-ball geolocation and location verification

    NASA Astrophysics Data System (ADS)

    Nelson, D. J.; Townsend, J. L.

    2017-05-01

    We present analysis methods that may be used to geolocate emitters using one or more moving receivers. While some of the methods we present may apply to a broader class of signals, our primary interest is locating and tracking ships from short pulsed transmissions, such as the maritime Automatic Identification System (AIS.) The AIS signal is difficult to process and track since the pulse duration is only 25 milliseconds, and the pulses may only be transmitted every six to ten seconds. Several fundamental problems are addressed, including demodulation of AIS/GMSK signals, verification of the emitter location, accurate frequency and delay estimation and identification of pulse trains from the same emitter. In particular, we present several new correlation methods, including cross-cross correlation that greatly improves correlation accuracy over conventional methods and cross- TDOA and cross-FDOA functions that make it possible to estimate time and frequency delay without the need of computing a two dimensional cross-ambiguity surface. By isolating pulses from the same emitter and accurately tracking the received signal frequency, we are able to accurately estimate the emitter location from the received Doppler characteristics.

  8. Dosimetric verification for intensity-modulated arc therapy plans by use of 2D diode array, radiochromic film and radiosensitive polymer gel.

    PubMed

    Hayashi, Naoki; Malmin, Ryan L; Watanabe, Yoichi

    2014-05-01

    Several tools are used for the dosimetric verification of intensity-modulated arc therapy (IMAT) treatment delivery. However, limited information is available for composite on-line evaluation of these tools. The purpose of this study was to evaluate the dosimetric verification of IMAT treatment plans using a 2D diode array detector (2D array), radiochromic film (RCF) and radiosensitive polymer gel dosimeter (RPGD). The specific verification plans were created for IMAT for two prostate cancer patients by use of the clinical treatment plans. Accordingly, the IMAT deliveries were performed with the 2D array on a gantry-mounting device, RCF in a cylindrical acrylic phantom, and the RPGD in two cylindrical phantoms. After the irradiation, the planar dose distributions from the 2D array and the RCFs, and the 3D dose distributions from the RPGD measurements were compared with the calculated dose distributions using the gamma analysis method (3% dose difference and 3-mm distance-to-agreement criterion), dose-dependent dose difference diagrams, dose difference histograms, and isodose distributions. The gamma passing rates of 2D array, RCFs and RPGD for one patient were 99.5%, 96.5% and 93.7%, respectively; the corresponding values for the second patient were 97.5%, 92.6% and 92.9%. Mean percentage differences between the RPGD measured and calculated doses in 3D volumes containing PTVs were -0.29 ± 7.1% and 0.97 ± 7.6% for the two patients, respectively. In conclusion, IMAT prostate plans can be delivered with high accuracy, although the 3D measurements indicated less satisfactory agreement with the treatment plans, mainly due to the dosimetric inaccuracy in low-dose regions of the RPGD measurements.

  9. Verification of ICESat-2/ATLAS Science Receiver Algorithm Onboard Databases

    NASA Astrophysics Data System (ADS)

    Carabajal, C. C.; Saba, J. L.; Leigh, H. W.; Magruder, L. A.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.

    2013-12-01

    NASA's ICESat-2 mission will fly the Advanced Topographic Laser Altimetry System (ATLAS) instrument on a 3-year mission scheduled to launch in 2016. ATLAS is a single-photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz, and a 6 spot pattern on the Earth's surface. A set of onboard Receiver Algorithms will perform signal processing to reduce the data rate and data volume to acceptable levels. These Algorithms distinguish surface echoes from the background noise, limit the daily data volume, and allow the instrument to telemeter only a small vertical region about the signal. For this purpose, three onboard databases are used: a Surface Reference Map (SRM), a Digital Elevation Model (DEM), and a Digital Relief Maps (DRMs). The DEM provides minimum and maximum heights that limit the signal search region of the onboard algorithms, including a margin for errors in the source databases, and onboard geolocation. Since the surface echoes will be correlated while noise will be randomly distributed, the signal location is found by histogramming the received event times and identifying the histogram bins with statistically significant counts. Once the signal location has been established, the onboard Digital Relief Maps (DRMs) will be used to determine the vertical width of the telemetry band about the signal. University of Texas-Center for Space Research (UT-CSR) is developing the ICESat-2 onboard databases, which are currently being tested using preliminary versions and equivalent representations of elevation ranges and relief more recently developed at Goddard Space Flight Center (GSFC). Global and regional elevation models have been assessed in terms of their accuracy using ICESat geodetic control, and have been used to develop equivalent representations of the onboard databases for testing against the UT-CSR databases, with special emphasis on the ice sheet regions. A series of verification checks have been implemented, including comparisons against ICESat altimetry for selected regions with tall vegetation and high relief. The extensive verification effort by the Receiver Algorithm team at GSFC is aimed at assuring that the onboard databases are sufficiently accurate. We will present the results of those assessments and verification tests, along with measures taken to implement modifications to the databases to optimize their use by the receiver algorithms. Companion presentations by McGarry et al. and Leigh et al. describe the details on the ATLAS Onboard Receiver Algorithms and databases development, respectively.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fakir, H.; Gaede, S.; Mulligan, M.

    Purpose: To design a versatile, nonhomogeneous insert for the dose verification phantom ArcCHECK{sup Trade-Mark-Sign} (Sun Nuclear Corp., FL) and to demonstrate its usefulness for the verification of dose distributions in inhomogeneous media. As an example, we demonstrate it can be used clinically for routine quality assurance of two volumetric modulated arc therapy (VMAT) systems for lung stereotactic body radiation therapy (SBRT): SmartArc{sup Registered-Sign} (Pinnacle{sup 3}, Philips Radiation Oncology Systems, Fitchburg, WI) and RapidArc{sup Registered-Sign} (Eclipse{sup Trade-Mark-Sign }, Varian Medical Systems, Palo Alto, CA). Methods: The cylindrical detector array ArcCHECK{sup Trade-Mark-Sign} has a retractable homogeneous acrylic insert. In this work, wemore » designed and manufactured a customized heterogeneous insert with densities that simulate soft tissue, lung, bone, and air. The insert offers several possible heterogeneity configurations and multiple locations for point dose measurements. SmartArc{sup Registered-Sign} and RapidArc{sup Registered-Sign} plans for lung SBRT were generated and copied to ArcCHECK{sup Trade-Mark-Sign} for each inhomogeneity configuration. Dose delivery was done on a Varian 2100 ix linac. The evaluation of dose distributions was based on gamma analysis of the diode measurements and point doses measurements at different positions near the inhomogeneities. Results: The insert was successfully manufactured and tested with different measurements of VMAT plans. Dose distributions measured with the homogeneous insert showed gamma passing rates similar to our clinical results ({approx}99%) for both treatment-planning systems. Using nonhomogeneous inserts decreased the passing rates by up to 3.6% in the examples studied. Overall, SmartArc{sup Registered-Sign} plans showed better gamma passing rates for nonhomogeneous measurements. The discrepancy between calculated and measured point doses was increased up to 6.5% for the nonhomogeneous insert depending on the inhomogeneity configuration and measurement location. SmartArc{sup Registered-Sign} and RapidArc{sup Registered-Sign} plans had similar plan quality but RapidArc{sup Registered-Sign} plans had significantly higher monitor units (up to 70%). Conclusions: A versatile, nonhomogeneous insert was developed for ArcCHECK{sup Trade-Mark-Sign} for an easy and quick evaluation of dose calculations with nonhomogeneous media and for comparison of different treatment planning systems. The device was tested for SmartArc{sup Registered-Sign} and RapidArc{sup Registered-Sign} plans for lung SBRT, showing the uncertainties of dose calculations with inhomogeneities. The new insert combines the convenience of the ArcCHECK{sup Trade-Mark-Sign} and the possibility of assessing dose distributions in inhomogeneous media.« less

  11. Device independence for two-party cryptography and position verification with memoryless devices

    NASA Astrophysics Data System (ADS)

    Ribeiro, Jérémy; Thinh, Le Phuc; Kaniewski, Jedrzej; Helsen, Jonas; Wehner, Stephanie

    2018-06-01

    Quantum communication has demonstrated its usefulness for quantum cryptography far beyond quantum key distribution. One domain is two-party cryptography, whose goal is to allow two parties who may not trust each other to solve joint tasks. Another interesting application is position-based cryptography whose goal is to use the geographical location of an entity as its only identifying credential. Unfortunately, security of these protocols is not possible against an all powerful adversary. However, if we impose some realistic physical constraints on the adversary, there exist protocols for which security can be proven, but these so far relied on the knowledge of the quantum operations performed during the protocols. In this work we improve the device-independent security proofs of Kaniewski and Wehner [New J. Phys. 18, 055004 (2016), 10.1088/1367-2630/18/5/055004] for two-party cryptography (with memoryless devices) and we add a security proof for device-independent position verification (also memoryless devices) under different physical constraints on the adversary. We assess the quality of the devices by observing a Bell violation, and, as for Kaniewski and Wehner [New J. Phys. 18, 055004 (2016), 10.1088/1367-2630/18/5/055004], security can be attained for any violation of the Clauser-Holt-Shimony-Horne inequality.

  12. RF model of the distribution system as a communication channel, phase 2. Volume 3: Appendices

    NASA Technical Reports Server (NTRS)

    Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.

    1982-01-01

    Program documentation concerning the design, implementation, and verification of a computerized model for predicting the steady-state sinusoidal response of radial configured distribution feeders is presented in these appendices.

  13. A calibration method for patient specific IMRT QA using a single therapy verification film

    PubMed Central

    Shukla, Arvind Kumar; Oinam, Arun S.; Kumar, Sanjeev; Sandhu, I.S.; Sharma, S.C.

    2013-01-01

    Aim The aim of the present study is to develop and verify the single film calibration procedure used in intensity-modulated radiation therapy (IMRT) quality assurance. Background Radiographic films have been regularly used in routine commissioning of treatment modalities and verification of treatment planning system (TPS). The radiation dosimetery based on radiographic films has ability to give absolute two-dimension dose distribution and prefer for the IMRT quality assurance. However, the single therapy verification film gives a quick and significant reliable method for IMRT verification. Materials and methods A single extended dose rate (EDR 2) film was used to generate the sensitometric curve of film optical density and radiation dose. EDR 2 film was exposed with nine 6 cm × 6 cm fields of 6 MV photon beam obtained from a medical linear accelerator at 5-cm depth in solid water phantom. The nine regions of single film were exposed with radiation doses raging from 10 to 362 cGy. The actual dose measurements inside the field regions were performed using 0.6 cm3 ionization chamber. The exposed film was processed after irradiation using a VIDAR film scanner and the value of optical density was noted for each region. Ten IMRT plans of head and neck carcinoma were used for verification using a dynamic IMRT technique, and evaluated using the gamma index method with TPS calculated dose distribution. Results Sensitometric curve has been generated using a single film exposed at nine field region to check quantitative dose verifications of IMRT treatments. The radiation scattered factor was observed to decrease exponentially with the increase in the distance from the centre of each field region. The IMRT plans based on calibration curve were verified using the gamma index method and found to be within acceptable criteria. Conclusion The single film method proved to be superior to the traditional calibration method and produce fast daily film calibration for highly accurate IMRT verification. PMID:24416558

  14. Post-OPC verification using a full-chip pattern-based simulation verification method

    NASA Astrophysics Data System (ADS)

    Hung, Chi-Yuan; Wang, Ching-Heng; Ma, Cliff; Zhang, Gary

    2005-11-01

    In this paper, we evaluated and investigated techniques for performing fast full-chip post-OPC verification using a commercial product platform. A number of databases from several technology nodes, i.e. 0.13um, 0.11um and 90nm are used in the investigation. Although it has proven that for most cases, our OPC technology is robust in general, due to the variety of tape-outs with complicated design styles and technologies, it is difficult to develop a "complete or bullet-proof" OPC algorithm that would cover every possible layout patterns. In the evaluation, among dozens of databases, some OPC databases were found errors by Model-based post-OPC checking, which could cost significantly in manufacturing - reticle, wafer process, and more importantly the production delay. From such a full-chip OPC database verification, we have learned that optimizing OPC models and recipes on a limited set of test chip designs may not provide sufficient coverage across the range of designs to be produced in the process. And, fatal errors (such as pinch or bridge) or poor CD distribution and process-sensitive patterns may still occur. As a result, more than one reticle tape-out cycle is not uncommon to prove models and recipes that approach the center of process for a range of designs. So, we will describe a full-chip pattern-based simulation verification flow serves both OPC model and recipe development as well as post OPC verification after production release of the OPC. Lastly, we will discuss the differentiation of the new pattern-based and conventional edge-based verification tools and summarize the advantages of our new tool and methodology: 1). Accuracy: Superior inspection algorithms, down to 1nm accuracy with the new "pattern based" approach 2). High speed performance: Pattern-centric algorithms to give best full-chip inspection efficiency 3). Powerful analysis capability: Flexible error distribution, grouping, interactive viewing and hierarchical pattern extraction to narrow down to unique patterns/cells.

  15. 37 CFR 270.5 - Designated collection and distribution organizations for records of use of sound recordings under...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., Statements of Account, Auditor's Reports, and other verification information filed in the Copyright Office... statements of account under compulsory license for making/distributing phonorecords of 201.19 Nondramatic... works, Royalties and statements of account under compulsory license for making/distributing 201.19...

  16. Verification on spray simulation of a pintle injector for liquid rocket engine

    NASA Astrophysics Data System (ADS)

    Son, Min; Yu, Kijeong; Radhakrishnan, Kanmaniraja; Shin, Bongchul; Koo, Jaye

    2016-02-01

    The pintle injector used for a liquid rocket engine is a newly re-attracted injection system famous for its wide throttle ability with high efficiency. The pintle injector has many variations with complex inner structures due to its moving parts. In order to study the rotating flow near the injector tip, which was observed from the cold flow experiment using water and air, a numerical simulation was adopted and a verification of the numerical model was later conducted. For the verification process, three types of experimental data including velocity distributions of gas flows, spray angles and liquid distribution were all compared using simulated results. The numerical simulation was performed using a commercial simulation program with the Eulerian multiphase model and axisymmetric two dimensional grids. The maximum and minimum velocities of gas were within the acceptable range of agreement, however, the spray angles experienced up to 25% error when the momentum ratios were increased. The spray density distributions were quantitatively measured and had good agreement. As a result of this study, it was concluded that the simulation method was properly constructed to study specific flow characteristics of the pintle injector despite having the limitations of two dimensional and coarse grids.

  17. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests between the different levels (system, modules, subsystems, etc) and giving an overview of the main test defined at Spacecraft level. The paper is mainly focused on the verification aspects of the EDL Demonstrator Module and the Rover Module, for which an intense testing activity without previous heritage in Europe is foreseen. In particular the Descent Module has to survive to the Mars atmospheric entry and landing, its surface platform has to stay operational for 8 sols on Martian surface, transmitting scientific data to the Orbiter. The Rover Module has to perform 180 sols mission in Mars surface environment. These operative conditions cannot be verified only by analysis; consequently a test campaign is defined including mechanical tests to simulate the entry loads, thermal test in Mars environment and the simulation of Rover operations on a 'Mars like' terrain. Finally, the paper present an overview of the documentation flow defined to ensure the correct translation of the mission requirements in verification activities (test, analysis, review of design) until the final verification close-out of the above requirements with the final verification reports.

  18. Estimating predictive hydrological uncertainty by dressing deterministic and ensemble forecasts; a comparison, with application to Meuse and Rhine

    NASA Astrophysics Data System (ADS)

    Verkade, J. S.; Brown, J. D.; Davids, F.; Reggiani, P.; Weerts, A. H.

    2017-12-01

    Two statistical post-processing approaches for estimation of predictive hydrological uncertainty are compared: (i) 'dressing' of a deterministic forecast by adding a single, combined estimate of both hydrological and meteorological uncertainty and (ii) 'dressing' of an ensemble streamflow forecast by adding an estimate of hydrological uncertainty to each individual streamflow ensemble member. Both approaches aim to produce an estimate of the 'total uncertainty' that captures both the meteorological and hydrological uncertainties. They differ in the degree to which they make use of statistical post-processing techniques. In the 'lumped' approach, both sources of uncertainty are lumped by post-processing deterministic forecasts using their verifying observations. In the 'source-specific' approach, the meteorological uncertainties are estimated by an ensemble of weather forecasts. These ensemble members are routed through a hydrological model and a realization of the probability distribution of hydrological uncertainties (only) is then added to each ensemble member to arrive at an estimate of the total uncertainty. The techniques are applied to one location in the Meuse basin and three locations in the Rhine basin. Resulting forecasts are assessed for their reliability and sharpness, as well as compared in terms of multiple verification scores including the relative mean error, Brier Skill Score, Mean Continuous Ranked Probability Skill Score, Relative Operating Characteristic Score and Relative Economic Value. The dressed deterministic forecasts are generally more reliable than the dressed ensemble forecasts, but the latter are sharper. On balance, however, they show similar quality across a range of verification metrics, with the dressed ensembles coming out slightly better. Some additional analyses are suggested. Notably, these include statistical post-processing of the meteorological forecasts in order to increase their reliability, thus increasing the reliability of the streamflow forecasts produced with ensemble meteorological forcings.

  19. Direct and full-scale experimental verifications towards ground-satellite quantum key distribution

    NASA Astrophysics Data System (ADS)

    Wang, Jian-Yu; Yang, Bin; Liao, Sheng-Kai; Zhang, Liang; Shen, Qi; Hu, Xiao-Fang; Wu, Jin-Cai; Yang, Shi-Ji; Jiang, Hao; Tang, Yan-Lin; Zhong, Bo; Liang, Hao; Liu, Wei-Yue; Hu, Yi-Hua; Huang, Yong-Mei; Qi, Bo; Ren, Ji-Gang; Pan, Ge-Sheng; Yin, Juan; Jia, Jian-Jun; Chen, Yu-Ao; Chen, Kai; Peng, Cheng-Zhi; Pan, Jian-Wei

    2013-05-01

    Quantum key distribution (QKD) provides the only intrinsically unconditional secure method for communication based on the principle of quantum mechanics. Compared with fibre-based demonstrations, free-space links could provide the most appealing solution for communication over much larger distances. Despite significant efforts, all realizations to date rely on stationary sites. Experimental verifications are therefore extremely crucial for applications to a typical low Earth orbit satellite. To achieve direct and full-scale verifications of our set-up, we have carried out three independent experiments with a decoy-state QKD system, and overcome all conditions. The system is operated on a moving platform (using a turntable), on a floating platform (using a hot-air balloon), and with a high-loss channel to demonstrate performances under conditions of rapid motion, attitude change, vibration, random movement of satellites, and a high-loss regime. The experiments address wide ranges of all leading parameters relevant to low Earth orbit satellites. Our results pave the way towards ground-satellite QKD and a global quantum communication network.

  20. Thermal Pollution Mathematical Model. Volume 4: Verification of Three-Dimensional Rigid-Lid Model at Lake Keowee. [envrionment impact of thermal discharges from power plants

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.; Sinha, S. K.

    1980-01-01

    The rigid lid model was developed to predict three dimensional temperature and velocity distributions in lakes. This model was verified at various sites (Lake Belews, Biscayne Bay, etc.) and th verification at Lake Keowee was the last of these series of verification runs. The verification at Lake Keowee included the following: (1) selecting the domain of interest, grid systems, and comparing the preliminary results with archival data; (2) obtaining actual ground truth and infrared scanner data both for summer and winter; and (3) using the model to predict the measured data for the above periods and comparing the predicted results with the actual data. The model results compared well with measured data. Thus, the model can be used as an effective predictive tool for future sites.

  1. Identifying Rhodamine Dye Plume Sources in Near-Shore Oceanic Environments by Integration of Chemical and Visual Sensors

    PubMed Central

    Tian, Yu; Kang, Xiaodong; Li, Yunyi; Li, Wei; Zhang, Aiqun; Yu, Jiangchen; Li, Yiping

    2013-01-01

    This article presents a strategy for identifying the source location of a chemical plume in near-shore oceanic environments where the plume is developed under the influence of turbulence, tides and waves. This strategy includes two modules: source declaration (or identification) and source verification embedded in a subsumption architecture. Algorithms for source identification are derived from the moth-inspired plume tracing strategies based on a chemical sensor. The in-water test missions, conducted in November 2002 at San Clemente Island (California, USA) in June 2003 in Duck (North Carolina, USA) and in October 2010 at Dalian Bay (China), successfully identified the source locations after autonomous underwater vehicles tracked the rhodamine dye plumes with a significant meander over 100 meters. The objective of the verification module is to verify the declared plume source using a visual sensor. Because images taken in near shore oceanic environments are very vague and colors in the images are not well-defined, we adopt a fuzzy color extractor to segment the color components and recognize the chemical plume and its source by measuring color similarity. The source verification module is tested by images taken during the CPT missions. PMID:23507823

  2. MR Imaging Based Treatment Planning for Radiotherapy of Prostate Cancer

    DTIC Science & Technology

    2007-02-01

    developed practical methods for heterogeneity correction for MRI - based dose calculations (Chen et al 2007). 6) We will use existing Monte Carlo ... Monte Carlo verification of IMRT dose distributions from a commercial treatment planning optimization system, Phys. Med. Biol., 45:2483-95 (2000) Ma...accuracy and consistency for MR based IMRT treatment planning for prostate cancer. A short paper entitled “ Monte Carlo dose verification of MR image based

  3. Verification of the databases EXFOR and ENDF

    NASA Astrophysics Data System (ADS)

    Berton, Gottfried; Damart, Guillaume; Cabellos, Oscar; Beauzamy, Bernard; Soppera, Nicolas; Bossant, Manuel

    2017-09-01

    The objective of this work is for the verification of large experimental (EXFOR) and evaluated nuclear reaction databases (JEFF, ENDF, JENDL, TENDL…). The work is applied to neutron reactions in EXFOR data, including threshold reactions, isomeric transitions, angular distributions and data in the resonance region of both isotopes and natural elements. Finally, a comparison of the resonance integrals compiled in EXFOR database with those derived from the evaluated libraries is also performed.

  4. Verification of NWP Cloud Properties using A-Train Satellite Observations

    NASA Astrophysics Data System (ADS)

    Kucera, P. A.; Weeks, C.; Wolff, C.; Bullock, R.; Brown, B.

    2011-12-01

    Recently, the NCAR Model Evaluation Tools (MET) has been enhanced to incorporate satellite observations for the verification of Numerical Weather Prediction (NWP) cloud products. We have developed tools that match fields spatially (both in the vertical and horizontal dimensions) to compare NWP products with satellite observations. These matched fields provide diagnostic evaluation of cloud macro attributes such as vertical distribution of clouds, cloud top height, and the spatial and seasonal distribution of cloud fields. For this research study, we have focused on using CloudSat, CALIPSO, and MODIS observations to evaluate cloud fields for a variety of NWP fields and derived products. We have selected cases ranging from large, mid-latitude synoptic systems to well-organized tropical cyclones. For each case, we matched the observed cloud field with gridded model and/or derived product fields. CloudSat and CALIPSO observations and model fields were matched and compared in the vertical along the orbit track. MODIS data and model fields were matched and compared in the horizontal. We then use MET to compute the verification statistics to quantify the performance of the models in representing the cloud fields. In this presentation we will give a summary of our comparison and show verification results for both synoptic and tropical cyclone cases.

  5. Commander Lousma works with EEVT experiment and cryogenic tube on aft middeck

    NASA Image and Video Library

    1982-03-31

    Commander Jack Lousma works with Electrophoresis Equipment Verification Test (EEVT) electrophoresis unit, cryogenic freezer and tube, and stowage locker equipment located on crew compartment middeck aft bulkhead.

  6. Verification of forecast ensembles in complex terrain including observation uncertainty

    NASA Astrophysics Data System (ADS)

    Dorninger, Manfred; Kloiber, Simon

    2017-04-01

    Traditionally, verification means to verify a forecast (ensemble) with the truth represented by observations. The observation errors are quite often neglected arguing that they are small when compared to the forecast error. In this study as part of the MesoVICT (Mesoscale Verification Inter-comparison over Complex Terrain) project it will be shown, that observation errors have to be taken into account for verification purposes. The observation uncertainty is estimated from the VERA (Vienna Enhanced Resolution Analysis) and represented via two analysis ensembles which are compared to the forecast ensemble. For the whole study results from COSMO-LEPS provided by Arpae-SIMC Emilia-Romagna are used as forecast ensemble. The time period covers the MesoVICT core case from 20-22 June 2007. In a first step, all ensembles are investigated concerning their distribution. Several tests have been executed (Kolmogorov-Smirnov-Test, Finkelstein-Schafer Test, Chi-Square Test etc.) showing no exact mathematical distribution. So the main focus is on non-parametric statistics (e.g. Kernel density estimation, Boxplots etc.) and also the deviation between "forced" normal distributed data and the kernel density estimations. In a next step the observational deviations due to the analysis ensembles are analysed. In a first approach scores are multiple times calculated with every single ensemble member from the analysis ensemble regarded as "true" observation. The results are presented as boxplots for the different scores and parameters. Additionally, the bootstrapping method is also applied to the ensembles. These possible approaches to incorporating observational uncertainty into the computation of statistics will be discussed in the talk.

  7. 25 CFR 39.208 - How are ISEP funds distributed?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... distributed? (a) On July 1, schools will receive 80 percent of their funds as determined in § 39.207. (b) On December 1, the balance will be distributed to all schools after verification of the school count and any... Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR EDUCATION THE INDIAN SCHOOL EQUALIZATION...

  8. 25 CFR 39.208 - How are ISEP funds distributed?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... distributed? (a) On July 1, schools will receive 80 percent of their funds as determined in § 39.207. (b) On December 1, the balance will be distributed to all schools after verification of the school count and any... Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR EDUCATION THE INDIAN SCHOOL EQUALIZATION...

  9. 25 CFR 39.208 - How are ISEP funds distributed?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... distributed? (a) On July 1, schools will receive 80 percent of their funds as determined in § 39.207. (b) On December 1, the balance will be distributed to all schools after verification of the school count and any... Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR EDUCATION THE INDIAN SCHOOL EQUALIZATION...

  10. 25 CFR 39.208 - How are ISEP funds distributed?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... distributed? (a) On July 1, schools will receive 80 percent of their funds as determined in § 39.207. (b) On December 1, the balance will be distributed to all schools after verification of the school count and any... Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR EDUCATION THE INDIAN SCHOOL EQUALIZATION...

  11. 25 CFR 39.208 - How are ISEP funds distributed?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... distributed? (a) On July 1, schools will receive 80 percent of their funds as determined in § 39.207. (b) On December 1, the balance will be distributed to all schools after verification of the school count and any... Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR EDUCATION THE INDIAN SCHOOL EQUALIZATION...

  12. SU-E-T-67: A Quality Assurance Procedure for VMAT Delivery Technique with Multiple Verification Metric Using TG-119 Protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katsuta, Y; Kadoya, N; Shimizu, E

    2015-06-15

    Purpose: A successful VMAT plan delivery includes precise modulations of dose rate, gantry rotational and multi-leaf collimator shapes. The purpose of this research is to construct routine QA protocol which focuses on VMAT delivery technique and to obtain a baseline including dose error, fluence distribution and mechanical accuracy during VMAT. Methods: The mock prostate, head and neck (HN) cases supplied from AAPM were used in this study. A VMAT plans were generated in Monaco TPS according to TG-119 protocol. Plans were created using 6 MV and 10 MV photon beams for each case. The phantom based measurement, fluence measurement andmore » log files analysis were performed. The dose measurement was performed using 0.6 cc ion chamber, which located at isocenter. The fluence distribution were acquired using the MapCHECK2 mounted in the MapPHAN. The trajectory log files recorded inner 20 leaf pairs and gantry angle positions at every 0.25 sec interval were exported to in-house software developed by MATLAB and determined those RMS values. Results: The dose difference is expressed as a ratio of the difference between measured and planned doses. The dose difference for 6 MV was 0.91%, for 10 MV was 0.67%. In turn, the fluence distribution using gamma criteria of 2%/2 mm with a 50% minimum dose threshold for 6 MV was 98.8%, for 10 MV was 97.5%, respectively. The RMS values of MLC for 6 MV and 10 MV were 0.32 mm and 0.37 mm, of gantry were 0.33 degree and 0.31 degree. Conclusion: In this study, QA protocol to assess VMAT delivery accuracy is constructed and results acquired in this study are used as a baseline of VMAT delivery performance verification.« less

  13. SU-F-P-21: Study of Dosimetry Accuracy of Small Passively Scattered Proton Beam Fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Y; Gautam, A; Kerr, M

    2016-06-15

    Purpose: To study the accuracy of the dose distribution of very small irregular fields of passively scattered proton beams calculated by the analytical pencil beam model of the Eclipse treatment planning system (TPS). Methods: An irregular field with a narrow region (width < 1 cm) that was used for the treatment of a small volume adjacent to a previously treated area were chosen for this investigation. Point doses at different locations inside the field were measured with a small volume ion chamber (A26, Standard Imaging). 2-D dose distributions were measured using a 2-D ion chamber array (MatriXX, IBA). All themore » measurements were done in plastic water phantom. The measured dose distributions were compared with the verification plan dose calculated in a water like phantom for the patient treatment field without the use of the compensator. Results: Point doses measured with the ion chamber in the narrowest section of the field were found to differ as much as 10% from the Eclipse calculated dose at some of the points. The 2-D dose distribution measured with the MatriXX which was validated by comparison with limited film measurement, at the proximal 95%, center of the spread out Bragg Peak and distal 90% depths agreed reasonably well with the TPS calculated dose distribution with more than 92% of the pixels passing the 2% / 2 mm dose distance agreement. Conclusion: The dose calculated by the pencil beam model of the Eclipse TPS for narrow irregular fields may not be accurate within 5% at some locations of the field, especially at the points close to the field edge due to the limitation of the dose calculation model. Overall accuracy of the calculated 2-D dose distribution was found to be acceptable for the 2%/2 mm dose/distance agreement with the measurement.« less

  14. A study comparison of two system model performance in estimated lifted index over Indonesia.

    NASA Astrophysics Data System (ADS)

    lestari, Juliana tri; Wandala, Agie

    2018-05-01

    Lifted index (LI) is one of atmospheric stability indices that used for thunderstorm forecasting. Numerical weather Prediction Models are essential for accurate weather forecast these day. This study has completed the attempt to compare the two NWP models these are Weather Research Forecasting (WRF) model and Global Forecasting System (GFS) model in estimates LI at 20 locations over Indonesia and verified the result with observation. Taylor diagram was used to comparing the models skill with shown the value of standard deviation, coefficient correlation and Root mean square error (RMSE). This study using the dataset on 00.00 UTC and 12.00 UTC during mid-March to Mid-April 2017. From the sample of LI distributions, both models have a tendency to overestimated LI value in almost all region in Indonesia while the WRF models has the better ability to catch the LI pattern distribution with observation than GFS model has. The verification result shows how both WRF and GFS model have such a weak relationship with observation except Eltari meteorologi station that its coefficient correlation reach almost 0.6 with the low RMSE value. Mean while WRF model have a better performance than GFS model. This study suggest that estimated LI of WRF model can provide the good performance for Thunderstorm forecasting over Indonesia in the future. However unsufficient relation between output models and observation in the certain location need a further investigation.

  15. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory's Source Region Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.

    As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.

  16. Cleanup Verification Package for the 118-F-1 Burial Ground

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. J. Farris and H. M. Sulloway

    2008-01-10

    This cleanup verification package documents completion of remedial action for the 118-F-1 Burial Ground on the Hanford Site. This burial ground is a combination of two locations formerly called Minor Construction Burial Ground No. 2 and Solid Waste Burial Ground No. 2. This waste site received radioactive equipment and other miscellaneous waste from 105-F Reactor operations, including dummy elements and irradiated process tubing; gun barrel tips, steel sleeves, and metal chips removed from the reactor; filter boxes containing reactor graphite chips; and miscellaneous construction solid waste.

  17. Crustal Seismic Attenuation in Germany Measured with Acoustic Radiative Transfer Theory

    NASA Astrophysics Data System (ADS)

    Gaebler, Peter J.; Eulenfeld, Tom; Wegler, Ulrich

    2017-04-01

    This work is carried out in the context of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). As part of this treaty a verification regime was introduced to detect, locate and characterize nuclear explosion testings. The study of seismology can provide essential information in the form of broadband waveform recordings for the identification and verification of these critical events. A profound knowledge of the Earth's subsurface between source and receiver is required for a detailed description of the seismic wave field. In addition to underground parameters such as seismic velocity or anisotropy, information about seismic attenuation values of the medium are required. Goal of this study is the creation of a comprehensive model of crustal seismic attenuation in Germany and adjacent areas. Over 20 years of earthquake data from the German Central Seismological Observatory data archive is used to estimate the spatial dependent distribution of seismic intrinsic and scattering attenuation of S-waves for frequencies between 0.5 and 20 Hz. The attenuation models are estimated by fitting synthetic seismogram envelopes calculated with acoustic radiative transfer theory to observed seismogram envelopes. This theory describes the propagation of seismic S-energy under the assumption of multiple isotropic scattering, the crustal structure of the scattering medium is hereby represented by a half-space model. We present preliminary results of the spatial distribution of intrinsic attenuation represented by the absorption path length, as well as of scattering attenuation in terms of the mean free path and compare the outcomes to results from previous studies. Furthermore catalog magnitudes are compared to moment magnitudes estimated during the inversion process. Additionally site amplification factors of the stations are presented.

  18. Monte Carlo patient study on the comparison of prompt gamma and PET imaging for range verification in proton therapy

    NASA Astrophysics Data System (ADS)

    Moteabbed, M.; España, S.; Paganetti, H.

    2011-02-01

    The purpose of this work was to compare the clinical adaptation of prompt gamma (PG) imaging and positron emission tomography (PET) as independent tools for non-invasive proton beam range verification and treatment validation. The PG range correlation and its differences with PET have been modeled for the first time in a highly heterogeneous tissue environment, using different field sizes and configurations. Four patients with different tumor locations (head and neck, prostate, spine and abdomen) were chosen to compare the site-specific behaviors of the PG and PET images, using both passive scattered and pencil beam fields. Accurate reconstruction of dose, PG and PET distributions was achieved by using the planning computed tomography (CT) image in a validated GEANT4-based Monte Carlo code capable of modeling the treatment nozzle and patient anatomy in detail. The physical and biological washout phenomenon and decay half-lives for PET activity for the most abundant isotopes such as 11C, 15O, 13N, 30P and 38K were taken into account in the data analysis. The attenuation of the gamma signal after traversing the patient geometry and respective detection efficiencies were estimated for both methods to ensure proper comparison. The projected dose, PG and PET profiles along many lines in the beam direction were analyzed to investigate the correlation consistency across the beam width. For all subjects, the PG method showed on average approximately 10 times higher gamma production rates than the PET method before, and 60 to 80 times higher production after including the washout correction and acquisition time delay. This rate strongly depended on tissue density and elemental composition. For broad passive scattered fields, it was demonstrated that large differences exist between PG and PET signal falloff positions and the correlation with the dose distribution for different lines in the beam direction. These variations also depended on the treatment site and the particular subject. Thus, similar to PET, direct range verification with PG in passive scattering is not easily viable. However, upon development of an optimized 3D PG detector, indirect range verification by comparing measured and simulated PG distributions (currently being explored for the PET method) would be more beneficial because it can avoid the inherent biological challenges of the PET imaging. The improved correlation of PG and PET with dose when using pencil beams was evident. PG imaging was found to be potentially advantageous especially for small tumors in the presence of high tissue heterogeneities. Including the effects of detector acceptance and efficiency may hold PET superior in terms of the amplitude of the detected signal (depending on the future development of PG detection technology), but the ability to perform online measurements and avoid signal disintegration (due to washout) with PG are important factors that can outweigh the benefits of higher detection sensitivity.

  19. Monte Carlo patient study on the comparison of prompt gamma and PET imaging for range verification in proton therapy.

    PubMed

    Moteabbed, M; España, S; Paganetti, H

    2011-02-21

    The purpose of this work was to compare the clinical adaptation of prompt gamma (PG) imaging and positron emission tomography (PET) as independent tools for non-invasive proton beam range verification and treatment validation. The PG range correlation and its differences with PET have been modeled for the first time in a highly heterogeneous tissue environment, using different field sizes and configurations. Four patients with different tumor locations (head and neck, prostate, spine and abdomen) were chosen to compare the site-specific behaviors of the PG and PET images, using both passive scattered and pencil beam fields. Accurate reconstruction of dose, PG and PET distributions was achieved by using the planning computed tomography (CT) image in a validated GEANT4-based Monte Carlo code capable of modeling the treatment nozzle and patient anatomy in detail. The physical and biological washout phenomenon and decay half-lives for PET activity for the most abundant isotopes such as (11)C, (15)O, (13)N, (30)P and (38)K were taken into account in the data analysis. The attenuation of the gamma signal after traversing the patient geometry and respective detection efficiencies were estimated for both methods to ensure proper comparison. The projected dose, PG and PET profiles along many lines in the beam direction were analyzed to investigate the correlation consistency across the beam width. For all subjects, the PG method showed on average approximately 10 times higher gamma production rates than the PET method before, and 60 to 80 times higher production after including the washout correction and acquisition time delay. This rate strongly depended on tissue density and elemental composition. For broad passive scattered fields, it was demonstrated that large differences exist between PG and PET signal falloff positions and the correlation with the dose distribution for different lines in the beam direction. These variations also depended on the treatment site and the particular subject. Thus, similar to PET, direct range verification with PG in passive scattering is not easily viable. However, upon development of an optimized 3D PG detector, indirect range verification by comparing measured and simulated PG distributions (currently being explored for the PET method) would be more beneficial because it can avoid the inherent biological challenges of the PET imaging. The improved correlation of PG and PET with dose when using pencil beams was evident. PG imaging was found to be potentially advantageous especially for small tumors in the presence of high tissue heterogeneities. Including the effects of detector acceptance and efficiency may hold PET superior in terms of the amplitude of the detected signal (depending on the future development of PG detection technology), but the ability to perform online measurements and avoid signal disintegration (due to washout) with PG are important factors that can outweigh the benefits of higher detection sensitivity.

  20. Automation and uncertainty analysis of a method for in-vivo range verification in particle therapy.

    PubMed

    Frey, K; Unholtz, D; Bauer, J; Debus, J; Min, C H; Bortfeld, T; Paganetti, H; Parodi, K

    2014-10-07

    We introduce the automation of the range difference calculation deduced from particle-irradiation induced β(+)-activity distributions with the so-called most-likely-shift approach, and evaluate its reliability via the monitoring of algorithm- and patient-specific uncertainty factors. The calculation of the range deviation is based on the minimization of the absolute profile differences in the distal part of two activity depth profiles shifted against each other. Depending on the workflow of positron emission tomography (PET)-based range verification, the two profiles under evaluation can correspond to measured and simulated distributions, or only measured data from different treatment sessions. In comparison to previous work, the proposed approach includes an automated identification of the distal region of interest for each pair of PET depth profiles and under consideration of the planned dose distribution, resulting in the optimal shift distance. Moreover, it introduces an estimate of uncertainty associated to the identified shift, which is then used as weighting factor to 'red flag' problematic large range differences. Furthermore, additional patient-specific uncertainty factors are calculated using available computed tomography (CT) data to support the range analysis. The performance of the new method for in-vivo treatment verification in the clinical routine is investigated with in-room PET images for proton therapy as well as with offline PET images for proton and carbon ion therapy. The comparison between measured PET activity distributions and predictions obtained by Monte Carlo simulations or measurements from previous treatment fractions is performed. For this purpose, a total of 15 patient datasets were analyzed, which were acquired at Massachusetts General Hospital and Heidelberg Ion-Beam Therapy Center with in-room PET and offline PET/CT scanners, respectively. Calculated range differences between the compared activity distributions are reported in a 2D map in beam-eye-view. In comparison to previously proposed approaches, the new most-likely-shift method shows more robust results for assessing in-vivo the range from strongly varying PET distributions caused by differing patient geometry, ion beam species, beam delivery techniques, PET imaging concepts and counting statistics. The additional visualization of the uncertainties and the dedicated weighting strategy contribute to the understanding of the reliability of observed range differences and the complexity in the prediction of activity distributions. The proposed method promises to offer a feasible technique for clinical routine of PET-based range verification.

  1. Automation and uncertainty analysis of a method for in-vivo range verification in particle therapy

    NASA Astrophysics Data System (ADS)

    Frey, K.; Unholtz, D.; Bauer, J.; Debus, J.; Min, C. H.; Bortfeld, T.; Paganetti, H.; Parodi, K.

    2014-10-01

    We introduce the automation of the range difference calculation deduced from particle-irradiation induced β+-activity distributions with the so-called most-likely-shift approach, and evaluate its reliability via the monitoring of algorithm- and patient-specific uncertainty factors. The calculation of the range deviation is based on the minimization of the absolute profile differences in the distal part of two activity depth profiles shifted against each other. Depending on the workflow of positron emission tomography (PET)-based range verification, the two profiles under evaluation can correspond to measured and simulated distributions, or only measured data from different treatment sessions. In comparison to previous work, the proposed approach includes an automated identification of the distal region of interest for each pair of PET depth profiles and under consideration of the planned dose distribution, resulting in the optimal shift distance. Moreover, it introduces an estimate of uncertainty associated to the identified shift, which is then used as weighting factor to ‘red flag’ problematic large range differences. Furthermore, additional patient-specific uncertainty factors are calculated using available computed tomography (CT) data to support the range analysis. The performance of the new method for in-vivo treatment verification in the clinical routine is investigated with in-room PET images for proton therapy as well as with offline PET images for proton and carbon ion therapy. The comparison between measured PET activity distributions and predictions obtained by Monte Carlo simulations or measurements from previous treatment fractions is performed. For this purpose, a total of 15 patient datasets were analyzed, which were acquired at Massachusetts General Hospital and Heidelberg Ion-Beam Therapy Center with in-room PET and offline PET/CT scanners, respectively. Calculated range differences between the compared activity distributions are reported in a 2D map in beam-eye-view. In comparison to previously proposed approaches, the new most-likely-shift method shows more robust results for assessing in-vivo the range from strongly varying PET distributions caused by differing patient geometry, ion beam species, beam delivery techniques, PET imaging concepts and counting statistics. The additional visualization of the uncertainties and the dedicated weighting strategy contribute to the understanding of the reliability of observed range differences and the complexity in the prediction of activity distributions. The proposed method promises to offer a feasible technique for clinical routine of PET-based range verification.

  2. Verification of Faulty Message Passing Systems with Continuous State Space in PVS

    NASA Technical Reports Server (NTRS)

    Pilotto, Concetta; White, Jerome

    2010-01-01

    We present a library of Prototype Verification System (PVS) meta-theories that verifies a class of distributed systems in which agent commu nication is through message-passing. The theoretic work, outlined in, consists of iterative schemes for solving systems of linear equations , such as message-passing extensions of the Gauss and Gauss-Seidel me thods. We briefly review that work and discuss the challenges in formally verifying it.

  3. SU-F-T-494: A Multi-Institutional Study of Independent Dose Verification Using Golden Beam Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Itano, M; Yamazaki, T; Tachibana, R

    Purpose: In general, beam data of individual linac is measured for independent dose verification software program and the verification is performed as a secondary check. In this study, independent dose verification using golden beam data was compared to that using individual linac’s beam data. Methods: Six institutions were participated and three different beam data were prepared. The one was individual measured data (Original Beam Data, OBD) .The others were generated by all measurements from same linac model (Model-GBD) and all linac models (All-GBD). The three different beam data were registered to the independent verification software program for each institute. Subsequently,more » patient’s plans in eight sites (brain, head and neck, lung, esophagus, breast, abdomen, pelvis and bone) were analyzed using the verification program to compare doses calculated using the three different beam data. Results: 1116 plans were collected from six institutes. Compared to using the OBD, the results shows the variation using the Model-GBD based calculation and the All-GBD was 0.0 ± 0.3% and 0.0 ± 0.6%, respectively. The maximum variations were 1.2% and 2.3%, respectively. The plans with the variation over 1% shows the reference points were located away from the central axis with/without physical wedge. Conclusion: The confidence limit (2SD) using the Model-GBD and the All-GBD was within 0.6% and 1.2%, respectively. Thus, the use of golden beam data may be feasible for independent verification. In addition to it, the verification using golden beam data provide quality assurance of planning from the view of audit. This research is partially supported by Japan Agency for Medical Research and Development(AMED)« less

  4. 21 CFR 607.37 - Inspection of establishment registrations and blood product listings.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... offices for firms within the geographical area of such district office. Upon request and receipt of a self-addressed stamped envelope, verification of registration number, or location of registered establishment...

  5. Comparison of individual and composite field analysis using array detector for Intensity Modulated Radiotherapy dose verification.

    PubMed

    Saminathan, Sathiyan; Chandraraj, Varatharaj; Sridhar, C H; Manickam, Ravikumar

    2012-01-01

    To compare the measured and calculated individual and composite field planar dose distribution of Intensity Modulated Radiotherapy plans. The measurements were performed in Clinac DHX linear accelerator with 6 MV photons using Matrixx device and a solid water phantom. The 20 brain tumor patients were selected for this study. The IMRT plan was carried out for all the patients using Eclipse treatment planning system. The verification plan was produced for every original plan using CT scan of Matrixx embedded in the phantom. Every verification field was measured by the Matrixx. The TPS calculated and measured dose distributions were compared for individual and composite fields. The percentage of gamma pixel match for the dose distribution patterns were evaluated using gamma histogram. The gamma pixel match was 95-98% for 41 fields (39%) and 98% for 59 fields (61%) with individual fields. The percentage of gamma pixel match was 95-98% for 5 patients and 98% for other 12 patients with composite fields. Three patients showed a gamma pixel match of less than 95%. The comparison of percentage gamma pixel match for individual and composite fields showed more than 2.5% variation for 6 patients, more than 1% variation for 4 patients, while the remaining 10 patients showed less than 1% variation. The individual and composite field measurements showed good agreement with TPS calculated dose distribution for the studied patients. The measurement and data analysis for individual fields is a time consuming process, the composite field analysis may be sufficient enough for smaller field dose distribution analysis with array detectors.

  6. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... distributions. 382.16 Section 382.16 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services § 382.16...

  7. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... distributions. 382.16 Section 382.16 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services § 382.16...

  8. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... distributions. 382.16 Section 382.16 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services § 382.16...

  9. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... distributions. 382.16 Section 382.16 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services § 382.16...

  10. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... distributions. 382.16 Section 382.16 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services § 382.16...

  11. Sparse distributed memory prototype: Principles of operation

    NASA Technical Reports Server (NTRS)

    Flynn, Michael J.; Kanerva, Pentti; Ahanin, Bahram; Bhadkamkar, Neal; Flaherty, Paul; Hickey, Philip

    1988-01-01

    Sparse distributed memory is a generalized random access memory (RAM) for long binary words. Such words can be written into and read from the memory, and they can be used to address the memory. The main attribute of the memory is sensitivity to similarity, meaning that a word can be read back not only by giving the original right address but also by giving one close to it as measured by the Hamming distance between addresses. Large memories of this kind are expected to have wide use in speech and scene analysis, in signal detection and verification, and in adaptive control of automated equipment. The memory can be realized as a simple, massively parallel computer. Digital technology has reached a point where building large memories is becoming practical. The research is aimed at resolving major design issues that have to be faced in building the memories. The design of a prototype memory with 256-bit addresses and from 8K to 128K locations for 256-bit words is described. A key aspect of the design is extensive use of dynamic RAM and other standard components.

  12. Design and Verification of a Distributed Communication Protocol

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.

  13. SU-E-T-586: Optimal Determination of Tolerance Level for Radiation Dose Delivery Verification in An in Vivo Dosimetry System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Y; Souri, S; Gill, G

    Purpose: To statistically determine the optimal tolerance level in the verification of delivery dose compared to the planned dose in an in vivo dosimetry system in radiotherapy. Methods: The LANDAUER MicroSTARii dosimetry system with screened nanoDots (optically stimulated luminescence dosimeters) was used for in vivo dose measurements. Ideally, the measured dose should match with the planned dose and falls within a normal distribution. Any deviation from the normal distribution may be redeemed as a mismatch, therefore a potential sign of the dose misadministration. Randomly mis-positioned nanoDots can yield a continuum background distribution. A percentage difference of the measured dose tomore » its corresponding planned dose (ΔD) can be used to analyze combined data sets for different patients. A model of a Gaussian plus a flat function was used to fit the ΔD distribution. Results: Total 434 nanoDot measurements for breast cancer patients were collected across a period of three months. The fit yields a Gaussian mean of 2.9% and a standard deviation (SD) of 5.3%. The observed shift of the mean from zero is attributed to the machine output bias and calibration of the dosimetry system. A pass interval of −2SD to +2SD was applied and a mismatch background was estimated to be 4.8%. With such a tolerance level, one can expect that 99.99% of patients should pass the verification and at most 0.011% might have a potential dose misadministration that may not be detected after 3 times of repeated measurements. After implementation, a number of new start breast cancer patients were monitored and the measured pass rate is consistent with the model prediction. Conclusion: It is feasible to implement an optimal tolerance level in order to maintain a low limit of potential dose misadministration while still to keep a relatively high pass rate in radiotherapy delivery verification.« less

  14. Bayesian Estimation of Combined Accuracy for Tests with Verification Bias

    PubMed Central

    Broemeling, Lyle D.

    2011-01-01

    This presentation will emphasize the estimation of the combined accuracy of two or more tests when verification bias is present. Verification bias occurs when some of the subjects are not subject to the gold standard. The approach is Bayesian where the estimation of test accuracy is based on the posterior distribution of the relevant parameter. Accuracy of two combined binary tests is estimated employing either “believe the positive” or “believe the negative” rule, then the true and false positive fractions for each rule are computed for two tests. In order to perform the analysis, the missing at random assumption is imposed, and an interesting example is provided by estimating the combined accuracy of CT and MRI to diagnose lung cancer. The Bayesian approach is extended to two ordinal tests when verification bias is present, and the accuracy of the combined tests is based on the ROC area of the risk function. An example involving mammography with two readers with extreme verification bias illustrates the estimation of the combined test accuracy for ordinal tests. PMID:26859487

  15. Very fast road database verification using textured 3D city models obtained from airborne imagery

    NASA Astrophysics Data System (ADS)

    Bulatov, Dimitri; Ziems, Marcel; Rottensteiner, Franz; Pohl, Melanie

    2014-10-01

    Road databases are known to be an important part of any geodata infrastructure, e.g. as the basis for urban planning or emergency services. Updating road databases for crisis events must be performed quickly and with the highest possible degree of automation. We present a semi-automatic algorithm for road verification using textured 3D city models, starting from aerial or even UAV-images. This algorithm contains two processes, which exchange input and output, but basically run independently from each other. These processes are textured urban terrain reconstruction and road verification. The first process contains a dense photogrammetric reconstruction of 3D geometry of the scene using depth maps. The second process is our core procedure, since it contains various methods for road verification. Each method represents a unique road model and a specific strategy, and thus is able to deal with a specific type of roads. Each method is designed to provide two probability distributions, where the first describes the state of a road object (correct, incorrect), and the second describes the state of its underlying road model (applicable, not applicable). Based on the Dempster-Shafer Theory, both distributions are mapped to a single distribution that refers to three states: correct, incorrect, and unknown. With respect to the interaction of both processes, the normalized elevation map and the digital orthophoto generated during 3D reconstruction are the necessary input - together with initial road database entries - for the road verification process. If the entries of the database are too obsolete or not available at all, sensor data evaluation enables classification of the road pixels of the elevation map followed by road map extraction by means of vectorization and filtering of the geometrically and topologically inconsistent objects. Depending on the time issue and availability of a geo-database for buildings, the urban terrain reconstruction procedure has semantic models of buildings, trees, and ground as output. Building s and ground are textured by means of available images. This facilitates the orientation in the model and the interactive verification of the road objects that where initially classified as unknown. The three main modules of the texturing algorithm are: Pose estimation (if the videos are not geo-referenced), occlusion analysis, and texture synthesis.

  16. Technical experiences of implementing a wireless tracking and facial biometric verification system for a clinical environment

    NASA Astrophysics Data System (ADS)

    Liu, Brent; Lee, Jasper; Documet, Jorge; Guo, Bing; King, Nelson; Huang, H. K.

    2006-03-01

    By implementing a tracking and verification system, clinical facilities can effectively monitor workflow and heighten information security in today's growing demand towards digital imaging informatics. This paper presents the technical design and implementation experiences encountered during the development of a Location Tracking and Verification System (LTVS) for a clinical environment. LTVS integrates facial biometrics with wireless tracking so that administrators can manage and monitor patient and staff through a web-based application. Implementation challenges fall into three main areas: 1) Development and Integration, 2) Calibration and Optimization of Wi-Fi Tracking System, and 3) Clinical Implementation. An initial prototype LTVS has been implemented within USC's Healthcare Consultation Center II Outpatient Facility, which currently has a fully digital imaging department environment with integrated HIS/RIS/PACS/VR (Voice Recognition).

  17. A Single-Boundary Accumulator Model of Response Times in an Addition Verification Task

    PubMed Central

    Faulkenberry, Thomas J.

    2017-01-01

    Current theories of mathematical cognition offer competing accounts of the interplay between encoding and calculation in mental arithmetic. Additive models propose that manipulations of problem format do not interact with the cognitive processes used in calculation. Alternatively, interactive models suppose that format manipulations have a direct effect on calculation processes. In the present study, we tested these competing models by fitting participants' RT distributions in an arithmetic verification task with a single-boundary accumulator model (the shifted Wald distribution). We found that in addition to providing a more complete description of RT distributions, the accumulator model afforded a potentially more sensitive test of format effects. Specifically, we found that format affected drift rate, which implies that problem format has a direct impact on calculation processes. These data give further support for an interactive model of mental arithmetic. PMID:28769853

  18. Plasma Model V&V of Collisionless Electrostatic Shock

    NASA Astrophysics Data System (ADS)

    Martin, Robert; Le, Hai; Bilyeu, David; Gildea, Stephen

    2014-10-01

    A simple 1D electrostatic collisionless shock was selected as an initial validation and verification test case for a new plasma modeling framework under development at the Air Force Research Laboratory's In-Space Propulsion branch (AFRL/RQRS). Cross verification between PIC, Vlasov, and Fluid plasma models within the framework along with expected theoretical results will be shown. The non-equilibrium velocity distributions (VDF) captured by PIC and Vlasov will be compared to each other and the assumed VDF of the fluid model at selected points. Validation against experimental data from the University of California, Los Angeles double-plasma device will also be presented along with current work in progress at AFRL/RQRS towards reproducing the experimental results using higher fidelity diagnostics to help elucidate differences between model results and between the models and original experiment. DISTRIBUTION A: Approved for public release; unlimited distribution; PA (Public Affairs) Clearance Number 14332.

  19. MESA: Message-Based System Analysis Using Runtime Verification

    NASA Technical Reports Server (NTRS)

    Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter

    2017-01-01

    In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.

  20. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform

    PubMed Central

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-01-01

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform’s mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument’s working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform. PMID:27869722

  1. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform.

    PubMed

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-11-18

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  2. Characterization and clinical evaluation of a novel 2D detector array for conventional and flattening filter free (FFF) IMRT pre-treatment verification.

    PubMed

    Sekar, Yuvaraj; Thoelking, Johannes; Eckl, Miriam; Kalichava, Irakli; Sihono, Dwi Seno Kuncoro; Lohr, Frank; Wenz, Frederik; Wertz, Hansjoerg

    2018-04-01

    The novel MatriXX FFF (IBA Dosimetry, Germany) detector is a new 2D ionization chamber detector array designed for patient specific IMRT-plan verification including flattening-filter-free (FFF) beams. This study provides a detailed analysis of the characterization and clinical evaluation of the new detector array. The verification of the MatriXX FFF was subdivided into (i) physical dosimetric tests including dose linearity, dose rate dependency and output factor measurements and (ii) patient specific IMRT pre-treatment plan verifications. The MatriXX FFF measurements were compared to the calculated dose distribution of a commissioned treatment planning system by gamma index and dose difference evaluations for 18 IMRT-sequences. All IMRT-sequences were measured with original gantry angles and with collapsing all beams to 0° gantry angle to exclude the influence of the detector's angle dependency. The MatriXX FFF was found to be linear and dose rate independent for all investigated modalities (deviations ≤0.6%). Furthermore, the output measurements of the MatriXX FFF were in very good agreement to reference measurements (deviations ≤1.8%). For the clinical evaluation an average pixel passing rate for γ (3%,3mm) of (98.5±1.5)% was achieved when applying a gantry angle correction. Also, with collapsing all beams to 0° gantry angle an excellent agreement to the calculated dose distribution was observed (γ (3%,3mm) =(99.1±1.1)%). The MatriXX FFF fulfills all physical requirements in terms of dosimetric accuracy. Furthermore, the evaluation of the IMRT-plan measurements showed that the detector particularly together with the gantry angle correction is a reliable device for IMRT-plan verification including FFF. Copyright © 2017. Published by Elsevier GmbH.

  3. Assessment of Potential Location of High Arsenic Contamination Using Fuzzy Overlay and Spatial Anisotropy Approach in Iron Mine Surrounding Area

    PubMed Central

    Wirojanagud, Wanpen; Srisatit, Thares

    2014-01-01

    Fuzzy overlay approach on three raster maps including land slope, soil type, and distance to stream can be used to identify the most potential locations of high arsenic contamination in soils. Verification of high arsenic contamination was made by collection samples and analysis of arsenic content and interpolation surface by spatial anisotropic method. A total of 51 soil samples were collected at the potential contaminated location clarified by fuzzy overlay approach. At each location, soil samples were taken at the depth of 0.00-1.00 m from the surface ground level. Interpolation surface of the analysed arsenic content using spatial anisotropic would verify the potential arsenic contamination location obtained from fuzzy overlay outputs. Both outputs of the spatial surface anisotropic and the fuzzy overlay mapping were significantly spatially conformed. Three contaminated areas with arsenic concentrations of 7.19 ± 2.86, 6.60 ± 3.04, and 4.90 ± 2.67 mg/kg exceeded the arsenic content of 3.9 mg/kg, the maximum concentration level (MCL) for agricultural soils as designated by Office of National Environment Board of Thailand. It is concluded that fuzzy overlay mapping could be employed for identification of potential contamination area with the verification by surface anisotropic approach including intensive sampling and analysis of the substances of interest. PMID:25110751

  4. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory`s Source Region Program. Appendix D: Ionospheric measurements for IVEs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.

    As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.

  5. CAPTIONALS: A computer aided testing environment for the verification and validation of communication protocols

    NASA Technical Reports Server (NTRS)

    Feng, C.; Sun, X.; Shen, Y. N.; Lombardi, Fabrizio

    1992-01-01

    This paper covers the verification and protocol validation for distributed computer and communication systems using a computer aided testing approach. Validation and verification make up the so-called process of conformance testing. Protocol applications which pass conformance testing are then checked to see whether they can operate together. This is referred to as interoperability testing. A new comprehensive approach to protocol testing is presented which address: (1) modeling for inter-layer representation for compatibility between conformance and interoperability testing; (2) computational improvement to current testing methods by using the proposed model inclusive of formulation of new qualitative and quantitative measures and time-dependent behavior; (3) analysis and evaluation of protocol behavior for interactive testing without extensive simulation.

  6. Formal design and verification of a reliable computing platform for real-time control (phase 3 results)

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Divito, Ben L.; Holloway, C. Michael

    1994-01-01

    In this paper the design and formal verification of the lower levels of the Reliable Computing Platform (RCP), a fault-tolerant computing system for digital flight control applications, are presented. The RCP uses NMR-style redundancy to mask faults and internal majority voting to flush the effects of transient faults. Two new layers of the RCP hierarchy are introduced: the Minimal Voting refinement (DA_minv) of the Distributed Asynchronous (DA) model and the Local Executive (LE) Model. Both the DA_minv model and the LE model are specified formally and have been verified using the Ehdm verification system. All specifications and proofs are available electronically via the Internet using anonymous FTP or World Wide Web (WWW) access.

  7. Optimum structural design based on reliability and proof-load testing

    NASA Technical Reports Server (NTRS)

    Shinozuka, M.; Yang, J. N.

    1969-01-01

    Proof-load test eliminates structures with strength less than the proof load and improves the reliability value in analysis. It truncates the distribution function of strength at the proof load, thereby alleviating verification of a fitted distribution function at the lower tail portion where data are usually nonexistent.

  8. Abstractions for Fault-Tolerant Distributed System Verification

    NASA Technical Reports Server (NTRS)

    Pike, Lee S.; Maddalon, Jeffrey M.; Miner, Paul S.; Geser, Alfons

    2004-01-01

    Four kinds of abstraction for the design and analysis of fault tolerant distributed systems are discussed. These abstractions concern system messages, faults, fault masking voting, and communication. The abstractions are formalized in higher order logic, and are intended to facilitate specifying and verifying such systems in higher order theorem provers.

  9. TESTING AND VERIFICATION OF REAL-TIME WATER QUALITY MONITORING SENSORS IN A DISTRIBUTION SYSTEM AGAINST INTRODUCED CONTAMINATION

    EPA Science Inventory

    Drinking water distribution systems reach the majority of American homes, business and civic areas, and are therefore an attractive target for terrorist attack via direct contamination, or backflow events. Instrumental monitoring of such systems may be used to signal the prese...

  10. From MetroII to Metronomy, Designing Contract-based Function-Architecture Co-simulation Framework for Timing Verification of Cyber-Physical Systems

    DTIC Science & Technology

    2015-03-13

    A. Lee. “A Programming Model for Time - Synchronized Distributed Real- Time Systems”. In: Proceedings of Real Time and Em- bedded Technology and Applications Symposium. 2007, pp. 259–268. ...From MetroII to Metronomy, Designing Contract-based Function-Architecture Co-simulation Framework for Timing Verification of Cyber-Physical Systems...the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data

  11. Non-Lethal Weapons Effectiveness Assessment Development and Verification Study (Etude d’evaluation, de developpement et de verification de l’efficacite des armes non letales)

    DTIC Science & Technology

    2009-10-01

    will guarantee a solid base for the future. The content of this publication has been reproduced directly from material supplied by RTO or the...intensity threat involving a local population wanting to break into the camp to steal material and food supplies ; and • A higher intensity threat...combatant evacuation opeations, distribute emergency supplies , and evacuate/ relocate refugees and displaced persons. Specified NLW-relevant tasks are

  12. Assessing the Potential of Societal Verification by Means of New Media

    DTIC Science & Technology

    2014-01-01

    the Defense Advanced Research Projects Agency (DARPA) “Red Balloon Challenge” in 2009 by finding 10 tethered weather balloons scattered across the...Institute of Technology (MIT) managed to locate 10 weather balloons tethered at undisclosed locations across the continental United States in less than...suited for complex problem solving, and the 2009 Defense Advanced Research Projects Agency’s (DARPA) “Red Balloon Challenge” has already demonstrated

  13. A Real-Time Rover Executive based On Model-Based Reactive Planning

    NASA Technical Reports Server (NTRS)

    Bias, M. Bernardine; Lemai, Solange; Muscettola, Nicola; Korsmeyer, David (Technical Monitor)

    2003-01-01

    This paper reports on the experimental verification of the ability of IDEA (Intelligent Distributed Execution Architecture) effectively operate at multiple levels of abstraction in an autonomous control system. The basic hypothesis of IDEA is that a large control system can be structured as a collection of interacting control agents, each organized around the same fundamental structure. Two IDEA agents, a system-level agent and a mission-level agent, are designed and implemented to autonomously control the K9 rover in real-time. The system is evaluated in the scenario where the rover must acquire images from a specified set of locations. The IDEA agents are responsible for enabling the rover to achieve its goals while monitoring the execution and safety of the rover and recovering from dangerous states when necessary. Experiments carried out both in simulation and on the physical rover, produced highly promising results.

  14. A virtual fluoroscopy system to verify seed positioning accuracy during prostate permanent seed implants.

    PubMed

    Sarkar, V; Gutierrez, A N; Stathakis, S; Swanson, G P; Papanikolaou, N

    2009-01-01

    The purpose of this project was to develop a software platform to produce a virtual fluoroscopic image as an aid for permanent prostate seed implants. Seed location information from a pre-plan was extracted and used as input to in-house developed software to produce a virtual fluoroscopic image. In order to account for differences in patient positioning on the day of treatment, the user was given the ability to make changes to the virtual image. The system has been shown to work as expected for all test cases. The system allows for quick (on average less than 10 sec) generation of a virtual fluoroscopic image of the planned seed pattern. The image can be used as a verification tool to aid the physician in evaluating how close the implant is to the planned distribution throughout the procedure and enable remedial action should a large deviation be observed.

  15. Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification

    DTIC Science & Technology

    2014-09-18

    and full/scale experimental verifications towards ground/ satellite quantum key distribution0 Oat Qhotonics 4235>9+7,=5;9!អ \\58^ Zin K. Dao Z. Miu T...Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification DISSERTATION Jeffrey D. Morris... QUANTUM KEY DISTRIBUTION SIMULATION FRAMEWORK USING THE DISCRETE EVENT SYSTEM SPECIFICATION DISSERTATION Presented to the Faculty Department of Systems

  16. Dosimetric verification for primary focal hypermetabolism of nasopharyngeal carcinoma patients treated with dynamic intensity-modulated radiation therapy.

    PubMed

    Xin, Yong; Wang, Jia-Yang; Li, Liang; Tang, Tian-You; Liu, Gui-Hong; Wang, Jian-She; Xu, Yu-Mei; Chen, Yong; Zhang, Long-Zhen

    2012-01-01

    To make sure the feasibility with (18F)FDG PET/CT to guided dynamic intensity-modulated radiation therapy (IMRT) for nasopharyngeal carcinoma patients, by dosimetric verification before treatment. Chose 11 patients in III~IVA nasopharyngeal carcinoma treated with functional image-guided IMRT and absolute and relative dosimetric verification by Varian 23EX LA, ionization chamber, 2DICA of I'mRT Matrixx and IBA detachable phantom. Drawing outline and making treatment plan were by different imaging techniques (CT and (18F)FDG PET/CT). The dose distributions of the various regional were realized by SMART. The absolute mean errors of interest area were 2.39%±0.66 using 0.6 cc ice chamber. Results using DTA method, the average relative dose measurements within our protocol (3%, 3 mm) were 87.64% at 300 MU/min in all filed. Dosimetric verification before IMRT is obligatory and necessary. Ionization chamber and 2DICA of I'mRT Matrixx was the effective dosimetric verification tool for primary focal hyper metabolism in functional image-guided dynamic IMRT for nasopharyngeal carcinoma. Our preliminary evidence indicates that functional image-guided dynamic IMRT is feasible.

  17. Full-Scale Incineration System Demonstration Verification Test Burns at the Naval Battalion Construction Center, Gulfport, Mississippi. Volume 3. Treatability Tests. Part 2

    DTIC Science & Technology

    1991-07-01

    1525 C1:y: daho Falls State: r Zip: 83413 Telephoue Hunber: (2 16) 65-1763 4. Facilities Location: Number & Steet: Naval Construction Bat.tallcn...ed into the POTW: (a) Pollutants which create a fire or explosion hazard in the POTW; (b) Pollutants which will cause corrosive structural damage to...Haylon Located in the laboratory (1) 15-1b C02 Located in the trailer 482 / 4.3.8 Maximum Hypothetical Accident ( Explosion ) The maximum hypothetical

  18. Mass and galaxy distributions of four massive galaxy clusters from Dark Energy Survey Science Verification data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melchior, P.; Suchyta, E.; Huff, E.

    2015-03-31

    We measure the weak-lensing masses and galaxy distributions of four massive galaxy clusters observed during the Science Verification phase of the Dark Energy Survey. This pathfinder study is meant to 1) validate the DECam imager for the task of measuring weak-lensing shapes, and 2) utilize DECam's large field of view to map out the clusters and their environments over 90 arcmin. We conduct a series of rigorous tests on astrometry, photometry, image quality, PSF modeling, and shear measurement accuracy to single out flaws in the data and also to identify the optimal data processing steps and parameters. We find Sciencemore » Verification data from DECam to be suitable for the lensing analysis described in this paper. The PSF is generally well-behaved, but the modeling is rendered difficult by a flux-dependent PSF width and ellipticity. We employ photometric redshifts to distinguish between foreground and background galaxies, and a red-sequence cluster finder to provide cluster richness estimates and cluster-galaxy distributions. By fitting NFW profiles to the clusters in this study, we determine weak-lensing masses that are in agreement with previous work. For Abell 3261, we provide the first estimates of redshift, weak-lensing mass, and richness. In addition, the cluster-galaxy distributions indicate the presence of filamentary structures attached to 1E 0657-56 and RXC J2248.7-4431, stretching out as far as 1 degree (approximately 20 Mpc), showcasing the potential of DECam and DES for detailed studies of degree-scale features on the sky.« less

  19. Mass and galaxy distributions of four massive galaxy clusters from Dark Energy Survey Science Verification data

    DOE PAGES

    Melchior, P.; Suchyta, E.; Huff, E.; ...

    2015-03-31

    We measure the weak-lensing masses and galaxy distributions of four massive galaxy clusters observed during the Science Verification phase of the Dark Energy Survey. This pathfinder study is meant to 1) validate the DECam imager for the task of measuring weak-lensing shapes, and 2) utilize DECam's large field of view to map out the clusters and their environments over 90 arcmin. We conduct a series of rigorous tests on astrometry, photometry, image quality, PSF modelling, and shear measurement accuracy to single out flaws in the data and also to identify the optimal data processing steps and parameters. We find Sciencemore » Verification data from DECam to be suitable for the lensing analysis described in this paper. The PSF is generally well-behaved, but the modelling is rendered difficult by a flux-dependent PSF width and ellipticity. We employ photometric redshifts to distinguish between foreground and background galaxies, and a red-sequence cluster finder to provide cluster richness estimates and cluster-galaxy distributions. By fitting NFW profiles to the clusters in this study, we determine weak-lensing masses that are in agreement with previous work. For Abell 3261, we provide the first estimates of redshift, weak-lensing mass, and richness. Additionally, the cluster-galaxy distributions indicate the presence of filamentary structures attached to 1E 0657-56 and RXC J2248.7-4431, stretching out as far as 1degree (approximately 20 Mpc), showcasing the potential of DECam and DES for detailed studies of degree-scale features on the sky.« less

  20. Experimental verification of multipartite entanglement in quantum networks

    PubMed Central

    McCutcheon, W.; Pappa, A.; Bell, B. A.; McMillan, A.; Chailloux, A.; Lawson, T.; Mafu, M.; Markham, D.; Diamanti, E.; Kerenidis, I.; Rarity, J. G.; Tame, M. S.

    2016-01-01

    Multipartite entangled states are a fundamental resource for a wide range of quantum information processing tasks. In particular, in quantum networks, it is essential for the parties involved to be able to verify if entanglement is present before they carry out a given distributed task. Here we design and experimentally demonstrate a protocol that allows any party in a network to check if a source is distributing a genuinely multipartite entangled state, even in the presence of untrusted parties. The protocol remains secure against dishonest behaviour of the source and other parties, including the use of system imperfections to their advantage. We demonstrate the verification protocol in a three- and four-party setting using polarization-entangled photons, highlighting its potential for realistic photonic quantum communication and networking applications. PMID:27827361

  1. Droplet size distributions of adjuvant-amended sprays from an air-assisted five-port PWM nozzle

    USDA-ARS?s Scientific Manuscript database

    Verification of droplet size distributions is essential for the development of real-time variable-rate sprayers that synchronize spray outputs with canopy structures. Droplet sizes from a custom-designed, air-assisted, five-port nozzle coupled with a pulse-width-modulated (PWM) solenoid valve were m...

  2. 242A Distributed Control System Year 2000 Acceptance Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEATS, M.C.

    1999-08-31

    This report documents acceptance test results for the 242-A Evaporator distributive control system upgrade to D/3 version 9.0-2 for year 2000 compliance. This report documents the test results obtained by acceptance testing as directed by procedure HNF-2695. This verification procedure will document the initial testing and evaluation of the potential 242-A Distributed Control System (DCS) operating difficulties across the year 2000 boundary and the calendar adjustments needed for the leap year. Baseline system performance data will be recorded using current, as-is operating system software. Data will also be collected for operating system software that has been modified to correct yearmore » 2000 problems. This verification procedure is intended to be generic such that it may be performed on any D/3{trademark} (GSE Process Solutions, Inc.) distributed control system that runs with the VMSTM (Digital Equipment Corporation) operating system. This test may be run on simulation or production systems depending upon facility status. On production systems, DCS outages will occur nine times throughout performance of the test. These outages are expected to last about 10 minutes each.« less

  3. Method for secure electronic voting system: face recognition based approach

    NASA Astrophysics Data System (ADS)

    Alim, M. Affan; Baig, Misbah M.; Mehboob, Shahzain; Naseem, Imran

    2017-06-01

    In this paper, we propose a framework for low cost secure electronic voting system based on face recognition. Essentially Local Binary Pattern (LBP) is used for face feature characterization in texture format followed by chi-square distribution is used for image classification. Two parallel systems are developed based on smart phone and web applications for face learning and verification modules. The proposed system has two tire security levels by using person ID followed by face verification. Essentially class specific threshold is associated for controlling the security level of face verification. Our system is evaluated three standard databases and one real home based database and achieve the satisfactory recognition accuracies. Consequently our propose system provides secure, hassle free voting system and less intrusive compare with other biometrics.

  4. 21 CFR 1271.37 - Will establishment registrations and HCT/P listings be available for inspection, and how do I...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the geographical area of such district office. Upon request and receipt of a self-addressed stamped envelope, verification of a registration number or the location of a registered establishment will be...

  5. WITHDRAWN: Beam position alignment and its verification for therapeutic ion beams from synchrotron

    NASA Astrophysics Data System (ADS)

    Saraya, Y.; Takeshita, E.; Furukawa, T.; Hara, Y.; Mizushima, K.; Saotome, N.; Tansho, R.; Shirai, T.; Noda, K.

    2017-09-01

    This article has been withdrawn at the request of the authors. The Publisher apologizes for any inconvenience this may cause. The full Elsevier Policy on Article Withdrawal can be found at (http://www.elsevier.com/locate/withdrawalpolicy)

  6. 47 CFR 73.683 - Field strength contours and presumptive determination of field strength at individual locations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... their qualifications in writing to the American Radio Relay League (ARRL). Individuals may also... telephone at 860-594-0200, or email at [email protected] (f) A satellite carrier is exempt from the verification...

  7. 40 CFR 141.401 - Sanitary surveys for ground water systems.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...: (1) Source, (2) Treatment, (3) Distribution system, (4) Finished water storage, (5) Pumps, pump facilities, and controls, (6) Monitoring, reporting, and data verification, (7) System management and...

  8. 40 CFR 141.401 - Sanitary surveys for ground water systems.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...: (1) Source, (2) Treatment, (3) Distribution system, (4) Finished water storage, (5) Pumps, pump facilities, and controls, (6) Monitoring, reporting, and data verification, (7) System management and...

  9. A novel approach to EPID-based 3D volumetric dosimetry for IMRT and VMAT QA

    NASA Astrophysics Data System (ADS)

    Alhazmi, Abdulaziz; Gianoli, Chiara; Neppl, Sebastian; Martins, Juliana; Veloza, Stella; Podesta, Mark; Verhaegen, Frank; Reiner, Michael; Belka, Claus; Parodi, Katia

    2018-06-01

    Intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT) are relatively complex treatment delivery techniques and require quality assurance (QA) procedures. Pre-treatment dosimetric verification represents a fundamental QA procedure in daily clinical routine in radiation therapy. The purpose of this study is to develop an EPID-based approach to reconstruct a 3D dose distribution as imparted to a virtual cylindrical water phantom to be used for plan-specific pre-treatment dosimetric verification for IMRT and VMAT plans. For each depth, the planar 2D dose distributions acquired in air were back-projected and convolved by depth-specific scatter and attenuation kernels. The kernels were obtained by making use of scatter and attenuation models to iteratively estimate the parameters from a set of reference measurements. The derived parameters served as a look-up table for reconstruction of arbitrary measurements. The summation of the reconstructed 3D dose distributions resulted in the integrated 3D dose distribution of the treatment delivery. The accuracy of the proposed approach was validated in clinical IMRT and VMAT plans by means of gamma evaluation, comparing the reconstructed 3D dose distributions with Octavius measurement. The comparison was carried out using (3%, 3 mm) criteria scoring 99% and 96% passing rates for IMRT and VMAT, respectively. An accuracy comparable to the one of the commercial device for 3D volumetric dosimetry was demonstrated. In addition, five IMRT and five VMAT were validated against the 3D dose calculation performed by the TPS in a water phantom using the same passing rate criteria. The median passing rates within the ten treatment plans was 97.3%, whereas the lowest was 95%. Besides, the reconstructed 3D distribution is obtained without predictions relying on forward dose calculation and without external phantom or dosimetric devices. Thus, the approach provides a fully automated, fast and easy QA procedure for plan-specific pre-treatment dosimetric verification.

  10. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    At the SPACEHAB Facility, STS-96 Mission Specialist Ellen Ochoa and Commander Kent Rominger pause during a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station. Other crew members at KSC for the IVT are Pilot Rick Husband and Mission Specialists Tamara Jernigan, Dan Barry, Julie Payette and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  11. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 Mission Specialist Julie Payette closes a container, part of the equipment to be carried on the SPACEHAB and mission STS-96. She and other crew members Commander Kent Rominger, Pilot Rick Husband, and Mission Speciaists Ellen Ochoa, Tamara Jernigan, Dan Barry and Valery Tokarev of Russia are at KSC for a payload Interface Verification Test for the upcoming mission to the International Space Station . Mission STS-96 carries the SPACEHAB Logistics Double Module, which has equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  12. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Posing on the platform next to the SPACEHAB Logistics Double Module in the SPACEHAB Facility are the STS-96 crew (from left) Mission Specialists Dan Barry, Tamara Jernigan, Valery Tokarev of Russia, and Julie Payette; Pilot Rick Husband; Mission Specialist Ellen Ochoa; and Commander Kent Rominger. The crew is at KSC for a payload Interface Verification Test for their upcoming mission to the International Space Station. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  13. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    At the SPACEHAB Facility, STS-96 Mission Specialist Ellen Ochoa and Commander Kent Rominger smile for the camera during a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station. Other crew members at KSC for the IVT are Pilot Rick Husband and Mission Specialists Tamara Jernigan, Dan Barry, Julie Payette and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  14. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) for the upcoming mission to the International Space Station , Chris Jaskolka of Boeing points out a piece of equipment in the SPACEHAB module to STS-96 Commander Kent Rominger, Mission Specialist Ellen Ochoa and Pilot Rick Husband. Other crew members visiting KSC for the IVT are Mission Specialists Tamara Jernigan, Dan Barry, Julie Payette and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  15. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 Mission Specialists Dan Barry and Tamara Jernigan discuss procedures during a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station. Other STS-96 crew members at KSC for the IVT are Commander Kent Rominger, Pilot Rick Husband and Mission Specialists Ellen Ochoa, Julie Payette and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  16. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, James Behling, with Boeing, talks about equipment for mission STS-96 during a payload Interface Verification Test (IVT). Watching are (from left) Mission Specialists Ellen Ochoa, Julie Payette and Dan Berry, and Pilot Rick Husband. Other STS-96 crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  17. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station, STS-96 Mission Specialists Julie Payette, Dan Barry, and Valery Tokarev of Russia, look at a Sequential Shunt Unit in the SPACEHAB Facility. Other crew members at KSC for the IVT are Commander Kent Rominger, Pilot Rick Husband, and Mission Specialists Ellen Ochoa and Tamara Jernigan. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  18. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility for a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station are (left to right) Mission Specialists Valery Tokarev, Julie Payette (holding a lithium hydroxide canister) and Dan Barry. Other crew members at KSC for the IVT are Commander Kent Rominger, Pilot Rick Husband and Mission Specialists Ellen Ochoa and Tamara Jernigan. Mission STS-96 carries the SPACEHAB Logistics Double Module, which has equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  19. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, the STS-96 crew looks over equipment during a payload Interface Verification Test for the upcoming mission to the International Space Station. From left are Commander Kent Rominger, Mission Specialists Tamara Jernigan and Valery Tokarev of Russia, Pilot Rick Husband, and Mission Specialists Ellen Ochoa and Julie Payette (backs to the camera). They are listening to Chris Jaskolka of Boeing talk about the equipment. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  20. Advanced Plant Habitat Test Harvest

    NASA Image and Video Library

    2017-08-24

    Arabidopsis thaliana plants are seen inside the growth chamber of the Advanced Plant Habitat (APH) Flight Unit No. 1 prior to harvest of half the plants. The harvest is part of an ongoing verification test of the APH unit, which is located inside the International Space Station Environmental Simulator in NASA Kennedy Space Center's Space Station Processing Facility. The APH undergoing testing at Kennedy is identical to one on the station and uses red, green and broad-spectrum white LED lights to grow plants in an environmentally controlled chamber. The seeds grown during the verification test will be grown on the station to help scientists understand how these plants adapt to spaceflight.

  1. Using crypts as iris minutiae

    NASA Astrophysics Data System (ADS)

    Shen, Feng; Flynn, Patrick J.

    2013-05-01

    Iris recognition is one of the most reliable biometric technologies for identity recognition and verification, but it has not been used in a forensic context because the representation and matching of iris features are not straightforward for traditional iris recognition techniques. In this paper we concentrate on the iris crypt as a visible feature used to represent the characteristics of irises in a similar way to fingerprint minutiae. The matching of crypts is based on their appearances and locations. The number of matching crypt pairs found between two irises can be used for identity verification and the convenience of manual inspection makes iris crypts a potential candidate for forensic applications.

  2. System engineering of the Atacama Large Millimeter/submillimeter Array

    NASA Astrophysics Data System (ADS)

    Bhatia, Ravinder; Marti, Javier; Sugimoto, Masahiro; Sramek, Richard; Miccolis, Maurizio; Morita, Koh-Ichiro; Arancibia, Demián.; Araya, Andrea; Asayama, Shin'ichiro; Barkats, Denis; Brito, Rodrigo; Brundage, William; Grammer, Wes; Haupt, Christoph; Kurlandczyk, Herve; Mizuno, Norikazu; Napier, Peter; Pizarro, Eduardo; Saini, Kamaljeet; Stahlman, Gretchen; Verzichelli, Gianluca; Whyborn, Nick; Yagoubov, Pavel

    2012-09-01

    The Atacama Large Millimeter/submillimeter Array (ALMA) will be composed of 66 high precision antennae located at 5000 meters altitude in northern Chile. This paper will present the methodology, tools and processes adopted to system engineer a project of high technical complexity, by system engineering teams that are remotely located and from different cultures, and in accordance with a demanding schedule and within tight financial constraints. The technical and organizational complexity of ALMA requires a disciplined approach to the definition, implementation and verification of the ALMA requirements. During the development phase, System Engineering chairs all technical reviews and facilitates the resolution of technical conflicts. We have developed analysis tools to analyze the system performance, incorporating key parameters that contribute to the ultimate performance, and are modeled using best estimates and/or measured values obtained during test campaigns. Strict tracking and control of the technical budgets ensures that the different parts of the system can operate together as a whole within ALMA boundary conditions. System Engineering is responsible for acceptances of the thousands of hardware items delivered to Chile, and also supports the software acceptance process. In addition, System Engineering leads the troubleshooting efforts during testing phases of the construction project. Finally, the team is conducting System level verification and diagnostics activities to assess the overall performance of the observatory. This paper will also share lessons learned from these system engineering and verification approaches.

  3. Computer Simulations to Study Diffraction Effects of Stacking Faults in Beta-SiC: II. Experimental Verification. 2; Experimental Verification

    NASA Technical Reports Server (NTRS)

    Pujar, Vijay V.; Cawley, James D.; Levine, S. (Technical Monitor)

    2000-01-01

    Earlier results from computer simulation studies suggest a correlation between the spatial distribution of stacking errors in the Beta-SiC structure and features observed in X-ray diffraction patterns of the material. Reported here are experimental results obtained from two types of nominally Beta-SiC specimens, which yield distinct XRD data. These samples were analyzed using high resolution transmission electron microscopy (HRTEM) and the stacking error distribution was directly determined. The HRTEM results compare well to those deduced by matching the XRD data with simulated spectra, confirming the hypothesis that the XRD data is indicative not only of the presence and density of stacking errors, but also that it can yield information regarding their distribution. In addition, the stacking error population in both specimens is related to their synthesis conditions and it appears that it is similar to the relation developed by others to explain the formation of the corresponding polytypes.

  4. Quantifying Uncertainty of Wind Power Production Through an Analog Ensemble

    NASA Astrophysics Data System (ADS)

    Shahriari, M.; Cervone, G.

    2016-12-01

    The Analog Ensemble (AnEn) method is used to generate probabilistic weather forecasts that quantify the uncertainty in power estimates at hypothetical wind farm locations. The data are from the NREL Eastern Wind Dataset that includes more than 1,300 modeled wind farms. The AnEn model uses a two-dimensional grid to estimate the probability distribution of wind speed (the predictand) given the values of predictor variables such as temperature, pressure, geopotential height, U-component and V-component of wind. The meteorological data is taken from the NCEP GFS which is available on a 0.25 degree grid resolution. The methodology first divides the data into two classes: training period and verification period. The AnEn selects a point in the verification period and searches for the best matching estimates (analogs) in the training period. The predictand value at those analogs are the ensemble prediction for the point in the verification period. The model provides a grid of wind speed values and the uncertainty (probability index) associated with each estimate. Each wind farm is associated with a probability index which quantifies the degree of difficulty to estimate wind power. Further, the uncertainty in estimation is related to other factors such as topography, land cover and wind resources. This is achieved by using a GIS system to compute the correlation between the probability index and geographical characteristics. This study has significant applications for investors in renewable energy sector especially wind farm developers. Lower level of uncertainty facilitates the process of submitting bids into day ahead and real time electricity markets. Thus, building wind farms in regions with lower levels of uncertainty will reduce the real-time operational risks and create a hedge against volatile real-time prices. Further, the links between wind estimate uncertainty and factors such as topography and wind resources, provide wind farm developers with valuable information regarding wind farm siting.

  5. A Quality Assurance Method that Utilizes 3D Dosimetry and Facilitates Clinical Interpretation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oldham, Mark, E-mail: mark.oldham@duke.edu; Thomas, Andrew; O'Daniel, Jennifer

    2012-10-01

    Purpose: To demonstrate a new three-dimensional (3D) quality assurance (QA) method that provides comprehensive dosimetry verification and facilitates evaluation of the clinical significance of QA data acquired in a phantom. Also to apply the method to investigate the dosimetric efficacy of base-of-skull (BOS) intensity-modulated radiotherapy (IMRT) treatment. Methods and Materials: Two types of IMRT QA verification plans were created for 6 patients who received BOS IMRT. The first plan enabled conventional 2D planar IMRT QA using the Varian portal dosimetry system. The second plan enabled 3D verification using an anthropomorphic head phantom. In the latter, the 3D dose distribution wasmore » measured using the DLOS/Presage dosimetry system (DLOS = Duke Large-field-of-view Optical-CT System, Presage Heuris Pharma, Skillman, NJ), which yielded isotropic 2-mm data throughout the treated volume. In a novel step, measured 3D dose distributions were transformed back to the patient's CT to enable calculation of dose-volume histograms (DVH) and dose overlays. Measured and planned patient DVHs were compared to investigate clinical significance. Results: Close agreement between measured and calculated dose distributions was observed for all 6 cases. For gamma criteria of 3%, 2 mm, the mean passing rate for portal dosimetry was 96.8% (range, 92.0%-98.9%), compared to 94.9% (range, 90.1%-98.9%) for 3D. There was no clear correlation between 2D and 3D passing rates. Planned and measured dose distributions were evaluated on the patient's anatomy, using DVH and dose overlays. Minor deviations were detected, and the clinical significance of these are presented and discussed. Conclusions: Two advantages accrue to the methods presented here. First, treatment accuracy is evaluated throughout the whole treated volume, yielding comprehensive verification. Second, the clinical significance of any deviations can be assessed through the generation of DVH curves and dose overlays on the patient's anatomy. The latter step represents an important development that advances the clinical relevance of complex treatment QA.« less

  6. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify eachmore » cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.« less

  7. ETV REPORT: REMOVAL OF ARSENIC IN DRINKING WATER - PALL CORPORATION MICROZA. MICROFILTRATION SYSTEM

    EPA Science Inventory

    Verification testing of the Pall Corporation Microza. Microfiltration System for arsenic removal was conducted at the Oakland County Drain Commissioner (OCDC) Plum Creek Development well station located in Oakland County, Michigan from August 19 through October 8, 2004. The sourc...

  8. Proceedings of the IDA Workshop on Formal Specification and Verification of Ada (Trade Name) (1st) Held in Alexandria, Virginia on 18-20 March 1985.

    DTIC Science & Technology

    1985-12-01

    on the third day. 5 ADA VERIFICATION WORKSHOP MARCH 18-20, 1985 LIST OF PARTICIPANTS Bernard Abrams ABRAMS@ADA20 Grumman Aerospace Corporation Mail...20301-3081 (202) 694-0211 Mark R. Cornwell CORNWELL @NRL-CSS Code 7590 Naval Research Lab Washington, D.C. 20375 (202) 767-3365 Jeff Facemire FACEMIRE...accompanied by descriptions of their purpose in English, to LUCKHAM@SAIL for annotation. - X-2 DISTRIBUTION LIST FOR M-146 Bernard Abrams ABRAMS@USC-ECLB

  9. The use of positron emission tomography in pion radiotherapy.

    PubMed

    Goodman, G B; Lam, G K; Harrison, R W; Bergstrom, M; Martin, W R; Pate, B D

    1986-10-01

    The radioactive debris produced by pion radiotherapy can be imaged by the technique of Positron Emission Tomography (PET) as a method of non-invasive in situ verification of the pion treatment. This paper presents the first visualization of the pion stopping distribution within a tumor in a human brain using PET. Together with the tissue functional information provided by the standard PET scans using radiopharmaceuticals, the combination of pion with PET technique can provide a much better form of radiotherapy than the use of conventional radiation in both treatment planning and verification.

  10. ESTEST: A Framework for the Verification and Validation of Electronic Structure Codes

    NASA Astrophysics Data System (ADS)

    Yuan, Gary; Gygi, Francois

    2011-03-01

    ESTEST is a verification and validation (V& V) framework for electronic structure codes that supports Qbox, Quantum Espresso, ABINIT, the Exciting Code and plans support for many more. We discuss various approaches to the electronic structure V& V problem implemented in ESTEST, that are related to parsing, formats, data management, search, comparison and analyses. Additionally, an early experiment in the distribution of V& V ESTEST servers among the electronic structure community will be presented. Supported by NSF-OCI 0749217 and DOE FC02-06ER25777.

  11. A High-Level Language for Modeling Algorithms and Their Properties

    NASA Astrophysics Data System (ADS)

    Akhtar, Sabina; Merz, Stephan; Quinson, Martin

    Designers of concurrent and distributed algorithms usually express them using pseudo-code. In contrast, most verification techniques are based on more mathematically-oriented formalisms such as state transition systems. This conceptual gap contributes to hinder the use of formal verification techniques. Leslie Lamport introduced PlusCal, a high-level algorithmic language that has the "look and feel" of pseudo-code, but is equipped with a precise semantics and includes a high-level expression language based on set theory. PlusCal models can be compiled to TLA + and verified using the model checker tlc.

  12. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2

    NASA Technical Reports Server (NTRS)

    Johnson, Kenneth L.; White, K. Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  13. 40 CFR 82.68 - Verification and public notice requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., any person who sells or distributes any mold release agents containing a class II substance as a... federal law to sell mold release agents containing hydrochlorofluorocarbons as propellants to anyone...

  14. The Hawaiian Electric Companies | Energy Systems Integration Facility |

    Science.gov Websites

    farm in Maui, Hawaii Verification of Voltage Regulation Operating Strategies NREL has studied how Hawaiian Electric Companies can best manage voltage regulation functions from distributed technologies. Two

  15. The JPSS Ground Project Algorithm Verification, Test and Evaluation System

    NASA Astrophysics Data System (ADS)

    Vicente, G. A.; Jain, P.; Chander, G.; Nguyen, V. T.; Dixon, V.

    2016-12-01

    The Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) is an operational system that provides services to the Suomi National Polar-orbiting Partnership (S-NPP) Mission. It is also a unique environment for Calibration/Validation (Cal/Val) and Data Quality Assessment (DQA) of the Join Polar Satellite System (JPSS) mission data products. GRAVITE provides a fast and direct access to the data and products created by the Interface Data Processing Segment (IDPS), the NASA/NOAA operational system that converts Raw Data Records (RDR's) generated by sensors on the S-NPP into calibrated geo-located Sensor Data Records (SDR's) and generates Mission Unique Products (MUPS). It also facilitates algorithm investigation, integration, checkouts and tuning, instrument and product calibration and data quality support, monitoring and data/products distribution. GRAVITE is the portal for the latest S-NPP and JPSS baselined Processing Coefficient Tables (PCT's) and Look-Up-Tables (LUT's) and hosts a number DQA offline tools that takes advantage of the proximity to the near-real time data flows. It also contains a set of automated and ad-hoc Cal/Val tools used for algorithm analysis and updates, including an instance of the IDPS called GRAVITE Algorithm Development Area (G-ADA), that has the latest installation of the IDPS algorithms running in an identical software and hardware platforms. Two other important GRAVITE component are the Investigator-led Processing System (IPS) and the Investigator Computing Facility (ICF). The IPS is a dedicated environment where authorized users run automated scripts called Product Generation Executables (PGE's) to support Cal/Val and data quality assurance offline. This data-rich and data-driven service holds its own distribution system and allows operators to retrieve science data products. The ICF is a workspace where users can share computing applications and resources and have full access to libraries and science and sensor quality analysis tools. In this presentation we will describe the GRAVITE systems and subsystems, architecture, technical specifications, capabilities and resources, distributed data and products and the latest advances to support the JPSS science algorithm implementation, validation and testing.

  16. ALMA sub-mm maser and dust distribution of VY Canis Majoris

    NASA Astrophysics Data System (ADS)

    Richards, A. M. S.; Impellizzeri, C. M. V.; Humphreys, E. M.; Vlahakis, C.; Vlemmings, W.; Baudry, A.; De Beck, E.; Decin, L.; Etoka, S.; Gray, M. D.; Harper, G. M.; Hunter, T. R.; Kervella, P.; Kerschbaum, F.; McDonald, I.; Melnick, G.; Muller, S.; Neufeld, D.; O'Gorman, E.; Parfenov, S. Yu.; Peck, A. B.; Shinnaga, H.; Sobolev, A. M.; Testi, L.; Uscanga, L.; Wootten, A.; Yates, J. A.; Zijlstra, A.

    2014-12-01

    Aims: Cool, evolved stars have copious, enriched winds. Observations have so far not fully constrained models for the shaping and acceleration of these winds. We need to understand the dynamics better, from the pulsating stellar surface to ~10 stellar radii, where radiation pressure on dust is fully effective. Asymmetric nebulae around some red supergiants imply the action of additional forces. Methods: We retrieved ALMA Science Verification data providing images of sub-mm line and continuum emission from VY CMa. This enables us to locate water masers with milli-arcsec accuracy and to resolve the dusty continuum. Results: The 658, 321, and 325 GHz masers lie in irregular, thick shells at increasing distances from the centre of expansion. For the first time this is confirmed as the stellar position, coinciding with a compact peak offset to the NW of the brightest continuum emission. The maser shells overlap but avoid each other on scales of up to 10 au. Their distribution is broadly consistent with excitation models but the conditions and kinematics are complicated by wind collisions, clumping, and asymmetries. Appendices are available in electronic form at http://www.aanda.org

  17. Sparse distributed memory: Principles and operation

    NASA Technical Reports Server (NTRS)

    Flynn, M. J.; Kanerva, P.; Bhadkamkar, N.

    1989-01-01

    Sparse distributed memory is a generalized random access memory (RAM) for long (1000 bit) binary words. Such words can be written into and read from the memory, and they can also be used to address the memory. The main attribute of the memory is sensitivity to similarity, meaning that a word can be read back not only by giving the original write address but also by giving one close to it as measured by the Hamming distance between addresses. Large memories of this kind are expected to have wide use in speech recognition and scene analysis, in signal detection and verification, and in adaptive control of automated equipment, in general, in dealing with real world information in real time. The memory can be realized as a simple, massively parallel computer. Digital technology has reached a point where building large memories is becoming practical. Major design issues were resolved which were faced in building the memories. The design is described of a prototype memory with 256 bit addresses and from 8 to 128 K locations for 256 bit words. A key aspect of the design is extensive use of dynamic RAM and other standard components.

  18. Comprehensive security framework for the communication and storage of medical images

    NASA Astrophysics Data System (ADS)

    Slik, David; Montour, Mike; Altman, Tym

    2003-05-01

    Confidentiality, integrity verification and access control of medical imagery and associated metadata is critical for the successful deployment of integrated healthcare networks that extend beyond the department level. As medical imagery continues to become widely accessed across multiple administrative domains and geographically distributed locations, image data should be able to travel and be stored on untrusted infrastructure, including public networks and server equipment operated by external entities. Given these challenges associated with protecting large-scale distributed networks, measures must be taken to protect patient identifiable information while guarding against tampering, denial of service attacks, and providing robust audit mechanisms. The proposed framework outlines a series of security practices for the protection of medical images, incorporating Transport Layer Security (TLS), public and secret key cryptography, certificate management and a token based trusted computing base. It outlines measures that can be utilized to protect information stored within databases, online and nearline storage, and during transport over trusted and untrusted networks. In addition, it provides a framework for ensuring end-to-end integrity of image data from acquisition to viewing, and presents a potential solution to the challenges associated with access control across multiple administrative domains and institution user bases.

  19. Expose : procedure and results of the joint experiment verification tests

    NASA Astrophysics Data System (ADS)

    Panitz, C.; Rettberg, P.; Horneck, G.; Rabbow, E.; Baglioni, P.

    The International Space Station will carry the EXPOSE facility accommodated at the universal workplace URM-D located outside the Russian Service Module. The launch will be affected in 2005 and it is planned to stay in space for 1.5 years. The tray like structure will accomodate 2 chemical and 6 biological PI-experiments or experiment systems of the ROSE (Response of Organisms to Space Environment) consortium. EXPOSE will support long-term in situ studies of microbes in artificial meteorites, as well as of microbial communities from special ecological niches, such as endolithic and evaporitic ecosystems. The either vented or sealed experiment pockets will be covered by an optical filter system to control intensity and spectral range of solar UV irradiation. Control of sun exposure will be achieved by the use of individual shutters. To test the compatibility of the different biological systems and their adaptation to the opportunities and constraints of space conditions a profound ground support program has been developed. The procedure and first results of this joint Experiment Verification Tests (EVT) will be presented. The results will be essential for the success of the EXPOSE mission and have been done in parallel with the development and construction of the final hardware design of the facility. The results of the mission will contribute to the understanding of the organic chemistry processes in space, the biological adaptation strategies to extreme conditions, e.g. on early Earth and Mars, and the distribution of life beyond its planet of origin.

  20. A Preliminary Experimental Examination of Worldview Verification, Perceived Racism, and Stress Reactivity in African Americans

    PubMed Central

    Lucas, Todd; Lumley, Mark A.; Flack, John M.; Wegner, Rhiana; Pierce, Jennifer; Goetz, Stefan

    2016-01-01

    Objective According to worldview verification theory, inconsistencies between lived experiences and worldviews are psychologically threatening. These inconsistencies may be key determinants of stress processes that influence cardiovascular health disparities. This preliminary examination considers how experiencing injustice can affect perceived racism and biological stress reactivity among African Americans. Guided by worldview verification theory, it was hypothesized that responses to receiving an unfair outcome would be moderated by fairness of the accompanying decision process, and that this effect would further depend on the consistency of the decision process with preexisting justice beliefs. Method A sample of 118 healthy African American adults completed baseline measures of justice beliefs, followed by a laboratory-based social-evaluative stressor task. Two randomized fairness manipulations were implemented during the task: participants were given either high or low levels of distributive (outcome) and procedural (decision process) justice. Glucocorticoid (cortisol) and inflammatory (C-reactive protein) biological responses were measured in oral fluids, and attributions of racism were also measured. Results The hypothesized 3-way interaction was generally obtained. Among African Americans with a strong belief in justice, perceived racism, cortisol and C-reactive protein responses to low distributive justice were higher when procedural justice was low. Among African Americans with a weak belief in justice however, these responses were higher when a low level of distributive justice was coupled with high procedural justice. Conclusions Biological and psychological processes that contribute to cardiovascular health disparities are affected by consistency between individual-level and contextual justice factors. PMID:27018728

  1. New developments in EPID-based 3D dosimetry in The Netherlands Cancer Institute

    NASA Astrophysics Data System (ADS)

    Mijnheer, B.; Rozendaal, R.; Olaciregui-Ruiz, I.; González, P.; van Oers, R.; Mans, A.

    2017-05-01

    EPID-based offline 3D in vivo dosimetry is performed routinely in The Netherlands Cancer Institute for almost all RT treatments. The 3D dose distribution is reconstructed using the EPID primary dose in combination with a back-projection algorithm and compared with the planned dose distribution. Recently the method was adapted for real-time dose verification, performing 3D dose verification in less than 300 ms, which is faster than the current portal frame acquisition rate. In this way a possibility is created for halting the linac in case of large delivery errors. Furthermore, a new method for pre-treatment QA was developed in which the EPID primary dose behind a phantom or patient is predicted using the CT data of that phantom or patient in combination with in-air EPID measurements. This virtual EPID primary transit dose is then used to reconstruct the 3D dose distribution within the phantom or patient geometry using the same dose engine as applied offline. In order to assess the relevance of our clinically applied alert criteria, we investigated the sensitivity of our EPID-based 3D dose verification system to detect delivery errors in VMAT treatments. This was done through simulation by modifying patient treatment plans, as well as experimentally by performing EPID measurements during the irradiation of an Alderson phantom, both after deliberately introducing errors during VMAT delivery. In this presentation these new developments will be elucidated.

  2. Verification of the karst flow model under laboratory controlled conditions

    NASA Astrophysics Data System (ADS)

    Gotovac, Hrvoje; Andric, Ivo; Malenica, Luka; Srzic, Veljko

    2016-04-01

    Karst aquifers are very important groundwater resources around the world as well as in coastal part of Croatia. They consist of extremely complex structure defining by slow and laminar porous medium and small fissures and usually fast turbulent conduits/karst channels. Except simple lumped hydrological models that ignore high karst heterogeneity, full hydraulic (distributive) models have been developed exclusively by conventional finite element and finite volume elements considering complete karst heterogeneity structure that improves our understanding of complex processes in karst. Groundwater flow modeling in complex karst aquifers are faced by many difficulties such as a lack of heterogeneity knowledge (especially conduits), resolution of different spatial/temporal scales, connectivity between matrix and conduits, setting of appropriate boundary conditions and many others. Particular problem of karst flow modeling is verification of distributive models under real aquifer conditions due to lack of above-mentioned information. Therefore, we will show here possibility to verify karst flow models under the laboratory controlled conditions. Special 3-D karst flow model (5.6*2.6*2 m) consists of concrete construction, rainfall platform, 74 piezometers, 2 reservoirs and other supply equipment. Model is filled by fine sand (3-D porous matrix) and drainage plastic pipes (1-D conduits). This model enables knowledge of full heterogeneity structure including position of different sand layers as well as conduits location and geometry. Moreover, we know geometry of conduits perforation that enable analysis of interaction between matrix and conduits. In addition, pressure and precipitation distribution and discharge flow rates from both phases can be measured very accurately. These possibilities are not present in real sites what this model makes much more useful for karst flow modeling. Many experiments were performed under different controlled conditions such as different levels in left and right end of reservoirs (boundary conditions), different flow regimes in conduits, flow with and without precipitation, free and pressurized discharge from conduits or influence of epikarst (top layer) on recession period. Experimental results are verified by conventional karst flow model (such as MODFLOW-CFP) showing that hydraulic (distributive) models can describe complex behavior of karst flow processes if substantial amount of input data are known from site investigations and monitoring. These results enable us to develop more advanced karst flow models that will improve understanding and analysis of complex flow processes in the real karst aquifers.

  3. Research on intelligent power distribution system for spacecraft

    NASA Astrophysics Data System (ADS)

    Xia, Xiaodong; Wu, Jianju

    2017-10-01

    The power distribution system (PDS) mainly realizes the power distribution and management of the electrical load of the whole spacecraft, which is directly related to the success or failure of the mission, and hence is an important part of the spacecraft. In order to improve the reliability and intelligent degree of the PDS, and considering the function and composition of spacecraft power distribution system, this paper systematically expounds the design principle and method of the intelligent power distribution system based on SSPC, and provides the analysis and verification of the test data additionally.

  4. An analysis of random projection for changeable and privacy-preserving biometric verification.

    PubMed

    Wang, Yongjin; Plataniotis, Konstantinos N

    2010-10-01

    Changeability and privacy protection are important factors for widespread deployment of biometrics-based verification systems. This paper presents a systematic analysis of a random-projection (RP)-based method for addressing these problems. The employed method transforms biometric data using a random matrix with each entry an independent and identically distributed Gaussian random variable. The similarity- and privacy-preserving properties, as well as the changeability of the biometric information in the transformed domain, are analyzed in detail. Specifically, RP on both high-dimensional image vectors and dimensionality-reduced feature vectors is discussed and compared. A vector translation method is proposed to improve the changeability of the generated templates. The feasibility of the introduced solution is well supported by detailed theoretical analyses. Extensive experimentation on a face-based biometric verification problem shows the effectiveness of the proposed method.

  5. A New Integrated Threshold Selection Methodology for Spatial Forecast Verification of Extreme Events

    NASA Astrophysics Data System (ADS)

    Kholodovsky, V.

    2017-12-01

    Extreme weather and climate events such as heavy precipitation, heat waves and strong winds can cause extensive damage to the society in terms of human lives and financial losses. As climate changes, it is important to understand how extreme weather events may change as a result. Climate and statistical models are often independently used to model those phenomena. To better assess performance of the climate models, a variety of spatial forecast verification methods have been developed. However, spatial verification metrics that are widely used in comparing mean states, in most cases, do not have an adequate theoretical justification to benchmark extreme weather events. We proposed a new integrated threshold selection methodology for spatial forecast verification of extreme events that couples existing pattern recognition indices with high threshold choices. This integrated approach has three main steps: 1) dimension reduction; 2) geometric domain mapping; and 3) thresholds clustering. We apply this approach to an observed precipitation dataset over CONUS. The results are evaluated by displaying threshold distribution seasonally, monthly and annually. The method offers user the flexibility of selecting a high threshold that is linked to desired geometrical properties. The proposed high threshold methodology could either complement existing spatial verification methods, where threshold selection is arbitrary, or be directly applicable in extreme value theory.

  6. Computer simulation of storm runoff for three watersheds in Albuquerque, New Mexico

    USGS Publications Warehouse

    Knutilla, R.L.; Veenhuis, J.E.

    1994-01-01

    Rainfall-runoff data from three watersheds were selected for calibration and verification of the U.S. Geological Survey's Distributed Routing Rainfall-Runoff Model. The watersheds chosen are residentially developed. The conceptually based model uses an optimization process that adjusts selected parameters to achieve the best fit between measured and simulated runoff volumes and peak discharges. Three of these optimization parameters represent soil-moisture conditions, three represent infiltration, and one accounts for effective impervious area. Each watershed modeled was divided into overland-flow segments and channel segments. The overland-flow segments were further subdivided to reflect pervious and impervious areas. Each overland-flow and channel segment was assigned representative values of area, slope, percentage of imperviousness, and roughness coefficients. Rainfall-runoff data for each watershed were separated into two sets for use in calibration and verification. For model calibration, seven input parameters were optimized to attain a best fit of the data. For model verification, parameter values were set using values from model calibration. The standard error of estimate for calibration of runoff volumes ranged from 19 to 34 percent, and for peak discharge calibration ranged from 27 to 44 percent. The standard error of estimate for verification of runoff volumes ranged from 26 to 31 percent, and for peak discharge verification ranged from 31 to 43 percent.

  7. ICSH guidelines for the verification and performance of automated cell counters for body fluids.

    PubMed

    Bourner, G; De la Salle, B; George, T; Tabe, Y; Baum, H; Culp, N; Keng, T B

    2014-12-01

    One of the many challenges facing laboratories is the verification of their automated Complete Blood Count cell counters for the enumeration of body fluids. These analyzers offer improved accuracy, precision, and efficiency in performing the enumeration of cells compared with manual methods. A patterns of practice survey was distributed to laboratories that participate in proficiency testing in Ontario, Canada, the United States, the United Kingdom, and Japan to determine the number of laboratories that are testing body fluids on automated analyzers and the performance specifications that were performed. Based on the results of this questionnaire, an International Working Group for the Verification and Performance of Automated Cell Counters for Body Fluids was formed by the International Council for Standardization in Hematology (ICSH) to prepare a set of guidelines to help laboratories plan and execute the verification of their automated cell counters to provide accurate and reliable results for automated body fluid counts. These guidelines were discussed at the ICSH General Assemblies and reviewed by an international panel of experts to achieve further consensus. © 2014 John Wiley & Sons Ltd.

  8. Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Yvonne (Editor)

    2008-01-01

    Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.

  9. Verification of Meteorological and Oceanographic Ensemble Forecasts in the U.S. Navy

    NASA Astrophysics Data System (ADS)

    Klotz, S.; Hansen, J.; Pauley, P.; Sestak, M.; Wittmann, P.; Skupniewicz, C.; Nelson, G.

    2013-12-01

    The Navy Ensemble Forecast Verification System (NEFVS) has been promoted recently to operational status at the U.S. Navy's Fleet Numerical Meteorology and Oceanography Center (FNMOC). NEFVS processes FNMOC and National Centers for Environmental Prediction (NCEP) meteorological and ocean wave ensemble forecasts, gridded forecast analyses, and innovation (observational) data output by FNMOC's data assimilation system. The NEFVS framework consists of statistical analysis routines, a variety of pre- and post-processing scripts to manage data and plot verification metrics, and a master script to control application workflow. NEFVS computes metrics that include forecast bias, mean-squared error, conditional error, conditional rank probability score, and Brier score. The system also generates reliability and Receiver Operating Characteristic diagrams. In this presentation we describe the operational framework of NEFVS and show examples of verification products computed from ensemble forecasts, meteorological observations, and forecast analyses. The construction and deployment of NEFVS addresses important operational and scientific requirements within Navy Meteorology and Oceanography. These include computational capabilities for assessing the reliability and accuracy of meteorological and ocean wave forecasts in an operational environment, for quantifying effects of changes and potential improvements to the Navy's forecast models, and for comparing the skill of forecasts from different forecast systems. NEFVS also supports the Navy's collaboration with the U.S. Air Force, NCEP, and Environment Canada in the North American Ensemble Forecast System (NAEFS) project and with the Air Force and the National Oceanic and Atmospheric Administration (NOAA) in the National Unified Operational Prediction Capability (NUOPC) program. This program is tasked with eliminating unnecessary duplication within the three agencies, accelerating the transition of new technology, such as multi-model ensemble forecasting, to U.S. Department of Defense use, and creating a superior U.S. global meteorological and oceanographic prediction capability. Forecast verification is an important component of NAEFS and NUOPC. Distribution Statement A: Approved for Public Release; distribution is unlimited

  10. Monte Carlo verification of radiotherapy treatments with CloudMC.

    PubMed

    Miras, Hector; Jiménez, Rubén; Perales, Álvaro; Terrón, José Antonio; Bertolet, Alejandro; Ortiz, Antonio; Macías, José

    2018-06-27

    A new implementation has been made on CloudMC, a cloud-based platform presented in a previous work, in order to provide services for radiotherapy treatment verification by means of Monte Carlo in a fast, easy and economical way. A description of the architecture of the application and the new developments implemented is presented together with the results of the tests carried out to validate its performance. CloudMC has been developed over Microsoft Azure cloud. It is based on a map/reduce implementation for Monte Carlo calculations distribution over a dynamic cluster of virtual machines in order to reduce calculation time. CloudMC has been updated with new methods to read and process the information related to radiotherapy treatment verification: CT image set, treatment plan, structures and dose distribution files in DICOM format. Some tests have been designed in order to determine, for the different tasks, the most suitable type of virtual machines from those available in Azure. Finally, the performance of Monte Carlo verification in CloudMC is studied through three real cases that involve different treatment techniques, linac models and Monte Carlo codes. Considering computational and economic factors, D1_v2 and G1 virtual machines were selected as the default type for the Worker Roles and the Reducer Role respectively. Calculation times up to 33 min and costs of 16 € were achieved for the verification cases presented when a statistical uncertainty below 2% (2σ) was required. The costs were reduced to 3-6 € when uncertainty requirements are relaxed to 4%. Advantages like high computational power, scalability, easy access and pay-per-usage model, make Monte Carlo cloud-based solutions, like the one presented in this work, an important step forward to solve the long-lived problem of truly introducing the Monte Carlo algorithms in the daily routine of the radiotherapy planning process.

  11. SU-E-T-04: 3D Dose Based Patient Compensator QA Procedure for Proton Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zou, W; Reyhan, M; Zhang, M

    2015-06-15

    Purpose: In proton double-scattering radiotherapy, compensators are the essential patient specific devices to contour the distal dose distribution to the tumor target. Traditional compensator QA is limited to checking the drilled surface profiles against the plan. In our work, a compensator QA process was established that assess the entire compensator including its internal structure for patient 3D dose verification. Methods: The fabricated patient compensators were CT scanned. Through mathematical image processing and geometric transformations, the CT images of the proton compensator were combined with the patient simulation CT images into a new series of CT images, in which the imagedmore » compensator is placed at the planned location along the corresponding beam line. The new CT images were input into the Eclipse treatment planning system. The original plan was calculated to the combined CT image series without the plan compensator. The newly computed patient 3D dose from the combined patientcompensator images was verified against the original plan dose. Test plans include the compensators with defects intentionally created inside the fabricated compensators. Results: The calculated 3D dose with the combined compensator and patient CT images reflects the impact of the fabricated compensator to the patient. For the test cases in which no defects were created, the dose distributions were in agreement between our method and the corresponding original plans. For the compensator with the defects, the purposely changed material and a purposely created internal defect were successfully detected while not possible with just the traditional compensator profiles detection methods. Conclusion: We present here a 3D dose verification process to qualify the fabricated proton double-scattering compensator. Such compensator detection process assesses the patient 3D impact of the fabricated compensator surface profile as well as the compensator internal material and structure changes. This research receives funding support from CURA Medical Technologies.« less

  12. Feasibility study on dosimetry verification of volumetric-modulated arc therapy-based total marrow irradiation.

    PubMed

    Liang, Yun; Kim, Gwe-Ya; Pawlicki, Todd; Mundt, Arno J; Mell, Loren K

    2013-03-04

    The purpose of this study was to develop dosimetry verification procedures for volumetric-modulated arc therapy (VMAT)-based total marrow irradiation (TMI). The VMAT based TMI plans were generated for three patients: one child and two adults. The planning target volume (PTV) was defined as bony skeleton, from head to mid-femur, with a 3 mm margin. The plan strategy similar to published studies was adopted. The PTV was divided into head and neck, chest, and pelvic regions, with separate plans each of which is composed of 2-3 arcs/fields. Multiple isocenters were evenly distributed along the patient's axial direction. The focus of this study is to establish a dosimetry quality assurance procedure involving both two-dimensional (2D) and three-dimensional (3D) volumetric verifications, which is desirable for a large PTV treated with multiple isocenters. The 2D dose verification was performed with film for gamma evaluation and absolute point dose was measured with ion chamber, with attention to the junction between neighboring plans regarding hot/cold spots. The 3D volumetric dose verification used commercial dose reconstruction software to reconstruct dose from electronic portal imaging devices (EPID) images. The gamma evaluation criteria in both 2D and 3D verification were 5% absolute point dose difference and 3 mm of distance to agreement. With film dosimetry, the overall average gamma passing rate was 98.2% and absolute dose difference was 3.9% in junction areas among the test patients; with volumetric portal dosimetry, the corresponding numbers were 90.7% and 2.4%. A dosimetry verification procedure involving both 2D and 3D was developed for VMAT-based TMI. The initial results are encouraging and warrant further investigation in clinical trials.

  13. Holographic particle size extraction by using Wigner-Ville distribution

    NASA Astrophysics Data System (ADS)

    Chuamchaitrakool, Porntip; Widjaja, Joewono; Yoshimura, Hiroyuki

    2014-06-01

    A new method for measuring object size from in-line holograms by using Wigner-Ville distribution (WVD) is proposed. The proposed method has advantages over conventional numerical reconstruction in that it is free from iterative process and it can extract the object size and position with only single computation of the WVD. Experimental verification of the proposed method is presented.

  14. Sediment Acoustics: Wideband Model, Reflection Loss and Ambient Noise Inversion

    DTIC Science & Technology

    2010-01-01

    DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Sediment acoustics : Wideband model , reflection loss and...Physically sound models of acoustic interaction with the ocean floor including penetration, reflection and scattering in support of MCM and ASW needs...OBJECTIVES (1) Consolidation of the BIC08 model of sediment acoustics , its verification in a variety of sediment types, parameter reduction and

  15. Collisionless Electrostatic Shock Modeling and Simulation

    DTIC Science & Technology

    2016-10-21

    unlimited. PA#16490 Dissipation Controls Wave Train Under- and Over-damped Shocks – Under-damped: • Dissipation is weak, ripples persist. • High...Density Position – Over-damped: ● Strong dissipation damps ripples . ● Low Density Position 12 Position Distribution A. Approved for public release...distribution unlimited. PA#16490 Model Verification Comparison with Linearized Solution – Evolution of the First Ripple Wavelength: • Simulated

  16. Testing a Model of Participant Retention in Longitudinal Substance Abuse Research

    ERIC Educational Resources Information Center

    Gilmore, Devin; Kuperminc, Gabriel P.

    2014-01-01

    Longitudinal substance abuse research has often been compromised by high rates of attrition, thought to be the result of the lifestyle that often accompanies addiction. Several studies have used strategies including collection of locator information at the baseline assessment, verification of the information, and interim contacts prior to…

  17. 47 CFR 64.1120 - Verification of orders for telecommunications service.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... data (e.g., the subscriber's date of birth or social security number). The independent third party must not be owned, managed, controlled, or directed by the carrier or the carrier's marketing agent; must... carrier's marketing agent; and must operate in a location physically separate from the carrier or the...

  18. Solar heating system design package for a single-family residence at William O'Brien State Park, Minnesota

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The plans, specifications, cost trade studies, and verification status of a prototype solar heating and hot water system for the Minnesota Department of Natural Resources's single-family dwelling located at O'Brien State Park, 30 miles east of Minneapolis, Minnesota are presented.

  19. Advanced Plant Habitat Test Harvest

    NASA Image and Video Library

    2017-08-24

    John "JC" Carver, a payload integration engineer with NASA Kennedy Space Center's Test and Operations Support Contract, harvests half the Arabidopsis thaliana plants inside the growth chamber of the Advanced Plant Habitat (APH) Flight Unit No. 1. The harvest is part of an ongoing verification test of the APH unit, which is located inside the International Space Station Environmental Simulator in Kennedy's Space Station Processing Facility. The APH undergoing testing at Kennedy is identical to one on the station and uses red, green and broad-spectrum white LED lights to grow plants in an environmentally controlled chamber. The seeds grown during the verification test will be grown on the station to help scientists understand how these plants adapt to spaceflight.

  20. Methods for identification and verification using vacuum XRF system

    NASA Technical Reports Server (NTRS)

    Kaiser, Bruce (Inventor); Schramm, Fred (Inventor)

    2005-01-01

    Apparatus and methods in which one or more elemental taggants that are intrinsically located in an object are detected by x-ray fluorescence analysis under vacuum conditions to identify or verify the object's elemental content for elements with lower atomic numbers. By using x-ray fluorescence analysis, the apparatus and methods of the invention are simple and easy to use, as well as provide detection by a non line-of-sight method to establish the origin of objects, as well as their point of manufacture, authenticity, verification, security, and the presence of impurities. The invention is extremely advantageous because it provides the capability to measure lower atomic number elements in the field with a portable instrument.

  1. Improvement on post-OPC verification efficiency for contact/via coverage check by final CD biasing of metal lines and considering their location on the metal layout

    NASA Astrophysics Data System (ADS)

    Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong

    2011-04-01

    As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model application is presented.

  2. Quantum supremacy in constant-time measurement-based computation: A unified architecture for sampling and verification

    NASA Astrophysics Data System (ADS)

    Miller, Jacob; Sanders, Stephen; Miyake, Akimasa

    2017-12-01

    While quantum speed-up in solving certain decision problems by a fault-tolerant universal quantum computer has been promised, a timely research interest includes how far one can reduce the resource requirement to demonstrate a provable advantage in quantum devices without demanding quantum error correction, which is crucial for prolonging the coherence time of qubits. We propose a model device made of locally interacting multiple qubits, designed such that simultaneous single-qubit measurements on it can output probability distributions whose average-case sampling is classically intractable, under similar assumptions as the sampling of noninteracting bosons and instantaneous quantum circuits. Notably, in contrast to these previous unitary-based realizations, our measurement-based implementation has two distinctive features. (i) Our implementation involves no adaptation of measurement bases, leading output probability distributions to be generated in constant time, independent of the system size. Thus, it could be implemented in principle without quantum error correction. (ii) Verifying the classical intractability of our sampling is done by changing the Pauli measurement bases only at certain output qubits. Our usage of random commuting quantum circuits in place of computationally universal circuits allows a unique unification of sampling and verification, so they require the same physical resource requirements in contrast to the more demanding verification protocols seen elsewhere in the literature.

  3. First International Conference on Ada (R) Programming Language Applications for the NASA Space Station, volume 1

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L. (Editor)

    1986-01-01

    Topics discussed include: test and verification; environment issues; distributed Ada issues; life cycle issues; Ada in Europe; management/training issues; common Ada interface set; and run time issues.

  4. Verification of spatial and temporal pressure distributions in segmented solid rocket motors

    NASA Technical Reports Server (NTRS)

    Salita, Mark

    1989-01-01

    A wide variety of analytical tools are in use today to predict the history and spatial distributions of pressure in the combustion chambers of solid rocket motors (SRMs). Experimental and analytical methods are presented here that allow the verification of many of these predictions. These methods are applied to the redesigned space shuttle booster (RSRM). Girth strain-gage data is compared to the predictions of various one-dimensional quasisteady analyses in order to verify the axial drop in motor static pressure during ignition transients as well as quasisteady motor operation. The results of previous modeling of radial flows in the bore, slots, and around grain overhangs are supported by approximate analytical and empirical techniques presented here. The predictions of circumferential flows induced by inhibitor asymmetries, nozzle vectoring, and propellant slump are compared to each other and to subscale cold air and water tunnel measurements to ascertain their validity.

  5. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, the STS-96 crew looks at equipment as part of a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station . From left are Mission Specialist Ellen Ochoa (behind the opened storage cover ), Commander Kent Rominger, Pilot Rick Husband (holding a lithium hydroxide canister) and Mission Specialists Dan Barry, Valery Tokarev of Russia and Julie Payette. In the background is TTI interpreter Valentina Maydell. The other crew member at KSC for the IVT is Mission Specialist Tamara Jernigan. Mission STS-96 carries the SPACEHAB Logistics Double Module, which has equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  6. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 crew members look over equipment during a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station. From left are Khristal Parker, with Boeing; Mission Specialist Dan Barry, Pilot Rick Husband, Mission Specialist Tamara Jernigan, and at the far right, Mission Specialist Julie Payette. An unidentified worker is in the background. Also at KSC for the IVT are Commander Kent Rominger and Mission Specialists Ellen Ochoa and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  7. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, (left to right) STS-96 Pilot Rick Husband and Mission Specialists Julie Payette and Ellen Ochoa work the straps on the Sequential Shunt Unit (SSU) in front of them. The STS-96 crew is at KSC for a payload Interface Verification Test (IVT) for its upcoming mission to the International Space Station . Other crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan, Dan Barry and Valery Tokarev of Russia. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  8. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 Mission Specialist Valery Tokarev of Russia (left) and Commander Kent Rominger (second from right) listen to Lynn Ashby (far right), with JSC, talking about the SPACEHAB equipment in front of them during a payload Interface Verification Test (IVT). In the background behind Tokarev is TTI interpreter Valentina Maydell. Other STS-96 crew members at KSC for the IVT are Pilot Rick Husband and Mission Specialists Dan Barry, Ellen Ochoa, Tamara Jernigan and Julie Payette. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  9. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) in the SPACEHAB Facility, STS-96 Mission Specialist Valery Tokarev of Russia (second from left) and Commander Kent Rominger learn about the Sequential Shunt Unit (SSU) in front of them from Lynn Ashby (far right), with Johnson Space Center. At the far left looking on is TTI interpreter Valentina Maydell. Other crew members at KSC for the IVT are Pilot Rick Husband and Mission Specialists Ellen Ochoa, Tamara Jernigan, Dan Barry and Julie Payette. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  10. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 Mission Specialist Valery Tokarev (in foreground) of the Russian Space Agency closes a container, part of the equipment that will be in the SPACEHAB module on mission STS-96. Behind Tokarev are Pilot Rick Husband (left) and Mission Specialist Dan Barry (right). Other crew members at KSC for a payload Interface Verification Test for the upcoming mission to the International Space Station are Commander Kent Rominger and Mission Specialists Ellen Ochoa, Tamara Jernigan and Julie Payette. Mission STS-96 carries the SPACEHAB Logistics Double Module, which has equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  11. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) in the SPACEHAB Facility, STS-96 Mission Specialist Tamara Jernigan checks over instructions while Mission Specialist Dan Barry looks up from the Sequential Shunt Unit (SSU) in front of him to other equipment Lynn Ashby (right), with Johnson Space Center, is pointing at. Other crew members at KSC for the IVT are Commander Kent Rominger, Pilot Rick Husband, and Mission Specialists Ellen Ochoa, Julie Payette and Valery Tokarev of Russia. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  12. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) in the SPACEHAB Facility, STS-96 Pilot Rick Husband and Mission Specialist Ellen Ochoa (on the left) and Mission Specialist Julie Payette (on the far right) listen to Khristal Parker (second from right), with Boeing, explain about the equipment in front of them. Other crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan, Dan Barry and Valery Tokarev of Russia. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  13. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility for a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station are (kneeling) STS-96 Mission Specialists Julie Payette and Ellen Ochoa, Pilot Rick Husband, and (standing at right) Mission Specialist Dan Barry. At the left is James Behling, with Boeing, explaining some of the equipment that will be on board STS-96. Other STS-96 crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  14. Detection, Location, and Characterization of Hydroacoustic Signals Using Seafloor Cable Networks Offshore Japan

    NASA Astrophysics Data System (ADS)

    Suyehiro, K.; Sugioka, H.; Watanabe, T.

    2008-12-01

    The hydroacoustic monitoring by the International Monitoring System for CTBT (Comprehensive Nuclear- Test-Ban Treaty) verification system utilizes hydrophone stations (6) and seismic stations (5 and called T- phase stations) for worldwide detection. Some conspicuous signals of natural origin include those from earthquakes, volcanic eruptions, or whale calls. Among artificial sources are non-nuclear explosions and airgun shots. It is important for the IMS system to detect and locate hydroacoustic events with sufficient accuracy and correctly characterize the signals and identify the source. As there are a number of seafloor cable networks operated offshore Japanese islands basically facing the Pacific Ocean for monitoring regional seismicity, the data from these stations (pressure and seismic sensors) may be utilized to increase the capability of IMS. We use these data to compare some selected event parameters with those by IMS. In particular, there have been several unconventional acoustic signals in the western Pacific,which were also captured by IMS hydrophones across the Pacific in the time period of 2007-present. These anomalous examples and also dynamite shots used for seismic crustal structure studies and other natural sources will be presented in order to help improve the IMS verification capabilities for detection, location and characterization of anomalous signals.

  15. Preventing illegal tobacco and alcohol sales to minors through electronic age-verification devices: a field effectiveness study.

    PubMed

    Krevor, Brad; Capitman, John A; Oblak, Leslie; Cannon, Joanna B; Ruwe, Mathilda

    2003-01-01

    Efforts to prohibit the sales of tobacco and alcohol products to minors are widespread. Electronic Age Verification (EAV) devices are one possible means to improve compliance with sales to minors laws. The purpose of this study was to evaluate the implementation and effectiveness of EAV devices in terms of the frequency and accuracy of age verification, as well as to examine the impact of EAV's on the retailer environment. Two study locations were selected: Tallahassee, Florida and Iowa City, Iowa. Retail stores were invited to participate in the study, producing a self-selected experimental group. Stores that did not elect to test the EAV's comprised the comparison group. The data sources included: 1) mystery shopper inspections: two pre- and five post-EAV installation mystery shopper inspections of tobacco and alcohol retailers; 2) retail clerk and manager interviews; and 3) customer interviews. The study found that installing EAV devices with minimal training and encouragement did not increase age verification and underage sales refusal. Surveyed clerks reported positive experiences using the electronic ID readers and customers reported almost no discomfort about being asked to swipe their IDs. Observations from this study support the need for a more comprehensive system for responsible retailing.

  16. A feasibility study of treatment verification using EPID cine images for hypofractionated lung radiotherapy

    NASA Astrophysics Data System (ADS)

    Tang, Xiaoli; Lin, Tong; Jiang, Steve

    2009-09-01

    We propose a novel approach for potential online treatment verification using cine EPID (electronic portal imaging device) images for hypofractionated lung radiotherapy based on a machine learning algorithm. Hypofractionated radiotherapy requires high precision. It is essential to effectively monitor the target to ensure that the tumor is within the beam aperture. We modeled the treatment verification problem as a two-class classification problem and applied an artificial neural network (ANN) to classify the cine EPID images acquired during the treatment into corresponding classes—with the tumor inside or outside of the beam aperture. Training samples were generated for the ANN using digitally reconstructed radiographs (DRRs) with artificially added shifts in the tumor location—to simulate cine EPID images with different tumor locations. Principal component analysis (PCA) was used to reduce the dimensionality of the training samples and cine EPID images acquired during the treatment. The proposed treatment verification algorithm was tested on five hypofractionated lung patients in a retrospective fashion. On average, our proposed algorithm achieved a 98.0% classification accuracy, a 97.6% recall rate and a 99.7% precision rate. This work was first presented at the Seventh International Conference on Machine Learning and Applications, San Diego, CA, USA, 11-13 December 2008.

  17. Middleware Trade Study for NASA Domain

    NASA Technical Reports Server (NTRS)

    Bowman, Dan

    2007-01-01

    This presentation presents preliminary results of a trade study designed to assess three distributed simulation middleware technologies for support of the NASA Constellation Distributed Space Exploration Simulation (DSES) project and Test and Verification Distributed System Integration Laboratory (DSIL). The technologies are: the High Level Architecture (HLA), the Test and Training Enabling Architecture (TENA), and an XML-based variant of Distributed Interactive Simulation (DIS-XML) coupled with the Extensible Messaging and Presence Protocol (XMPP). According to the criteria and weights determined in this study, HLA scores better than the other two for DSES as well as the DSIL

  18. NASA Constellation Distributed Simulation Middleware Trade Study

    NASA Technical Reports Server (NTRS)

    Hasan, David; Bowman, James D.; Fisher, Nancy; Cutts, Dannie; Cures, Edwin Z.

    2008-01-01

    This paper presents the results of a trade study designed to assess three distributed simulation middleware technologies for support of the NASA Constellation Distributed Space Exploration Simulation (DSES) project and Test and Verification Distributed System Integration Laboratory (DSIL). The technologies are the High Level Architecture (HLA), the Test and Training Enabling Architecture (TENA), and an XML-based variant of Distributed Interactive Simulation (DIS-XML) coupled with the Extensible Messaging and Presence Protocol (XMPP). According to the criteria and weights determined in this study, HLA scores better than the other two for DSES as well as the DSIL.

  19. Comparison of Kodak EDR2 and Gafchromic EBT film for intensity-modulated radiation therapy dose distribution verification.

    PubMed

    Sankar, A; Ayyangar, Komanduri M; Nehru, R Mothilal; Kurup, P G Gopalakrishna; Murali, V; Enke, Charles A; Velmurugan, J

    2006-01-01

    The quantitative dose validation of intensity-modulated radiation therapy (IMRT) plans require 2-dimensional (2D) high-resolution dosimetry systems with uniform response over its sensitive region. The present work deals with clinical use of commercially available self-developing Radio Chromic Film, Gafchromic EBT film, for IMRT dose verification. Dose response curves were generated for the films using a VXR-16 film scanner. The results obtained with EBT films were compared with the results of Kodak extended dose range 2 (EDR2) films. The EBT film had a linear response between the dose range of 0 to 600 cGy. The dose-related characteristics of the EBT film, such as post irradiation color growth with time, film uniformity, and effect of scanning orientation, were studied. There was up to 8.6% increase in the color density between 2 to 40 hours after irradiation. There was a considerable variation, up to 8.5%, in the film uniformity over its sensitive region. The quantitative differences between calculated and measured dose distributions were analyzed using DTA and Gamma index with the tolerance of 3% dose difference and 3-mm distance agreement. The EDR2 films showed consistent results with the calculated dose distributions, whereas the results obtained using EBT were inconsistent. The variation in the film uniformity limits the use of EBT film for conventional large-field IMRT verification. For IMRT of smaller field sizes (4.5 x 4.5 cm), the results obtained with EBT were comparable with results of EDR2 films.

  20. A preliminary experimental examination of worldview verification, perceived racism, and stress reactivity in African Americans.

    PubMed

    Lucas, Todd; Lumley, Mark A; Flack, John M; Wegner, Rhiana; Pierce, Jennifer; Goetz, Stefan

    2016-04-01

    According to worldview verification theory, inconsistencies between lived experiences and worldviews are psychologically threatening. These inconsistencies may be key determinants of stress processes that influence cardiovascular health disparities. This preliminary examination considers how experiencing injustice can affect perceived racism and biological stress reactivity among African Americans. Guided by worldview verification theory, it was hypothesized that responses to receiving an unfair outcome would be moderated by fairness of the accompanying decision process, and that this effect would further depend on the consistency of the decision process with preexisting justice beliefs. A sample of 118 healthy African American adults completed baseline measures of justice beliefs, followed by a laboratory-based social-evaluative stressor task. Two randomized fairness manipulations were implemented during the task: participants were given either high or low levels of distributive (outcome) and procedural (decision process) justice. Glucocorticoid (cortisol) and inflammatory (C-reactive protein) biological responses were measured in oral fluids, and attributions of racism were also measured. The hypothesized 3-way interaction was generally obtained. Among African Americans with a strong belief in justice, perceived racism, cortisol, and C-reactive protein responses to low distributive justice were higher when procedural justice was low. Among African Americans with a weak belief in justice however, these responses were higher when a low level of distributive justice was coupled with high procedural justice. Biological and psychological processes that contribute to cardiovascular health disparities are affected by consistency between individual-level and contextual justice factors. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  1. A feasibility study on bedside upper airway ultrasonography compared to waveform capnography for verifying endotracheal tube location after intubation

    PubMed Central

    2013-01-01

    Background In emergency settings, verification of endotracheal tube (ETT) location is important for critically ill patients. Ignorance of oesophageal intubation can be disastrous. Many methods are used for verification of the endotracheal tube location; none are ideal. Quantitative waveform capnography is considered the standard of care for this purpose but is not always available and is expensive. Therefore, this feasibility study is conducted to compare a cheaper alternative, bedside upper airway ultrasonography to waveform capnography, for verification of endotracheal tube location after intubation. Methods This was a prospective, single-centre, observational study, conducted at the HRPB, Ipoh. It included patients who were intubated in the emergency department from 28 March 2012 to 17 August 2012. A waiver of consent had been obtained from the Medical Research Ethics Committee. Bedside upper airway ultrasonography was performed after intubation and compared to waveform capnography. Specificity, sensitivity, positive and negative predictive value and likelihood ratio are calculated. Results A sample of 107 patients were analysed, and 6 (5.6%) had oesophageal intubations. The overall accuracy of bedside upper airway ultrasonography was 98.1% (95% confidence interval (CI) 93.0% to 100.0%). The kappa value (Κ) was 0.85, indicating a very good agreement between the bedside upper airway ultrasonography and waveform capnography. Thus, bedside upper airway ultrasonography is in concordance with waveform capnography. The sensitivity, specificity, positive predictive value and negative predictive value of bedside upper airway ultrasonography were 98.0% (95% CI 93.0% to 99.8%), 100% (95% CI 54.1% to 100.0%), 100% (95% CI 96.3% to 100.0%) and 75.0% (95% CI 34.9% to 96.8%). The likelihood ratio of a positive test is infinite and the likelihood ratio of a negative test is 0.0198 (95% CI 0.005 to 0.0781). The mean confirmation time by ultrasound is 16.4 s. No adverse effects were recorded. Conclusions Our study shows that ultrasonography can replace waveform capnography in confirming ETT placement in centres without capnography. This can reduce incidence of unrecognised oesophageal intubation and prevent morbidity and mortality. Trial registration National Medical Research Register NMRR11100810230. PMID:23826756

  2. The physical model for research of behavior of grouting mixtures

    NASA Astrophysics Data System (ADS)

    Hajovsky, Radovan; Pies, Martin; Lossmann, Jaroslav

    2016-06-01

    The paper deals with description of physical model designed for verification of behavior of grouting mixtures when applied below underground water level. Described physical model has been set up to determine propagation of grouting mixture in a given environment. Extension of grouting in this environment is based on measurement of humidity and temperature with the use of combined sensors located within preinstalled special measurement probes around grouting needle. Humidity was measured by combined capacity sensor DTH-1010, temperature was gathered by a NTC thermistor. Humidity sensors measured time when grouting mixture reached sensor location point. NTC thermistors measured temperature changes in time starting from initial of injection. This helped to develop 3D map showing the distribution of grouting mixture through the environment. Accomplishment of this particular measurement was carried out by a designed primary measurement module capable of connecting 4 humidity and temperature sensors. This module also takes care of converting these physical signals into unified analogue signals consequently brought to the input terminals of analogue input of programmable automation controller (PAC) WinPAC-8441. This controller ensures the measurement itself, archiving and visualization of all data. Detail description of a complex measurement system and evaluation in form of 3D animations and graphs is supposed to be in a full paper.

  3. The Fracture Project

    DTIC Science & Technology

    2017-09-01

    report was cleared for public release by the 88th ABW, Wright-Patterson AFB Public Affairs Office and is available to the general public, including...AFRL/RI 11. SPONSOR/MONITOR’S REPORT NUMBER AFRL-RI-RS-TR-2017-178 12. DISTRIBUTION AVAILABILITY STATEMENT Approved for Public Release; Distribution...Formal Verification, Red Team, High Assurance Cyber Military Systems 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES

  4. SU-F-T-364: Monte Carlo-Dose Verification of Volumetric Modulated Arc Therapy Plans Using AAPM TG-119 Test Patterns

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onizuka, R; Araki, F; Ohno, T

    2016-06-15

    Purpose: To investigate the Monte Carlo (MC)-based dose verification for VMAT plans by a treatment planning system (TPS). Methods: The AAPM TG-119 test structure set was used for VMAT plans by the Pinnacle3 (convolution/superposition), using a Synergy radiation head of a 6 MV beam with the Agility MLC. The Synergy was simulated with the EGSnrc/BEAMnrc code, and VMAT dose distributions were calculated with the EGSnrc/DOSXYZnrc code by the same irradiation conditions as TPS. VMAT dose distributions of TPS and MC were compared with those of EBT3 film, by 2-D gamma analysis of ±3%/3 mm criteria with a threshold of 30%more » of prescribed doses. VMAT dose distributions between TPS and MC were also compared by DVHs and 3-D gamma analysis of ±3%/3 mm criteria with a threshold of 10%, and 3-D passing rates for PTVs and OARs were analyzed. Results: TPS dose distributions differed from those of film, especially for Head & neck. The dose difference between TPS and film results from calculation accuracy for complex motion of MLCs like tongue and groove effect. In contrast, MC dose distributions were in good agreement with those of film. This is because MC can model fully the MLC configuration and accurately reproduce the MLC motion between control points in VMAT plans. D95 of PTV for Prostate, Head & neck, C-shaped, and Multi Target was 97.2%, 98.1%, 101.6%, and 99.7% for TPS and 95.7%, 96.0%, 100.6%, and 99.1% for MC, respectively. Similarly, 3-D gamma passing rates of each PTV for TPS vs. MC were 100%, 89.5%, 99.7%, and 100%, respectively. 3-D passing rates of TPS reduced for complex VMAT fields like Head & neck because MLCs are not modeled completely for TPS. Conclusion: MC-calculated VMAT dose distributions is useful for the 3-D dose verification of VMAT plans by TPS.« less

  5. From Tornadoes to Earthquakes: Forecast Verification for Binary Events Applied to the 1999 Chi-Chi, Taiwan, Earthquake

    NASA Astrophysics Data System (ADS)

    Chen, C.; Rundle, J. B.; Holliday, J. R.; Nanjo, K.; Turcotte, D. L.; Li, S.; Tiampo, K. F.

    2005-12-01

    Forecast verification procedures for statistical events with binary outcomes typically rely on the use of contingency tables and Relative Operating Characteristic (ROC) diagrams. Originally developed for the statistical evaluation of tornado forecasts on a county-by-county basis, these methods can be adapted to the evaluation of competing earthquake forecasts. Here we apply these methods retrospectively to two forecasts for the m = 7.3 1999 Chi-Chi, Taiwan, earthquake. These forecasts are based on a method, Pattern Informatics (PI), that locates likely sites for future large earthquakes based on large change in activity of the smallest earthquakes. A competing null hypothesis, Relative Intensity (RI), is based on the idea that future large earthquake locations are correlated with sites having the greatest frequency of small earthquakes. We show that for Taiwan, the PI forecast method is superior to the RI forecast null hypothesis. Inspection of the two maps indicates that their forecast locations are indeed quite different. Our results confirm an earlier result suggesting that the earthquake preparation process for events such as the Chi-Chi earthquake involves anomalous changes in activation or quiescence, and that signatures of these processes can be detected in precursory seismicity data. Furthermore, we find that our methods can accurately forecast the locations of aftershocks from precursory seismicity changes alone, implying that the main shock together with its aftershocks represent a single manifestation of the formation of a high-stress region nucleating prior to the main shock.

  6. Non-arbitrage in financial markets: A Bayesian approach for verification

    NASA Astrophysics Data System (ADS)

    Cerezetti, F. V.; Stern, Julio Michael

    2012-10-01

    The concept of non-arbitrage plays an essential role in finance theory. Under certain regularity conditions, the Fundamental Theorem of Asset Pricing states that, in non-arbitrage markets, prices of financial instruments are martingale processes. In this theoretical framework, the analysis of the statistical distributions of financial assets can assist in understanding how participants behave in the markets, and may or may not engender arbitrage conditions. Assuming an underlying Variance Gamma statistical model, this study aims to test, using the FBST - Full Bayesian Significance Test, if there is a relevant price difference between essentially the same financial asset traded at two distinct locations. Specifically, we investigate and compare the behavior of call options on the BOVESPA Index traded at (a) the Equities Segment and (b) the Derivatives Segment of BM&FBovespa. Our results seem to point out significant statistical differences. To what extent this evidence is actually the expression of perennial arbitrage opportunities is still an open question.

  7. 77 FR 24988 - Manufacturer of Controlled Substances; Notice of Registration; Johnson Matthey Pharma Services

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-26

    ... distributed to the company's customers. No comments or objections have been received. DEA has considered the... physical security systems, verification of the company's compliance with state and local laws, and a review...

  8. Online 3D EPID-based dose verification: Proof of concept.

    PubMed

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel

    2016-07-01

    Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5-10 s irradiation time. A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.

  9. Biometric Identification Verification Technology Status and Feasibility Study

    DTIC Science & Technology

    1994-09-01

    L’., .- CONTRACT No. DNA 001 -93-C-01 37 Approved for public release;T distribution Is unlimited. ~v 94g’ Destroy this report when it is no longer...DISTRIBUI ION/AVAILABILITY STATEMENT 12b. DISTRIBUTION CODE Approved for public release; distribution is unlimited. 13. ABSTRACT (Maximurm 200 wvrds) DoD...guys." 4lie issue is then reduced to one of positive identification and control. Traditiozal~y, this has beeýu accomplished by posting a guard or entry

  10. Implementation of the Short-Term Ensemble Prediction System (STEPS) in Belgium and verification of case studies

    NASA Astrophysics Data System (ADS)

    Foresti, Loris; Reyniers, Maarten; Delobbe, Laurent

    2014-05-01

    The Short-Term Ensemble Prediction System (STEPS) is a probabilistic precipitation nowcasting scheme developed at the Australian Bureau of Meteorology in collaboration with the UK Met Office. In order to account for the multiscaling nature of rainfall structures, the radar field is decomposed into an 8 levels multiplicative cascade using a Fast Fourier Transform. The cascade is advected using the velocity field estimated with optical flow and evolves stochastically according to a hierarchy of auto-regressive processes. This allows reproducing the empirical observation that the rate of temporal evolution of the small scales is faster than the large scales. The uncertainty in radar rainfall measurement and the unknown future development of the velocity field are also considered by stochastic modelling in order to reflect their typical spatial and temporal variability. Recently, a 4 years national research program has been initiated by the University of Leuven, the Royal Meteorological Institute (RMI) of Belgium and 3 other partners: PLURISK ("forecasting and management of extreme rainfall induced risks in the urban environment"). The project deals with the nowcasting of rainfall and subsequent urban inundations, as well as socio-economic risk quantification, communication, warning and prevention. At the urban scale it is widely recognized that the uncertainty of hydrological and hydraulic models is largely driven by the input rainfall estimation and forecast uncertainty. In support to the PLURISK project the RMI aims at integrating STEPS in the current operational deterministic precipitation nowcasting system INCA-BE (Integrated Nowcasting through Comprehensive Analysis). This contribution will illustrate examples of STEPS ensemble and probabilistic nowcasts for a few selected case studies of stratiform and convective rain in Belgium. The paper focuses on the development of STEPS products for potential hydrological users and a preliminary verification of the nowcasts, especially to analyze the spatial distribution of forecast errors. The analysis of nowcast biases reveals the locations where the convective initiation, rainfall growth and decay processes significantly reduce the forecast accuracy, but also points out the need for improving the radar-based quantitative precipitation estimation product that is used both to generate and verify the nowcasts. The collection of fields of verification statistics is implemented using an online update strategy, which potentially enables the system to learn from forecast errors as the archive of nowcasts grows. The study of the spatial or temporal distribution of nowcast errors is a key step to convey to the users an overall estimation of the nowcast accuracy and to drive future model developments.

  11. 77 FR 8325 - Petition for Waiver of Compliance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-14

    ... defective locations for post-test verification. Verifiers will then be sent out with field instruments to... III of its nonstop continuous rail test pilot project beginning April 1, 2012, for a period of up to 1... nonstop continuous rail test, CSX will not perform parallel or redundant start/stop rail testing on track...

  12. Shuttle passenger couch. [design and performance of engineering model

    NASA Technical Reports Server (NTRS)

    Rosener, A. A.; Stephenson, M. L.

    1974-01-01

    Conceptual design and fabrication of a full scale shuttle passenger couch engineering model are reported. The model was utilized to verify anthropometric dimensions, reach dimensions, ingress/egress, couch operation, storage space, restraint locations, and crew acceptability. These data were then incorported in the design of the passenger couch verification model that underwent performance tests.

  13. ETV Report:Siemens Model H-4XE-HO Open Channel UV System

    EPA Science Inventory

    Verification testing of the Siemens Barrier Sunligt H-4XE-HO UV System was completed at the UV Validation and Research Center of New York (UV Center), located in Johnstown, NY. The H-4XE System utilizes 16 high-output, low-pressure lamps oriented horizontally and parallel to the...

  14. ETV Report: Siemens Model V-40R-A150 Open Channel UV System

    EPA Science Inventory

    Verification testing of the Siemens Barrier Sunlight V-40R-A150 UV System was completed at the UV Validation and Research Center of New York (UV Center), located in Johnstown, NY. The V-40R System supplied by Siemens utilizes 40 high-output, low-pressure amalgam lamps, oriented ...

  15. 77 FR 67777 - National Oil and Hazardous Substance Pollution Contingency Plan; National Priorities List...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-14

    ... subsequent soil samples showed levels of metals at or below generic residential criteria or background values... 1994- 1996 and additional sampling between 1998 and 2007. Area A--Site Entrance: Soil boring samples... verification samples. Additional soil samples were collected from the same location as the previous collection...

  16. ETV REPORT: REMOVAL OF ARSENIC IN DRINKING WATER ORCA WATER TECHNOLOGIES KEMLOOP 1000 COAGULATION AND FILTRATION WATER TREATMENT SYSTEM

    EPA Science Inventory

    Verification testing of the ORCA Water Technologies KemLoop 1000 Coagulation and Filtration Water Treatment System for arsenic removal was conducted at the St. Louis Center located in Washtenaw County, Michigan, from March 23 through April 6, 2005. The source water was groundwate...

  17. Trajectory Based Behavior Analysis for User Verification

    NASA Astrophysics Data System (ADS)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  18. Advanced Plant Habitat Test Harvest

    NASA Image and Video Library

    2017-08-24

    John "JC" Carver, a payload integration engineer with NASA Kennedy Space Center's Test and Operations Support Contract, opens the door to the growth chamber of the Advanced Plant Habitat (APH) Flight Unit No. 1 for a test harvest of half of the Arabidopsis thaliana plants growing within. The harvest is part of an ongoing verification test of the APH unit, which is located inside the International Space Station Environmental Simulator in Kennedy's Space Station Processing Facility. The APH undergoing testing at Kennedy is identical to one on the station and uses red, green and broad-spectrum white LED lights to grow plants in an environmentally controlled chamber. The seeds grown during the verification test will be grown on the station to help scientists understand how these plants adapt to spaceflight.

  19. Advanced Plant Habitat Test Harvest

    NASA Image and Video Library

    2017-08-24

    John "JC" Carver, a payload integration engineer with NASA Kennedy Space Center's Test and Operations Support Contract, places Arabidopsis thaliana plants harvested from the Advanced Plant Habitat (APH) Flight Unit No. 1 into a Mini ColdBag that quickly freezes the plants. The harvest is part of an ongoing verification test of the APH unit, which is located inside the International Space Station Environmental Simulator in Kennedy's Space Station Processing Facility. The APH undergoing testing at Kennedy is identical to one on the station and uses red, green and broad-spectrum white LED lights to grow plants in an environmentally controlled chamber. The seeds grown during the verification test will be grown on the station to help scientists understand how these plants adapt to spaceflight.

  20. Advanced Plant Habitat Test Harvest

    NASA Image and Video Library

    2017-08-24

    John "JC" Carver, a payload integration engineer with NASA Kennedy Space Center's Test and Operations Support Contract, places Arabidopsis thaliana plants harvested from the Advanced Plant Habitat (APH) Flight Unit No. 1 into an Ultra-low Freezer chilled to -150 degrees Celsius. The harvest is part of an ongoing verification test of the APH unit, which is located inside the International Space Station Environmental Simulator in Kennedy's Space Station Processing Facility. The APH undergoing testing at Kennedy is identical to one on the station and uses red, green and broad-spectrum white LED lights to grow plants in an environmentally controlled chamber. The seeds grown during the verification test will be grown on the station to help scientists understand how these plants adapt to spaceflight.

  1. Multi-particle inspection using associated particle sources

    DOEpatents

    Bingham, Philip R.; Mihalczo, John T.; Mullens, James A.; McConchie, Seth M.; Hausladen, Paul A.

    2016-02-16

    Disclosed herein are representative embodiments of methods, apparatus, and systems for performing combined neutron and gamma ray radiography. For example, one exemplary system comprises: a neutron source; a set of alpha particle detectors configured to detect alpha particles associated with neutrons generated by the neutron source; neutron detectors positioned to detect at least some of the neutrons generated by the neutron source; a gamma ray source; a set of verification gamma ray detectors configured to detect verification gamma rays associated with gamma rays generated by the gamma ray source; a set of gamma ray detectors configured to detect gamma rays generated by the gamma ray source; and an interrogation region located between the neutron source, the gamma ray source, the neutron detectors, and the gamma ray detectors.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Otani, Y; Sumida, I; Yagi, M

    Purpose: Brachytherapy has multiple manual procedures which are prone to human error, especially during the connection process of the treatment device to applicator. This is when considerable attention is required. In this study, we propose a new connection verification device concept. Methods: The system is composed of a ring magnet (anisotropic ferrite : magfine Inc), hole device (A1324LUA-T : Allegro MicroSystems Phil Inc) and an in-house check cable, which is made from magnetic material (Figure1). The magnetic field distribution is affected by the check cable position and any magnetic field variation is detected by the hole device. This system frequencymore » is 20Hz and the average of 4 signals was used as hole device value to reduce noise. Results: The value of the hole device is altered, depending on the location of the check cable. The resolution of the check cable position is 5mm and 10mm, around a 10mm region from the hole device and over 10mm, respectively. There was a reduction in sensitivity of the hole device, in our test, which was linked to the distance of the hole device from the check cable. Conclusion: We demonstrated a new concept of connection verification in a brachytherapy. This system has the possibility to detect an incorrect connection. Moreover, the system is capable of self-optimization, such as determining the number of hole device and the magnet strength.Acknowledgement:This work was supported by JSPS Core -to-Core program Number 23003 and KAKENHI Grant Number 26860401. This work was supported by JSPS Core-to-Core program Number 23003 and KAKENHI Grant Number 26860401.« less

  3. Forecast Verification: Identification of small changes in weather forecasting skill

    NASA Astrophysics Data System (ADS)

    Weatherhead, E. C.; Jensen, T. L.

    2017-12-01

    Global and regonal weather forecasts have improved over the past seven decades most often because of small, incrmental improvements. The identificaiton and verification of forecast improvement due to proposed small changes in forecasting can be expensive and, if not carried out efficiently, can slow progress in forecasting development. This presentation will look at the skill of commonly used verification techniques and show how the ability to detect improvements can depend on the magnitude of the improvement, the number of runs used to test the improvement, the location on the Earth and the statistical techniques used. For continuous variables, such as temperture, wind and humidity, the skill of a forecast can be directly compared using a pair-wise statistical test that accommodates the natural autocorrelation and magnitude of variability. For discrete variables, such as tornado outbreaks, or icing events, the challenges is to reduce the false alarm rate while improving the rate of correctly identifying th discrete event. For both continuus and discrete verification results, proper statistical approaches can reduce the number of runs needed to identify a small improvement in forecasting skill. Verification within the Next Generation Global Prediction System is an important component to the many small decisions needed to make stat-of-the-art improvements to weather forecasting capabilities. The comparison of multiple skill scores with often conflicting results requires not only appropriate testing, but also scientific judgment to assure that the choices are appropriate not only for improvements in today's forecasting capabilities, but allow improvements that will come in the future.

  4. eBiometrics: an enhanced multi-biometrics authentication technique for real-time remote applications on mobile devices

    NASA Astrophysics Data System (ADS)

    Kuseler, Torben; Lami, Ihsan; Jassim, Sabah; Sellahewa, Harin

    2010-04-01

    The use of mobile communication devices with advance sensors is growing rapidly. These sensors are enabling functions such as Image capture, Location applications, and Biometric authentication such as Fingerprint verification and Face & Handwritten signature recognition. Such ubiquitous devices are essential tools in today's global economic activities enabling anywhere-anytime financial and business transactions. Cryptographic functions and biometric-based authentication can enhance the security and confidentiality of mobile transactions. Using Biometric template security techniques in real-time biometric-based authentication are key factors for successful identity verification solutions, but are venerable to determined attacks by both fraudulent software and hardware. The EU-funded SecurePhone project has designed and implemented a multimodal biometric user authentication system on a prototype mobile communication device. However, various implementations of this project have resulted in long verification times or reduced accuracy and/or security. This paper proposes to use built-in-self-test techniques to ensure no tampering has taken place on the verification process prior to performing the actual biometric authentication. These techniques utilises the user personal identification number as a seed to generate a unique signature. This signature is then used to test the integrity of the verification process. Also, this study proposes the use of a combination of biometric modalities to provide application specific authentication in a secure environment, thus achieving optimum security level with effective processing time. I.e. to ensure that the necessary authentication steps and algorithms running on the mobile device application processor can not be undermined or modified by an imposter to get unauthorized access to the secure system.

  5. Calibrated Multiple Event Relocations of the Central and Eastern United States

    NASA Astrophysics Data System (ADS)

    Yeck, W. L.; Benz, H.; McNamara, D. E.; Bergman, E.; Herrmann, R. B.; Myers, S. C.

    2015-12-01

    Earthquake locations are a first-order observable which form the basis of a wide range of seismic analyses. Currently, the ANSS catalog primarily contains published single-event earthquake locations that rely on assumed 1D velocity models. Increasing the accuracy of cataloged earthquake hypocenter locations and origin times and constraining their associated errors can improve our understanding of Earth structure and have a fundamental impact on subsequent seismic studies. Multiple-event relocation algorithms often increase the precision of relative earthquake hypocenters but are hindered by their limited ability to provide realistic location uncertainties for individual earthquakes. Recently, a Bayesian approach to the multiple event relocation problem has proven to have many benefits including the ability to: (1) handle large data sets; (2) easily incorporate a priori hypocenter information; (3) model phase assignment errors; and, (4) correct for errors in the assumed travel time model. In this study we employ bayseloc [Myers et al., 2007, 2009] to relocate earthquakes in the Central and Eastern United States from 1964-present. We relocate ~11,000 earthquakes with a dataset of ~439,000 arrival time observations. Our dataset includes arrival-time observations from the ANSS catalog supplemented with arrival-time data from the Reviewed ISC Bulletin (prior to 1981), targeted local studies, and arrival-time data from the TA Array. One significant benefit of the bayesloc algorithm is its ability to incorporate a priori constraints on the probability distributions of specific earthquake locations parameters. To constrain the inversion, we use high-quality calibrated earthquake locations from local studies, including studies from: Raton Basin, Colorado; Mineral, Virginia; Guy, Arkansas; Cheneville, Quebec; Oklahoma; and Mt. Carmel, Illinois. We also add depth constraints to 232 earthquakes from regional moment tensors. Finally, we add constraints from four historic (1964-1973) ground truth events from a verification database. We (1) evaluate our ability to improve our location estimations, (2) use improved locations to evaluate Earth structure in seismically active regions, and (3) examine improvements to the estimated locations of historic large magnitude earthquakes.

  6. Vector wind profile gust model

    NASA Technical Reports Server (NTRS)

    Adelfang, S. I.

    1981-01-01

    To enable development of a vector wind gust model suitable for orbital flight test operations and trade studies, hypotheses concerning the distributions of gust component variables were verified. Methods for verification of hypotheses that observed gust variables, including gust component magnitude, gust length, u range, and L range, are gamma distributed and presented. Observed gust modulus has been drawn from a bivariate gamma distribution that can be approximated with a Weibull distribution. Zonal and meridional gust components are bivariate gamma distributed. An analytical method for testing for bivariate gamma distributed variables is presented. Two distributions for gust modulus are described and the results of extensive hypothesis testing of one of the distributions are presented. The validity of the gamma distribution for representation of gust component variables is established.

  7. Experimental Verification Of The Osculating Cones Method For Two Waverider Forebodies At Mach 4 and 6

    NASA Technical Reports Server (NTRS)

    Miller, Rolf W.; Argrow, Brian M.; Center, Kenneth B.; Brauckmann, Gregory J.; Rhode, Matthew N.

    1998-01-01

    The NASA Langley Research Center Unitary Plan Wind Tunnel and the 20-Inch Mach 6 Tunnel were used to test two osculating cones waverider models. The Mach-4 and Mach-6 shapes were generated using the interactive design tool WIPAR. WIPAR performance predictions are compared to the experimental results. Vapor screen results for the Mach-4 model at the on- design Mach number provide visual verification that the shock is attached along the entire leading edge, within the limits of observation. WIPAR predictions of pressure distributions and aerodynamic coefficients show general agreement with the corresponding experimental values.

  8. Model-Driven Test Generation of Distributed Systems

    NASA Technical Reports Server (NTRS)

    Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin

    2012-01-01

    This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.

  9. Comparison of DSMC Reaction Models with QCT Reaction Rates for Nitrogen

    DTIC Science & Technology

    2016-07-17

    The U.S. Government is joint author of the work and has the right to use, modify, reproduce, release, perform, display, or disclose the work. 13...Distribution A: Approved for Public Release, Distribution Unlimited PA #16299 Introduction • Comparison with measurements is final goal • Validation...model verification and parameter adjustment • Four chemistry models: total collision energy (TCE), quantum kinetic (QK), vibration-dissociation favoring

  10. Dosimetric characterization and output verification for conical brachytherapy surface applicators. Part I. Electronic brachytherapy source

    PubMed Central

    Fulkerson, Regina K.; Micka, John A.; DeWerd, Larry A.

    2014-01-01

    Purpose: Historically, treatment of malignant surface lesions has been achieved with linear accelerator based electron beams or superficial x-ray beams. Recent developments in the field of brachytherapy now allow for the treatment of surface lesions with specialized conical applicators placed directly on the lesion. Applicators are available for use with high dose rate (HDR) 192Ir sources, as well as electronic brachytherapy sources. Part I of this paper will discuss the applicators used with electronic brachytherapy sources; Part II will discuss those used with HDR 192Ir sources. Although the use of these applicators has gained in popularity, the dosimetric characteristics including depth dose and surface dose distributions have not been independently verified. Additionally, there is no recognized method of output verification for quality assurance procedures with applicators like these. Existing dosimetry protocols available from the AAPM bookend the cross-over characteristics of a traditional brachytherapy source (as described by Task Group 43) being implemented as a low-energy superficial x-ray beam (as described by Task Group 61) as observed with the surface applicators of interest. Methods: This work aims to create a cohesive method of output verification that can be used to determine the dose at the treatment surface as part of a quality assurance/commissioning process for surface applicators used with HDR electronic brachytherapy sources (Part I) and 192Ir sources (Part II). Air-kerma rate measurements for the electronic brachytherapy sources were completed with an Attix Free-Air Chamber, as well as several models of small-volume ionization chambers to obtain an air-kerma rate at the treatment surface for each applicator. Correction factors were calculated using MCNP5 and EGSnrc Monte Carlo codes in order to determine an applicator-specific absorbed dose to water at the treatment surface from the measured air-kerma rate. Additionally, relative dose measurements of the surface dose distributions and characteristic depth dose curves were completed in-phantom. Results: Theoretical dose distributions and depth dose curves were generated for each applicator and agreed well with the measured values. A method of output verification was created that allows users to determine the applicator-specific dose to water at the treatment surface based on a measured air-kerma rate. Conclusions: The novel output verification methods described in this work will reduce uncertainties in dose delivery for treatments with these kinds of surface applicators, ultimately improving patient care. PMID:24506635

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lau, A; Chen, Y; Ahmad, S

    Purpose: Proton therapy exhibits several advantages over photon therapy due to depth-dose distributions from proton interactions within the target material. However, uncertainties associated with protons beam range in the patient limit the advantage of proton therapy applications. To quantify beam range, positron-emitting nuclei (PEN) and prompt gamma (PG) techniques have been developed. These techniques use de-excitation photons to describe the location of the beam in the patient. To develop a detector system for implementing the PG technique for range verification applications in proton therapy, we studied the yields, energy and angular distributions of the secondary particles emitted from a PMMAmore » phantom. Methods: Proton pencil beams of various energies incident onto a PMMA phantom with dimensions of 5 x 5 x 50 cm3 were used for simulation with the Geant4 toolkit using the standard electromagnetic packages as well as the packages based on the binary-cascade nuclear model. The emitted secondary particles are analyzed . Results: For 160 MeV incident protons, the yields of secondary neutrons and photons per 100 incident protons were ~6 and ~15 respectively. Secondary photon energy spectrum showed several energy peaks in the range between 0 and 10 MeV. The energy peaks located between 4 and 6 MeV were attributed to originate from direct proton interactions with 12C (~ 4.4 MeV) and 16O (~ 6 MeV), respectively. Most of the escaping secondary neutrons were found to have energies between 10 and 100 MeV. Isotropic emissions were found for lower energy neutrons (<10 MeV) and photons for all energies, while higher energy neutrons were emitted predominantly in the forward direction. The yields of emitted photons and neutrons increased with the increase of incident proton energies. Conclusions: A detector system is currently being developed incorporating the yields, energy and angular distributions of secondary particles from proton interactions obtained from this study.« less

  12. Evaluation of geotechnical monitoring data from the ESF North Ramp Starter Tunnel, April 1994 to June 1995. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-11-01

    This report presents the results of instrumentation measurements and observations made during construction of the North Ramp Starter Tunnel (NRST) of the Exploratory Studies Facility (ESF). The information in this report was developed as part of the Design Verification Study, Section 8.3.1.15.1.8 of the Yucca Mountain Site Characterization Plan (DOE 1988). The ESF is being constructed by the US Department of Energy (DOE) to evaluate the feasibility of locating a potential high-level nuclear waste repository on lands within and adjacent to the Nevada Test Site (NTS), Nye County, Nevada. The Design Verification Studies are performed to collect information during constructionmore » of the ESF that will be useful for design and construction of the potential repository. Four experiments make up the Design Verification Study: Evaluation of Mining Methods, Monitoring Drift Stability, Monitoring of Ground Support Systems, and The Air Quality and Ventilation Experiment. This report describes Sandia National Laboratories` (SNL) efforts in the first three of these experiments in the NRST.« less

  13. Probabilistic verification of cloud fraction from three different products with CALIPSO

    NASA Astrophysics Data System (ADS)

    Jung, B. J.; Descombes, G.; Snyder, C.

    2017-12-01

    In this study, we present how Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) can be used for probabilistic verification of cloud fraction, and apply this probabilistic approach to three cloud fraction products: a) The Air Force Weather (AFW) World Wide Merged Cloud Analysis (WWMCA), b) Satellite Cloud Observations and Radiative Property retrieval Systems (SatCORPS) from NASA Langley Research Center, and c) Multi-sensor Advection Diffusion nowCast (MADCast) from NCAR. Although they differ in their details, both WWMCA and SatCORPS retrieve cloud fraction from satellite observations, mainly of infrared radiances. MADCast utilizes in addition a short-range forecast of cloud fraction (provided by the Model for Prediction Across Scales, assuming cloud fraction is advected as a tracer) and a column-by-column particle filter implemented within the Gridpoint Statistical Interpolation (GSI) data-assimilation system. The probabilistic verification considers the retrieved or analyzed cloud fractions as predicting the probability of cloud at any location within a grid cell and the 5-km vertical feature mask (VFM) from CALIPSO level-2 products as a point observation of cloud.

  14. Towards real-time VMAT verification using a prototype, high-speed CMOS active pixel sensor.

    PubMed

    Zin, Hafiz M; Harris, Emma J; Osmond, John P F; Allinson, Nigel M; Evans, Philip M

    2013-05-21

    This work investigates the feasibility of using a prototype complementary metal oxide semiconductor active pixel sensor (CMOS APS) for real-time verification of volumetric modulated arc therapy (VMAT) treatment. The prototype CMOS APS used region of interest read out on the chip to allow fast imaging of up to 403.6 frames per second (f/s). The sensor was made larger (5.4 cm × 5.4 cm) using recent advances in photolithographic technique but retains fast imaging speed with the sensor's regional read out. There is a paradigm shift in radiotherapy treatment verification with the advent of advanced treatment techniques such as VMAT. This work has demonstrated that the APS can track multi leaf collimator (MLC) leaves moving at 18 mm s(-1) with an automatic edge tracking algorithm at accuracy better than 1.0 mm even at the fastest imaging speed. Evaluation of the measured fluence distribution for an example VMAT delivery sampled at 50.4 f/s was shown to agree well with the planned fluence distribution, with an average gamma pass rate of 96% at 3%/3 mm. The MLC leaves motion and linac pulse rate variation delivered throughout the VMAT treatment can also be measured. The results demonstrate the potential of CMOS APS technology as a real-time radiotherapy dosimeter for delivery of complex treatments such as VMAT.

  15. A back-projection algorithm in the presence of an extra attenuating medium: towards EPID dosimetry for the MR-Linac

    NASA Astrophysics Data System (ADS)

    Torres-Xirau, I.; Olaciregui-Ruiz, I.; Rozendaal, R. A.; González, P.; Mijnheer, B. J.; Sonke, J.-J.; van der Heide, U. A.; Mans, A.

    2017-08-01

    In external beam radiotherapy, electronic portal imaging devices (EPIDs) are frequently used for pre-treatment and for in vivo dose verification. Currently, various MR-guided radiotherapy systems are being developed and clinically implemented. Independent dosimetric verification is highly desirable. For this purpose we adapted our EPID-based dose verification system for use with the MR-Linac combination developed by Elekta in cooperation with UMC Utrecht and Philips. In this study we extended our back-projection method to cope with the presence of an extra attenuating medium between the patient and the EPID. Experiments were performed at a conventional linac, using an aluminum mock-up of the MRI scanner housing between the phantom and the EPID. For a 10 cm square field, the attenuation by the mock-up was 72%, while 16% of the remaining EPID signal resulted from scattered radiation. 58 IMRT fields were delivered to a 20 cm slab phantom with and without the mock-up. EPID reconstructed dose distributions were compared to planned dose distributions using the γ -evaluation method (global, 3%, 3 mm). In our adapted back-projection algorithm the averaged {γmean} was 0.27+/- 0.06 , while in the conventional it was 0.28+/- 0.06 . Dose profiles of several square fields reconstructed with our adapted algorithm showed excellent agreement when compared to TPS.

  16. SU-E-T-762: Toward Volume-Based Independent Dose Verification as Secondary Check

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tachibana, H; Tachibana, R

    2015-06-15

    Purpose: Lung SBRT plan has been shifted to volume prescription technique. However, point dose agreement is still verified using independent dose verification at the secondary check. The volume dose verification is more affected by inhomogeneous correction rather than point dose verification currently used as the check. A feasibility study for volume dose verification was conducted in lung SBRT plan. Methods: Six SBRT plans were collected in our institute. Two dose distributions with / without inhomogeneous correction were generated using Adaptive Convolve (AC) in Pinnacle3. Simple MU Analysis (SMU, Triangle Product, Ishikawa, JP) was used as the independent dose verification softwaremore » program, in which a modified Clarkson-based algorithm was implemented and radiological path length was computed using CT images independently to the treatment planning system. The agreement in point dose and mean dose between the AC with / without the correction and the SMU were assessed. Results: In the point dose evaluation for the center of the GTV, the difference shows the systematic shift (4.5% ± 1.9 %) in comparison of the AC with the inhomogeneous correction, on the other hands, there was good agreement of 0.2 ± 0.9% between the SMU and the AC without the correction. In the volume evaluation, there were significant differences in mean dose for not only PTV (14.2 ± 5.1 %) but also GTV (8.0 ± 5.1 %) compared to the AC with the correction. Without the correction, the SMU showed good agreement for GTV (1.5 ± 0.9%) as well as PTV (0.9% ± 1.0%). Conclusion: The volume evaluation for secondary check may be possible in homogenous region. However, the volume including the inhomogeneous media would make larger discrepancy. Dose calculation algorithm for independent verification needs to be modified to take into account the inhomogeneous correction.« less

  17. SU-E-T-398: Feasibility of Automated Tools for Robustness Evaluation of Advanced Photon and Proton Techniques in Oropharyngeal Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, H; Liang, X; Kalbasi, A

    2014-06-01

    Purpose: Advanced radiotherapy (RT) techniques such as proton pencil beam scanning (PBS) and photon-based volumetric modulated arc therapy (VMAT) have dosimetric advantages in the treatment of head and neck malignancies. However, anatomic or alignment changes during treatment may limit robustness of PBS and VMAT plans. We assess the feasibility of automated deformable registration tools for robustness evaluation in adaptive PBS and VMAT RT of oropharyngeal cancer (OPC). Methods: We treated 10 patients with bilateral OPC with advanced RT techniques and obtained verification CT scans with physician-reviewed target and OAR contours. We generated 3 advanced RT plans for each patient: protonmore » PBS plan using 2 posterior oblique fields (2F), proton PBS plan using an additional third low-anterior field (3F), and a photon VMAT plan using 2 arcs (Arc). For each of the planning techniques, we forward calculated initial (Ini) plans on the verification scans to create verification (V) plans. We extracted DVH indicators based on physician-generated contours for 2 target and 14 OAR structures to investigate the feasibility of two automated tools (contour propagation (CP) and dose deformation (DD)) as surrogates for routine clinical plan robustness evaluation. For each verification scan, we compared DVH indicators of V, CP and DD plans in a head-to-head fashion using Student's t-test. Results: We performed 39 verification scans; each patient underwent 3 to 6 verification scan. We found no differences in doses to target or OAR structures between V and CP, V and DD, and CP and DD plans across all patients (p > 0.05). Conclusions: Automated robustness evaluation tools, CP and DD, accurately predicted dose distributions of verification (V) plans using physician-generated contours. These tools may be further developed as a potential robustness screening tool in the workflow for adaptive treatment of OPC using advanced RT techniques, reducing the need for physician-generated contours.« less

  18. TPS(PET)-A TPS-based approach for in vivo dose verification with PET in proton therapy.

    PubMed

    Frey, K; Bauer, J; Unholtz, D; Kurz, C; Krämer, M; Bortfeld, T; Parodi, K

    2014-01-06

    Since the interest in ion-irradiation for tumour therapy has significantly increased over the last few decades, intensive investigations are performed to improve the accuracy of this form of patient treatment. One major goal is the development of methods for in vivo dose verification. In proton therapy, a PET (positron emission tomography)-based approach measuring the irradiation-induced tissue activation inside the patient has been already clinically implemented. The acquired PET images can be compared to an expectation, derived under the assumption of a correct treatment application, to validate the particle range and the lateral field position in vivo. In the context of this work, TPSPET is introduced as a new approach to predict proton-irradiation induced three-dimensional positron emitter distributions by means of the same algorithms of the clinical treatment planning system (TPS). In order to perform additional activity calculations, reaction-channel-dependent input positron emitter depth distributions are necessary, which are determined from the application of a modified filtering approach to the TPS reference depth dose profiles in water. This paper presents the implementation of TPSPET on the basis of the research treatment planning software treatment planning for particles. The results are validated in phantom and patient studies against Monte Carlo simulations, and compared to β(+)-emitter distributions obtained from a slightly modified version of the originally proposed one-dimensional filtering approach applied to three-dimensional dose distributions. In contrast to previously introduced methods, TPSPET provides a faster implementation, the results show no sensitivity to lateral field extension and the predicted β(+)-emitter densities are fully consistent to the planned treatment dose as they are calculated by the same pencil beam algorithms. These findings suggest a large potential of the application of TPSPET for in vivo dose verification in the daily clinical routine.

  19. 38 CFR 74.1 - What definitions are important for VetBiz Vendor Information Pages (VIP) Verification Program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (NAICS) Manual published by the U.S. Office of Management and Budget. Principal place of business means... working hours and where top management's current business records are kept. If the office from which management is directed and where the current business records are kept are in different locations, CVE will...

  20. 8 CFR 103.21 - Access by individuals to records maintained about them.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... should clearly identity the record sought by the name and any other personal identifiers for the individual (such as the alien file number or Social Security Account Number), date and place of birth, and type of file in which the record is believed to be located. (b) Verification of identity. The following...

  1. Hailstorms over Switzerland: Verification of Crowd-sourced Data

    NASA Astrophysics Data System (ADS)

    Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia

    2016-04-01

    The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.

  2. Assessment of test methods for evaluating effectiveness of cleaning flexible endoscopes.

    PubMed

    Washburn, Rebecca E; Pietsch, Jennifer J

    2018-06-01

    Strict adherence to each step of reprocessing is imperative to removing potentially infectious agents. Multiple methods for verifying proper reprocessing exist; however, each presents challenges and limitations, and best practice within the industry has not been established. Our goal was to evaluate endoscope cleaning verification tests with particular interest in the evaluation of the manual cleaning step. The results of the cleaning verification tests were compared with microbial culturing to see if a positive cleaning verification test would be predictive of microbial growth. This study was conducted at 2 high-volume endoscopy units within a multisite health care system. Each of the 90 endoscopes were tested for adenosine triphosphate, protein, microbial growth via agar plate, and rapid gram-negative culture via assay. The endoscopes were tested in 3 locations: the instrument channel, control knob, and elevator mechanism. This analysis showed substantial level of agreement between protein detection postmanual cleaning and protein detection post-high-level disinfection at the control head for scopes sampled sequentially. This study suggests that if protein is detected postmanual cleaning, there is a significant likelihood that protein will also be detected post-high-level disinfection. It also infers that a cleaning verification test is not predictive of microbial growth. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  3. Ionoacoustic characterization of the proton Bragg peak with submillimeter accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Assmann, W., E-mail: walter.assmann@lmu.de; Reinhardt, S.; Lehrack, S.

    2015-02-15

    Purpose: Range verification in ion beam therapy relies to date on nuclear imaging techniques which require complex and costly detector systems. A different approach is the detection of thermoacoustic signals that are generated due to localized energy loss of ion beams in tissue (ionoacoustics). Aim of this work was to study experimentally the achievable position resolution of ionoacoustics under idealized conditions using high frequency ultrasonic transducers and a specifically selected probing beam. Methods: A water phantom was irradiated by a pulsed 20 MeV proton beam with varying pulse intensity and length. The acoustic signal of single proton pulses was measuredmore » by different PZT-based ultrasound detectors (3.5 and 10 MHz central frequencies). The proton dose distribution in water was calculated by Geant4 and used as input for simulation of the generated acoustic wave by the matlab toolbox k-WAVE. Results: In measurements from this study, a clear signal of the Bragg peak was observed for an energy deposition as low as 10{sup 12} eV. The signal amplitude showed a linear increase with particle number per pulse and thus, dose. Bragg peak position measurements were reproducible within ±30 μm and agreed with Geant4 simulations to better than 100 μm. The ionoacoustic signal pattern allowed for a detailed analysis of the Bragg peak and could be well reproduced by k-WAVE simulations. Conclusions: The authors have studied the ionoacoustic signal of the Bragg peak in experiments using a 20 MeV proton beam with its correspondingly localized energy deposition, demonstrating submillimeter position resolution and providing a deep insight in the correlation between the acoustic signal and Bragg peak shape. These results, together with earlier experiments and new simulations (including the results in this study) at higher energies, suggest ionoacoustics as a technique for range verification in particle therapy at locations, where the tumor can be localized by ultrasound imaging. This acoustic range verification approach could offer the possibility of combining anatomical ultrasound and Bragg peak imaging, but further studies are required for translation of these findings to clinical application.« less

  4. Dosimetric verification of stereotactic radiosurgery/stereotactic radiotherapy dose distributions using Gafchromic EBT3.

    PubMed

    Cusumano, Davide; Fumagalli, Maria L; Marchetti, Marcello; Fariselli, Laura; De Martin, Elena

    2015-01-01

    Aim of this study is to examine the feasibility of using the new Gafchromic EBT3 film in a high-dose stereotactic radiosurgery and radiotherapy quality assurance procedure. Owing to the reduced dimensions of the involved lesions, the feasibility of scanning plan verification films on the scanner plate area with the best uniformity rather than using a correction mask was evaluated. For this purpose, signal values dispersion and reproducibility of film scans were investigated. Uniformity was then quantified in the selected area and was found to be within 1.5% for doses up to 8 Gy. A high-dose threshold level for analyses using this procedure was established evaluating the sensitivity of the irradiated films. Sensitivity was found to be of the order of centiGray for doses up to 6.2 Gy and decreasing for higher doses. The obtained results were used to implement a procedure comparing dose distributions delivered with a CyberKnife system to planned ones. The procedure was validated through single beam irradiation on a Gafchromic film. The agreement between dose distributions was then evaluated for 13 patients (brain lesions, 5 Gy/die prescription isodose ~80%) using gamma analysis. Results obtained using Gamma test criteria of 5%/1 mm show a pass rate of 94.3%. Gamma frequency parameters calculation for EBT3 films showed to strongly depend on subtraction of unexposed film pixel values from irradiated ones. In the framework of the described dosimetric procedure, EBT3 films proved to be effective in the verification of high doses delivered to lesions with complex shapes and adjacent to organs at risk. Copyright © 2015 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.

  5. Dosimetric verification of stereotactic radiosurgery/stereotactic radiotherapy dose distributions using Gafchromic EBT3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cusumano, Davide, E-mail: davide.cusumano@unimi.it; Fumagalli, Maria L.; Marchetti, Marcello

    2015-10-01

    Aim of this study is to examine the feasibility of using the new Gafchromic EBT3 film in a high-dose stereotactic radiosurgery and radiotherapy quality assurance procedure. Owing to the reduced dimensions of the involved lesions, the feasibility of scanning plan verification films on the scanner plate area with the best uniformity rather than using a correction mask was evaluated. For this purpose, signal values dispersion and reproducibility of film scans were investigated. Uniformity was then quantified in the selected area and was found to be within 1.5% for doses up to 8 Gy. A high-dose threshold level for analyses usingmore » this procedure was established evaluating the sensitivity of the irradiated films. Sensitivity was found to be of the order of centiGray for doses up to 6.2 Gy and decreasing for higher doses. The obtained results were used to implement a procedure comparing dose distributions delivered with a CyberKnife system to planned ones. The procedure was validated through single beam irradiation on a Gafchromic film. The agreement between dose distributions was then evaluated for 13 patients (brain lesions, 5 Gy/die prescription isodose ~80%) using gamma analysis. Results obtained using Gamma test criteria of 5%/1 mm show a pass rate of 94.3%. Gamma frequency parameters calculation for EBT3 films showed to strongly depend on subtraction of unexposed film pixel values from irradiated ones. In the framework of the described dosimetric procedure, EBT3 films proved to be effective in the verification of high doses delivered to lesions with complex shapes and adjacent to organs at risk.« less

  6. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, (from left) STS-96 Mission Specialist Julie Payette, Pilot Rick Husband and Mission Specialist Ellen Ochoa learn about the Sequential Shunt Unit (SSU) in front of them from Lynn Ashby (far right), with Johnson Space Center. The STS-96 crew is at KSC for a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station . Other crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan, Dan Barry and Valery Tokarev of Russia. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  7. EXAMINING THE ROLE AND RESEARCH CHALLENGES OF SOCIAL MEDIA AS A TOOL FOR NONPROLIFERATION AND ARMS CONTROL TREATY VERIFICATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.

    Traditional arms control treaty verification activities typically involve a combination of technical measurements via physical and chemical sensors, state declarations, political agreements, and on-site inspections involving international subject matter experts. However, the ubiquity of the internet, and the electronic sharing of data that it enables, has made available a wealth of open source information with the potential to benefit verification efforts. Open source information is already being used by organizations such as the International Atomic Energy Agency to support the verification of state-declared information, prepare inspectors for in-field activities, and to maintain situational awareness . The recent explosion in socialmore » media use has opened new doors to exploring the attitudes, moods, and activities around a given topic. Social media platforms, such as Twitter, Facebook, and YouTube, offer an opportunity for individuals, as well as institutions, to participate in a global conversation at minimal cost. Social media data can also provide a more data-rich environment, with text data being augmented with images, videos, and location data. The research described in this paper investigates the utility of applying social media signatures as potential arms control and nonproliferation treaty verification tools and technologies, as determined through a series of case studies. The treaty relevant events that these case studies touch upon include detection of undeclared facilities or activities, determination of unknown events recorded by the International Monitoring System (IMS), and the global media response to the occurrence of an Indian missile launch. The case studies examine how social media can be used to fill an information gap and provide additional confidence to a verification activity. The case studies represent, either directly or through a proxy, instances where social media information may be available that could potentially augment the evaluation of an event. The goal of this paper is to instigate a discussion within the verification community as to where and how social media can be effectively utilized to complement and enhance traditional treaty verification efforts. In addition, this paper seeks to identify areas of future research and development necessary to adapt social media analytic tools and techniques, and to form the seed for social media analytics to aid and inform arms control and nonproliferation policymakers and analysts. While social media analysis (as well as open source analysis as a whole) will not ever be able to replace traditional arms control verification measures, they do supply unique signatures that can augment existing analysis.« less

  8. Design, Implementation, and Verification of the Reliable Multicast Protocol. Thesis

    NASA Technical Reports Server (NTRS)

    Montgomery, Todd L.

    1995-01-01

    This document describes the Reliable Multicast Protocol (RMP) design, first implementation, and formal verification. RMP provides a totally ordered, reliable, atomic multicast service on top of an unreliable multicast datagram service. RMP is fully and symmetrically distributed so that no site bears an undue portion of the communications load. RMP provides a wide range of guarantees, from unreliable delivery to totally ordered delivery, to K-resilient, majority resilient, and totally resilient atomic delivery. These guarantees are selectable on a per message basis. RMP provides many communication options, including virtual synchrony, a publisher/subscriber model of message delivery, a client/server model of delivery, mutually exclusive handlers for messages, and mutually exclusive locks. It has been commonly believed that total ordering of messages can only be achieved at great performance expense. RMP discounts this. The first implementation of RMP has been shown to provide high throughput performance on Local Area Networks (LAN). For two or more destinations a single LAN, RMP provides higher throughput than any other protocol that does not use multicast or broadcast technology. The design, implementation, and verification activities of RMP have occurred concurrently. This has allowed the verification to maintain a high fidelity between design model, implementation model, and the verification model. The restrictions of implementation have influenced the design earlier than in normal sequential approaches. The protocol as a whole has matured smoother by the inclusion of several different perspectives into the product development.

  9. Thermal System Verification and Model Validation for NASA's Cryogenic Passively Cooled James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Cleveland, Paul E.; Parrish, Keith A.

    2005-01-01

    A thorough and unique thermal verification and model validation plan has been developed for NASA s James Webb Space Telescope. The JWST observatory consists of a large deployed aperture optical telescope passively cooled to below 50 Kelvin along with a suite of several instruments passively and actively cooled to below 37 Kelvin and 7 Kelvin, respectively. Passive cooling to these extremely low temperatures is made feasible by the use of a large deployed high efficiency sunshield and an orbit location at the L2 Lagrange point. Another enabling feature is the scale or size of the observatory that allows for large radiator sizes that are compatible with the expected power dissipation of the instruments and large format Mercury Cadmium Telluride (HgCdTe) detector arrays. This passive cooling concept is simple, reliable, and mission enabling when compared to the alternatives of mechanical coolers and stored cryogens. However, these same large scale observatory features, which make passive cooling viable, also prevent the typical flight configuration fully-deployed thermal balance test that is the keystone to most space missions thermal verification plan. JWST is simply too large in its deployed configuration to be properly thermal balance tested in the facilities that currently exist. This reality, when combined with a mission thermal concept with little to no flight heritage, has necessitated the need for a unique and alternative approach to thermal system verification and model validation. This paper describes the thermal verification and model validation plan that has been developed for JWST. The plan relies on judicious use of cryogenic and thermal design margin, a completely independent thermal modeling cross check utilizing different analysis teams and software packages, and finally, a comprehensive set of thermal tests that occur at different levels of JWST assembly. After a brief description of the JWST mission and thermal architecture, a detailed description of the three aspects of the thermal verification and model validation plan is presented.

  10. Evolutionary Story of a Satellite DNA from Phodopus sungorus (Rodentia, Cricetidae)

    PubMed Central

    Paço, Ana; Adega, Filomena; Meštrović, Nevenka; Plohl, Miroslav; Chaves, Raquel

    2014-01-01

    With the goal to contribute for the understanding of satellite DNA evolution and its genomic involvement, in this work it was isolated and characterized the first satellite DNA (PSUcentSat) from Phodopus sungorus (Cricetidae). Physical mapping of this sequence in P. sungorus showed large PSUcentSat arrays located at the heterochromatic (peri)centromeric region of five autosomal pairs and Y-chromosome. The presence of orthologous PSUcentSat sequences in the genomes of other Cricetidae and Muridae rodents was also verified, presenting however, an interspersed chromosomal distribution. This distribution pattern suggests a PSUcentSat-scattered location in an ancestor of Muridae/Cricetidae families, that assumed afterwards, in the descendant genome of P. sungorus a restricted localization to few chromosomes in the (peri)centromeric region. We believe that after the divergence of the studied species, PSUcentSat was most probably highly amplified in the (peri)centromeric region of some chromosome pairs of this hamster by recombinational mechanisms. The bouquet chromosome configuration (prophase I) possibly displays an important role in this selective amplification, providing physical proximity of centromeric regions between chromosomes with similar size and/or morphology. This seems particularly evident for the acrocentric chromosomes of P. sungorus (including the Y-chromosome), all presenting large PSUcentSat arrays at the (peri)centromeric region. The conservation of this sequence in the studied genomes and its (peri)centromeric amplification in P. sungorus strongly suggests functional significance, possibly displaying this satellite family different functions in the different genomes. The verification of PSUcentSat transcriptional activity in normal proliferative cells suggests that its transcription is not stage-limited, as described for some other satellites. PMID:25336681

  11. Department of National Defence's use of thermography for facilities maintenance

    NASA Astrophysics Data System (ADS)

    Kittson, John E.

    1990-03-01

    Since the late seventies DND through the Director General Works has been actively encouraging the use of thermography as an efficient and effective technique for supporting preventive maintenance quality assurance and energy conservation programs at Canadian Forces Bases (CFBs). This paper will provide an overview of DND''s experiences in the utilization of thermography for facilities maintenance applications. 1. HISTORICAL MILESTONES The following are milestones of DND''s use of thermography: a. Purchase of Infrared Equipment In 1976/77 DND purchased five AGA 750 Infrared Thermovision Systems which were distributed to commands. In 1980/81/82 six AGA liOs five AGA TPT8Os two AGA 782s and one AGA 720 were acquired. Finally DND also purchased seven AGEMA 870 systems during 1987/88. b. First and Second Interdepartaental Building Thermography Courses In 1978 and 1980 DND hosted two building thermography courses that were conducted by Public Works Canada. c. CE Thermographer Specialist Training Courses DND developed a training standard in 1983 for Construction Engineering (CE) Thermographer qualification which included all CE applications of thermography. The first annual inhouse training course was conducted at CFB Borden Ontario in 1984. These are now being conducted at the CFB Chilliwack Detachment in Vernon British Columbia. 2 . MARKETING FACILITIES MAINTENANCE IR Of paramount importance for successfully developing DND appreciation for thermography was providing familiarization training to CE staff at commands and bases. These threeday presentations emphasized motivational factors conducting thermographic surveys and utilizing infrared data of roofs electrical/mechanical systems heating plants steam distribution and building enclosures. These factors consisted mainly of the following objectives: a. preventive maintenance by locating deficiencies to be repaired b. quality assurance by verification of workmanship materials and design c. energy conservation by locating heat loss areas 2 / SPIE Vol. 1313 Thermosense XII (1990)

  12. A representative survey of indoor radon in the sixteen regions in Mexico City.

    PubMed

    Espinosa, G; Gammage, R B

    2003-01-01

    Mexico City, also called Federal District, covers an area of 1504 km(2), and has more than 8 million inhabitants. It is located more than 2200 m above sea level in a zone of high seismic activity, and founded on an ancient lake. At present it is one of the most crowded and contaminated cities in the world, with thermal inversions. Chemical contaminants and aerosol particles in the environmental air are high most of the year. Due to these geological, environmental and socioeconomic conditions, Federal District presents very peculiar characteristics, which are important for understanding the distribution and measurements of indoor radon concentration. In this work the results of 3 year (1998-2000) measurements of indoor radon levels in the Federal District are presented. For the detector distribution and measurements, the actual political administrative divisions of the Federal District, consisting of 16 very well defined zones, was used. Nuclear track detection methodology was selected for the measurement, with a passive device close-end-cup system with CR-39 (Lantrack) polycarbonate as the detection material, with one step chemical etching, following a very well established protocol developed at the Instituto de Física, UNAM. Calibration was carried out at the Oak Ridge National Laboratory, and verification at the Instituto de Física chamber. The results show that the arithmetical mean values of the indoor radon concentration for each region of the Federal District follow a non-homogenous distribution.

  13. All-digital radar architecture

    NASA Astrophysics Data System (ADS)

    Molchanov, Pavlo A.

    2014-10-01

    All digital radar architecture requires exclude mechanical scan system. The phase antenna array is necessarily large because the array elements must be co-located with very precise dimensions and will need high accuracy phase processing system for aggregate and distribute T/R modules data to/from antenna elements. Even phase array cannot provide wide field of view. New nature inspired all digital radar architecture proposed. The fly's eye consists of multiple angularly spaced sensors giving the fly simultaneously thee wide-area visual coverage it needs to detect and avoid the threats around him. Fly eye radar antenna array consist multiple directional antennas loose distributed along perimeter of ground vehicle or aircraft and coupled with receiving/transmitting front end modules connected by digital interface to central processor. Non-steering antenna array allows creating all-digital radar with extreme flexible architecture. Fly eye radar architecture provides wide possibility of digital modulation and different waveform generation. Simultaneous correlation and integration of thousands signals per second from each point of surveillance area allows not only detecting of low level signals ((low profile targets), but help to recognize and classify signals (targets) by using diversity signals, polarization modulation and intelligent processing. Proposed all digital radar architecture with distributed directional antenna array can provide a 3D space vector to the jammer by verification direction of arrival for signals sources and as result jam/spoof protection not only for radar systems, but for communication systems and any navigation constellation system, for both encrypted or unencrypted signals, for not limited number or close positioned jammers.

  14. Electromagnetic head-and-neck hyperthermia applicator: experimental phantom verification and FDTD model.

    PubMed

    Paulides, Margarethus M; Bakker, Jurriaan F; van Rhoon, Gerard C

    2007-06-01

    To experimentally verify the feasibility of focused heating in the neck region by an array of two rings of six electromagnetic antennas. We also measured the dynamic specific absorption rate (SAR) steering possibilities of this setup and compared these SAR patterns to simulations. Using a specially constructed laboratory prototype head-and-neck applicator, including a neck-mimicking cylindrical muscle phantom, we performed SAR measurements by electric field, Schottky-diode sheet measurements and, using the power-pulse technique, by fiberoptic thermometry and infrared thermography. Using phase steering, we also steered the SAR distribution in radial and axial directions. All measured distributions were compared with the predictions by a finite-difference time-domain-based electromagnetic simulator. A central 50% iso-SAR focus of 35 +/- 3 mm in diameter and about 100 +/- 15 mm in length was obtained for all investigated settings. Furthermore, this SAR focus could be steered toward the desired location in the radial and axial directions with an accuracy of approximately 5 mm. The SAR distributions as measured by all three experimental methods were well predicted by the simulations. The results of our study have shown that focused heating in the neck is feasible and that this focus can be effectively steered in the radial and axial directions. For quality assurance measurements, we believe that the Schottky-diode sheet provides the best compromise among effort, speed, and accuracy, although a more specific and improved design is warranted.

  15. Verification of combined thermal-hydraulic and heat conduction analysis code FLOWNET/TRUMP

    NASA Astrophysics Data System (ADS)

    Maruyama, Soh; Fujimoto, Nozomu; Kiso, Yoshihiro; Murakami, Tomoyuki; Sudo, Yukio

    1988-09-01

    This report presents the verification results of the combined thermal-hydraulic and heat conduction analysis code, FLOWNET/TRUMP which has been utilized for the core thermal hydraulic design, especially for the analysis of flow distribution among fuel block coolant channels, the determination of thermal boundary conditions for fuel block stress analysis and the estimation of fuel temperature in the case of fuel block coolant channel blockage accident in the design of the High Temperature Engineering Test Reactor(HTTR), which the Japan Atomic Energy Research Institute has been planning to construct in order to establish basic technologies for future advanced very high temperature gas-cooled reactors and to be served as an irradiation test reactor for promotion of innovative high temperature new frontier technologies. The verification of the code was done through the comparison between the analytical results and experimental results of the Helium Engineering Demonstration Loop Multi-channel Test Section(HENDEL T(sub 1-M)) with simulated fuel rods and fuel blocks.

  16. In vivo verification of proton beam path by using post-treatment PET/CT imaging.

    PubMed

    Hsi, Wen C; Indelicato, Daniel J; Vargas, Carlos; Duvvuri, Srividya; Li, Zuofeng; Palta, Jatinder

    2009-09-01

    The purpose of this study is to establish the in vivo verification of proton beam path by using proton-activated positron emission distributions. A total of 50 PET/CT imaging studies were performed on ten prostate cancer patients immediately after daily proton therapy treatment through a single lateral portal. The PET/CT and planning CT were registered by matching the pelvic bones, and the beam path of delivered protons was defined in vivo by the positron emission distribution seen only within the pelvic bones, referred to as the PET-defined beam path. Because of the patient position correction at each fraction, the marker-defined beam path, determined by the centroid of implanted markers seen in the posttreatment (post-Tx) CT, is used for the planned beam path. The angular variation and discordance between the PET- and marker-defined paths were derived to investigate the intrafraction prostate motion. For studies with large discordance, the relative location between the centroid and pelvic bones seen in the post-Tx CT was examined. The PET/CT studies are categorized for distinguishing the prostate motion that occurred before or after beam delivery. The post-PET CT was acquired after PET imaging to investigate prostate motion due to physiological changes during the extended PET acquisition. The less than 2 degrees of angular variation indicates that the patient roll was minimal within the immobilization device. Thirty of the 50 studies with small discordance, referred as good cases, show a consistent alignment between the field edges and the positron emission distributions from the entrance to the distal edge. For those good cases, average displacements are 0.6 and 1.3 mm along the anterior-posterior (D(AP)) and superior-inferior (D(SI)) directions, respectively, with 1.6 mm standard deviations in both directions. For the remaining 20 studies demonstrating a large discordance (more than 6 mm in either D(AP) or D(SI)), 13 studies, referred as motion-after-Tx cases, also show large misalignment between the field edge and the positron emission distribution in lipomatous tissues around the prostate. These motion-after-Tx cases correspond to patients with large changes in volume of rectal gas between the post-Tx and the post-PET CTs. The standard deviations for D(AP) and D(SI) are 5.0 and 3.0 mm, respectively, for these motion-after-Tx cases. The final seven studies, referred to as position-error cases, which had a large discordance but no misalignment, were found to have deviations of 4.6 and 3.6 mm in D(AP) and D(SI), respectively. The position-error cases correspond to a large discrepancy on the relative location between the centroid and pelvic bones seen in post-Tx CT and recorded x-ray radiographs. Systematic analyses of proton-activated positron emission distributions provide patient-specific information on prostate motion (sigmaM) and patient position variability (sigmap) during daily proton beam delivery. The less than 2 mm of displacement variations in the good cases indicates that population-based values of sigmap and sigmaM, used in margin algorithms for treatment planning at the authors' institution are valid for the majority of cases. However, a small fraction of PET/CT studies (approximately 14%) with -4 mm displacement variations may require different margins. Such data are useful in establishing patient-specific planning target volume margins.

  17. Online 3D EPID-based dose verification: Proof of concept

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozenda

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of thismore » study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5–10 s irradiation time. Conclusions: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.« less

  18. Initial study and verification of a distributed fiber optic corrosion monitoring system for transportation structures.

    DOT National Transportation Integrated Search

    2012-07-01

    For this study, a novel optical fiber sensing system was developed and tested for the monitoring of corrosion in : transportation systems. The optical fiber sensing system consists of a reference long period fiber gratings (LPFG) sensor : for corrosi...

  19. Investigation of Trends in Aerosol Direct Radiative Effects over North America Using a Coupled Meteorology-Chemistry Model

    EPA Science Inventory

    A comprehensive investigation of the processes regulating tropospheric aerosol distributions, their optical properties, and their radiative effects in conjunction with verification of their simulated radiative effects for past conditions relative to measurements is needed in orde...

  20. Southern California Edison Grid Integration Evaluation: Cooperative Research and Development Final Report, CRADA Number CRD-10-376

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mather, Barry

    2015-07-09

    The objective of this project is to use field verification to improve DOE’s ability to model and understand the impacts of, as well as develop solutions for, high penetration PV deployments in electrical utility distribution systems. The Participant will work with NREL to assess the existing distribution system at SCE facilities and assess adding additional PV systems into the electric power system.

  1. Autonomy Community of Interest (COI) Test and Evaluation, Verification and Validation (TEVV) Working Group: Technology Investment Strategy 2015-2018

    DTIC Science & Technology

    2015-05-01

    Evaluation Center of Excellence SUAS Small Unmanned Aircraft System SUT System under Test T&E Test and Evaluation TARDEC Tank Automotive Research...17 Distribution A: Distribution Unlimited 2 Background In the past decade, unmanned systems have significantly impacted warfare...environments at a speed and scale beyond manned capability. However, current unmanned systems operate with minimal autonomy. To meet warfighter needs and

  2. A New On-Line Diagnosis Protocol for the SPIDER Family of Byzantine Fault Tolerant Architectures

    NASA Technical Reports Server (NTRS)

    Geser, Alfons; Miner, Paul S.

    2004-01-01

    This paper presents the formal verification of a new protocol for online distributed diagnosis for the SPIDER family of architectures. An instance of the Scalable Processor-Independent Design for Electromagnetic Resilience (SPIDER) architecture consists of a collection of processing elements communicating over a Reliable Optical Bus (ROBUS). The ROBUS is a specialized fault-tolerant device that guarantees Interactive Consistency, Distributed Diagnosis (Group Membership), and Synchronization in the presence of a bounded number of physical faults. Formal verification of the original SPIDER diagnosis protocol provided a detailed understanding that led to the discovery of a significantly more efficient protocol. The original protocol was adapted from the formally verified protocol used in the MAFT architecture. It required O(N) message exchanges per defendant to correctly diagnose failures in a system with N nodes. The new protocol achieves the same diagnostic fidelity, but only requires O(1) exchanges per defendant. This paper presents this new diagnosis protocol and a formal proof of its correctness using PVS.

  3. Structural Deterministic Safety Factors Selection Criteria and Verification

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1992-01-01

    Though current deterministic safety factors are arbitrarily and unaccountably specified, its ratio is rooted in resistive and applied stress probability distributions. This study approached the deterministic method from a probabilistic concept leading to a more systematic and coherent philosophy and criterion for designing more uniform and reliable high-performance structures. The deterministic method was noted to consist of three safety factors: a standard deviation multiplier of the applied stress distribution; a K-factor for the A- or B-basis material ultimate stress; and the conventional safety factor to ensure that the applied stress does not operate in the inelastic zone of metallic materials. The conventional safety factor is specifically defined as the ratio of ultimate-to-yield stresses. A deterministic safety index of the combined safety factors was derived from which the corresponding reliability proved the deterministic method is not reliability sensitive. The bases for selecting safety factors are presented and verification requirements are discussed. The suggested deterministic approach is applicable to all NASA, DOD, and commercial high-performance structures under static stresses.

  4. Advanced Plant Habitat Test Harvest

    NASA Image and Video Library

    2017-08-24

    John "JC" Carver, a payload integration engineer with NASA Kennedy Space Center's Test and Operations Support Contract, uses a FluorPen to measure the chlorophyll fluorescence of Arabidopsis thaliana plants inside the growth chamber of the Advanced Plant Habitat (APH) Flight Unit No. 1. Half the plants were then harvested. The harvest is part of an ongoing verification test of the APH unit, which is located inside the International Space Station Environmental Simulator in Kennedy's Space Station Processing Facility. The APH undergoing testing at Kennedy is identical to one on the station and uses red, green and broad-spectrum white LED lights to grow plants in an environmentally controlled chamber. The seeds grown during the verification test will be grown on the station to help scientists understand how these plants adapt to spaceflight.

  5. A comparison of two prompt gamma imaging techniques with collimator-based cameras for range verification in proton therapy

    NASA Astrophysics Data System (ADS)

    Lin, Hsin-Hon; Chang, Hao-Ting; Chao, Tsi-Chian; Chuang, Keh-Shih

    2017-08-01

    In vivo range verification plays an important role in proton therapy to fully utilize the benefits of the Bragg peak (BP) for delivering high radiation dose to tumor, while sparing the normal tissue. For accurately locating the position of BP, camera equipped with collimators (multi-slit and knife-edge collimator) to image prompt gamma (PG) emitted along the proton tracks in the patient have been proposed for range verification. The aim of the work is to compare the performance of multi-slit collimator and knife-edge collimator for non-invasive proton beam range verification. PG imaging was simulated by a validated GATE/GEANT4 Monte Carlo code to model the spot-scanning proton therapy and cylindrical PMMA phantom in detail. For each spot, 108 protons were simulated. To investigate the correlation between the acquired PG profile and the proton range, the falloff regions of PG profiles were fitted with a 3-line-segment curve function as the range estimate. Factors including the energy window setting, proton energy, phantom size, and phantom shift that may influence the accuracy of detecting range were studied. Results indicated that both collimator systems achieve reasonable accuracy and good response to the phantom shift. The accuracy of range predicted by multi-slit collimator system is less affected by the proton energy, while knife-edge collimator system can achieve higher detection efficiency that lead to a smaller deviation in predicting range. We conclude that both collimator systems have potentials for accurately range monitoring in proton therapy. It is noted that neutron contamination has a marked impact on range prediction of the two systems, especially in multi-slit system. Therefore, a neutron reduction technique for improving the accuracy of range verification of proton therapy is needed.

  6. Certification of NIST Room Temperature Low-Energy and High-Energy Charpy Verification Specimens

    PubMed Central

    Lucon, Enrico; McCowan, Chris N.; Santoyo, Ray L.

    2015-01-01

    The possibility for NIST to certify Charpy reference specimens for testing at room temperature (21 °C ± 1 °C) instead of −40 °C was investigated by performing 130 room-temperature tests from five low-energy and four high-energy lots of steel on the three master Charpy machines located in Boulder, CO. The statistical analyses performed show that in most cases the variability of results (i.e., the experimental scatter) is reduced when testing at room temperature. For eight out of the nine lots considered, the observed variability was lower at 21 °C than at −40 °C. The results of this study will allow NIST to satisfy requests for room-temperature Charpy verification specimens that have been received from customers for several years: testing at 21 °C removes from the verification process the operator’s skill in transferring the specimen in a timely fashion from the cooling bath to the impact position, and puts the focus back on the machine performance. For NIST, it also reduces the time and cost for certifying new verification lots. For one of the low-energy lots tested with a C-shaped hammer, we experienced two specimens jamming, which yielded unusually high values of absorbed energy. For both specimens, the signs of jamming were clearly visible. For all the low-energy lots investigated, jamming is slightly more likely to occur at 21 °C than at −40 °C, since at room temperature low-energy samples tend to remain in the test area after impact rather than exiting in the opposite direction of the pendulum swing. In the evaluation of a verification set, any jammed specimen should be removed from the analyses. PMID:26958453

  7. Certification of NIST Room Temperature Low-Energy and High-Energy Charpy Verification Specimens.

    PubMed

    Lucon, Enrico; McCowan, Chris N; Santoyo, Ray L

    2015-01-01

    The possibility for NIST to certify Charpy reference specimens for testing at room temperature (21 °C ± 1 °C) instead of -40 °C was investigated by performing 130 room-temperature tests from five low-energy and four high-energy lots of steel on the three master Charpy machines located in Boulder, CO. The statistical analyses performed show that in most cases the variability of results (i.e., the experimental scatter) is reduced when testing at room temperature. For eight out of the nine lots considered, the observed variability was lower at 21 °C than at -40 °C. The results of this study will allow NIST to satisfy requests for room-temperature Charpy verification specimens that have been received from customers for several years: testing at 21 °C removes from the verification process the operator's skill in transferring the specimen in a timely fashion from the cooling bath to the impact position, and puts the focus back on the machine performance. For NIST, it also reduces the time and cost for certifying new verification lots. For one of the low-energy lots tested with a C-shaped hammer, we experienced two specimens jamming, which yielded unusually high values of absorbed energy. For both specimens, the signs of jamming were clearly visible. For all the low-energy lots investigated, jamming is slightly more likely to occur at 21 °C than at -40 °C, since at room temperature low-energy samples tend to remain in the test area after impact rather than exiting in the opposite direction of the pendulum swing. In the evaluation of a verification set, any jammed specimen should be removed from the analyses.

  8. Polymer gel dosimeters for pretreatment radiotherapy verification using the three-dimensional gamma evaluation and pass rate maps.

    PubMed

    Hsieh, Ling-Ling; Shieh, Jiunn-I; Wei, Li-Ju; Wang, Yi-Chun; Cheng, Kai-Yuan; Shih, Cheng-Ting

    2017-05-01

    Polymer gel dosimeters (PGDs) have been widely studied for use in the pretreatment verification of clinical radiation therapy. However, the readability of PGDs in three-dimensional (3D) dosimetry remain unclear. In this study, the pretreatment verifications of clinical radiation therapy were performed using an N-isopropyl-acrylamide (NIPAM) PGD, and the results were used to evaluate the performance of the NIPAM PGD on 3D dose measurement. A gel phantom was used to measure the dose distribution of a clinical case of intensity-modulated radiation therapy. Magnetic resonance imaging scans were performed for dose readouts. The measured dose volumes were compared with the planned dose volume. The relative volume histograms showed that relative volumes with a negative percent dose difference decreased as time elapsed. Furthermore, the histograms revealed few changes after 24h postirradiation. For the 3%/3mm and 2%/2mm criteria, the pass rates of the 12- and 24-h dose volumes were higher than 95%, respectively. This study thus concludes that the pass rate map can be used to evaluate the dose-temporal readability of PGDs and that the NIPAM PGD can be used for clinical pretreatment verifications. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  9. Using Colored Stochastic Petri Net (CS-PN) software for protocol specification, validation, and evaluation

    NASA Technical Reports Server (NTRS)

    Zenie, Alexandre; Luguern, Jean-Pierre

    1987-01-01

    The specification, verification, validation, and evaluation, which make up the different steps of the CS-PN software are outlined. The colored stochastic Petri net software is applied to a Wound/Wait protocol decomposable into two principal modules: request or couple (transaction, granule) treatment module and wound treatment module. Each module is specified, verified, validated, and then evaluated separately, to deduce a verification, validation and evaluation of the complete protocol. The colored stochastic Petri nets tool is shown to be a natural extension of the stochastic tool, adapted to distributed systems and protocols, because the color conveniently takes into account the numerous sites, transactions, granules and messages.

  10. Non-invasive mapping of bilateral motor speech areas using navigated transcranial magnetic stimulation and functional magnetic resonance imaging.

    PubMed

    Könönen, Mervi; Tamsi, Niko; Säisänen, Laura; Kemppainen, Samuli; Määttä, Sara; Julkunen, Petro; Jutila, Leena; Äikiä, Marja; Kälviäinen, Reetta; Niskanen, Eini; Vanninen, Ritva; Karjalainen, Pasi; Mervaala, Esa

    2015-06-15

    Navigated transcranial magnetic stimulation (nTMS) is a modern precise method to activate and study cortical functions noninvasively. We hypothesized that a combination of nTMS and functional magnetic resonance imaging (fMRI) could clarify the localization of functional areas involved with motor control and production of speech. Navigated repetitive TMS (rTMS) with short bursts was used to map speech areas on both hemispheres by inducing speech disruption during number recitation tasks in healthy volunteers. Two experienced video reviewers, blinded to the stimulated area, graded each trial offline according to possible speech disruption. The locations of speech disrupting nTMS trials were overlaid with fMRI activations of word generation task. Speech disruptions were produced on both hemispheres by nTMS, though there were more disruptive stimulation sites on the left hemisphere. Grade of the disruptions varied from subjective sensation to mild objectively recognizable disruption up to total speech arrest. The distribution of locations in which speech disruptions could be elicited varied among individuals. On the left hemisphere the locations of disturbing rTMS bursts with reviewers' verification followed the areas of fMRI activation. Similar pattern was not observed on the right hemisphere. The reviewer-verified speech disruptions induced by nTMS provided clinically relevant information, and fMRI might explain further the function of the cortical area. nTMS and fMRI complement each other, and their combination should be advocated when assessing individual localization of speech network. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Environmental Technology Verification Report: Climate Energy freewatt™ Micro-Combined Heat and Power System

    EPA Science Inventory

    The EPA GHG Center collaborated with the New York State Energy Research and Development Authority (NYSERDA) to evaluate the performance of the Climate Energy freewatt Micro-Combined Heat and Power System. The system is a reciprocating internal combustion (IC) engine distributed e...

  12. 7 CFR 62.210 - Denial, suspension, or cancellation of service.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Denial, suspension, or cancellation of service. 62.210...) Quality Systems Verification Programs Definitions Service § 62.210 Denial, suspension, or cancellation of...) Accurately represent the eligibility of agricultural products or services distributed under an approved...

  13. 7 CFR 62.210 - Denial, suspension, or cancellation of service.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Denial, suspension, or cancellation of service. 62.210...) Quality Systems Verification Programs Definitions Service § 62.210 Denial, suspension, or cancellation of...) Accurately represent the eligibility of agricultural products or services distributed under an approved...

  14. Digital Video of Live-Scan Fingerprint Data

    National Institute of Standards and Technology Data Gateway

    NIST Digital Video of Live-Scan Fingerprint Data (PC database for purchase)   NIST Special Database 24 contains MPEG-2 (Moving Picture Experts Group) compressed digital video of live-scan fingerprint data. The database is being distributed for use in developing and testing of fingerprint verification systems.

  15. 8 CFR 274a.2 - Verification of identity and employment authorization.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... card issued by the Commonwealth of the Northern Mariana Islands. (2) [Reserved] (vi) Special rules for... production was made. If Forms I-9 are kept at another location, the person or entity must inform the officer... present the Forms I-9, any officer listed in 8 CFR 287.4 may compel production of the Forms I-9 and any...

  16. 40 CFR 1066.225 - Roll runout and diameter verification procedure.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... section. (2) Measure roll diameter using a Pi Tape®. Orient the Pi Tape® to the marker line at the desired measurement location with the Pi Tape® hook pointed outward. Temporarily secure the Pi Tape® to the roll near the hook end with adhesive tape. Slowly turn the roll, wrapping the Pi Tape® around the roll surface...

  17. 40 CFR 1066.225 - Roll runout and diameter verification procedure.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) Measure roll diameter using a Pi Tape®. Orient the Pi Tape® to the marker line at the desired measurement location with the Pi Tape® hook pointed outward. Temporarily secure the Pi Tape® to the roll near the hook end with adhesive tape. Slowly turn the roll, wrapping the Pi Tape® around the roll surface. Ensure...

  18. 40 CFR 1066.225 - Roll runout and diameter verification procedure.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... section. (2) Measure roll diameter using a Pi Tape®. Orient the Pi Tape® to the marker line at the desired measurement location with the Pi Tape® hook pointed outward. Temporarily secure the Pi Tape® to the roll near the hook end with adhesive tape. Slowly turn the roll, wrapping the Pi Tape® around the roll surface...

  19. Payload specialist station study. Part 2: CEI specifications (part 1). [space shuttles

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The performance, design, and verification specifications are established for the multifunction display system (MFDS) to be located at the payload station in the shuttle orbiter aft flight deck. The system provides the display units (with video, alphanumerics, and graphics capabilities), associated with electronic units and the keyboards in support of the payload dedicated controls and the displays concept.

  20. Crack Detection with Lamb Wave Wavenumber Analysis

    NASA Technical Reports Server (NTRS)

    Tian, Zhenhua; Leckey, Cara; Rogge, Matt; Yu, Lingyu

    2013-01-01

    In this work, we present our study of Lamb wave crack detection using wavenumber analysis. The aim is to demonstrate the application of wavenumber analysis to 3D Lamb wave data to enable damage detection. The 3D wavefields (including vx, vy and vz components) in time-space domain contain a wealth of information regarding the propagating waves in a damaged plate. For crack detection, three wavenumber analysis techniques are used: (i) two dimensional Fourier transform (2D-FT) which can transform the time-space wavefield into frequency-wavenumber representation while losing the spatial information; (ii) short space 2D-FT which can obtain the frequency-wavenumber spectra at various spatial locations, resulting in a space-frequency-wavenumber representation; (iii) local wavenumber analysis which can provide the distribution of the effective wavenumbers at different locations. All of these concepts are demonstrated through a numerical simulation example of an aluminum plate with a crack. The 3D elastodynamic finite integration technique (EFIT) was used to obtain the 3D wavefields, of which the vz (out-of-plane) wave component is compared with the experimental measurement obtained from a scanning laser Doppler vibrometer (SLDV) for verification purposes. The experimental and simulated results are found to be in close agreement. The application of wavenumber analysis on 3D EFIT simulation data shows the effectiveness of the analysis for crack detection. Keywords: : Lamb wave, crack detection, wavenumber analysis, EFIT modeling

  1. Ground deformation monitoring using RADARSAT-2 DInSAR-MSBAS at the Aquistore CO2 storage site in Saskatchewan (Canada)

    NASA Astrophysics Data System (ADS)

    Czarnogorska, M.; Samsonov, S.; White, D.

    2014-11-01

    The research objectives of the Aquistore CO2 storage project are to design, adapt, and test non-seismic monitoring methods for measurement, and verification of CO2 storage, and to integrate data to determine subsurface fluid distributions, pressure changes and associated surface deformation. Aquistore site is located near Estevan in Southern Saskatchewan on the South flank of the Souris River and west of the Boundary Dam Power Station and the historical part of Estevan coal mine in southeastern Saskatchewan, Canada. Several monitoring techniques were employed in the study area including advanced satellite Differential Interferometric Synthetic Aperture Radar (DInSAR) technique, GPS, tiltmeters and piezometers. The targeted CO2 injection zones are within the Winnipeg and Deadwood formations located at > 3000 m depth. An array of monitoring techniques was employed in the study area including advanced satellite Differential Interferometric Synthetic Aperture Radar (DInSAR) with established corner reflectors, GPS, tiltmeters and piezometers stations. We used airborne LIDAR data for topographic phase estimation, and DInSAR product geocoding. Ground deformation maps have been calculated using Multidimensional Small Baseline Subset (MSBAS) methodology from 134 RADARSAT-2 images, from five different beams, acquired during 20120612-20140706. We computed and interpreted nine time series for selected places. MSBAS results indicate slow ground deformation up to 1 cm/year not related to CO2 injection but caused by various natural and anthropogenic causes.

  2. Building a Library for Microelectronics Verification with Topological Constraints

    DTIC Science & Technology

    2017-03-01

    Tables 1d, 3b); 1-bit full adder cell (Fig. 1), respectively. Table 5. Frequency distributions for the genus of logically equivalent circuit...Figure 1 shows that switching signal pairs produces logically- equivalent topologies of the 1-bit full adder cell with three values of the genus (g = 3 [1...case], 4, 5, 6). Figure 1. Frequency distribution for logically equivalent circuit topologies of the 1-bit full adder cell (2048) in Table 1(e

  3. Exploring the e-cigarette e-commerce marketplace: Identifying Internet e-cigarette marketing characteristics and regulatory gaps.

    PubMed

    Mackey, Tim K; Miner, Angela; Cuomo, Raphael E

    2015-11-01

    The electronic cigarette (e-cigarette) market is maturing into a billion-dollar industry. Expansion includes new channels of access not sufficiently assessed, including Internet sales of e-cigarettes. This study identifies unique e-cigarette Internet vendor characteristics, including geographic location, promotional strategies, use of social networking, presence/absence of age verification, and consumer warning representation. We performed structured Internet search engine queries and used inclusion/exclusion criteria to identify e-cigarette vendors. We then conducted content analysis of characteristics of interest. Our examination yielded 57 e-cigarette Internet vendors including 54.4% (n=31) that sold exclusively online. The vast majority of websites (96.5%, n=55) were located in the U.S. Vendors used a variety of sales promotion strategies to market e-cigarettes including 70.2% (n=40) that used more than one social network service (SNS) and 42.1% (n=24) that used more than one promotional sales strategies. Most vendors (68.4%, n=39) displayed one or more health warnings on their website, but often displayed them in smaller font or in their terms and conditions. Additionally, 35.1% (n=20) of vendors did not have any detectable age verification process. E-cigarette Internet vendors are actively engaged in various promotional activities to increase the appeal and presence of their products online. In the absence of FDA regulations specific to the Internet, the e-cigarette e-commerce marketplace is likely to grow. This digital environment poses unique challenges requiring targeted policy-making including robust online age verification, monitoring of SNS marketing, and greater scrutiny of certain forms of marketing promotional practices. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. Identification and verification of critical performance dimensions. Phase 1 of the systematic process redesign of drug distribution.

    PubMed

    Colen, Hadewig B; Neef, Cees; Schuring, Roel W

    2003-06-01

    Worldwide patient safety has become a major social policy problem for healthcare organisations. As in other organisations, the patients in our hospital also suffer from an inadequate distribution process, as becomes clear from incident reports involving medication errors. Medisch Spectrum Twente is a top primary-care, clinical, teaching hospital. The hospital pharmacy takes care of 1070 internal beds and 1120 beds in an affiliated psychiatric hospital and nursing homes. In the beginning of 1999, our pharmacy group started a large interdisciplinary research project to develop a safe, effective and efficient drug distribution system by using systematic process redesign. The process redesign includes both organisational and technological components. This article describes the identification and verification of critical performance dimensions for the design of drug distribution processes in hospitals (phase 1 of the systematic process redesign of drug distribution). Based on reported errors and related causes, we suggested six generic performance domains. To assess the role of the performance dimensions, we used three approaches: flowcharts, interviews with stakeholders and review of the existing performance using time studies and medication error studies. We were able to set targets for costs, quality of information, responsiveness, employee satisfaction, and degree of innovation. We still have to establish what drug distribution system, in respect of quality and cost-effectiveness, represents the best and most cost-effective way of preventing medication errors. We intend to develop an evaluation model, using the critical performance dimensions as a starting point. This model can be used as a simulation template to compare different drug distribution concepts in order to define the differences in quality and cost-effectiveness.

  5. Experimental quantum verification in the presence of temporally correlated noise

    NASA Astrophysics Data System (ADS)

    Mavadia, S.; Edmunds, C. L.; Hempel, C.; Ball, H.; Roy, F.; Stace, T. M.; Biercuk, M. J.

    2018-02-01

    Growth in the capabilities of quantum information hardware mandates access to techniques for performance verification that function under realistic laboratory conditions. Here we experimentally characterise the impact of common temporally correlated noise processes on both randomised benchmarking (RB) and gate-set tomography (GST). Our analysis highlights the role of sequence structure in enhancing or suppressing the sensitivity of quantum verification protocols to either slowly or rapidly varying noise, which we treat in the limiting cases of quasi-DC miscalibration and white noise power spectra. We perform experiments with a single trapped 171Yb+ ion-qubit and inject engineered noise (" separators="∝σ^ z ) to probe protocol performance. Experiments on RB validate predictions that measured fidelities over sequences are described by a gamma distribution varying between approximately Gaussian, and a broad, highly skewed distribution for rapidly and slowly varying noise, respectively. Similarly we find a strong gate set dependence of default experimental GST procedures in the presence of correlated errors, leading to significant deviations between estimated and calculated diamond distances in the presence of correlated σ^ z errors. Numerical simulations demonstrate that expansion of the gate set to include negative rotations can suppress these discrepancies and increase reported diamond distances by orders of magnitude for the same error processes. Similar effects do not occur for correlated σ^ x or σ^ y errors or depolarising noise processes, highlighting the impact of the critical interplay of selected gate set and the gauge optimisation process on the meaning of the reported diamond norm in correlated noise environments.

  6. A novel design of the high-precision magnetic locator with three-dimension measurement capability applying dynamically sensing mechanism

    NASA Astrophysics Data System (ADS)

    Huang, Wen-Nan; Chen, Po-Shen; Chen, Mu-Ping; Teng, Ching-Cheng

    2006-09-01

    A novel design of the magnetic locator, for obtaining the high-precision measurement information of variety of the buried metal pipes, is presented in this paper. The concept of dynamically sensing mechanism, including the vibrating and moving devices, proposed herein is a simple and effective way to improve the precision of three-dimension location sensing for the underground utilities. Based on the primary magnetism of Lenz's law and Faraday's law, the functions of the amplifying effect for the sensing magnetic signals, as well as the distinguishing effect by the simple filtering algorithms embedded in processing programs, are achieved while the relatively strong noise exists. The verification results of these integration designs demonstrate the effectiveness both by precise locating for the buried utility, and accurate measurement for the depth.

  7. Approximate Synchrony: An Abstraction for Distributed Almost Synchronous Systems

    DTIC Science & Technology

    2015-05-29

    finding bugs. Verification of the TSCH Protocol. Time Synchronized Channel Hopping (TSCH) is a Medium Access Control scheme that enables low power...allotted by the schedule and remain in sleep mode otherwise. In the ab- sence of precise time-synchronization, the time-slots across nodes would not be

  8. 39 CFR 501.16 - PC postage payment methodology.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... DISTRIBUTE POSTAGE EVIDENCING SYSTEMS § 501.16 PC postage payment methodology. (a) The PC Postage customer is... issues a refund to a customer for any unused postage in a Postage Evidencing System. After verification... Service approval to continue to operate PC Postage systems, the provider must submit to a periodic audit...

  9. Determinants of Standard Errors of MLEs in Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Yuan, Ke-Hai; Cheng, Ying; Zhang, Wei

    2010-01-01

    This paper studies changes of standard errors (SE) of the normal-distribution-based maximum likelihood estimates (MLE) for confirmatory factor models as model parameters vary. Using logical analysis, simplified formulas and numerical verification, monotonic relationships between SEs and factor loadings as well as unique variances are found.…

  10. A comparative verification of high resolution precipitation forecasts using model output statistics

    NASA Astrophysics Data System (ADS)

    van der Plas, Emiel; Schmeits, Maurice; Hooijman, Nicolien; Kok, Kees

    2017-04-01

    Verification of localized events such as precipitation has become even more challenging with the advent of high-resolution meso-scale numerical weather prediction (NWP). The realism of a forecast suggests that it should compare well against precipitation radar imagery with similar resolution, both spatially and temporally. Spatial verification methods solve some of the representativity issues that point verification gives rise to. In this study a verification strategy based on model output statistics is applied that aims to address both double penalty and resolution effects that are inherent to comparisons of NWP models with different resolutions. Using predictors based on spatial precipitation patterns around a set of stations, an extended logistic regression (ELR) equation is deduced, leading to a probability forecast distribution of precipitation for each NWP model, analysis and lead time. The ELR equations are derived for predictands based on areal calibrated radar precipitation and SYNOP observations. The aim is to extract maximum information from a series of precipitation forecasts, like a trained forecaster would. The method is applied to the non-hydrostatic model Harmonie (2.5 km resolution), Hirlam (11 km resolution) and the ECMWF model (16 km resolution), overall yielding similar Brier skill scores for the 3 post-processed models, but larger differences for individual lead times. Besides, the Fractions Skill Score is computed using the 3 deterministic forecasts, showing somewhat better skill for the Harmonie model. In other words, despite the realism of Harmonie precipitation forecasts, they only perform similarly or somewhat better than precipitation forecasts from the 2 lower resolution models, at least in the Netherlands.

  11. DeepMitosis: Mitosis detection via deep detection, verification and segmentation networks.

    PubMed

    Li, Chao; Wang, Xinggang; Liu, Wenyu; Latecki, Longin Jan

    2018-04-01

    Mitotic count is a critical predictor of tumor aggressiveness in the breast cancer diagnosis. Nowadays mitosis counting is mainly performed by pathologists manually, which is extremely arduous and time-consuming. In this paper, we propose an accurate method for detecting the mitotic cells from histopathological slides using a novel multi-stage deep learning framework. Our method consists of a deep segmentation network for generating mitosis region when only a weak label is given (i.e., only the centroid pixel of mitosis is annotated), an elaborately designed deep detection network for localizing mitosis by using contextual region information, and a deep verification network for improving detection accuracy by removing false positives. We validate the proposed deep learning method on two widely used Mitosis Detection in Breast Cancer Histological Images (MITOSIS) datasets. Experimental results show that we can achieve the highest F-score on the MITOSIS dataset from ICPR 2012 grand challenge merely using the deep detection network. For the ICPR 2014 MITOSIS dataset that only provides the centroid location of mitosis, we employ the segmentation model to estimate the bounding box annotation for training the deep detection network. We also apply the verification model to eliminate some false positives produced from the detection model. By fusing scores of the detection and verification models, we achieve the state-of-the-art results. Moreover, our method is very fast with GPU computing, which makes it feasible for clinical practice. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. 4D ML reconstruction as a tool for volumetric PET-based treatment verification in ion beam radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Bernardi, E., E-mail: elisabetta.debernardi@unimib.it; Ricotti, R.; Riboldi, M.

    2016-02-15

    Purpose: An innovative strategy to improve the sensitivity of positron emission tomography (PET)-based treatment verification in ion beam radiotherapy is proposed. Methods: Low counting statistics PET images acquired during or shortly after the treatment (Measured PET) and a Monte Carlo estimate of the same PET images derived from the treatment plan (Expected PET) are considered as two frames of a 4D dataset. A 4D maximum likelihood reconstruction strategy was adapted to iteratively estimate the annihilation events distribution in a reference frame and the deformation motion fields that map it in the Expected PET and Measured PET frames. The outputs generatedmore » by the proposed strategy are as follows: (1) an estimate of the Measured PET with an image quality comparable to the Expected PET and (2) an estimate of the motion field mapping Expected PET to Measured PET. The details of the algorithm are presented and the strategy is preliminarily tested on analytically simulated datasets. Results: The algorithm demonstrates (1) robustness against noise, even in the worst conditions where 1.5 × 10{sup 4} true coincidences and a random fraction of 73% are simulated; (2) a proper sensitivity to different kind and grade of mismatches ranging between 1 and 10 mm; (3) robustness against bias due to incorrect washout modeling in the Monte Carlo simulation up to 1/3 of the original signal amplitude; and (4) an ability to describe the mismatch even in presence of complex annihilation distributions such as those induced by two perpendicular superimposed ion fields. Conclusions: The promising results obtained in this work suggest the applicability of the method as a quantification tool for PET-based treatment verification in ion beam radiotherapy. An extensive assessment of the proposed strategy on real treatment verification data is planned.« less

  13. Evaluation of Kodak EDR2 film for dose verification of intensity modulated radiation therapy delivered by a static multileaf collimator.

    PubMed

    Zhu, X R; Jursinic, P A; Grimm, D F; Lopez, F; Rownd, J J; Gillin, M T

    2002-08-01

    A new type of radiographic film, Kodak EDR2 film, was evaluated for dose verification of intensity modulated radiation therapy (IMRT) delivered by a static multileaf collimator (SMLC). A sensitometric curve of EDR2 film irradiated by a 6 MV x-ray beam was compared with that of Kodak X-OMAT V (XV) film. The effects of field size, depth and dose rate on the sensitometric curve were also studied. It is found that EDR2 film is much less sensitive than XV film. In high-energy x-ray beams, the double hit process is the dominant mechanism that renders the grains on EDR2 films developable. As a result, in the dose range that is commonly used for film dosimetry for IMRT and conventional external beam therapy, the sensitometric curves of EDR2 films cannot be approximated as a linear function, OD = c * D. Within experimental uncertainty, the film sensitivity does not depend on the dose rate (50 vs 300 MU/min) or dose per pulse (from 1.0 x 10(-4) to 4.21 x 10(-4) Gy/pulse). Field sizes and depths (up to field size of 10 x 10 cm2 and depth = 10 cm) have little effect on the sensitometric curves. Percent depth doses (PDDs) for both 6 and 23 MV x rays were measured with both EDR2 and XV films and compared with ion chamber data. Film data are within 2.5% of the ion chamber results. Dose profiles measured with EDR2 film are consistent with those measured with an ion chamber. Examples of measured IMRT isodose distributions versus calculated isodoses are presented. We have used EDR2 films for verification of all IMRT patients treated by SMLC in our clinic. In most cases, with EDR2 film, actual clinical daily fraction doses can be used for verification of composite isodose distributions of SMLC-based IMRT.

  14. Evaluating aspects of online medication safety in long-term follow-up of 136 Internet pharmacies: illegal rogue online pharmacies flourish and are long-lived.

    PubMed

    Fittler, Andras; Bősze, Gergely; Botz, Lajos

    2013-09-10

    A growing number of online pharmacies have been established worldwide. Among them are numerous illegal websites selling medicine without valid medical prescriptions or distributing substandard or counterfeit drugs. Only a limited number of studies have been published on Internet pharmacies with regard to patient safety, professionalism, long-term follow-up, and pharmaceutical legitimacy verification. In this study, we selected, evaluated, and followed 136 Internet pharmacy websites aiming to identify indicators of professional online pharmacy service and online medication safety. An Internet search was performed by simulating the needs of potential customers of online pharmacies. A total of 136 Internet pharmacy websites were assessed and followed for four years. According to the LegitScript database, relevant characteristics such as longevity, time of continuous operation, geographical location, displayed contact information, prescription requirement, medical information exchange, and pharmaceutical legitimacy verification were recorded and evaluated. The number of active Internet pharmacy websites decreased; 23 of 136 (16.9%) online pharmacies ceased operating within 12 months and only 67 monitored websites (49.3%) were accessible at the end of the four-year observation period. However, not all operated continuously, as about one-fifth (31/136) of all observed online pharmacy websites were inaccessible provisionally. Thus, only 56 (41.2%) Internet-based pharmacies were continuously operational. Thirty-one of the 136 online pharmacies (22.8%) had not provided any contact details, while only 59 (43.4%) displayed all necessary contact information on the website. We found that the declared physical location claims did not correspond to the area of domain registration (according to IP address) for most websites. Although the majority (120/136, 88.2%) of the examined Internet pharmacies distributed various prescription-only medicines, only 9 (6.6%) requested prior medical prescriptions before purchase. Medical information exchange was generally ineffective as 52 sites (38.2%) did not require any medical information from patients. The product information about the medicines was generally (126/136, 92.6%) not displayed adequately, and the contents of the patient information leaflet were incomplete in most cases (104/136, 76.5%). Numerous online operators (60/136, 44.1%) were defined as rogue Internet pharmacies, but no legitimate Internet-based pharmacies were among them. One site (0.7%) was yet unverified, 23 (16.9%) were unapproved, while the remaining (52/136, 38.2%) websites were not available in the LegitScript database. Contrary to our prior assumptions, prescription or medical information requirement, or the indication of contact information on the website, does not seem to correlate with "rogue pharmacy" status using the LegitScript online pharmacy verification standards. Instead, long-term continuous operation strongly correlated (P<.001) with explicit illegal activity. Most Internet pharmacies in our study sample were illegal sites within the definition of "rogue" Internet pharmacy. These websites violate professional, legal, and ethical standards and endanger patient safety. This work shows evidence that online pharmacies that act illegally appear to have greater longevity than others, presumably because there is no compelling reason for frequent change in order to survive. We also found that one in five websites revived (closed down and reopened again within four years) and no-prescription sites with limited medicine and patient information are flourishing.

  15. Evaluating Aspects of Online Medication Safety in Long-Term Follow-Up of 136 Internet Pharmacies: Illegal Rogue Online Pharmacies Flourish and Are Long-Lived

    PubMed Central

    2013-01-01

    Background A growing number of online pharmacies have been established worldwide. Among them are numerous illegal websites selling medicine without valid medical prescriptions or distributing substandard or counterfeit drugs. Only a limited number of studies have been published on Internet pharmacies with regard to patient safety, professionalism, long-term follow-up, and pharmaceutical legitimacy verification. Objective In this study, we selected, evaluated, and followed 136 Internet pharmacy websites aiming to identify indicators of professional online pharmacy service and online medication safety. Methods An Internet search was performed by simulating the needs of potential customers of online pharmacies. A total of 136 Internet pharmacy websites were assessed and followed for four years. According to the LegitScript database, relevant characteristics such as longevity, time of continuous operation, geographical location, displayed contact information, prescription requirement, medical information exchange, and pharmaceutical legitimacy verification were recorded and evaluated. Results The number of active Internet pharmacy websites decreased; 23 of 136 (16.9%) online pharmacies ceased operating within 12 months and only 67 monitored websites (49.3%) were accessible at the end of the four-year observation period. However, not all operated continuously, as about one-fifth (31/136) of all observed online pharmacy websites were inaccessible provisionally. Thus, only 56 (41.2%) Internet-based pharmacies were continuously operational. Thirty-one of the 136 online pharmacies (22.8%) had not provided any contact details, while only 59 (43.4%) displayed all necessary contact information on the website. We found that the declared physical location claims did not correspond to the area of domain registration (according to IP address) for most websites. Although the majority (120/136, 88.2%) of the examined Internet pharmacies distributed various prescription-only medicines, only 9 (6.6%) requested prior medical prescriptions before purchase. Medical information exchange was generally ineffective as 52 sites (38.2%) did not require any medical information from patients. The product information about the medicines was generally (126/136, 92.6%) not displayed adequately, and the contents of the patient information leaflet were incomplete in most cases (104/136, 76.5%). Numerous online operators (60/136, 44.1%) were defined as rogue Internet pharmacies, but no legitimate Internet-based pharmacies were among them. One site (0.7%) was yet unverified, 23 (16.9%) were unapproved, while the remaining (52/136, 38.2%) websites were not available in the LegitScript database. Contrary to our prior assumptions, prescription or medical information requirement, or the indication of contact information on the website, does not seem to correlate with “rogue pharmacy” status using the LegitScript online pharmacy verification standards. Instead, long-term continuous operation strongly correlated (P<.001) with explicit illegal activity. Conclusions Most Internet pharmacies in our study sample were illegal sites within the definition of “rogue” Internet pharmacy. These websites violate professional, legal, and ethical standards and endanger patient safety. This work shows evidence that online pharmacies that act illegally appear to have greater longevity than others, presumably because there is no compelling reason for frequent change in order to survive. We also found that one in five websites revived (closed down and reopened again within four years) and no-prescription sites with limited medicine and patient information are flourishing. PMID:24021777

  16. Development of a database for the verification of trans-ionospheric remote sensing systems

    NASA Astrophysics Data System (ADS)

    Leitinger, R.

    2005-08-01

    Remote sensing systems need verification by means of in-situ data or by means of model data. In the case of ionospheric occultation inversion, ionosphere tomography and other imaging methods on the basis of satellite-to-ground or satellite-to-satellite electron content, the availability of in-situ data with adequate spatial and temporal co-location is a very rare case, indeed. Therefore the method of choice for verification is to produce artificial electron content data with realistic properties, subject these data to the inversion/retrieval method, compare the results with model data and apply a suitable type of “goodness of fit” classification. Inter-comparison of inversion/retrieval methods should be done with sets of artificial electron contents in a “blind” (or even “double blind”) way. The set up of a relevant database for the COST 271 Action is described. One part of the database will be made available to everyone interested in testing of inversion/retrieval methods. The artificial electron content data are calculated by means of large-scale models that are “modulated” in a realistic way to include smaller scale and dynamic structures, like troughs and traveling ionospheric disturbances.

  17. Thermoacoustic range verification using a clinical ultrasound array provides perfectly co-registered overlay of the Bragg peak onto an ultrasound image

    NASA Astrophysics Data System (ADS)

    Patch, S. K.; Kireeff Covo, M.; Jackson, A.; Qadadha, Y. M.; Campbell, K. S.; Albright, R. A.; Bloemhard, P.; Donoghue, A. P.; Siero, C. R.; Gimpel, T. L.; Small, S. M.; Ninemire, B. F.; Johnson, M. B.; Phair, L.

    2016-08-01

    The potential of particle therapy due to focused dose deposition in the Bragg peak has not yet been fully realized due to inaccuracies in range verification. The purpose of this work was to correlate the Bragg peak location with target structure, by overlaying the location of the Bragg peak onto a standard ultrasound image. Pulsed delivery of 50 MeV protons was accomplished by a fast chopper installed between the ion source and the cyclotron inflector. The chopper limited the train of bunches so that 2 Gy were delivered in 2 μ \\text{s} . The ion pulse generated thermoacoustic pulses that were detected by a cardiac ultrasound array, which also produced a grayscale ultrasound image. A filtered backprojection algorithm focused the received signal to the Bragg peak location with perfect co-registration to the ultrasound images. Data was collected in a room temperature water bath and gelatin phantom with a cavity designed to mimic the intestine, in which gas pockets can displace the Bragg peak. Phantom experiments performed with the cavity both empty and filled with olive oil confirmed that displacement of the Bragg peak due to anatomical change could be detected. Thermoacoustic range measurements in the waterbath agreed with Monte Carlo simulation within 1.2 mm. In the phantom, thermoacoustic range estimates and first-order range estimates from CT images agreed to within 1.5 mm.

  18. The effect of incidence angle on the overall three-dimensional aerodynamic performance of a classical annular airfoil cascade

    NASA Technical Reports Server (NTRS)

    Bergsten, D. E.; Fleeter, S.

    1983-01-01

    To be of quantitative value to the designer and analyst, it is necessary to experimentally verify the flow modeling and the numerics inherent in calculation codes being developed to predict the three dimensional flow through turbomachine blade rows. This experimental verification requires that predicted flow fields be correlated with three dimensional data obtained in experiments which model the fundamental phenomena existing in the flow passages of modern turbomachines. The Purdue Annular Cascade Facility was designed specifically to provide these required three dimensional data. The overall three dimensional aerodynamic performance of an instrumented classical airfoil cascade was determined over a range of incidence angle values. This was accomplished utilizing a fully automated exit flow data acquisition and analysis system. The mean wake data, acquired at two downstream axial locations, were analyzed to determine the effect of incidence angle, the three dimensionality of the cascade exit flow field, and the similarity of the wake profiles. The hub, mean, and tip chordwise airfoil surface static pressure distributions determined at each incidence angle are correlated with predictions from the MERIDL and TSONIC computer codes.

  19. Computational Modeling of Microstructural-Evolution in AISI 1005 Steel During Gas Metal Arc Butt Welding

    NASA Astrophysics Data System (ADS)

    Grujicic, M.; Ramaswami, S.; Snipes, J. S.; Yavari, R.; Arakere, A.; Yen, C.-F.; Cheeseman, B. A.

    2013-05-01

    A fully coupled (two-way), transient, thermal-mechanical finite-element procedure is developed to model conventional gas metal arc welding (GMAW) butt-joining process. Two-way thermal-mechanical coupling is achieved by making the mechanical material model of the workpiece and the weld temperature-dependent and by allowing the potential work of plastic deformation resulting from large thermal gradients to be dissipated in the form of heat. To account for the heat losses from the weld into the surroundings, heat transfer effects associated with natural convection and radiation to the environment and thermal-heat conduction to the adjacent workpiece material are considered. The procedure is next combined with the basic physical-metallurgy concepts and principles and applied to a prototypical (plain) low-carbon steel (AISI 1005) to predict the distribution of various crystalline phases within the as-welded material microstructure in different fusion zone and heat-affected zone locations, under given GMAW-process parameters. The results obtained are compared with available open-literature experimental data to provide validation/verification for the proposed GMAW modeling effort.

  20. Design and implementation of a head-and-neck phantom for system audit and verification of intensity-modulated radiation therapy.

    PubMed

    Webster, Gareth J; Hardy, Mark J; Rowbottom, Carl G; Mackay, Ranald I

    2008-04-16

    The head and neck is a challenging anatomic site for intensity-modulated radiation therapy (IMRT), requiring thorough testing of planning and treatment delivery systems. Ideally, the phantoms used should be anatomically realistic, have radiologic properties identical to those of the tissues concerned, and allow for the use of a variety of devices to verify dose and dose distribution in any target or normaltissue structure. A phantom that approaches the foregoing characteristics has been designed and built; its specific purpose is verification for IMRT treatments in the head-andneck region. This semi-anatomic phantom, HANK, is constructed of Perspex (Imperial Chemical Industries, London, U.K.) and provides for the insertion of heterogeneities simulating air cavities in a range of fixed positions. Chamber inserts are manufactured to incorporate either a standard thimble ionization chamber (0.125 cm3: PTW, Freiburg, Germany) or a smaller PinPoint chamber (0.015 cm3: PTW), and measurements can be made with either chamber in a range of positions throughout the phantom. Coronal films can also be acquired within the phantom, and additional solid blocks of Perspex allow for transverse films to be acquired within the head region. Initial studies using simple conventional head-and-neck plans established the reproducibility of the phantom and the measurement devices to within the setup uncertainty of +/- 0.5 mm. Subsequent verification of 9 clinical head-and-neck IMRT plans demonstrated the efficacy of the phantom in making a range of patient-specific dose measurements in regions of dosimetric and clinical interest. Agreement between measured values and those predicted by the Pinnacle3 treatment planning system (Philips Medical Systems, Andover, MA) was found to be generally good, with a mean error on the calculated dose to each point of +0.2% (range: -4.3% to +2.2%; n = 9) for the primary planning target volume (PTV), -0.1% (range: -1.5% to +2.0%; n = 8) for the nodal PTV, and +0.0% (range: -1.8% to +4.3%, n = 9) for the spinal cord. The suitability of the phantom for measuring combined dose distributions using radiographic film was also evaluated. The phantom has proved to be a valuable tool in the development and implementation of clinical head-and-neck IMRT, allowing for accurate verification of absolute dose and dose distributions in regions of clinical and dosimetric interest.

  1. The Automation of Nowcast Model Assessment Processes

    DTIC Science & Technology

    2016-09-01

    that will automate real-time WRE-N model simulations, collect and quality control check weather observations for assimilation and verification, and...domains centered near White Sands Missile Range, New Mexico, where the Meteorological Sensor Array (MSA) will be located. The MSA will provide...observations and performing quality -control checks for the pre-forecast data assimilation period. 2. Run the WRE-N model to generate model forecast data

  2. Compensation of distributed delays in integrated communication and control systems

    NASA Technical Reports Server (NTRS)

    Ray, Asok; Luck, Rogelio

    1991-01-01

    The concept, analysis, implementation, and verification of a method for compensating delays that are distributed between the sensors, controller, and actuators within a control loop are discussed. With the objective of mitigating the detrimental effects of these network induced delays, a predictor-controller algorithm was formulated and analyzed. Robustness of the delay compensation algorithm was investigated relative to parametric uncertainties in plant modeling. The delay compensator was experimentally verified on an IEEE 802.4 network testbed for velocity control of a DC servomotor.

  3. Multiplicity Counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geist, William H.

    2015-12-01

    This set of slides begins by giving background and a review of neutron counting; three attributes of a verification item are discussed: 240Pu eff mass; α, the ratio of (α,n) neutrons to spontaneous fission neutrons; and leakage multiplication. It then takes up neutron detector systems – theory & concepts (coincidence counting, moderation, die-away time); detector systems – some important details (deadtime, corrections); introduction to multiplicity counting; multiplicity electronics and example distributions; singles, doubles, and triples from measured multiplicity distributions; and the point model: multiplicity mathematics.

  4. Magnetic and gravity anomalies in the Americas

    NASA Technical Reports Server (NTRS)

    Braile, L. W.; Hinze, W. J.; Vonfrese, R. R. B. (Principal Investigator)

    1981-01-01

    The cleaning and magnetic tape storage of spherical Earth processing programs are reported. These programs include: NVERTSM which inverts total or vector magnetic anomaly data on a distribution of point dipoles in spherical coordinates; SMFLD which utilizes output from NVERTSM to compute total or vector magnetic anomaly fields for a distribution of point dipoles in spherical coordinates; NVERTG; and GFLD. Abstracts are presented for papers dealing with the mapping and modeling of magnetic and gravity anomalies, and with the verification of crustal components in satellite data.

  5. Source position verification and dosimetry in HDR brachytherapy using an EPID.

    PubMed

    Smith, R L; Taylor, M L; McDermott, L N; Haworth, A; Millar, J L; Franich, R D

    2013-11-01

    Accurate treatment delivery in high dose rate (HDR) brachytherapy requires correct source dwell positions and dwell times to be administered relative to each other and to the surrounding anatomy. Treatment delivery inaccuracies predominantly occur for two reasons: (i) anatomical movement or (ii) as a result of human errors that are usually related to incorrect implementation of the planned treatment. Electronic portal imaging devices (EPIDs) were originally developed for patient position verification in external beam radiotherapy and their application has been extended to provide dosimetric information. The authors have characterized the response of an EPID for use with an (192)Ir brachytherapy source to demonstrate its use as a verification device, providing both source position and dosimetric information. Characterization of the EPID response using an (192)Ir brachytherapy source included investigations of reproducibility, linearity with dose rate, photon energy dependence, and charge build-up effects associated with exposure time and image acquisition time. Source position resolution in three dimensions was determined. To illustrate treatment verification, a simple treatment plan was delivered to a phantom and the measured EPID dose distribution compared with the planned dose. The mean absolute source position error in the plane parallel to the EPID, for dwells measured at 50, 100, and 150 mm source to detector distances (SDD), was determined to be 0.26 mm. The resolution of the z coordinate (perpendicular distance from detector plane) is SDD dependent with 95% confidence intervals of ± 0.1, ± 0.5, and ± 2.0 mm at SDDs of 50, 100, and 150 mm, respectively. The response of the EPID is highly linear to dose rate. The EPID exhibits an over-response to low energy incident photons and this nonlinearity is incorporated into the dose calibration procedure. A distance (spectral) dependent dose rate calibration procedure has been developed. The difference between measured and planned dose is less than 2% for 98.0% of pixels in a two-dimensional plane at an SDD of 100 mm. Our application of EPID dosimetry to HDR brachytherapy provides a quality assurance measure of the geometrical distribution of the delivered dose as well as the source positions, which is not possible with any current HDR brachytherapy verification system.

  6. Heterogeneity of activated carbons in adsorption of phenols from aqueous solutions—Comparison of experimental isotherm data and simulation predictions

    NASA Astrophysics Data System (ADS)

    Podkościelny, P.; Nieszporek, K.

    2007-01-01

    Surface heterogeneity of activated carbons is usually characterized by adsorption energy distribution (AED) functions which can be estimated from the experimental adsorption isotherms by inverting integral equation. The experimental data of phenol adsorption from aqueous solution on activated carbons prepared from polyacrylonitrile (PAN) and polyethylene terephthalate (PET) have been taken from literature. AED functions for phenol adsorption, generated by application of regularization method have been verified. The Grand Canonical Monte Carlo (GCMC) simulation technique has been used as verification tool. The definitive stage of verification was comparison of experimental adsorption data and those obtained by utilization GCMC simulations. Necessary information for performing of simulations has been provided by parameters of AED functions calculated by regularization method.

  7. Shift Verification and Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pandya, Tara M.; Evans, Thomas M.; Davidson, Gregory G

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over amore » burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.« less

  8. Formal Verification of a Conflict Resolution and Recovery Algorithm

    NASA Technical Reports Server (NTRS)

    Maddalon, Jeffrey; Butler, Ricky; Geser, Alfons; Munoz, Cesar

    2004-01-01

    New air traffic management concepts distribute the duty of traffic separation among system participants. As a consequence, these concepts have a greater dependency and rely heavily on on-board software and hardware systems. One example of a new on-board capability in a distributed air traffic management system is air traffic conflict detection and resolution (CD&R). Traditional methods for safety assessment such as human-in-the-loop simulations, testing, and flight experiments may not be sufficient for this highly distributed system as the set of possible scenarios is too large to have a reasonable coverage. This paper proposes a new method for the safety assessment of avionics systems that makes use of formal methods to drive the development of critical systems. As a case study of this approach, the mechanical veri.cation of an algorithm for air traffic conflict resolution and recovery called RR3D is presented. The RR3D algorithm uses a geometric optimization technique to provide a choice of resolution and recovery maneuvers. If the aircraft adheres to these maneuvers, they will bring the aircraft out of conflict and the aircraft will follow a conflict-free path to its original destination. Veri.cation of RR3D is carried out using the Prototype Verification System (PVS).

  9. High-Resolution Fast-Neutron Spectrometry for Arms Control and Treaty Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David L. Chichester; James T. Johnson; Edward H. Seabury

    2012-07-01

    Many nondestructive nuclear analysis techniques have been developed to support the measurement needs of arms control and treaty verification, including gross photon and neutron counting, low- and high-resolution gamma spectrometry, time-correlated neutron measurements, and photon and neutron imaging. One notable measurement technique that has not been extensively studied to date for these applications is high-resolution fast-neutron spectrometry (HRFNS). Applied for arms control and treaty verification, HRFNS has the potential to serve as a complimentary measurement approach to these other techniques by providing a means to either qualitatively or quantitatively determine the composition and thickness of non-nuclear materials surrounding neutron-emitting materials.more » The technique uses the normally-occurring neutrons present in arms control and treaty verification objects of interest as an internal source of neutrons for performing active-interrogation transmission measurements. Most low-Z nuclei of interest for arms control and treaty verification, including 9Be, 12C, 14N, and 16O, possess fast-neutron resonance features in their absorption cross sections in the 0.5- to 5-MeV energy range. Measuring the selective removal of source neutrons over this energy range, assuming for example a fission-spectrum starting distribution, may be used to estimate the stoichiometric composition of intervening materials between the neutron source and detector. At a simpler level, determination of the emitted fast-neutron spectrum may be used for fingerprinting 'known' assemblies for later use in template-matching tests. As with photon spectrometry, automated analysis of fast-neutron spectra may be performed to support decision making and reporting systems protected behind information barriers. This paper will report recent work at Idaho National Laboratory to explore the feasibility of using HRFNS for arms control and treaty verification applications, including simulations and experiments, using fission-spectrum neutron sources to assess neutron transmission through composite low-Z attenuators.« less

  10. Control/structure interaction design methodology

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.; Layman, William E.

    1989-01-01

    The Control Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts (such as active structure) and new tools (such as a combined structure and control optimization algorithm) and their verification in ground and possibly flight test. The new CSI design methodology is centered around interdisciplinary engineers using new tools that closely integrate structures and controls. Verification is an important CSI theme and analysts will be closely integrated to the CSI Test Bed laboratory. Components, concepts, tools and algorithms will be developed and tested in the lab and in future Shuttle-based flight experiments. The design methodology is summarized in block diagrams depicting the evolution of a spacecraft design and descriptions of analytical capabilities used in the process. The multiyear JPL CSI implementation plan is described along with the essentials of several new tools. A distributed network of computation servers and workstations was designed that will provide a state-of-the-art development base for the CSI technologies.

  11. A formally verified algorithm for interactive consistency under a hybrid fault model

    NASA Technical Reports Server (NTRS)

    Lincoln, Patrick; Rushby, John

    1993-01-01

    Consistent distribution of single-source data to replicated computing channels is a fundamental problem in fault-tolerant system design. The 'Oral Messages' (OM) algorithm solves this problem of Interactive Consistency (Byzantine Agreement) assuming that all faults are worst-cass. Thambidurai and Park introduced a 'hybrid' fault model that distinguished three fault modes: asymmetric (Byzantine), symmetric, and benign; they also exhibited, along with an informal 'proof of correctness', a modified version of OM. Unfortunately, their algorithm is flawed. The discipline of mechanically checked formal verification eventually enabled us to develop a correct algorithm for Interactive Consistency under the hybrid fault model. This algorithm withstands $a$ asymmetric, $s$ symmetric, and $b$ benign faults simultaneously, using $m+1$ rounds, provided $n is greater than 2a + 2s + b + m$, and $m\\geg a$. We present this algorithm, discuss its subtle points, and describe its formal specification and verification in PVS. We argue that formal verification systems such as PVS are now sufficiently effective that their application to fault-tolerance algorithms should be considered routine.

  12. Evaluation of Gafchromic EBT-XD film, with comparison to EBT3 film, and application in high dose radiotherapy verification.

    PubMed

    Palmer, Antony L; Dimitriadis, Alexis; Nisbet, Andrew; Clark, Catharine H

    2015-11-21

    There is renewed interest in film dosimetry for the verification of dose delivery of complex treatments, particularly small fields, compared to treatment planning system calculations. A new radiochromic film, Gafchromic EBT-XD, is available for high-dose treatment verification and we present the first published evaluation of its use. We evaluate the new film for MV photon dosimetry, including calibration curves, performance with single- and triple-channel dosimetry, and comparison to existing EBT3 film. In the verification of a typical 25 Gy stereotactic radiotherapy (SRS) treatment, compared to TPS planned dose distribution, excellent agreement was seen with EBT-XD using triple-channel dosimetry, in isodose overlay, maximum 1.0 mm difference over 200-2400 cGy, and gamma evaluation, mean passing rate 97% at 3% locally-normalised, 1.5 mm criteria. In comparison to EBT3, EBT-XD gave improved evaluation results for the SRS-plan, had improved calibration curve gradients at high doses, and had reduced lateral scanner effect. The dimensions of the two films are identical. The optical density of EBT-XD is lower than EBT3 for the same dose. The effective atomic number for both may be considered water-equivalent in MV radiotherapy. We have validated the use of EBT-XD for high-dose, small-field radiotherapy, for routine QC and a forthcoming multi-centre SRS dosimetry intercomparison.

  13. Evaluation of Gafchromic EBT-XD film, with comparison to EBT3 film, and application in high dose radiotherapy verification

    NASA Astrophysics Data System (ADS)

    Palmer, Antony L.; Dimitriadis, Alexis; Nisbet, Andrew; Clark, Catharine H.

    2015-11-01

    There is renewed interest in film dosimetry for the verification of dose delivery of complex treatments, particularly small fields, compared to treatment planning system calculations. A new radiochromic film, Gafchromic EBT-XD, is available for high-dose treatment verification and we present the first published evaluation of its use. We evaluate the new film for MV photon dosimetry, including calibration curves, performance with single- and triple-channel dosimetry, and comparison to existing EBT3 film. In the verification of a typical 25 Gy stereotactic radiotherapy (SRS) treatment, compared to TPS planned dose distribution, excellent agreement was seen with EBT-XD using triple-channel dosimetry, in isodose overlay, maximum 1.0 mm difference over 200-2400 cGy, and gamma evaluation, mean passing rate 97% at 3% locally-normalised, 1.5 mm criteria. In comparison to EBT3, EBT-XD gave improved evaluation results for the SRS-plan, had improved calibration curve gradients at high doses, and had reduced lateral scanner effect. The dimensions of the two films are identical. The optical density of EBT-XD is lower than EBT3 for the same dose. The effective atomic number for both may be considered water-equivalent in MV radiotherapy. We have validated the use of EBT-XD for high-dose, small-field radiotherapy, for routine QC and a forthcoming multi-centre SRS dosimetry intercomparison.

  14. Dosimetry investigation of MOSFET for clinical IMRT dose verification.

    PubMed

    Deshpande, Sudesh; Kumar, Rajesh; Ghadi, Yogesh; Neharu, R M; Kannan, V

    2013-06-01

    In IMRT, patient-specific dose verification is followed regularly at each centre. Simple and efficient dosimetry techniques play a very important role in routine clinical dosimetry QA. The MOSFET dosimeter offers several advantages over the conventional dosimeters such as its small detector size, immediate readout, immediate reuse, multiple point dose measurements. To use the MOSFET as routine clinical dosimetry system for pre-treatment dose verification in IMRT, a comprehensive set of experiments has been conducted, to investigate its linearity, reproducibility, dose rate effect and angular dependence for 6 MV x-ray beam. The MOSFETs shows a linear response with linearity coefficient of 0.992 for a dose range of 35 cGy to 427 cGy. The reproducibility of the MOSFET was measured by irradiating the MOSFET for ten consecutive irradiations in the dose range of 35 cGy to 427 cGy. The measured reproducibility of MOSFET was found to be within 4% up to 70 cGy and within 1.4% above 70 cGy. The dose rate effect on the MOSFET was investigated in the dose rate range 100 MU/min to 600 MU/min. The response of the MOSFET varies from -1.7% to 2.1%. The angular responses of the MOSFETs were measured at 10 degrees intervals from 90 to 270 degrees in an anticlockwise direction and normalized at gantry angle zero and it was found to be in the range of 0.98 ± 0.014 to 1.01 ± 0.014. The MOSFETs were calibrated in a phantom which was later used for IMRT verification. The measured calibration coefficients were found to be 1 mV/cGy and 2.995 mV/cGy in standard and high sensitivity mode respectively. The MOSFETs were used for pre-treatment dose verification in IMRT. Nine dosimeters were used for each patient to measure the dose in different plane. The average variation between calculated and measured dose at any location was within 3%. Dose verification using MOSFET and IMRT phantom was found to quick and efficient and well suited for a busy radiotherapy department.

  15. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF TECHNOLOGIES FOR MEASURING TRACE ELEMENTS IN SOIL AND SEDIMENT OXFORD X-MTE 3000TX XRF ANALYZER

    EPA Science Inventory

    The Elvatech, Ltd. ElvaX (ElvaX) x-ray fluorescence (XRF) analyzer distributed in the United States by Xcalibur XRF Services (Xcalibur), was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field por...

  16. Field Verification of New and Innovative Technologies for the Assessment and Rehabilitation of Drinking Water Distribution Systems and Wastewater Collection Systems

    EPA Science Inventory

    This project will contribute valuable information on the performance characteristics of new technology for use in infrastructure rehabilitation, and will provide additional credibility to the U.S. Environment Protection Agency’s (EPA) Office of Research and Development’s (ORD) fo...

  17. Soundscapes

    DTIC Science & Technology

    2013-09-30

    STATEMENT A. Approved for public release; distribution is unlimited. . Soundscapes Michael B...models to provide hindcasts, nowcasts, and forecasts of the time-evolving soundscape . In terms of the types of sound sources, we will focus initially on...APPROACH The research has two principle thrusts: 1) the modeling of the soundscape , and 2) verification using datasets that have been collected

  18. Soundscapes

    DTIC Science & Technology

    2012-09-30

    STATEMENT A. Approved for public release; distribution is unlimited. . Soundscapes Michael B...models to provide hindcasts, nowcasts, and forecasts of the time-evolving soundscape . In terms of the types of sound sources, we will focus initially on...APPROACH The research has two principle thrusts: 1) the modeling of the soundscape , and 2) verification using datasets that have been collected

  19. Calibration of a distributed routing rainfall-runoff model at four urban sites near Miami, Florida

    USGS Publications Warehouse

    Doyle, W. Harry; Miller, Jeffrey E.

    1980-01-01

    Urban stormwater data from four Miami, Fla. catchments were collected and compiled by the U.S. Geological Survey and were used for testing the applicability of deterministic modeling for characterizing stormwater flows from small land-use areas. A description of model calibration and verification is presented for: (1) A 40.8 acre single-family residential area, (2) a 58.3-acre highway area, (3) a 20.4-acre commercial area, and (4) a 14.7-acre multifamily residential area. Rainfall-runoff data for 80, 108, 114, and 52 storms at sites, 1, 2, 3, and 4, respectively, were collected, analyzed, and stored on direct-access files. Rainfall and runoff data for these storms (at 1-minute time intervals) were used in flow-modeling simulation analyses. A distributed routing Geological Survey rainfall-runoff model was used to determine rainfall excess and route overland and channel flows at each site. Optimization of soil-moisture- accounting and infiltration parameters was performed during the calibration phases. The results of this study showed that, with qualifications, an acceptable verification of the Geological Survey model can be achieved. (Kosco-USGS)

  20. MAGIC polymer gel for dosimetric verification in boron neutron capture therapy

    PubMed Central

    Heikkinen, Sami; Kotiluoto, Petri; Serén, Tom; Seppälä, Tiina; Auterinen, Iiro; Savolainen, Sauli

    2007-01-01

    Radiation‐sensitive polymer gels are among the most promising three‐dimensional dose verification tools developed to date. We tested the normoxic polymer gel dosimeter known by the acronym MAGIC (methacrylic and ascorbic acid in gelatin initiated by copper) to evaluate its use in boron neutron capture therapy (BNCT) dosimetry. We irradiated a large cylindrical gel phantom (diameter: 10 cm; length: 20 cm) in the epithermal neutron beam of the Finnish BNCT facility at the FiR 1 nuclear reactor. Neutron irradiation was simulated with a Monte Carlo radiation transport code MCNP. To compare dose–response, gel samples from the same production batch were also irradiated with 6 MV photons from a medical linear accelerator. Irradiated gel phantoms then underwent magnetic resonance imaging to determine their R2 relaxation rate maps. The measured and normalized dose distribution in the epithermal neutron beam was compared with the dose distribution calculated by computer simulation. The results support the feasibility of using MAGIC gel in BNCT dosimetry. PACS numbers: 87.53.Qc, 87.53.Wz, 87.66.Ff PMID:17592463

  1. Retrospective cost adaptive Reynolds-averaged Navier-Stokes k-ω model for data-driven unsteady turbulent simulations

    NASA Astrophysics Data System (ADS)

    Li, Zhiyong; Hoagg, Jesse B.; Martin, Alexandre; Bailey, Sean C. C.

    2018-03-01

    This paper presents a data-driven computational model for simulating unsteady turbulent flows, where sparse measurement data is available. The model uses the retrospective cost adaptation (RCA) algorithm to automatically adjust the closure coefficients of the Reynolds-averaged Navier-Stokes (RANS) k- ω turbulence equations to improve agreement between the simulated flow and the measurements. The RCA-RANS k- ω model is verified for steady flow using a pipe-flow test case and for unsteady flow using a surface-mounted-cube test case. Measurements used for adaptation of the verification cases are obtained from baseline simulations with known closure coefficients. These verification test cases demonstrate that the RCA-RANS k- ω model can successfully adapt the closure coefficients to improve agreement between the simulated flow field and a set of sparse flow-field measurements. Furthermore, the RCA-RANS k- ω model improves agreement between the simulated flow and the baseline flow at locations at which measurements do not exist. The RCA-RANS k- ω model is also validated with experimental data from 2 test cases: steady pipe flow, and unsteady flow past a square cylinder. In both test cases, the adaptation improves agreement with experimental data in comparison to the results from a non-adaptive RANS k- ω model that uses the standard values of the k- ω closure coefficients. For the steady pipe flow, adaptation is driven by mean stream-wise velocity measurements at 24 locations along the pipe radius. The RCA-RANS k- ω model reduces the average velocity error at these locations by over 35%. For the unsteady flow over a square cylinder, adaptation is driven by time-varying surface pressure measurements at 2 locations on the square cylinder. The RCA-RANS k- ω model reduces the average surface-pressure error at these locations by 88.8%.

  2. Location of photographs showing landslide features in the Little North Santiam River Basin, Oregon

    USGS Publications Warehouse

    Sobieszczyk, Steven

    2010-01-01

    Data points represent locations of photographs taken of landslides in the Little North Santiam River Basin, Oregon. Photos were taken in spring of 2010 during field verification of landslide locations (deposits previously mapped using LiDAR-derived imagery). The photographs depict various landslide features, such as scarps, pistol-butt trees, or colluvium deposits. This work was completed as part of the Master's thesis "Turbidity Monitoring and LiDAR Imagery Indicate Landslides are Primary Source of Suspended-Sediment Load in the Little North Santiam River Basin, Oregon, Winter 2009-2010" by Steven Sobieszczyk, Portland State University and U.S. Geological Survey. Data layers in this geodatabase include: landslide deposit boundaries (Deposits); field-verfied location imagery (Photos); head scarp or scarp flanks (Scarp_Flanks); and secondary scarp features (Scarps).The geodatabase template was developed by the Oregon Department of Geology and Mineral Industries (Burns and Madin, 2009).

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jordan, Amy B.; Zyvoloski, George Anthony; Weaver, Douglas James

    The simulation work presented in this report supports DOE-NE Used Fuel Disposition Campaign (UFDC) goals related to the development of drift scale in-situ field testing of heat-generating nuclear waste (HGNW) in salt formations. Numerical code verification and validation is an important part of the lead-up to field testing, allowing exploration of potential heater emplacement designs, monitoring locations, and perhaps most importantly the ability to predict heat and mass transfer around an evolving test. Such predictions are crucial for the design and location of sampling and monitoring that can be used to validate our understanding of a drift scale test thatmore » is likely to span several years.« less

  4. EPA Facility Registry System (FRS): NEPT

    EPA Pesticide Factsheets

    This web feature service contains location and facility identification information from EPA's Facility Registry System (FRS) for the subset of facilities that link to the National Environmental Performance Track (NEPT) Program dataset. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. Additional information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs

  5. EPA Facility Registry Service (FRS): NEI

    EPA Pesticide Factsheets

    This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of facilities that link to the National Emissions Inventory (NEI) Program dataset. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. Additional information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs

  6. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  7. Technical Note: Range verification system using edge detection method for a scintillator and a CCD camera system.

    PubMed

    Saotome, Naoya; Furukawa, Takuji; Hara, Yousuke; Mizushima, Kota; Tansho, Ryohei; Saraya, Yuichi; Shirai, Toshiyuki; Noda, Koji

    2016-04-01

    Three-dimensional irradiation with a scanned carbon-ion beam has been performed from 2011 at the authors' facility. The authors have developed the rotating-gantry equipped with the scanning irradiation system. The number of combinations of beam properties to measure for the commissioning is more than 7200, i.e., 201 energy steps, 3 intensities, and 12 gantry angles. To compress the commissioning time, quick and simple range verification system is required. In this work, the authors develop a quick range verification system using scintillator and charge-coupled device (CCD) camera and estimate the accuracy of the range verification. A cylindrical plastic scintillator block and a CCD camera were installed on the black box. The optical spatial resolution of the system is 0.2 mm/pixel. The camera control system was connected and communicates with the measurement system that is part of the scanning system. The range was determined by image processing. Reference range for each energy beam was determined by a difference of Gaussian (DOG) method and the 80% of distal dose of the depth-dose distribution that were measured by a large parallel-plate ionization chamber. The authors compared a threshold method and a DOG method. The authors found that the edge detection method (i.e., the DOG method) is best for the range detection. The accuracy of range detection using this system is within 0.2 mm, and the reproducibility of the same energy measurement is within 0.1 mm without setup error. The results of this study demonstrate that the authors' range check system is capable of quick and easy range verification with sufficient accuracy.

  8. Analyzing Personalized Policies for Online Biometric Verification

    PubMed Central

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M.

    2014-01-01

    Motivated by India’s nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident’s biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India’s program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India’s biometric program. The mean delay is sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32–41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident. PMID:24787752

  9. Technical Note: Range verification system using edge detection method for a scintillator and a CCD camera system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saotome, Naoya, E-mail: naosao@nirs.go.jp; Furukawa, Takuji; Hara, Yousuke

    Purpose: Three-dimensional irradiation with a scanned carbon-ion beam has been performed from 2011 at the authors’ facility. The authors have developed the rotating-gantry equipped with the scanning irradiation system. The number of combinations of beam properties to measure for the commissioning is more than 7200, i.e., 201 energy steps, 3 intensities, and 12 gantry angles. To compress the commissioning time, quick and simple range verification system is required. In this work, the authors develop a quick range verification system using scintillator and charge-coupled device (CCD) camera and estimate the accuracy of the range verification. Methods: A cylindrical plastic scintillator blockmore » and a CCD camera were installed on the black box. The optical spatial resolution of the system is 0.2 mm/pixel. The camera control system was connected and communicates with the measurement system that is part of the scanning system. The range was determined by image processing. Reference range for each energy beam was determined by a difference of Gaussian (DOG) method and the 80% of distal dose of the depth-dose distribution that were measured by a large parallel-plate ionization chamber. The authors compared a threshold method and a DOG method. Results: The authors found that the edge detection method (i.e., the DOG method) is best for the range detection. The accuracy of range detection using this system is within 0.2 mm, and the reproducibility of the same energy measurement is within 0.1 mm without setup error. Conclusions: The results of this study demonstrate that the authors’ range check system is capable of quick and easy range verification with sufficient accuracy.« less

  10. Analyzing personalized policies for online biometric verification.

    PubMed

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  11. Evaluation of a single-scan protocol for radiochromic film dosimetry.

    PubMed

    Shimohigashi, Yoshinobu; Araki, Fujio; Maruyama, Masato; Nakaguchi, Yuji; Kuwahara, Satoshi; Nagasue, Nozomu; Kai, Yudai

    2015-03-08

    The purpose of this study was to evaluate a single-scan protocol using Gafchromic EBT3 film (EBT3) by comparing it with the commonly used 24-hr measurement protocol for radiochromic film dosimetry. Radiochromic film is generally scanned 24 hr after film exposure (24-hr protocol). The single-scan protocol enables measurement results within a short time using only the verification film, one calibration film, and unirradiated film. The single-scan protocol was scanned 30 min after film irradiation. The EBT3 calibration curves were obtained with the multichannel film dosimetry method. The dose verifications for each protocol were performed with the step pattern, pyramid pattern, and clinical treatment plans for intensity-modulated radiation therapy (IMRT). The absolute dose distributions for each protocol were compared with those calculated by the treatment planning system (TPS) using gamma evaluation at 3% and 3 mm. The dose distribution for the single-scan protocol was within 2% of the 24-hr protocol dose distribution. For the step pattern, the absolute dose discrepancies between the TPS for the single-scan and 24-hr protocols were 2.0 ± 1.8 cGy and 1.4 ± 1.2 cGy at the dose plateau, respectively. The pass rates were 96.0% for the single-scan protocol and 95.9% for the 24-hr protocol. Similarly, the dose discrepancies for the pyramid pattern were 3.6 ± 3.5cGy and 2.9 ± 3.3 cGy, respectively, while the pass rates for the pyramid pattern were 95.3% and 96.4%, respectively. The average pass rates for the four IMRT plans were 96.7% ± 1.8% for the single-scan protocol and 97.3% ± 1.4% for the 24-hr protocol. Thus, the single-scan protocol measurement is useful for dose verification of IMRT, based on its accuracy and efficiency.

  12. Evaluation of a single‐scan protocol for radiochromic film dosimetry

    PubMed Central

    Araki, Fujio; Maruyama, Masato; Nakaguchi, Yuji; Kuwahara, Satoshi; Nagasue, Nozomu; Kai, Yudai

    2015-01-01

    The purpose of this study was to evaluate a single‐scan protocol using Gafchromic EBT3 film (EBT3) by comparing it with the commonly used 24‐hr measurement protocol for radiochromic film dosimetry. Radiochromic film is generally scanned 24 hr after film exposure (24‐hr protocol). The single‐scan protocol enables measurement results within a short time using only the verification film, one calibration film, and unirradiated film. The single‐scan protocol was scanned 30 min after film irradiation. The EBT3 calibration curves were obtained with the multichannel film dosimetry method. The dose verifications for each protocol were performed with the step pattern, pyramid pattern, and clinical treatment plans for intensity‐modulated radiation therapy (IMRT). The absolute dose distributions for each protocol were compared with those calculated by the treatment planning system (TPS) using gamma evaluation at 3% and 3 mm. The dose distribution for the single‐scan protocol was within 2% of the 24‐hr protocol dose distribution. For the step pattern, the absolute dose discrepancies between the TPS for the single‐scan and 24‐hr protocols were 2.0±1.8 cGy and 1.4±1.2 cGy at the dose plateau, respectively. The pass rates were 96.0% for the single‐scan protocol and 95.9% for the 24‐hr protocol. Similarly, the dose discrepancies for the pyramid pattern were 3.6±3.5 cGy and 2.9±3.3 cGy, respectively, while the pass rates for the pyramid pattern were 95.3% and 96.4%, respectively. The average pass rates for the four IMRT plans were 96.7%±1.8% for the single‐scan protocol and 97.3%±1.4% for the 24‐hr protocol. Thus, the single‐scan protocol measurement is useful for dose verification of IMRT, based on its accuracy and efficiency. PACS number: 87.55.Qr PMID:26103194

  13. Wide-field lensing mass maps from Dark Energy Survey science verification data: Methodology and detailed analysis

    DOE PAGES

    Vikram, V.

    2015-07-29

    Weak gravitational lensing allows one to reconstruct the spatial distribution of the projected mass density across the sky. These “mass maps” provide a powerful tool for studying cosmology as they probe both luminous and dark matter. In this paper, we present a weak lensing mass map reconstructed from shear measurements in a 139 deg 2 area from the Dark Energy Survey (DES) science verification data. We compare the distribution of mass with that of the foreground distribution of galaxies and clusters. The overdensities in the reconstructed map correlate well with the distribution of optically detected clusters. We demonstrate that candidatemore » superclusters and voids along the line of sight can be identified, exploiting the tight scatter of the cluster photometric redshifts. We cross-correlate the mass map with a foreground magnitude-limited galaxy sample from the same data. Our measurement gives results consistent with mock catalogs from N-body simulations that include the primary sources of statistical uncertainties in the galaxy, lensing, and photo-z catalogs. The statistical significance of the cross-correlation is at the 6.8σ level with 20 arcminute smoothing. We find that the contribution of systematics to the lensing mass maps is generally within measurement uncertainties. In this study, we analyze less than 3% of the final area that will be mapped by the DES; the tools and analysis techniques developed in this paper can be applied to forthcoming larger data sets from the survey.« less

  14. Verification and Validation of the Coastal Modeling System. Report 2: CMS-Wave

    DTIC Science & Technology

    2011-12-01

    Figure 44. Offshore bathymetry showing NDBC and CDIP buoy locations. ........................................ 70 Figure 45. CMS-Wave modeling domain...the four measurement stations. During the same time intervals, offshore wave information was available from a Coastal Data Information Program ( CDIP ...were conducted with a grid of 236 × 398 cells with variable cell spacing of 30 to 200 m (see Figure 28). Directional wave spectra from CDIP 036 served

  15. Advanced in-production hotspot prediction and monitoring with micro-topography

    NASA Astrophysics Data System (ADS)

    Fanton, P.; Hasan, T.; Lakcher, A.; Le-Gratiet, B.; Prentice, C.; Simiz, J.-G.; La Greca, R.; Depre, L.; Hunsche, S.

    2017-03-01

    At 28nm technology node and below, hot spot prediction and process window control across production wafers have become increasingly critical to prevent hotspots from becoming yield-limiting defects. We previously established proof of concept for a systematic approach to identify the most critical pattern locations, i.e. hotspots, in a reticle layout by computational lithography and combining process window characteristics of these patterns with across-wafer process variation data to predict where hotspots may become yield impacting defects [1,2]. The current paper establishes the impact of micro-topography on a 28nm metal layer, and its correlation with hotspot best focus variations across a production chip layout. Detailed topography measurements are obtained from an offline tool, and pattern-dependent best focus (BF) shifts are determined from litho simulations that include mask-3D effects. We also establish hotspot metrology and defect verification by SEM image contour extraction and contour analysis. This enables detection of catastrophic defects as well as quantitative characterization of pattern variability, i.e. local and global CD uniformity, across a wafer to establish hotspot defect and variability maps. Finally, we combine defect prediction and verification capabilities for process monitoring by on-product, guided hotspot metrology, i.e. with sampling locations being determined from the defect prediction model and achieved prediction accuracy (capture rate) around 75%

  16. Dynamic analysis methods for detecting anomalies in asynchronously interacting systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Akshat; Solis, John Hector; Matschke, Benjamin

    2014-01-01

    Detecting modifications to digital system designs, whether malicious or benign, is problematic due to the complexity of the systems being analyzed. Moreover, static analysis techniques and tools can only be used during the initial design and implementation phases to verify safety and liveness properties. It is computationally intractable to guarantee that any previously verified properties still hold after a system, or even a single component, has been produced by a third-party manufacturer. In this paper we explore new approaches for creating a robust system design by investigating highly-structured computational models that simplify verification and analysis. Our approach avoids the needmore » to fully reconstruct the implemented system by incorporating a small verification component that dynamically detects for deviations from the design specification at run-time. The first approach encodes information extracted from the original system design algebraically into a verification component. During run-time this component randomly queries the implementation for trace information and verifies that no design-level properties have been violated. If any deviation is detected then a pre-specified fail-safe or notification behavior is triggered. Our second approach utilizes a partitioning methodology to view liveness and safety properties as a distributed decision task and the implementation as a proposed protocol that solves this task. Thus the problem of verifying safety and liveness properties is translated to that of verifying that the implementation solves the associated decision task. We develop upon results from distributed systems and algebraic topology to construct a learning mechanism for verifying safety and liveness properties from samples of run-time executions.« less

  17. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    NASA Astrophysics Data System (ADS)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  18. Fast 3D dosimetric verifications based on an electronic portal imaging device using a GPU calculation engine.

    PubMed

    Zhu, Jinhan; Chen, Lixin; Chen, Along; Luo, Guangwen; Deng, Xiaowu; Liu, Xiaowei

    2015-04-11

    To use a graphic processing unit (GPU) calculation engine to implement a fast 3D pre-treatment dosimetric verification procedure based on an electronic portal imaging device (EPID). The GPU algorithm includes the deconvolution and convolution method for the fluence-map calculations, the collapsed-cone convolution/superposition (CCCS) algorithm for the 3D dose calculations and the 3D gamma evaluation calculations. The results of the GPU-based CCCS algorithm were compared to those of Monte Carlo simulations. The planned and EPID-based reconstructed dose distributions in overridden-to-water phantoms and the original patients were compared for 6 MV and 10 MV photon beams in intensity-modulated radiation therapy (IMRT) treatment plans based on dose differences and gamma analysis. The total single-field dose computation time was less than 8 s, and the gamma evaluation for a 0.1-cm grid resolution was completed in approximately 1 s. The results of the GPU-based CCCS algorithm exhibited good agreement with those of the Monte Carlo simulations. The gamma analysis indicated good agreement between the planned and reconstructed dose distributions for the treatment plans. For the target volume, the differences in the mean dose were less than 1.8%, and the differences in the maximum dose were less than 2.5%. For the critical organs, minor differences were observed between the reconstructed and planned doses. The GPU calculation engine was used to boost the speed of 3D dose and gamma evaluation calculations, thus offering the possibility of true real-time 3D dosimetric verification.

  19. In-Situ Pumping Test for Multilayer Hydrogeological Site in Taiwan

    NASA Astrophysics Data System (ADS)

    Lin, S.; Tan, Y.; Lien, I.; Hsu, G.; Bao, K.

    2010-12-01

    Pingtung plain is located the in the southwestern Taiwan, and rainfall concentrated from May to October with the average annual precipitation of 2000 mm. However, topographic steepest rushing stream lead to most of the precipitation becomes runoff and drains to the ocean in short time. Due to the shortage of surface water, the groundwater is an important one of much water recourses. Additionally, the government will be set up artificial lakes in proximal-fan of pingtung plain, which increases recharge and supply to usage of the agricultural and aquaculture. However, the locations of pumping wells are decided not only affecting the developmental quantity of the groundwater but economic growth serious limited. Therefore, MODFLOW-96 was used to simulate the groundwater distribution and optimal the better recharge zone in the regional scale. Based on the model calibration and verification results, the tuku farm of the kaoping lake study is better recharged from Laonong Stream and the northeast, and the safe yield is much than other study zone. Additionally, directional variations in permeability anisotropic formations have important effects on velocities and storage of the groundwater recourses. We have further utilized modifying ANN (artificial neural networks) approach, as well as incorporating the Papadopoulos analytical solution [Lin et al, 2010], to estimate the directional and magnitude of the permeability parameters for pumping test at the tuku farm of the kaoping lake study, which the methodology will be improve accuracy of the estimation parameter. According to drawdown record data of six observation wells, results suggest that the locations of the pumping wells are set up in the northeast and northwest which since sedimentary formations are more permeable along the major direction from the northeast and northwest. Hence, the information can be helpful the groundwater management and supply in the Pingtung plain.

  20. Verification and Validation of EnergyPlus Phase Change Material Model for Opaque Wall Assemblies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tabares-Velasco, P. C.; Christensen, C.; Bianchi, M.

    2012-08-01

    Phase change materials (PCMs) represent a technology that may reduce peak loads and HVAC energy consumption in buildings. A few building energy simulation programs have the capability to simulate PCMs, but their accuracy has not been completely tested. This study shows the procedure used to verify and validate the PCM model in EnergyPlus using a similar approach as dictated by ASHRAE Standard 140, which consists of analytical verification, comparative testing, and empirical validation. This process was valuable, as two bugs were identified and fixed in the PCM model, and version 7.1 of EnergyPlus will have a validated PCM model. Preliminarymore » results using whole-building energy analysis show that careful analysis should be done when designing PCMs in homes, as their thermal performance depends on several variables such as PCM properties and location in the building envelope.« less

  1. Dual-mode capability for hardware-in-the-loop

    NASA Astrophysics Data System (ADS)

    Vamivakas, A. N.; Jackson, Ron L.

    2000-07-01

    This paper details a Hardware-in-the-Loop Facility (HIL) developed for evaluation and verification of a missile system with dual mode capability. The missile has the capability of tracking and intercepting a target using either an RF antenna or an IR sensor. The testing of a dual mode system presents a significant challenge in the development of the HIL facility. An IR and RF target environment must be presented simultaneously to the missile under test. These targets, simulated by IR and RF sources, must be presented to the missile under test without interference from each other. The location of each source is critical in the development of the HIL facility. The requirements for building a HIL facility with dual mode capability and the methodology for testing the dual mode system are defined within this paper. Methods for the verification and validation of the facility are discussed.

  2. Numerical Weather Predictions Evaluation Using Spatial Verification Methods

    NASA Astrophysics Data System (ADS)

    Tegoulias, I.; Pytharoulis, I.; Kotsopoulos, S.; Kartsios, S.; Bampzelis, D.; Karacostas, T.

    2014-12-01

    During the last years high-resolution numerical weather prediction simulations have been used to examine meteorological events with increased convective activity. Traditional verification methods do not provide the desired level of information to evaluate those high-resolution simulations. To assess those limitations new spatial verification methods have been proposed. In the present study an attempt is made to estimate the ability of the WRF model (WRF -ARW ver3.5.1) to reproduce selected days with high convective activity during the year 2010 using those feature-based verification methods. Three model domains, covering Europe, the Mediterranean Sea and northern Africa (d01), the wider area of Greece (d02) and central Greece - Thessaly region (d03) are used at horizontal grid-spacings of 15km, 5km and 1km respectively. By alternating microphysics (Ferrier, WSM6, Goddard), boundary layer (YSU, MYJ) and cumulus convection (Kain-­-Fritsch, BMJ) schemes, a set of twelve model setups is obtained. The results of those simulations are evaluated against data obtained using a C-Band (5cm) radar located at the centre of the innermost domain. Spatial characteristics are well captured but with a variable time lag between simulation results and radar data. Acknowledgements: This research is co­financed by the European Union (European Regional Development Fund) and Greek national funds, through the action "COOPERATION 2011: Partnerships of Production and Research Institutions in Focused Research and Technology Sectors" (contract number 11SYN_8_1088 - DAPHNE) in the framework of the operational programme "Competitiveness and Entrepreneurship" and Regions in Transition (OPC II, NSRF 2007-­-2013).

  3. Infrared remote sensing of the vertical and horizontal distribution of clouds

    NASA Technical Reports Server (NTRS)

    Chahine, M. T.; Haskins, R. D.

    1982-01-01

    An algorithm has been developed to derive the horizontal and vertical distribution of clouds from the same set of infrared radiance data used to retrieve atmospheric temperature profiles. The method leads to the determination of the vertical atmospheric temperature structure and the cloud distribution simultaneously, providing information on heat sources and sinks, storage rates and transport phenomena in the atmosphere. Experimental verification of this algorithm was obtained using the 15-micron data measured by the NOAA-VTPR temperature sounder. After correcting for water vapor emission, the results show that the cloud cover derived from 15-micron data is less than that obtained from visible data.

  4. Verification, Validation, and Accreditation Challenges of Distributed Simulation for Space Exploration Technology

    NASA Technical Reports Server (NTRS)

    Thomas, Danny; Hartway, Bobby; Hale, Joe

    2006-01-01

    Throughout its rich history, NASA has invested heavily in sophisticated simulation capabilities. These capabilities reside in NASA facilities across the country - and with partners around the world. NASA s Exploration Systems Mission Directorate (ESMD) has the opportunity to leverage these considerable investments to resolve technical questions relating to its missions. The distributed nature of the assets, both in terms of geography and organization, present challenges to their combined and coordinated use, but precedents of geographically distributed real-time simulations exist. This paper will show how technological advances in simulation can be employed to address the issues associated with netting NASA simulation assets.

  5. SU-F-T-267: A Clarkson-Based Independent Dose Verification for the Helical Tomotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagata, H; Juntendo University, Hongo, Tokyo; Hongo, H

    2016-06-15

    Purpose: There have been few reports for independent dose verification for Tomotherapy. We evaluated the accuracy and the effectiveness of an independent dose verification system for the Tomotherapy. Methods: Simple MU Analysis (SMU, Triangle Product, Ishikawa, Japan) was used as the independent verification system and the system implemented a Clarkson-based dose calculation algorithm using CT image dataset. For dose calculation in the SMU, the Tomotherapy machine-specific dosimetric parameters (TMR, Scp, OAR and MLC transmission factor) were registered as the machine beam data. Dose calculation was performed after Tomotherapy sinogram from DICOM-RT plan information was converted to the information for MUmore » and MLC location at more segmented control points. The performance of the SMU was assessed by a point dose measurement in non-IMRT and IMRT plans (simple target and mock prostate plans). Subsequently, 30 patients’ treatment plans for prostate were compared. Results: From the comparison, dose differences between the SMU and the measurement were within 3% for all cases in non-IMRT plans. In the IMRT plan for the simple target, the differences (Average±1SD) were −0.70±1.10% (SMU vs. TPS), −0.40±0.10% (measurement vs. TPS) and −1.20±1.00% (measurement vs. SMU), respectively. For the mock prostate, the differences were −0.40±0.60% (SMU vs. TPS), −0.50±0.90% (measurement vs. TPS) and −0.90±0.60% (measurement vs. SMU), respectively. For patients’ plans, the difference was −0.50±2.10% (SMU vs. TPS). Conclusion: A Clarkson-based independent dose verification for the Tomotherapy can be clinically available as a secondary check with the similar tolerance level of AAPM Task group 114. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  6. Experimental verification of the Acuros XB and AAA dose calculation adjacent to heterogeneous media for IMRT and RapidArc of nasopharygeal carcinoma.

    PubMed

    Kan, Monica W K; Leung, Lucullus H T; So, Ronald W K; Yu, Peter K N

    2013-03-01

    To compare the doses calculated by the Acuros XB (AXB) algorithm and analytical anisotropic algorithm (AAA) with experimentally measured data adjacent to and within heterogeneous medium using intensity modulated radiation therapy (IMRT) and RapidArc(®) (RA) volumetric arc therapy plans for nasopharygeal carcinoma (NPC). Two-dimensional dose distribution immediately adjacent to both air and bone inserts of a rectangular tissue equivalent phantom irradiated using IMRT and RA plans for NPC cases were measured with GafChromic(®) EBT3 films. Doses near and within the nasopharygeal (NP) region of an anthropomorphic phantom containing heterogeneous medium were also measured with thermoluminescent dosimeters (TLD) and EBT3 films. The measured data were then compared with the data calculated by AAA and AXB. For AXB, dose calculations were performed using both dose-to-medium (AXB_Dm) and dose-to-water (AXB_Dw) options. Furthermore, target dose differences between AAA and AXB were analyzed for the corresponding real patients. The comparison of real patient plans was performed by stratifying the targets into components of different densities, including tissue, bone, and air. For the verification of planar dose distribution adjacent to air and bone using the rectangular phantom, the percentages of pixels that passed the gamma analysis with the ± 3%/3mm criteria were 98.7%, 99.5%, and 97.7% on the axial plane for AAA, AXB_Dm, and AXB_Dw, respectively, averaged over all IMRT and RA plans, while they were 97.6%, 98.2%, and 97.7%, respectively, on the coronal plane. For the verification of planar dose distribution within the NP region of the anthropomorphic phantom, the percentages of pixels that passed the gamma analysis with the ± 3%/3mm criteria were 95.1%, 91.3%, and 99.0% for AAA, AXB_Dm, and AXB_Dw, respectively, averaged over all IMRT and RA plans. Within the NP region where air and bone were present, the film measurements represented the dose close to unit density water in a heterogeneous medium, produced the best agreement with the AXB_Dw. For the verification of point doses within the target using TLD in the anthropomorphic phantom, the absolute percentage deviations between the calculated and measured data when averaged over all IMRT and RA plans were 1.8%, 1.7%, and 1.8% for AAA, AXB_Dm and AXB_Dw, respectively. From all the verification results, no significant difference was found between the IMRT and RA plans. The target dose analysis of the real patient plans showed that the discrepancies in mean doses to the PTV component in tissue among the three dose calculation options were within 2%, but up to about 4% in the bone content, with AXB_Dm giving the lowest values and AXB_Dw giving the highest values. In general, the verification measurements demonstrated that both algorithms produced acceptable accuracy when compared to the measured data. GafChromic(®) film results indicated that AXB produced slightly better accuracy compared to AAA for dose calculation adjacent to and within the heterogeneous media. Users should be aware of the differences in calculated target doses between options AXB_Dm and AXB_Dw, especially in bone, for IMRT and RA in NPC cases.

  7. High-Assurance Spiral

    DTIC Science & Technology

    2017-11-01

    Public Release; Distribution Unlimited. PA# 88ABW-2017-5388 Date Cleared: 30 OCT 2017 13. SUPPLEMENTARY NOTES 14. ABSTRACT Cyber- physical systems... physical processes that interact in intricate manners. This makes verification of the software complex and unwieldy. In this report, an approach towards...resulting implementations. 15. SUBJECT TERMS Cyber- physical systems, Formal guarantees, Code generation 16. SECURITY CLASSIFICATION OF: 17

  8. CS-10 Verification Survey at Former McClellan AFB, Sacramento, CA

    DTIC Science & Technology

    2013-09-26

    8217!11C’D ./ -k~>olY 3ə ’-Af" 4 \\I Olt te’o Q__li~\\ (G"d.07y r ,~ ’illi’ 󈧏󈧓 ’>, - c;;¥a’f’’i c""" I 16 Distribution A: Approved for public release

  9. DownscaleConcept 2.3 User Manual. Downscaled, Spatially Distributed Soil Moisture Calculator

    DTIC Science & Technology

    2011-01-01

    be first presented with the dataset 28 results to your query. From this page, check the box next to the ASTER GDEM dataset and press the "List...information for verification. No charge will be associated with GDEM data archives. 14. Select "Submit Order Now!" to process your order. 15. Wait for

  10. Fuel and fire behavior prediction in big sagebrush

    Treesearch

    James K. Brown

    1982-01-01

    Relationships between height of big sagebrush and crown area, fuel loading, bulk density, size distribution of foliage and stemwood, and fraction dead stemwood are presented. Based upon these relationships, modeled rate-of-fire spread and fireline intensity are shown for sagebrush ranging in height from 20 to 120 em and in coverage from 10 to 40 percent. Verification...

  11. A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables

    NASA Astrophysics Data System (ADS)

    Huang, Laura X.; Isaac, George A.; Sheng, Grant

    2014-01-01

    This paper presents the verification results for nowcasts of seven categorical variables from an integrated weighted model (INTW) and the underlying numerical weather prediction (NWP) models. Nowcasting, or short range forecasting (0-6 h), over complex terrain with sufficient accuracy is highly desirable but a very challenging task. A weighting, evaluation, bias correction and integration system (WEBIS) for generating nowcasts by integrating NWP forecasts and high frequency observations was used during the Vancouver 2010 Olympic and Paralympic Winter Games as part of the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) project. Forecast data from Canadian high-resolution deterministic NWP system with three nested grids (at 15-, 2.5- and 1-km horizontal grid-spacing) were selected as background gridded data for generating the integrated nowcasts. Seven forecast variables of temperature, relative humidity, wind speed, wind gust, visibility, ceiling and precipitation rate are treated as categorical variables for verifying the integrated weighted forecasts. By analyzing the verification of forecasts from INTW and the NWP models among 15 sites, the integrated weighted model was found to produce more accurate forecasts for the 7 selected forecast variables, regardless of location. This is based on the multi-categorical Heidke skill scores for the test period 12 February to 21 March 2010.

  12. INTERIM REPORT--INDEPENDENT VERIFICATION SURVEY OF SECTION 3, SURVEY UNITS 1, 4 AND 5 EXCAVATED SURFACES, WHITTAKER CORPORATION, REYNOLDS INDUSTRIAL PARK, TRANSFER, PENNSYLVANIA DCN: 5002-SR-04-0"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ADAMS, WADE C

    At Pennsylvania Department of Environmental Protection's request, ORAU's IEAV program conducted verification surveys on the excavated surfaces of Section 3, SUs 1, 4, and 5 at the Whittaker site on March 13 and 14, 2013. The survey activities included visual inspections, gamma radiation surface scans, gamma activity measurements, and soil sampling activities. Verification activities also included the review and assessment of the licensee's project documentation and methodologies. Surface scans identified four areas of elevated direct gamma radiation distinguishable from background; one area within SUs 1 and 4 and two areas within SU5. One area within SU5 was remediated by removingmore » a golf ball size piece of slag while ORAU staff was onsite. With the exception of the golf ball size piece of slag within SU5, a review of the ESL Section 3 EXS data packages for SUs 1, 4, and 5 indicated that these locations of elevated gamma radiation were also identified by the ESL gamma scans and that ESL personnel performed additional investigations and soil sampling within these areas. The investigative results indicated that the areas met the release criteria.« less

  13. Technical review of SRT-CMA-930058 revalidation studies of Mark 16 experiments: J70

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reed, R.L.

    1993-10-25

    This study is a reperformance of a set of MGBS-TGAL criticality safety code validation calculations previously reported by Clark. The reperformance was needed because the records of the previous calculations could not be located in current APG files and records. As noted by the author, preliminary attempts to reproduce the Clark results by direct modeling in MGBS and TGAL were unsuccessful. Consultation with Clark indicated that the MGBS-TGAL (EXPT) option within the KOKO system should be used to set up the MGBS and TGAL input data records. The results of the study indicate that the technique used by Clark hasmore » been established and that the technique is now documented for future use. File records of the calculations have also been established in APG files. The review was performed per QAP 11--14 of 1Q34. Since the reviewer was involved in developing the procedural technique used for this study, this review can not be considered a fully independent review, but should be considered a verification that the document contains adequate information to allow a new user to perform similar calculations, a verification of the procedure by performing several calculations independently with identical results to the reported results, and a verification of the readability of the report.« less

  14. Costs and effects of two public sector delivery channels for long-lasting insecticidal nets in Uganda

    PubMed Central

    2010-01-01

    Background In Uganda, long-lasting insecticidal nets (LLIN) have been predominantly delivered through two public sector channels: targeted campaigns or routine antenatal care (ANC) services. Their combination in a mixed-model strategy is being advocated to quickly increase LLIN coverage and maintain it over time, but there is little evidence on the efficiency of each system. This study evaluated the two delivery channels regarding LLIN retention and use, and estimated the associated costs, to contribute towards the evidence-base on LLIN delivery channels in Uganda. Methods Household surveys were conducted 5-7 months after LLIN distribution, combining questionnaires with visual verification of LLIN presence. Focus groups and interviews were conducted to further investigate determinants of LLIN retention and use. Campaign distribution was evaluated in Jinja and Adjumani while ANC distribution was evaluated only in the latter district. Costs were calculated from the provider perspective through retrospective analysis of expenditure data, and effects were estimated as cost per LLIN delivered and cost per treated-net-year (TNY). These effects were calculated for the total number of LLINs delivered and for those retained and used. Results After 5-7 months, over 90% of LLINs were still owned by recipients, and between 74% (Jinja) and 99% (ANC Adjumani) were being used. Costing results showed that delivery was cheapest for the campaign in Jinja and highest for the ANC channel, with economic delivery cost per net retained and used of USD 1.10 and USD 2.31, respectively. Financial delivery costs for the two channels were similar in the same location, USD 1.04 for campaign or USD 1.07 for ANC delivery in Adjumani, but differed between locations (USD 0.67 for campaign delivery in Jinja). Economic cost for ANC distribution were considerably higher (USD 2.27) compared to campaign costs (USD 1.23) in Adjumani. Conclusions Targeted campaigns and routine ANC services can both achieve high LLIN retention and use among the target population. The comparatively higher economic cost of delivery through ANC facilities was at least partially due to the relatively short time this system had been in existence. Further studies comparing the cost of well-established ANC delivery with LLIN campaigns and other delivery channels are thus encouraged. PMID:20406448

  15. Costs and effects of two public sector delivery channels for long-lasting insecticidal nets in Uganda.

    PubMed

    Kolaczinski, Jan H; Kolaczinski, Kate; Kyabayinze, Daniel; Strachan, Daniel; Temperley, Matilda; Wijayanandana, Nayantara; Kilian, Albert

    2010-04-20

    In Uganda, long-lasting insecticidal nets (LLIN) have been predominantly delivered through two public sector channels: targeted campaigns or routine antenatal care (ANC) services. Their combination in a mixed-model strategy is being advocated to quickly increase LLIN coverage and maintain it over time, but there is little evidence on the efficiency of each system. This study evaluated the two delivery channels regarding LLIN retention and use, and estimated the associated costs, to contribute towards the evidence-base on LLIN delivery channels in Uganda. Household surveys were conducted 5-7 months after LLIN distribution, combining questionnaires with visual verification of LLIN presence. Focus groups and interviews were conducted to further investigate determinants of LLIN retention and use. Campaign distribution was evaluated in Jinja and Adjumani while ANC distribution was evaluated only in the latter district. Costs were calculated from the provider perspective through retrospective analysis of expenditure data, and effects were estimated as cost per LLIN delivered and cost per treated-net-year (TNY). These effects were calculated for the total number of LLINs delivered and for those retained and used. After 5-7 months, over 90% of LLINs were still owned by recipients, and between 74% (Jinja) and 99% (ANC Adjumani) were being used. Costing results showed that delivery was cheapest for the campaign in Jinja and highest for the ANC channel, with economic delivery cost per net retained and used of USD 1.10 and USD 2.31, respectively. Financial delivery costs for the two channels were similar in the same location, USD 1.04 for campaign or USD 1.07 for ANC delivery in Adjumani, but differed between locations (USD 0.67 for campaign delivery in Jinja). Economic cost for ANC distribution were considerably higher (USD 2.27) compared to campaign costs (USD 1.23) in Adjumani. Targeted campaigns and routine ANC services can both achieve high LLIN retention and use among the target population. The comparatively higher economic cost of delivery through ANC facilities was at least partially due to the relatively short time this system had been in existence. Further studies comparing the cost of well-established ANC delivery with LLIN campaigns and other delivery channels are thus encouraged.

  16. Integration of the instrument control electronics for the ESPRESSO spectrograph at ESO-VLT

    NASA Astrophysics Data System (ADS)

    Baldini, V.; Calderone, G.; Cirami, R.; Coretti, I.; Cristiani, S.; Di Marcantonio, P.; Mégevand, D.; Riva, M.; Santin, P.

    2016-07-01

    ESPRESSO, the Echelle SPectrograph for Rocky Exoplanet and Stable Spectroscopic Observations of the ESO - Very Large Telescope site, is now in its integration phase. The large number of functions of this complex instrument are fully controlled by a Beckhoff PLC based control electronics architecture. Four small and one large cabinets host the main electronic parts to control all the sensors, motorized stages and other analogue and digital functions of ESPRESSO. The Instrument Control Electronics (ICE) is built following the latest ESO standards and requirements. Two main PLC CPUs are used and are programmed through the TwinCAT Beckhoff dedicated software. The assembly, integration and verification phase of ESPRESSO, due to its distributed nature and different geographical locations of the consortium partners, is quite challenging. After the preliminary assembling and test of the electronic components at the Astronomical Observatory of Trieste and the test of some electronics and software parts at ESO (Garching), the complete system for the control of the four Front End Unit (FEU) arms of ESPRESSO has been fully assembled and tested in Merate (Italy) at the beginning of 2016. After these first tests, the system will be located at the Geneva Observatory (Switzerland) until the Preliminary Acceptance Europe (PAE) and finally shipped to Chile for the commissioning. This paper describes the integration strategy of the ICE workpackage of ESPRESSO, the hardware and software tests that have been performed, with an overall view of the experience gained during these project's phases.

  17. Transcription Start Site Evolution in Drosophila

    PubMed Central

    Main, Bradley J.; Smith, Andrew D.; Jang, Hyosik; Nuzhdin, Sergey V.

    2013-01-01

    Transcription start site (TSS) evolution remains largely undescribed in Drosophila, likely due to limited annotations in non-melanogaster species. In this study, we introduce a concise new method that selectively sequences from the 5′-end of mRNA and used it to identify TSS in four Drosophila species, including Drosophila melanogaster, D. simulans, D. sechellia, and D. pseudoobscura. For verification, we compared our results in D. melanogaster with known annotations, published 5′-rapid amplification of cDNA ends data, and with RNAseq from the same mRNA pool. Then, we paired 2,849 D. melanogaster TSS with its closest equivalent TSS in each species (likely to be its true ortholog) using the available multiple sequence alignments. Most of the D. melanogaster TSSs were successfully paired with an ortholog in each species (83%, 86%, and 55% for D. simulans, D. sechellia, and D. pseudoobscura, respectively). On the basis of the number and distribution of reads mapped at each TSS, we also estimated promoter-specific expression (PSE) and TSS peak shape, respectively. Among paired TSS orthologs, the location and promoter activity were largely conserved. TSS location appears important as PSE, and TSS peak shape was more frequently divergent among TSS that had moved. Unpaired TSS were surprisingly common in D. pseudoobscura. An increased mutation rate upstream of TSS might explain this pattern. We found an enrichment of ribosomal protein genes among diverged TSS, suggesting that TSS evolution is not uniform across the genome. PMID:23649539

  18. Study of a Single-Power Two-Circuit ESR Process with Current-Carrying Mold: Mathematical Simulation of the Process and Experimental Verification

    NASA Astrophysics Data System (ADS)

    Dong, Yanwu; Hou, Zhiwen; Jiang, Zhouhua; Cao, Haibo; Feng, Qianlong; Cao, Yulong

    2018-02-01

    A novel single-power two-circuit ESR process (ESR-STCCM) with current-carrying mold has been investigated via numerical simulation and experimental research in this paper. A 2D quasi-steady-state mathematical model is developed to describe ESR-STCCM. The electromagnetic field, flow field, slag pool temperature distribution, and the shape of a molten steel pool in ESR-STCCM have been investigated by FLUENT software as well as user-defined functions (UDF). The results indicate that ESR-STCCM is different from the conventional ESR process. The maximum electromagnetic force, current density, Joule heat, and slag pool flow velocity are located in the lower part of the conductor in the ESR-STCCM process. The direction of the maximum electromagnetic force inclines upward. There are two distinct vortices in the slag pool. The larger swirl rotates counterclockwise near the conductor, with a value of 0.0263 m s-1 due to the interaction of the electromagnetic force and gravity. The maximum temperature of the slag pool is 2070 K (1797 °C) and is located in the center of the swirl with a filling ratio of 0.6 and a 20 mm electrode immersion depth. The depth of a molten steel pool is shallower, which is conducive to improving solidification quality. In addition, the filling ratio of 0.6 is conducive to controlling steel solidification quality. Some experiments have been done, and the numerical model is confirmed by experimental results.

  19. TU-FG-BRB-09: Thermoacoustic Range Verification with Perfect Co-Registered Overlay of Bragg Peak onto Ultrasound Image

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patch, S; Kireeff Covo, M; Jackson, A

    Purpose: The potential of particle therapy has not yet been fully realized due to inaccuracies in range verification. The purpose of this work was to correlate the Bragg peak location with target structure, by overlaying thermoacoustic localization of the Bragg peak onto an ultrasound image. Methods: Pulsed delivery of 50 MeV protons was accomplished by a fast chopper installed between the ion source and the inflector of the 88″ cyclotron at Lawrence Berkeley National Lab. 2 Gy were delivered in 2 µs by a beam with peak current of 2 µA. Thermoacoustic emissions were detected by a cardiac array andmore » Verasonics V1 ultrasound system, which also generated a grayscale ultrasound image. 1024 thermoacoustic pulses were averaged before filtering and one-way beamforming focused signal onto the Bragg peak location with perfect co-registration to the ultrasound images. Data was collected in a room temperature water bath and gelatin phantom with a cavity designed to mimic the intestine, in which gas pockets can displace the Bragg peak. Experiments were performed with the cavity both empty and filled with olive oil. Results: In the waterbath overlays of the Bragg peak agreed with Monte Carlo simulations to within 800±170 µm. Agreement within 1.3 ± 0.2 mm was achieved in the gelatin phantom, although relative stopping powers were estimated only to first order from CT scans. Protoacoustic signals were detected after travel from the Bragg peak through 29 mm and 65 mm of phantom material when the cavity was empty and full of olive oil, respectively. Conclusion: Protoacoustic range verification is feasible with a commercial clinical ultrasound array, but at doses exceeding the clinical realm. Further optimization of both transducer array and injection line chopper is required to enable range verification within a 2 Gy dose limit, which would enable online adaptive treatment. This work was supported in part by a UWM Intramural Instrumentation Grant and by the Director, Office of Science, Office of Nuclear Physics, of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. YMQ was supported by a UWM-OUR summer fellowship.« less

  20. Smart intimation and location of faults in distribution system

    NASA Astrophysics Data System (ADS)

    Hari Krishna, K.; Srinivasa Rao, B.

    2018-04-01

    Location of faults in the distribution system is one of the most complicated problems that we are facing today. Identification of fault location and severity of fault within a short time is required to provide continuous power supply but fault identification and information transfer to the operator is the biggest challenge in the distribution network. This paper proposes a fault location method in the distribution system based on Arduino nano and GSM module with flame sensor. The main idea is to locate the fault in the distribution transformer by sensing the arc coming out from the fuse element. The biggest challenge in the distribution network is to identify the location and the severity of faults under different conditions. Well operated transmission and distribution systems will play a key role for uninterrupted power supply. Whenever fault occurs in the distribution system the time taken to locate and eliminate the fault has to be reduced. The proposed design was achieved with flame sensor and GSM module. Under faulty condition, the system will automatically send an alert message to the operator in the distribution system, about the abnormal conditions near the transformer, site code and its exact location for possible power restoration.

  1. A point-by-point multi-scale surface temperature reconstruction method and tests by pseudo proxy experiments

    NASA Astrophysics Data System (ADS)

    Chen, X.

    2016-12-01

    This study present a multi-scale approach combining Mode Decomposition and Variance Matching (MDVM) method and basic process of Point-by-Point Regression (PPR) method. Different from the widely applied PPR method, the scanning radius for each grid box, were re-calculated considering the impact from topography (i.e. mean altitudes and fluctuations). Thus, appropriate proxy records were selected to be candidates for reconstruction. The results of this multi-scale methodology could not only provide the reconstructed gridded temperature, but also the corresponding uncertainties of the four typical timescales. In addition, this method can bring in another advantage that spatial distribution of the uncertainty for different scales could be quantified. To interpreting the necessity of scale separation in calibration, with proxy records location over Eastern Asia, we perform two sets of pseudo proxy experiments (PPEs) based on different ensembles of climate model simulation. One consist of 7 simulated results by 5 models (BCC-CSM1-1, CSIRO-MK3L-1-2, HadCM3, MPI-ESM-P, and Giss-E2-R) of the "past1000" simulation from Coupled Model Intercomparison Project Phase 5. The other is based on the simulations of Community Earth System Model Last Millennium Ensemble (CESM-LME). The pseudo-records network were obtained by adding the white noise with signal-to-noise ratio (SNR) increasing from 0.1 to 1.0 to the simulated true state and the locations mainly followed the PAGES-2k network in Asia. Totally, 400 years (1601-2000) simulation was used for calibration and 600 years (1001-1600) for verification. The reconstructed results were evaluated by three metrics 1) root mean squared error (RMSE), 2) correlation and 3) reduction of error (RE) score. The PPE verification results have shown that, in comparison with ordinary linear calibration method (variance matching), the RMSE and RE score of PPR-MDVM are improved, especially for the area with sparse proxy records. To be noted, in some periods with large volcanic activities, the RMSE of MDVM get larger than VM for higher SNR cases. It should be inferred that the volcanic eruptions might blur the intrinsic characteristics of multi-scales variabilities of the climate system and the MDVM method would show less advantage in that case.

  2. Applying monitoring, verification, and accounting techniques to a real-world, enhanced oil recovery operational CO2 leak

    USGS Publications Warehouse

    Wimmer, B.T.; Krapac, I.G.; Locke, R.; Iranmanesh, A.

    2011-01-01

    The use of carbon dioxide (CO2) for enhanced oil recovery (EOR) is being tested for oil fields in the Illinois Basin, USA. While this technology has shown promise for improving oil production, it has raised some issues about the safety of CO2 injection and storage. The Midwest Geological Sequestration Consortium (MGSC) organized a Monitoring, Verification, and Accounting (MVA) team to develop and deploy monitoring programs at three EOR sites in Illinois, Indiana, and Kentucky, USA. MVA goals include establishing baseline conditions to evaluate potential impacts from CO2 injection, demonstrating that project activities are protective of human health and the environment, and providing an accurate accounting of stored CO2. This paper focuses on the use of MVA techniques in monitoring a small CO2 leak from a supply line at an EOR facility under real-world conditions. The ability of shallow monitoring techniques to detect and quantify a CO2 leak under real-world conditions has been largely unproven. In July of 2009, a leak in the pipe supplying pressurized CO2 to an injection well was observed at an MGSC EOR site located in west-central Kentucky. Carbon dioxide was escaping from the supply pipe located approximately 1 m underground. The leak was discovered visually by site personnel and injection was halted immediately. At its largest extent, the hole created by the leak was approximately 1.9 m long by 1.7 m wide and 0.7 m deep in the land surface. This circumstance provided an excellent opportunity to evaluate the performance of several monitoring techniques including soil CO2 flux measurements, portable infrared gas analysis, thermal infrared imagery, and aerial hyperspectral imagery. Valuable experience was gained during this effort. Lessons learned included determining 1) hyperspectral imagery was not effective in detecting this relatively small, short-term CO2 leak, 2) even though injection was halted, the leak remained dynamic and presented a safety risk concern during monitoring activities and, 3) the atmospheric and soil monitoring techniques used were relatively cost-effective, easily and rapidly deployable, and required minimal manpower to set up and maintain for short-term assessments. However, characterization of CO2 distribution near the land surface resulting from a dynamic leak with widely variable concentrations and fluxes was challenging. ?? 2011 Published by Elsevier Ltd.

  3. Verification of Space Station Secondary Power System Stability Using Design of Experiment

    NASA Technical Reports Server (NTRS)

    Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce

    1998-01-01

    This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.

  4. Probabilistic Sizing and Verification of Space Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Denaux, David; Ballhause, Dirk; Logut, Daniel; Lucarelli, Stefano; Coe, Graham; Laine, Benoit

    2012-07-01

    Sizing of ceramic parts is best optimised using a probabilistic approach which takes into account the preexisting flaw distribution in the ceramic part to compute a probability of failure of the part depending on the applied load, instead of a maximum allowable load as for a metallic part. This requires extensive knowledge of the material itself but also an accurate control of the manufacturing process. In the end, risk reduction approaches such as proof testing may be used to lower the final probability of failure of the part. Sizing and verification of ceramic space structures have been performed by Astrium for more than 15 years, both with Zerodur and SiC: Silex telescope structure, Seviri primary mirror, Herschel telescope, Formosat-2 instrument, and other ceramic structures flying today. Throughout this period of time, Astrium has investigated and developed experimental ceramic analysis tools based on the Weibull probabilistic approach. In the scope of the ESA/ESTEC study: “Mechanical Design and Verification Methodologies for Ceramic Structures”, which is to be concluded in the beginning of 2012, existing theories, technical state-of-the-art from international experts, and Astrium experience with probabilistic analysis tools have been synthesized into a comprehensive sizing and verification method for ceramics. Both classical deterministic and more optimised probabilistic methods are available, depending on the criticality of the item and on optimisation needs. The methodology, based on proven theory, has been successfully applied to demonstration cases and has shown its practical feasibility.

  5. Verification of intensity modulated radiation therapy beams using a tissue equivalent plastic scintillator dosimetry system

    NASA Astrophysics Data System (ADS)

    Petric, Martin Peter

    This thesis describes the development and implementation of a novel method for the dosimetric verification of intensity modulated radiation therapy (IMRT) fields with several advantages over current techniques. Through the use of a tissue equivalent plastic scintillator sheet viewed by a charge-coupled device (CCD) camera, this method provides a truly tissue equivalent dosimetry system capable of efficiently and accurately performing field-by-field verification of IMRT plans. This work was motivated by an initial study comparing two IMRT treatment planning systems. The clinical functionality of BrainLAB's BrainSCAN and Varian's Helios IMRT treatment planning systems were compared in terms of implementation and commissioning, dose optimization, and plan assessment. Implementation and commissioning revealed differences in the beam data required to characterize the beam prior to use with the BrainSCAN system requiring higher resolution data compared to Helios. This difference was found to impact on the ability of the systems to accurately calculate dose for highly modulated fields, with BrainSCAN being more successful than Helios. The dose optimization and plan assessment comparisons revealed that while both systems use considerably different optimization algorithms and user-control interfaces, they are both capable of producing substantially equivalent dose plans. The extensive use of dosimetric verification techniques in the IMRT treatment planning comparison study motivated the development and implementation of a novel IMRT dosimetric verification system. The system consists of a water-filled phantom with a tissue equivalent plastic scintillator sheet built into the top surface. Scintillation light is reflected by a plastic mirror within the phantom towards a viewing window where it is captured using a CCD camera. Optical photon spread is removed using a micro-louvre optical collimator and by deconvolving a glare kernel from the raw images. Characterization of this new dosimetric verification system indicates excellent dose response and spatial linearity, high spatial resolution, and good signal uniformity and reproducibility. Dosimetric results from square fields, dynamic wedged fields, and a 7-field head and neck IMRT treatment plan indicate good agreement with film dosimetry distributions. Efficiency analysis of the system reveals a 50% reduction in time requirements for field-by-field verification of a 7-field IMRT treatment plan compared to film dosimetry.

  6. Polythelia (supernumerary nipple): an update.

    PubMed

    Johnson, C A; Felson, B; Jolles, H

    1986-09-01

    Supernumerary mammary gland, nipple, or areola (with neither nipple nor mammary tissue) have been well documented in the medical literature of the last two decades. Though predominantly a cosmetic blemish, the anomalous appendage may give rise to a neoplasm. Because of its atypical appearance and ectopic location, diagnosis of the anomaly may require a high index of suspicion and histologic verification. In our current concern with breast cancer, there is need to be aware of this entity.

  7. Small-Field Measurements of 3D Polymer Gel Dosimeters through Optical Computed Tomography.

    PubMed

    Shih, Tian-Yu; Wu, Jay; Shih, Cheng-Ting; Lee, Yao-Ting; Wu, Shin-Hua; Yao, Chun-Hsu; Hsieh, Bor-Tsung

    2016-01-01

    With advances in therapeutic instruments and techniques, three-dimensional dose delivery has been widely used in radiotherapy. The verification of dose distribution in a small field becomes critical because of the obvious dose gradient within the field. The study investigates the dose distributions of various field sizes by using NIPAM polymer gel dosimeter. The dosimeter consists of 5% gelatin, 5% monomers, 3% cross linkers, and 5 mM THPC. After irradiation, a 24 to 96 hour delay was applied, and the gel dosimeters were read by a cone beam optical computed tomography (optical CT) scanner. The dose distributions measured by the NIPAM gel dosimeter were compared to the outputs of the treatment planning system using gamma evaluation. For the criteria of 3%/3 mm, the pass rates for 5 × 5, 3 × 3, 2 × 2, 1 × 1, and 0.5 × 0.5 cm2 were as high as 91.7%, 90.7%, 88.2%, 74.8%, and 37.3%, respectively. For the criteria of 5%/5 mm, the gamma pass rates of the 5 × 5, 3 × 3, and 2 × 2 cm2 fields were over 99%. The NIPAM gel dosimeter provides high chemical stability. With cone-beam optical CT readouts, the NIPAM polymer gel dosimeter has potential for clinical dose verification of small-field irradiation.

  8. Comparative evaluation of Kodak EDR2 and XV2 films for verification of intensity modulated radiation therapy.

    PubMed

    Dogan, Nesrin; Leybovich, Leonid B; Sethi, Anil

    2002-11-21

    Film dosimetry provides a convenient tool to determine dose distributions, especially for verification of IMRT plans. However, the film response to radiation shows a significant dependence on depth, energy and field size that compromise the accuracy of measurements. Kodak's XV2 film has a low saturation dose (approximately 100 cGy) and, consequently, a relatively short region of linear dose-response. The recently introduced Kodak extended range EDR2 film was reported to have a linear dose-response region extending to 500 cGy. This increased dose range may be particularly useful in the verification of IMRT plans. In this work, the dependence of Kodak EDR2 film's response on the depth, field size and energy was evaluated and compared with Kodak XV2 film. Co-60, 6 MV, 10 MV and 18 MV beams were used. Field sizes were 2 x 2, 6 x 6, 10 x 10, 14 x 14, 18 x 18 and 24 x 24 cm2. Doses for XV2 and EDR2 films were 80 cGy and 300 cGy, respectively. Optical density was converted to dose using depth-corrected sensitometric (Hurter and Driffield, or H&D) curves. For each field size, XV2 and EDR2 depth-dose curves were compared with ion chamber depth-dose curves. Both films demonstrated similar (within 1%) field size dependence. The deviation from the ion chamber for both films was small forthe fields ranging from 2 x 2 to 10 x 10 cm2: < or =2% for 6, 10 and 18 MV beams. No deviation was observed for the Co-60 beam. As the field size increased to 24 x 24 cm2, the deviation became significant for both films: approximately 7.5% for Co-60, approximately 5% for 6 MV and 10 MV, and approximately 6% for 18 MV. During the verification of IMRT plans, EDR2 film showed a better agreement with the calculated dose distributions than the XV2 film.

  9. Density scaling of phantom materials for a 3D dose verification system.

    PubMed

    Tani, Kensuke; Fujita, Yukio; Wakita, Akihisa; Miyasaka, Ryohei; Uehara, Ryuzo; Kodama, Takumi; Suzuki, Yuya; Aikawa, Ako; Mizuno, Norifumi; Kawamori, Jiro; Saitoh, Hidetoshi

    2018-05-21

    In this study, the optimum density scaling factors of phantom materials for a commercially available three-dimensional (3D) dose verification system (Delta4) were investigated in order to improve the accuracy of the calculated dose distributions in the phantom materials. At field sizes of 10 × 10 and 5 × 5 cm 2 with the same geometry, tissue-phantom ratios (TPRs) in water, polymethyl methacrylate (PMMA), and Plastic Water Diagnostic Therapy (PWDT) were measured, and TPRs in various density scaling factors of water were calculated by Monte Carlo simulation, Adaptive Convolve (AdC, Pinnacle 3 ), Collapsed Cone Convolution (CCC, RayStation), and AcurosXB (AXB, Eclipse). Effective linear attenuation coefficients (μ eff ) were obtained from the TPRs. The ratios of μ eff in phantom and water ((μ eff ) pl,water ) were compared between the measurements and calculations. For each phantom material, the density scaling factor proposed in this study (DSF) was set to be the value providing a match between the calculated and measured (μ eff ) pl,water . The optimum density scaling factor was verified through the comparison of the dose distributions measured by Delta4 and calculated with three different density scaling factors: the nominal physical density (PD), nominal relative electron density (ED), and DSF. Three plans were used for the verifications: a static field of 10 × 10 cm 2 and two intensity modulated radiation therapy (IMRT) treatment plans. DSF were determined to be 1.13 for PMMA and 0.98 for PWDT. DSF for PMMA showed good agreement for AdC and CCC with 6 MV x ray, and AdC for 10 MV x ray. DSF for PWDT showed good agreement regardless of the dose calculation algorithms and x-ray energy. DSF can be considered one of the references for the density scaling factor of Delta4 phantom materials and may help improve the accuracy of the IMRT dose verification using Delta4. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  10. SU-E-T-278: Realization of Dose Verification Tool for IMRT Plan Based On DPM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Jinfeng; Cao, Ruifen; Dai, Yumei

    Purpose: To build a Monte Carlo dose verification tool for IMRT Plan by implementing a irradiation source model into DPM code. Extend the ability of DPM to calculate any incident angles and irregular-inhomogeneous fields. Methods: With the virtual source and the energy spectrum which unfolded from the accelerator measurement data,combined with optimized intensity maps to calculate the dose distribution of the irradiation irregular-inhomogeneous field. The irradiation source model of accelerator was substituted by a grid-based surface source. The contour and the intensity distribution of the surface source were optimized by ARTS (Accurate/Advanced Radiotherapy System) optimization module based on the tumormore » configuration. The weight of the emitter was decided by the grid intensity. The direction of the emitter was decided by the combination of the virtual source and the emitter emitting position. The photon energy spectrum unfolded from the accelerator measurement data was adjusted by compensating the contaminated electron source. For verification, measured data and realistic clinical IMRT plan were compared with DPM dose calculation. Results: The regular field was verified by comparing with the measured data. It was illustrated that the differences were acceptable (<2% inside the field, 2–3mm in the penumbra). The dose calculation of irregular field by DPM simulation was also compared with that of FSPB (Finite Size Pencil Beam) and the passing rate of gamma analysis was 95.1% for peripheral lung cancer. The regular field and the irregular rotational field were all within the range of permitting error. The computing time of regular fields were less than 2h, and the test of peripheral lung cancer was 160min. Through parallel processing, the adapted DPM could complete the calculation of IMRT plan within half an hour. Conclusion: The adapted parallelized DPM code with irradiation source model is faster than classic Monte Carlo codes. Its computational accuracy and speed satisfy the clinical requirement, and it is expectable to be a Monte Carlo dose verification tool for IMRT Plan. Strategic Priority Research Program of the China Academy of Science(XDA03040000); National Natural Science Foundation of China (81101132)« less

  11. Determining the mechanical properties of a radiochromic silicone-based 3D dosimeter

    NASA Astrophysics Data System (ADS)

    Kaplan, L. P.; Høye, E. M.; Balling, P.; Muren, L. P.; Petersen, J. B. B.; Poulsen, P. R.; Yates, E. S.; Skyt, P. S.

    2017-07-01

    New treatment modalities in radiotherapy (RT) enable delivery of highly conformal dose distributions in patients. This creates a need for precise dose verification in three dimensions (3D). A radiochromic silicone-based 3D dosimetry system has recently been developed. Such a dosimeter can be used for dose verification in deformed geometries, which requires knowledge of the dosimeter’s mechanical properties. In this study we have characterized the dosimeter’s elastic behaviour under tensile and compressive stress. In addition, the dose response under strain was determined. It was found that the dosimeter behaved as an incompressible hyperelastic material with a non-linear stress/strain curve and with no observable hysteresis or plastic deformation even at high strains. The volume was found to be constant within a 2% margin at deformations up to 60%. Furthermore, it was observed that the dosimeter returned to its original geometry within a 2% margin when irradiated under stress, and that the change in optical density per centimeter was constant regardless of the strain during irradiation. In conclusion, we have shown that this radiochromic silicone-based dosimeter’s mechanical properties make it a viable candidate for dose verification in deformable 3D geometries.

  12. SU-E-T-255: Optimized Supine Craniospinal Irradiation with Image-Guided and Field Matched Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Z; Holupka, E; Naughton, J

    2014-06-01

    Purpose: Conventional craniospinal irradiation (CSI) challenges include dose inhomogeneity at field junctions and position uncertainty due to the field divergence, particular for the two spinal fields. Here we outline a new supine CSI technique to address these difficulties. Methods: Patient was simulated in supine position. The cranial fields had isocenter at C2/C3 vertebral and were matched with 1st spinal field. Their inferior border was chosen to avoid the shoulder, as well as chin from the 1st spine field. Their collimator angles were dependent on asymmetry jaw setting of the 1st spinal field. With couch rotation, the spinal field gantry anglesmore » were adjusted to ensure, the inferior border of 1st and superior border of 2nd spinal fields were perpendicular to the table top. The radio-opaque wire position for the spinal junction was located initially by the light field from an anterior setup beam, and was finalized by the portal imaging of the 1st spinal field. With reference to the spinal junction wire, the fields were matched by positioning the isocenter of the 2nd spinal field. A formula was derived to optimize supine CSI treatment planning, by utilizing the relationship among the Yjaw setting, the spinal field gantry angles, cranial field collimator angles, and the spinal field isocenters location. The plan was delivered with portal imaging alignment for the both cranial and spinal junctions. Results: Utilizing this technique with matching beams, and conventional technique such as feathering and forwarding planning, a homogenous dose distribution was achieved throughout the entire CSI treatment volume including the spinal junction. Placing the spinal junction wire visualized in both spinal portals, allows for precise determination and verification of the appropriate match line of the spine fields. Conclusion: This technique of optimization supine CSI achieved a homogenous dose distributions and patient localization accuracy with image-guided and matched beams.« less

  13. Predicting patterns of non-native plant invasions in Yosemite National Park, California, USA

    USGS Publications Warehouse

    Underwood, E.C.; Klinger, R.; Moore, P.E.

    2004-01-01

    One of the major issues confronting management of parks and reserves is the invasion of non-native plant species. Yosemite National Park is one of the largest and best-known parks in the United States, harbouring significant cultural and ecological resources. Effective management of non-natives would be greatly assisted by information on their potential distribution that can be generated by predictive modelling techniques. Our goal was to identify key environmental factors that were correlated with the percent cover of non-native species and then develop a predictive model using the Genetic Algorithm for Rule-set Production technique. We performed a series of analyses using community-level data on species composition in 236 plots located throughout the park. A total of 41 non-native species were recorded which occurred in 23.7% of the plots. Plots with non-natives occurred most frequently at low- to mid-elevations, in flat areas with other herbaceous species. Based on the community-level results, we selected elevation, slope, and vegetation structure as inputs into the GARP model to predict the environmental niche of non-native species. Verification of results was performed using plot data reserved from the model, which calculated the correct prediction of non-native species occurrence as 76%. The majority of the western, lower-elevation portion of the park was predicted to have relatively low levels of non-native species occurrence, with highest concentrations predicted at the west and south entrances and in the Yosemite Valley. Distribution maps of predicted occurrences will be used by management to: efficiently target monitoring of non-native species, prioritize control efforts according to the likelihood of non-native occurrences, and inform decisions relating to the management of non-native species in postfire environments. Our approach provides a valuable tool for assisting decision makers to better manage non-native species, which can be readily adapted to target non-native species in other locations.

  14. SU-F-T-463: Light-Field Based Dynalog Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atwal, P; Ramaseshan, R

    2016-06-15

    Purpose: To independently verify leaf positions in so-called dynalog files for a Varian iX linac with a Millennium 120 MLC. This verification provides a measure of confidence that the files can be used directly as part of a more extensive intensity modulated radiation therapy / volumetric modulated arc therapy QA program. Methods: Initial testing used white paper placed at the collimator plane and a standard hand-held digital camera to image the light and shadow of a static MLC field through the paper. Known markings on the paper allow for image calibration. Noise reduction was attempted with removal of ‘inherent noise’more » from an open-field light image through the paper, but the method was found to be inconsequential. This is likely because the environment could not be controlled to the precision required for the sort of reproducible characterization of the quantum noise needed in order to meaningfully characterize and account for it. A multi-scale iterative edge detection algorithm was used for localizing the leaf ends. These were compared with the planned locations from the treatment console. Results: With a very basic setup, the image of the central bank A leaves 15–45, which are arguably the most important for beam modulation, differed from the planned location by [0.38±0.28] mm. Similarly, for bank B leaves 15–45 had a difference of [0.42±0.28] mm Conclusion: It should be possible to determine leaf position accurately with not much more than a modern hand-held camera and some software. This means we can have a periodic and independent verification of the dynalog file information. This is indicated by the precision already achieved using a basic setup and analysis methodology. Currently, work is being done to reduce imaging and setup errors, which will bring the leaf position error down further, and allow meaningful analysis over the full range of leaves.« less

  15. Quantitative assessment of the physical potential of proton beam range verification with PET/CT.

    PubMed

    Knopf, A; Parodi, K; Paganetti, H; Cascio, E; Bonab, A; Bortfeld, T

    2008-08-07

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6 degrees to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.

  16. Quantitative assessment of the physical potential of proton beam range verification with PET/CT

    NASA Astrophysics Data System (ADS)

    Knopf, A.; Parodi, K.; Paganetti, H.; Cascio, E.; Bonab, A.; Bortfeld, T.

    2008-08-01

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6° to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.

  17. Making statistical inferences about software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1988-01-01

    Failure times of software undergoing random debugging can be modelled as order statistics of independent but nonidentically distributed exponential random variables. Using this model inferences can be made about current reliability and, if debugging continues, future reliability. This model also shows the difficulty inherent in statistical verification of very highly reliable software such as that used by digital avionics in commercial aircraft.

  18. Stopping power and dose calculations with analytical and Monte Carlo methods for protons and prompt gamma range verification

    NASA Astrophysics Data System (ADS)

    Usta, Metin; Tufan, Mustafa Çağatay; Aydın, Güral; Bozkurt, Ahmet

    2018-07-01

    In this study, we have performed the calculations stopping power, depth dose, and range verification for proton beams using dielectric and Bethe-Bloch theories and FLUKA, Geant4 and MCNPX Monte Carlo codes. In the framework, as analytical studies, Drude model was applied for dielectric theory and effective charge approach with Roothaan-Hartree-Fock charge densities was used in Bethe theory. In the simulations different setup parameters were selected to evaluate the performance of three distinct Monte Carlo codes. The lung and breast tissues were investigated are considered to be related to the most common types of cancer throughout the world. The results were compared with each other and the available data in literature. In addition, the obtained results were verified with prompt gamma range data. In both stopping power values and depth-dose distributions, it was found that the Monte Carlo values give better results compared with the analytical ones while the results that agree best with ICRU data in terms of stopping power are those of the effective charge approach between the analytical methods and of the FLUKA code among the MC packages. In the depth dose distributions of the examined tissues, although the Bragg curves for Monte Carlo almost overlap, the analytical ones show significant deviations that become more pronounce with increasing energy. Verifications with the results of prompt gamma photons were attempted for 100-200 MeV protons which are regarded important for proton therapy. The analytical results are within 2%-5% and the Monte Carlo values are within 0%-2% as compared with those of the prompt gammas.

  19. Retina verification system based on biometric graph matching.

    PubMed

    Lajevardi, Seyed Mehdi; Arakala, Arathi; Davis, Stephen A; Horadam, Kathy J

    2013-09-01

    This paper presents an automatic retina verification framework based on the biometric graph matching (BGM) algorithm. The retinal vasculature is extracted using a family of matched filters in the frequency domain and morphological operators. Then, retinal templates are defined as formal spatial graphs derived from the retinal vasculature. The BGM algorithm, a noisy graph matching algorithm, robust to translation, non-linear distortion, and small rotations, is used to compare retinal templates. The BGM algorithm uses graph topology to define three distance measures between a pair of graphs, two of which are new. A support vector machine (SVM) classifier is used to distinguish between genuine and imposter comparisons. Using single as well as multiple graph measures, the classifier achieves complete separation on a training set of images from the VARIA database (60% of the data), equaling the state-of-the-art for retina verification. Because the available data set is small, kernel density estimation (KDE) of the genuine and imposter score distributions of the training set are used to measure performance of the BGM algorithm. In the one dimensional case, the KDE model is validated with the testing set. A 0 EER on testing shows that the KDE model is a good fit for the empirical distribution. For the multiple graph measures, a novel combination of the SVM boundary and the KDE model is used to obtain a fair comparison with the KDE model for the single measure. A clear benefit in using multiple graph measures over a single measure to distinguish genuine and imposter comparisons is demonstrated by a drop in theoretical error of between 60% and more than two orders of magnitude.

  20. Convective Weather Forecast Quality Metrics for Air Traffic Management Decision-Making

    NASA Technical Reports Server (NTRS)

    Chatterji, Gano B.; Gyarfas, Brett; Chan, William N.; Meyn, Larry A.

    2006-01-01

    Since numerical weather prediction models are unable to accurately forecast the severity and the location of the storm cells several hours into the future when compared with observation data, there has been a growing interest in probabilistic description of convective weather. The classical approach for generating uncertainty bounds consists of integrating the state equations and covariance propagation equations forward in time. This step is readily recognized as the process update step of the Kalman Filter algorithm. The second well known method, known as the Monte Carlo method, consists of generating output samples by driving the forecast algorithm with input samples selected from distributions. The statistical properties of the distributions of the output samples are then used for defining the uncertainty bounds of the output variables. This method is computationally expensive for a complex model compared to the covariance propagation method. The main advantage of the Monte Carlo method is that a complex non-linear model can be easily handled. Recently, a few different methods for probabilistic forecasting have appeared in the literature. A method for computing probability of convection in a region using forecast data is described in Ref. 5. Probability at a grid location is computed as the fraction of grid points, within a box of specified dimensions around the grid location, with forecast convection precipitation exceeding a specified threshold. The main limitation of this method is that the results are dependent on the chosen dimensions of the box. The examples presented Ref. 5 show that this process is equivalent to low-pass filtering of the forecast data with a finite support spatial filter. References 6 and 7 describe the technique for computing percentage coverage within a 92 x 92 square-kilometer box and assigning the value to the center 4 x 4 square-kilometer box. This technique is same as that described in Ref. 5. Characterizing the forecast, following the process described in Refs. 5 through 7, in terms of percentage coverage or confidence level is notionally sound compared to characterizing in terms of probabilities because the probability of the forecast being correct can only be determined using actual observations. References 5 through 7 only use the forecast data and not the observations. The method for computing the probability of detection, false alarm ratio and several forecast quality metrics (Skill Scores) using both the forecast and observation data are given in Ref. 2. This paper extends the statistical verification method in Ref. 2 to determine co-occurrence probabilities. The method consists of computing the probability that a severe weather cell (grid location) is detected in the observation data in the neighborhood of the severe weather cell in the forecast data. Probabilities of occurrence at the grid location and in its neighborhood with higher severity, and with lower severity in the observation data compared to that in the forecast data are examined. The method proposed in Refs. 5 through 7 is used for computing the probability that a certain number of cells in the neighborhood of severe weather cells in the forecast data are seen as severe weather cells in the observation data. Finally, the probability of existence of gaps in the observation data in the neighborhood of severe weather cells in forecast data is computed. Gaps are defined as openings between severe weather cells through which an aircraft can safely fly to its intended destination. The rest of the paper is organized as follows. Section II summarizes the statistical verification method described in Ref. 2. The extension of this method for computing the co-occurrence probabilities in discussed in Section HI. Numerical examples using NCWF forecast data and NCWD observation data are presented in Section III to elucidate the characteristics of the co-occurrence probabilities. This section also discusses the procedure for computing throbabilities that the severity of convection in the observation data will be higher or lower in the neighborhood of grid locations compared to that indicated at the grid locations in the forecast data. The probability of coverage of neighborhood grid cells is also described via examples in this section. Section IV discusses the gap detection algorithm and presents a numerical example to illustrate the method. The locations of the detected gaps in the observation data are used along with the locations of convective weather cells in the forecast data to determine the probability of existence of gaps in the neighborhood of these cells. Finally, the paper is concluded in Section V.

  1. Computational control of flexible aerospace systems

    NASA Technical Reports Server (NTRS)

    Sharpe, Lonnie, Jr.; Shen, Ji Yao

    1994-01-01

    The main objective of this project is to establish a distributed parameter modeling technique for structural analysis, parameter estimation, vibration suppression and control synthesis of large flexible aerospace structures. This report concentrates on the research outputs produced in the last two years. The main accomplishments can be summarized as follows. A new version of the PDEMOD Code had been completed based on several incomplete versions. The verification of the code had been conducted by comparing the results with those examples for which the exact theoretical solutions can be obtained. The theoretical background of the package and the verification examples has been reported in a technical paper submitted to the Joint Applied Mechanics & Material Conference, ASME. A brief USER'S MANUAL had been compiled, which includes three parts: (1) Input data preparation; (2) Explanation of the Subroutines; and (3) Specification of control variables. Meanwhile, a theoretical investigation of the NASA MSFC two-dimensional ground-based manipulator facility by using distributed parameter modeling technique has been conducted. A new mathematical treatment for dynamic analysis and control of large flexible manipulator systems has been conceived, which may provide an embryonic form of a more sophisticated mathematical model for future modified versions of the PDEMOD Codes.

  2. Review of the Microdosimetric Studies for High-Energy Charged Particle Beams Using a Tissue-Equivalent Proportional Counter

    NASA Astrophysics Data System (ADS)

    Tsuda, Shuichi; Sato, Tatsuhiko; Ogawa, Tatsuhiko; Sasaki, Shinichi

    Lineal energy (y) distributions were measured for various types of charged particles such as protons and iron, with kinetic energies of up to 500 MeV/u, via the use of a wall-less tissue-equivalent proportional counter (TEPC). Radial dependencies of y distributions were also experimentally evaluated to investigate the track structures of protons, carbon, and iron beams. This paper reviews a series of measured data using the aforementioned TEPC as well as assesses the systematic verification of a microdosimetric calculation model of a y distribution incorporated into the particle and heavy ion transport code system (PHITS) and associated track structure models.

  3. Finite Element Simulation and Experimental Verification of Internal Stress of Quenched AISI 4140 Cylinders

    NASA Astrophysics Data System (ADS)

    Liu, Yu; Qin, Shengwei; Hao, Qingguo; Chen, Nailu; Zuo, Xunwei; Rong, Yonghua

    2017-03-01

    The study of internal stress in quenched AISI 4140 medium carbon steel is of importance in engineering. In this work, the finite element simulation (FES) was employed to predict the distribution of internal stress in quenched AISI 4140 cylinders with two sizes of diameter based on exponent-modified (Ex-Modified) normalized function. The results indicate that the FES based on Ex-Modified normalized function proposed is better consistent with X-ray diffraction measurements of the stress distribution than FES based on normalized function proposed by Abrassart, Desalos and Leblond, respectively, which is attributed that Ex-Modified normalized function better describes transformation plasticity. Effect of temperature distribution on the phase formation, the origin of residual stress distribution and effect of transformation plasticity function on the residual stress distribution were further discussed.

  4. Interfacing Space Communications and Navigation Network Simulation with Distributed System Integration Laboratories (DSIL)

    NASA Technical Reports Server (NTRS)

    Jennings, Esther H.; Nguyen, Sam P.; Wang, Shin-Ywan; Woo, Simon S.

    2008-01-01

    NASA's planned Lunar missions will involve multiple NASA centers where each participating center has a specific role and specialization. In this vision, the Constellation program (CxP)'s Distributed System Integration Laboratories (DSIL) architecture consist of multiple System Integration Labs (SILs), with simulators, emulators, testlabs and control centers interacting with each other over a broadband network to perform test and verification for mission scenarios. To support the end-to-end simulation and emulation effort of NASA' exploration initiatives, different NASA centers are interconnected to participate in distributed simulations. Currently, DSIL has interconnections among the following NASA centers: Johnson Space Center (JSC), Kennedy Space Center (KSC), Marshall Space Flight Center (MSFC) and Jet Propulsion Laboratory (JPL). Through interconnections and interactions among different NASA centers, critical resources and data can be shared, while independent simulations can be performed simultaneously at different NASA locations, to effectively utilize the simulation and emulation capabilities at each center. Furthermore, the development of DSIL can maximally leverage the existing project simulation and testing plans. In this work, we describe the specific role and development activities at JPL for Space Communications and Navigation Network (SCaN) simulator using the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) tool to simulate communications effects among mission assets. Using MACHETE, different space network configurations among spacecrafts and ground systems of various parameter sets can be simulated. Data that is necessary for tracking, navigation, and guidance of spacecrafts such as Crew Exploration Vehicle (CEV), Crew Launch Vehicle (CLV), and Lunar Relay Satellite (LRS) and orbit calculation data are disseminated to different NASA centers and updated periodically using the High Level Architecture (HLA). In addition, the performance of DSIL under different traffic loads with different mix of data and priorities are evaluated.

  5. Precipitation From a Multiyear Database of Convection-Allowing WRF Simulations

    NASA Astrophysics Data System (ADS)

    Goines, D. C.; Kennedy, A. D.

    2018-03-01

    Convection-allowing models (CAMs) have become frequently used for operational forecasting and, more recently, have been utilized for general circulation model downscaling. CAM forecasts have typically been analyzed for a few case studies or over short time periods, but this limits the ability to judge the overall skill of deterministic simulations. Analysis over long time periods can yield a better understanding of systematic model error. Four years of warm season (April-August, 2010-2013)-simulated precipitation has been accumulated from two Weather Research and Forecasting (WRF) models with 4 km grid spacing. The simulations were provided by the National Center for Environmental Prediction (NCEP) and the National Severe Storms Laboratory (NSSL), each with different dynamic cores and parameterization schemes. These simulations are evaluated against the NCEP Stage-IV precipitation data set with similar 4 km grid spacing. The spatial distribution and diurnal cycle of precipitation in the central United States are analyzed using Hovmöller diagrams, grid point correlations, and traditional verification skill scoring (i.e., ETS; Equitable Threat Score). Although NCEP-WRF had a high positive error in total precipitation, spatial characteristics were similar to observations. For example, the spatial distribution of NCEP-WRF precipitation correlated better than NSSL-WRF for the Northern Plains. Hovmöller results exposed a delay in initiation and decay of diurnal precipitation by NCEP-WRF while both models had difficulty in reproducing the timing and location of propagating precipitation. ETS was highest for NSSL-WRF in all domains at all times. ETS was also higher in areas of propagating precipitation compared to areas of unorganized diurnal scattered precipitation. Monthly analysis identified unique differences between the two models in their abilities to correctly simulate the spatial distribution and zonal motion of precipitation through the warm season.

  6. Real-Time Impact Visualization Inspection of Aerospace Composite Structures with Distributed Sensors.

    PubMed

    Si, Liang; Baier, Horst

    2015-07-08

    For the future design of smart aerospace structures, the development and application of a reliable, real-time and automatic monitoring and diagnostic technique is essential. Thus, with distributed sensor networks, a real-time automatic structural health monitoring (SHM) technique is designed and investigated to monitor and predict the locations and force magnitudes of unforeseen foreign impacts on composite structures and to estimate in real time mode the structural state when impacts occur. The proposed smart impact visualization inspection (IVI) technique mainly consists of five functional modules, which are the signal data preprocessing (SDP), the forward model generator (FMG), the impact positioning calculator (IPC), the inverse model operator (IMO) and structural state estimator (SSE). With regard to the verification of the practicality of the proposed IVI technique, various structure configurations are considered, which are a normal CFRP panel and another CFRP panel with "orange peel" surfaces and a cutout hole. Additionally, since robustness against several background disturbances is also an essential criterion for practical engineering demands, investigations and experimental tests are carried out under random vibration interfering noise (RVIN) conditions. The accuracy of the predictions for unknown impact events on composite structures using the IVI technique is validated under various structure configurations and under changing environmental conditions. The evaluated errors all fall well within a satisfactory limit range. Furthermore, it is concluded that the IVI technique is applicable for impact monitoring, diagnosis and assessment of aerospace composite structures in complex practical engineering environments.

  7. Real-Time Impact Visualization Inspection of Aerospace Composite Structures with Distributed Sensors

    PubMed Central

    Si, Liang; Baier, Horst

    2015-01-01

    For the future design of smart aerospace structures, the development and application of a reliable, real-time and automatic monitoring and diagnostic technique is essential. Thus, with distributed sensor networks, a real-time automatic structural health monitoring (SHM) technique is designed and investigated to monitor and predict the locations and force magnitudes of unforeseen foreign impacts on composite structures and to estimate in real time mode the structural state when impacts occur. The proposed smart impact visualization inspection (IVI) technique mainly consists of five functional modules, which are the signal data preprocessing (SDP), the forward model generator (FMG), the impact positioning calculator (IPC), the inverse model operator (IMO) and structural state estimator (SSE). With regard to the verification of the practicality of the proposed IVI technique, various structure configurations are considered, which are a normal CFRP panel and another CFRP panel with “orange peel” surfaces and a cutout hole. Additionally, since robustness against several background disturbances is also an essential criterion for practical engineering demands, investigations and experimental tests are carried out under random vibration interfering noise (RVIN) conditions. The accuracy of the predictions for unknown impact events on composite structures using the IVI technique is validated under various structure configurations and under changing environmental conditions. The evaluated errors all fall well within a satisfactory limit range. Furthermore, it is concluded that the IVI technique is applicable for impact monitoring, diagnosis and assessment of aerospace composite structures in complex practical engineering environments. PMID:26184196

  8. Real-Time System Verification by Kappa-Induction

    NASA Technical Reports Server (NTRS)

    Pike, Lee S.

    2005-01-01

    We report the first formal verification of a reintegration protocol for a safety-critical, fault-tolerant, real-time distributed embedded system. A reintegration protocol increases system survivability by allowing a node that has suffered a fault to regain state consistent with the operational nodes. The protocol is verified in the Symbolic Analysis Laboratory (SAL), where bounded model checking and decision procedures are used to verify infinite-state systems by k-induction. The protocol and its environment are modeled as synchronizing timeout automata. Because k-induction is exponential with respect to k, we optimize the formal model to reduce the size of k. Also, the reintegrator's event-triggered behavior is conservatively modeled as time-triggered behavior to further reduce the size of k and to make it invariant to the number of nodes modeled. A corollary is that a clique avoidance property is satisfied.

  9. A Practical Approach to Identity on Digital Ecosystems Using Claim Verification and Trust

    NASA Astrophysics Data System (ADS)

    McLaughlin, Mark; Malone, Paul

    Central to the ethos of digital ecosystems (DEs) is that DEs should be distributed and have no central points of failure or control. This essentially mandates a decentralised system, which poses significant challenges for identity. Identity in decentralised environments must be treated very differently to identity in traditional environments, where centralised naming, authentication and authorisation can be assumed, and where identifiers can be considered global and absolute. In the absence of such guarantees we have expanded on the OPAALS identity model to produce a general implementation for the OPAALS DE that uses a combination of identity claim verification protocols and trust to give assurances in place of centralised servers. We outline how the components of this implementation function and give an illustrated workflow of how identity issues are solved on the OPAALS DE in practice.

  10. Airside HVAC BESTEST: HVAC Air-Distribution System Model Test Cases for ASHRAE Standard 140

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, Ronald; Neymark, Joel; Kennedy, Mike D.

    This paper summarizes recent work to develop new airside HVAC equipment model analytical verification test cases for ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs. The analytical verification test method allows comparison of simulation results from a wide variety of building energy simulation programs with quasi-analytical solutions, further described below. Standard 140 is widely cited for evaluating software for use with performance-path energy efficiency analysis, in conjunction with well-known energy-efficiency standards including ASHRAE Standard 90.1, the International Energy Conservation Code, and other international standards. Airside HVAC Equipment is a common area ofmore » modelling not previously explicitly tested by Standard 140. Integration of the completed test suite into Standard 140 is in progress.« less

  11. Spot: A Programming Language for Verified Flight Software

    NASA Technical Reports Server (NTRS)

    Bocchino, Robert L., Jr.; Gamble, Edward; Gostelow, Kim P.; Some, Raphael R.

    2014-01-01

    The C programming language is widely used for programming space flight software and other safety-critical real time systems. C, however, is far from ideal for this purpose: as is well known, it is both low-level and unsafe. This paper describes Spot, a language derived from C for programming space flight systems. Spot aims to maintain compatibility with existing C code while improving the language and supporting verification with the SPIN model checker. The major features of Spot include actor-based concurrency, distributed state with message passing and transactional updates, and annotations for testing and verification. Spot also supports domain-specific annotations for managing spacecraft state, e.g., communicating telemetry information to the ground. We describe the motivation and design rationale for Spot, give an overview of the design, provide examples of Spot's capabilities, and discuss the current status of the implementation.

  12. HTTP-based remote operational options for the Vacuum Tower Telescope, Tenerife

    NASA Astrophysics Data System (ADS)

    Staiger, J.

    2012-09-01

    We are currently developing network based tools for the Vacuum Tower Telescope (VTT), Tenerife which will allow to operate the telescope together with the newly developed 2D-spectrometer HELLRIDE under remote control conditions. The computational configuration can be viewed as a distributed system linking hardware components of various functionality from different locations. We have developed a communication protocol which is basically an extension of the HTTP standard. It will serve as a carrier for command- and data-transfers. The server-client software is based on Berkley-Unix sockets in a C++ programming environment. A customized CMS will allow to create browser accessible information on-the-fly. Java-based applet pages have been tested as optional user access GUI's. An access tool has been implemented to download near-realtime, web-based target information from NASA/SDO. Latency tests have been carried out at the VTT and the Swedish STT at La Palma for concept verification. Short response times indicate that under favorable network conditions remote interactive telescope handling may be possible. The scientific focus of possible future remote operations will be set on the helioseismology of the solar atmosphere, the monitoring of flares and the footpoint analysis of coronal loops and chromospheric events.

  13. Palm Vein Verification Using Multiple Features and Locality Preserving Projections

    PubMed Central

    Bu, Wei; Wu, Xiangqian; Zhao, Qiushi

    2014-01-01

    Biometrics is defined as identifying people by their physiological characteristic, such as iris pattern, fingerprint, and face, or by some aspects of their behavior, such as voice, signature, and gesture. Considerable attention has been drawn on these issues during the last several decades. And many biometric systems for commercial applications have been successfully developed. Recently, the vein pattern biometric becomes increasingly attractive for its uniqueness, stability, and noninvasiveness. A vein pattern is the physical distribution structure of the blood vessels underneath a person's skin. The palm vein pattern is very ganglion and it shows a huge number of vessels. The attitude of the palm vein vessels stays in the same location for the whole life and its pattern is definitely unique. In our work, the matching filter method is proposed for the palm vein image enhancement. New palm vein features extraction methods, global feature extracted based on wavelet coefficients and locality preserving projections (WLPP), and local feature based on local binary pattern variance and locality preserving projections (LBPV_LPP) have been proposed. Finally, the nearest neighbour matching method has been proposed that verified the test palm vein images. The experimental result shows that the EER to the proposed method is 0.1378%. PMID:24693230

  14. Forecasting of monsoon heavy rains: challenges in NWP

    NASA Astrophysics Data System (ADS)

    Sharma, Kuldeep; Ashrit, Raghavendra; Iyengar, Gopal; Bhatla, R.; Rajagopal, E. N.

    2016-05-01

    Last decade has seen a tremendous improvement in the forecasting skill of numerical weather prediction (NWP) models. This is attributed to increased sophistication in NWP models, which resolve complex physical processes, advanced data assimilation, increased grid resolution and satellite observations. However, prediction of heavy rains is still a challenge since the models exhibit large error in amounts as well as spatial and temporal distribution. Two state-of-art NWP models have been investigated over the Indian monsoon region to assess their ability in predicting the heavy rainfall events. The unified model operational at National Center for Medium Range Weather Forecasting (NCUM) and the unified model operational at the Australian Bureau of Meteorology (Australian Community Climate and Earth-System Simulator -- Global (ACCESS-G)) are used in this study. The recent (JJAS 2015) Indian monsoon season witnessed 6 depressions and 2 cyclonic storms which resulted in heavy rains and flooding. The CRA method of verification allows the decomposition of forecast errors in terms of error in the rainfall volume, pattern and location. The case by case study using CRA technique shows that contribution to the rainfall errors come from pattern and displacement is large while contribution due to error in predicted rainfall volume is least.

  15. Application of semipermeable membrane devices for long-term monitoring of polycyclic aromatic hydrocarbons at various stages of drinking water treatment.

    PubMed

    Pogorzelec, Marta; Piekarska, Katarzyna

    2018-08-01

    The primary goal of the presented study was the investigation of occurrence and concentration of sixteen selected polycyclic aromatic hydrocarbons in samples from various stages of water treatment and verification of the applicability of semi-permeable membrane devices in the monitoring of drinking water. Another objective was to verify if weather seasons affect the concentration and complexity of PAHs. For these purposes, semipermeable membrane devices were installed in a surface water treatment plant located in Lower Silesia (Poland). Samples were collected monthly over a period of one year. To determine the effect of water treatment on PAH concentrations, four sampling sites were selected: raw water input, a stream of water in the pipe just before ozonation, treated water output and water after passing through the distribution system. After each month of sampling, SPMDs were exchanged for fresh ones and prepared for instrumental analysis. Concentrations of polycyclic aromatic hydrocarbons were determined by high-performance liquid chromatography (HPLC). The presented study indicates that semipermeable membrane devices can be an effective tool for the analysis of drinking water, in which organic micropollutants occur at very low concentrations. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Palm vein verification using multiple features and locality preserving projections.

    PubMed

    Al-Juboori, Ali Mohsin; Bu, Wei; Wu, Xiangqian; Zhao, Qiushi

    2014-01-01

    Biometrics is defined as identifying people by their physiological characteristic, such as iris pattern, fingerprint, and face, or by some aspects of their behavior, such as voice, signature, and gesture. Considerable attention has been drawn on these issues during the last several decades. And many biometric systems for commercial applications have been successfully developed. Recently, the vein pattern biometric becomes increasingly attractive for its uniqueness, stability, and noninvasiveness. A vein pattern is the physical distribution structure of the blood vessels underneath a person's skin. The palm vein pattern is very ganglion and it shows a huge number of vessels. The attitude of the palm vein vessels stays in the same location for the whole life and its pattern is definitely unique. In our work, the matching filter method is proposed for the palm vein image enhancement. New palm vein features extraction methods, global feature extracted based on wavelet coefficients and locality preserving projections (WLPP), and local feature based on local binary pattern variance and locality preserving projections (LBPV_LPP) have been proposed. Finally, the nearest neighbour matching method has been proposed that verified the test palm vein images. The experimental result shows that the EER to the proposed method is 0.1378%.

  17. Subpixel edge estimation with lens aberrations compensation based on the iterative image approximation for high-precision thermal expansion measurements of solids

    NASA Astrophysics Data System (ADS)

    Inochkin, F. M.; Kruglov, S. K.; Bronshtein, I. G.; Kompan, T. A.; Kondratjev, S. V.; Korenev, A. S.; Pukhov, N. F.

    2017-06-01

    A new method for precise subpixel edge estimation is presented. The principle of the method is the iterative image approximation in 2D with subpixel accuracy until the appropriate simulated is found, matching the simulated and acquired images. A numerical image model is presented consisting of three parts: an edge model, object and background brightness distribution model, lens aberrations model including diffraction. The optimal values of model parameters are determined by means of conjugate-gradient numerical optimization of a merit function corresponding to the L2 distance between acquired and simulated images. Computationally-effective procedure for the merit function calculation along with sufficient gradient approximation is described. Subpixel-accuracy image simulation is performed in a Fourier domain with theoretically unlimited precision of edge points location. The method is capable of compensating lens aberrations and obtaining the edge information with increased resolution. Experimental method verification with digital micromirror device applied to physically simulate an object with known edge geometry is shown. Experimental results for various high-temperature materials within the temperature range of 1000°C..2400°C are presented.

  18. GIS for the Assessment of the Groundwater Recharge Potential Zone

    NASA Astrophysics Data System (ADS)

    Lee, C.; Yeh, H.; Chen, J.; Hsu, K.

    2008-12-01

    Water resources in Taiwan are unevenly distributed in spatial and temporal domains. Effectively utilizing the water resources is an imperative task due to climate change. At present, groundwater contributes 34% of the total annual water supply and is an important fresh water resource. However, over-exploitation has decreased groundwater availability and has led to land subsidence. Assessing the potential zone of groundwater recharge is extremely important for the protection of water quality and the management of groundwater systems. The Chih-Pen Creek basin in eastern Taiwan is examined in this study to assess its groundwater resources potential. Remote sensing and the Geographical Information System (GIS) are used to integrate five contributing factors: lithology, land cover/land use, lineaments, drainage, and slope. The weights of factors contributing to the groundwater recharge are derived using aerial photos, geology maps, a land use database, and field verification. The resultant map of the groundwater potential zone demonstrates that the highest recharge potential area is located towards the downstream regions in the basin because of the high infiltration rates caused by gravelly sand and agricultural land use in these regions. In contrast, the least effective recharge potential area is in upstream regions due to the low infiltration of limestone.

  19. Strategies for Ground Testing of Manned Lunar Surface Systems

    NASA Technical Reports Server (NTRS)

    Beyer, Jeff; Gill, Tracy; Peacock, Mike

    2009-01-01

    One of the primary objectives of NASA's Vision for Space Exploration is the creation of a permanently manned lunar outpost. Facing the challenge of establishing a human presence on the moon will require new innovations and technologies that will be critical to expanding this exploration to Mars and beyond. However, accomplishing this task presents an unprecedented set of obstacles, one of the more significant of which is the development of new strategies for ground test and verification. Present concepts for the Lunar Surface System (LSS) architecture call for the construction of a series of independent yet tightly coupled modules and elements to be launched and assembled in incremental stages. Many of these will be fabricated at distributed locations and delivered shortly before launch, precluding any opportunity for testing in an actual integrated configuration. Furthermore, these components must operate flawlessly once delivered to the lunar surface since there is no possibility for returning a malfunctioning module to Earth for repair or modification. Although undergoing continual refinement, this paper will present the current state of the plans and models that have been devised for meeting the challenge of ground based testing for Constellation Program LSS as well as the rationale behind their selection.

  20. Experimental and Numerical Simulation Analysis of Typical Carbon Woven Fabric/Epoxy Laminates Subjected to Lightning Strike

    NASA Astrophysics Data System (ADS)

    Yin, J. J.; Chang, F.; Li, S. L.; Yao, X. L.; Sun, J. R.; Xiao, Y.

    2017-12-01

    To clarify the evolution of damage for typical carbon woven fabric/epoxy laminates exposed to lightning strike, artificial lightning testing on carbon woven fabric/epoxy laminates were conducted, damage was assessed using visual inspection and damage peeling approaches. Relationships between damage size and action integral were also elucidated. Results showed that damage appearance of carbon woven fabric/epoxy laminate presents circular distribution, and center of the circle located at the lightning attachment point approximately, there exist no damage projected area dislocations for different layers, visual damage territory represents maximum damage scope; visible damage can be categorized into five modes: resin ablation, fiber fracture and sublimation, delamination, ablation scallops and block-shaped ply-lift; delamination damage due to resin pyrolysis and internal pressure exist obvious distinguish; project area of total damage is linear with action integral for the same type specimens, that of resin ablation damage is linear with action integral, but no correlation with specimen type, for all specimens, damage depth is linear with logarithm of action integral. The coupled thermal-electrical model constructed is capable to simulate the ablation damage for carbon woven fabric/epoxy laminates exposed to simulated lightning current through experimental verification.

  1. Detection, location, and characterization of hydroacoustic signals using seafloor cable networks offshore Japan (Invited)

    NASA Astrophysics Data System (ADS)

    Sugioka, H.; Suyehiro, K.; Shinohara, M.

    2009-12-01

    The hydroacoustic monitoring by the International Monitoring System (IMS) for Comprehensive Nuclear-Test-Treaty (CTBT) verification system utilize hydrophone stations and seismic stations called T-phase stations for worldwide detection. Some signals of natural origin include those from earthquakes, submarine volcanic eruptions, or whale calls. Among artificial sources there are non-nuclear explosions and air-gun shots. It is important for IMS system to detect and locate hydroacoustic events with sufficient accuracy and correctly characterize the signals and identify the source. As there are a number of seafloor cable networks operated offshore Japanese islands basically facing the Pacific Ocean for monitoring regional seismicity, the data from these stations (pressures, hydrophones and seismic sensors) may be utilized to verify and increase the capability of the IMS. We use these data to compare some selected event parameters with those by Pacific in the time period of 2004-present. These anomalous examples and also dynamite shots used for seismic crustal structure studies and other natural sources will be presented in order to help improve the IMS verification capabilities for detection, location and characterization of anomalous signals. The seafloor cable networks composed of three hydrophones and six seismometers and a temporal dense seismic array detected and located hydroacoustic events offshore Japanese island on 12th of March in 2008, which had been reported by the IMS. We detected not only the reverberated hydroacoustic waves between the sea surface and the sea bottom but also the seismic waves going through the crust associated with the events. The determined source of the seismic waves is almost coincident with the one of hydroacoustic waves, suggesting that the seismic waves are converted very close to the origin of the hydroacoustic source. We also detected very similar signals on 16th of March in 2009 to the ones associated with the event of 12th of March in 2008.

  2. Completing and sustaining IMS network for the CTBT Verification Regime

    NASA Astrophysics Data System (ADS)

    Meral Ozel, N.

    2015-12-01

    The CTBT International Monitoring System is to be comprised of 337 facilities located all over the world for the purpose of detecting and locating nuclear test explosions. Major challenges remain, namely the completion of the network where most of the remaining stations have either environmental, logistical and/or political issues to surmont (89% of the stations have already been built) and the sustainment of a reliable and state-of the-art network covering 4 technologies - seismic, infrasound , hydroacoustic and radionuclide. To have a credible and trustworthy verification system ready for entry into force of the Treaty, the CTBTO is protecting and enhancing its investment of its global network of stations and is providing effective data to the International Data Centre (IDC) and Member States. Regarding the protection of the CTBTO's investment and enhanced sustainment of IMS station operations, the IMS Division is enhancing the capabilities of the monitoring system by applying advances in instrumentation and introducing new software applications that are fit for purpose. Some examples are the development of noble gas laboratory systems to process and analyse subsoil samples, development of a mobile noble gas system for onsite inspection purposes, optimization of Beta Gamma detectors for Xenon detection, assessing and improving the efficiency of wind noise reduction systems for infrasound stations, development and testing of infrasound stations with a self-calibrating capability, and research into the use of modular designs for the hydroacoustic network.

  3. Proceedings of the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nichols, James W., LTC

    2000-09-15

    These proceedings contain papers prepared for the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), held 13-15 September 2000 in New Orleans, Louisiana. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Defense Threat Reduction Agency (DTRA), Air Force Technical Applications Center (AFTAC), Department of Defense (DoD), US Army Space and Missile Defense Command, Defense Special Weapons Agency (DSWA), and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate,more » and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.« less

  4. Real-Time Simulation for Verification and Validation of Diagnostic and Prognostic Algorithms

    NASA Technical Reports Server (NTRS)

    Aguilar, Robet; Luu, Chuong; Santi, Louis M.; Sowers, T. Shane

    2005-01-01

    To verify that a health management system (HMS) performs as expected, a virtual system simulation capability, including interaction with the associated platform or vehicle, very likely will need to be developed. The rationale for developing this capability is discussed and includes the limited capability to seed faults into the actual target system due to the risk of potential damage to high value hardware. The capability envisioned would accurately reproduce the propagation of a fault or failure as observed by sensors located at strategic locations on and around the target system and would also accurately reproduce the control system and vehicle response. In this way, HMS operation can be exercised over a broad range of conditions to verify that it meets requirements for accurate, timely response to actual faults with adequate margin against false and missed detections. An overview is also presented of a real-time rocket propulsion health management system laboratory which is available for future rocket engine programs. The health management elements and approaches of this lab are directly applicable for future space systems. In this paper the various components are discussed and the general fault detection, diagnosis, isolation and the response (FDIR) concept is presented. Additionally, the complexities of V&V (Verification and Validation) for advanced algorithms and the simulation capabilities required to meet the changing state-of-the-art in HMS are discussed.

  5. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication verification, or installation verification... 30 Mineral Resources 2 2010-07-01 2010-07-01 false When must I resubmit Platform Verification...

  6. Pathological fracture of the patella due to an atypical located aneurysmal bone cyst: verification by means of ultrasound-guided biopsy.

    PubMed

    Plaikner, Michaela; Gruber, Hannes; Henninger, Benjamin; Gruber, Leonhard; Kosiol, Juana; Loizides, Alexander

    2016-03-01

    We report on a rare case of an atypical located aneurysmal bone cyst (ABC) in the patella presenting with pathological fracture after trauma. Using all available diagnostic modalities and by means of ultrasound-guided core-needle biopsy an unclear and suspected pathological fractured cystic bone lesion in the patella of a young man could be further clarified. The acquired images suggested the diagnosis of a pathological fractured aneurysmal bone cyst after mild trauma. However, due to the extraordinary location and clinical presentation the diagnosis was secured by means of ultrasound-guided biopsy through a small cortical gap. As shown in this rare case of an atypical aneurysmal bone cyst of the patella, the quite seldom but sometimes possible ultrasound-guided biopsy of intraosseous lesions can help to achieve the diagnostic clarification and should also be taken into account as a non-standard procedure.

  7. Quantitative trait locus gene mapping: a new method for locating alcohol response genes.

    PubMed

    Crabbe, J C

    1996-01-01

    Alcoholism is a multigenic trait with important non-genetic determinants. Studies with genetic animal models of susceptibility to several of alcohol's effects suggest that several genes contributing modest effects on susceptibility (Quantitative Trait Loci, or QTLs) are important. A new technique of QTL gene mapping has allowed the identification of the location in mouse genome of several such QTLs. The method is described, and the locations of QTLs affecting the acute alcohol withdrawal reaction are described as an example of the method. Verification of these QTLs in ancillary studies is described and the strengths, limitations, and future directions to be pursued are discussed. QTL mapping is a promising method for identifying genes in rodents with the hope of directly extrapolating the results to the human genome. This review is based on a paper presented at the First International Congress of the Latin American Society for Biomedical Research on Alcoholism, Santiago, Chile, November 1994.

  8. Monte Carlo simulations to replace film dosimetry in IMRT verification.

    PubMed

    Goetzfried, Thomas; Rickhey, Mark; Treutwein, Marius; Koelbl, Oliver; Bogner, Ludwig

    2011-01-01

    Patient-specific verification of intensity-modulated radiation therapy (IMRT) plans can be done by dosimetric measurements or by independent dose or monitor unit calculations. The aim of this study was the clinical evaluation of IMRT verification based on a fast Monte Carlo (MC) program with regard to possible benefits compared to commonly used film dosimetry. 25 head-and-neck IMRT plans were recalculated by a pencil beam based treatment planning system (TPS) using an appropriate quality assurance (QA) phantom. All plans were verified both by film and diode dosimetry and compared to MC simulations. The irradiated films, the results of diode measurements and the computed dose distributions were evaluated, and the data were compared on the basis of gamma maps and dose-difference histograms. Average deviations in the high-dose region between diode measurements and point dose calculations performed with the TPS and MC program were 0.7 ± 2.7% and 1.2 ± 3.1%, respectively. For film measurements, the mean gamma values with 3% dose difference and 3mm distance-to-agreement were 0.74 ± 0.28 (TPS as reference) with dose deviations up to 10%. Corresponding values were significantly reduced to 0.34 ± 0.09 for MC dose calculation. The total time needed for both verification procedures is comparable, however, by far less labor intensive in the case of MC simulations. The presented study showed that independent dose calculation verification of IMRT plans with a fast MC program has the potential to eclipse film dosimetry more and more in the near future. Thus, the linac-specific QA part will necessarily become more important. In combination with MC simulations and due to the simple set-up, point-dose measurements for dosimetric plausibility checks are recommended at least in the IMRT introduction phase. Copyright © 2010. Published by Elsevier GmbH.

  9. SU-E-T-490: Independent Three-Dimensional (3D) Dose Verification of VMAT/SBRT Using EPID and Cloud Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, A; Han, B; Bush, K

    Purpose: Dosimetric verification of VMAT/SBRT is currently performed on one or two planes in a phantom with either film or array detectors. A robust and easy-to-use 3D dosimetric tool has been sought since the advent of conformal radiation therapy. Here we present such a strategy for independent 3D VMAT/SBRT plan verification system by a combined use of EPID and cloud-based Monte Carlo (MC) dose calculation. Methods: The 3D dosimetric verification proceeds in two steps. First, the plan was delivered with a high resolution portable EPID mounted on the gantry, and the EPID-captured gantry-angle-resolved VMAT/SBRT field images were converted into fluencemore » by using the EPID pixel response function derived from MC simulations. The fluence was resampled and used as the input for an in-house developed Amazon cloud-based MC software to reconstruct the 3D dose distribution. The accuracy of the developed 3D dosimetric tool was assessed using a Delta4 phantom with various field sizes (square, circular, rectangular, and irregular MLC fields) and different patient cases. The method was applied to validate VMAT/SBRT plans using WFF and FFF photon beams (Varian TrueBeam STX). Results: It was found that the proposed method yielded results consistent with the Delta4 measurements. For points on the two detector planes, a good agreement within 1.5% were found for all the testing fields. Patient VMAT/SBRT plan studies revealed similar level of accuracy: an average γ-index passing rate of 99.2± 0.6% (3mm/3%), 97.4± 2.4% (2mm/2%), and 72.6± 8.4 % ( 1mm/1%). Conclusion: A valuable 3D dosimetric verification strategy has been developed for VMAT/SBRT plan validation. The technique provides a viable solution for a number of intractable dosimetry problems, such as small fields and plans with high dose gradient.« less

  10. Tolerance limits and methodologies for IMRT measurement-based verification QA: Recommendations of AAPM Task Group No. 218.

    PubMed

    Miften, Moyed; Olch, Arthur; Mihailidis, Dimitris; Moran, Jean; Pawlicki, Todd; Molineu, Andrea; Li, Harold; Wijesooriya, Krishni; Shi, Jie; Xia, Ping; Papanikolaou, Nikos; Low, Daniel A

    2018-04-01

    Patient-specific IMRT QA measurements are important components of processes designed to identify discrepancies between calculated and delivered radiation doses. Discrepancy tolerance limits are neither well defined nor consistently applied across centers. The AAPM TG-218 report provides a comprehensive review aimed at improving the understanding and consistency of these processes as well as recommendations for methodologies and tolerance limits in patient-specific IMRT QA. The performance of the dose difference/distance-to-agreement (DTA) and γ dose distribution comparison metrics are investigated. Measurement methods are reviewed and followed by a discussion of the pros and cons of each. Methodologies for absolute dose verification are discussed and new IMRT QA verification tools are presented. Literature on the expected or achievable agreement between measurements and calculations for different types of planning and delivery systems are reviewed and analyzed. Tests of vendor implementations of the γ verification algorithm employing benchmark cases are presented. Operational shortcomings that can reduce the γ tool accuracy and subsequent effectiveness for IMRT QA are described. Practical considerations including spatial resolution, normalization, dose threshold, and data interpretation are discussed. Published data on IMRT QA and the clinical experience of the group members are used to develop guidelines and recommendations on tolerance and action limits for IMRT QA. Steps to check failed IMRT QA plans are outlined. Recommendations on delivery methods, data interpretation, dose normalization, the use of γ analysis routines and choice of tolerance limits for IMRT QA are made with focus on detecting differences between calculated and measured doses via the use of robust analysis methods and an in-depth understanding of IMRT verification metrics. The recommendations are intended to improve the IMRT QA process and establish consistent, and comparable IMRT QA criteria among institutions. © 2018 American Association of Physicists in Medicine.

  11. Verification and Improvement of ERS-1/2 Altimeter Geophysical Data Records for Global Change Studies

    NASA Technical Reports Server (NTRS)

    Shum, C. K.

    2000-01-01

    This Final Technical Report summarizes the research work conducted under NASA's Physical Oceanography Program entitled, Verification And Improvement Of ERS-112 Altimeter Geophysical Data Recorders For Global Change Studies, for the time period from January 1, 2000 through June 30, 2000. This report also provides a summary of the investigation from July 1, 1997 - June 30, 2000. The primary objectives of this investigation include verification and improvement of the ERS-1 and ERS-2 radar altimeter geophysical data records for distribution of the data to the ESA-approved U.S. ERS-1/-2 investigators for global climate change studies. Specifically, the investigation is to verify and improve the ERS geophysical data record products by calibrating the instrument and assessing accuracy for the ERS-1/-2 orbital, geophysical, media, and instrument corrections. The purpose is to ensure that the consistency of constants, standards and algorithms with TOPEX/POSEIDON radar altimeter for global climate change studies such as the monitoring and interpretation of long-term sea level change. This investigation has provided the current best precise orbits, with the radial orbit accuracy for ERS-1 (Phases C-G) and ERS-2 estimated at the 3-5 cm rms level, an 30-fold improvement compared to the 1993 accuracy. We have finalized the production and verification of the value-added ERS-1 mission (Phases A, B, C, D, E, F, and G), in collaboration with JPL PODAAC and the University of Texas. Orbit and data verification and improvement of algorithms led to the best data product available to-date. ERS-2 altimeter data have been improved and we have been active on Envisat (2001 launch) GDR algorithm review and improvement. The data improvement of ERS-1 and ERS-2 led to improvement in the global mean sea surface, marine gravity anomaly and bathymetry models, and a study of Antarctica mass balance, which was published in Science in 1998.

  12. PET/CT imaging for treatment verification after proton therapy: A study with plastic phantoms and metallic implants

    PubMed Central

    Parodi, Katia; Paganetti, Harald; Cascio, Ethan; Flanz, Jacob B.; Bonab, Ali A.; Alpert, Nathaniel M.; Lohmann, Kevin; Bortfeld, Thomas

    2008-01-01

    The feasibility of off-line positron emission tomography/computed tomography (PET/CT) for routine three dimensional in-vivo treatment verification of proton radiation therapy is currently under investigation at Massachusetts General Hospital in Boston. In preparation for clinical trials, phantom experiments were carried out to investigate the sensitivity and accuracy of the method depending on irradiation and imaging parameters. Furthermore, they addressed the feasibility of PET/CT as a robust verification tool in the presence of metallic implants. These produce x-ray CT artifacts and fluence perturbations which may compromise the accuracy of treatment planning algorithms. Spread-out Bragg peak proton fields were delivered to different phantoms consisting of polymethylmethacrylate (PMMA), PMMA stacked with lung and bone equivalent materials, and PMMA with titanium rods to mimic implants in patients. PET data were acquired in list mode starting within 20 min after irradiation at a commercial luthetium-oxyorthosilicate (LSO)-based PET/CT scanner. The amount and spatial distribution of the measured activity could be well reproduced by calculations based on the GEANT4 and FLUKA Monte Carlo codes. This phantom study supports the potential of millimeter accuracy for range monitoring and lateral field position verification even after low therapeutic dose exposures of 2 Gy, despite the delay between irradiation and imaging. It also indicates the value of PET for treatment verification in the presence of metallic implants, demonstrating a higher sensitivity to fluence perturbations in comparison to a commercial analytical treatment planning system. Finally, it addresses the suitability of LSO-based PET detectors for hadron therapy monitoring. This unconventional application of PET involves countrates which are orders of magnitude lower than in diagnostic tracer imaging, i.e., the signal of interest is comparable to the noise originating from the intrinsic radioactivity of the detector itself. In addition to PET alone, PET/CT imaging provides accurate information on the position of the imaged object and may assess possible anatomical changes during fractionated radiotherapy in clinical applications. PMID:17388158

  13. The DES Bright Arcs Survey: Hundreds of Candidate Strongly Lensed Galaxy Systems from the Dark Energy Survey Science Verification and Year 1 Observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diehl, H. T.; Buckley-Geer, E. J.; Lindgren, K. A.

    We report the results of searches for strong gravitational lens systems in the Dark Energy Survey (DES) Science Verification and Year 1 observations. The Science Verification data span approximately 250 sq. deg. with a median i -band limiting magnitude for extended objects (10 σ ) of 23.0. The Year 1 data span approximately 2000 sq. deg. and have an i -band limiting magnitude for extended objects (10 σ ) of 22.9. As these data sets are both wide and deep, they are particularly useful for identifying strong gravitational lens candidates. Potential strong gravitational lens candidate systems were initially identified basedmore » on a color and magnitude selection in the DES object catalogs or because the system is at the location of a previously identified galaxy cluster. Cutout images of potential candidates were then visually scanned using an object viewer and numerically ranked according to whether or not we judged them to be likely strong gravitational lens systems. Having scanned nearly 400,000 cutouts, we present 374 candidate strong lens systems, of which 348 are identified for the first time. We provide the R.A. and decl., the magnitudes and photometric properties of the lens and source objects, and the distance (radius) of the source(s) from the lens center for each system.« less

  14. Linear models to perform treaty verification tasks for enhanced information security

    DOE PAGES

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; ...

    2016-11-12

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensionalmore » vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.« less

  15. Linear models to perform treaty verification tasks for enhanced information security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensionalmore » vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.« less

  16. The DES Bright Arcs Survey: Hundreds of Candidate Strongly Lensed Galaxy Systems from the Dark Energy Survey Science Verification and Year 1 Observations

    NASA Astrophysics Data System (ADS)

    Diehl, H. T.; Buckley-Geer, E. J.; Lindgren, K. A.; Nord, B.; Gaitsch, H.; Gaitsch, S.; Lin, H.; Allam, S.; Collett, T. E.; Furlanetto, C.; Gill, M. S. S.; More, A.; Nightingale, J.; Odden, C.; Pellico, A.; Tucker, D. L.; da Costa, L. N.; Fausti Neto, A.; Kuropatkin, N.; Soares-Santos, M.; Welch, B.; Zhang, Y.; Frieman, J. A.; Abdalla, F. B.; Annis, J.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Cunha, C. E.; D'Andrea, C. B.; Desai, S.; Dietrich, J. P.; Drlica-Wagner, A.; Evrard, A. E.; Finley, D. A.; Flaugher, B.; García-Bellido, J.; Gerdes, D. W.; Goldstein, D. A.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Kuehn, K.; Kuhlmann, S.; Lahav, O.; Li, T. S.; Lima, M.; Maia, M. A. G.; Marshall, J. L.; Menanteau, F.; Miquel, R.; Nichol, R. C.; Nugent, P.; Ogando, R. L. C.; Plazas, A. A.; Reil, K.; Romer, A. K.; Sako, M.; Sanchez, E.; Santiago, B.; Scarpine, V.; Schindler, R.; Schubnell, M.; Sevilla-Noarbe, I.; Sheldon, E.; Smith, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Walker, A. R.; DES Collaboration

    2017-09-01

    We report the results of searches for strong gravitational lens systems in the Dark Energy Survey (DES) Science Verification and Year 1 observations. The Science Verification data span approximately 250 sq. deg. with a median I-band limiting magnitude for extended objects (10σ) of 23.0. The Year 1 data span approximately 2000 sq. deg. and have an I-band limiting magnitude for extended objects (10σ) of 22.9. As these data sets are both wide and deep, they are particularly useful for identifying strong gravitational lens candidates. Potential strong gravitational lens candidate systems were initially identified based on a color and magnitude selection in the DES object catalogs or because the system is at the location of a previously identified galaxy cluster. Cutout images of potential candidates were then visually scanned using an object viewer and numerically ranked according to whether or not we judged them to be likely strong gravitational lens systems. Having scanned nearly 400,000 cutouts, we present 374 candidate strong lens systems, of which 348 are identified for the first time. We provide the R.A. and decl., the magnitudes and photometric properties of the lens and source objects, and the distance (radius) of the source(s) from the lens center for each system.

  17. Linear models to perform treaty verification tasks for enhanced information security

    NASA Astrophysics Data System (ADS)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A.

    2017-02-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  18. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  19. Poster - Thurs Eve-43: Verification of dose calculation with tissue inhomogeneity using MapCHECK.

    PubMed

    Korol, R; Chen, J; Mosalaei, H; Karnas, S

    2008-07-01

    MapCHECK (Sun Nuclear, Melbourne, FL) with 445 diode detectors has been used widely for routine IMRT quality assurance (QA) 1 . However, routine IMRT QA has not included the verification of inhomogeneity effects. The objective of this study is to use MapCHECK and a phantom to verify dose calculation and IMRT delivery with tissue inhomogeneity. A phantom with tissue inhomogeneities was placed on top of MapCHECK to measure the planar dose for an anterior beam with photon energy 6 MV or 18 MV. The phantom was composed of a 3.5 cm thick block of lung equivalent material and solid water arranged side by side with a 0.5 cm slab of solid water on the top of the phantom. The phantom setup including MapCHECK was CT scanned and imported into Pinnacle 8.0d for dose calculation. Absolute dose distributions were compared with gamma criteria 3% for dose difference and 3 mm for distance-to-agreement. The results are in good agreement between the measured and calculated planar dose with 88% pass rate based on the gamma analysis. The major dose difference was at the lung-water interface. Further investigation will be performed on a custom designed inhomogeneity phantom with inserts of varying densities and effective depth to create various dose gradients at the interface for dose calculation and delivery verification. In conclusion, a phantom with tissue inhomogeneities can be used with MapCHECK for verification of dose calculation and delivery with tissue inhomogeneity. © 2008 American Association of Physicists in Medicine.

  20. Spatial Evaluation and Verification of Earthquake Simulators

    NASA Astrophysics Data System (ADS)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  1. 75 FR 8920 - Grant of Authority for Subzone Status; IKEA Distribution Services (Distribution of Home...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-26

    ... Status; IKEA Distribution Services (Distribution of Home Furnishings and Accessories); Baltimore, MD... subzone at the warehouse and distribution facility of IKEA Distribution Services, located in Perryville... and distribution at the facility of IKEA Distribution Services, located in Perryville, Maryland...

  2. Commissioning of a grid-based Boltzmann solver for cervical cancer brachytherapy treatment planning with shielded colpostats.

    PubMed

    Mikell, Justin K; Klopp, Ann H; Price, Michael; Mourtada, Firas

    2013-01-01

    We sought to commission a gynecologic shielded colpostat analytic model provided from a treatment planning system (TPS) library. We have reported retrospectively the dosimetric impact of this applicator model in a cohort of patients. A commercial TPS with a grid-based Boltzmann solver (GBBS) was commissioned for (192)Ir high-dose-rate (HDR) brachytherapy for cervical cancer with stainless steel-shielded colpostats. Verification of the colpostat analytic model was verified using a radiograph and vendor schematics. MCNPX v2.6 Monte Carlo simulations were performed to compare dose distributions around the applicator in water with the TPS GBBS dose predictions. Retrospectively, the dosimetric impact was assessed over 24 cervical cancer patients' HDR plans. Applicator (TPS ID #AL13122005) shield dimensions were within 0.4 mm of the independent shield dimensions verification. GBBS profiles in planes bisecting the cap around the applicator agreed with Monte Carlo simulations within 2% at most locations; differing screw representations resulted in differences of up to 9%. For the retrospective study, the GBBS doses differed from TG-43 as follows (mean value ± standard deviation [min, max]): International Commission on Radiation units [ICRU]rectum (-8.4 ± 2.5% [-14.1, -4.1%]), ICRUbladder (-7.2 ± 3.6% [-15.7, -2.1%]), D2cc-rectum (-6.2 ± 2.6% [-11.9, -0.8%]), D2cc-sigmoid (-5.6 ± 2.6% [-9.3, -2.0%]), and D2cc-bladder (-3.4 ± 1.9% [-7.2, -1.1%]). As brachytherapy TPSs implement advanced model-based dose calculations, the analytic applicator models stored in TPSs should be independently validated before clinical use. For this cohort, clinically meaningful differences (>5%) from TG-43 were observed. Accurate dosimetric modeling of shielded applicators may help to refine organ toxicity studies. Copyright © 2013 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  3. Off-fault plasticity in three-dimensional dynamic rupture simulations using a modal Discontinuous Galerkin method on unstructured meshes: Implementation, verification, and application

    NASA Astrophysics Data System (ADS)

    Wollherr, Stephanie; Gabriel, Alice-Agnes; Uphoff, Carsten

    2018-05-01

    The dynamics and potential size of earthquakes depend crucially on rupture transfers between adjacent fault segments. To accurately describe earthquake source dynamics, numerical models can account for realistic fault geometries and rheologies such as nonlinear inelastic processes off the slip interface. We present implementation, verification, and application of off-fault Drucker-Prager plasticity in the open source software SeisSol (www.seissol.org). SeisSol is based on an arbitrary high-order derivative modal Discontinuous Galerkin (ADER-DG) method using unstructured, tetrahedral meshes specifically suited for complex geometries. Two implementation approaches are detailed, modelling plastic failure either employing sub-elemental quadrature points or switching to nodal basis coefficients. At fine fault discretizations the nodal basis approach is up to 6 times more efficient in terms of computational costs while yielding comparable accuracy. Both methods are verified in community benchmark problems and by three dimensional numerical h- and p-refinement studies with heterogeneous initial stresses. We observe no spectral convergence for on-fault quantities with respect to a given reference solution, but rather discuss a limitation to low-order convergence for heterogeneous 3D dynamic rupture problems. For simulations including plasticity, a high fault resolution may be less crucial than commonly assumed, due to the regularization of peak slip rate and an increase of the minimum cohesive zone width. In large-scale dynamic rupture simulations based on the 1992 Landers earthquake, we observe high rupture complexity including reverse slip, direct branching, and dynamic triggering. The spatio-temporal distribution of rupture transfers are altered distinctively by plastic energy absorption, correlated with locations of geometrical fault complexity. Computational cost increases by 7% when accounting for off-fault plasticity in the demonstrating application. Our results imply that the combination of fully 3D dynamic modelling, complex fault geometries, and off-fault plastic yielding is important to realistically capture dynamic rupture transfers in natural fault systems.

  4. Dosimetric validation and clinical implementation of two 3D dose verification systems for quality assurance in volumetric-modulated arc therapy techniques.

    PubMed

    Clemente-Gutiérrez, Francisco; Pérez-Vara, Consuelo

    2015-03-08

    A pretreatment quality assurance program for volumetric techniques should include redundant calculations and measurement-based verifications. The patient-specific quality assurance process must be based in clinically relevant metrics. The aim of this study was to show the commission, clinical implementation, and comparison of two systems that allow performing a 3D redundant dose calculation. In addition, one of them is capable of reconstructing the dose on patient anatomy from measurements taken with a 2D ion chamber array. Both systems were compared in terms of reference calibration data (absolute dose, output factors, percentage depth-dose curves, and profiles). Results were in good agreement for absolute dose values (discrepancies were below 0.5%) and output factors (mean differences were below 1%). Maximum mean discrepancies were located between 10 and 20 cm of depth for PDDs (-2.7%) and in the penumbra region for profiles (mean DTA of 1.5 mm). Validation of the systems was performed by comparing point-dose measurements with values obtained by the two systems for static, dynamic fields from AAPM TG-119 report, and 12 real VMAT plans for different anatomical sites (differences better than 1.2%). Comparisons between measurements taken with a 2D ion chamber array and results obtained by both systems for real VMAT plans were also performed (mean global gamma passing rates better than 87.0% and 97.9% for the 2%/2 mm and 3%/3 mm criteria). Clinical implementation of the systems was evaluated by comparing dose-volume parameters for all TG-119 tests and real VMAT plans with TPS values (mean differences were below 1%). In addition, comparisons between dose distributions calculated by TPS and those extracted by the two systems for real VMAT plans were also performed (mean global gamma passing rates better than 86.0% and 93.0% for the 2%/2 mm and 3%/ 3 mm criteria). The clinical use of both systems was successfully evaluated.

  5. Real-time high speed generator system emulation with hardware-in-the-loop application

    NASA Astrophysics Data System (ADS)

    Stroupe, Nicholas

    The emerging emphasis and benefits of distributed generation on smaller scale networks has prompted much attention and focus to research in this field. Much of the research that has grown in distributed generation has also stimulated the development of simulation software and techniques. Testing and verification of these distributed power networks is a complex task and real hardware testing is often desired. This is where simulation methods such as hardware-in-the-loop become important in which an actual hardware unit can be interfaced with a software simulated environment to verify proper functionality. In this thesis, a simulation technique is taken one step further by utilizing a hardware-in-the-loop technique to emulate the output voltage of a generator system interfaced to a scaled hardware distributed power system for testing. The purpose of this thesis is to demonstrate a new method of testing a virtually simulated generation system supplying a scaled distributed power system in hardware. This task is performed by using the Non-Linear Loads Test Bed developed by the Energy Conversion and Integration Thrust at the Center for Advanced Power Systems. This test bed consists of a series of real hardware developed converters consistent with the Navy's All-Electric-Ship proposed power system to perform various tests on controls and stability under the expected non-linear load environment of the Navy weaponry. This test bed can also explore other distributed power system research topics and serves as a flexible hardware unit for a variety of tests. In this thesis, the test bed will be utilized to perform and validate this newly developed method of generator system emulation. In this thesis, the dynamics of a high speed permanent magnet generator directly coupled with a micro turbine are virtually simulated on an FPGA in real-time. The calculated output stator voltage will then serve as a reference for a controllable three phase inverter at the input of the test bed that will emulate and reproduce these voltages on real hardware. The output of the inverter is then connected with the rest of the test bed and can consist of a variety of distributed system topologies for many testing scenarios. The idea is that the distributed power system under test in hardware can also integrate real generator system dynamics without physically involving an actual generator system. The benefits of successful generator system emulation are vast and lead to much more detailed system studies without the draw backs of needing physical generator units. Some of these advantages are safety, reduced costs, and the ability of scaling while still preserving the appropriate system dynamics. This thesis will introduce the ideas behind generator emulation and explain the process and necessary steps to obtaining such an objective. It will also demonstrate real results and verification of numerical values in real-time. The final goal of this thesis is to introduce this new idea and show that it is in fact obtainable and can prove to be a highly useful tool in the simulation and verification of distributed power systems.

  6. Satellite detection of oil on the marine surface

    NASA Technical Reports Server (NTRS)

    Wilson, M. J.; Oneill, P. E.; Estes, J. E.

    1981-01-01

    The ability of two widely dissimilar spaceborne imaging sensors to detect surface oil accumulations in the marine environment has been evaluated using broadly different techniques. Digital Landsat multispectral scanner (MSS) data consisting of two visible and two near infrared channels has been processed to enhance contrast between areas of known oil coverage and background clean surface water. These enhanced images have then been compared to surface verification data gathered by aerial reconnaissance during the October 15, 1975, Landsat overpass. A similar evaluation of oil slick imaging potential has been made for digitally enhanced Seasat-A synthetic aperture radar (SAR) data from July 18, 1979. Due to the premature failure of this satellite, however, no concurrent surface verification data were collected. As a substitute, oil slick configuration information has been generated for the comparison using meteorological and oceanographic data. The test site utilized in both studies was the extensive area of natural seepage located off Coal Oil Point, adjacent to the University of California, Santa Barbara.

  7. Specification and Verification of Secure Concurrent and Distributed Software Systems

    DTIC Science & Technology

    1992-02-01

    primitive search strategies work for operating systems that contain relatively few operations . As the number of operations increases, so does the the...others have granted him access to, etc . The burden of security falls on the operating system , although appropriate hardware support can minimize the...Guttag, J. Horning, and R. Levin. Synchronization primitives for a multiprocessor: a formal specification. Symposium on Operating System Principles

  8. High Fidelity Modeling of Field-Reversed Configuration (FRC) Thrusters (Briefing Charts)

    DTIC Science & Technology

    2017-05-24

    Converged Math → Irrelevant Solutions? Validation: Fluids Example Stoke’s Flow MARTIN, SOUSA, TRAN (AFRL/RQRS) DISTRIBUTION A - APPROVED FOR PUBLIC RELEASE...Convergence Tests Converged Math → Irrelevant Solutions? Must be Aware of Valid Assumption Regions Validation: Fluids Example Stoke’s Flow Potential...AND VALIDATION Verification: Asymptotic Models → Analytical Solutions Yields Exact Convergence Tests Converged Math → Irrelevant Solutions? Must be

  9. An In Vitro Model for Retinal Laser Damage

    DTIC Science & Technology

    2007-01-01

    Approved for public release, distribution unlimited This paper is part of the following report: TITLE: Conference on Optical Interactions with Tissue...necessarily endorsed by the United States Air Force. Optical Interactions with Tissue and Cells XVIII, edited by Steven L. Jacques, William P. Roach, Proc...used for the 532-nm exposures. Verification of laser wavelength was performed with a spectrometer (Ocean Optics ). Figure 4 provides a schematic

  10. MAGAT gel and EBT2 film‐based dosimetry for evaluating source plugging‐based treatment plan in Gamma Knife stereotactic radiosurgery

    PubMed Central

    Vivekanandhan, S.; Kale, S.S.; Rath, G.K.; Senthilkumaran, S.; Thulkar, S.; Subramani, V.; Laviraj, M.A.; Bisht, R.K.; Mahapatra, A.K.

    2012-01-01

    This work illustrates a procedure to assess the overall accuracy associated with Gamma Knife treatment planning using plugging. The main role of source plugging or blocking is to create dose falloff in the junction between a target and a critical structure. We report the use of MAGAT gel dosimeter for verification of an experimental treatment plan based on plugging. The polymer gel contained in a head‐sized glass container simulated all major aspects of the treatment process of Gamma Knife radiosurgery. The 3D dose distribution recorded in the gel dosimeter was read using a 1.5T MRI scanner. Scanning protocol was: CPMG pulse sequence with 8 equidistant echoes, TR=7 s, echo step=14 ms, pixel size=0.5 mm x 0.5 mm, and slice thickness of 2 mm. Using a calibration relationship between absorbed dose and spin‐spin relaxation rate (R2), we converted R2 images to dose images. Volumetric dose comparison between treatment planning system (TPS) and gel measurement was accomplished using an in‐house MATLAB‐based program. The isodose overlay of the measured and computed dose distribution on axial planes was in close agreement. Gamma index analysis of 3D data showed more than 94% voxel pass rate for different tolerance criteria of 3%/2 mm, 3%/1 mm and 2%/2 mm. Film dosimetry with GAFCHROMIC EBT 2 film was also performed to compare the results with the calculated TPS dose. Gamma index analysis of film measurement for the same tolerance criteria used for gel measurement evaluation showed more than 95% voxel pass rate. Verification of gamma plan calculated dose on account of shield is not part of acceptance testing of Leksell Gamma Knife (LGK). Through this study we accomplished a volumetric comparison of dose distributions measured with a polymer gel dosimeter and Leksell GammaPlan (LGP) calculations for plans using plugging. We propose gel dosimeter as a quality assurance (QA) tool for verification of plug‐based planning. PACS number: 87.53.Ly, 87.55.‐x, 87.56.N‐ PMID:23149780

  11. SU-E-T-52: A New Device for Quality Assurance of a Single Isocenter Technique for the Simultaneous Treatment of Multiple Brain Metastases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maurer, J; Sintay, B; Varchena, V

    2015-06-15

    Purpose: Comprehensive quality assurance (QA) of a single isocenter technique for the simultaneous treatment of multiple brain metastases is presently impractical due to the time consuming nature of measuring each lesion’s dose on film or with a micro-chamber. Three dimensional diode array and full field film measurements are sometimes used to evaluate these plans, but gamma analysis may not reveal local errors that have significant effects on one or a few of several targets. This work aimed to design, build and test a phantom to simplify comprehensive measurement and evaluation. Methods: A phantom was designed with 28 stackable slabs. Themore » top and bottom slabs are 1.5 centimeters (cm) in thickness, and central 26 slabs are 0.5 cm thick. When assembled with radiochromic film in all 27 gaps, the phantom measures 16.5 x 15 x 19 cm. Etchings were designed to aide in identification of specific film planes on computed tomography (CT) images and correlation of individual PTVs with closest bisecting planes. Patient verification plans with a total of 16 PTVs were calculated on the phantom CT, and test deliveries both with and without couch kicks were performed to test the ability to identify correct film placements and subsequent PTV specific dose distributions on the films. Results: Bisecting planes corresponding to PTV locations were easily identified, and PTV specific dose distributions were clear for all 16 targets. For deliveries with couch kicks, the phantom PTV dose distributions closely approximated those calculated on the patient’s CT. For deliveries without couch kicks, PTV specific dosimetry was also possible, although the distributions had ‘ghosts’ equaling the number of couch kicks, with distance between ghosts increasing with distance from the isocenter. Conclusion: A new phantom facilitates fast comprehensive commissioning validation and PTV specific dosimetry for a single isocenter technique for treating multiple brain metastases. This work was partially funded by CIRS, Inc.« less

  12. Integrated cockpit design for the Army helicopter improvement program

    NASA Technical Reports Server (NTRS)

    Drennen, T.; Bowen, B.

    1984-01-01

    The main Army Helicopter Improvement Program (AHIP) mission is to navigate precisely, locate targets accurately, communicate their position to other battlefield elements, and to designate them for laser guided weapons. The onboard navigation and mast-mounted sight (MMS) avionics enable accurate tracking of current aircraft position and subsequent target location. The AHIP crewstation development was based on extensive mission/task analysis, function allocation, total system design, and test and verification. The avionics requirements to meet the mission was limited by the existing aircraft structural and performance characteristics and resultant space, weight, and power restrictions. These limitations and night operations requirement led to the use of night vision goggles. The combination of these requirements and limitations dictated an integrated control/display approach using multifunction displays and controls.

  13. Ground calibration of the spatial response and quantum efficiency of the CdZnTe hard x-ray detectors for NuSTAR

    NASA Astrophysics Data System (ADS)

    Grefenstette, Brian W.; Bhalerao, Varun; Cook, W. Rick; Harrison, Fiona A.; Kitaguchi, Takao; Madsen, Kristin K.; Mao, Peter H.; Miyasaka, Hiromasa; Rana, Vikram

    2017-08-01

    Pixelated Cadmium Zinc Telluride (CdZnTe) detectors are currently flying on the Nuclear Spectroscopic Telescope ARray (NuSTAR) NASA Astrophysics Small Explorer. While the pixel pitch of the detectors is ≍ 605 μm, we can leverage the detector readout architecture to determine the interaction location of an individual photon to much higher spatial accuracy. The sub-pixel spatial location allows us to finely oversample the point spread function of the optics and reduces imaging artifacts due to pixelation. In this paper we demonstrate how the sub-pixel information is obtained, how the detectors were calibrated, and provide ground verification of the quantum efficiency of our Monte Carlo model of the detector response.

  14. Verification Games: Crowd-Sourced Formal Verification

    DTIC Science & Technology

    2016-03-01

    VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION UNIVERSITY OF WASHINGTON MARCH 2016 FINAL TECHNICAL REPORT...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750...clarification memorandum dated 16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Over the more than three years of the project Verification Games : Crowd-sourced

  15. SU-F-T-288: Impact of Trajectory Log Files for Clarkson-Based Independent Dose Verification of IMRT and VMAT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takahashi, R; Kamima, T; Tachibana, H

    2016-06-15

    Purpose: To investigate the effect of the trajectory files from linear accelerator for Clarkson-based independent dose verification in IMRT and VMAT plans. Methods: A CT-based independent dose verification software (Simple MU Analysis: SMU, Triangle Products, Japan) with a Clarksonbased algorithm was modified to calculate dose using the trajectory log files. Eclipse with the three techniques of step and shoot (SS), sliding window (SW) and Rapid Arc (RA) was used as treatment planning system (TPS). In this study, clinically approved IMRT and VMAT plans for prostate and head and neck (HN) at two institutions were retrospectively analyzed to assess the dosemore » deviation between DICOM-RT plan (PL) and trajectory log file (TJ). An additional analysis was performed to evaluate MLC error detection capability of SMU when the trajectory log files was modified by adding systematic errors (0.2, 0.5, 1.0 mm) and random errors (5, 10, 30 mm) to actual MLC position. Results: The dose deviations for prostate and HN in the two sites were 0.0% and 0.0% in SS, 0.1±0.0%, 0.1±0.1% in SW and 0.6±0.5%, 0.7±0.9% in RA, respectively. The MLC error detection capability shows the plans for HN IMRT were the most sensitive and 0.2 mm of systematic error affected 0.7% dose deviation on average. Effect of the MLC random error did not affect dose error. Conclusion: The use of trajectory log files including actual information of MLC location, gantry angle, etc should be more effective for an independent verification. The tolerance level for the secondary check using the trajectory file may be similar to that of the verification using DICOM-RT plan file. From the view of the resolution of MLC positional error detection, the secondary check could detect the MLC position error corresponding to the treatment sites and techniques. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  16. Electricity End Uses, Energy Efficiency, and Distributed Energy Resources Baseline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Lisa; Wei, Max; Morrow, William

    This report was developed by a team of analysts at Lawrence Berkeley National Laboratory, with Argonne National Laboratory contributing the transportation section, and is a DOE EPSA product and part of a series of “baseline” reports intended to inform the second installment of the Quadrennial Energy Review (QER 1.2). QER 1.2 provides a comprehensive review of the nation’s electricity system and cover the current state and key trends related to the electricity system, including generation, transmission, distribution, grid operations and planning, and end use. The baseline reports provide an overview of elements of the electricity system. This report focuses onmore » end uses, electricity consumption, electric energy efficiency, distributed energy resources (DERs) (such as demand response, distributed generation, and distributed storage), and evaluation, measurement, and verification (EM&V) methods for energy efficiency and DERs.« less

  17. Analysis-Based Verification: A Programmer-Oriented Approach to the Assurance of Mechanical Program Properties

    DTIC Science & Technology

    2010-05-27

    programming language, threads can only communicate through fields and this assertion prohibits an alias to the object under construction from being writ- ten...1.9. We call this type of reporting “compiler-like” in the sense that the descriptive message output by the tool has to communicate the semantics of...way to communicate a “need” for further annotation to the tool user because a precise expression of both the location and content of the needed

  18. An integral equation formulation for predicting radiation patterns of a space shuttle annular slot antenna

    NASA Technical Reports Server (NTRS)

    Jones, J. E.; Richmond, J. H.

    1974-01-01

    An integral equation formulation is applied to predict pitch- and roll-plane radiation patterns of a thin VHF/UHF (very high frequency/ultra high frequency) annular slot communications antenna operating at several locations in the nose region of the space shuttle orbiter. Digital computer programs used to compute radiation patterns are given and the use of the programs is illustrated. Experimental verification of computed patterns is given from measurements made on 1/35-scale models of the orbiter.

  19. Design, analysis and test verification of advanced encapsulation systems

    NASA Technical Reports Server (NTRS)

    Garcia, A., III

    1984-01-01

    Investigations into transparent conductive polymers were begun. Polypyrrole was electrochemically deposited, but the film characteristics were poor. A proprietary polymer material supplied by Polaroid was evaluated and showed promise as a readily processable material. A method was developed for calculating the magnitude and location of the maximum electric field for the family of solar-cell-like shapes. A method for calculating the lines of force for three dimensional electric fields was developed and applied to a geometry of interest to the photovoltaic program.

  20. Typification of Zapałowicz’s names in Aconitum section Aconitum

    PubMed Central

    Wacławska-Ćwiertnia, Klaudia; Mitka, Józef

    2016-01-01

    Abstract Hugo Zapałowicz described and named 27 taxa in Aconitum sect. Aconitum. Their names are typified here. Two of them (Aconitum berdaui, Aconitum bucovinense) are deemed correct for currently accepted species of the Carpathians, 24 are reduced to synonymy under five taxa, and for one no original material has been located. The correct place and exact date of their publication, which differs from those usually assumed, have been ascertained by bibliographic verification and the study of archival documents. PMID:26884711

Top