Sample records for regional field verification

  1. Idaho out-of-service verification field operational test

    DOT National Transportation Integrated Search

    2000-02-01

    The Out-of-Service Verification Field Operational Test Project was initiated in 1994. The purpose of the project was to test the feasibility of using sensors and a computerized tracking system to augment the ability of inspectors to monitor and contr...

  2. Verification of operational solar flare forecast: Case of Regional Warning Center Japan

    NASA Astrophysics Data System (ADS)

    Kubo, Yûki; Den, Mitsue; Ishii, Mamoru

    2017-08-01

    In this article, we discuss a verification study of an operational solar flare forecast in the Regional Warning Center (RWC) Japan. The RWC Japan has been issuing four-categorical deterministic solar flare forecasts for a long time. In this forecast verification study, we used solar flare forecast data accumulated over 16 years (from 2000 to 2015). We compiled the forecast data together with solar flare data obtained with the Geostationary Operational Environmental Satellites (GOES). Using the compiled data sets, we estimated some conventional scalar verification measures with 95% confidence intervals. We also estimated a multi-categorical scalar verification measure. These scalar verification measures were compared with those obtained by the persistence method and recurrence method. As solar activity varied during the 16 years, we also applied verification analyses to four subsets of forecast-observation pair data with different solar activity levels. We cannot conclude definitely that there are significant performance differences between the forecasts of RWC Japan and the persistence method, although a slightly significant difference is found for some event definitions. We propose to use a scalar verification measure to assess the judgment skill of the operational solar flare forecast. Finally, we propose a verification strategy for deterministic operational solar flare forecasting. For dichotomous forecast, a set of proposed verification measures is a frequency bias for bias, proportion correct and critical success index for accuracy, probability of detection for discrimination, false alarm ratio for reliability, Peirce skill score for forecast skill, and symmetric extremal dependence index for association. For multi-categorical forecast, we propose a set of verification measures as marginal distributions of forecast and observation for bias, proportion correct for accuracy, correlation coefficient and joint probability distribution for association, the

  3. An unattended verification station for UF 6 cylinders: Field trial findings

    DOE PAGES

    Smith, L. E.; Miller, K. A.; McDonald, B. S.; ...

    2017-08-26

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less

  4. An unattended verification station for UF 6 cylinders: Field trial findings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, L. E.; Miller, K. A.; McDonald, B. S.

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less

  5. An unattended verification station for UF6 cylinders: Field trial findings

    NASA Astrophysics Data System (ADS)

    Smith, L. E.; Miller, K. A.; McDonald, B. S.; Webster, J. B.; Zalavadia, M. A.; Garner, J. R.; Stewart, S. L.; Branney, S. J.; Todd, L. C.; Deshmukh, N. S.; Nordquist, H. A.; Kulisek, J. A.; Swinhoe, M. T.

    2017-12-01

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. Analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 "typical" Type 30B cylinders, and the viability of an "NDA Fingerprint" concept as a high-fidelity means to periodically verify that material diversion has not occurred.

  6. A calibration method for patient specific IMRT QA using a single therapy verification film

    PubMed Central

    Shukla, Arvind Kumar; Oinam, Arun S.; Kumar, Sanjeev; Sandhu, I.S.; Sharma, S.C.

    2013-01-01

    Aim The aim of the present study is to develop and verify the single film calibration procedure used in intensity-modulated radiation therapy (IMRT) quality assurance. Background Radiographic films have been regularly used in routine commissioning of treatment modalities and verification of treatment planning system (TPS). The radiation dosimetery based on radiographic films has ability to give absolute two-dimension dose distribution and prefer for the IMRT quality assurance. However, the single therapy verification film gives a quick and significant reliable method for IMRT verification. Materials and methods A single extended dose rate (EDR 2) film was used to generate the sensitometric curve of film optical density and radiation dose. EDR 2 film was exposed with nine 6 cm × 6 cm fields of 6 MV photon beam obtained from a medical linear accelerator at 5-cm depth in solid water phantom. The nine regions of single film were exposed with radiation doses raging from 10 to 362 cGy. The actual dose measurements inside the field regions were performed using 0.6 cm3 ionization chamber. The exposed film was processed after irradiation using a VIDAR film scanner and the value of optical density was noted for each region. Ten IMRT plans of head and neck carcinoma were used for verification using a dynamic IMRT technique, and evaluated using the gamma index method with TPS calculated dose distribution. Results Sensitometric curve has been generated using a single film exposed at nine field region to check quantitative dose verifications of IMRT treatments. The radiation scattered factor was observed to decrease exponentially with the increase in the distance from the centre of each field region. The IMRT plans based on calibration curve were verified using the gamma index method and found to be within acceptable criteria. Conclusion The single film method proved to be superior to the traditional calibration method and produce fast daily film calibration for highly

  7. Flow visualization methods for field test verification of CFD analysis of an open gloveport

    DOE PAGES

    Strons, Philip; Bailey, James L.

    2017-01-01

    Anemometer readings alone cannot provide a complete picture of air flow patterns at an open gloveport. Having a means to visualize air flow for field tests in general provides greater insight by indicating direction in addition to the magnitude of the air flow velocities in the region of interest. Furthermore, flow visualization is essential for Computational Fluid Dynamics (CFD) verification, where important modeling assumptions play a significant role in analyzing the chaotic nature of low-velocity air flow. A good example is shown Figure 1, where an unexpected vortex pattern occurred during a field test that could not have been measuredmore » relying only on anemometer readings. Here by, observing and measuring the patterns of the smoke flowing into the gloveport allowed the CFD model to be appropriately updated to match the actual flow velocities in both magnitude and direction.« less

  8. Competitive region orientation code for palmprint verification and identification

    NASA Astrophysics Data System (ADS)

    Tang, Wenliang

    2015-11-01

    Orientation features of the palmprint have been widely investigated in coding-based palmprint-recognition methods. Conventional orientation-based coding methods usually used discrete filters to extract the orientation feature of palmprint. However, in real operations, the orientations of the filter usually are not consistent with the lines of the palmprint. We thus propose a competitive region orientation-based coding method. Furthermore, an effective weighted balance scheme is proposed to improve the accuracy of the extracted region orientation. Compared with conventional methods, the region orientation of the palmprint extracted using the proposed method can precisely and robustly describe the orientation feature of the palmprint. Extensive experiments on the baseline PolyU and multispectral palmprint databases are performed and the results show that the proposed method achieves a promising performance in comparison to conventional state-of-the-art orientation-based coding methods in both palmprint verification and identification.

  9. Influence of the Redundant Verification and the Non-Redundant Verification on the Hydraulic Tomography

    NASA Astrophysics Data System (ADS)

    Wei, T. B.; Chen, Y. L.; Lin, H. R.; Huang, S. Y.; Yeh, T. C. J.; Wen, J. C.

    2016-12-01

    In the groundwater study, it estimated the heterogeneous spatial distribution of hydraulic Properties, there were many scholars use to hydraulic tomography (HT) from field site pumping tests to estimate inverse of heterogeneous spatial distribution of hydraulic Properties, to prove the most of most field site aquifer was heterogeneous hydrogeological parameters spatial distribution field. Many scholars had proposed a method of hydraulic tomography to estimate heterogeneous spatial distribution of hydraulic Properties of aquifer, the Huang et al. [2011] was used the non-redundant verification analysis of pumping wells changed, observation wells fixed on the inverse and the forward, to reflect the feasibility of the heterogeneous spatial distribution of hydraulic Properties of field site aquifer of the non-redundant verification analysis on steady-state model.From post literature, finding only in steady state, non-redundant verification analysis of pumping well changed location and observation wells fixed location for inverse and forward. But the studies had not yet pumping wells fixed or changed location, and observation wells fixed location for redundant verification or observation wells change location for non-redundant verification of the various combinations may to explore of influences of hydraulic tomography method. In this study, it carried out redundant verification method and non-redundant verification method for forward to influences of hydraulic tomography method in transient. And it discuss above mentioned in NYUST campus sites the actual case, to prove the effectiveness of hydraulic tomography methods, and confirmed the feasibility on inverse and forward analysis from analysis results.Keywords: Hydraulic Tomography, Redundant Verification, Heterogeneous, Inverse, Forward

  10. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  11. Field verification for the effectiveness of continuity diaphragms for skewed continuous P/C P/S concrete girder bridges.

    DOT National Transportation Integrated Search

    2009-10-01

    The research presented herein describes the field verification for the effectiveness of continuity diaphragms for : skewed continuous precast, prestressed, concrete girder bridges. The objectives of this research are (1) to perform : field load testi...

  12. Registration verification of SEA/AR fields. [Oregon, Texas, Montana, Nebraska, Washington, Colorado, Kansas, Oklahoma, and North Dakota

    NASA Technical Reports Server (NTRS)

    Austin, W. W.; Lautenschlager, L. (Principal Investigator)

    1981-01-01

    A method of field registration verification for 20 SEA/AR sites for the 1979 crop year is evaluated. Field delineations for the sites were entered into the data base, and their registration verified using single channel gray scale computer printout maps of LANDSAT data taken over the site.

  13. Field Verification Program (Aquatic Disposal): Comparison of Field and Laboratory Bioaccumulation of Organic and Inorganic Contaminants from Black Rock Harbor Dredged Material

    DTIC Science & Technology

    1988-05-01

    include poly- chlorinated biphenyls (PCBs) and related chlorinated pesticides of similar polarity in addition to the petroleum hydrocarbons . The...Ui It tILL (JV: FIELD VERIFICATION PROGRAM (AQUATIC DISPOSAL).’Wh TECHNICAL REPORT D-87-6 COMPARISON OF FIELD AND LABORATORY BIOACCUMULATION OF...Laboratory Bioaccumulation of Organic and Inorganic Contaminants from Black Rock Harbor Dredged Material 12 PERSONAL AUTHOR(S) Lake, James L.; Galloway

  14. Comparison of individual and composite field analysis using array detector for Intensity Modulated Radiotherapy dose verification.

    PubMed

    Saminathan, Sathiyan; Chandraraj, Varatharaj; Sridhar, C H; Manickam, Ravikumar

    2012-01-01

    To compare the measured and calculated individual and composite field planar dose distribution of Intensity Modulated Radiotherapy plans. The measurements were performed in Clinac DHX linear accelerator with 6 MV photons using Matrixx device and a solid water phantom. The 20 brain tumor patients were selected for this study. The IMRT plan was carried out for all the patients using Eclipse treatment planning system. The verification plan was produced for every original plan using CT scan of Matrixx embedded in the phantom. Every verification field was measured by the Matrixx. The TPS calculated and measured dose distributions were compared for individual and composite fields. The percentage of gamma pixel match for the dose distribution patterns were evaluated using gamma histogram. The gamma pixel match was 95-98% for 41 fields (39%) and 98% for 59 fields (61%) with individual fields. The percentage of gamma pixel match was 95-98% for 5 patients and 98% for other 12 patients with composite fields. Three patients showed a gamma pixel match of less than 95%. The comparison of percentage gamma pixel match for individual and composite fields showed more than 2.5% variation for 6 patients, more than 1% variation for 4 patients, while the remaining 10 patients showed less than 1% variation. The individual and composite field measurements showed good agreement with TPS calculated dose distribution for the studied patients. The measurement and data analysis for individual fields is a time consuming process, the composite field analysis may be sufficient enough for smaller field dose distribution analysis with array detectors.

  15. SU-F-T-463: Light-Field Based Dynalog Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atwal, P; Ramaseshan, R

    2016-06-15

    Purpose: To independently verify leaf positions in so-called dynalog files for a Varian iX linac with a Millennium 120 MLC. This verification provides a measure of confidence that the files can be used directly as part of a more extensive intensity modulated radiation therapy / volumetric modulated arc therapy QA program. Methods: Initial testing used white paper placed at the collimator plane and a standard hand-held digital camera to image the light and shadow of a static MLC field through the paper. Known markings on the paper allow for image calibration. Noise reduction was attempted with removal of ‘inherent noise’more » from an open-field light image through the paper, but the method was found to be inconsequential. This is likely because the environment could not be controlled to the precision required for the sort of reproducible characterization of the quantum noise needed in order to meaningfully characterize and account for it. A multi-scale iterative edge detection algorithm was used for localizing the leaf ends. These were compared with the planned locations from the treatment console. Results: With a very basic setup, the image of the central bank A leaves 15–45, which are arguably the most important for beam modulation, differed from the planned location by [0.38±0.28] mm. Similarly, for bank B leaves 15–45 had a difference of [0.42±0.28] mm Conclusion: It should be possible to determine leaf position accurately with not much more than a modern hand-held camera and some software. This means we can have a periodic and independent verification of the dynalog file information. This is indicated by the precision already achieved using a basic setup and analysis methodology. Currently, work is being done to reduce imaging and setup errors, which will bring the leaf position error down further, and allow meaningful analysis over the full range of leaves.« less

  16. PERFORMANCE VERIFICATION TEST FOR FIELD-PORTABLE MEASUREMENTS OF LEAD IN DUST

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program (www.epa.jzov/etv) conducts performance verification tests of technologies used for the characterization and monitoring of contaminated media. The program exists to provide high-quali...

  17. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    NASA Astrophysics Data System (ADS)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  18. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  19. Verification Games: Crowd-Sourced Formal Verification

    DTIC Science & Technology

    2016-03-01

    VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION UNIVERSITY OF WASHINGTON MARCH 2016 FINAL TECHNICAL REPORT...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750...clarification memorandum dated 16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Over the more than three years of the project Verification Games : Crowd-sourced

  20. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... that your new configuration meets this verification. The verification consists of operating an engine... with data simultaneously generated and recorded by laboratory equipment as follows: (1) Mount an engine...

  1. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... that your new configuration meets this verification. The verification consists of operating an engine... with data simultaneously generated and recorded by laboratory equipment as follows: (1) Mount an engine...

  2. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... that your new configuration meets this verification. The verification consists of operating an engine... with data simultaneously generated and recorded by laboratory equipment as follows: (1) Mount an engine...

  3. Environmental Technology Verification: Pesticide Spray Drift Reduction Technologies for Row and Field Crops

    EPA Pesticide Factsheets

    The Environmental Technology Verification Program, established by the EPA, is designed to accelerate the development and commercialization of new or improved technologies through third-party verification and reporting of performance.

  4. Open magnetic fields in active regions

    NASA Technical Reports Server (NTRS)

    Svestka, Z.; Solodyna, C. V.; Howard, R.; Levine, R. H.

    1977-01-01

    Soft X-ray images and magnetograms of several active regions and coronal holes are examined which support the interpretation that some of the dark X-ray gaps seen between interconnecting loops and inner cores of active regions are foot points of open field lines inside the active regions. Characteristics of the investigated dark gaps are summarized. All the active regions with dark X-ray gaps at the proper place and with the correct polarity predicted by global potential extrapolation of photospheric magnetic fields are shown to be old active regions, indicating that field opening is accomplished only in a late phase of active-region development. It is noted that some of the observed dark gaps probably have nothing in common with open fields, but are either due to the decreased temperature in low-lying portions of interconnecting loops or are the roots of higher and less dense or cooler loops.

  5. 40 CFR 1065.920 - PEMS Calibrations and verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... verification. The verification consists of operating an engine over a duty cycle in the laboratory and... by laboratory equipment as follows: (1) Mount an engine on a dynamometer for laboratory testing...

  6. SU-F-T-440: The Feasibility Research of Checking Cervical Cancer IMRT Pre- Treatment Dose Verification by Automated Treatment Planning Verification System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, X; Yin, Y; Lin, X

    Purpose: To assess the preliminary feasibility of automated treatment planning verification system in cervical cancer IMRT pre-treatment dose verification. Methods: The study selected randomly clinical IMRT treatment planning data for twenty patients with cervical cancer, all IMRT plans were divided into 7 fields to meet the dosimetric goals using a commercial treatment planning system(PianncleVersion 9.2and the EclipseVersion 13.5). The plans were exported to the Mobius 3D (M3D)server percentage differences of volume of a region of interest (ROI) and dose calculation of target region and organ at risk were evaluated, in order to validate the accuracy automated treatment planning verification system.more » Results: The difference of volume for Pinnacle to M3D was less than results for Eclipse to M3D in ROI, the biggest difference was 0.22± 0.69%, 3.5±1.89% for Pinnacle and Eclipse respectively. M3D showed slightly better agreement in dose of target and organ at risk compared with TPS. But after recalculating plans by M3D, dose difference for Pinnacle was less than Eclipse on average, results were within 3%. Conclusion: The method of utilizing the automated treatment planning system to validate the accuracy of plans is convenientbut the scope of differences still need more clinical patient cases to determine. At present, it should be used as a secondary check tool to improve safety in the clinical treatment planning.« less

  7. GENERIC VERIFICATION PROTOCOL: DISTRIBUTED GENERATION AND COMBINED HEAT AND POWER FIELD TESTING PROTOCOL

    EPA Science Inventory

    This report is a generic verification protocol by which EPA’s Environmental Technology Verification program tests newly developed equipment for distributed generation of electric power, usually micro-turbine generators and internal combustion engine generators. The protocol will ...

  8. An Efficient Location Verification Scheme for Static Wireless Sensor Networks.

    PubMed

    Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok

    2017-01-24

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.

  9. An Efficient Location Verification Scheme for Static Wireless Sensor Networks

    PubMed Central

    Kim, In-hwan; Kim, Bo-sung; Song, JooSeok

    2017-01-01

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007

  10. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  11. Application verification research of cloud computing technology in the field of real time aerospace experiment

    NASA Astrophysics Data System (ADS)

    Wan, Junwei; Chen, Hongyan; Zhao, Jing

    2017-08-01

    According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.

  12. Verification, Validation and Sensitivity Studies in Computational Biomechanics

    PubMed Central

    Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.

    2012-01-01

    Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646

  13. Hierarchical Representation Learning for Kinship Verification.

    PubMed

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  14. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  15. Comparative evaluation of Kodak EDR2 and XV2 films for verification of intensity modulated radiation therapy.

    PubMed

    Dogan, Nesrin; Leybovich, Leonid B; Sethi, Anil

    2002-11-21

    Film dosimetry provides a convenient tool to determine dose distributions, especially for verification of IMRT plans. However, the film response to radiation shows a significant dependence on depth, energy and field size that compromise the accuracy of measurements. Kodak's XV2 film has a low saturation dose (approximately 100 cGy) and, consequently, a relatively short region of linear dose-response. The recently introduced Kodak extended range EDR2 film was reported to have a linear dose-response region extending to 500 cGy. This increased dose range may be particularly useful in the verification of IMRT plans. In this work, the dependence of Kodak EDR2 film's response on the depth, field size and energy was evaluated and compared with Kodak XV2 film. Co-60, 6 MV, 10 MV and 18 MV beams were used. Field sizes were 2 x 2, 6 x 6, 10 x 10, 14 x 14, 18 x 18 and 24 x 24 cm2. Doses for XV2 and EDR2 films were 80 cGy and 300 cGy, respectively. Optical density was converted to dose using depth-corrected sensitometric (Hurter and Driffield, or H&D) curves. For each field size, XV2 and EDR2 depth-dose curves were compared with ion chamber depth-dose curves. Both films demonstrated similar (within 1%) field size dependence. The deviation from the ion chamber for both films was small forthe fields ranging from 2 x 2 to 10 x 10 cm2: < or =2% for 6, 10 and 18 MV beams. No deviation was observed for the Co-60 beam. As the field size increased to 24 x 24 cm2, the deviation became significant for both films: approximately 7.5% for Co-60, approximately 5% for 6 MV and 10 MV, and approximately 6% for 18 MV. During the verification of IMRT plans, EDR2 film showed a better agreement with the calculated dose distributions than the XV2 film.

  16. Verification of NWP Cloud Properties using A-Train Satellite Observations

    NASA Astrophysics Data System (ADS)

    Kucera, P. A.; Weeks, C.; Wolff, C.; Bullock, R.; Brown, B.

    2011-12-01

    Recently, the NCAR Model Evaluation Tools (MET) has been enhanced to incorporate satellite observations for the verification of Numerical Weather Prediction (NWP) cloud products. We have developed tools that match fields spatially (both in the vertical and horizontal dimensions) to compare NWP products with satellite observations. These matched fields provide diagnostic evaluation of cloud macro attributes such as vertical distribution of clouds, cloud top height, and the spatial and seasonal distribution of cloud fields. For this research study, we have focused on using CloudSat, CALIPSO, and MODIS observations to evaluate cloud fields for a variety of NWP fields and derived products. We have selected cases ranging from large, mid-latitude synoptic systems to well-organized tropical cyclones. For each case, we matched the observed cloud field with gridded model and/or derived product fields. CloudSat and CALIPSO observations and model fields were matched and compared in the vertical along the orbit track. MODIS data and model fields were matched and compared in the horizontal. We then use MET to compute the verification statistics to quantify the performance of the models in representing the cloud fields. In this presentation we will give a summary of our comparison and show verification results for both synoptic and tropical cyclone cases.

  17. A Survey of Measurement, Mitigation, and Verification Field Technologies for Carbon Sequestration Geologic Storage

    NASA Astrophysics Data System (ADS)

    Cohen, K. K.; Klara, S. M.; Srivastava, R. D.

    2004-12-01

    The U.S. Department of Energy's (U.S. DOE's) Carbon Sequestration Program is developing state-of-the-science technologies for measurement, mitigation, and verification (MM&V) in field operations of geologic sequestration. MM&V of geologic carbon sequestration operations will play an integral role in the pre-injection, injection, and post-injection phases of carbon capture and storage projects to reduce anthropogenic greenhouse gas emissions. Effective MM&V is critical to the success of CO2 storage projects and will be used by operators, regulators, and stakeholders to ensure safe and permanent storage of CO2. In the U.S. DOE's Program, Carbon sequestration MM&V has numerous instrumental roles: Measurement of a site's characteristics and capability for sequestration; Monitoring of the site to ensure the storage integrity; Verification that the CO2 is safely stored; and Protection of ecosystems. Other drivers for MM&V technology development include cost-effectiveness, measurement precision, and frequency of measurements required. As sequestration operations are implemented in the future, it is anticipated that measurements over long time periods and at different scales will be required; this will present a significant challenge. MM&V sequestration technologies generally utilize one of the following approaches: below ground measurements; surface/near-surface measurements; aerial and satellite imagery; and modeling/simulations. Advanced subsurface geophysical technologies will play a primary role for MM&V. It is likely that successful MM&V programs will incorporate multiple technologies including but not limited to: reservoir modeling and simulations; geophysical techniques (a wide variety of seismic methods, microgravity, electrical, and electromagnetic techniques); subsurface fluid movement monitoring methods such as injection of tracers, borehole and wellhead pressure sensors, and tiltmeters; surface/near surface methods such as soil gas monitoring and infrared

  18. Control of embankment settlement field verification on PCPT prediction methods.

    DOT National Transportation Integrated Search

    2011-07-01

    Piezocone penetration tests (PCPT) have been widely used by geotechnical engineers for subsurface investigation and evaluation of different soil properties such as strength and deformation characteristics of the soil. This report focuses on the verif...

  19. Commissioning results of an automated treatment planning verification system

    PubMed Central

    Mason, Bryan E.; Robinson, Ronald C.; Kisling, Kelly D.; Kirsner, Steven M.

    2014-01-01

    A dose calculation verification system (VS) was acquired and commissioned as a second check on the treatment planning system (TPS). This system reads DICOM CT datasets, RT plans, RT structures, and RT dose from the TPS and automatically, using its own collapsed cone superposition/convolution algorithm, computes dose on the same CT dataset. The system was commissioned by extracting basic beam parameters for simple field geometries and dose verification for complex treatments. Percent depth doses (PDD) and profiles were extracted for field sizes using jaw settings 3 × 3 cm2 ‐ 40 × 40 cm2 and compared to measured data, as well as our TPS model. Smaller fields of 1 × 1 cm2 and 2 × 2 cm2 generated using the multileaf collimator (MLC) were analyzed in the same fashion as the open fields. In addition, 40 patient plans consisting of both IMRT and VMAT were computed and the following comparisons were made: 1) TPS to the VS, 2) VS to measured data, and 3) TPS to measured data where measured data is both ion chamber (IC) and film measurements. Our results indicated for all field sizes using jaw settings PDD errors for the VS on average were less than 0.87%, 1.38%, and 1.07% for 6x, 15x, and 18x, respectively, relative to measured data. PDD errors for MLC field sizes were less than 2.28%, 1.02%, and 2.23% for 6x, 15x, and 18x, respectively. The infield profile analysis yielded results less than 0.58% for 6x, 0.61% for 15x, and 0.77% for 18x for the VS relative to measured data. Analysis of the penumbra region yields results ranging from 66.5% points, meeting the DTA criteria to 100% of the points for smaller field sizes for all energies. Analysis of profile data for field sizes generated using the MLC saw agreement with infield DTA analysis ranging from 68.8%–100% points passing the 1.5%/1.5 mm criteria. Results from the dose verification for IMRT and VMAT beams indicated that, on average, the ratio of TPS to IC and VS to IC measurements was

  20. Verification and transfer of thermal pollution model. Volume 5: Verification of 2-dimensional numerical model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorate (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  1. The experimental verification of a streamline curvature numerical analysis method applied to the flow through an axial flow fan

    NASA Technical Reports Server (NTRS)

    Pierzga, M. J.

    1981-01-01

    The experimental verification of an inviscid, incompressible through-flow analysis method is presented. The primary component of this method is an axisymmetric streamline curvature technique which is used to compute the hub-to-tip flow field of a given turbomachine. To analyze the flow field in the blade-to-blade plane of the machine, the potential flow solution of an infinite cascade of airfoils is also computed using a source model technique. To verify the accuracy of such an analysis method an extensive experimental verification investigation was conducted using an axial flow research fan. Detailed surveys of the blade-free regions of the machine along with intra-blade surveys using rotating pressure sensing probes and blade surface static pressure taps provide a one-to-one relationship between measured and predicted data. The results of this investigation indicate the ability of this inviscid analysis method to predict the design flow field of the axial flow fan test rotor to within a few percent of the measured values.

  2. Field test of short-notice random inspections for inventory-change verification at a low-enriched-uranium fuel-fabrication plant: Preliminary summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fishbone, L.G.; Moussalli, G.; Naegele, G.

    1994-04-01

    An approach of short-notice random inspections (SNRIs) for inventory-change verification can enhance the effectiveness and efficiency of international safeguards at natural or low-enriched uranium (LEU) fuel fabrication plants. According to this approach, the plant operator declares the contents of nuclear material items before knowing if an inspection will occur to verify them. Additionally, items about which declarations are newly made should remain available for verification for an agreed time. This report details a six-month field test of the feasibility of such SNRIs which took place at the Westinghouse Electric Corporation Commercial Nuclear Fuel Division. Westinghouse personnel made daily declarations aboutmore » both feed and product items, uranium hexafluoride cylinders and finished fuel assemblies, using a custom-designed computer ``mailbox``. Safeguards inspectors from the IAEA conducted eight SNRIs to verify these declarations. Items from both strata were verified during the SNRIs by means of nondestructive assay equipment. The field test demonstrated the feasibility and practicality of key elements of the SNRI approach for a large LEU fuel fabrication plant.« less

  3. Monitoring/Verification using DMS: TATP Example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephan Weeks, Kevin Kyle, Manuel Manard

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations-management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biologicalmore » materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. Fast GC is the leading field analytical method for gas phase separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.« less

  4. Overview of RICOR's reliability theoretical analysis, accelerated life demonstration test results and verification by field data

    NASA Astrophysics Data System (ADS)

    Vainshtein, Igor; Baruch, Shlomi; Regev, Itai; Segal, Victor; Filis, Avishai; Riabzev, Sergey

    2018-05-01

    The growing demand for EO applications that work around the clock 24hr/7days a week, such as in border surveillance systems, emphasizes the need for a highly reliable cryocooler having increased operational availability and optimized system's Integrated Logistic Support (ILS). In order to meet this need, RICOR developed linear and rotary cryocoolers which achieved successfully this goal. Cryocoolers MTTF was analyzed by theoretical reliability evaluation methods, demonstrated by normal and accelerated life tests at Cryocooler level and finally verified by field data analysis derived from Cryocoolers operating at system level. The following paper reviews theoretical reliability analysis methods together with analyzing reliability test results derived from standard and accelerated life demonstration tests performed at Ricor's advanced reliability laboratory. As a summary for the work process, reliability verification data will be presented as a feedback from fielded systems.

  5. Verification and transfer of thermal pollution model. Volume 3: Verification of 3-dimensional rigid-lid model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.; Sinha, S. K.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free-surface model also provides surface height variations with time.

  6. Dosimetric verification of small fields in the lung using lung-equivalent polymer gel and Monte Carlo simulation.

    PubMed

    Gharehaghaji, Nahideh; Dadgar, Habib Alah

    2018-01-01

    The main purpose of this study was evaluate a polymer-gel-dosimeter (PGD) for three-dimensional verification of dose distributions in the lung that is called lung-equivalent gel (LEG) and then to compare its result with Monte Carlo (MC) method. In the present study, to achieve a lung density for PGD, gel is beaten until foam is obtained, and then sodium dodecyl sulfate is added as a surfactant to increase the surface tension of the gel. The foam gel was irradiated with 1 cm × 1 cm field size in the 6 MV photon beams of ONCOR SIEMENS LINAC, along the central axis of the gel. The LEG was then scanned on a 1.5 Tesla magnetic resonance imaging scanner after irradiation using a multiple-spin echo sequence. Least-square fitting the pixel values from 32 consecutive images using a single exponential decay function derived the R2 relaxation rates. Moreover, 6 and 18 MV photon beams of ONCOR SIEMENS LINAC are simulated using MCNPX MC Code. The MC model is used to calculate the depth dose water and low-density water resembling the soft tissue and lung, respectively. Percentages of dose reduction in the lung region relative to homogeneous phantom for 6 MV photon beam were 44.6%, 39%, 13%, and 7% for 0.5 cm × 0.5 cm, 1 cm × 1 cm, 2 cm × 2 cm, and 3 cm × 3 cm fields, respectively. For 18 MV photon beam, the results were found to be 82%, 69%, 46%, and 25.8% for the same field sizes, respectively. Preliminary results show good agreement between depth dose measured with the LEG and the depth dose calculated using MCNP code. Our study showed that the dose reduction with small fields in the lung was very high. Thus, inaccurate prediction of absorbed dose inside the lung and also lung/soft-tissue interfaces with small photon beams may lead to critical consequences for treatment outcome.

  7. Generic Verification Protocol for Testing Pesticide Application Spray Drift Reduction Technologies for Row and Field Crops

    EPA Pesticide Factsheets

    This generic verification protocol provides a detailed method to conduct and report results from a verification test of pesticide application technologies that can be used to evaluate these technologies for their potential to reduce spray drift.

  8. Verification of the isotopic composition of precipitation simulated by a regional isotope circulation model over Japan.

    PubMed

    Tanoue, Masahiro; Ichiyanagi, Kimpei; Yoshimura, Kei

    2016-01-01

    The isotopic composition (δ(18)O and δ(2)H) of precipitation simulated by a regional isotope circulation model with a horizontal resolution of 10, 30 and 50 km was compared with observations at 56 sites over Japan in 2013. All simulations produced reasonable spatio-temporal variations in δ(18)O in precipitation over Japan, except in January. In January, simulated δ(18)O values in precipitation were higher than observed values on the Pacific side of Japan, especially during an explosively developing extratropical cyclone event. This caused a parameterisation of precipitation formulation about the large fraction of precipitated water to liquid detrained water in the lower troposphere. As a result, most water vapour that transported from the Sea of Japan precipitated on the Sea of Japan side. The isotopic composition of precipitation was a useful verification tool for the parameterisation of precipitation formulation as well as large-scale moisture transport processes in the regional isotope circulation model.

  9. Monitoring/Verification Using DMS: TATP Example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kevin Kyle; Stephan Weeks

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operationsmanagement systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biologicalmore » materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. GC is the leading analytical method for the separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.« less

  10. Feasibility study on dosimetry verification of volumetric-modulated arc therapy-based total marrow irradiation.

    PubMed

    Liang, Yun; Kim, Gwe-Ya; Pawlicki, Todd; Mundt, Arno J; Mell, Loren K

    2013-03-04

    The purpose of this study was to develop dosimetry verification procedures for volumetric-modulated arc therapy (VMAT)-based total marrow irradiation (TMI). The VMAT based TMI plans were generated for three patients: one child and two adults. The planning target volume (PTV) was defined as bony skeleton, from head to mid-femur, with a 3 mm margin. The plan strategy similar to published studies was adopted. The PTV was divided into head and neck, chest, and pelvic regions, with separate plans each of which is composed of 2-3 arcs/fields. Multiple isocenters were evenly distributed along the patient's axial direction. The focus of this study is to establish a dosimetry quality assurance procedure involving both two-dimensional (2D) and three-dimensional (3D) volumetric verifications, which is desirable for a large PTV treated with multiple isocenters. The 2D dose verification was performed with film for gamma evaluation and absolute point dose was measured with ion chamber, with attention to the junction between neighboring plans regarding hot/cold spots. The 3D volumetric dose verification used commercial dose reconstruction software to reconstruct dose from electronic portal imaging devices (EPID) images. The gamma evaluation criteria in both 2D and 3D verification were 5% absolute point dose difference and 3 mm of distance to agreement. With film dosimetry, the overall average gamma passing rate was 98.2% and absolute dose difference was 3.9% in junction areas among the test patients; with volumetric portal dosimetry, the corresponding numbers were 90.7% and 2.4%. A dosimetry verification procedure involving both 2D and 3D was developed for VMAT-based TMI. The initial results are encouraging and warrant further investigation in clinical trials.

  11. Polar Field Reversals and Active Region Decay

    NASA Astrophysics Data System (ADS)

    Petrie, Gordon; Ettinger, Sophie

    2017-09-01

    We study the relationship between polar field reversals and decayed active region magnetic flux. Photospheric active region flux is dispersed by differential rotation and turbulent diffusion, and is transported poleward by meridional flows and diffusion. We summarize the published evidence from observation and modeling of the influence of meridional flow variations and decaying active region flux's spatial distribution, such as the Joy's law tilt angle. Using NSO Kitt Peak synoptic magnetograms covering cycles 21-24, we investigate in detail the relationship between the transport of decayed active region flux to high latitudes and changes in the polar field strength, including reversals in the magnetic polarity at the poles. By means of stack plots of low- and high-latitude slices of the synoptic magnetograms, the dispersal of flux from low to high latitudes is tracked, and the timing of this dispersal is compared to the polar field changes. In the most abrupt cases of polar field reversal, a few activity complexes (systems of active regions) are identified as the main cause. The poleward transport of large quantities of decayed trailing-polarity flux from these complexes is found to correlate well in time with the abrupt polar field changes. In each case, significant latitudinal displacements were found between the positive and negative flux centroids of the complexes, consistent with Joy's law bipole tilt with trailing-polarity flux located poleward of leading-polarity flux. The activity complexes of the cycle 21 and 22 maxima were larger and longer-lived than those of the cycle 23 and 24 maxima, and the poleward surges were stronger and more unipolar and the polar field changes larger and faster. The cycle 21 and 22 polar reversals were dominated by only a few long-lived complexes whereas the cycle 23 and 24 reversals were the cumulative effects of more numerous, shorter-lived regions. We conclude that sizes and lifetimes of activity complexes are key to

  12. Verification of intensity modulated radiation therapy beams using a tissue equivalent plastic scintillator dosimetry system

    NASA Astrophysics Data System (ADS)

    Petric, Martin Peter

    This thesis describes the development and implementation of a novel method for the dosimetric verification of intensity modulated radiation therapy (IMRT) fields with several advantages over current techniques. Through the use of a tissue equivalent plastic scintillator sheet viewed by a charge-coupled device (CCD) camera, this method provides a truly tissue equivalent dosimetry system capable of efficiently and accurately performing field-by-field verification of IMRT plans. This work was motivated by an initial study comparing two IMRT treatment planning systems. The clinical functionality of BrainLAB's BrainSCAN and Varian's Helios IMRT treatment planning systems were compared in terms of implementation and commissioning, dose optimization, and plan assessment. Implementation and commissioning revealed differences in the beam data required to characterize the beam prior to use with the BrainSCAN system requiring higher resolution data compared to Helios. This difference was found to impact on the ability of the systems to accurately calculate dose for highly modulated fields, with BrainSCAN being more successful than Helios. The dose optimization and plan assessment comparisons revealed that while both systems use considerably different optimization algorithms and user-control interfaces, they are both capable of producing substantially equivalent dose plans. The extensive use of dosimetric verification techniques in the IMRT treatment planning comparison study motivated the development and implementation of a novel IMRT dosimetric verification system. The system consists of a water-filled phantom with a tissue equivalent plastic scintillator sheet built into the top surface. Scintillation light is reflected by a plastic mirror within the phantom towards a viewing window where it is captured using a CCD camera. Optical photon spread is removed using a micro-louvre optical collimator and by deconvolving a glare kernel from the raw images. Characterization of this

  13. Verification of the Rigidity of the Coulomb Field in Motion

    NASA Astrophysics Data System (ADS)

    Blinov, S. V.; Bulyzhenkov, I. É.

    2018-06-01

    Laplace, analyzing the stability of the Solar System, was the first to calculate that the velocity of the motion of force fields can significantly exceed the velocity of light waves. In electrodynamics, the Coulomb field should rigidly accompany its source for instantaneous force action in distant regions. Such rigid motion was recently inferred from experiments at the Frascati Beam Test Facility with short beams of relativistic electrons. The comments of the authors on their observations are at odds with the comments of theoreticians on retarded potentials, which motivates a detailed study of the positions of both sides. Predictions of measurements, based on the Lienard-Wiechert potentials, are used to propose an unambiguous scheme for testing the rigidity of the Coulomb field. Realization of the proposed experimental scheme could independently refute or support the assertions of the Italian physicists regarding the rigid motion of Coulomb fields and likewise the nondual field approach to macroscopic reality.

  14. "Edge-on" MOSkin detector for stereotactic beam measurement and verification.

    PubMed

    Jong, Wei Loong; Ung, Ngie Min; Vannyat, Ath; Jamalludin, Zulaikha; Rosenfeld, Anatoly; Wong, Jeannie Hsiu Ding

    2017-01-01

    Dosimetry in small radiation field is challenging and complicated because of dose volume averaging and beam perturbations in a detector. We evaluated the suitability of the "Edge-on" MOSkin (MOSFET) detector in small radiation field measurement. We also tested the feasibility for dosimetric verification in stereotactic radiosurgery (SRS) and stereotactic radiotherapy (SRT). "Edge-on" MOSkin detector was calibrated and the reproducibility and linearity were determined. Lateral dose profiles and output factors were measured using the "Edge-on" MOSkin detector, ionization chamber, SRS diode and EBT2 film. Dosimetric verification was carried out on two SRS and five SRT plans. In dose profile measurements, the "Edge-on" MOSkin measurements concurred with EBT2 film measurements. It showed full width at half maximum of the dose profile with average difference of 0.11mm and penumbral width with difference of ±0.2mm for all SRS cones as compared to EBT2 film measurement. For output factor measurements, a 1.1% difference was observed between the "Edge-on" MOSkin detector and EBT2 film for 4mm SRS cone. The "Edge-on" MOSkin detector provided reproducible measurements for dose verification in real-time. The measured doses concurred with the calculated dose for SRS (within 1%) and SRT (within 3%). A set of output correction factors for the "Edge-on" MOSkin detector for small radiation fields were derived from EBT2 film measurement and presented. This study showed that the "Edge-on" MOSkin detector is a suitable tool for dose verification in small radiation field. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  15. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  16. A new verification film system for routine quality control of radiation fields: Kodak EC-L.

    PubMed

    Hermann, A; Bratengeier, K; Priske, A; Flentje, M

    2000-06-01

    The use of modern irradiation techniques requires better verification films for determining set-up deviations and patient movements during the course of radiation treatment. This is an investigation of the image quality and time requirement of a new verification film system compared to a conventional portal film system. For conventional verifications we used Agfa Curix HT 1000 films which were compared to the new Kodak EC-L film system. 344 Agfa Curix HT 1000 and 381 Kodak EC-L portal films of different tumor sites (prostate, rectum, head and neck) were visually judged on a light box by 2 experienced physicians. Subjective judgement of image quality, masking of films and time requirement were checked. In this investigation 68% of 175 Kodak EC-L ap/pa-films were judged "good", only 18% were classified "moderate" or "poor" 14%, but only 22% of 173 conventional ap/pa verification films (Agfa Curix HT 1000) were judged to be "good". The image quality, detail perception and time required for film inspection of the new Kodak EC-L film system was significantly improved when compared with standard portal films. They could be read more accurately and the detection of set-up deviation was facilitated.

  17. A framework of multitemplate ensemble for fingerprint verification

    NASA Astrophysics Data System (ADS)

    Yin, Yilong; Ning, Yanbin; Ren, Chunxiao; Liu, Li

    2012-12-01

    How to improve performance of an automatic fingerprint verification system (AFVS) is always a big challenge in biometric verification field. Recently, it becomes popular to improve the performance of AFVS using ensemble learning approach to fuse related information of fingerprints. In this article, we propose a novel framework of fingerprint verification which is based on the multitemplate ensemble method. This framework is consisted of three stages. In the first stage, enrollment stage, we adopt an effective template selection method to select those fingerprints which best represent a finger, and then, a polyhedron is created by the matching results of multiple template fingerprints and a virtual centroid of the polyhedron is given. In the second stage, verification stage, we measure the distance between the centroid of the polyhedron and a query image. In the final stage, a fusion rule is used to choose a proper distance from a distance set. The experimental results on the FVC2004 database prove the improvement on the effectiveness of the new framework in fingerprint verification. With a minutiae-based matching method, the average EER of four databases in FVC2004 drops from 10.85 to 0.88, and with a ridge-based matching method, the average EER of these four databases also decreases from 14.58 to 2.51.

  18. Systematic study of source mask optimization and verification flows

    NASA Astrophysics Data System (ADS)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  19. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  20. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  1. Spacecraft attitude calibration/verification baseline study

    NASA Technical Reports Server (NTRS)

    Chen, L. C.

    1981-01-01

    A baseline study for a generalized spacecraft attitude calibration/verification system is presented. It can be used to define software specifications for three major functions required by a mission: the pre-launch parameter observability and data collection strategy study; the in-flight sensor calibration; and the post-calibration attitude accuracy verification. Analytical considerations are given for both single-axis and three-axis spacecrafts. The three-axis attitudes considered include the inertial-pointing attitudes, the reference-pointing attitudes, and attitudes undergoing specific maneuvers. The attitude sensors and hardware considered include the Earth horizon sensors, the plane-field Sun sensors, the coarse and fine two-axis digital Sun sensors, the three-axis magnetometers, the fixed-head star trackers, and the inertial reference gyros.

  2. The effects of magnetic field in plume region on the performance of multi-cusped field thruster

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Peng, E-mail: hupengemail@126.com; Liu, Hui, E-mail: thruster@126.com; Yu, Daren

    2015-10-15

    The performance characteristics of a Multi-cusped Field Thruster depending on the magnetic field in the plume region were investigated. Five magnetic field shielding rings were separately mounted near the exit of discharge channel to decrease the strength of magnetic field in the plume region in different levels, while the magnetic field in the upstream was well maintained. The test results show that the electron current increases with the decrease of magnetic field strength in the plume region, which gives rise to higher propellant utilization and lower current utilization. On the other hand, the stronger magnetic field in the plume regionmore » improves the performance at low voltages (high current mode) while lower magnetic field improves the performance at high voltages (low current mode). This work can provide some optimal design ideas of the magnetic strength in the plume region to improve the performance of thruster.« less

  3. Comparison between In-house developed and Diamond commercial software for patient specific independent monitor unit calculation and verification with heterogeneity corrections.

    PubMed

    Kuppusamy, Vijayalakshmi; Nagarajan, Vivekanandan; Jeevanandam, Prakash; Murugan, Lavanya

    2016-02-01

    The study was aimed to compare two different monitor unit (MU) or dose verification software in volumetric modulated arc therapy (VMAT) using modified Clarkson's integration technique for 6 MV photons beams. In-house Excel Spreadsheet based monitor unit verification calculation (MUVC) program and PTW's DIAMOND secondary check software (SCS), version-6 were used as a secondary check to verify the monitor unit (MU) or dose calculated by treatment planning system (TPS). In this study 180 patients were grouped into 61 head and neck, 39 thorax and 80 pelvic sites. Verification plans are created using PTW OCTAVIUS-4D phantom and also measured using 729 detector chamber and array with isocentre as the suitable point of measurement for each field. In the analysis of 154 clinically approved VMAT plans with isocentre at a region above -350 HU, using heterogeneity corrections, In-house Spreadsheet based MUVC program and Diamond SCS showed good agreement TPS. The overall percentage average deviations for all sites were (-0.93% + 1.59%) and (1.37% + 2.72%) for In-house Excel Spreadsheet based MUVC program and Diamond SCS respectively. For 26 clinically approved VMAT plans with isocentre at a region below -350 HU showed higher variations for both In-house Spreadsheet based MUVC program and Diamond SCS. It can be concluded that for patient specific quality assurance (QA), the In-house Excel Spreadsheet based MUVC program and Diamond SCS can be used as a simple and fast accompanying to measurement based verification for plans with isocentre at a region above -350 HU. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  4. Numerical verification of three point bending experiment of magnetorheological elastomer (MRE) in magnetic field

    NASA Astrophysics Data System (ADS)

    Miedzinska, Danuta; Boczkowska, Anna; Zubko, Konrad

    2010-07-01

    In the article a method of numerical verification of experimental results for magnetorheological elastomer samples (MRE) is presented. The samples were shaped into cylinders with diameter of 8 mm and height of 20 mm with various carbonyl iron volume shares (1,5%, 11,5% and 33%). The diameter of soft ferromagnetic substance particles ranged from 6 to 9 μm. During the experiment, initially bended samples were exposed to the magnetic field with intensity levels at 0,1T, 0,3T, 0,5T, 0,7 and 1T. The reaction of the sample to the field action was measured as a displacement of a specimen. Numerical calculation was carried out with the MSC Patran/Marc computer code. For the purpose of numerical analysis the orthotropic material model with the material properties of magnetorheological elastomer along the iron chains, and of the pure elastomer along other directions, was applied. The material properties were obtained from the experimental tests. During the numerical analysis, the initial mechanical load resulting from cylinder deflection was set. Then, the equivalent external force, that was set on the basis of analytical calculations of intermolecular reaction within iron chains in the specific magnetic field, was put on the bended sample. Correspondence of such numerical model with results of the experiment was verified. Similar results of the experiments and both theoretical and FEM analysis indicates that macroscopic modeling of magnetorheological elastomer mechanical properties as orthotropic material delivers accurate enough description of the material's behavior.

  5. Molecular Verification of Cryptops hortensis (Scolopendromorpha: Cryptopidae) in theNearctic Region

    DTIC Science & Technology

    2018-01-29

    Journal Article 3. DATES COVERED (From – To) March – April 2016 4. TITLE AND SUBTITLE Molecular Verification of Cryptops hortensis...PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) USAF School of Aerospace Medicine ...Public Health and Preventive Medicine Dept/PHR 2510 Fifth St., Bldg. 840 Wright-Patterson AFB, OH 45433-7913 8. PERFORMING ORGANIZATION REPORT

  6. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  7. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT: TRITON SYSTEMS, LLC SOLID BOWL CENTRIFUGE, MODEL TS-5000

    EPA Science Inventory

    Verification testing of the Triton Systems, LLC Solid Bowl Centrifuge Model TS-5000 (TS-5000) was conducted at the Lake Wheeler Road Field Laboratory Swine Educational Unit in Raleigh, North Carolina. The TS-5000 was 48" in diameter and 30" deep, with a bowl capacity of 16 ft3. ...

  9. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  10. Numerical Weather Predictions Evaluation Using Spatial Verification Methods

    NASA Astrophysics Data System (ADS)

    Tegoulias, I.; Pytharoulis, I.; Kotsopoulos, S.; Kartsios, S.; Bampzelis, D.; Karacostas, T.

    2014-12-01

    During the last years high-resolution numerical weather prediction simulations have been used to examine meteorological events with increased convective activity. Traditional verification methods do not provide the desired level of information to evaluate those high-resolution simulations. To assess those limitations new spatial verification methods have been proposed. In the present study an attempt is made to estimate the ability of the WRF model (WRF -ARW ver3.5.1) to reproduce selected days with high convective activity during the year 2010 using those feature-based verification methods. Three model domains, covering Europe, the Mediterranean Sea and northern Africa (d01), the wider area of Greece (d02) and central Greece - Thessaly region (d03) are used at horizontal grid-spacings of 15km, 5km and 1km respectively. By alternating microphysics (Ferrier, WSM6, Goddard), boundary layer (YSU, MYJ) and cumulus convection (Kain-­-Fritsch, BMJ) schemes, a set of twelve model setups is obtained. The results of those simulations are evaluated against data obtained using a C-Band (5cm) radar located at the centre of the innermost domain. Spatial characteristics are well captured but with a variable time lag between simulation results and radar data. Acknowledgements: This research is co­financed by the European Union (European Regional Development Fund) and Greek national funds, through the action "COOPERATION 2011: Partnerships of Production and Research Institutions in Focused Research and Technology Sectors" (contract number 11SYN_8_1088 - DAPHNE) in the framework of the operational programme "Competitiveness and Entrepreneurship" and Regions in Transition (OPC II, NSRF 2007-­-2013).

  11. Geomagnetic field observations in the Kopaonik thrust region, Yugoslavia.

    NASA Astrophysics Data System (ADS)

    Bicskei, T.; Popeskov, M.

    1991-09-01

    In the absence of continuous registrations of the geomagnetic field variations in the surveyed region, the nearest permanent observatory records had to be used in the data reduction procedure. The proposed method estimates the differences between the hourly mean values at the particular measuring site, which are not actually known, and at the observatory on the basis of a series of instantaneous total field intensity values measured simultaneously at these two places. The application of this method to the geomagnetic field data from the wider area of the Kopaonik thrust region has revealed local field changes which show connection with pronounced seismic activity that has been going on in this region since it was affected by the M = 6.0 earthquake on May 18, 1980.

  12. Physics Verification Overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  13. EMC: Verification

    Science.gov Websites

    , GFS, RAP, HRRR, HIRESW, SREF mean, International Global Models, HPC analysis Precipitation Skill Scores : 1995-Present NAM, GFS, NAM CONUS nest, International Models EMC Forecast Verfication Stats: NAM ) Real Time Verification of NCEP Operational Models against observations Real Time Verification of NCEP

  14. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  15. Magnetic field in the NGC7023 photodissociation region

    NASA Astrophysics Data System (ADS)

    Alves, Marta

    2015-10-01

    The far-UV radiation of massive stars illuminates molecular clouds creating photodissociation regions (PDRs), the transition layers between atomic and molecular media. Recent results based on Herschel observations reveal the presence of small regions at high gas pressure in the PDRs, whose origin is still not well understood, while polarization measurements towards a few PDRs indicate that magnetic fields can play a significant role in their structure. The limited number of existing polarization observations suggest that, when subject to a high gas and radiation pressure from the stars, the magnetic field tends to align and to be compressed in the PDR. As a consequence, bright PDRs should be magnetically dominated. However, this possibility has been the subject of very few studies due to the sparsity of relevant data. We propose to map the magnetic field in a nearby bright PDR, NGC 7023, using the unique capabilities of HAWC+ onboard SOFIA. For one, we wish to test the hypothesis that the magnetic field should be parallel to this PDR, which is illuminated by a radiation field of 2600 (in Habing units). Secondly, since NGC 7023 is a well studied region, its physical conditions (density, temperature) are known and can thus be related to the magnetic field across the PDR. We can investigate the relation between the field structure and the geometry of the PDR, and aided by Herschel observations we can also explore a possible connection between the magnetic field and the existence of high density regions in the PDR. SOFIA HAWC+ is the only instrument capable of imaging the polarized emission of extended objects, with structure at arcsecond scales. Moreover, it allows us trace the magnetic field within the PDR, owing to its 63micron band that traces the warm (40K) dust present at the illuminated surface. Our observations will be complementary to those led by the instrument team, who will observe NGC 7023 using the three highest wavelength filters.

  16. Voltage verification unit

    DOEpatents

    Martin, Edward J [Virginia Beach, VA

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  17. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  18. Relationship between Birkeland current regions, particle precipitation, and electric fields

    NASA Technical Reports Server (NTRS)

    De La Beaujardiere, O.; Watermann, J.; Newell, P.; Rich, F.

    1993-01-01

    The relationship of the large-scale dayside Birkeland currents to large-scale particle precipitation patterns, currents, and convection is examined using DMSP and Sondrestrom radar observations. It is found that the local time of the mantle currents is not limited to the longitude of the cusp proper, but covers a larger local time extent. The mantle currents flow entirely on open field lines. About half of region 1 currents flow on open field lines, consistent with the assumption that the region 1 currents are generated by the solar wind dynamo and flow within the surface that separates open and closed field lines. More than 80 percent of the Birkeland current boundaries do not correspond to particle precipitation boundaries. Region 2 currents extend beyond the plasma sheet poleward boundary; region 1 currents flow in part on open field lines; mantle currents and mantle particles are not coincident. On most passes when a triple current sheet is observed, the convection reversal is located on closed field lines.

  19. High-resolution face verification using pore-scale facial features.

    PubMed

    Li, Dong; Zhou, Huiling; Lam, Kin-Man

    2015-08-01

    Face recognition methods, which usually represent face images using holistic or local facial features, rely heavily on alignment. Their performances also suffer a severe degradation under variations in expressions or poses, especially when there is one gallery per subject only. With the easy access to high-resolution (HR) face images nowadays, some HR face databases have recently been developed. However, few studies have tackled the use of HR information for face recognition or verification. In this paper, we propose a pose-invariant face-verification method, which is robust to alignment errors, using the HR information based on pore-scale facial features. A new keypoint descriptor, namely, pore-Principal Component Analysis (PCA)-Scale Invariant Feature Transform (PPCASIFT)-adapted from PCA-SIFT-is devised for the extraction of a compact set of distinctive pore-scale facial features. Having matched the pore-scale features of two-face regions, an effective robust-fitting scheme is proposed for the face-verification task. Experiments show that, with one frontal-view gallery only per subject, our proposed method outperforms a number of standard verification methods, and can achieve excellent accuracy even the faces are under large variations in expression and pose.

  20. MO-FG-202-01: A Fast Yet Sensitive EPID-Based Real-Time Treatment Verification System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmad, M; Nourzadeh, H; Neal, B

    2016-06-15

    Purpose: To create a real-time EPID-based treatment verification system which robustly detects treatment delivery and patient attenuation variations. Methods: Treatment plan DICOM files sent to the record-and-verify system are captured and utilized to predict EPID images for each planned control point using a modified GPU-based digitally reconstructed radiograph algorithm which accounts for the patient attenuation, source energy fluence, source size effects, and MLC attenuation. The DICOM and predicted images are utilized by our C++ treatment verification software which compares EPID acquired 1024×768 resolution frames acquired at ∼8.5hz from Varian Truebeam™ system. To maximize detection sensitivity, image comparisons determine (1) ifmore » radiation exists outside of the desired treatment field; (2) if radiation is lacking inside the treatment field; (3) if translations, rotations, and magnifications of the image are within tolerance. Acquisition was tested with known test fields and prior patient fields. Error detection was tested in real-time and utilizing images acquired during treatment with another system. Results: The computational time of the prediction algorithms, for a patient plan with 350 control points and 60×60×42cm^3 CT volume, is 2–3minutes on CPU and <27 seconds on GPU for 1024×768 images. The verification software requires a maximum of ∼9ms and ∼19ms for 512×384 and 1024×768 resolution images, respectively, to perform image analysis and dosimetric validations. Typical variations in geometric parameters between reference and the measured images are 0.32°for gantry rotation, 1.006 for scaling factor, and 0.67mm for translation. For excess out-of-field/missing in-field fluence, with masks extending 1mm (at isocenter) from the detected aperture edge, the average total in-field area missing EPID fluence was 1.5mm2 the out-of-field excess EPID fluence was 8mm^2, both below error tolerances. Conclusion: A real-time verification

  1. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  2. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  3. Verification Challenges at Low Numbers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered ofmore » the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.« less

  4. Regional Modeling and Power Spectra of Mercury's Crustal Magnetic Field

    NASA Astrophysics Data System (ADS)

    Plattner, A. M.; Johnson, C. L.

    2018-05-01

    Mercury's crustal magnetic field and magnetic power spectra for select regions show distinct patterns for regions without magnetized impact craters, regions with magnetized impact craters, and the region north of Caloris.

  5. Generic Verification Protocol for Testing Pesticide Application Spray Drift Reduction Technologies for Row and Field Crops (Version 1.4)

    EPA Science Inventory

    This generic verification protocol provides a detailed method for conducting and reporting results from verification testing of pesticide application technologies. It can be used to evaluate technologies for their potential to reduce spray drift, hence the term “drift reduction t...

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT: BROME AGRI SALES, LTD., MAXIMIZER SEPARATOR, MODEL MAX 1016 - 03/01/WQPC-SWP

    EPA Science Inventory

    Verification testing of the Brome Agri Sales Ltd. Maximizer Separator, Model MAX 1016 (Maximizer) was conducted at the Lake Wheeler Road Field Laboratory Swine Educational Unit in Raleigh, North Carolina. The Maximizer is an inclined screen solids separator that can be used to s...

  7. Verification of road databases using multiple road models

    NASA Astrophysics Data System (ADS)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  8. Nonpotential features observed in the magnetic field of an active region

    NASA Technical Reports Server (NTRS)

    Gary, G. A.; Moore, R. L.; Hagyard, M. J.; Haisch, Bernhard M.

    1987-01-01

    A unique coordinated data set consisting of vector magnetograms, H-alpha photographs, and high-resolution ultraviolet images of a solar active region is used, together with mathematical models, to calculate potential and force-free magnetic field lines and to examine the nonpotential nature of the active region structure. It is found that the overall bipolar magnetic field of the active region had a net twist corresponding to net current of order 3 x 10 to the 12th A and average density of order 4 x 10 to the -4th A/sq m flowing antiparallel to the field. There were three regions of enhanced nonpotentiality in the interior of the active region; in one the field had a marked nonpotential twist or shear with height above the photosphere. The measured total nonpotential magnetic energy stored in the entire active region was of order 10 to the 32nd ergs, about 3 sigma above the noise level.

  9. Integration and verification testing of the Large Synoptic Survey Telescope camera

    NASA Astrophysics Data System (ADS)

    Lange, Travis; Bond, Tim; Chiang, James; Gilmore, Kirk; Digel, Seth; Dubois, Richard; Glanzman, Tom; Johnson, Tony; Lopez, Margaux; Newbry, Scott P.; Nordby, Martin E.; Rasmussen, Andrew P.; Reil, Kevin A.; Roodman, Aaron J.

    2016-08-01

    We present an overview of the Integration and Verification Testing activities of the Large Synoptic Survey Telescope (LSST) Camera at the SLAC National Accelerator Lab (SLAC). The LSST Camera, the sole instrument for LSST and under construction now, is comprised of a 3.2 Giga-pixel imager and a three element corrector with a 3.5 degree diameter field of view. LSST Camera Integration and Test will be taking place over the next four years, with final delivery to the LSST observatory anticipated in early 2020. We outline the planning for Integration and Test, describe some of the key verification hardware systems being developed, and identify some of the more complicated assembly/integration activities. Specific details of integration and verification hardware systems will be discussed, highlighting some of the technical challenges anticipated.

  10. Cleanup Verification Package for the 300 VTS Waste Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S. W. Clark and T. H. Mitchell

    2006-03-13

    This cleanup verification package documents completion of remedial action for the 300 Area Vitrification Test Site, also known as the 300 VTS site. The site was used by Pacific Northwest National Laboratory as a field demonstration site for in situ vitrification of soils containing simulated waste.

  11. Discriminative Features Mining for Offline Handwritten Signature Verification

    NASA Astrophysics Data System (ADS)

    Neamah, Karrar; Mohamad, Dzulkifli; Saba, Tanzila; Rehman, Amjad

    2014-03-01

    Signature verification is an active research area in the field of pattern recognition. It is employed to identify the particular person with the help of his/her signature's characteristics such as pen pressure, loops shape, speed of writing and up down motion of pen, writing speed, pen pressure, shape of loops, etc. in order to identify that person. However, in the entire process, features extraction and selection stage is of prime importance. Since several signatures have similar strokes, characteristics and sizes. Accordingly, this paper presents combination of orientation of the skeleton and gravity centre point to extract accurate pattern features of signature data in offline signature verification system. Promising results have proved the success of the integration of the two methods.

  12. Comparison of Kodak EDR2 and Gafchromic EBT film for intensity-modulated radiation therapy dose distribution verification.

    PubMed

    Sankar, A; Ayyangar, Komanduri M; Nehru, R Mothilal; Kurup, P G Gopalakrishna; Murali, V; Enke, Charles A; Velmurugan, J

    2006-01-01

    The quantitative dose validation of intensity-modulated radiation therapy (IMRT) plans require 2-dimensional (2D) high-resolution dosimetry systems with uniform response over its sensitive region. The present work deals with clinical use of commercially available self-developing Radio Chromic Film, Gafchromic EBT film, for IMRT dose verification. Dose response curves were generated for the films using a VXR-16 film scanner. The results obtained with EBT films were compared with the results of Kodak extended dose range 2 (EDR2) films. The EBT film had a linear response between the dose range of 0 to 600 cGy. The dose-related characteristics of the EBT film, such as post irradiation color growth with time, film uniformity, and effect of scanning orientation, were studied. There was up to 8.6% increase in the color density between 2 to 40 hours after irradiation. There was a considerable variation, up to 8.5%, in the film uniformity over its sensitive region. The quantitative differences between calculated and measured dose distributions were analyzed using DTA and Gamma index with the tolerance of 3% dose difference and 3-mm distance agreement. The EDR2 films showed consistent results with the calculated dose distributions, whereas the results obtained using EBT were inconsistent. The variation in the film uniformity limits the use of EBT film for conventional large-field IMRT verification. For IMRT of smaller field sizes (4.5 x 4.5 cm), the results obtained with EBT were comparable with results of EDR2 films.

  13. From field to region yield predictions in response to pedo-climatic variations in Eastern Canada

    NASA Astrophysics Data System (ADS)

    JÉGO, G.; Pattey, E.; Liu, J.

    2013-12-01

    The increase in global population coupled with new pressures to produce energy and bioproducts from agricultural land requires an increase in crop productivity. However, the influence of climate and soil variations on crop production and environmental performance is not fully understood and accounted for to define more sustainable and economical management strategies. Regional crop modeling can be a great tool for understanding the impact of climate variations on crop production, for planning grain handling and for assessing the impact of agriculture on the environment, but it is often limited by the availability of input data. The STICS ("Simulateur mulTIdisciplinaire pour les Cultures Standard") crop model, developed by INRA (France) is a functional crop model which has a built-in module to optimize several input parameters by minimizing the difference between calculated and measured output variables, such as Leaf Area Index (LAI). STICS crop model was adapted to the short growing season of the Mixedwood Plains Ecozone using field experiments results, to predict biomass and yield of soybean, spring wheat and corn. To minimize the numbers of inference required for regional applications, 'generic' cultivars rather than specific ones have been calibrated in STICS. After the calibration of several model parameters, the root mean square error (RMSE) of yield and biomass predictions ranged from 10% to 30% for the three crops. A bit more scattering was obtained for LAI (20%verification of the sensitivity of the biomass prediction to climate variations. Using RS data to re-initialize input parameters that are not readily available (e.g. seeding date) is considered an effective way

  14. Field Verification of Stable Perched Groundwater in Layered Bedrock Uplands

    USGS Publications Warehouse

    Carter, J.T.; Gotkowitz, M.B.; Anderson, M.P.

    2011-01-01

    Data substantiating perched conditions in layered bedrock uplands are rare and have not been widely reported. Field observations in layered sedimentary bedrock in southwestern Wisconsin, USA, provide evidence of a stable, laterally extensive perched aquifer. Data from a densely instrumented field site show a perched aquifer in shallow dolomite, underlain by a shale-and-dolomite aquitard approximately 25 m thick, which is in turn underlain by sandstone containing a 30-m-thick unsaturated zone above a regional aquifer. Heads in water supply wells indicate that perched conditions extend at least several kilometers into hillsides, which is consistent with published modeling studies. Observations of unsaturated conditions in the sandstone over a 4-year period, historical development of the perched aquifer, and perennial flow from upland springs emanating from the shallow dolomite suggest that perched groundwater is a stable hydrogeologic feature under current climate conditions. Water-table hydrographs exhibit apparent differences in the amount and timing of recharge to the perched and regional flow systems; steep hydraulic gradients and tritium and chloride concentrations suggest there is limited hydraulic connection between the two. Recognition and characterization of perched flow systems have practical importance because their groundwater flow and transport pathways may differ significantly from those in underlying flow systems. Construction of multi-aquifer wells and groundwater withdrawal in perched systems can further alter such pathways. ?? 2010 The Author(s). Journal compilation ?? 2010 National Ground Water Association.

  15. On-Ground Processing of Yaogan-24 Remote Sensing Satellite Attitude Data and Verification Using Geometric Field Calibration

    PubMed Central

    Wang, Mi; Fan, Chengcheng; Yang, Bo; Jin, Shuying; Pan, Jun

    2016-01-01

    Satellite attitude accuracy is an important factor affecting the geometric processing accuracy of high-resolution optical satellite imagery. To address the problem whereby the accuracy of the Yaogan-24 remote sensing satellite’s on-board attitude data processing is not high enough and thus cannot meet its image geometry processing requirements, we developed an approach involving on-ground attitude data processing and digital orthophoto (DOM) and the digital elevation model (DEM) verification of a geometric calibration field. The approach focuses on three modules: on-ground processing based on bidirectional filter, overall weighted smoothing and fitting, and evaluation in the geometric calibration field. Our experimental results demonstrate that the proposed on-ground processing method is both robust and feasible, which ensures the reliability of the observation data quality, convergence and stability of the parameter estimation model. In addition, both the Euler angle and quaternion could be used to build a mathematical fitting model, while the orthogonal polynomial fitting model is more suitable for modeling the attitude parameter. Furthermore, compared to the image geometric processing results based on on-board attitude data, the image uncontrolled and relative geometric positioning result accuracy can be increased by about 50%. PMID:27483287

  16. ACTIVE REGION FILAMENTS MIGHT HARBOR WEAK MAGNETIC FIELDS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Díaz Baso, C. J.; Martínez González, M. J.; Asensio Ramos, A., E-mail: cdiazbas@iac.es

    Recent spectropolarimetric observations of active region filaments have revealed polarization profiles with signatures typical of the strong field Zeeman regime. The conspicuous absence in those observations of scattering polarization and Hanle effect signatures was then pointed out by some authors. This was interpreted as either a signature of mixed “turbulent” field components or as a result of optical thickness. In this article, we present a natural scenario to explain these Zeeman-only spectropolarimetric observations of active region (AR) filaments. We propose a two-component model, one on top of the other. Both components have horizontal fields, with the azimuth difference between themmore » being close to 90°. The component that lies lower in the atmosphere is permeated by a strong field of the order of 600 G, while the upper component has much weaker fields, of the order of 10 G. The ensuing scattering polarization signatures of the individual components have opposite signs, so its combination along the line of sight reduces—and even can cancel out—the Hanle signatures, giving rise to an apparent Zeeman-only profile. This model is also applicable to other chromospheric structures seen in absorption above ARs.« less

  17. Verification of a Viscous Computational Aeroacoustics Code using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  18. Verification of a Viscous Computational Aeroacoustics Code Using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  19. Bridging scale gaps between regional maps of forest aboveground biomass and field sampling plots using TanDEM-X data

    NASA Astrophysics Data System (ADS)

    Ni, W.; Zhang, Z.; Sun, G.

    2017-12-01

    Several large-scale maps of forest AGB have been released [1] [2] [3]. However, these existing global or regional datasets were only approximations based on combining land cover type and representative values instead of measurements of actual forest aboveground biomass or forest heights [4]. Rodríguez-Veiga et al[5] reported obvious discrepancies of existing forest biomass stock maps with in-situ observations in Mexico. One of the biggest challenges to the credibility of these maps comes from the scale gaps between the size of field sampling plots used to develop(or validate) estimation models and the pixel size of these maps and the availability of field sampling plots with sufficient size for the verification of these products [6]. It is time-consuming and labor-intensive to collect sufficient number of field sampling data over the plot size of the same as resolutions of regional maps. The smaller field sampling plots cannot fully represent the spatial heterogeneity of forest stands as shown in Figure 1. Forest AGB is directly determined by forest heights, diameter at breast height (DBH) of each tree, forest density and tree species. What measured in the field sampling are the geometrical characteristics of forest stands including the DBH, tree heights and forest densities. The LiDAR data is considered as the best dataset for the estimation of forest AGB. The main reason is that LiDAR can directly capture geometrical features of forest stands by its range detection capabilities.The remotely sensed dataset, which is capable of direct measurements of forest spatial structures, may serve as a ladder to bridge the scale gaps between the pixel size of regional maps of forest AGB and field sampling plots. Several researches report that TanDEM-X data can be used to characterize the forest spatial structures [7, 8]. In this study, the forest AGB map of northeast China were produced using ALOS/PALSAR data taking TanDEM-X data as a bridges. The TanDEM-X InSAR data used in

  20. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    PubMed

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  1. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations... concentration subcomponents (e.g., THC and CH4 for NMHC) separately. For example, for NMHC measurements, perform drift verification on NMHC; do not verify THC and CH4 separately. (2) Drift verification requires two...

  2. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  3. Experimental verification of nanofluid shear-wave reconversion in ultrasonic fields.

    PubMed

    Forrester, Derek Michael; Huang, Jinrui; Pinfield, Valerie J; Luppé, Francine

    2016-03-14

    Here we present the verification of shear-mediated contributions to multiple scattering of ultrasound in suspensions. Acoustic spectroscopy was carried out with suspensions of silica of differing particle sizes and concentrations in water to find the attenuation at a broad range of frequencies. As the particle sizes approach the nanoscale, commonly used multiple scattering models fail to match experimental results. We develop a new model, taking into account shear mediated contributions, and find excellent agreement with the attenuation spectra obtained using two types of spectrometer. The results determine that shear-wave phenomena must be considered in ultrasound characterisation of nanofluids at even relatively low concentrations of scatterers that are smaller than one micrometre in diameter.

  4. Verification of Gyrokinetic codes: Theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia; Bottino, Alberto; Görler, Tobias; Sonnendrücker, Eric; Told, Daniel; Villard, Laurent

    2017-05-01

    In fusion plasmas, the strong magnetic field allows the fast gyro-motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the subsequent transport. Naturally, these codes require thorough verification and validation. Here, we present a new and generic theoretical framework and specific numerical applications to test the faithfulness of the implemented models to theory and to verify the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which has rarely been done and therefore makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The verification of the numerical scheme is proposed via the benchmark effort. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC) and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations implemented in the ORB5 and GENE codes using the Lagrangian variational formulation. At the computational level, detailed verifications of global electromagnetic test cases developed from the CYCLONE Base Case are considered, including a parametric β-scan covering the transition from ITG to KBM and the spectral properties at the nominal β value.

  5. Applications of Bayesian Procrustes shape analysis to ensemble radar reflectivity nowcast verification

    NASA Astrophysics Data System (ADS)

    Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang

    2016-07-01

    This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.

  6. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  7. Multi-level slug tests in highly permeable formations: 2. Hydraulic conductivity identification, method verification, and field applications

    USGS Publications Warehouse

    Zlotnik, V.A.; McGuire, V.L.

    1998-01-01

    Using the developed theory and modified Springer-Gelhar (SG) model, an identification method is proposed for estimating hydraulic conductivity from multi-level slug tests. The computerized algorithm calculates hydraulic conductivity from both monotonic and oscillatory well responses obtained using a double-packer system. Field verification of the method was performed at a specially designed fully penetrating well of 0.1-m diameter with a 10-m screen in a sand and gravel alluvial aquifer (MSEA site, Shelton, Nebraska). During well installation, disturbed core samples were collected every 0.6 m using a split-spoon sampler. Vertical profiles of hydraulic conductivity were produced on the basis of grain-size analysis of the disturbed core samples. These results closely correlate with the vertical profile of horizontal hydraulic conductivity obtained by interpreting multi-level slug test responses using the modified SG model. The identification method was applied to interpret the response from 474 slug tests in 156 locations at the MSEA site. More than 60% of responses were oscillatory. The method produced a good match to experimental data for both oscillatory and monotonic responses using an automated curve matching procedure. The proposed method allowed us to drastically increase the efficiency of each well used for aquifer characterization and to process massive arrays of field data. Recommendations generalizing this experience to massive application of the proposed method are developed.Using the developed theory and modified Springer-Gelhar (SG) model, an identification method is proposed for estimating hydraulic conductivity from multi-level slug tests. The computerized algorithm calculates hydraulic conductivity from both monotonic and oscillatory well responses obtained using a double-packer system. Field verification of the method was performed at a specially designed fully penetrating well of 0.1-m diameter with a 10-m screen in a sand and gravel alluvial

  8. SU-G-IeP4-06: Feasibility of External Beam Treatment Field Verification Using Cherenkov Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Black, P; Na, Y; Wuu, C

    2016-06-15

    Purpose: Cherenkov light emission has been shown to correlate with ionizing radiation (IR) dose delivery in solid tissue. In order to properly correlate Cherenkov light images with real time dose delivery in a patient, we must account for geometric and intensity distortions arising from observation angle, as well as the effect of monitor units (MU) and field size on Cherenkov light emission. To test the feasibility of treatment field verification, we first focused on Cherenkov light emission efficiency based on MU and known field size (FS). Methods: Cherenkov light emission was captured using a PI-MAX4 intensified charge coupled device(ICCD) systemmore » (Princeton Instruments), positioned at a fixed angle of 40° relative to the beam central axis. A Varian TrueBeam linear accelerator (linac) was operated at 6MV and 600MU/min to deliver an Anterior-Posterior beam to a 5cm thick block phantom positioned at 100cm Source-to-Surface-Distance(SSD). FS of 10×10, 5×5, and 2×2cm{sup 2} were used. Before beam delivery projected light field images were acquired, ensuring that geometric distortions were consistent when measuring Cherenkov field discrepancies. Cherenkov image acquisition was triggered by linac target current. 500 frames were acquired for each FS. Composite images were created through summation of frames and background subtraction. MU per image was calculated based on linac pulse delay of 2.8ms. Cherenkov and projected light FS were evaluated using ImageJ software. Results: Mean Cherenkov FS discrepancies compared to light field were <0.5cm for 5.6, 2.8, and 8.6 MU for 10×10, 5×5, and 2×2cm{sup 2} FS, respectably. Discrepancies were reduced with increasing field size and MU. We predict a minimum of 100 frames is needed for reliable confirmation of delivered FS. Conclusion: Current discrepancies in Cherenkov field sizes are within a usable range to confirm treatment delivery in standard and respiratory gated clinical scenarios at MU levels appropriate to

  9. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  10. Verification of the databases EXFOR and ENDF

    NASA Astrophysics Data System (ADS)

    Berton, Gottfried; Damart, Guillaume; Cabellos, Oscar; Beauzamy, Bernard; Soppera, Nicolas; Bossant, Manuel

    2017-09-01

    The objective of this work is for the verification of large experimental (EXFOR) and evaluated nuclear reaction databases (JEFF, ENDF, JENDL, TENDL…). The work is applied to neutron reactions in EXFOR data, including threshold reactions, isomeric transitions, angular distributions and data in the resonance region of both isotopes and natural elements. Finally, a comparison of the resonance integrals compiled in EXFOR database with those derived from the evaluated libraries is also performed.

  11. Alignment verification procedures

    NASA Technical Reports Server (NTRS)

    Edwards, P. R.; Phillips, E. P.; Newman, J. C., Jr.

    1988-01-01

    In alignment verification procedures each laboratory is required to align its test machines and gripping fixtures to produce a nearly uniform tensile stress field on an un-notched sheet specimen. The blank specimens (50 mm w X 305 mm l X 2.3 mm th) supplied by the coordinators were strain gauged. Strain gauge readings were taken at all gauges (n = 1 through 10). The alignment verification procedures are as follows: (1) zero all strain gauges while specimen is in a free-supported condition; (2) put strain-gauged specimen in the test machine so that specimen front face (face 1) is in contact with reference jaw (standard position of specimen), tighten grips, and at zero load measure strains on all gauges. (epsilon sub nS0 is strain at gauge n, standard position, zero load); (3) with specimen in machine and at a tensile load of 10 kN measure strains (specimen in standard position). (Strain = epsilon sub nS10); (4) remove specimen from machine. Put specimen in machine so that specimen back face (face 2) is in contact with reference jaw (reverse position of specimen), tighten grips, and at zero load measure strains on all gauges. (Strain - epsilon sub nR0); and (5) with specimen in machine and at tensile load of 10 kN measure strains (specimen in reverse position). (epsilon sub nR10 is strain at gauge n, reverse position, 10 kN load).

  12. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  13. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  14. A Practitioners Perspective on Verification

    NASA Astrophysics Data System (ADS)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  15. SIR-A imagery in geologic studies of the Sierra Madre Oriental, northeastern Mexico. Part 1 (Regional stratigraphy): The use of morphostratigraphic units in remote sensing mapping

    NASA Technical Reports Server (NTRS)

    Longoria, J. F.; Jimenez, O. H.

    1985-01-01

    SIR-A imaging was used in geological studies of sedimentary terrains in the Sierra Madre Oriental, northeastern Mexico. Geological features such as regional strike and dip, bedding, folding and faulting were readily detected on the image. The recognition of morphostructural units in the imagery, coupled with field verification, enabled geological mapping of the region at the scale of 1:250 000. Structural profiling lead to the elaboration of a morphostructural map allowing the recognition of an echelon folds and field trends which were used to postulate the ectonic setting of the region.

  16. Online 3D EPID-based dose verification: Proof of concept

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozenda

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of thismore » study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame

  17. Online 3D EPID-based dose verification: Proof of concept.

    PubMed

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel

    2016-07-01

    Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took

  18. Cleanup Verification Package for the 600-47 Waste Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. J. Cutlip

    This cleanup verification package documents completion of interim remedial action for the 600-47 waste site. This site consisted of several areas of surface debris and contamination near the banks of the Columbia River across from Johnson Island. Contaminated material identified in field surveys included four areas of soil, wood, nuts, bolts, and other metal debris.

  19. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Eligibility verification. 457.380 Section 457.380... Requirements: Eligibility, Screening, Applications, and Enrollment § 457.380 Eligibility verification. (a) The... State may establish reasonable eligibility verification mechanisms to promote enrollment of eligible...

  20. Correlation between magnetic and electric field perturbations in the field-aligned current regions deduced from DE 2 observations

    NASA Technical Reports Server (NTRS)

    Ishii, M.; Sugiura, M.; Iyemori, T.; Slavin, J. A.

    1992-01-01

    The satellite-observed high correlations between magnetic and electric field perturbations in the high-latitude field-aligned current regions are investigated by examining the dependence of the relationship between Delta-B and E on spatial scale, using the electric and magnetic field data obtained by DE 2 in the polar regions. The results are compared with the Pedersen conductivity inferred from the international reference ionosphere model and the Alfven wave velocity calculated from the in situ ion density and magnetic field measurements.

  1. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  2. Viability Study for an Unattended UF 6 Cylinder Verification Station: Phase I Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Leon E.; Miller, Karen A.; Garner, James R.

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass and identification for all declared UF 6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field

  3. Quantum money with classical verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gavinsky, Dmitry

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  4. Quantum money with classical verification

    NASA Astrophysics Data System (ADS)

    Gavinsky, Dmitry

    2014-12-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  5. Kleene Algebra and Bytecode Verification

    DTIC Science & Technology

    2016-04-27

    computing the star (Kleene closure) of a matrix of transfer functions. In this paper we show how this general framework applies to the problem of Java ...bytecode verification. We show how to specify transfer functions arising in Java bytecode verification in such a way that the Kleene algebra operations...potentially improve the performance over the standard worklist algorithm when a small cutset can be found. Key words: Java , bytecode, verification, static

  6. Mutation Testing for Effective Verification of Digital Components of Physical Systems

    NASA Astrophysics Data System (ADS)

    Kushik, N. G.; Evtushenko, N. V.; Torgaev, S. N.

    2015-12-01

    Digital components of modern physical systems are often designed applying circuitry solutions based on the field programmable gate array technology (FPGA). Such (embedded) digital components should be carefully tested. In this paper, an approach for the verification of digital physical system components based on mutation testing is proposed. The reference description of the behavior of a digital component in the hardware description language (HDL) is mutated by introducing into it the most probable errors and, unlike mutants in high-level programming languages, the corresponding test case is effectively derived based on a comparison of special scalable representations of the specification and the constructed mutant using various logic synthesis and verification systems.

  7. Capturing field-scale variability in crop performance across a regional-scale climosequence

    NASA Astrophysics Data System (ADS)

    Brooks, E. S.; Poggio, M.; Anderson, T. R.; Gasch, C.; Yourek, M. A.; Ward, N. K.; Magney, T. S.; Brown, D. J.; Huggins, D. R.

    2014-12-01

    With the increasing availability of variable rate technology for applying fertilizers and other agrichemicals in dryland agricultural production systems there is a growing need to better capture and understand the processes driving field scale variability in crop yield and soil water. This need for a better understanding of field scale variability has led to the recent designation of the R. J. Cook Agronomy Farm (CAF) (Pullman, WA, USA) as a United States Department of Agriculture Long-Term Agro-Ecosystem Research (LTAR) site. Field scale variability at the CAF is closely monitored using extensive environmental sensor networks and intensive hand sampling. As investigating land-soil-water dynamics at CAF is essential for improving precision agriculture, transferring this knowledge across the regional-scale climosequence is challenging. In this study we describe the hydropedologic functioning of the CAF in relation to five extensively instrumented field sites located within 50 km in the same climatic region. The formation of restrictive argillic soil horizons in the wetter, cooler eastern edge of the region results in the development of extensive perched water tables, surface saturation, and surface runoff, whereas excess water is not an issue in the warmer, drier, western edge of the region. Similarly, crop and tillage management varies across the region as well. We discuss the implications of these regional differences on field scale management decisions and demonstrate how we are using proximal soil sensing and remote sensing imagery to better understand and capture field scale variability at a particular field site.

  8. MARATHON Verification (MARV)

    DTIC Science & Technology

    2017-08-01

    comparable with MARATHON 1 in terms of output. Rather, the MARATHON 2 verification cases were designed to ensure correct implementation of the new algorithms...DISCLAIMER The findings of this report are not to be construed as an official Department of the Army position, policy, or decision unless so designated by...for employment against demands. This study is a comparative verification of the functionality of MARATHON 4 (our newest implementation of MARATHON

  9. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory`s Source Region Program. Appendix B: Surface ground motion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weaver, T.A.; Baker, D.F.; Edwards, C.L.

    1993-10-01

    Surface ground motion was recorded for many of the Integrated Verification Experiments using standard 10-, 25- and 100-g accelerometers, force-balanced accelerometers and, for some events, using golf balls and 0.39-cm steel balls as surface inertial gauges (SIGs). This report contains the semi-processed acceleration, velocity, and displacement data for the accelerometers fielded and the individual observations for the SIG experiments. Most acceleration, velocity, and displacement records have had calibrations applied and have been deramped, offset corrected, and deglitched but are otherwise unfiltered or processed from their original records. Digital data for all of these records are stored at Los Alamos Nationalmore » Laboratory.« less

  10. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests

  11. Verification and benchmark testing of the NUFT computer code

    NASA Astrophysics Data System (ADS)

    Lee, K. H.; Nitao, J. J.; Kulshrestha, A.

    1993-10-01

    This interim report presents results of work completed in the ongoing verification and benchmark testing of the NUFT (Nonisothermal Unsaturated-saturated Flow and Transport) computer code. NUFT is a suite of multiphase, multicomponent models for numerical solution of thermal and isothermal flow and transport in porous media, with application to subsurface contaminant transport problems. The code simulates the coupled transport of heat, fluids, and chemical components, including volatile organic compounds. Grid systems may be cartesian or cylindrical, with one-, two-, or fully three-dimensional configurations possible. In this initial phase of testing, the NUFT code was used to solve seven one-dimensional unsaturated flow and heat transfer problems. Three verification and four benchmarking problems were solved. In the verification testing, excellent agreement was observed between NUFT results and the analytical or quasianalytical solutions. In the benchmark testing, results of code intercomparison were very satisfactory. From these testing results, it is concluded that the NUFT code is ready for application to field and laboratory problems similar to those addressed here. Multidimensional problems, including those dealing with chemical transport, will be addressed in a subsequent report.

  12. Microcode Verification Project.

    DTIC Science & Technology

    1980-05-01

    numerical constant. The internal syntax for these minimum and maximum values is REALMIN and REALMAX. ISPSSIMP ISPSSIMP is the file simplifying bitstring...To be fair , it is quito clear that much of the ILbor Il tile verification task can be reduced If verification and. code development are carried out...basi.a of and the language we have chosen for both encoding our descriptions of machines and reasoning about the course of computations. Internally , our

  13. Analysis of potential errors in real-time streamflow data and methods of data verification by digital computer

    USGS Publications Warehouse

    Lystrom, David J.

    1972-01-01

    Various methods of verifying real-time streamflow data are outlined in part II. Relatively large errors (those greater than 20-30 percent) can be detected readily by use of well-designed verification programs for a digital computer, and smaller errors can be detected only by discharge measurements and field observations. The capability to substitute a simulated discharge value for missing or erroneous data is incorporated in some of the verification routines described. The routines represent concepts ranging from basic statistical comparisons to complex watershed modeling and provide a selection from which real-time data users can choose a suitable level of verification.

  14. Face verification with balanced thresholds.

    PubMed

    Yan, Shuicheng; Xu, Dong; Tang, Xiaoou

    2007-01-01

    The process of face verification is guided by a pre-learned global threshold, which, however, is often inconsistent with class-specific optimal thresholds. It is, hence, beneficial to pursue a balance of the class-specific thresholds in the model-learning stage. In this paper, we present a new dimensionality reduction algorithm tailored to the verification task that ensures threshold balance. This is achieved by the following aspects. First, feasibility is guaranteed by employing an affine transformation matrix, instead of the conventional projection matrix, for dimensionality reduction, and, hence, we call the proposed algorithm threshold balanced transformation (TBT). Then, the affine transformation matrix, constrained as the product of an orthogonal matrix and a diagonal matrix, is optimized to improve the threshold balance and classification capability in an iterative manner. Unlike most algorithms for face verification which are directly transplanted from face identification literature, TBT is specifically designed for face verification and clarifies the intrinsic distinction between these two tasks. Experiments on three benchmark face databases demonstrate that TBT significantly outperforms the state-of-the-art subspace techniques for face verification.

  15. Verification of intensity modulated profiles using a pixel segmented liquid-filled linear array.

    PubMed

    Pardo, J; Roselló, J V; Sánchez-Doblado, F; Gómez, F

    2006-06-07

    A liquid isooctane (C8H18) filled ionization chamber linear array developed for radiotherapy quality assurance, consisting of 128 pixels (each of them with a 1.7 mm pitch), has been used to acquire profiles of several intensity modulated fields. The results were compared with film measurements using the gamma test. The comparisons show a very good matching, even in high gradient dose regions. The volume-averaging effect of the pixels is negligible and the spatial resolution is enough to verify these regions. However, some mismatches between the detectors have been found in regions where low-energy scattered photons significantly contribute to the total dose. These differences are not very important (in fact, the measurements of both detectors are in agreement using the gamma test with tolerances of 3% and 3 mm in most of those regions), and may be associated with the film energy dependence. In addition, the linear array repeatability (0.27% one standard deviation) is much better than the film one ( approximately 3%). The good repeatability, small pixel size and high spatial resolution make the detector ideal for the real time profile verification of high gradient beam profiles like those present in intensity modulated radiation therapy and radiosurgery.

  16. Partial defect verification of spent fuel assemblies by PDET: Principle and field testing in Interim Spent fuel Storage Facility (CLAB) in Sweden

    DOE PAGES

    Ham, Y.; Kerr, P.; Sitaraman, S.; ...

    2016-05-05

    Here, the need for the development of a credible method and instrument for partial defect verification of spent fuel has been emphasized over a few decades in the safeguards communities as the diverted spent fuel pins can be the source of nuclear terrorism or devices. The need is increasingly more important and even urgent as many countries have started to transfer spent fuel to so called "difficult-to-access" areas such as dry storage casks, reprocessing or geological repositories. Partial defect verification is required by IAEA before spent fuel is placed into "difficult-to-access" areas. Earlier, Lawrence Livermore National Laboratory (LLNL) has reportedmore » the successful development of a new, credible partial defect verification method for pressurized water reactor (PWR) spent fuel assemblies without use of operator data, and further reported the validation experiments using commercial spent fuel assemblies with some missing fuel pins. The method was found to be robust as the method is relatively invariant to the characteristic variations of spent fuel assemblies such as initial fuel enrichment, cooling time, and burn-up. Since then, the PDET system has been designed and prototyped for 17×17 PWR spent fuel assemblies, complete with data acquisition software and acquisition electronics. In this paper, a summary description of the PDET development followed by results of the first successful field testing using the integrated PDET system and actual spent fuel assemblies performed in a commercial spent fuel storage site, known as Central Interim Spent fuel Storage Facility (CLAB) in Sweden will be presented. In addition to partial defect detection initial studies have determined that the tool can be used to verify the operator declared average burnup of the assembly as well as intra-assembly bunrup levels.« less

  17. Partial Defect Verification of Spent Fuel Assemblies by PDET: Principle and Field Testing in Interim Spent Fuel Storage Facility (CLAB) in Sweden

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ham, Y.S.; Kerr, P.; Sitaraman, S.

    The need for the development of a credible method and instrument for partial defect verification of spent fuel has been emphasized over a few decades in the safeguards communities as the diverted spent fuel pins can be the source of nuclear terrorism or devices. The need is increasingly more important and even urgent as many countries have started to transfer spent fuel to so called 'difficult-to-access' areas such as dry storage casks, reprocessing or geological repositories. Partial defect verification is required by IAEA before spent fuel is placed into 'difficult-to-access' areas. Earlier, Lawrence Livermore National Laboratory (LLNL) has reported themore » successful development of a new, credible partial defect verification method for pressurized water reactor (PWR) spent fuel assemblies without use of operator data, and further reported the validation experiments using commercial spent fuel assemblies with some missing fuel pins. The method was found to be robust as the method is relatively invariant to the characteristic variations of spent fuel assemblies such as initial fuel enrichment, cooling time, and burn-up. Since then, the PDET system has been designed and prototyped for 17x17 PWR spent fuel assemblies, complete with data acquisition software and acquisition electronics. In this paper, a summary description of the PDET development followed by results of the first successful field testing using the integrated PDET system and actual spent fuel assemblies performed in a commercial spent fuel storage site, known as Central Interim Spent fuel Storage Facility (CLAB) in Sweden will be presented. In addition to partial defect detection initial studies have determined that the tool can be used to verify the operator declared average burnup of the assembly as well as intra-assembly burnup levels. (authors)« less

  18. Partial defect verification of spent fuel assemblies by PDET: Principle and field testing in Interim Spent fuel Storage Facility (CLAB) in Sweden

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ham, Y.; Kerr, P.; Sitaraman, S.

    Here, the need for the development of a credible method and instrument for partial defect verification of spent fuel has been emphasized over a few decades in the safeguards communities as the diverted spent fuel pins can be the source of nuclear terrorism or devices. The need is increasingly more important and even urgent as many countries have started to transfer spent fuel to so called "difficult-to-access" areas such as dry storage casks, reprocessing or geological repositories. Partial defect verification is required by IAEA before spent fuel is placed into "difficult-to-access" areas. Earlier, Lawrence Livermore National Laboratory (LLNL) has reportedmore » the successful development of a new, credible partial defect verification method for pressurized water reactor (PWR) spent fuel assemblies without use of operator data, and further reported the validation experiments using commercial spent fuel assemblies with some missing fuel pins. The method was found to be robust as the method is relatively invariant to the characteristic variations of spent fuel assemblies such as initial fuel enrichment, cooling time, and burn-up. Since then, the PDET system has been designed and prototyped for 17×17 PWR spent fuel assemblies, complete with data acquisition software and acquisition electronics. In this paper, a summary description of the PDET development followed by results of the first successful field testing using the integrated PDET system and actual spent fuel assemblies performed in a commercial spent fuel storage site, known as Central Interim Spent fuel Storage Facility (CLAB) in Sweden will be presented. In addition to partial defect detection initial studies have determined that the tool can be used to verify the operator declared average burnup of the assembly as well as intra-assembly bunrup levels.« less

  19. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  20. How Much Energy Can Be Stored in Solar Active Region Magnetic Fields?

    NASA Astrophysics Data System (ADS)

    Linker, J.; Downs, C.; Torok, T.; Titov, V. S.; Lionello, R.; Mikic, Z.; Riley, P.

    2015-12-01

    Major solar eruptions such as X-class flares and very fast coronal mass ejections usually originate in active regions on the Sun. The energy that powers these events is believed to be stored as free magnetic energy (energy above the potential field state) prior to eruption. While coronal magnetic fields are not in general force-free, active regions have very strong magnetic fields and at low coronal heights the plasma beta is therefore very small, making the field (in equilibrium) essentially force-free. The Aly-Sturrock theorem shows that the energy of a fully force-free field cannot exceed the energy of the so-called open field. If the theorem holds, this places an upper limit on the amount of free energy that can be stored: the maximum free energy (MFE) is the difference between the open field energy and the potential field energy of the active region. In thermodynamic MHD simulations of a major eruption (the July 14, 2000 'Bastille' day event) and a modest event (February 13, 2009, we have found that the MFE indeed bounds the energy stored prior to eruption. We compute the MFE for major eruptive events in cycles 23 and 24 to investigate the maximum amount of energy that can be stored in solar active regions.Research supported by AFOSR, NASA, and NSF.

  1. Verification of different forecasts of Hungarian Meteorological Service

    NASA Astrophysics Data System (ADS)

    Feher, B.

    2009-09-01

    In this paper I show the results of the forecasts made by the Hungarian Meteorological Service. I focus on the general short- and medium-range forecasts, which contains cloudiness, precipitation, wind speed and temperature for six regions of Hungary. I would like to show the results of some special forecasts as well, such as precipitation predictions which are made for the catchment area of Danube and Tisza rivers, and daily mean temperature predictions used by Hungarian energy companies. The product received by the user is made by the general forecaster, but these predictions are based on the ALADIN and ECMWF outputs. Because of these, the product of the forecaster and the models were also verified. Method like this is able to show us, which weather elements are more difficult to forecast or which regions have higher errors. During the verification procedure the basic errors (mean error, mean absolute error) are calculated. Precipitation amount is classified into five categories, and scores like POD, TS, PC,…etc. were defined by contingency table determined by these categories. The procedure runs fully automatically, all the things forecasters have to do is to print the daily result each morning. Beside the daily result, verification is also made for longer periods like week, month or year. Analyzing the results of longer periods we can say that the best predictions are made for the first few days, and precipitation forecasts are less good for mountainous areas, even, the scores of the forecasters sometimes are higher than the errors of the models. Since forecaster receive results next day, it can helps him/her to reduce mistakes and learn the weakness of the models. This paper contains the verification scores, their trends, the method by which these scores are calculated, and some case studies on worse forecasts.

  2. Magnetic fields, stellar feedback, and the geometry of H II regions

    NASA Astrophysics Data System (ADS)

    Ferland, Gary J.

    2009-04-01

    Magnetic pressure has long been known to dominate over gas pressure in atomic and molecular regions of the interstellar medium. Here I review several recent observational studies of the relationships between the H+, H0 and H2 regions in M42 (the Orion complex) and M17. A simple picture results. When stars form they push back surrounding material, mainly through the outward momentum of starlight acting on grains, and field lines are dragged with the gas due to flux freezing. The magnetic field is compressed and the magnetic pressure increases until it is able to resist further expansion and the system comes into approximate magnetostatic equilibrium. Magnetic field lines can be preferentially aligned perpendicular to the long axis of quiescent cloud before stars form. After star formation and pushback occurs ionized gas will be constrained to flow along field lines and escape from the system along directions perpendicular to the long axis. The magnetic field may play other roles in the physics of the H II region and associated PDR. Cosmic rays may be enhanced along with the field and provide additional heating of atomic and molecular material. Wave motions may be associated with the field and contribute a component of turbulence to observed line profiles.

  3. Formulating face verification with semidefinite programming.

    PubMed

    Yan, Shuicheng; Liu, Jianzhuang; Tang, Xiaoou; Huang, Thomas S

    2007-11-01

    This paper presents a unified solution to three unsolved problems existing in face verification with subspace learning techniques: selection of verification threshold, automatic determination of subspace dimension, and deducing feature fusing weights. In contrast to previous algorithms which search for the projection matrix directly, our new algorithm investigates a similarity metric matrix (SMM). With a certain verification threshold, this matrix is learned by a semidefinite programming approach, along with the constraints of the kindred pairs with similarity larger than the threshold, and inhomogeneous pairs with similarity smaller than the threshold. Then, the subspace dimension and the feature fusing weights are simultaneously inferred from the singular value decomposition of the derived SMM. In addition, the weighted and tensor extensions are proposed to further improve the algorithmic effectiveness and efficiency, respectively. Essentially, the verification is conducted within an affine subspace in this new algorithm and is, hence, called the affine subspace for verification (ASV). Extensive experiments show that the ASV can achieve encouraging face verification accuracy in comparison to other subspace algorithms, even without the need to explore any parameters.

  4. Model based verification and prognosis of acidification and sulphate releasing processes downstream of a former sewage field in Berlin (Germany).

    PubMed

    Horner, Christoph; Engelmann, Frank; Nützmann, Gunnar

    2009-04-15

    An ammonium contamination plume originating from sewage field management practices over several decades is affecting the water quality at the well fields of the Friedrichshagen waterworks in Berlin, Germany. Because hydraulic measures were unsuccessful due to the fixation of ammonium on the aquifer matrix by cation exchange, an in situ nitrification measure by injection of oxygen gas was chosen to protect the extraction wells. In order to assess the hydro chemical processes accompanying this in situ measure, reactive transport modelling was performed. The relevant processes are the dissolution of oxygen gas and the nitrification of ammonium which initiate secondary geochemical processes like sulphate release, acidification and hardening. The reactive transport modelling began with the deduction of a reaction network, followed by the mathematical formulation and incorporation of reactive terms into a reactive transport solver. Two model versions were set up: (1) a simplified large scale model to evaluate the long-term reaction zoning to be expected due to permanent oxygen gas injection, and (2) a verification of the monitored hydrochemistry during a first field test performed near the contamination source. The results of reactive transport modelling demonstrate that in situ injection of oxygen gas will be effective in reducing the ammonium load from the well fields, and that acidification processes near the production wells can be minimized. Finally, a line of gas injection wells extending over the whole width of the ammonium contamination plume will be constructed to protect the well fields from further ammonium load.

  5. 25 CFR 61.8 - Verification forms.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... using the last address of record. The verification form will be used to ascertain the previous enrollee... death. Name and/or address changes will only be made if the verification form is signed by an adult... 25 Indians 1 2010-04-01 2010-04-01 false Verification forms. 61.8 Section 61.8 Indians BUREAU OF...

  6. Verification of Gyrokinetic codes: theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia

    2016-10-01

    In fusion plasmas the strong magnetic field allows the fast gyro motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the consequent transport. We present a new and generic theoretical framework and specific numerical applications to test the validity and the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The indirect verification of numerical scheme is proposed via the Benchmark process. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC), and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations using the generic variational formulation. Then, we derive and include the models implemented in ORB5 and GENE inside this hierarchy. At the computational level, detailed verification of global electromagnetic test cases based on the CYCLONE are considered, including a parametric β-scan covering the transition between the ITG to KBM and the spectral properties at the nominal β value.

  7. High stakes in INF verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krepon, M.

    1987-06-01

    The stakes involved in negotiating INF verification arrangements are high. While these proposals deal only with intermediate-range ground-launched cruise and mobile missiles, if properly devised they could help pave the way for comprehensive limits on other cruise missiles and strategic mobile missiles. In contrast, poorly drafted monitoring provisions could compromise national industrial security and generate numerous compliance controversies. Any verification regime will require new openness on both sides, but that means significant risks as well as opportunities. US and Soviet negotiators could spend weeks, months, and even years working out in painstaking detail verification provisions for medium-range missiles. Alternatively, ifmore » the two sides wished to conclude an INF agreement quickly, they could defer most of the difficult verification issues to the strategic arms negotiations.« less

  8. Verification of Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  9. Correlation of DCPI with deformation under proof roller loading to assess soft subgrade stabilization criterion : addendum to NCDOT final report 2011-05, entitled : "Field verification of undercut criteria and alternatives for subgrade stabilization in th

    DOT National Transportation Integrated Search

    2016-11-21

    Work presented herein is an addendum to the final report for NCDOT Project 2011-05 entitled : Field Verification of Undercut Criteria and Alternatives for Subgrade Stabilization in the : Piedmont Area. The objective of the addendum work is to p...

  10. SU-E-T-138: Dosimetric Verification For Volumetric Modulated Arc Therapy Cranio-Spinal Irradiation Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goksel, E; Bilge, H; Yildiz, Yarar

    2014-06-01

    Purpose: Dosimetric feasibility of cranio-spinal irradiation with volumetric modulated arc therapy (VMAT-CSI) technique in terms of dose distribution accuracy was investigated using a humanlike phantom. Methods: The OARs and PTV volumes for the Rando phantom were generated on supine CT images. Eclipse (version 8.6) TPS with AAA algorithm was used to create the treatment plan with VMAT-CSI technique. RapidArc plan consisted of cranial, upper spinal (US) and lower spinal (LS) regions that were optimized in the same plan. US field was overlapped by 3cm with cranial and LS fields. Three partial arcs for cranium and 1 full arc for eachmore » US and LS region were used. The VMAT-CSI dose distribution inside the Rando phantom was measured with thermoluminescent detectors (TLD) and film dosimetry, and was compared to the calculated doses of field junctions, target and OARs. TLDs were placed at 24 positions throughout the phantom. The measured TLD doses were compared to the calculated point doses. Planar doses for field junctions were verified with Gafchromic films. Films were analyzed in PTW Verisoft application software using gamma analysis method with the 4 mm distance to agreement (DTA) and 4% dose agreement criteria. Results: TLD readings demonstrated accurate dose delivery, with a median dose difference of -0.3% (range: -8% and 12%) when compared with calculated doses for the areas inside the treatment portal. The maximum dose difference was 12% higher in testicals that are outside the treatment region and 8% lower in lungs where the heterogeinity was higher. All planar dose verifications for field junctions passed the gamma analysis and measured planar dose distributions demonstrated average 97% agreement with calculated doses. Conclusion: The dosimetric data verified with TLD and film dosimetry shows that VMAT-CSI technique provides accurate dose distribution and can be delivered safely.« less

  11. CD volume design and verification

    NASA Technical Reports Server (NTRS)

    Li, Y. P.; Hughes, J. S.

    1993-01-01

    In this paper, we describe a prototype for CD-ROM volume design and verification. This prototype allows users to create their own model of CD volumes by modifying a prototypical model. Rule-based verification of the test volumes can then be performed later on against the volume definition. This working prototype has proven the concept of model-driven rule-based design and verification for large quantity of data. The model defined for the CD-ROM volumes becomes a data model as well as an executable specification.

  12. Formal Verification of Air Traffic Conflict Prevention Bands Algorithms

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony J.; Munoz, Cesar A.; Dowek, Gilles

    2010-01-01

    In air traffic management, a pairwise conflict is a predicted loss of separation between two aircraft, referred to as the ownship and the intruder. A conflict prevention bands system computes ranges of maneuvers for the ownship that characterize regions in the airspace that are either conflict-free or 'don't go' zones that the ownship has to avoid. Conflict prevention bands are surprisingly difficult to define and analyze. Errors in the calculation of prevention bands may result in incorrect separation assurance information being displayed to pilots or air traffic controllers. This paper presents provably correct 3-dimensional prevention bands algorithms for ranges of track angle; ground speed, and vertical speed maneuvers. The algorithms have been mechanically verified in the Prototype Verification System (PVS). The verification presented in this paper extends in a non-trivial way that of previously published 2-dimensional algorithms.

  13. A time-averaged regional model of the Hermean magnetic field

    NASA Astrophysics Data System (ADS)

    Thébault, E.; Langlais, B.; Oliveira, J. S.; Amit, H.; Leclercq, L.

    2018-03-01

    This paper presents the first regional model of the magnetic field of Mercury developed with mathematical continuous functions. The model has a horizontal spatial resolution of about 830 km at the surface of the planet, and it is derived without any a priori information about the geometry of the internal and external fields or regularization. It relies on an extensive dataset of the MESSENGER's measurements selected over its entire orbital lifetime between 2011 and 2015. A first order separation between the internal and the external fields over the Northern hemisphere is achieved under the assumption that the magnetic field measurements are acquired in a source free region within the magnetospheric cavity. When downward continued to the core-mantle boundary, the model confirms some of the general structures observed in previous studies such as the dominance of zonal field, the location of the North magnetic pole, and the global absence of significant small scale structures. The transformation of the regional model into a global spherical harmonic one provides an estimate for the axial quadrupole to axial dipole ratio of about g20/g10 = 0.27 . This is much lower than previous estimates of about 0.40. We note that it is possible to obtain a similar ratio provided that more weight is put on the location of the magnetic equator and less elsewhere.

  14. Monitoring and verification R&D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pilat, Joseph F; Budlong - Sylvester, Kory W; Fearey, Bryan L

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existingmore » energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.« less

  15. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the... verifications and analysis. It may also be necessary to limit the range of conditions under which the PEMS can... additional information or analysis to support your conclusions. (b) Overall verification. This paragraph (b...

  16. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  17. Preventing illegal tobacco and alcohol sales to minors through electronic age-verification devices: a field effectiveness study.

    PubMed

    Krevor, Brad; Capitman, John A; Oblak, Leslie; Cannon, Joanna B; Ruwe, Mathilda

    2003-01-01

    Efforts to prohibit the sales of tobacco and alcohol products to minors are widespread. Electronic Age Verification (EAV) devices are one possible means to improve compliance with sales to minors laws. The purpose of this study was to evaluate the implementation and effectiveness of EAV devices in terms of the frequency and accuracy of age verification, as well as to examine the impact of EAV's on the retailer environment. Two study locations were selected: Tallahassee, Florida and Iowa City, Iowa. Retail stores were invited to participate in the study, producing a self-selected experimental group. Stores that did not elect to test the EAV's comprised the comparison group. The data sources included: 1) mystery shopper inspections: two pre- and five post-EAV installation mystery shopper inspections of tobacco and alcohol retailers; 2) retail clerk and manager interviews; and 3) customer interviews. The study found that installing EAV devices with minimal training and encouragement did not increase age verification and underage sales refusal. Surveyed clerks reported positive experiences using the electronic ID readers and customers reported almost no discomfort about being asked to swipe their IDs. Observations from this study support the need for a more comprehensive system for responsible retailing.

  18. Monthly and seasonally verification of precipitation in Poland

    NASA Astrophysics Data System (ADS)

    Starosta, K.; Linkowska, J.

    2009-04-01

    The national meteorological service of Poland - the Institute of Meteorology and Water Management (IMWM) joined COSMO - The Consortium for Small Scale Modelling on July 2004. In Poland, the COSMO _PL model version 3.5 had run till June 2007. Since July 2007, the model version 4.0 has been running. The model runs in an operational mode at 14-km grid spacing, twice a day (00 UTC, 12 UTC). For scientific research also model with 7-km grid spacing is ran. Monthly and seasonally verification for the 24-hours (06 UTC - 06 UTC) accumulated precipitation is presented in this paper. The precipitation field of COSMO_LM had been verified against rain gauges network (308 points). The verification had been made for every month and all seasons from December 2007 to December 2008. The verification was made for three forecast days for selected thresholds: 0.5, 1, 2.5, 5, 10, 20, 25, 30 mm. Following indices from contingency table were calculated: FBI (bias), POD (probability of detection), PON (probability of detection of non event), FAR (False alarm rate), TSS (True sill statistic), HSS (Heidke skill score), ETS (Equitable skill score). Also percentile ranks and ROC-relative operating characteristic are presented. The ROC is a graph of the hit rate (Y-axis) against false alarm rate (X-axis) for different decision thresholds

  19. Monthly and seasonally verification of precipitation in Poland

    NASA Astrophysics Data System (ADS)

    Starosta, K.; Linkowska, J.

    2009-04-01

    The national meteorological service of Poland - the Institute of Meteorology and Water Management (IMWM) joined COSMO - The Consortium for Small Scale Modelling on July 2004. In Poland, the COSMO _PL model version 3.5 had run till June 2007. Since July 2007, the model version 4.0 has been running. The model runs in an operational mode at 14-km grid spacing, twice a day (00 UTC, 12 UTC). For scientific research also model with 7-km grid spacing is ran. Monthly and seasonally verification for the 24-hours (06 UTC - 06 UTC) accumulated precipitation is presented in this paper. The precipitation field of COSMO_LM had been verified against rain gauges network (308 points). The verification had been made for every month and all seasons from December 2007 to December 2008. The verification was made for three forecast days for selected thresholds: 0.5, 1, 2.5, 5, 10, 20, 25, 30 mm. Following indices from contingency table were calculated: FBI (bias), POD (probability of detection), PON (probability of detection of non event), FAR (False alarm rate), TSS (True sill statistic), HSS (Heidke skill score), ETS (Equitable skill score). Also percentile ranks and ROC-relative operating characteristic are presented. The ROC is a graph of the hit rate (Y-axis) against false alarm rate (X-axis) for different decision thresholds.

  20. A Scala DSL for RETE-Based Runtime Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2013-01-01

    Runtime verification (RV) consists in part of checking execution traces against formalized specifications. Several systems have emerged, most of which support specification notations based on state machines, regular expressions, temporal logic, or grammars. The field of Artificial Intelligence (AI) has for an even longer period of time studied rule-based production systems, which at a closer look appear to be relevant for RV, although seemingly focused on slightly different application domains, such as for example business processes and expert systems. The core algorithm in many of these systems is the Rete algorithm. We have implemented a Rete-based runtime verification system, named LogFire (originally intended for offline log analysis but also applicable to online analysis), as an internal DSL in the Scala programming language, using Scala's support for defining DSLs. This combination appears attractive from a practical point of view. Our contribution is in part conceptual in arguing that such rule-based frameworks originating from AI may be suited for RV.

  1. DeepMitosis: Mitosis detection via deep detection, verification and segmentation networks.

    PubMed

    Li, Chao; Wang, Xinggang; Liu, Wenyu; Latecki, Longin Jan

    2018-04-01

    Mitotic count is a critical predictor of tumor aggressiveness in the breast cancer diagnosis. Nowadays mitosis counting is mainly performed by pathologists manually, which is extremely arduous and time-consuming. In this paper, we propose an accurate method for detecting the mitotic cells from histopathological slides using a novel multi-stage deep learning framework. Our method consists of a deep segmentation network for generating mitosis region when only a weak label is given (i.e., only the centroid pixel of mitosis is annotated), an elaborately designed deep detection network for localizing mitosis by using contextual region information, and a deep verification network for improving detection accuracy by removing false positives. We validate the proposed deep learning method on two widely used Mitosis Detection in Breast Cancer Histological Images (MITOSIS) datasets. Experimental results show that we can achieve the highest F-score on the MITOSIS dataset from ICPR 2012 grand challenge merely using the deep detection network. For the ICPR 2014 MITOSIS dataset that only provides the centroid location of mitosis, we employ the segmentation model to estimate the bounding box annotation for training the deep detection network. We also apply the verification model to eliminate some false positives produced from the detection model. By fusing scores of the detection and verification models, we achieve the state-of-the-art results. Moreover, our method is very fast with GPU computing, which makes it feasible for clinical practice. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. 40 CFR 1066.240 - Torque transducer verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification. Verify torque-measurement systems by performing the verifications described in §§ 1066.270 and... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Torque transducer verification. 1066...

  3. Experimental verification of force fields for molecular dynamics simulations using Gly-Pro-Gly-Gly.

    PubMed

    Aliev, Abil E; Courtier-Murias, Denis

    2010-09-30

    Experimental NMR verification of MD simulations using 12 different force fields (AMBER, CHARMM, GROMOS, and OPLS-AA) and 5 different water models has been undertaken to identify reliable MD protocols for structure and dynamics elucidations of small open chain peptides containing Gly and Pro. A conformationally flexible tetrapeptide Gly-Pro-Gly-Gly was selected for NMR (3)J-coupling, chemical shift, and internuclear distance measurements, followed by their calculations using 2 μs long MD simulations in water. In addition, Ramachandran population maps for Pro-2 and Gly-3 residues of GPGG obtained from MD simulations were used for detailed comparisons with similar maps from the protein data bank (PDB) for large number of Gly and Pro residues in proteins. The MD simulations revealed strong dependence of the populations and geometries of preferred backbone and side chain conformations, as well as the time scales of the peptide torsional transitions on the force field used. On the basis of the analysis of the measured and calculated data, AMBER99SB is identified as the most reliable force field for reproducing NMR measured parameters, which are dependent on the peptide backbone and the Pro side chain geometries and dynamics. Ramachandran maps showing the dependence of conformational populations as a function of backbone ϕ/ψ angles for Pro-2 and Gly-3 residues of GPGG from MD simulations using AMBER99SB, AMBER03, and CHARMM were found to resemble similar maps for Gly and Pro residues from the PDB survey. Three force fields (AMBER99, AMBER99ϕ, and AMBER94) showed the least satisfactory agreement with both the solution NMR and the PDB survey data. The poor performance of these force fields is attributed to their propensity to overstabilize helical peptide backbone conformations at the Pro-2 and Gly-3 residues. On the basis of the similarity of the MD and PDB Ramachandran plots, the following sequence of transitions is suggested for the Gly backbone conformation: α(L)

  4. 47 CFR 73.151 - Field strength measurements to establish performance of directional antennas.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... verified either by field strength measurement or by computer modeling and sampling system verification. (a... specifically identified by the Commission. (c) Computer modeling and sample system verification of modeled... performance verified by computer modeling and sample system verification. (1) A matrix of impedance...

  5. Clinical Experience and Evaluation of Patient Treatment Verification With a Transit Dosimeter.

    PubMed

    Ricketts, Kate; Navarro, Clara; Lane, Katherine; Blowfield, Claire; Cotten, Gary; Tomala, Dee; Lord, Christine; Jones, Joanne; Adeyemi, Abiodun

    2016-08-01

    To prospectively evaluate a protocol for transit dosimetry on a patient population undergoing intensity modulated radiation therapy (IMRT) and to assess the issues in clinical implementation of electronic portal imaging devices (EPIDs) for treatment verification. Fifty-eight patients were enrolled in the study. Amorphous silicon EPIDs were calibrated for dose and used to acquire images of delivered fields. Measured EPID dose maps were back-projected using the planning computed tomographic (CT) images to calculate dose at prespecified points within the patient and compared with treatment planning system dose offline using point dose difference and point γ analysis. The deviation of the results was used to inform future action levels. Two hundred twenty-five transit images were analyzed, composed of breast, prostate, and head and neck IMRT fields. Patient measurements demonstrated the potential of the dose verification protocol to model dose well under complex conditions: 83.8% of all delivered beams achieved the initial set tolerance level of ΔD of 0 ± 5 cGy or %ΔD of 0% ± 5%. Importantly, the protocol was also sensitive to anatomic changes and spotted that 3 patients from 20 measured prostate patients had undergone anatomic change in comparison with the planning CT. Patient data suggested an EPID-reconstructed versus treatment planning system dose difference action level of 0% ± 7% for breast fields. Asymmetric action levels were more appropriate for inversed IMRT fields, using absolute dose difference (-2 ± 5 cGy) or summed field percentage dose difference (-6% ± 7%). The in vivo dose verification method was easy to use and simple to implement, and it could detect patient anatomic changes that impacted dose delivery. The system required no extra dose to the patient or treatment time delay and so could be used throughout the course of treatment to identify and limit systematic and random errors in dose delivery for patient groups. Copyright

  6. Clinical Experience and Evaluation of Patient Treatment Verification With a Transit Dosimeter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricketts, Kate, E-mail: k.ricketts@ucl.ac.uk; Department of Radiotherapy Physics, Royal Berkshire NHS Foundation Trust, Reading; Navarro, Clara

    2016-08-01

    Purpose: To prospectively evaluate a protocol for transit dosimetry on a patient population undergoing intensity modulated radiation therapy (IMRT) and to assess the issues in clinical implementation of electronic portal imaging devices (EPIDs) for treatment verification. Methods and Materials: Fifty-eight patients were enrolled in the study. Amorphous silicon EPIDs were calibrated for dose and used to acquire images of delivered fields. Measured EPID dose maps were back-projected using the planning computed tomographic (CT) images to calculate dose at prespecified points within the patient and compared with treatment planning system dose offline using point dose difference and point γ analysis. Themore » deviation of the results was used to inform future action levels. Results: Two hundred twenty-five transit images were analyzed, composed of breast, prostate, and head and neck IMRT fields. Patient measurements demonstrated the potential of the dose verification protocol to model dose well under complex conditions: 83.8% of all delivered beams achieved the initial set tolerance level of Δ{sub D} of 0 ± 5 cGy or %Δ{sub D} of 0% ± 5%. Importantly, the protocol was also sensitive to anatomic changes and spotted that 3 patients from 20 measured prostate patients had undergone anatomic change in comparison with the planning CT. Patient data suggested an EPID-reconstructed versus treatment planning system dose difference action level of 0% ± 7% for breast fields. Asymmetric action levels were more appropriate for inversed IMRT fields, using absolute dose difference (−2 ± 5 cGy) or summed field percentage dose difference (−6% ± 7%). Conclusions: The in vivo dose verification method was easy to use and simple to implement, and it could detect patient anatomic changes that impacted dose delivery. The system required no extra dose to the patient or treatment time delay and so could be used throughout the course of treatment to identify and limit

  7. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  8. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  9. 14 CFR 211.11 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Verification. 211.11 Section 211.11 Aeronautics and Space OFFICE OF THE SECRETARY, DEPARTMENT OF TRANSPORTATION (AVIATION PROCEEDINGS) ECONOMIC REGULATIONS APPLICATIONS FOR PERMITS TO FOREIGN AIR CARRIERS General Requirements § 211.11 Verification...

  10. Quantitative assessment of the physical potential of proton beam range verification with PET/CT.

    PubMed

    Knopf, A; Parodi, K; Paganetti, H; Cascio, E; Bonab, A; Bortfeld, T

    2008-08-07

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6 degrees to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the

  11. Quantitative assessment of the physical potential of proton beam range verification with PET/CT

    NASA Astrophysics Data System (ADS)

    Knopf, A.; Parodi, K.; Paganetti, H.; Cascio, E.; Bonab, A.; Bortfeld, T.

    2008-08-01

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6° to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET

  12. Fresnel-region fields and antenna noise-temperature calculations for advanced microwave sounding units

    NASA Technical Reports Server (NTRS)

    Schmidt, R. F.

    1982-01-01

    A transition from the antenna noise temperature formulation for extended noise sources in the far-field or Fraunhofer-region of an antenna to one of the intermediate near field or Fresnel-region is discussed. The effort is directed toward microwave antenna simulations and high-speed digital computer analysis of radiometric sounding units used to obtain water vapor and temperature profiles of the atmosphere. Fresnel-region fields are compared at various distances from the aperture. The antenna noise temperature contribution of an annular noise source is computed in the Fresnel-region (D squared/16 lambda) for a 13.2 cm diameter offset-paraboloid aperture at 60 GHz. The time-average Poynting vector is used to effect the computation.

  13. REGIONAL-SCALE WIND FIELD CLASSIFICATION EMPLOYING CLUSTER ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glascoe, L G; Glaser, R E; Chin, H S

    2004-06-17

    The classification of time-varying multivariate regional-scale wind fields at a specific location can assist event planning as well as consequence and risk analysis. Further, wind field classification involves data transformation and inference techniques that effectively characterize stochastic wind field variation. Such a classification scheme is potentially useful for addressing overall atmospheric transport uncertainty and meteorological parameter sensitivity issues. Different methods to classify wind fields over a location include the principal component analysis of wind data (e.g., Hardy and Walton, 1978) and the use of cluster analysis for wind data (e.g., Green et al., 1992; Kaufmann and Weber, 1996). The goalmore » of this study is to use a clustering method to classify the winds of a gridded data set, i.e, from meteorological simulations generated by a forecast model.« less

  14. Identifying open magnetic field regions of the Sun and their heliospheric counterparts

    NASA Astrophysics Data System (ADS)

    Krista, L. D.; Reinard, A.

    2017-12-01

    Open magnetic regions on the Sun are either long-lived (coronal holes) or transient (dimmings) in nature. Both phenomena are fundamental to our understanding of the solar behavior as a whole. Coronal holes are the sources of high-speed solar wind streams that cause recurrent geomagnetic storms. Furthermore, the variation of coronal hole properties (area, location, magnetic field strength) over the solar activity cycle is an important marker of the global evolution of the solar magnetic field. Dimming regions, on the other hand, are short-lived coronal holes that often emerge in the wake of solar eruptions. By analyzing their physical properties and their temporal evolution, we aim to understand their connection with their eruptive counterparts (flares and coronal mass ejections) and predict the possibility of a geomagnetic storm. The author developed the Coronal Hole Automated Recognition and Monitoring (CHARM) and the Coronal Dimming Tracker (CoDiT) algorithms. These tools not only identify but track the evolution of open magnetic field regions. CHARM also provides daily coronal hole maps, that are used for forecasts at the NOAA Space Weather Prediction Center. Our goal is to better understand the processes that give rise to eruptive and non-eruptive open field regions and investigate how these regions evolve over time and influence space weather.

  15. Spatial Evaluation and Verification of Earthquake Simulators

    NASA Astrophysics Data System (ADS)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  16. Scaling up from field to region for wind erosion prediction using a field-scale wind erosion model and GIS

    USGS Publications Warehouse

    Zobeck, T.M.; Parker, N.C.; Haskell, S.; Guoding, K.

    2000-01-01

    Factors that affect wind erosion such as surface vegetative and other cover, soil properties and surface roughness usually change spatially and temporally at the field-scale to produce important field-scale variations in wind erosion. Accurate estimation of wind erosion when scaling up from fields to regions, while maintaining meaningful field-scale process details, remains a challenge. The objectives of this study were to evaluate the feasibility of using a field-scale wind erosion model with a geographic information system (GIS) to scale up to regional levels and to quantify the differences in wind erosion estimates produced by different scales of soil mapping used as a data layer in the model. A GIS was used in combination with the revised wind erosion equation (RWEQ), a field-scale wind erosion model, to estimate wind erosion for two 50 km2 areas. Landsat Thematic Mapper satellite imagery from 1993 with 30 m resolution was used as a base map. The GIS database layers included land use, soils, and other features such as roads. The major land use was agricultural fields. Data on 1993 crop management for selected fields of each crop type were collected from local government agency offices and used to 'train' the computer to classify land areas by crop and type of irrigation (agroecosystem) using commercially available software. The land area of the agricultural land uses was overestimated by 6.5% in one region (Lubbock County, TX, USA) and underestimated by about 21% in an adjacent region (Terry County, TX, USA). The total estimated wind erosion potential for Terry County was about four times that estimated for adjacent Lubbock County. The difference in potential erosion among the counties was attributed to regional differences in surface soil texture. In a comparison of different soil map scales in Terry County, the generalised soil map had over 20% more of the land area and over 15% greater erosion potential in loamy sand soils than did the detailed soil map. As

  17. A Verification-Driven Approach to Control Analysis and Tuning

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2008-01-01

    This paper proposes a methodology for the analysis and tuning of controllers using control verification metrics. These metrics, which are introduced in a companion paper, measure the size of the largest uncertainty set of a given class for which the closed-loop specifications are satisfied. This framework integrates deterministic and probabilistic uncertainty models into a setting that enables the deformation of sets in the parameter space, the control design space, and in the union of these two spaces. In regard to control analysis, we propose strategies that enable bounding regions of the design space where the specifications are satisfied by all the closed-loop systems associated with a prescribed uncertainty set. When this is unfeasible, we bound regions where the probability of satisfying the requirements exceeds a prescribed value. In regard to control tuning, we propose strategies for the improvement of the robust characteristics of a baseline controller. Some of these strategies use multi-point approximations to the control verification metrics in order to alleviate the numerical burden of solving a min-max problem. Since this methodology targets non-linear systems having an arbitrary, possibly implicit, functional dependency on the uncertain parameters and for which high-fidelity simulations are available, they are applicable to realistic engineering problems..

  18. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PERFORMANCE VERIFICATION OF THE W.L. GORE & ASSOCIATES GORE-SORBER SCREENING SURVEY

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  20. Formal design and verification of a reliable computing platform for real-time control. Phase 2: Results

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Divito, Ben L.

    1992-01-01

    The design and formal verification of the Reliable Computing Platform (RCP), a fault tolerant computing system for digital flight control applications is presented. The RCP uses N-Multiply Redundant (NMR) style redundancy to mask faults and internal majority voting to flush the effects of transient faults. The system is formally specified and verified using the Ehdm verification system. A major goal of this work is to provide the system with significant capability to withstand the effects of High Intensity Radiated Fields (HIRF).

  1. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  2. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having knowledge...

  3. Experimental verification of the role of electron pressure in fast magnetic reconnection with a guide field

    DOE PAGES

    Fox, W.; Sciortino, F.; v. Stechow, A.; ...

    2017-03-21

    We report detailed laboratory observations of the structure of a reconnection current sheet in a two-fluid plasma regime with a guide magnetic field. We observe and quantitatively analyze the quadrupolar electron pressure variation in the ion-diffusion region, as originally predicted by extended magnetohydrodynamics simulations. The projection of the electron pressure gradient parallel to the magnetic field contributes significantly to balancing the parallel electric field, and the resulting cross-field electron jets in the reconnection layer are diamagnetic in origin. Furthermore, these results demonstrate how parallel and perpendicular force balance are coupled in guide field reconnection and confirm basic theoretical models ofmore » the importance of electron pressure gradients for obtaining fast magnetic reconnection.« less

  4. Industrial methodology for process verification in research (IMPROVER): toward systems biology verification

    PubMed Central

    Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo

    2012-01-01

    Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044

  5. Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, S R; Bihari, B L; Salari, K

    As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.

  6. Verification of ICESat-2/ATLAS Science Receiver Algorithm Onboard Databases

    NASA Astrophysics Data System (ADS)

    Carabajal, C. C.; Saba, J. L.; Leigh, H. W.; Magruder, L. A.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.

    2013-12-01

    NASA's ICESat-2 mission will fly the Advanced Topographic Laser Altimetry System (ATLAS) instrument on a 3-year mission scheduled to launch in 2016. ATLAS is a single-photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz, and a 6 spot pattern on the Earth's surface. A set of onboard Receiver Algorithms will perform signal processing to reduce the data rate and data volume to acceptable levels. These Algorithms distinguish surface echoes from the background noise, limit the daily data volume, and allow the instrument to telemeter only a small vertical region about the signal. For this purpose, three onboard databases are used: a Surface Reference Map (SRM), a Digital Elevation Model (DEM), and a Digital Relief Maps (DRMs). The DEM provides minimum and maximum heights that limit the signal search region of the onboard algorithms, including a margin for errors in the source databases, and onboard geolocation. Since the surface echoes will be correlated while noise will be randomly distributed, the signal location is found by histogramming the received event times and identifying the histogram bins with statistically significant counts. Once the signal location has been established, the onboard Digital Relief Maps (DRMs) will be used to determine the vertical width of the telemetry band about the signal. University of Texas-Center for Space Research (UT-CSR) is developing the ICESat-2 onboard databases, which are currently being tested using preliminary versions and equivalent representations of elevation ranges and relief more recently developed at Goddard Space Flight Center (GSFC). Global and regional elevation models have been assessed in terms of their accuracy using ICESat geodetic control, and have been used to develop equivalent representations of the onboard databases for testing against the UT-CSR databases, with special emphasis on the ice sheet regions. A series of verification checks have been implemented, including

  7. CHEMICAL INDUCTION MIXER VERIFICATION - ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Wet-Weather Flow Technologies Pilot of the Environmental Technology Verification (ETV) Program, which is supported by the U.S. Environmental Protection Agency and facilitated by NSF International, has recently evaluated the performance of chemical induction mixers used for di...

  8. Expert system verification and validation study. Delivery 3A and 3B: Trip summaries

    NASA Technical Reports Server (NTRS)

    French, Scott

    1991-01-01

    Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.

  9. Regional-Scale High-Latitude Extreme Geoelectric Fields Pertaining to Geomagnetically Induced Currents

    NASA Technical Reports Server (NTRS)

    Pulkkinen, Antti; Bernabeu, Emanuel; Eichner, Jan; Viljanen, Ari; Ngwira, Chigomezyo

    2015-01-01

    Motivated by the needs of the high-voltage power transmission industry, we use data from the high-latitude IMAGE magnetometer array to study characteristics of extreme geoelectric fields at regional scales. We use 10-s resolution data for years 1993-2013, and the fields are characterized using average horizontal geoelectric field amplitudes taken over station groups that span about 500-km distance. We show that geoelectric field structures associated with localized extremes at single stations can be greatly different from structures associated with regionally uniform geoelectric fields, which are well represented by spatial averages over single stations. Visual extrapolation and rigorous extreme value analysis of spatially averaged fields indicate that the expected range for 1-in-100-year extreme events are 3-8 V/km and 3.4-7.1 V/km, respectively. The Quebec reference ground model is used in the calculations.

  10. The Use of Remote Sensing Satellites for Verification in International Law

    NASA Astrophysics Data System (ADS)

    Hettling, J. K.

    The contribution is a very sensitive topic which is currently about to gain significance and importance in the international community. It implies questions of international law as well as the contemplation of new developments and decisions in international politics. The paper will begin with the meaning and current status of verification in international law as well as the legal basis of satellite remote sensing in international treaties and resolutions. For the verification part, this implies giving a definition of verification and naming its fields of application and the different means of verification. For the remote sensing part, it involves the identification of relevant provisions in the Outer Space Treaty and the United Nations General Assembly Principles on Remote Sensing. Furthermore it shall be looked at practical examples: in how far have remote sensing satellites been used to verify international obligations? Are there treaties which would considerably profit from the use of remote sensing satellites? In this respect, there are various examples which can be contemplated, such as the ABM Treaty (even though out of force now), the SALT and START Agreements, the Chemical Weapons Convention and the Conventional Test Ban Treaty. It will be mentioned also that NGOs have started to verify international conventions, e.g. Landmine Monitor is verifying the Mine-Ban Convention. Apart from verifying arms control and disarmament treaties, satellites can also strengthen the negotiation of peace agreements (such as the Dayton Peace Talks) and the prevention of international conflicts from arising. Verification has played an increasingly prominent role in high-profile UN operations. Verification and monitoring can be applied to the whole range of elements that constitute a peace implementation process, ranging from the military aspects through electoral monitoring and human rights monitoring, from negotiating an accord to finally monitoring it. Last but not least the

  11. Field guide to diseases & insects of the Rocky Mountain Region

    Treesearch

    Forest Health Protection Rocky Mountain Region

    2010-01-01

    This field guide is a forest management tool for field identification of biotic and abiotic agents that damage native trees in Colorado, Kansas, Nebraska, South Dakota, and Wyoming, which constitute the USDA Forest Service's Rocky Mountain Region. The guide focuses only on tree diseases and forest insects that have significant economic, ecological, and/ or...

  12. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  13. Comparison of transient horizontal magnetic fields in a plage region and in the quiet Sun

    NASA Astrophysics Data System (ADS)

    Ishikawa, R.; Tsuneta, S.

    2009-02-01

    Aims: The properties of transient horizontal magnetic fields (THMFs) in both plage and quiet Sun regions are obtained and compared. Methods: Spectro-polarimetric observations with the Solar Optical Telescope (SOT) on the Hinode satellite were carried out with a cadence of about 30 s for both plage and quiet regions located near the disk center. We selected THMFs that have net linear polarization (LP) higher than 0.22%, and an area larger than or equal to 3 pixels, and compared their occurrence rates and distribution of magnetic field azimuth. We obtained probability density functions (PDFs) of magnetic field strength and inclination for both regions. Results: The occurrence rate in the plage region is the same as for the quiet Sun. The vertical magnetic flux in the plage region is ~8 times more than in the quiet Sun. There is essentially no preferred orientation for the THMFs in either region; however, THMFs in the plage region with higher LP have a preferred direction consistent with that of the plage-region's large-scale vertical field pattern. PDFs show that there is no difference in the distribution of field strength of horizontal fields between the quiet Sun and the plage regions when we avoid the persistent vertical flux concentrations for the plage region. Conclusions: The similarity between the PDFs and the occurrence rates in plage and quiet regions suggests that a local dynamo process due to the granular motion may generate THMFs all over the Sun. The preferred orientation for higher LP in the plage indicates that the THMFs may be somewhat influenced by the larger-scale magnetic field pattern of the plage. A movie is only available in electronic form at http://www.aanda.org

  14. Two years experience with quality assurance protocol for patient related Rapid Arc treatment plan verification using a two dimensional ionization chamber array

    PubMed Central

    2011-01-01

    Purpose To verify the dose distribution and number of monitor units (MU) for dynamic treatment techniques like volumetric modulated single arc radiation therapy - Rapid Arc - each patient treatment plan has to be verified prior to the first treatment. The purpose of this study was to develop a patient related treatment plan verification protocol using a two dimensional ionization chamber array (MatriXX, IBA, Schwarzenbruck, Germany). Method Measurements were done to determine the dependence between response of 2D ionization chamber array, beam direction, and field size. Also the reproducibility of the measurements was checked. For the patient related verifications the original patient Rapid Arc treatment plan was projected on CT dataset of the MatriXX and the dose distribution was calculated. After irradiation of the Rapid Arc verification plans measured and calculated 2D dose distributions were compared using the gamma evaluation method implemented in the measuring software OmniPro (version 1.5, IBA, Schwarzenbruck, Germany). Results The dependence between response of 2D ionization chamber array, field size and beam direction has shown a passing rate of 99% for field sizes between 7 cm × 7 cm and 24 cm × 24 cm for measurements of single arc. For smaller and larger field sizes than 7 cm × 7 cm and 24 cm × 24 cm the passing rate was less than 99%. The reproducibility was within a passing rate of 99% and 100%. The accuracy of the whole process including the uncertainty of the measuring system, treatment planning system, linear accelerator and isocentric laser system in the treatment room was acceptable for treatment plan verification using gamma criteria of 3% and 3 mm, 2D global gamma index. Conclusion It was possible to verify the 2D dose distribution and MU of Rapid Arc treatment plans using the MatriXX. The use of the MatriXX for Rapid Arc treatment plan verification in clinical routine is reasonable. The passing rate should be 99% than the verification

  15. On the Electron Diffusion Region in Asymmetric Reconnection with a Guide Magnetic Field

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Liu, Yi-Hsin; Chen, Li-Jen; Bessho, Naoki; Kuznetsova, Masha; Birn, Joachim; Burch, James L.

    2016-01-01

    Particle-in-cell simulations in a 2.5-D geometry and analytical theory are employed to study the electron diffusion region in asymmetric reconnection with a guide magnetic field. The analysis presented here demonstrates that similar to the case without guide field, in-plane flow stagnation and null of the in-plane magnetic field are well separated. In addition, it is shown that the electric field at the local magnetic X point is again dominated by inertial effects, whereas it remains dominated by nongyrotropic pressure effects at the in-plane flow stagnation point. A comparison between local electron Larmor radii and the magnetic gradient scale lengths predicts that distribution should become nongyrotropic in a region enveloping both field reversal and flow stagnation points. This prediction is verified by an analysis of modeled electron distributions, which show clear evidence of mixing in the critical region.

  16. Verification and transfer of thermal pollution model. Volume 4: User's manual for three-dimensional rigid-lid model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Nwadike, E. V.; Sinha, S. E.

    1982-01-01

    The theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model are described. Model verification at two sites, a separate user's manual for each model are included. The 3-D model has two forms: free surface and rigid lid. The former allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth, estuaries and coastal regions. The latter is suited for small surface wave heights compared to depth because surface elevation was removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free surface model also provides surface height variations with time.

  17. Verification and transfer of thermal pollution model. Volume 2: User's manual for 3-dimensional free-surface model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Tuann, S. Y.; Lee, C. R.

    1982-01-01

    The six-volume report: describes the theory of a three-dimensional (3-D) mathematical thermal discharge model and a related one-dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  18. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  19. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  20. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  1. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  2. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  3. Real time radiotherapy verification with Cherenkov imaging: development of a system for beamlet verification

    NASA Astrophysics Data System (ADS)

    Pogue, B. W.; Krishnaswamy, V.; Jermyn, M.; Bruza, P.; Miao, T.; Ware, William; Saunders, S. L.; Andreozzi, J. M.; Gladstone, D. J.; Jarvis, L. A.

    2017-05-01

    Cherenkov imaging has been shown to allow near real time imaging of the beam entrance and exit on patient tissue, with the appropriate intensified camera and associated image processing. A dedicated system has been developed for research into full torso imaging of whole breast irradiation, where the dual camera system captures the beam shape for all beamlets used in this treatment protocol. Particularly challenging verification measurement exists in dynamic wedge, field in field, and boost delivery, and the system was designed to capture these as they are delivered. Two intensified CMOS (ICMOS) cameras were developed and mounted in a breast treatment room, and pilot studies for intensity and stability were completed. Software tools to contour the treatment area have been developed and are being tested prior to initiation of the full trial. At present, it is possible to record delivery of individual beamlets as small as a single MLC thickness, and readout at 20 frames per second is achieved. Statistical analysis of system repeatibilty and stability is presented, as well as pilot human studies.

  4. ADEN ALOS PALSAR Product Verification

    NASA Astrophysics Data System (ADS)

    Wright, P. A.; Meadows, P. J.; Mack, G.; Miranda, N.; Lavalle, M.

    2008-11-01

    Within the ALOS Data European Node (ADEN) the verification of PALSAR products is an important and continuing activity, to ensure data utility for the users. The paper will give a summary of the verification activities, the status of the ADEN PALSAR processor and the current quality issues that are important for users of ADEN PALSAR data.

  5. A region of intense plasma wave turbulence on auroral field lines

    NASA Technical Reports Server (NTRS)

    Gurnett, D. A.; Frank, L. A.

    1976-01-01

    This report presents a detailed study of the plasma wave turbulence observed by HAWKEYE-1 and IMP-6 on high latitude auroral field lines and investigates the relationship of this turbulence to magnetic field and plasma measurements obtained in the same region.

  6. Prompt Gamma Imaging for In Vivo Range Verification of Pencil Beam Scanning Proton Therapy.

    PubMed

    Xie, Yunhe; Bentefour, El Hassane; Janssens, Guillaume; Smeets, Julien; Vander Stappen, François; Hotoiu, Lucian; Yin, Lingshu; Dolney, Derek; Avery, Stephen; O'Grady, Fionnbarr; Prieels, Damien; McDonough, James; Solberg, Timothy D; Lustig, Robert A; Lin, Alexander; Teo, Boon-Keng K

    2017-09-01

    To report the first clinical results and value assessment of prompt gamma imaging for in vivo proton range verification in pencil beam scanning mode. A stand-alone, trolley-mounted, prototype prompt gamma camera utilizing a knife-edge slit collimator design was used to record the prompt gamma signal emitted along the proton tracks during delivery of proton therapy for a brain cancer patient. The recorded prompt gamma depth detection profiles of individual pencil beam spots were compared with the expected profiles simulated from the treatment plan. In 6 treatment fractions recorded over 3 weeks, the mean (± standard deviation) range shifts aggregated over all spots in 9 energy layers were -0.8 ± 1.3 mm for the lateral field, 1.7 ± 0.7 mm for the right-superior-oblique field, and -0.4 ± 0.9 mm for the vertex field. This study demonstrates the feasibility and illustrates the distinctive benefits of prompt gamma imaging in pencil beam scanning treatment mode. Accuracy in range verification was found in this first clinical case to be better than the range uncertainty margin applied in the treatment plan. These first results lay the foundation for additional work toward tighter integration of the system for in vivo proton range verification and quantification of range uncertainties. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. SU-E-T-762: Toward Volume-Based Independent Dose Verification as Secondary Check

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tachibana, H; Tachibana, R

    2015-06-15

    Purpose: Lung SBRT plan has been shifted to volume prescription technique. However, point dose agreement is still verified using independent dose verification at the secondary check. The volume dose verification is more affected by inhomogeneous correction rather than point dose verification currently used as the check. A feasibility study for volume dose verification was conducted in lung SBRT plan. Methods: Six SBRT plans were collected in our institute. Two dose distributions with / without inhomogeneous correction were generated using Adaptive Convolve (AC) in Pinnacle3. Simple MU Analysis (SMU, Triangle Product, Ishikawa, JP) was used as the independent dose verification softwaremore » program, in which a modified Clarkson-based algorithm was implemented and radiological path length was computed using CT images independently to the treatment planning system. The agreement in point dose and mean dose between the AC with / without the correction and the SMU were assessed. Results: In the point dose evaluation for the center of the GTV, the difference shows the systematic shift (4.5% ± 1.9 %) in comparison of the AC with the inhomogeneous correction, on the other hands, there was good agreement of 0.2 ± 0.9% between the SMU and the AC without the correction. In the volume evaluation, there were significant differences in mean dose for not only PTV (14.2 ± 5.1 %) but also GTV (8.0 ± 5.1 %) compared to the AC with the correction. Without the correction, the SMU showed good agreement for GTV (1.5 ± 0.9%) as well as PTV (0.9% ± 1.0%). Conclusion: The volume evaluation for secondary check may be possible in homogenous region. However, the volume including the inhomogeneous media would make larger discrepancy. Dose calculation algorithm for independent verification needs to be modified to take into account the inhomogeneous correction.« less

  8. Wide Field Imaging of the Hubble Deep Field-South Region III: Catalog

    NASA Technical Reports Server (NTRS)

    Palunas, Povilas; Collins, Nicholas R.; Gardner, Jonathan P.; Hill, Robert S.; Malumuth, Eliot M.; Rhodes, Jason; Teplitz, Harry I.; Woodgate, Bruce E.

    2002-01-01

    We present 1/2 square degree uBVRI imaging around the Hubble Deep Field - South. These data have been used in earlier papers to examine the QSO population and the evolution of the correlation function in the region around the HDF-S. The images were obtained with the Big Throughput Camera at CTIO in September 1998. The images reach 5 sigma limits of u approx. 24.4, B approx. 25.6, V approx. 25.3, R approx. 24.9 and I approx. 23.9. We present a catalog of approx. 22,000 galaxies. We also present number-magnitude counts and a comparison with other observations of the same field. The data presented here are available over the world wide web.

  9. Calibration and verification of thermographic cameras for geometric measurements

    NASA Astrophysics Data System (ADS)

    Lagüela, S.; González-Jorge, H.; Armesto, J.; Arias, P.

    2011-03-01

    Infrared thermography is a technique with an increasing degree of development and applications. Quality assessment in the measurements performed with the thermal cameras should be achieved through metrology calibration and verification. Infrared cameras acquire temperature and geometric information, although calibration and verification procedures are only usual for thermal data. Black bodies are used for these purposes. Moreover, the geometric information is important for many fields as architecture, civil engineering and industry. This work presents a calibration procedure that allows the photogrammetric restitution and a portable artefact to verify the geometric accuracy, repeatability and drift of thermographic cameras. These results allow the incorporation of this information into the quality control processes of the companies. A grid based on burning lamps is used for the geometric calibration of thermographic cameras. The artefact designed for the geometric verification consists of five delrin spheres and seven cubes of different sizes. Metrology traceability for the artefact is obtained from a coordinate measuring machine. Two sets of targets with different reflectivity are fixed to the spheres and cubes to make data processing and photogrammetric restitution possible. Reflectivity was the chosen material propriety due to the thermographic and visual cameras ability to detect it. Two thermographic cameras from Flir and Nec manufacturers, and one visible camera from Jai are calibrated, verified and compared using calibration grids and the standard artefact. The calibration system based on burning lamps shows its capability to perform the internal orientation of the thermal cameras. Verification results show repeatability better than 1 mm for all cases, being better than 0.5 mm for the visible one. As it must be expected, also accuracy appears higher in the visible camera, and the geometric comparison between thermographic cameras shows slightly better

  10. Post-OPC verification using a full-chip pattern-based simulation verification method

    NASA Astrophysics Data System (ADS)

    Hung, Chi-Yuan; Wang, Ching-Heng; Ma, Cliff; Zhang, Gary

    2005-11-01

    In this paper, we evaluated and investigated techniques for performing fast full-chip post-OPC verification using a commercial product platform. A number of databases from several technology nodes, i.e. 0.13um, 0.11um and 90nm are used in the investigation. Although it has proven that for most cases, our OPC technology is robust in general, due to the variety of tape-outs with complicated design styles and technologies, it is difficult to develop a "complete or bullet-proof" OPC algorithm that would cover every possible layout patterns. In the evaluation, among dozens of databases, some OPC databases were found errors by Model-based post-OPC checking, which could cost significantly in manufacturing - reticle, wafer process, and more importantly the production delay. From such a full-chip OPC database verification, we have learned that optimizing OPC models and recipes on a limited set of test chip designs may not provide sufficient coverage across the range of designs to be produced in the process. And, fatal errors (such as pinch or bridge) or poor CD distribution and process-sensitive patterns may still occur. As a result, more than one reticle tape-out cycle is not uncommon to prove models and recipes that approach the center of process for a range of designs. So, we will describe a full-chip pattern-based simulation verification flow serves both OPC model and recipe development as well as post OPC verification after production release of the OPC. Lastly, we will discuss the differentiation of the new pattern-based and conventional edge-based verification tools and summarize the advantages of our new tool and methodology: 1). Accuracy: Superior inspection algorithms, down to 1nm accuracy with the new "pattern based" approach 2). High speed performance: Pattern-centric algorithms to give best full-chip inspection efficiency 3). Powerful analysis capability: Flexible error distribution, grouping, interactive viewing and hierarchical pattern extraction to narrow

  11. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  12. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  13. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  14. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  15. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...

  16. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...

  17. Security Verification of Secure MANET Routing Protocols

    DTIC Science & Technology

    2012-03-22

    SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS THESIS Matthew F. Steele, Captain, USAF AFIT/GCS/ ENG /12-03 DEPARTMENT OF THE AIR FORCE AIR...States AFIT/GCS/ ENG /12-03 SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS THESIS Presented to the Faculty Department of Electrical and Computer...DISTRIBUTION UNLIMITED AFIT/GCS/ ENG /12-03 SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS Matthew F. Steele, B.S.E.E. Captain, USAF

  18. NEUTRON MULTIPLICITY AND ACTIVE WELL NEUTRON COINCIDENCE VERIFICATION MEASUREMENTS PERFORMED FOR MARCH 2009 SEMI-ANNUAL DOE INVENTORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dewberry, R.; Ayers, J.; Tietze, F.

    The Analytical Development (AD) Section field nuclear measurement group performed six 'best available technique' verification measurements to satisfy a DOE requirement instituted for the March 2009 semi-annual inventory. The requirement of (1) yielded the need for SRNL Research Operations Department Material Control & Accountability (MC&A) group to measure the Pu content of five items and the highly enrich uranium (HEU) content of two. No 14Q-qualified measurement equipment was available to satisfy the requirement. The AD field nuclear group has routinely performed the required Confirmatory Measurements for the semi-annual inventories for fifteen years using sodium iodide and high purity germanium (HpGe)more » {gamma}-ray pulse height analysis nondestructive assay (NDA) instruments. With appropriate {gamma}-ray acquisition modeling, the HpGe spectrometers can be used to perform verification-type quantitative assay for Pu-isotopics and HEU content. The AD nuclear NDA group is widely experienced with this type of measurement and reports content for these species in requested process control, MC&A booking, and holdup measurements assays Site-wide. However none of the AD HpGe {gamma}-ray spectrometers have been 14Q-qualified, and the requirement of reference 1 specifically excluded a {gamma}-ray PHA measurement from those it would accept for the required verification measurements. The requirement of reference 1 was a new requirement for which the Savannah River National Laboratory (SRNL) Research Operations Department (ROD) MC&A group was unprepared. The criteria for exemption from verification were: (1) isotope content below 50 grams; (2) intrinsically tamper indicating or TID sealed items which contain a Category IV quantity of material; (3) assembled components; and (4) laboratory samples. Therefore all (SRNL) Material Balance Area (MBA) items with greater than 50 grams total Pu or greater than 50 grams HEU were subject to a verification measurement. The pass

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  20. Gender Verification of Female Olympic Athletes.

    ERIC Educational Resources Information Center

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  1. Flexible gas insulated transmission line having regions of reduced electric field

    DOEpatents

    Cookson, Alan H.; Fischer, William H.; Yoon, Kue H.; Meyer, Jeffry R.

    1983-01-01

    A gas insulated transmission line having radially flexible field control means for reducing the electric field along the periphery of the inner conductor at predetermined locations wherein the support insulators are located. The radially flexible field control means of the invention includes several structural variations of the inner conductor, wherein careful controlling of the length to depth of surface depressions produces regions of reduced electric field. Several embodiments of the invention dispose a flexible connector at the predetermined location along the inner conductor where the surface depressions that control the reduced electric field are located.

  2. Background field removal using a region adaptive kernel for quantitative susceptibility mapping of human brain

    NASA Astrophysics Data System (ADS)

    Fang, Jinsheng; Bao, Lijun; Li, Xu; van Zijl, Peter C. M.; Chen, Zhong

    2017-08-01

    Background field removal is an important MR phase preprocessing step for quantitative susceptibility mapping (QSM). It separates the local field induced by tissue magnetic susceptibility sources from the background field generated by sources outside a region of interest, e.g. brain, such as air-tissue interface. In the vicinity of air-tissue boundary, e.g. skull and paranasal sinuses, where large susceptibility variations exist, present background field removal methods are usually insufficient and these regions often need to be excluded by brain mask erosion at the expense of losing information of local field and thus susceptibility measures in these regions. In this paper, we propose an extension to the variable-kernel sophisticated harmonic artifact reduction for phase data (V-SHARP) background field removal method using a region adaptive kernel (R-SHARP), in which a scalable spherical Gaussian kernel (SGK) is employed with its kernel radius and weights adjustable according to an energy "functional" reflecting the magnitude of field variation. Such an energy functional is defined in terms of a contour and two fitting functions incorporating regularization terms, from which a curve evolution model in level set formation is derived for energy minimization. We utilize it to detect regions of with a large field gradient caused by strong susceptibility variation. In such regions, the SGK will have a small radius and high weight at the sphere center in a manner adaptive to the voxel energy of the field perturbation. Using the proposed method, the background field generated from external sources can be effectively removed to get a more accurate estimation of the local field and thus of the QSM dipole inversion to map local tissue susceptibility sources. Numerical simulation, phantom and in vivo human brain data demonstrate improved performance of R-SHARP compared to V-SHARP and RESHARP (regularization enabled SHARP) methods, even when the whole paranasal sinus regions

  3. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  4. Verification measurements and clinical evaluation of the iPlan RT Monte Carlo dose algorithm for 6 MV photon energy

    NASA Astrophysics Data System (ADS)

    Petoukhova, A. L.; van Wingerden, K.; Wiggenraad, R. G. J.; van de Vaart, P. J. M.; van Egmond, J.; Franken, E. M.; van Santvoort, J. P. C.

    2010-08-01

    This study presents data for verification of the iPlan RT Monte Carlo (MC) dose algorithm (BrainLAB, Feldkirchen, Germany). MC calculations were compared with pencil beam (PB) calculations and verification measurements in phantoms with lung-equivalent material, air cavities or bone-equivalent material to mimic head and neck and thorax and in an Alderson anthropomorphic phantom. Dosimetric accuracy of MC for the micro-multileaf collimator (MLC) simulation was tested in a homogeneous phantom. All measurements were performed using an ionization chamber and Kodak EDR2 films with Novalis 6 MV photon beams. Dose distributions measured with film and calculated with MC in the homogeneous phantom are in excellent agreement for oval, C and squiggle-shaped fields and for a clinical IMRT plan. For a field with completely closed MLC, MC is much closer to the experimental result than the PB calculations. For fields larger than the dimensions of the inhomogeneities the MC calculations show excellent agreement (within 3%/1 mm) with the experimental data. MC calculations in the anthropomorphic phantom show good agreement with measurements for conformal beam plans and reasonable agreement for dynamic conformal arc and IMRT plans. For 6 head and neck and 15 lung patients a comparison of the MC plan with the PB plan was performed. Our results demonstrate that MC is able to accurately predict the dose in the presence of inhomogeneities typical for head and neck and thorax regions with reasonable calculation times (5-20 min). Lateral electron transport was well reproduced in MC calculations. We are planning to implement MC calculations for head and neck and lung cancer patients.

  5. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Design verification testing. 61.40-3 Section 61.40-3... INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...

  6. Dosimetric accuracy of Kodak EDR2 film for IMRT verifications.

    PubMed

    Childress, Nathan L; Salehpour, Mohammad; Dong, Lei; Bloch, Charles; White, R Allen; Rosen, Isaac I

    2005-02-01

    difference and 2 mm criteria to ion chamber measurements for both sliding window and step-and-shoot fluence map verifications. Calibrated film results agreed with ion chamber measurements to within 5 % /2 mm criteria for transverse-plane full-plan verifications, but were consistently low. When properly calibrated, EDR2 film can be an adequate two-dimensional dosimeter for IMRT verifications, although it may underestimate doses in regions with long exposure times.

  7. Simulation-based MDP verification for leading-edge masks

    NASA Astrophysics Data System (ADS)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification

  8. SU-E-T-435: Development and Commissioning of a Complete System for In-Vivo Dosimetry and Range Verification in Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samuel, D; Testa, M; Park, Y

    Purpose: In-vivo dose and beam range verification in proton therapy could play significant roles in proton treatment validation and improvements. Invivo beam range verification, in particular, could enable new treatment techniques one of which, for example, could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. We have developed and commissioned an integrated system with hardware, software and workflow protocols, to provide a complete solution, simultaneously for both in-vivo dosimetry and range verification for proton therapy. Methods: The system uses a matrix of diodes, up to 12 in total, but separablemore » into three groups for flexibility in application. A special amplifier was developed to capture extremely small signals from very low proton beam current. The software was developed within iMagX, a general platform for image processing in radiation therapy applications. The range determination exploits the inherent relationship between the internal range modulation clock of the proton therapy system and the radiological depth at the point of measurement. The commissioning of the system, for in-vivo dosimetry and for range verification was separately conducted using anthropomorphic phantom. EBT films and TLDs were used for dose comparisons and range scan of the beam distal fall-off was used as ground truth for range verification. Results: For in-vivo dose measurement, the results were in agreement with TLD and EBT films and were within 3% from treatment planning calculations. For range verification, a precision of 0.5mm is achieved in homogeneous phantoms, and a precision of 2mm for anthropomorphic pelvic phantom, except at points with significant range mixing. Conclusion: We completed the commissioning of our system for in-vivo dosimetry and range verification in proton therapy. The results suggest that the system is ready for clinical trials on patient.« less

  9. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  10. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements: (a...

  11. Self-verification and contextualized self-views.

    PubMed

    Chen, Serena; English, Tammy; Peng, Kaiping

    2006-07-01

    Whereas most self-verification research has focused on people's desire to verify their global self-conceptions, the present studies examined self-verification with regard to contextualized selfviews-views of the self in particular situations and relationships. It was hypothesized that individuals whose core self-conceptions include contextualized self-views should seek to verify these self-views. In Study 1, the more individuals defined the self in dialectical terms, the more their judgments were biased in favor of verifying over nonverifying feedback about a negative, situation-specific self-view. In Study 2, consistent with research on gender differences in the importance of relationships to the self-concept, women but not men showed a similar bias toward feedback about a negative, relationship-specific self-view, a pattern not seen for global self-views. Together, the results support the notion that self-verification occurs for core self-conceptions, whatever form(s) they may take. Individual differences in self-verification and the nature of selfhood and authenticity are discussed.

  12. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  13. TH-AB-202-02: Real-Time Verification and Error Detection for MLC Tracking Deliveries Using An Electronic Portal Imaging Device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J Zwan, B; Central Coast Cancer Centre, Gosford, NSW; Colvill, E

    2016-06-15

    Purpose: The added complexity of the real-time adaptive multi-leaf collimator (MLC) tracking increases the likelihood of undetected MLC delivery errors. In this work we develop and test a system for real-time delivery verification and error detection for MLC tracking radiotherapy using an electronic portal imaging device (EPID). Methods: The delivery verification system relies on acquisition and real-time analysis of transit EPID image frames acquired at 8.41 fps. In-house software was developed to extract the MLC positions from each image frame. Three comparison metrics were used to verify the MLC positions in real-time: (1) field size, (2) field location and, (3)more » field shape. The delivery verification system was tested for 8 VMAT MLC tracking deliveries (4 prostate and 4 lung) where real patient target motion was reproduced using a Hexamotion motion stage and a Calypso system. Sensitivity and detection delay was quantified for various types of MLC and system errors. Results: For both the prostate and lung test deliveries the MLC-defined field size was measured with an accuracy of 1.25 cm{sup 2} (1 SD). The field location was measured with an accuracy of 0.6 mm and 0.8 mm (1 SD) for lung and prostate respectively. Field location errors (i.e. tracking in wrong direction) with a magnitude of 3 mm were detected within 0.4 s of occurrence in the X direction and 0.8 s in the Y direction. Systematic MLC gap errors were detected as small as 3 mm. The method was not found to be sensitive to random MLC errors and individual MLC calibration errors up to 5 mm. Conclusion: EPID imaging may be used for independent real-time verification of MLC trajectories during MLC tracking deliveries. Thresholds have been determined for error detection and the system has been shown to be sensitive to a range of delivery errors.« less

  14. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What is the Platform Verification Program? 250... Platforms and Structures Platform Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms...

  15. Fast regional readout CMOS Image Sensor for dynamic MLC tracking

    NASA Astrophysics Data System (ADS)

    Zin, H.; Harris, E.; Osmond, J.; Evans, P.

    2014-03-01

    Advanced radiotherapy techniques such as volumetric modulated arc therapy (VMAT) require verification of the complex beam delivery including tracking of multileaf collimators (MLC) and monitoring the dose rate. This work explores the feasibility of a prototype Complementary metal-oxide semiconductor Image Sensor (CIS) for tracking these complex treatments by utilising fast, region of interest (ROI) read out functionality. An automatic edge tracking algorithm was used to locate the MLC leaves edges moving at various speeds (from a moving triangle field shape) and imaged with various sensor frame rates. The CIS demonstrates successful edge detection of the dynamic MLC motion within accuracy of 1.0 mm. This demonstrates the feasibility of the sensor to verify treatment delivery involving dynamic MLC up to ~400 frames per second (equivalent to the linac pulse rate), which is superior to any current techniques such as using electronic portal imaging devices (EPID). CIS provides the basis to an essential real-time verification tool, useful in accessing accurate delivery of complex high energy radiation to the tumour and ultimately to achieve better cure rates for cancer patients.

  16. SU-E-T-278: Realization of Dose Verification Tool for IMRT Plan Based On DPM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Jinfeng; Cao, Ruifen; Dai, Yumei

    Purpose: To build a Monte Carlo dose verification tool for IMRT Plan by implementing a irradiation source model into DPM code. Extend the ability of DPM to calculate any incident angles and irregular-inhomogeneous fields. Methods: With the virtual source and the energy spectrum which unfolded from the accelerator measurement data,combined with optimized intensity maps to calculate the dose distribution of the irradiation irregular-inhomogeneous field. The irradiation source model of accelerator was substituted by a grid-based surface source. The contour and the intensity distribution of the surface source were optimized by ARTS (Accurate/Advanced Radiotherapy System) optimization module based on the tumormore » configuration. The weight of the emitter was decided by the grid intensity. The direction of the emitter was decided by the combination of the virtual source and the emitter emitting position. The photon energy spectrum unfolded from the accelerator measurement data was adjusted by compensating the contaminated electron source. For verification, measured data and realistic clinical IMRT plan were compared with DPM dose calculation. Results: The regular field was verified by comparing with the measured data. It was illustrated that the differences were acceptable (<2% inside the field, 2–3mm in the penumbra). The dose calculation of irregular field by DPM simulation was also compared with that of FSPB (Finite Size Pencil Beam) and the passing rate of gamma analysis was 95.1% for peripheral lung cancer. The regular field and the irregular rotational field were all within the range of permitting error. The computing time of regular fields were less than 2h, and the test of peripheral lung cancer was 160min. Through parallel processing, the adapted DPM could complete the calculation of IMRT plan within half an hour. Conclusion: The adapted parallelized DPM code with irradiation source model is faster than classic Monte Carlo codes. Its computational accuracy

  17. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication verification, or installation verification... 30 Mineral Resources 2 2010-07-01 2010-07-01 false When must I resubmit Platform Verification...

  18. Built-in-Test Verification Techniques

    DTIC Science & Technology

    1987-02-01

    report documents the results of the effort for the Rome Air Development Center Contract F30602-84-C-0021, BIT Verification Techniques. The work was...Richard Spillman of Sp.,llman Research Associates. The principal investigators were Mike Partridge and subsequently Jeffrey Albert. The contract was...two your effort to develop techniques for Built-In Test (BIT) verification. The objective of the contract was to develop specifications and technical

  19. Recover Act. Verification of Geothermal Tracer Methods in Highly Constrained Field Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becker, Matthew W.

    2014-05-16

    The prediction of the geothermal system efficiency is strong linked to the character of the flow system that connects injector and producer wells. If water flow develops channels or “short circuiting” between injection and extraction wells thermal sweep is poor and much of the reservoir is left untapped. The purpose of this project was to understand how channelized flow develops in fracture geothermal reservoirs and how it can be measured in the field. We explored two methods of assessing channelization: hydraulic connectivity tests and tracer tests. These methods were tested at a field site using two verification methods: ground penetratingmore » radar (GPR) images of saline tracer and heat transfer measurements using distributed temperature sensing (DTS). The field site for these studies was the Altona Flat Fractured Rock Research Site located in northeastern New York State. Altona Flat Rock is an experimental site considered a geologic analog for some geothermal reservoirs given its low matrix porosity. Because soil overburden is thin, it provided unique access to saturated bedrock fractures and the ability image using GPR which does not effectively penetrate most soils. Five boreholes were drilled in a “five spot” pattern covering 100 m2 and hydraulically isolated in a single bedding plane fracture. This simple system allowed a complete characterization of the fracture. Nine small diameter boreholes were drilled from the surface to just above the fracture to allow the measurement of heat transfer between the fracture and the rock matrix. The focus of the hydraulic investigation was periodic hydraulic testing. In such tests, rather than pumping or injection in a well at a constant rate, flow is varied to produce an oscillating pressure signal. This pressure signal is sensed in other wells and the attenuation and phase lag between the source and receptor is an indication of hydraulic connection. We found that these tests were much more effective than

  20. Background field removal using a region adaptive kernel for quantitative susceptibility mapping of human brain.

    PubMed

    Fang, Jinsheng; Bao, Lijun; Li, Xu; van Zijl, Peter C M; Chen, Zhong

    2017-08-01

    Background field removal is an important MR phase preprocessing step for quantitative susceptibility mapping (QSM). It separates the local field induced by tissue magnetic susceptibility sources from the background field generated by sources outside a region of interest, e.g. brain, such as air-tissue interface. In the vicinity of air-tissue boundary, e.g. skull and paranasal sinuses, where large susceptibility variations exist, present background field removal methods are usually insufficient and these regions often need to be excluded by brain mask erosion at the expense of losing information of local field and thus susceptibility measures in these regions. In this paper, we propose an extension to the variable-kernel sophisticated harmonic artifact reduction for phase data (V-SHARP) background field removal method using a region adaptive kernel (R-SHARP), in which a scalable spherical Gaussian kernel (SGK) is employed with its kernel radius and weights adjustable according to an energy "functional" reflecting the magnitude of field variation. Such an energy functional is defined in terms of a contour and two fitting functions incorporating regularization terms, from which a curve evolution model in level set formation is derived for energy minimization. We utilize it to detect regions of with a large field gradient caused by strong susceptibility variation. In such regions, the SGK will have a small radius and high weight at the sphere center in a manner adaptive to the voxel energy of the field perturbation. Using the proposed method, the background field generated from external sources can be effectively removed to get a more accurate estimation of the local field and thus of the QSM dipole inversion to map local tissue susceptibility sources. Numerical simulation, phantom and in vivo human brain data demonstrate improved performance of R-SHARP compared to V-SHARP and RESHARP (regularization enabled SHARP) methods, even when the whole paranasal sinus regions

  1. Automated detection of open magnetic field regions in EUV images

    NASA Astrophysics Data System (ADS)

    Krista, Larisza Diana; Reinard, Alysha

    2016-05-01

    Open magnetic regions on the Sun are either long-lived (coronal holes) or transient (dimmings) in nature, but both appear as dark regions in EUV images. For this reason their detection can be done in a similar way. As coronal holes are often large and long-lived in comparison to dimmings, their detection is more straightforward. The Coronal Hole Automated Recognition and Monitoring (CHARM) algorithm detects coronal holes using EUV images and a magnetogram. The EUV images are used to identify dark regions, and the magnetogam allows us to determine if the dark region is unipolar - a characteristic of coronal holes. There is no temporal sensitivity in this process, since coronal hole lifetimes span days to months. Dimming regions, however, emerge and disappear within hours. Hence, the time and location of a dimming emergence need to be known to successfully identify them and distinguish them from regular coronal holes. Currently, the Coronal Dimming Tracker (CoDiT) algorithm is semi-automated - it requires the dimming emergence time and location as an input. With those inputs we can identify the dimming and track it through its lifetime. CoDIT has also been developed to allow the tracking of dimmings that split or merge - a typical feature of dimmings.The advantage of these particular algorithms is their ability to adapt to detecting different types of open field regions. For coronal hole detection, each full-disk solar image is processed individually to determine a threshold for the image, hence, we are not limited to a single pre-determined threshold. For dimming regions we also allow individual thresholds for each dimming, as they can differ substantially. This flexibility is necessary for a subjective analysis of the studied regions. These algorithms were developed with the goal to allow us better understand the processes that give rise to eruptive and non-eruptive open field regions. We aim to study how these regions evolve over time and what environmental

  2. Complete data preparation flow for Massively Parallel E-Beam lithography on 28nm node full-field design

    NASA Astrophysics Data System (ADS)

    Fay, Aurélien; Browning, Clyde; Brandt, Pieter; Chartoire, Jacky; Bérard-Bergery, Sébastien; Hazart, Jérôme; Chagoya, Alexandre; Postnikov, Sergei; Saib, Mohamed; Lattard, Ludovic; Schavione, Patrick

    2016-03-01

    Massively parallel mask-less electron beam lithography (MP-EBL) offers a large intrinsic flexibility at a low cost of ownership in comparison to conventional optical lithography tools. This attractive direct-write technique needs a dedicated data preparation flow to correct both electronic and resist processes. Moreover, Data Prep has to be completed in a short enough time to preserve the flexibility advantage of MP-EBL. While the MP-EBL tools have currently entered an advanced stage of development, this paper will focus on the data preparation side of the work for specifically the MAPPER Lithography FLX-1200 tool [1]-[4], using the ASELTA Nanographics Inscale software. The complete flow as well as the methodology used to achieve a full-field layout data preparation, within an acceptable cycle time, will be presented. Layout used for Data Prep evaluation was one of a 28 nm technology node Metal1 chip with a field size of 26x33mm2, compatible with typical stepper/scanner field sizes and wafer stepping plans. Proximity Effect Correction (PEC) was applied to the entire field, which was then exported as a single file to MAPPER Lithography's machine format, containing fractured shapes and dose assignments. The Soft Edge beam to beam stitching method was employed in the specific overlap regions defined by the machine format as well. In addition to PEC, verification of the correction was included as part of the overall data preparation cycle time. This verification step was executed on the machine file format to ensure pattern fidelity and accuracy as late in the flow as possible. Verification over the full chip, involving billions of evaluation points, is performed both at nominal conditions and at Process Window corners in order to ensure proper exposure and process latitude. The complete MP-EBL data preparation flow was demonstrated for a 28 nm node Metal1 layout in 37 hours. The final verification step shows that the Edge Placement Error (EPE) is kept below 2.25 nm

  3. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What is the Platform Verification Program? 250... Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms; platforms of a new or unique design...

  4. SU-E-T-490: Independent Three-Dimensional (3D) Dose Verification of VMAT/SBRT Using EPID and Cloud Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, A; Han, B; Bush, K

    Purpose: Dosimetric verification of VMAT/SBRT is currently performed on one or two planes in a phantom with either film or array detectors. A robust and easy-to-use 3D dosimetric tool has been sought since the advent of conformal radiation therapy. Here we present such a strategy for independent 3D VMAT/SBRT plan verification system by a combined use of EPID and cloud-based Monte Carlo (MC) dose calculation. Methods: The 3D dosimetric verification proceeds in two steps. First, the plan was delivered with a high resolution portable EPID mounted on the gantry, and the EPID-captured gantry-angle-resolved VMAT/SBRT field images were converted into fluencemore » by using the EPID pixel response function derived from MC simulations. The fluence was resampled and used as the input for an in-house developed Amazon cloud-based MC software to reconstruct the 3D dose distribution. The accuracy of the developed 3D dosimetric tool was assessed using a Delta4 phantom with various field sizes (square, circular, rectangular, and irregular MLC fields) and different patient cases. The method was applied to validate VMAT/SBRT plans using WFF and FFF photon beams (Varian TrueBeam STX). Results: It was found that the proposed method yielded results consistent with the Delta4 measurements. For points on the two detector planes, a good agreement within 1.5% were found for all the testing fields. Patient VMAT/SBRT plan studies revealed similar level of accuracy: an average γ-index passing rate of 99.2± 0.6% (3mm/3%), 97.4± 2.4% (2mm/2%), and 72.6± 8.4 % ( 1mm/1%). Conclusion: A valuable 3D dosimetric verification strategy has been developed for VMAT/SBRT plan validation. The technique provides a viable solution for a number of intractable dosimetry problems, such as small fields and plans with high dose gradient.« less

  5. Regional United States electric field and GIC hazard impacts (Invited)

    NASA Astrophysics Data System (ADS)

    Gannon, J. L.; Balch, C. C.; Trichtchenko, L.

    2013-12-01

    Geomagnetically Induced Currents (GICs) are primarily driven by impulsive geomagnetic disturbances created by the interaction between the Earth's magnetosphere and sharp velocity, density, and magnetic field enhancements in the solar wind. However, the magnitude of the induced electric field response at the ground level, and therefore the resulting hazard to the bulk power system, is determined not only by magnetic drivers, but also by the underlying geology. Convolution techniques are used to calculate surface electric fields beginning from the spectral characteristics of magnetic field drivers and the frequency response of the local geology. Using these techniques, we describe historical scenarios for regions across the United States, and the potential impact of large events on electric power infrastructure.

  6. Verification Challenges at Low Numbers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-07-16

    This paper will explore the difficulties of deep reductions by examining the technical verification challenges. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 (Pifer 2010). Further reductions will include stepping stones at 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, themore » issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national lab complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.« less

  7. Model Based Verification of Cyber Range Event Environments

    DTIC Science & Technology

    2015-12-10

    Model Based Verification of Cyber Range Event Environments Suresh K. Damodaran MIT Lincoln Laboratory 244 Wood St., Lexington, MA, USA...apply model based verification to cyber range event environment configurations, allowing for the early detection of errors in event environment...Environment Representation (CCER) ontology. We also provide an overview of a methodology to specify verification rules and the corresponding error

  8. Three-dimensional verification of ¹²⁵I seed stability after permanent implantation in the parotid gland and periparotid region.

    PubMed

    Fan, Yi; Huang, Ming-Wei; Zheng, Lei; Zhao, Yi-Jiao; Zhang, Jian-Guo

    2015-11-24

    To evaluate seed stability after permanent implantation in the parotid gland and periparotid region via a three-dimensional reconstruction of CT data. Fifteen patients treated from June 2008 to June 2012 at Peking University School and Hospital of Stomatology for parotid gland tumors with postoperative adjunctive (125)I interstitial brachytherapy were retrospectively reviewed in this study. Serial CT data were obtained during follow-up. Mimics and Geomagic Studio software were used for seed reconstruction and stability analysis, respectively. Seed loss and/or migration outside of the treated area were absent in all patients during follow-up (23-71 months). Total seed cluster volume was maximized on day 1 post-implantation due to edema and decreased significantly by an average of 13.5 % (SD = 9.80 %; 95 % CI, 6.82-17.68 %) during the first two months and an average of 4.5 % (SD = 3.60 %; 95 % CI, 2.29-6.29 %) during the next four months. Volume stabilized over the subsequent six months. (125)I seed number and location were stable with a general volumetric shrinkage tendency in the parotid gland and periparotid region. Three-dimensional seed reconstruction of CT images is feasible for visualization and verification of implanted seeds in parotid brachytherapy.

  9. Active alignment/contact verification system

    DOEpatents

    Greenbaum, William M.

    2000-01-01

    A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.

  10. Students' Verification Strategies for Combinatorial Problems

    ERIC Educational Resources Information Center

    Mashiach Eizenberg, Michal; Zaslavsky, Orit

    2004-01-01

    We focus on a major difficulty in solving combinatorial problems, namely, on the verification of a solution. Our study aimed at identifying undergraduate students' tendencies to verify their solutions, and the verification strategies that they employ when solving these problems. In addition, an attempt was made to evaluate the level of efficiency…

  11. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  12. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  13. Enhanced Verification Test Suite for Physics Simulation Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, J R; Brock, J S; Brandon, S T

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest.more » This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  14. Monte Carlo simulations to replace film dosimetry in IMRT verification.

    PubMed

    Goetzfried, Thomas; Rickhey, Mark; Treutwein, Marius; Koelbl, Oliver; Bogner, Ludwig

    2011-01-01

    Patient-specific verification of intensity-modulated radiation therapy (IMRT) plans can be done by dosimetric measurements or by independent dose or monitor unit calculations. The aim of this study was the clinical evaluation of IMRT verification based on a fast Monte Carlo (MC) program with regard to possible benefits compared to commonly used film dosimetry. 25 head-and-neck IMRT plans were recalculated by a pencil beam based treatment planning system (TPS) using an appropriate quality assurance (QA) phantom. All plans were verified both by film and diode dosimetry and compared to MC simulations. The irradiated films, the results of diode measurements and the computed dose distributions were evaluated, and the data were compared on the basis of gamma maps and dose-difference histograms. Average deviations in the high-dose region between diode measurements and point dose calculations performed with the TPS and MC program were 0.7 ± 2.7% and 1.2 ± 3.1%, respectively. For film measurements, the mean gamma values with 3% dose difference and 3mm distance-to-agreement were 0.74 ± 0.28 (TPS as reference) with dose deviations up to 10%. Corresponding values were significantly reduced to 0.34 ± 0.09 for MC dose calculation. The total time needed for both verification procedures is comparable, however, by far less labor intensive in the case of MC simulations. The presented study showed that independent dose calculation verification of IMRT plans with a fast MC program has the potential to eclipse film dosimetry more and more in the near future. Thus, the linac-specific QA part will necessarily become more important. In combination with MC simulations and due to the simple set-up, point-dose measurements for dosimetric plausibility checks are recommended at least in the IMRT introduction phase. Copyright © 2010. Published by Elsevier GmbH.

  15. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  16. 22 CFR 123.14 - Import certificate/delivery verification procedure.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Import certificate/delivery verification...

  17. 22 CFR 123.14 - Import certificate/delivery verification procedure.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Import certificate/delivery verification...

  18. Self-verification motives at the collective level of self-definition.

    PubMed

    Chen, Serena; Chen, Karen Y; Shaw, Lindsay

    2004-01-01

    Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.

  19. About Region 3's Laboratory and Field Services at EPA's Environmental Science Center

    EPA Pesticide Factsheets

    Mission & contact information for EPA Region 3's Laboratory and Field Services located at EPA's Environmental Science Center: the Office of Analytical Services and Quality Assurance & Field Inspection Program

  20. Critical Surface Cleaning and Verification Alternatives

    NASA Technical Reports Server (NTRS)

    Melton, Donald M.; McCool, A. (Technical Monitor)

    2000-01-01

    As a result of federal and state requirements, historical critical cleaning and verification solvents such as Freon 113, Freon TMC, and Trichloroethylene (TCE) are either highly regulated or no longer 0 C available. Interim replacements such as HCFC 225 have been qualified, however toxicity and future phase-out regulations necessitate long term solutions. The scope of this project was to qualify a safe and environmentally compliant LOX surface verification alternative to Freon 113, TCE and HCFC 225. The main effort was focused on initiating the evaluation and qualification of HCFC 225G as an alternate LOX verification solvent. The project was scoped in FY 99/00 to perform LOX compatibility, cleaning efficiency and qualification on flight hardware.

  1. Optimal sensitometric curves of Kodak EDR2 film for dynamic intensity modulated radiation therapy verification.

    PubMed

    Suriyapee, S; Pitaxtarnin, N; Oonsiri, S; Jumpangern, C; Israngkul Na Ayuthaya, I

    2008-01-01

    To investigate the optimal sensitometric curves of extended dose range (EDR2) radiographic film in terms of depth, field size, dose range and processing conditions for dynamic intensity modulated radiation therapy (IMRT) dosimetry verification with 6 MV X-ray beams. A Varian Clinac 23 EX linear accelerator with 6 MV X-ray beam was used to study the response of Kodak EDR2 film. Measurements were performed at depths of 5, 10 and 15 cm in MedTec virtual water phantom and with field sizes of 2x2, 3x3, 10x10 and 15x15 cm(2). Doses ranging from 20 to 450 cGy were used. The film was developed with the Kodak RP X-OMAT Model M6B automatic film processor. Film response was measured with the Vidar model VXR-16 scanner. Sensitometric curves were applied to the dose profiles measured with film at 5 cm in the virtual water phantom with field sizes of 2x2 and 10x10 cm(2) and compared with ion chamber data. Scanditronix/Wellhofer OmniPro(TM) IMRT software was used for the evaluation of the IMRT plan calculated by Eclipse treatment planning. Investigation of the reproducibility and accuracy of the film responses, which depend mainly on the film processor, was carried out by irradiating one film nine times with doses of 20 to 450 cGy. A maximum standard deviation of 4.9% was found which decreased to 1.9% for doses between 20 and 200 cGy. The sensitometric curves for various field sizes at fixed depth showed a maximum difference of 4.2% between 2x2 and 15x15 cm(2) at 5 cm depth with a dose of 450 cGy. The shallow depth tended to show a greater effect of field size responses than the deeper depths. The sensitometric curves for various depths at fixed field size showed slightly different film responses; the difference due to depth was within 1.8% for all field sizes studied. Both field size and depth effect were reduced when the doses were lower than 450 cGy. The difference was within 2.5% in the dose range from 20 to 300 cGy for all field sizes and depths studied. Dose profiles

  2. Policing the boundaries of sex: a critical examination of gender verification and the Caster Semenya controversy.

    PubMed

    Cooky, Cheryl; Dworkin, Shari L

    2013-01-01

    On August 19, 2009, Caster Semenya, South African track star, won a gold medal in the women's 800-meter event. According to media reports, on the same day, the International Association of Athletics Federations (IAAF) ordered Semenya to undergo gender verification testing. This article critically assesses the main concepts and claims that undergird international sport organizations' policies regarding "gender verification" or "sex testing." We examine the ways in which these policies operate through several highly contested assumptions, including that (a) sex exists as a binary; (b) sport is a level playing field for competitors; and (c) some intersex athletes have an unfair advantage over women who are not intersex and, as such, they should be banned from competition to ensure that sport is a level playing field. To conclude, we make three recommendations that are consistent with the attainment of sex and gender justice in sport, which include acknowledging that myriad physical advantages are accepted in sport, recognizing that sport as a level playing field is a myth, and eliminating sex testing in sport.

  3. Commissioning and Science Verification of JAST/T80

    NASA Astrophysics Data System (ADS)

    Ederoclte, A.; Cenarro, A. J.; Marín-Franch, A.; Cristóbal-Hornillos, D.; Vázquez Ramió, H.; Varela, J.; Hurier, G.; Moles, M.; Lamadrid, J. L.; Díaz-Martín, M. C.; Iglesias Marzoa, R.; Tilve, V.; Rodríguez, S.; Maícas, N.; Abri, J.

    2017-03-01

    Located at the Observatorio Astrofísico de Javalambre, the ’’Javalambre Auxiliary Survey Telescope’’ is an 80cm telescope with a unvignetted 2 square degrees field of view. The telescope is equipped with T80Cam, a camera with a large format CCD and two filter wheels which can host, at any given time, 12 filters. The telescope has been designed to provide optical quality all across the field of view, which is achieved with a field corrector. In this talk, I will review the commissioning of the telescope. The optical performance in the centre of the field of view has been tested with lucky imaging technique, providing a telescope PSF of 0.4’’, which is close to the one expected from theory. Moreover, the tracking of the telescope does not affect the image quality, as it has been shown that stars appear round even in exposures of 10minutes obtained without guiding. Most importantly, we present the preliminary results of science verification observations which combine the two main characteristics of this telescope: the large field of view and the special filter set.

  4. Characteristics of ionospheric convection and field-aligned current in the dayside cusp region

    NASA Technical Reports Server (NTRS)

    Lu, G.; Lyons, L. R.; Reiff, P. H.; Denig, W. F.; Beaujardiere, O. De LA; Kroehl, H. W.; Newell, P. T.; Rich, F. J.; Opgenoorth, H.; Persson, M. A. L.

    1995-01-01

    The assimilative mapping of ionospheric electrodynamics (AMIE) technique has been used to estimate global distributions of high-latitude ionospheric convection and field-aligned current by combining data obtained nearly simultaneously both from ground and from space. Therefore, unlike the statistical patterns, the 'snapshot' distributions derived by AMIE allow us to examine in more detail the distinctions between field-aligned current systems associated with separate magnetospheric processes, especially in the dayside cusp region. By comparing the field-aligned current and ionospheric convection patterns with the corresponding spectrograms of precipitating particles, the following signatures have been identified: (1) For the three cases studied, which all had an IMF with negative y and z components, the cusp precipitation was encountered by the DMSP satellites in the postnoon sector in the northern hemisphere and in the prenoon sector in the southern hemisphere. The equatorward part of the cusp in both hemispheres is in the sunward flow region and marks the beginning of the flow rotation from sunward to antisunward. (2) The pair of field-aligned currents near local noon, i.e., the cusp/mantle currents, are coincident with the cusp or mantle particle precipitation. In distinction, the field-aligned currents on the dawnside and duskside, i.e., the normal region 1 currents, are usually associated with the plasma sheet particle precipitation. Thus the cusp/mantle currents are generated on open field lines and the region 1 currents mainly on closed field lines. (3) Topologically, the cusp/mantle currents appear as an expansion of the region 1 currents from the dawnside and duskside and they overlap near local noon. When B(sub y) is negative, in the northern hemisphere the downward field-aligned current is located poleward of the upward current; whereas in the southern hemisphere the upward current is located poleward of the downward current. (4) Under the assumption of

  5. In-Field Performance Testing of the Fork Detector for Quantitative Spent Fuel Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauld, Ian C.; Hu, Jianwei; De Baere, P.

    Expanding spent fuel dry storage activities worldwide are increasing demands on safeguards authorities that perform inspections. The European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) require measurements to verify declarations when spent fuel is transferred to difficult-to-access locations, such as dry storage casks and the repositories planned in Finland and Sweden. EURATOM makes routine use of the Fork detector to obtain gross gamma and total neutron measurements during spent fuel inspections. Data analysis is performed by modules in the integrated Review and Analysis Program (iRAP) software, developed jointly by EURATOM and the IAEA. Under the frameworkmore » of the US Department of Energy–EURATOM cooperation agreement, a module for automated Fork detector data analysis has been developed by Oak Ridge National Laboratory (ORNL) using the ORIGEN code from the SCALE code system and implemented in iRAP. EURATOM and ORNL recently performed measurements on 30 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel (Clab), operated by the Swedish Nuclear Fuel and Waste Management Company (SKB). The measured assemblies represent a broad range of fuel characteristics. Neutron count rates for 15 measured pressurized water reactor assemblies are predicted with an average relative standard deviation of 4.6%, and gamma signals are predicted on average within 2.6% of the measurement. The 15 measured boiling water reactor assemblies exhibit slightly larger deviations of 5.2% for the gamma signals and 5.7% for the neutron count rates, compared to measurements. These findings suggest that with improved analysis of the measurement data, existing instruments can provide increased verification of operator declarations of the spent fuel and thereby also provide greater ability to confirm integrity of an assembly. These results support the application of the Fork detector as a fully quantitative

  6. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  7. Magnetospheric Multiscale Observations of the Electron Diffusion Region of Large Guide Field Magnetic Reconnection

    NASA Technical Reports Server (NTRS)

    Eriksson, S.; Wilder, F. D.; Ergun, R. E.; Schwartz, S. J.; Cassak, P. A.; Burch, J. L.; Chen, Li-Jen; Torbert, R. B.; Phan, T. D.; Lavraud, B.; hide

    2016-01-01

    We report observations from the Magnetospheric Multiscale (MMS) satellites of a large guide field magnetic reconnection event. The observations suggest that two of the four MMS spacecraft sampled the electron diffusion region, whereas the other two spacecraft detected the exhaust jet from the event. The guide magnetic field amplitude is approximately 4 times that of the reconnecting field. The event is accompanied by a significant parallel electric field (E(sub parallel lines) that is larger than predicted by simulations. The high-speed (approximately 300 km/s) crossing of the electron diffusion region limited the data set to one complete electron distribution inside of the electron diffusion region, which shows significant parallel heating. The data suggest that E(sub parallel lines) is balanced by a combination of electron inertia and a parallel gradient of the gyrotropic electron pressure.

  8. Observations of photospheric magnetic fields and shear flows in flaring active regions

    NASA Astrophysics Data System (ADS)

    Tarbell, T.; Ferguson, S.; Frank, Z.; Title, A.; Topka, K.

    1988-11-01

    Horizontal flows in the photosphere and subsurface convection zone move the footpoints of coronal magnetic field lines. Magnetic energy to power flares can be stored in the corona if the flows drive the fields far from the potential configuration. Videodisk movies were shown with 0.5 to 1 arcsecond resolution of the following simultaneous observations: green continuum, longitudinal magnetogram, Fe I 5576 A line center (mid-photosphere), H alpha wings, and H alpha line center. The movies show a 90 x 90 arcsecond field of view of an active region at S29, W11. When viewed at speeds of a few thousand times real-time, the photospheric movies clearly show the active region fields being distorted by a remarkable combination of systematic flows and small eruptions of new flux. Magnetic bipoles are emerging over a large area, and the polarities are systematically flowing apart. The horizontal flows were mapped in detail from the continuum movies, and these may be used to predict the future evolution of the region. The horizontal flows are not discernable in H alpha. The H alpha movies strongly suggest reconnection processes in the fibrils joining opposite polarities. When viewed in combination with the magnetic movies, the cause for this evolution is apparent: opposite polarity fields collide and partially cancel, and the fibrils reconnect above the surface. This type of reconnection, driven by subphotospheric flows, complicates the chromospheric and coronal fields, causing visible braiding and twisting of the fibrils. Some of the transient emission events in the fibrils and adjacent plage may also be related.

  9. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  10. CASL Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mousseau, Vincent Andrew; Dinh, Nam

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation andmore » verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.« less

  11. The monitoring and verification of nuclear weapons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garwin, Richard L., E-mail: RLG2@us.ibm.com

    2014-05-09

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION: ADD-ON NOX CONTROLS

    EPA Science Inventory

    The paper discusses the environmental technology verification (ETV) of add-on nitrogen oxide (NOx) controls. Research Triangle Institute (RTI) is EPA's cooperating partner for the Air Pollution Control Technology (APCT) Program, one of a dozen ETV pilot programs. Verification of ...

  13. Verification and quality control of routine hematology analyzers.

    PubMed

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  14. Characterization of the IMF By-dependent field-aligned currents in the cleft region based on DE 2 observations

    NASA Technical Reports Server (NTRS)

    Taguchi, S.; Sugiura, M.; Winningham, J. D.; Slavin, J. A.

    1993-01-01

    The magnetic field and plasma data from 47 passes of DE-2 are used to study the IMF By-dependent distribution of field-aligned currents in the cleft region. It is proposed that the low-latitude cleft current (LCC) region is not an extension of the region 1 or region 2 current system and that a pair of LCCs and high-latitude cleft currents (HCCs) constitutes the cleft field-aligned current regime. The proposed pair of cleft field-aligned currents is explained with a qualitative model in which this pair of currents is generated on open field lines that have just been reconnected on the dayside magnetopause. The electric fields are transmitted along the field lines to the ionosphere, creating a poleward electric field and a pair of field-aligned currents when By is positive; the pair of field-aligned currents consists of a downward current at lower latitudes and an upward current at higher latitudes. In the By negative case, the model explains the reversal of the field-aligned current direction in the LCC and HCC regions.

  15. Multi-canister overpack project -- verification and validation, MCNP 4A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldmann, L.H.

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  17. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  18. Simulation environment based on the Universal Verification Methodology

    NASA Astrophysics Data System (ADS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  19. Glove-based approach to online signature verification.

    PubMed

    Kamel, Nidal S; Sayeed, Shohel; Ellis, Grant A

    2008-06-01

    Utilizing the multiple degrees of freedom offered by the data glove for each finger and the hand, a novel on-line signature verification system using the Singular Value Decomposition (SVD) numerical tool for signature classification and verification is presented. The proposed technique is based on the Singular Value Decomposition in finding r singular vectors sensing the maximal energy of glove data matrix A, called principal subspace, so the effective dimensionality of A can be reduced. Having modeled the data glove signature through its r-principal subspace, signature authentication is performed by finding the angles between the different subspaces. A demonstration of the data glove is presented as an effective high-bandwidth data entry device for signature verification. This SVD-based signature verification technique is tested and its performance is shown to be able to recognize forgery signatures with a false acceptance rate of less than 1.2%.

  20. Verification of S&D Solutions for Network Communications and Devices

    NASA Astrophysics Data System (ADS)

    Rudolph, Carsten; Compagna, Luca; Carbone, Roberto; Muñoz, Antonio; Repp, Jürgen

    This chapter describes the tool-supported verification of S&D Solutions on the level of network communications and devices. First, the general goals and challenges of verification in the context of AmI systems are highlighted and the role of verification and validation within the SERENITY processes is explained.Then, SERENITY extensions to the SH VErification tool are explained using small examples. Finally, the applicability of existing verification tools is discussed in the context of the AVISPA toolset. The two different tools show that for the security analysis of network and devices S&D Patterns relevant complementary approachesexist and can be used.

  1. Considerations in STS payload environmental verification

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1978-01-01

    Considerations regarding the Space Transportation System (STS) payload environmental verification are reviewed. It is noted that emphasis is placed on testing at the subassembly level and that the basic objective of structural dynamic payload verification is to ensure reliability in a cost-effective manner. Structural analyses consist of: (1) stress analysis for critical loading conditions, (2) model analysis for launch and orbital configurations, (3) flight loads analysis, (4) test simulation analysis to verify models, (5) kinematic analysis of deployment/retraction sequences, and (6) structural-thermal-optical program analysis. In addition to these approaches, payload verification programs are being developed in the thermal-vacuum area. These include the exposure to extreme temperatures, temperature cycling, thermal-balance testing and thermal-vacuum testing.

  2. Formal Multilevel Hierarchical Verification of Synchronous MOS VLSI Circuits.

    DTIC Science & Technology

    1987-06-01

    166 12.4 Capacitance Coupling............................. 166 12.5 Multiple Abstraction Fuctions ....................... 168...depend on whether it is performing flat verification or hierarchical verification. The primary operations of Silica Pithecus when performing flat...signals never arise. The primary operation of Silica Pithecus when performing hierarchical verification is processing constraints to show they hold

  3. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience.

    PubMed

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Jörgen; Nyholm, Tufve; Ahnesjö, Anders; Karlsson, Mikael

    2007-08-21

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm(3) ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 +/- 1.2% and 0.5 +/- 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 +/- 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental

  4. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  5. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  6. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  7. Verification of BOUT++ by the method of manufactured solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dudson, B. D., E-mail: benjamin.dudson@york.ac.uk; Hill, P.; Madsen, J.

    2016-06-15

    BOUT++ is a software package designed for solving plasma fluid models. It has been used to simulate a wide range of plasma phenomena ranging from linear stability analysis to 3D plasma turbulence and is capable of simulating a wide range of drift-reduced plasma fluid and gyro-fluid models. A verification exercise has been performed as part of a EUROfusion Enabling Research project, to rigorously test the correctness of the algorithms implemented in BOUT++, by testing order-of-accuracy convergence rates using the Method of Manufactured Solutions (MMS). We present tests of individual components including time-integration and advection schemes, non-orthogonal toroidal field-aligned coordinate systemsmore » and the shifted metric procedure which is used to handle highly sheared grids. The flux coordinate independent approach to differencing along magnetic field-lines has been implemented in BOUT++ and is here verified using the MMS in a sheared slab configuration. Finally, we show tests of three complete models: 2-field Hasegawa-Wakatani in 2D slab, 3-field reduced magnetohydrodynamics (MHD) in 3D field-aligned toroidal coordinates, and 5-field reduced MHD in slab geometry.« less

  8. Phase unwrapping using region-based markov random field model.

    PubMed

    Dong, Ying; Ji, Jim

    2010-01-01

    Phase unwrapping is a classical problem in Magnetic Resonance Imaging (MRI), Interferometric Synthetic Aperture Radar and Sonar (InSAR/InSAS), fringe pattern analysis, and spectroscopy. Although many methods have been proposed to address this problem, robust and effective phase unwrapping remains a challenge. This paper presents a novel phase unwrapping method using a region-based Markov Random Field (MRF) model. Specifically, the phase image is segmented into regions within which the phase is not wrapped. Then, the phase image is unwrapped between different regions using an improved Highest Confidence First (HCF) algorithm to optimize the MRF model. The proposed method has desirable theoretical properties as well as an efficient implementation. Simulations and experimental results on MRI images show that the proposed method provides similar or improved phase unwrapping than Phase Unwrapping MAx-flow/min-cut (PUMA) method and ZpM method.

  9. Palmprint Based Verification System Using SURF Features

    NASA Astrophysics Data System (ADS)

    Srinivas, Badrinath G.; Gupta, Phalguni

    This paper describes the design and development of a prototype of robust biometric system for verification. The system uses features extracted using Speeded Up Robust Features (SURF) operator of human hand. The hand image for features is acquired using a low cost scanner. The palmprint region extracted is robust to hand translation and rotation on the scanner. The system is tested on IITK database of 200 images and PolyU database of 7751 images. The system is found to be robust with respect to translation and rotation. It has FAR 0.02%, FRR 0.01% and accuracy of 99.98% and can be a suitable system for civilian applications and high-security environments.

  10. Dosimetric verification for primary focal hypermetabolism of nasopharyngeal carcinoma patients treated with dynamic intensity-modulated radiation therapy.

    PubMed

    Xin, Yong; Wang, Jia-Yang; Li, Liang; Tang, Tian-You; Liu, Gui-Hong; Wang, Jian-She; Xu, Yu-Mei; Chen, Yong; Zhang, Long-Zhen

    2012-01-01

    To make sure the feasibility with (18F)FDG PET/CT to guided dynamic intensity-modulated radiation therapy (IMRT) for nasopharyngeal carcinoma patients, by dosimetric verification before treatment. Chose 11 patients in III~IVA nasopharyngeal carcinoma treated with functional image-guided IMRT and absolute and relative dosimetric verification by Varian 23EX LA, ionization chamber, 2DICA of I'mRT Matrixx and IBA detachable phantom. Drawing outline and making treatment plan were by different imaging techniques (CT and (18F)FDG PET/CT). The dose distributions of the various regional were realized by SMART. The absolute mean errors of interest area were 2.39%±0.66 using 0.6 cc ice chamber. Results using DTA method, the average relative dose measurements within our protocol (3%, 3 mm) were 87.64% at 300 MU/min in all filed. Dosimetric verification before IMRT is obligatory and necessary. Ionization chamber and 2DICA of I'mRT Matrixx was the effective dosimetric verification tool for primary focal hyper metabolism in functional image-guided dynamic IMRT for nasopharyngeal carcinoma. Our preliminary evidence indicates that functional image-guided dynamic IMRT is feasible.

  11. 47 CFR 2.952 - Limitation on verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Limitation on verification. 2.952 Section 2.952 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL... person shall, in any advertising matter, brochure, etc., use or make reference to a verification in a...

  12. Handbook: Design of automated redundancy verification

    NASA Technical Reports Server (NTRS)

    Ford, F. A.; Hasslinger, T. W.; Moreno, F. J.

    1971-01-01

    The use of the handbook is discussed and the design progress is reviewed. A description of the problem is presented, and examples are given to illustrate the necessity for redundancy verification, along with the types of situations to which it is typically applied. Reusable space vehicles, such as the space shuttle, are recognized as being significant in the development of the automated redundancy verification problem.

  13. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    NASA Technical Reports Server (NTRS)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  14. The politics of verification and the control of nuclear tests, 1945-1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallagher, N.W.

    1990-01-01

    This dissertation addresses two questions: (1) why has agreement been reached on verification regimes to support some arms control accords but not others; and (2) what determines the extent to which verification arrangements promote stable cooperation. This study develops an alternative framework for analysis by examining the politics of verification at two levels. The logical politics of verification are shaped by the structure of the problem of evaluating cooperation under semi-anarchical conditions. The practical politics of verification are driven by players' attempts to use verification arguments to promote their desired security outcome. The historical material shows that agreements on verificationmore » regimes are reached when key domestic and international players desire an arms control accord and believe that workable verification will not have intolerable costs. Clearer understanding of how verification is itself a political problem, and how players manipulate it to promote other goals is necessary if the politics of verification are to support rather than undermine the development of stable cooperation.« less

  15. Flat field concave holographic grating with broad spectral region and moderately high resolution.

    PubMed

    Wu, Jian Fen; Chen, Yong Yan; Wang, Tai Sheng

    2012-02-01

    In order to deal with the conflicts between broad spectral region and high resolution in compact spectrometers based on a flat field concave holographic grating and line array CCD, we present a simple and practical method to design a flat field concave holographic grating that is capable of imaging a broad spectral region at a moderately high resolution. First, we discuss the principle of realizing a broad spectral region and moderately high resolution. Second, we provide the practical method to realize our ideas, in which Namioka grating theory, a genetic algorithm, and ZEMAX are used to reach this purpose. Finally, a near-normal-incidence example modeled in ZEMAX is shown to verify our ideas. The results show that our work probably has a general applicability in compact spectrometers with a broad spectral region and moderately high resolution.

  16. Four large-scale field-aligned current systmes in the dayside high-latitude region

    NASA Technical Reports Server (NTRS)

    Ohtani, S.; Potemra, T. A.; Newell, P.T.; Zanetti, L. J.; Iijima, T.; Watanabe, M.; Blomberg, L. G.; Elphinstone, R. D.; Murphree, J. S.; Yamauchi, M.

    1995-01-01

    A system of four current sheets of large-scale field-aligned currents (FACs) was discovered in the data set of simultaneous Viking and Defense Meteorological Satellire Program-F7 (DMSP-F7) crossing of the dayside high-latitude region. This paper reports four examples of this system that were observed in the prenoon sector. The flow polarities of FACs are upward, downward, upward, and downward, from equatorward to poleward. The lowest-latitude upward current is flowing mostly in the central plasma sheet (CPS) precipitation region, often overlapping with the boundary plasma sheet (BPS) at its poleward edge, andis interpreted as a region 2 current. The pair of downward and upward FACs in the middle of te structure are collocated with structured electron precipitation. The precipitation of high-energy (greater than 1 keV) electrons is more intense in the lower-latitude downward current sheet. The highest-latitude downward flowing current sheet is located in a weak, low-energy particle precipitation region, suggesting that this current is flowing on open field lines. Simulaneous observations in the postnoon local time sector reveal the standard three-sheet structure of FACs, sometimes described as region 2, region 1, and mantle (referred to the midday region O) currents. A high correlation was found between the occurrence of the four FAC sheet structure and negative interplanetary magnetic field (IMF) B(sub Y). We discuss the FAC structurein terms of three types of convection cells: the merging, viscous, andlobe cells. During strongly negative IMF B(sub Y), two convection reversals exist in the prenoon sector; one is inside the viscous cell, and the other is between the viscous cell and the lobe cell. This structure of convection flow is supported by the Viking electric field and auroral UV image data. Based on the convection pattern, the four FAC sheet structure is interpreted as the latitude overlap of midday and morning FAC systems. We suggest that the for

  17. Formal specification and verification of Ada software

    NASA Technical Reports Server (NTRS)

    Hird, Geoffrey R.

    1991-01-01

    The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.

  18. Current status of verification practices in clinical biochemistry in Spain.

    PubMed

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  19. Fingerprint changes and verification failure among patients with hand dermatitis.

    PubMed

    Lee, Chew Kek; Chang, Choong Chor; Johar, Asmah; Puwira, Othman; Roshidah, Baba

    2013-03-01

    To determine the prevalence of fingerprint verification failure and to define and quantify the fingerprint changes associated with fingerprint verification failure. Case-control study. Referral public dermatology center. The study included 100 consecutive patients with clinical hand dermatitis involving the palmar distal phalanx of either thumb and 100 age-, sex-, and ethnicity-matched controls. Patients with an altered thumb print due to other causes and palmar hyperhidrosis were excluded. Fingerprint verification(pass/fail) and hand eczema severity index score. Twenty-seven percent of patients failed fingerprint verification compared with 2% of controls. Fingerprint verification failure was associated with a higher hand eczema severity index score (P.001). The main fingerprint abnormalities were fingerprint dystrophy (42.0%) and abnormal white lines (79.5%). The number of abnormal white lines was significantly higher among the patients with hand dermatitis compared with controls(P=.001). Among the patients with hand dermatitis, theodds of failing fingerprint verification with fingerprint dystrophy was 4.01. The presence of broad lines and long lines was associated with a greater odds of fingerprint verification failure (odds ratio [OR], 8.04; 95% CI, 3.56-18.17 and OR, 2.37; 95% CI, 1.31-4.27, respectively),while the presence of thin lines was protective of verification failure (OR, 0.45; 95% CI, 0.23-0.89). Fingerprint verification failure is a significant problem among patients with more severe hand dermatitis. It is mainly due to fingerprint dystrophy and abnormal white lines. Malaysian National Medical Research Register Identifier: NMRR-11-30-8226

  20. Identity Verification, Control, and Aggression in Marriage

    ERIC Educational Resources Information Center

    Stets, Jan E.; Burke, Peter J.

    2005-01-01

    In this research we study the identity verification process and its effects in marriage. Drawing on identity control theory, we hypothesize that a lack of verification in the spouse identity (1) threatens stable self-meanings and interaction patterns between spouses, and (2) challenges a (nonverified) spouse's perception of control over the…

  1. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory's Source Region Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.

    As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.

  2. Optimal sensitometric curves of Kodak EDR2 film for dynamic intensity modulated radiation therapy verification

    PubMed Central

    Suriyapee, S; Pitaxtarnin, N; Oonsiri, S; Jumpangern, C; Israngkul Na Ayuthaya, I

    2008-01-01

    Purpose: To investigate the optimal sensitometric curves of extended dose range (EDR2) radiographic film in terms of depth, field size, dose range and processing conditions for dynamic intensity modulated radiation therapy (IMRT) dosimetry verification with 6 MV X-ray beams. Materials and methods: A Varian Clinac 23 EX linear accelerator with 6 MV X-ray beam was used to study the response of Kodak EDR2 film. Measurements were performed at depths of 5, 10 and 15 cm in MedTec virtual water phantom and with field sizes of 2x2, 3x3, 10x10 and 15x15 cm2. Doses ranging from 20 to 450 cGy were used. The film was developed with the Kodak RP X-OMAT Model M6B automatic film processor. Film response was measured with the Vidar model VXR-16 scanner. Sensitometric curves were applied to the dose profiles measured with film at 5 cm in the virtual water phantom with field sizes of 2x2 and 10x10 cm2 and compared with ion chamber data. Scanditronix/Wellhofer OmniProTM IMRT software was used for the evaluation of the IMRT plan calculated by Eclipse treatment planning. Results: Investigation of the reproducibility and accuracy of the film responses, which depend mainly on the film processor, was carried out by irradiating one film nine times with doses of 20 to 450 cGy. A maximum standard deviation of 4.9% was found which decreased to 1.9% for doses between 20 and 200 cGy. The sensitometric curves for various field sizes at fixed depth showed a maximum difference of 4.2% between 2x2 and 15x15 cm2 at 5 cm depth with a dose of 450 cGy. The shallow depth tended to show a greater effect of field size responses than the deeper depths. The sensitometric curves for various depths at fixed field size showed slightly different film responses; the difference due to depth was within 1.8% for all field sizes studied. Both field size and depth effect were reduced when the doses were lower than 450 cGy. The difference was within 2.5% in the dose range from 20 to 300 cGy for all field sizes and

  3. Dynamic testing for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.

    1972-01-01

    Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.

  4. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  5. INF verification: a guide for the perplexed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendelsohn, J.

    1987-09-01

    The administration has dug itself some deep holes on the verification issue. It will have to conclude an arms control treaty without having resolved earlier (but highly questionable) compliance issues on which it has placed great emphasis. It will probably have to abandon its more sweeping (and unnecessary) on-site inspection (OSI) proposals because of adverse security and political implications for the United States and its allies. And, finally, it will probably have to present to the Congress an INF treaty that will provide for a considerably less-stringent (but nonetheless adequate) verification regime that it had originally demanded. It is difficultmore » to dispel the impression that, when the likelihood of concluding an INF treaty seemed remote, the administration indulged its penchant for intrusive and non-negotiable verification measures. As the possibility of, and eagerness for, a treaty increased, and as the Soviet Union shifted its policy from one of the resistance to OSI to one of indicating that on-site verification involved reciprocal obligations, the administration was forced to scale back its OSI rhetoric. This re-evaluation of OSI by the administration does not make the INF treaty any less verifiable; from the outset the Reagan administration was asking for a far more extensive verification package than was necessary, practicable, acceptable, or negotiable.« less

  6. MO-F-16A-01: Implementation of MPPG TPS Verification Tests On Various Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smilowitz, J; Bredfeldt, J; Geurts, M

    2014-06-15

    Purpose: To demonstrate the implementation of the Medical Physics Practice Guideline (MPPG) for dose calculation and beam parameters verification of treatment planning systems (TPS). Methods: We implemented the draft TPS MPPG for three linacs: Varian Trilogy, TomoHDA and Elekta Infinity. Static and modulated test plans were created. The static fields are different than used in commissioning. Data was collected using ion chambers and diodes in a scanning water tank, Delta4 phantom and a custom phantom. MatLab and Microsoft Excel were used to create analysis tools to compare reference DICOM dose with scan data. This custom code allowed for the interpolation,more » registration and gamma analysis of arbitrary dose profiles. It will be provided as open source code. IMRT fields were validated with Delta4 registration and comparison tools. The time for each task was recorded. Results: The tests confirmed the strengths, and revealed some limitations, of our TPS. The agreement between calculated and measured dose was reported for all beams. For static fields, percent depth dose and profiles were analyzed with criteria in the draft MPPG. The results reveal areas of slight mismatch with the model (MLC leaf penumbra, buildup region.) For TomoTherapy, the IMRT plan 2%/2 mm gamma analysis revealed poorest agreement in the low dose regions. For one static test plan for all 10MV Trilogy photon beams, the plan generation, scan queue creation, data collection, data analysis and report took 2 hours, excluding tank setup. Conclusions: We have demonstrated the implementation feasibility of the TPS MPPG. This exercise generated an open source tool for dose comparisons between scan data and DICOM dose data. An easily reproducible and efficient infrastructure with streamlined data collection was created for repeatable robust testing of the TPS. The tests revealed minor discrepancies in our models and areas for improvement that are being investigated.« less

  7. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Control Functions), MOD-027-1 (Verification of Models and Data for Turbine/Governor and Load Control or...), MOD-027-1 (Verification of Models and Data for Turbine/Governor and Load Control or Active Power... Category B and C contingencies, as required by wind generators in Order No. 661, or that those generators...

  8. 24 CFR 985.3 - Indicators, HUD verification methods and ratings.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Indicators, HUD verification..., HUD verification methods and ratings. This section states the performance indicators that are used to assess PHA Section 8 management. HUD will use the verification method identified for each indicator in...

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM FOR MONITORING AND CHARACTERIZATION

    EPA Science Inventory

    The Environmental Technology Verification Program is a service of the Environmental Protection Agency designed to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of performance. The goal of ETV i...

  10. Local field radiotherapy without elective nodal irradiation for postoperative loco-regional recurrence of esophageal cancer.

    PubMed

    Kimoto, Takuya; Yamazaki, Hideya; Suzuki, Gen; Aibe, Norihiro; Masui, Koji; Tatekawa, Kotoha; Sasaki, Naomi; Fujiwara, Hitoshi; Shiozaki, Atsushi; Konishi, Hirotaka; Nakamura, Satoaki; Yamada, Kei

    2017-09-01

    Radiotherapy is an effective treatment for the postoperative loco-regional recurrence of esophageal cancer; however, the optimal treatment field remains controversial. This study aims to evaluate the outcome of local field radiotherapy without elective nodal irradiation for postoperative loco-regional recurrence of esophageal cancer. We retrospectively investigated 35 patients treated for a postoperative loco-regional recurrence of esophageal cancer with local field radiotherapy between December 2008 and March 2016. The median irradiation dose was 60 Gy (range: 50-67.5 Gy). Thirty-one (88.6%) patients received concurrent chemotherapy. The median follow-up period was 18 months (range: 5-94 months). The 2-year overall survival was 55.7%, with a median survival time of 29.9 months. In the univariate analysis, the maximal diameter ≤20 mm (P = 0.0383), solitary lesion (P = 0.0352), and the complete remission after treatment (P = 0.00411) had a significantly better prognosis. A total of 27 of 35 patients (77.1%) had progressive disease (loco-regional failure [n = 9], distant metastasis [n = 7], and both loco-regional failure and distant metastasis [n = 11]). No patients had Grade 3 or greater mucositis. Local field radiotherapy is a considerable treatment option for postoperative loco-regional recurrence of esophageal cancer. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  11. Biomarker Discovery and Verification of Esophageal Squamous Cell Carcinoma Using Integration of SWATH/MRM.

    PubMed

    Hou, Guixue; Lou, Xiaomin; Sun, Yulin; Xu, Shaohang; Zi, Jin; Wang, Quanhui; Zhou, Baojin; Han, Bo; Wu, Lin; Zhao, Xiaohang; Lin, Liang; Liu, Siqi

    2015-09-04

    We propose an efficient integration of SWATH with MRM for biomarker discovery and verification when the corresponding ion library is well established. We strictly controlled the false positive rate associated with SWATH MS signals and carefully selected the target peptides coupled with SWATH and MRM. We collected 10 samples of esophageal squamous cell carcinoma (ESCC) tissues paired with tumors and adjacent regions and quantified 1758 unique proteins with FDR 1% at protein level using SWATH, in which 467 proteins were abundance-dependent with ESCC. After carefully evaluating the SWATH MS signals of the up-regulated proteins, we selected 120 proteins for MRM verification. MRM analysis of the pooled and individual esophageal tissues resulted in 116 proteins that exhibited similar abundance response modes to ESCC that were acquired with SWATH. Because the ESCC-related proteins consisted of a high percentile of secreted proteins, we conducted the MRM assay on patient sera that were collected from pre- and postoperation. Of the 116 target proteins, 42 were identified in the ESCC sera, including 11 with lowered abundances postoperation. Coupling SWATH and MRM is thus feasible and efficient for the discovery and verification of cancer-related protein biomarkers.

  12. Numerical analyses of trapped field magnet and stable levitation region of HTSC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsuchimoto, M.; Kojima, T.; Waki, H.

    Stable levitation with a permanent magnet and a bulk high {Tc} superconductor (HTSC) is examined numerically by using the critical state model and the frozen field model. Differences between a permanent magnet and a trapped field magnet are first discussed from property of levitation force. Stable levitation region of the HTSC on a ring magnet and on a solenoid coil are calculated with the numerical methods. Obtained results are discussed from difference of the magnetic field configuration.

  13. Land Ice Verification and Validation Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and testmore » data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  14. Verification in Referral-Based Crowdsourcing

    PubMed Central

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  15. The Electronic View Box: a software tool for radiation therapy treatment verification.

    PubMed

    Bosch, W R; Low, D A; Gerber, R L; Michalski, J M; Graham, M V; Perez, C A; Harms, W B; Purdy, J A

    1995-01-01

    We have developed a software tool for interactively verifying treatment plan implementation. The Electronic View Box (EVB) tool copies the paradigm of current practice but does so electronically. A portal image (online portal image or digitized port film) is displayed side by side with a prescription image (digitized simulator film or digitally reconstructed radiograph). The user can measure distances between features in prescription and portal images and "write" on the display, either to approve the image or to indicate required corrective actions. The EVB tool also provides several features not available in conventional verification practice using a light box. The EVB tool has been written in ANSI C using the X window system. The tool makes use of the Virtual Machine Platform and Foundation Library specifications of the NCI-sponsored Radiation Therapy Planning Tools Collaborative Working Group for portability into an arbitrary treatment planning system that conforms to these specifications. The present EVB tool is based on an earlier Verification Image Review tool, but with a substantial redesign of the user interface. A graphical user interface prototyping system was used in iteratively refining the tool layout to allow rapid modifications of the interface in response to user comments. Features of the EVB tool include 1) hierarchical selection of digital portal images based on physician name, patient name, and field identifier; 2) side-by-side presentation of prescription and portal images at equal magnification and orientation, and with independent grayscale controls; 3) "trace" facility for outlining anatomical structures; 4) "ruler" facility for measuring distances; 5) zoomed display of corresponding regions in both images; 6) image contrast enhancement; and 7) communication of portal image evaluation results (approval, block modification, repeat image acquisition, etc.). The EVB tool facilitates the rapid comparison of prescription and portal images and

  16. On open and closed field line regions in Tsyganenko's field model and their possible associations with horse collar auroras

    NASA Technical Reports Server (NTRS)

    Birn, J.; Hones, E. W., Jr.; Craven, J. D.; Frank, L. A.; Elphinstone, R. D.; Stern, D. P.

    1991-01-01

    The boundary between open and closed field lines is investigated in the empirical Tsyganenko (1987) magnetic field model. All field lines extending to distances beyond -70 R(E), the tailward velocity limit of the Tsyganenko model are defined as open, while all other field lines, which cross the equatorial plane earthward of -70 R(E) and are connected with the earth at both ends, are assumed closed. It is found that this boundary at the surface of the earth, identified as the polar cap boundary, can exhibit the arrowhead shape, pointed toward the sun, which is found in horse collar auroras. For increasing activity levels, the polar cap increases in area and becomes rounder, so that the arrowhead shape is less pronounced. The presence of a net B(y) component can also lead to considerable rounding of the open flux region. The arrowhead shape is found to be closely associated with the increase of B(z) from the midnight region to the flanks of the tail, consistent with a similar increase of the plasma sheet thickness.

  17. HELIOSHEATH MAGNETIC FIELDS BETWEEN 104 AND 113 AU IN A REGION OF DECLINING SPEEDS AND A STAGNATION REGION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burlaga, L. F.; Ness, N. F., E-mail: lburlagahsp@verizon.net, E-mail: nfnudel@yahoo.com

    2012-04-10

    We examine the relationships between the magnetic field and the radial velocity component V{sub R} observed in the heliosheath by instruments on Voyager 1 (V1). No increase in the magnetic field strength B was observed in a region where V{sub R} decreased linearly from 70 km s{sup -1} to 0 km s{sup -1} as plasma moved outward past V1. An unusually broad transition from positive to negative polarity was observed during a Almost-Equal-To 26 day interval when the heliospheric current sheet (HCS) moved below the latitude of V1 and the speed of V1 was comparable to the radial speed ofmore » the heliosheath flow. When V1 moved through a region where V{sub R} Almost-Equal-To 0 (the 'stagnation region'), B increased linearly with time by a factor of two, and the average of B was 0.14 nT. Nothing comparable to this was observed previously. The magnetic polarity was negative throughout the stagnation region for Almost-Equal-To 580 days until 2011 DOY 235, indicating that the HCS was below the latitude of V1. The average passage times of the magnetic holes and proton boundary layers were the same during 2009 and 2011, because the plasma moved past V1 during 2009 at the same speed that V1 moved through the stagnation region during 2011. The microscale fluctuations of B in the stagnation region during 2011 are qualitatively the same as those observed in the heliosheath during 2009. These results suggest that the stagnation region is a part of the heliosheath, rather than a 'transition region' associated with the heliopause.« less

  18. 7 CFR 272.8 - State income and eligibility verification system.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false State income and eligibility verification system. 272... PARTICIPATING STATE AGENCIES § 272.8 State income and eligibility verification system. (a) General. (1) State agencies may maintain and use an income and eligibility verification system (IEVS), as specified in this...

  19. Guidelines for qualifying cleaning and verification materials

    NASA Technical Reports Server (NTRS)

    Webb, D.

    1995-01-01

    This document is intended to provide guidance in identifying technical issues which must be addressed in a comprehensive qualification plan for materials used in cleaning and cleanliness verification processes. Information presented herein is intended to facilitate development of a definitive checklist that should address all pertinent materials issues when down selecting a cleaning/verification media.

  20. 21 CFR 21.44 - Verification of identity.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  1. 21 CFR 21.44 - Verification of identity.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 1 2013-04-01 2013-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  2. 21 CFR 21.44 - Verification of identity.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 1 2014-04-01 2014-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  3. 21 CFR 21.44 - Verification of identity.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 1 2012-04-01 2012-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  4. Study of techniques for redundancy verification without disrupting systems, phases 1-3

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.

  5. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  6. Wide-Field Lensing Mass Maps from Dark Energy Survey Science Verification Data

    DOE PAGES

    Chang, C.

    2015-07-29

    We present a mass map reconstructed from weak gravitational lensing shear measurements over 139 deg 2 from the Dark Energy Survey science verification data. The mass map probes both luminous and dark matter, thus providing a tool for studying cosmology. We also find good agreement between the mass map and the distribution of massive galaxy clusters identified using a red-sequence cluster finder. Potential candidates for superclusters and voids are identified using these maps. We measure the cross-correlation between the mass map and a magnitude-limited foreground galaxy sample and find a detection at the 6.8σ level with 20 arc min smoothing.more » These measurements are consistent with simulated galaxy catalogs based on N-body simulations from a cold dark matter model with a cosmological constant. This suggests low systematics uncertainties in the map. Finally, we summarize our key findings in this Letter; the detailed methodology and tests for systematics are presented in a companion paper.« less

  7. Wide-Field Lensing Mass Maps from Dark Energy Survey Science Verification Data.

    PubMed

    Chang, C; Vikram, V; Jain, B; Bacon, D; Amara, A; Becker, M R; Bernstein, G; Bonnett, C; Bridle, S; Brout, D; Busha, M; Frieman, J; Gaztanaga, E; Hartley, W; Jarvis, M; Kacprzak, T; Kovács, A; Lahav, O; Lin, H; Melchior, P; Peiris, H; Rozo, E; Rykoff, E; Sánchez, C; Sheldon, E; Troxel, M A; Wechsler, R; Zuntz, J; Abbott, T; Abdalla, F B; Allam, S; Annis, J; Bauer, A H; Benoit-Lévy, A; Brooks, D; Buckley-Geer, E; Burke, D L; Capozzi, D; Carnero Rosell, A; Carrasco Kind, M; Castander, F J; Crocce, M; D'Andrea, C B; Desai, S; Diehl, H T; Dietrich, J P; Doel, P; Eifler, T F; Evrard, A E; Fausti Neto, A; Flaugher, B; Fosalba, P; Gruen, D; Gruendl, R A; Gutierrez, G; Honscheid, K; James, D; Kent, S; Kuehn, K; Kuropatkin, N; Maia, M A G; March, M; Martini, P; Merritt, K W; Miller, C J; Miquel, R; Neilsen, E; Nichol, R C; Ogando, R; Plazas, A A; Romer, A K; Roodman, A; Sako, M; Sanchez, E; Sevilla, I; Smith, R C; Soares-Santos, M; Sobreira, F; Suchyta, E; Tarle, G; Thaler, J; Thomas, D; Tucker, D; Walker, A R

    2015-07-31

    We present a mass map reconstructed from weak gravitational lensing shear measurements over 139  deg2 from the Dark Energy Survey science verification data. The mass map probes both luminous and dark matter, thus providing a tool for studying cosmology. We find good agreement between the mass map and the distribution of massive galaxy clusters identified using a red-sequence cluster finder. Potential candidates for superclusters and voids are identified using these maps. We measure the cross-correlation between the mass map and a magnitude-limited foreground galaxy sample and find a detection at the 6.8σ level with 20 arc min smoothing. These measurements are consistent with simulated galaxy catalogs based on N-body simulations from a cold dark matter model with a cosmological constant. This suggests low systematics uncertainties in the map. We summarize our key findings in this Letter; the detailed methodology and tests for systematics are presented in a companion paper.

  8. Formal Verification for a Next-Generation Space Shuttle

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2002-01-01

    This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.

  9. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication... 30 Mineral Resources 2 2011-07-01 2011-07-01 false When must I resubmit Platform Verification...

  10. A digital flight control system verification laboratory

    NASA Technical Reports Server (NTRS)

    De Feo, P.; Saib, S.

    1982-01-01

    A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.

  11. Preliminary report on the Black Thunder, Wyoming CTBT R and D experiment quicklook report: LLNL input from regional stations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harben, P.E.; Glenn, L.A.

    This report presents a preliminary summary of the data recorded at three regional seismic stations from surface blasting at the Black Thunder Coal Mine in northeast Wyoming. The regional stations are part of a larger effort that includes many more seismic stations in the immediate vicinity of the mine. The overall purpose of this effort is to characterize the source function and propagation characteristics of large typical surface mine blasts. A detailed study of source and propagation features of conventional surface blasts is a prerequisite to attempts at discriminating this type of blasting activity from other sources of seismic events.more » The Black Thunder Seismic experiment is a joint verification effort to determine seismic source and path effects that result from very large, but routine ripple-fired surface mining blasts. Studies of the data collected will be for the purpose of understanding how the near-field and regional seismic waveforms from these surface mining blasts are similar to, and different from, point shot explosions and explosions at greater depth. The Black Hills Station is a Designated Seismic Station that was constructed for temporary occupancy by the Former Soviet Union seismic verification scientists in accordance with the Threshold Test Ban Treaty protocol.« less

  12. A methodology for the rigorous verification of plasma simulation codes

    NASA Astrophysics Data System (ADS)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  13. Verification of clinical samples, positive in AMPLICOR Neisseria gonorrhoeae polymerase chain reaction, by 16S rRNA and gyrA compared with culture.

    PubMed

    Airell, Asa; Lindbäck, Emma; Ataker, Ferda; Pörnull, Kirsti Jalakas; Wretlind, Bengt

    2005-06-01

    We compared 956 samples for AMPLICOR Neisseria gonorrhoeae polymerase chain reaction (PCR) (Roche) with species verification using the 16S rRNA gene to verification using gyrA gene. Control was the culture method. The gyrA verification uses pyrosequencing of the quinolone resistance-determining region of gyrA. Of 52 samples with optical density >/=0.2 in PCR, 27 were negative in culture, two samples from pharynx were false negative in culture and four samples from pharynx were false positives in verification with 16S rRNA. Twenty-five samples showed growth of gonococci, 18 of the corresponding PCR samples were verified by both methods; three urine samples were positive only in gyrA ; and one pharynx specimen was positive only in 16S rRNA. Three samples were lost. We conclude that AMPLICOR N. gonorrhoeae PCR with verification in gyrA gene can be considered as a diagnostic tool in populations with low prevalence of gonorrhoea and that pharynx specimens should not be analysed by PCR.

  14. Static and Dynamic Verification of Critical Software for Space Applications

    NASA Astrophysics Data System (ADS)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION TEST PROTOCOL, GENERAL VENTILATION FILTERS

    EPA Science Inventory

    The Environmental Technology Verification Test Protocol, General Ventilation Filters provides guidance for verification tests.

    Reference is made in the protocol to the ASHRAE 52.2P "Method of Testing General Ventilation Air-cleaning Devices for Removal Efficiency by P...

  16. Signature Verification Using N-tuple Learning Machine.

    PubMed

    Maneechot, Thanin; Kitjaidure, Yuttana

    2005-01-01

    This research presents new algorithm for signature verification using N-tuple learning machine. The features are taken from handwritten signature on Digital Tablet (On-line). This research develops recognition algorithm using four features extraction, namely horizontal and vertical pen tip position(x-y position), pen tip pressure, and pen altitude angles. Verification uses N-tuple technique with Gaussian thresholding.

  17. Verification of eye lens dose in IMRT by MOSFET measurement.

    PubMed

    Wang, Xuetao; Li, Guangjun; Zhao, Jianling; Song, Ying; Xiao, Jianghong; Bai, Sen

    2018-04-17

    The eye lens is recognized as one of the most radiosensitive structures in the human body. The widespread use of intensity-modulated radiotherapy (IMRT) complicates dose verification and necessitates high standards of dose computation. The purpose of this work was to assess the computed dose accuracy of eye lens through measurements using a metal-oxide-semiconductor field-effect transistor (MOSFET) dosimetry system. Sixteen clinical IMRT plans of head and neck patients were copied to an anthropomorphic head phantom. Measurements were performed using the MOSFET dosimetry system based on the head phantom. Two MOSFET detectors were imbedded in the eyes of the head phantom as the left and the right lens, covered by approximately 5-mm-thick paraffin wax. The measurement results were compared with the calculated values with a dose grid size of 1 mm. Sixteen IMRT plans were delivered, and 32 measured lens doses were obtained for analysis. The MOSFET dosimetry system can be used to verify the lens dose, and our measurements showed that the treatment planning system used in our clinic can provide adequate dose assessment in eye lenses. The average discrepancy between measurement and calculation was 6.7 ± 3.4%, and the largest discrepancy was 14.3%, which met the acceptability criterion set by the American Association of Physicists in Medicine Task Group 53 for external beam calculation for multileaf collimator-shaped fields in buildup regions. Copyright © 2018 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.

  18. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Niton XLt 700 Series (XLt) XRF Services x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XLt analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XLt analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy

  19. The magnetic field structure in high-mass star formation regions

    NASA Technical Reports Server (NTRS)

    Davidson, Jacqueline A.; Schleuning, D.; Dotson, J. L.; Dowell, C. Darren; Hildebrand, Roger H.

    1995-01-01

    We present a preliminary analysis of far-IR polarimetric observations, which were made to study the magnetic field structure in the high-mass star formation regions of M42, NGC2024, and W3. These observations were made from the Kuiper Airborne Observatory (KAO), using the University of Chicago far-IR polarimeter, Stokes.

  20. 24 CFR 5.659 - Family information and verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Family information and verification... Assisted Housing Serving Persons with Disabilities: Family Income and Family Payment; Occupancy... § 5.659 Family information and verification. (a) Applicability. This section states requirements for...

  1. 24 CFR 5.659 - Family information and verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Family information and verification... Assisted Housing Serving Persons with Disabilities: Family Income and Family Payment; Occupancy... § 5.659 Family information and verification. (a) Applicability. This section states requirements for...

  2. 40 CFR 1066.275 - Daily dynamometer readiness verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.275 Daily... automated process for this verification procedure, perform this evaluation by setting the initial speed and... your dynamometer does not perform this verification with an automated process: (1) With the dynamometer...

  3. Software Verification of Orion Cockpit Displays

    NASA Technical Reports Server (NTRS)

    Biswas, M. A. Rafe; Garcia, Samuel; Prado, Matthew; Hossain, Sadad; Souris, Matthew; Morin, Lee

    2017-01-01

    NASA's latest spacecraft Orion is in the development process of taking humans deeper into space. Orion is equipped with three main displays to monitor and control the spacecraft. To ensure the software behind the glass displays operates without faults, rigorous testing is needed. To conduct such testing, the Rapid Prototyping Lab at NASA's Johnson Space Center along with the University of Texas at Tyler employed a software verification tool, EggPlant Functional by TestPlant. It is an image based test automation tool that allows users to create scripts to verify the functionality within a program. A set of edge key framework and Common EggPlant Functions were developed to enable creation of scripts in an efficient fashion. This framework standardized the way to code and to simulate user inputs in the verification process. Moreover, the Common EggPlant Functions can be used repeatedly in verification of different displays.

  4. Contribution to the Solar Mean Magnetic Field from Different Solar Regions

    NASA Astrophysics Data System (ADS)

    Kutsenko, A. S.; Abramenko, V. I.; Yurchyshyn, V. B.

    2017-09-01

    Seven-year-long seeing-free observations of solar magnetic fields with the Helioseismic and Magnetic Imager (HMI) on board the Solar Dynamics Observatory (SDO) were used to study the sources of the solar mean magnetic field, SMMF, defined as the net line-of-sight magnetic flux divided over the solar disk area. To evaluate the contribution of different regions to the SMMF, we separated all the pixels of each SDO/HMI magnetogram into three subsets: weak (BW), intermediate (BI), and strong (BS) fields. The BW component represents areas with magnetic flux densities below the chosen threshold; the BI component is mainly represented by network fields, remains of decayed active regions (ARs), and ephemeral regions. The BS component consists of magnetic elements in ARs. To derive the contribution of a subset to the total SMMF, the linear regression coefficients between the corresponding component and the SMMF were calculated. We found that i) when the threshold level of 30 Mx cm-2 is applied, the BI and BS components together contribute from 65% to 95% of the SMMF, while the fraction of the occupied area varies in a range of 2 - 6% of the disk area; ii) as the threshold magnitude is lowered to 6 Mx cm-2, the contribution from BI+BS grows to 98%, and the fraction of the occupied area reaches a value of about 40% of the solar disk. In summary, we found that regardless of the threshold level, only a small part of the solar disk area contributes to the SMMF. This means that the photospheric magnetic structure is an intermittent inherently porous medium, resembling a percolation cluster. These findings suggest that the long-standing concept that continuous vast unipolar areas on the solar surface are the source of the SMMF may need to be reconsidered.

  5. Regional gravity field modelling from GOCE observables

    NASA Astrophysics Data System (ADS)

    Pitoňák, Martin; Šprlák, Michal; Novák, Pavel; Tenzer, Robert

    2017-01-01

    In this article we discuss a regional recovery of gravity disturbances at the mean geocentric sphere approximating the Earth over the area of Central Europe from satellite gravitational gradients. For this purpose, we derive integral formulas which allow converting the gravity disturbances onto the disturbing gravitational gradients in the local north-oriented frame (LNOF). The derived formulas are free of singularities in case of r ≠ R . We then investigate three numerical approaches for solving their inverses. In the initial approach, the integral formulas are firstly modified for solving individually the near- and distant-zone contributions. While the effect of the near-zone gravitational gradients is solved as an inverse problem, the effect of the distant-zone gravitational gradients is computed by numerical integration from the global gravitational model (GGM) TIM-r4. In the second approach, we further elaborate the first scenario by reducing measured gravitational gradients for gravitational effects of topographic masses. In the third approach, we apply additional modification by reducing gravitational gradients for the reference GGM. In all approaches we determine the gravity disturbances from each of the four accurately measured gravitational gradients separately as well as from their combination. Our regional gravitational field solutions are based on the GOCE EGG_TRF_2 gravitational gradients collected within the period from November 1 2009 until January 11 2010. Obtained results are compared with EGM2008, DIR-r1, TIM-r1 and SPW-r1. The best fit, in terms of RMS (2.9 mGal), is achieved for EGM2008 while using the third approach which combine all four well-measured gravitational gradients. This is explained by the fact that a-priori information about the Earth's gravitational field up to the degree and order 180 was used.

  6. Possible Connection of Geological Composition With Geomagnetic Field Change In Kopaonik Thrust Region

    NASA Astrophysics Data System (ADS)

    Popeskov, Mirjana; Cukavac, Milena; Lazovic, Caslav

    This paper should consider interpretation of geomagnetic field changes on the basis of possible connection with geological composition of deformation zone. Analysis of total magnetic field intensity data from 38 surveys, carried out in the period may 1980 ­ november 2001 in Kopaonik thrust region, central Serbia, reveals anomalous behaviour of local field changes in particular time intervals. These data give us possibility to observe geomagnetic changes in long period of time. This paper shall consider if and how different magnetizations of geological composition of array are in connection with anomalous geomagnetic field change. We shall consider how non-uniform geological structure or rocks with different magnetizations can effect geomagnetic observations and weather sharp contrast in rock magnetization between neighbour layers can give rise to larger changes in the geomagnetic total intensity than those for a uniform layer. For that purpose we are going to consider geological and tectonical map of Kopaonik region. We shall also consider map of vertical component of geomagnetic field because Kopaonik belongs to high magnetic anomaly zone. Corelation of geomagnetic and geological data is supposed to give us some answers to the question of origine of some anomalious geomagnetic changes in total intensity of geomagnetic field. It can also represent first step in corelationof geomagnetic field changes to other geophysical, seismological or geological data that can be couse of geomagnetic field change.

  7. Verification and intercomparison of mesoscale ensemble prediction systems in the Beijing 2008 Olympics Research and Development Project

    NASA Astrophysics Data System (ADS)

    Kunii, Masaru; Saito, Kazuo; Seko, Hiromu; Hara, Masahiro; Hara, Tabito; Yamaguchi, Munehiko; Gong, Jiandong; Charron, Martin; Du, Jun; Wang, Yong; Chen, Dehui

    2011-05-01

    During the period around the Beijing 2008 Olympic Games, the Beijing 2008 Olympics Research and Development Project (B08RDP) was conducted as part of the World Weather Research Program short-range weather forecasting research project. Mesoscale ensemble prediction (MEP) experiments were carried out by six organizations in near-real time, in order to share their experiences in the development of MEP systems. The purpose of this study is to objectively verify these experiments and to clarify the problems associated with the current MEP systems through the same experiences. Verification was performed using the MEP outputs interpolated into a common verification domain with a horizontal resolution of 15 km. For all systems, the ensemble spreads grew as the forecast time increased, and the ensemble mean improved the forecast errors compared with individual control forecasts in the verification against the analysis fields. However, each system exhibited individual characteristics according to the MEP method. Some participants used physical perturbation methods. The significance of these methods was confirmed by the verification. However, the mean error (ME) of the ensemble forecast in some systems was worse than that of the individual control forecast. This result suggests that it is necessary to pay careful attention to physical perturbations.

  8. Internal and external potential-field estimation from regional vector data at varying satellite altitude

    NASA Astrophysics Data System (ADS)

    Plattner, Alain; Simons, Frederik J.

    2017-10-01

    When modelling satellite data to recover a global planetary magnetic or gravitational potential field, the method of choice remains their analysis in terms of spherical harmonics. When only regional data are available, or when data quality varies strongly with geographic location, the inversion problem becomes severely ill-posed. In those cases, adopting explicitly local methods is to be preferred over adapting global ones (e.g. by regularization). Here, we develop the theory behind a procedure to invert for planetary potential fields from vector observations collected within a spatially bounded region at varying satellite altitude. Our method relies on the construction of spatiospectrally localized bases of functions that mitigate the noise amplification caused by downward continuation (from the satellite altitude to the source) while balancing the conflicting demands for spatial concentration and spectral limitation. The `altitude-cognizant' gradient vector Slepian functions (AC-GVSF) enjoy a noise tolerance under downward continuation that is much improved relative to the `classical' gradient vector Slepian functions (CL-GVSF), which do not factor satellite altitude into their construction. Furthermore, venturing beyond the realm of their first application, published in a preceding paper, in the present article we extend the theory to being able to handle both internal and external potential-field estimation. Solving simultaneously for internal and external fields under the limitation of regional data availability reduces internal-field artefacts introduced by downward-continuing unmodelled external fields, as we show with numerical examples. We explain our solution strategies on the basis of analytic expressions for the behaviour of the estimation bias and variance of models for which signal and noise are uncorrelated, (essentially) space- and band-limited, and spectrally (almost) white. The AC-GVSF are optimal linear combinations of vector spherical harmonics

  9. 37 CFR 262.7 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner or a Performer may conduct a single audit of the Designated Agent upon reasonable notice and... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR CERTAIN ELIGIBLE...

  10. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...

  11. Distribution of the Crustal Magnetic Field in Sichuan-Yunnan Region, Southwest China

    PubMed Central

    Bai, Chunhua; Kang, Guofa; Gao, Guoming

    2014-01-01

    Based on the new and higher degree geomagnetic model NGDC-720-V3, we have investigated the spatial distribution, the altitude decay characteristics of the crustal magnetic anomaly, the contributions from different wavelength bands to the anomaly, and the relationship among the anomaly, the geological structure, and the geophysical field in Sichuan-Yunnan region of China. It is noted that the most outstanding feature in this area is the strong positive magnetic anomaly in Sichuan Basin, a geologically stable block. Contrasting with this feature, a strong negative anomaly can be seen nearby in Longmen Mountain block, an active block. This contradiction implies a possible relationship between the magnetic field and the geological activity. Completely different feature in magnetic field distribution is seen in the central Yunnan block, another active region, where positive and negative anomalies distribute alternatively, showing a complex magnetic anomaly map. Some fault belts, such as the Longmen Mountain fault, Lijiang-Xiaojinhe fault, and the Red River fault, are the transitional zones of strong and weak or negative and positive anomalies. The corresponding relationship between the magnetic anomaly and the geophysical fields was confirmed. PMID:25243232

  12. Adiabatic theory in regions of strong field gradients. [in magnetosphere

    NASA Technical Reports Server (NTRS)

    Whipple, E. C.; Northrop, T. G.; Birmingham, T. J.

    1986-01-01

    The theory for the generalized first invariant for adiabatic motion of charged particles in regions where there are large gradients in magnetic or electric fields is developed. The general condition for an invariant to exist in such regions is that the potential well in which the particle oscillates change its shape slowly as the particle drifts. It is shown how the Kruskal (1962) procedure can be applied to obtain expressions for the invariant and for drift velocities that are asymptotic in a smallness parameter epsilon. The procedure is illustrated by obtaining the invariant and drift velocities for particles traversing a perpendicular shock, and the generalized invariant is compared with the magnetic moment, and the drift orbits with the actual orbits, for a particular case. In contrast to the magnetic moment, the generalized first invariant is better for large gyroradii (large kinetic energies) than for small gyroradii. Expressions for the invariant when an electrostatic potential jump is imposed across the perpendicular shock, and when the particle traverses a rotational shear layer with a small normal component of the magnetic field are given.

  13. Large Scale Skill in Regional Climate Modeling and the Lateral Boundary Condition Scheme

    NASA Astrophysics Data System (ADS)

    Veljović, K.; Rajković, B.; Mesinger, F.

    2009-04-01

    Several points are made concerning the somewhat controversial issue of regional climate modeling: should a regional climate model (RCM) be expected to maintain the large scale skill of the driver global model that is supplying its lateral boundary condition (LBC)? Given that this is normally desired, is it able to do so without help via the fairly popular large scale nudging? Specifically, without such nudging, will the RCM kinetic energy necessarily decrease with time compared to that of the driver model or analysis data as suggested by a study using the Regional Atmospheric Modeling System (RAMS)? Finally, can the lateral boundary condition scheme make a difference: is the almost universally used but somewhat costly relaxation scheme necessary for a desirable RCM performance? Experiments are made to explore these questions running the Eta model in two versions differing in the lateral boundary scheme used. One of these schemes is the traditional relaxation scheme, and the other the Eta model scheme in which information is used at the outermost boundary only, and not all variables are prescribed at the outflow boundary. Forecast lateral boundary conditions are used, and results are verified against the analyses. Thus, skill of the two RCM forecasts can be and is compared not only against each other but also against that of the driver global forecast. A novel verification method is used in the manner of customary precipitation verification in that forecast spatial wind speed distribution is verified against analyses by calculating bias adjusted equitable threat scores and bias scores for wind speeds greater than chosen wind speed thresholds. In this way, focusing on a high wind speed value in the upper troposphere, verification of large scale features we suggest can be done in a manner that may be more physically meaningful than verifications via spectral decomposition that are a standard RCM verification method. The results we have at this point are somewhat

  14. Contribution of Field Strength Gradients to the Net Vertical Current of Active Regions

    NASA Astrophysics Data System (ADS)

    Vemareddy, P.

    2017-12-01

    We examined the contribution of field strength gradients for the degree of net vertical current (NVC) neutralization in active regions (ARs). We used photospheric vector magnetic field observations of AR 11158 obtained by Helioseismic and Magnetic Imager on board SDO and Hinode. The vertical component of the electric current is decomposed into twist and shear terms. The NVC exhibits systematic evolution owing to the presence of the sheared polarity inversion line between rotating and shearing magnetic regions. We found that the sign of shear current distribution is opposite in dominant pixels (60%–65%) to that of twist current distribution, and its time profile bears no systematic trend. This result indicates that the gradient of magnetic field strength contributes to an opposite signed, though smaller in magnitude, current to that contributed by the magnetic field direction in the vertical component of the current. Consequently, the net value of the shear current is negative in both polarity regions, which when added to the net twist current reduces the direct current value in the north (B z > 0) polarity, resulting in a higher degree of NVC neutralization. We conjecture that the observed opposite signs of shear and twist currents are an indication, according to Parker, that the direct volume currents of flux tubes are canceled by their return currents, which are contributed by field strength gradients. Furthermore, with the increase of spatial resolution, we found higher values of twist, shear current distributions. However, the resolution effect is more useful in resolving the field strength gradients, and therefore suggests more contribution from shear current for the degree of NVC neutralization.

  15. Soil-geographical regionalization as a basis for digital soil mapping: Karelia case study

    NASA Astrophysics Data System (ADS)

    Krasilnikov, P.; Sidorova, V.; Dubrovina, I.

    2010-12-01

    Recent development of digital soil mapping (DSM) allowed improving significantly the quality of soil maps. We tried to make a set of empirical models for the territory of Karelia, a republic at the North-East of the European territory of Russian Federation. This territory was selected for the pilot study for DSM for two reasons. First, the soils of the region are mainly monogenetic; thus, the effect of paleogeographic environment on recent soils is reduced. Second, the territory was poorly mapped because of low agricultural development: only 1.8% of the total area of the republic is used for agriculture and has large-scale soil maps. The rest of the territory has only small-scale soil maps, compiled basing on the general geographic concepts rather than on field surveys. Thus, the only solution for soil inventory was the predictive digital mapping. The absence of large-scaled soil maps did not allow data mining from previous soil surveys, and only empirical models could be applied. For regionalization purposes, we accepted the division into Northern and Southern Karelia, proposed in the general scheme of soil regionalization of Russia; boundaries between the regions were somewhat modified. Within each region, we specified from 15 (Northern Karelia) to 32 (Southern Karelia) individual soilscapes and proposed soil-topographic and soil-lithological relationships for every soilscape. Further field verification is needed to adjust the models.

  16. Proton Therapy Verification with PET Imaging

    PubMed Central

    Zhu, Xuping; Fakhri, Georges El

    2013-01-01

    Proton therapy is very sensitive to uncertainties introduced during treatment planning and dose delivery. PET imaging of proton induced positron emitter distributions is the only practical approach for in vivo, in situ verification of proton therapy. This article reviews the current status of proton therapy verification with PET imaging. The different data detecting systems (in-beam, in-room and off-line PET), calculation methods for the prediction of proton induced PET activity distributions, and approaches for data evaluation are discussed. PMID:24312147

  17. Formal verification of an oral messages algorithm for interactive consistency

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1992-01-01

    The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.

  18. Magnetic Field Diagnostics and Spatio-Temporal Variability of the Solar Transition Region

    NASA Astrophysics Data System (ADS)

    Peter, H.

    2013-12-01

    Magnetic field diagnostics of the transition region from the chromosphere to the corona faces us with the problem that one has to apply extreme-ultraviolet (EUV) spectro-polarimetry. While for the coronal diagnostics techniques already exist in the form of infrared coronagraphy above the limb and radio observations on the disk, one has to investigate EUV observations for the transition region. However, so far the success of such observations has been limited, but various current projects aim to obtain spectro-polarimetric data in the extreme UV in the near future. Therefore it is timely to study the polarimetric signals we can expect from these observations through realistic forward modeling. We employ a 3D magneto-hydrodynamic (MHD) forward model of the solar corona and synthesize the Stokes I and Stokes V profiles of C iv (1548 Å). A signal well above 0.001 in Stokes V can be expected even if one integrates for several minutes to reach the required signal-to-noise ratio, and despite the rapidly changing intensity in the model (just as in observations). This variability of the intensity is often used as an argument against transition region magnetic diagnostics, which requires exposure times of minutes. However, the magnetic field is evolving much slower than the intensity, and therefore the degree of (circular) polarization remains rather constant when one integrates in time. Our study shows that it is possible to measure the transition region magnetic field if a polarimetric accuracy on the order of 0.001 can be reached, which we can expect from planned instrumentation.

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION: Separation of Manure Solids from Flushed Swine Waste. Hoffland Environmental Inc. Drag Screen and Clarifier

    EPA Science Inventory

    Verification testing of the Hoffland Drag Screen and Clarifier was conducted at the North Carolina State University's Lake Wheeler Road Field Laboratory, in Raleigh, North Carolina. The farm is designed to operate as a research and teaching facility with the capacity for 250 so...

  20. 10 CFR 60.47 - Facility information and verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Facility information and verification. 60.47 Section 60.47 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Us/iaea Safeguards Agreement § 60.47 Facility information and verification. (a) In...

  1. Three Lectures on Theorem-proving and Program Verification

    NASA Technical Reports Server (NTRS)

    Moore, J. S.

    1983-01-01

    Topics concerning theorem proving and program verification are discussed with particlar emphasis on the Boyer/Moore theorem prover, and approaches to program verification such as the functional and interpreter methods and the inductive assertion approach. A history of the discipline and specific program examples are included.

  2. 10 CFR 61.32 - Facility information and verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Facility information and verification. 61.32 Section 61.32 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR LAND DISPOSAL OF RADIOACTIVE WASTE Licenses Us/iaea Safeguards Agreement § 61.32 Facility information and verification. (a) In...

  3. 10 CFR 61.32 - Facility information and verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Facility information and verification. 61.32 Section 61.32 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR LAND DISPOSAL OF RADIOACTIVE WASTE Licenses Us/iaea Safeguards Agreement § 61.32 Facility information and verification. (a) In...

  4. 10 CFR 61.32 - Facility information and verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Facility information and verification. 61.32 Section 61.32 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR LAND DISPOSAL OF RADIOACTIVE WASTE Licenses Us/iaea Safeguards Agreement § 61.32 Facility information and verification. (a) In...

  5. Distilling the Verification Process for Prognostics Algorithms

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai

    2013-01-01

    The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.

  6. Verification of the FtCayuga fault-tolerant microprocessor system. Volume 1: A case study in theorem prover-based verification

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam; Bickford, Mark

    1991-01-01

    The design and formal verification of a hardware system for a task that is an important component of a fault tolerant computer architecture for flight control systems is presented. The hardware system implements an algorithm for obtaining interactive consistancy (byzantine agreement) among four microprocessors as a special instruction on the processors. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, provided certain preconditions hold. An assumption is made that the processors execute synchronously. For verification, the authors used a computer aided design hardware design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.

  7. MULTI-WAVELENGTH STUDY OF A DELTA-SPOT. I. A REGION OF VERY STRONG, HORIZONTAL MAGNETIC FIELD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jaeggli, S. A., E-mail: sarah.jaeggli@nasa.gov

    Active region NOAA 11035 appeared in 2009 December, early in the new solar activity cycle. This region achieved a delta sunspot (δ spot) configuration when parasitic flux emerged near the rotationally leading magnetic polarity and traveled through the penumbra of the largest sunspot in the group. Both visible and infrared imaging spectropolarimetry of the magnetically sensitive Fe i line pairs at 6302 and 15650 Å show large Zeeman splitting in the penumbra between the parasitic umbra and the main sunspot umbra. The polarized Stokes spectra in the strongest field region display anomalous profiles, and strong blueshifts are seen in anmore » adjacent region. Analysis of the profiles is carried out using a Milne–Eddington inversion code capable of fitting either a single magnetic component with stray light or two independent magnetic components to verify the field strength. The inversion results show that the anomalous profiles cannot be produced by the combination of two profiles with moderate magnetic fields. The largest field strengths are 3500–3800 G in close proximity to blueshifts as strong as 3.8 km s{sup −1}. The strong, nearly horizontal magnetic field seen near the polarity inversion line in this region is difficult to understand in the context of a standard model of sunspot magnetohydrostatic equilibrium.« less

  8. Evolution of the Active Region NOAA 12443 based on magnetic field extrapolations: preliminary results

    NASA Astrophysics Data System (ADS)

    Chicrala, André; Dallaqua, Renato Sergio; Antunes Vieira, Luis Eduardo; Dal Lago, Alisson; Rodríguez Gómez, Jenny Marcela; Palacios, Judith; Coelho Stekel, Tardelli Ronan; Rezende Costa, Joaquim Eduardo; da Silva Rockenbach, Marlos

    2017-10-01

    The behavior of Active Regions (ARs) is directly related to the occurrence of some remarkable phenomena in the Sun such as solar flares or coronal mass ejections (CME). In this sense, changes in the magnetic field of the region can be used to uncover other relevant features like the evolution of the ARs magnetic structure and the plasma flow related to it. In this work we describe the evolution of the magnetic structure of the active region AR NOAA12443 observed from 2015/10/30 to 2015/11/10, which may be associated with several X-ray flares of classes C and M. The analysis is based on observations of the solar surface and atmosphere provided by HMI and AIA instruments on board of the SDO spacecraft. In order to investigate the magnetic energy buildup and release of the ARs, we shall employ potential and linear force free extrapolations based on the solar surface magnetic field distribution and the photospheric velocity fields.

  9. Using the full tensor of GOCE gravity gradients for regional gravity field modelling

    NASA Astrophysics Data System (ADS)

    Lieb, Verena; Bouman, Johannes; Dettmering, Denise; Fuchs, Martin; Schmidt, Michael

    2013-04-01

    With its 3-axis gradiometer GOCE delivers 3-dimensional (3D) information of the Earth's gravity field. This essential advantage - e.g. compared with the 1D gravity field information from GRACE - can be used for research on the Earth's interior and for geophysical exploration. To benefit from this multidimensional measurement system, the combination of all 6 GOCE gradients and additionally the consistent combination with other gravity observations mean an innovative challenge for regional gravity field modelling. As the individual gravity gradients reflect the gravity field depending on different spatial directions, observation equations are formulated separately for each of these components. In our approach we use spherical localizing base functions to display the gravity field for specified regions. Therefore the series expansions based on Legendre polynomials have to be adopted to obtain mathematical expressions for the second derivatives of the gravitational potential which are observed by GOCE in the Cartesian Gradiometer Reference Frame (GRF). We (1) have to transform the equations from the spherical terrestrial into a Cartesian Local North-Oriented Reference Frame (LNOF), (2) to set up a 3x3 tensor of observation equations and (3) finally to rotate the tensor defined in the terrestrial LNOF into the GRF. Thus we ensure the use of the original non-rotated and unaffected GOCE measurements within the analysis procedure. As output from the synthesis procedure we then obtain the second derivatives of the gravitational potential for all combinations of the xyz Cartesian coordinates in the LNOF. Further the implementation of variance component estimation provides a flexible tool to diversify the influence of the input gradiometer observations. On the one hand the less accurate xy and yz measurements are nearly excluded by estimating large variance components. On the other hand the yy measurements, which show systematic errors increasing at high latitudes, could be

  10. A Roadmap for the Implementation of Continued Process Verification.

    PubMed

    Boyer, Marcus; Gampfer, Joerg; Zamamiri, Abdel; Payne, Robin

    2016-01-01

    In 2014, the members of the BioPhorum Operations Group (BPOG) produced a 100-page continued process verification case study, entitled "Continued Process Verification: An Industry Position Paper with Example Protocol". This case study captures the thought processes involved in creating a continued process verification plan for a new product in response to the U.S. Food and Drug Administration's guidance on the subject introduced in 2011. In so doing, it provided the specific example of a plan developed for a new molecular antibody product based on the "A MAb Case Study" that preceded it in 2009.This document provides a roadmap that draws on the content of the continued process verification case study to provide a step-by-step guide in a more accessible form, with reference to a process map of the product life cycle. It could be used as a basis for continued process verification implementation in a number of different scenarios: For a single product and process;For a single site;To assist in the sharing of data monitoring responsibilities among sites;To assist in establishing data monitoring agreements between a customer company and a contract manufacturing organization. The U.S. Food and Drug Administration issued guidance on the management of manufacturing processes designed to improve quality and control of drug products. This involved increased focus on regular monitoring of manufacturing processes, reporting of the results, and the taking of opportunities to improve. The guidance and practice associated with it is known as continued process verification This paper summarizes good practice in responding to continued process verification guidance, gathered from subject matter experts in the biopharmaceutical industry. © PDA, Inc. 2016.

  11. Land surface Verification Toolkit (LVT)

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  12. EOS-AM precision pointing verification

    NASA Technical Reports Server (NTRS)

    Throckmorton, A.; Braknis, E.; Bolek, J.

    1993-01-01

    The Earth Observing System (EOS) AM mission requires tight pointing knowledge to meet scientific objectives, in a spacecraft with low frequency flexible appendage modes. As the spacecraft controller reacts to various disturbance sources and as the inherent appendage modes are excited by this control action, verification of precision pointing knowledge becomes particularly challenging for the EOS-AM mission. As presently conceived, this verification includes a complementary set of multi-disciplinary analyses, hardware tests and real-time computer in the loop simulations, followed by collection and analysis of hardware test and flight data and supported by a comprehensive data base repository for validated program values.

  13. 78 FR 27882 - VA Veteran-Owned Small Business (VOSB) Verification Guidelines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-13

    ... Verification Self-Assessment Tool that walks the veteran through the regulation and how it applies to the...) Verification Guidelines AGENCY: Department of Veterans Affairs. ACTION: Advanced notice of proposed rulemaking... regulations governing the Department of Veterans Affairs (VA) Veteran-Owned Small Business (VOSB) Verification...

  14. 78 FR 32010 - Pipeline Safety: Public Workshop on Integrity Verification Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-28

    .... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process AGENCY: Pipeline and... announcing a public workshop to be held on the concept of ``Integrity Verification Process.'' The Integrity Verification Process shares similar characteristics with fitness for service processes. At this workshop, the...

  15. Crowd Sourced Formal Verification-Augmentation (CSFV-A)

    DTIC Science & Technology

    2016-06-01

    Formal Verification (CSFV) program built games that recast FV problems into puzzles to make these problems more accessible, increasing the manpower to...construct FV proofs. This effort supported the CSFV program by hosting the games on a public website, and analyzed the gameplay for efficiency to...provide FV proofs. 15. SUBJECT TERMS Crowd Source, Software, Formal Verification, Games 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT

  16. 24 CFR 960.259 - Family information and verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Family information and verification... URBAN DEVELOPMENT ADMISSION TO, AND OCCUPANCY OF, PUBLIC HOUSING Rent and Reexamination § 960.259 Family information and verification. (a) Family obligation to supply information. (1) The family must supply any...

  17. 24 CFR 960.259 - Family information and verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Family information and verification... URBAN DEVELOPMENT ADMISSION TO, AND OCCUPANCY OF, PUBLIC HOUSING Rent and Reexamination § 960.259 Family information and verification. (a) Family obligation to supply information. (1) The family must supply any...

  18. Student-Teacher Linkage Verification: Model Process and Recommendations

    ERIC Educational Resources Information Center

    Watson, Jeffery; Graham, Matthew; Thorn, Christopher A.

    2012-01-01

    As momentum grows for tracking the role of individual educators in student performance, school districts across the country are implementing projects that involve linking teachers to their students. Programs that link teachers to student outcomes require a verification process for student-teacher linkages. Linkage verification improves accuracy by…

  19. Characterization of methanol as a magnetic field tracer in star-forming regions

    NASA Astrophysics Data System (ADS)

    Lankhaar, Boy; Vlemmings, Wouter; Surcis, Gabriele; van Langevelde, Huib Jan; Groenenboom, Gerrit C.; van der Avoird, Ad

    2018-02-01

    Magnetic fields play an important role during star formation1. Direct magnetic field strength observations have proven particularly challenging in the extremely dynamic protostellar phase2-4. Because of their occurrence in the densest parts of star-forming regions, masers, through polarization observations, are the main source of magnetic field strength and morphology measurements around protostars2. Of all maser species, methanol is one of the strongest and most abundant tracers of gas around high-mass protostellar disks and in outflows. However, as experimental determination of the magnetic characteristics of methanol has remained largely unsuccessful5, a robust magnetic field strength analysis of these regions could hitherto not be performed. Here, we report a quantitative theoretical model of the magnetic properties of methanol, including the complicated hyperfine structure that results from its internal rotation6. We show that the large range in values of the Landé g factors of the hyperfine components of each maser line lead to conclusions that differ substantially from the current interpretation based on a single effective g factor. These conclusions are more consistent with other observations7,8 and confirm the presence of dynamically important magnetic fields around protostars. Additionally, our calculations show that (nonlinear) Zeeman effects must be taken into account to further enhance the accuracy of cosmological electron-to-proton mass ratio determinations using methanol9-12.

  20. The use of robots for arms control treaty verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michalowski, S.J.

    1991-01-01

    Many aspects of the superpower relationship now present a new set of challenges and opportunities, including the vital area of arms control. This report addresses one such possibility: the use of robots for the verification of arms control treaties. The central idea of this report is far from commonly-accepted. In fact, it was only encountered once in bibliographic review phase of the project. Nonetheless, the incentive for using robots is simple and coincides with that of industrial applications: to replace or supplement human activity in the performance of tasks for which human participation is unnecessary, undesirable, impossible, too dangerous ormore » too expensive. As in industry, robots should replace workers (in this case, arms control inspectors) only when questions of efficiency, reliability, safety, security and cost-effectiveness have been answered satisfactorily. In writing this report, it is not our purpose to strongly advocate the application of robots in verification. Rather, we wish to explore the significant aspects, pro and con, of applying experience from the field of flexible automation to the complex task of assuring arms control treaty compliance. We want to establish a framework for further discussion of this topic and to define criteria for evaluating future proposals. The authors' expertise is in robots, not arms control. His practical experience has been in developing systems for use in the rehabilitation of severely disabled persons (such as quadriplegics), who can use robots for assistance during activities of everyday living, as well as in vocational applications. This creates a special interest in implementations that, in some way, include a human operator in the control scheme of the robot. As we hope to show in this report, such as interactive systems offer the greatest promise of making a contribution to the challenging problems of treaty verification. 15 refs.« less

  1. Verification of the NWP models operated at ICM, Poland

    NASA Astrophysics Data System (ADS)

    Melonek, Malgorzata

    2010-05-01

    Interdisciplinary Centre for Mathematical and Computational Modelling, University of Warsaw (ICM) started its activity in the field of NWP in May 1997. Since this time the numerical weather forecasts covering Central Europe have been routinely published on our publicly available website. First NWP model used in ICM was hydrostatic Unified Model developed by the UK Meteorological Office. It was a mesoscale version with horizontal resolution of 17 km and 31 levels in vertical. At present two NWP non-hydrostatic models are running in quasi-operational regime. The main new UM model with 4 km horizontal resolution, 38 levels in vertical and forecats range of 48 hours is running four times a day. Second, the COAMPS model (Coupled Ocean/Atmosphere Mesoscale Prediction System) developed by the US Naval Research Laboratory, configured with the three nested grids (with coresponding resolutions of 39km, 13km and 4.3km, 30 vertical levels) are running twice a day (for 00 and 12 UTC). The second grid covers Central Europe and has forecast range of 84 hours. Results of the both NWP models, ie. COAMPS computed on 13km mesh resolution and UM, are verified against observations from the Polish synoptic stations. Verification uses surface observations and nearest grid point forcasts. Following meteorological elements are verified: air temperature at 2m, mean sea level pressure, wind speed and wind direction at 10 m and 12 hours accumulated precipitation. There are presented different statistical indices. For continous variables Mean Error(ME), Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE) in 6 hours intervals are computed. In case of precipitation the contingency tables for different thresholds are computed and some of the verification scores such as FBI, ETS, POD, FAR are graphically presented. The verification sample covers nearly one year.

  2. Regional-Scale Surface Magnetic Fields and Proton Fluxes to Mercury's Surface from Proton-Reflection Magnetometry

    NASA Astrophysics Data System (ADS)

    Winslow, R. M.; Johnson, C. L.; Anderson, B. J.; Gershman, D. J.; Raines, J. M.; Lillis, R. J.; Korth, H.; Slavin, J. A.; Solomon, S. C.; Zurbuchen, T.

    2014-12-01

    The application of a recently developed proton-reflection magnetometry technique to MESSENGER spacecraft observations at Mercury has yielded two significant findings. First, loss-cone observations directly confirm particle precipitation to Mercury's surface and indicate that solar wind plasma persistently bombards the planet not only in the magnetic cusp regions but over a large fraction of the southern hemisphere. Second, the inferred surface field strengths independently confirm the north-south asymmetry in Mercury's global magnetic field structure first documented from observations of magnetic equator crossings. Here we extend this work with 1.5 additional years of observations (i.e., to 2.5 years in all) to further probe Mercury's surface magnetic field and better resolve proton flux precipitation to the planet's surface. We map regions where proton loss cones are observed; these maps indicate regions where protons precipitate directly onto the surface. The augmentation of our data set over that used in our original study allows us to examine the proton loss cones in cells of dimension 10° latitude by 20° longitude in Mercury body-fixed coordinates. We observe a transition from double-sided to single-sided loss cones in the pitch-angle distributions; this transition marks the boundary between open and closed field lines. At the surface this boundary lies between 60° and 70°N. Our observations allow the estimation of surface magnetic field strengths in the northern cusp region and the calculation of incident proton fluxes to both hemispheres. In the northern cusp, our regional-scale observations are consistent with an offset dipole field and a dipole moment of 190 nT RM3, where RM is Mercury's radius, implying that any regional-scale variations in surface magnetic field strengths are either weak relative to the dipole field or occur at length scales smaller than the resolution of our observations (~300 km). From the global proton flux map (north of 40° S

  3. Active Region Photospheric Magnetic Properties Derived from Line-of-Sight and Radial Fields

    NASA Astrophysics Data System (ADS)

    Guerra, J. A.; Park, S.-H.; Gallagher, P. T.; Kontogiannis, I.; Georgoulis, M. K.; Bloomfield, D. S.

    2018-01-01

    The effect of using two representations of the normal-to-surface magnetic field to calculate photospheric measures that are related to the active region (AR) potential for flaring is presented. Several AR properties were computed using line-of-sight (B_{los}) and spherical-radial (Br) magnetograms from the Space-weather HMI Active Region Patch (SHARP) products of the Solar Dynamics Observatory, characterizing the presence and features of magnetic polarity inversion lines, fractality, and magnetic connectivity of the AR photospheric field. The data analyzed correspond to {≈ }4{,}000 AR observations, achieved by randomly selecting 25% of days between September 2012 and May 2016 for analysis at 6-hr cadence. Results from this statistical study include: i) the Br component results in a slight upwards shift of property values in a manner consistent with a field-strength underestimation by the B_{los} component; ii) using the Br component results in significantly lower inter-property correlation in one-third of the cases, implying more independent information as regards the state of the AR photospheric magnetic field; iii) flaring rates for each property vary between the field components in a manner consistent with the differences in property-value ranges resulting from the components; iv) flaring rates generally increase for higher values of properties, except the Fourier spectral power index that has flare rates peaking around a value of 5/3. These findings indicate that there may be advantages in using Br rather than B_{los} in calculating flare-related AR magnetic properties, especially for regions located far from central meridian.

  4. Electron diffusion region during magnetopause reconnection with an intermediate guide field: Magnetospheric multiscale observations

    NASA Astrophysics Data System (ADS)

    Chen, L.-J.; Hesse, M.; Wang, S.; Gershman, D.; Ergun, R. E.; Burch, J.; Bessho, N.; Torbert, R. B.; Giles, B.; Webster, J.; Pollock, C.; Dorelli, J.; Moore, T.; Paterson, W.; Lavraud, B.; Strangeway, R.; Russell, C.; Khotyaintsev, Y.; Lindqvist, P.-A.; Avanov, L.

    2017-05-01

    An electron diffusion region (EDR) in magnetic reconnection with a guide magnetic field approximately 0.2 times the reconnecting component is encountered by the four Magnetospheric Multiscale spacecraft at the Earth's magnetopause. The distinct substructures in the EDR on both sides of the reconnecting current sheet are visualized with electron distribution functions that are 2 orders of magnitude higher cadence than ever achieved to enable the following new findings: (1) Motion of the demagnetized electrons plays an important role to sustain the reconnection current and contributes to the dissipation due to the nonideal electric field, (2) the finite guide field dominates over the Hall magnetic field in an electron-scale region in the exhaust and modifies the electron flow dynamics in the EDR, (3) the reconnection current is in part carried by inflowing field-aligned electrons in the magnetosphere part of the EDR, and (4) the reconnection electric field measured by multiple spacecraft is uniform over at least eight electron skin depths and corresponds to a reconnection rate of approximately 0.1. The observations establish the first look at the structure of the EDR under a weak but not negligible guide field.

  5. Overview of the TOPEX/Poseidon Platform Harvest Verification Experiment

    NASA Technical Reports Server (NTRS)

    Morris, Charles S.; DiNardo, Steven J.; Christensen, Edward J.

    1995-01-01

    An overview is given of the in situ measurement system installed on Texaco's Platform Harvest for verification of the sea level measurement from the TOPEX/Poseidon satellite. The prelaunch error budget suggested that the total root mean square (RMS) error due to measurements made at this verification site would be less than 4 cm. The actual error budget for the verification site is within these original specifications. However, evaluation of the sea level data from three measurement systems at the platform has resulted in unexpectedly large differences between the systems. Comparison of the sea level measurements from the different tide gauge systems has led to a better understanding of the problems of measuring sea level in relatively deep ocean. As of May 1994, the Platform Harvest verification site has successfully supported 60 TOPEX/Poseidon overflights.

  6. International Space Station Requirement Verification for Commercial Visiting Vehicles

    NASA Technical Reports Server (NTRS)

    Garguilo, Dan

    2017-01-01

    The COTS program demonstrated NASA could rely on commercial providers for safe, reliable, and cost-effective cargo delivery to ISS. The ISS Program has developed a streamlined process to safely integrate commercial visiting vehicles and ensure requirements are met Levy a minimum requirement set (down from 1000s to 100s) focusing on the ISS interface and safety, reducing the level of NASA oversight/insight and burden on the commercial Partner. Partners provide a detailed verification and validation plan documenting how they will show they've met NASA requirements. NASA conducts process sampling to ensure that the established verification processes is being followed. NASA participates in joint verification events and analysis for requirements that require both parties verify. Verification compliance is approved by NASA and launch readiness certified at mission readiness reviews.

  7. Property-driven functional verification technique for high-speed vision system-on-chip processor

    NASA Astrophysics Data System (ADS)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  8. Challenges in High-Assurance Runtime Verification

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.

    2016-01-01

    Safety-critical systems are growing more complex and becoming increasingly autonomous. Runtime Verification (RV) has the potential to provide protections when a system cannot be assured by conventional means, but only if the RV itself can be trusted. In this paper, we proffer a number of challenges to realizing high-assurance RV and illustrate how we have addressed them in our research. We argue that high-assurance RV provides a rich target for automated verification tools in hope of fostering closer collaboration among the communities.

  9. Palmprint verification using Lagrangian decomposition and invariant interest points

    NASA Astrophysics Data System (ADS)

    Gupta, P.; Rattani, A.; Kisku, D. R.; Hwang, C. J.; Sing, J. K.

    2011-06-01

    This paper presents a palmprint based verification system using SIFT features and Lagrangian network graph technique. We employ SIFT for feature extraction from palmprint images whereas the region of interest (ROI) which has been extracted from wide palm texture at the preprocessing stage, is considered for invariant points extraction. Finally, identity is established by finding permutation matrix for a pair of reference and probe palm graphs drawn on extracted SIFT features. Permutation matrix is used to minimize the distance between two graphs. The propsed system has been tested on CASIA and IITK palmprint databases and experimental results reveal the effectiveness and robustness of the system.

  10. Formal Verification of the AAMP-FV Microcode

    NASA Technical Reports Server (NTRS)

    Miller, Steven P.; Greve, David A.; Wilding, Matthew M.; Srivas, Mandayam

    1999-01-01

    This report describes the experiences of Collins Avionics & Communications and SRI International in formally specifying and verifying the microcode in a Rockwell proprietary microprocessor, the AAMP-FV, using the PVS verification system. This project built extensively on earlier experiences using PVS to verify the microcode in the AAMP5, a complex, pipelined microprocessor designed for use in avionics displays and global positioning systems. While the AAMP5 experiment demonstrated the technical feasibility of formal verification of microcode, the steep learning curve encountered left unanswered the question of whether it could be performed at reasonable cost. The AAMP-FV project was conducted to determine whether the experience gained on the AAMP5 project could be used to make formal verification of microcode cost effective for safety-critical and high volume devices.

  11. Verification test report on a solar heating and hot water system

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  12. Plasma Equilibrium in a Magnetic Field with Stochastic Regions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.A. Krommes and Allan H. Reiman

    The nature of plasma equilibrium in a magnetic field with stochastic regions is examined. It is shown that the magnetic differential equation that determines the equilibrium Pfirsch-Schluter currents can be cast in a form similar to various nonlinear equations for a turbulent plasma, allowing application of the mathematical methods of statistical turbulence theory. An analytically tractable model, previously studied in the context of resonance-broadening theory, is applied with particular attention paid to the periodicity constraints required in toroidal configurations. It is shown that even a very weak radial diffusion of the magnetic field lines can have a significant effect onmore » the equilibrium in the neighborhood of the rational surfaces, strongly modifying the near-resonant Pfirsch-Schluter currents. Implications for the numerical calculation of 3D equilibria are discussed« less

  13. High-resolution fluence verification for treatment plan specific QA in ion beam radiotherapy

    NASA Astrophysics Data System (ADS)

    Martišíková, Mária; Brons, Stephan; Hesse, Bernd M.; Jäkel, Oliver

    2013-03-01

    Ion beam radiotherapy exploits the finite range of ion beams and the increased dose deposition of ions toward the end of their range in material. This results in high dose conformation to the target region, which can be further increased using scanning ion beams. The standard method for patient-plan verification in ion beam therapy is ionization chamber dosimetry. The spatial resolution of this method is given by the distance between the chambers (typically 1 cm). However, steep dose gradients created by scanning ion beams call for more information and improved spatial resolution. Here we propose a clinically applicable method, supplementary to standard patient-plan verification. It is based on ion fluence measurements in the entrance region with high spatial resolution in the plane perpendicular to the beam, separately for each energy slice. In this paper the usability of the RID256 L amorphous silicon flat-panel detector for the measurements proposed is demonstrated for carbon ion beams. The detector provides sufficient spatial resolution for this kind of measurement (pixel pitch 0.8 mm). The experiments were performed at the Heidelberg Ion-Beam Therapy Center in Germany. This facility is equipped with a synchrotron capable of accelerating ions from protons up to oxygen to energies between 48 and 430 MeV u-1. Beam application is based on beam scanning technology. The measured signal corresponding to single energy slices was translated to ion fluence on a pixel-by-pixel basis, using calibration, which is dependent on energy and ion type. To quantify the agreement of the fluence distributions measured with those planned, a gamma-index criterion was used. In the patient field investigated excellent agreement was found between the two distributions. At least 95% of the slices contained more than 96% of points agreeing with our criteria. Due to the high spatial resolution, this method is especially valuable for measurements of strongly inhomogeneous fluence

  14. Temporal Specification and Verification of Real-Time Systems.

    DTIC Science & Technology

    1991-08-30

    of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .

  15. 20 CFR 212.5 - Verification of military service.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Verification of military service. 212.5... MILITARY SERVICE § 212.5 Verification of military service. Military service may be verified by the... armed forces that shows the beginning and ending dates of the individual's active military service; or a...

  16. On Verifying Currents and Other Features in the Hawaiian Islands Region Using Fully Coupled Ocean/Atmosphere Mesoscale Prediction System Compared to Global Ocean Model and Ocean Observations

    NASA Astrophysics Data System (ADS)

    Jessen, P. G.; Chen, S.

    2014-12-01

    This poster introduces and evaluates features concerning the Hawaii, USA region using the U.S. Navy's fully Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS-OS™) coupled to the Navy Coastal Ocean Model (NCOM). It also outlines some challenges in verifying ocean currents in the open ocean. The system is evaluated using in situ ocean data and initial forcing fields from the operational global Hybrid Coordinate Ocean Model (HYCOM). Verification shows difficulties in modelling downstream currents off the Hawaiian islands (Hawaii's wake). Comparing HYCOM to NCOM current fields show some displacement of small features such as eddies. Generally, there is fair agreement from HYCOM to NCOM in salinity and temperature fields. There is good agreement in SSH fields.

  17. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  18. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  19. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  20. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  1. Offline signature verification using convolution Siamese network

    NASA Astrophysics Data System (ADS)

    Xing, Zi-Jian; Yin, Fei; Wu, Yi-Chao; Liu, Cheng-Lin

    2018-04-01

    This paper presents an offline signature verification approach using convolutional Siamese neural network. Unlike the existing methods which consider feature extraction and metric learning as two independent stages, we adopt a deepleaning based framework which combines the two stages together and can be trained end-to-end. The experimental results on two offline public databases (GPDSsynthetic and CEDAR) demonstrate the superiority of our method on the offline signature verification problem.

  2. A high-resolution regional reanalysis for the European CORDEX region

    NASA Astrophysics Data System (ADS)

    Bollmeyer, Christoph; Keller, Jan; Ohlwein, Christian; Wahl, Sabrina

    2015-04-01

    Within the Hans-Ertel-Centre for Weather Research (HErZ), the climate monitoring branch concentrates efforts on the assessment and analysis of regional climate in Germany and Europe. In joint cooperation with DWD (German Weather Service), a high-resolution reanalysis system based on the COSMO model has been developed. Reanalyses gain more and more importance as a source of meteorological information for many purposes and applications. Several global reanalyses projects (e.g., ERA, MERRA, CSFR, JMA9) produce and verify these data sets to provide time series as long as possible combined with a high data quality. Due to a spatial resolution down to 50-70km and 3-hourly temporal output, they are not suitable for small scale problems (e.g., regional climate assessment, meso-scale NWP verification, input for subsequent models such as river runoff simulations, renewable energy applications). The implementation of regional reanalyses based on a limited area model along with a data assimilation scheme is able to generate reanalysis data sets with high spatio-temporal resolution. The work presented here focuses on two regional reanalyses for Europe and Germany. The European reanalysis COSMO-REA6 matches the CORDEX EURO-11 specifications, albeit at a higher spatial resolution, i.e., 0.055° (6km) instead of 0.11° (12km). Nested into COSMO-REA6 is COSMO-REA2, a convective-scale reanalysis with 2km resolution for Germany. COSMO-REA6 comprises the assimilation of observational data using the existing nudging scheme of COSMO and is complemented by a special soil moisture analysis and boundary conditions given by ERA-Interim data. COSMO-REA2 also uses the nudging scheme complemented by a latent heat nudging of radar information. The reanalysis data set currently covers 17 years (1997-2013) for COSMO-REA6 and 4 years (2010-2013) for COSMO-REA2 with a very large set of output variables and a high temporal output step of hourly 3D-fields and quarter-hourly 2D-fields. The evaluation

  3. Compressive sensing using optimized sensing matrix for face verification

    NASA Astrophysics Data System (ADS)

    Oey, Endra; Jeffry; Wongso, Kelvin; Tommy

    2017-12-01

    Biometric appears as one of the solutions which is capable in solving problems that occurred in the usage of password in terms of data access, for example there is possibility in forgetting password and hard to recall various different passwords. With biometrics, physical characteristics of a person can be captured and used in the identification process. In this research, facial biometric is used in the verification process to determine whether the user has the authority to access the data or not. Facial biometric is chosen as its low cost implementation and generate quite accurate result for user identification. Face verification system which is adopted in this research is Compressive Sensing (CS) technique, in which aims to reduce dimension size as well as encrypt data in form of facial test image where the image is represented in sparse signals. Encrypted data can be reconstructed using Sparse Coding algorithm. Two types of Sparse Coding namely Orthogonal Matching Pursuit (OMP) and Iteratively Reweighted Least Squares -ℓp (IRLS-ℓp) will be used for comparison face verification system research. Reconstruction results of sparse signals are then used to find Euclidean norm with the sparse signal of user that has been previously saved in system to determine the validity of the facial test image. Results of system accuracy obtained in this research are 99% in IRLS with time response of face verification for 4.917 seconds and 96.33% in OMP with time response of face verification for 0.4046 seconds with non-optimized sensing matrix, while 99% in IRLS with time response of face verification for 13.4791 seconds and 98.33% for OMP with time response of face verification for 3.1571 seconds with optimized sensing matrix.

  4. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  5. Considerations in STS payload environmental verification

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1978-01-01

    The current philosophy of the GSFS regarding environmental verification of Shuttle payloads is reviewed. In the structures area, increased emphasis will be placed on the use of analysis for design verification, with selective testing performed as necessary. Furthermore, as a result of recent cost optimization analysis, the multitier test program will presumably give way to a comprehensive test program at the major payload subassembly level after adequate workmanship at the component level has been verified. In the thermal vacuum area, thought is being given to modifying the approaches used for conventional spacecraft.

  6. Comments for A Conference on Verification in the 21st Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doyle, James E.

    2012-06-12

    The author offers 5 points for the discussion of Verification and Technology: (1) Experience with the implementation of arms limitation and arms reduction agreements confirms that technology alone has never been relied upon to provide effective verification. (2) The historical practice of verification of arms control treaties between Cold War rivals may constrain the cooperative and innovative use of technology for transparency, veification and confidence building in the future. (3) An area that has been identified by many, including the US State Department and NNSA as being rich for exploration for potential uses of technology for transparency and verification ismore » information and communications technology (ICT). This includes social media, crowd-sourcing, the internet of things, and the concept of societal verification, but there are issues. (4) On the issue of the extent to which verification technologies are keeping pace with the demands of future protocols and agrements I think the more direct question is ''are they effective in supporting the objectives of the treaty or agreement?'' In this regard it is important to acknowledge that there is a verification grand challenge at our doorstep. That is ''how does one verify limitations on nuclear warheads in national stockpiles?'' (5) Finally, while recognizing the daunting political and security challenges of such an approach, multilateral engagement and cooperation at the conceptual and technical levels provides benefits for addressing future verification challenges.« less

  7. Experimental verification of a Monte Carlo-based MLC simulation model for IMRT dose calculations in heterogeneous media

    NASA Astrophysics Data System (ADS)

    Tyagi, N.; Curran, B. H.; Roberson, P. L.; Moran, J. M.; Acosta, E.; Fraass, B. A.

    2008-02-01

    IMRT often requires delivering small fields which may suffer from electronic disequilibrium effects. The presence of heterogeneities, particularly low-density tissues in patients, complicates such situations. In this study, we report on verification of the DPM MC code for IMRT treatment planning in heterogeneous media, using a previously developed model of the Varian 120-leaf MLC. The purpose of this study is twofold: (a) design a comprehensive list of experiments in heterogeneous media for verification of any dose calculation algorithm and (b) verify our MLC model in these heterogeneous type geometries that mimic an actual patient geometry for IMRT treatment. The measurements have been done using an IMRT head and neck phantom (CIRS phantom) and slab phantom geometries. Verification of the MLC model has been carried out using point doses measured with an A14 slim line (SL) ion chamber inside a tissue-equivalent and a bone-equivalent material using the CIRS phantom. Planar doses using lung and bone equivalent slabs have been measured and compared using EDR films (Kodak, Rochester, NY).

  8. DIRECT OBSERVATION OF SOLAR CORONAL MAGNETIC FIELDS BY VECTOR TOMOGRAPHY OF THE CORONAL EMISSION LINE POLARIZATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kramar, M.; Lin, H.; Tomczyk, S., E-mail: kramar@cua.edu, E-mail: lin@ifa.hawaii.edu, E-mail: tomczyk@ucar.edu

    We present the first direct “observation” of the global-scale, 3D coronal magnetic fields of Carrington Rotation (CR) Cycle 2112 using vector tomographic inversion techniques. The vector tomographic inversion uses measurements of the Fe xiii 10747 Å Hanle effect polarization signals by the Coronal Multichannel Polarimeter (CoMP) and 3D coronal density and temperature derived from scalar tomographic inversion of Solar Terrestrial Relations Observatory (STEREO)/Extreme Ultraviolet Imager (EUVI) coronal emission lines (CELs) intensity images as inputs to derive a coronal magnetic field model that best reproduces the observed polarization signals. While independent verifications of the vector tomography results cannot be performed, wemore » compared the tomography inverted coronal magnetic fields with those constructed by magnetohydrodynamic (MHD) simulations based on observed photospheric magnetic fields of CR 2112 and 2113. We found that the MHD model for CR 2112 is qualitatively consistent with the tomography inverted result for most of the reconstruction domain except for several regions. Particularly, for one of the most noticeable regions, we found that the MHD simulation for CR 2113 predicted a model that more closely resembles the vector tomography inverted magnetic fields. In another case, our tomographic reconstruction predicted an open magnetic field at a region where a coronal hole can be seen directly from a STEREO-B/EUVI image. We discuss the utilities and limitations of the tomographic inversion technique, and present ideas for future developments.« less

  9. Security Verification Techniques Applied to PatchLink COTS Software

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt; Andrew, Chris; Jog, Sameer

    2006-01-01

    Verification of the security of software artifacts is a challenging task. An integrated approach that combines verification techniques can increase the confidence in the security of software artifacts. Such an approach has been developed by the Jet Propulsion Laboratory (JPL) and the University of California at Davis (UC Davis). Two security verification instruments were developed and then piloted on PatchLink's UNIX Agent, a Commercial-Off-The-Shelf (COTS) software product, to assess the value of the instruments and the approach. The two instruments are the Flexible Modeling Framework (FMF) -- a model-based verification instrument (JPL), and a Property-Based Tester (UC Davis). Security properties were formally specified for the COTS artifact and then verified using these instruments. The results were then reviewed to determine the effectiveness of the approach and the security of the COTS product.

  10. Stochastic analysis of concentration field in a wake region.

    PubMed

    Yassin, Mohamed F; Elmi, Abdirashid A

    2011-02-01

    Identifying geographic locations in urban areas from which air pollutants enter the atmosphere is one of the most important information needed to develop effective mitigation strategies for pollution control. Stochastic analysis is a powerful tool that can be used for estimating concentration fluctuation in plume dispersion in a wake region around buildings. Only few studies have been devoted to evaluate applications of stochastic analysis to pollutant dispersion in an urban area. This study was designed to investigate the concentration fields in the wake region using obstacle model such as an isolated building model. We measured concentration fluctuations at centerline of various downwind distances from the source, and different heights with the frequency of 1 KHz. Concentration fields were analyzed stochastically, using the probability density functions (pdf). Stochastic analysis was performed on the concentration fluctuation and the pdf of mean concentration, fluctuation intensity, and crosswind mean-plume dispersion. The pdf of the concentration fluctuation data have shown a significant non-Gaussian behavior. The lognormal distribution appeared to be the best fit to the shape of concentration measured in the boundary layer. We observed that the plume dispersion pdf near the source was shorter than the plume dispersion far from the source. Our findings suggest that the use of stochastic technique in complex building environment can be a powerful tool to help understand the distribution and location of air pollutants.

  11. Development and verification of global/local analysis techniques for laminated composites

    NASA Technical Reports Server (NTRS)

    Thompson, Danniella Muheim; Griffin, O. Hayden, Jr.

    1991-01-01

    A two-dimensional to three-dimensional global/local finite element approach was developed, verified, and applied to a laminated composite plate of finite width and length containing a central circular hole. The resulting stress fields for axial compression loads were examined for several symmetric stacking sequences and hole sizes. Verification was based on comparison of the displacements and the stress fields with those accepted trends from previous free edge investigations and a complete three-dimensional finite element solution of the plate. The laminates in the compression study included symmetric cross-ply, angle-ply and quasi-isotropic stacking sequences. The entire plate was selected as the global model and analyzed with two-dimensional finite elements. Displacements along a region identified as the global/local interface were applied in a kinematically consistent fashion to independent three-dimensional local models. Local areas of interest in the plate included a portion of the straight free edge near the hole, and the immediate area around the hole. Interlaminar stress results obtained from the global/local analyses compares well with previously reported trends, and some new conclusions about interlaminar stress fields in plates with different laminate orientations and hole sizes are presented for compressive loading. The effectiveness of the global/local procedure in reducing the computational effort required to solve these problems is clearly demonstrated through examination of the computer time required to formulate and solve the linear, static system of equations which result for the global and local analyses to those required for a complete three-dimensional formulation for a cross-ply laminate. Specific processors used during the analyses are described in general terms. The application of this global/local technique is not limited software system, and was developed and described in as general a manner as possible.

  12. Ontology Matching with Semantic Verification.

    PubMed

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  13. What is the Final Verification of Engineering Requirements?

    NASA Technical Reports Server (NTRS)

    Poole, Eric

    2010-01-01

    This slide presentation reviews the process of development through the final verification of engineering requirements. The definition of the requirements is driven by basic needs, and should be reviewed by both the supplier and the customer. All involved need to agree upon a formal requirements including changes to the original requirements document. After the requirements have ben developed, the engineering team begins to design the system. The final design is reviewed by other organizations. The final operational system must satisfy the original requirements, though many verifications should be performed during the process. The verification methods that are used are test, inspection, analysis and demonstration. The plan for verification should be created once the system requirements are documented. The plan should include assurances that every requirement is formally verified, that the methods and the responsible organizations are specified, and that the plan is reviewed by all parties. The options of having the engineering team involved in all phases of the development as opposed to having some other organization continue the process once the design has been complete is discussed.

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  15. Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System

    NASA Astrophysics Data System (ADS)

    Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li

    The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.

  16. Evaluation of HCFC AK 225 Alternatives for Precision Cleaning and Verification

    NASA Technical Reports Server (NTRS)

    Melton, D. M.

    1998-01-01

    Maintaining qualified cleaning and verification processes are essential in an production environment. Environmental regulations have and are continuing to impact cleaning and verification processing in component and large structures, both at the Michoud Assembly Facility and component suppliers. The goal of the effort was to assure that the cleaning and verification proceeds unimpeded and that qualified, environmentally compliant material and process replacements are implemented and perform to specifications. The approach consisted of (1) selection of a Supersonic Gas-Liquid Cleaning System; (2) selection and evaluation of three cleaning and verification solvents as candidate alternatives to HCFC 225 (Vertrel 423 (HCFC), Vertrel MCA (HFC/1,2-Dichloroethylene), and HFE 7100DE (HFE/1,2 Dichloroethylene)); and evaluation of an analytical instrumental post cleaning verification technique. This document is presented in viewgraph format.

  17. Implementation and verification of global optimization benchmark problems

    NASA Astrophysics Data System (ADS)

    Posypkin, Mikhail; Usov, Alexander

    2017-12-01

    The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  18. A study of applications scribe frame data verifications using design rule check

    NASA Astrophysics Data System (ADS)

    Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki

    2013-06-01

    In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.

  19. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Rigaku ZSX Mini II (ZSX Mini II) XRF Services x-ray fluorescence (XRF) analyzer was demon-strated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ZSX Mini II analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ZSX Mini II analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element con

  20. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Rontec PicoTAX x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the PicoTAX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the PicoTAX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by c

  1. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Elvatech, Ltd. ElvaX (ElvaX) x-ray fluorescence (XRF) analyzer distributed in the United States by Xcalibur XRF Services (Xcalibur), was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ElvaX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ElvaX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as s

  2. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Oxford ED2000 x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ED2000 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ED2000 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by com

  3. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Innov-X XT400 Series (XT400) x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XT400 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XT400 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was as

  4. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  5. Ada(R) Test and Verification System (ATVS)

    NASA Technical Reports Server (NTRS)

    Strelich, Tom

    1986-01-01

    The Ada Test and Verification System (ATVS) functional description and high level design are completed and summarized. The ATVS will provide a comprehensive set of test and verification capabilities specifically addressing the features of the Ada language, support for embedded system development, distributed environments, and advanced user interface capabilities. Its design emphasis was on effective software development environment integration and flexibility to ensure its long-term use in the Ada software development community.

  6. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  7. SOLAR FLARE PREDICTION USING SDO/HMI VECTOR MAGNETIC FIELD DATA WITH A MACHINE-LEARNING ALGORITHM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bobra, M. G.; Couvidat, S., E-mail: couvidat@stanford.edu

    2015-01-10

    We attempt to forecast M- and X-class solar flares using a machine-learning algorithm, called support vector machine (SVM), and four years of data from the Solar Dynamics Observatory's Helioseismic and Magnetic Imager, the first instrument to continuously map the full-disk photospheric vector magnetic field from space. Most flare forecasting efforts described in the literature use either line-of-sight magnetograms or a relatively small number of ground-based vector magnetograms. This is the first time a large data set of vector magnetograms has been used to forecast solar flares. We build a catalog of flaring and non-flaring active regions sampled from a databasemore » of 2071 active regions, comprised of 1.5 million active region patches of vector magnetic field data, and characterize each active region by 25 parameters. We then train and test the machine-learning algorithm and we estimate its performances using forecast verification metrics with an emphasis on the true skill statistic (TSS). We obtain relatively high TSS scores and overall predictive abilities. We surmise that this is partly due to fine-tuning the SVM for this purpose and also to an advantageous set of features that can only be calculated from vector magnetic field data. We also apply a feature selection algorithm to determine which of our 25 features are useful for discriminating between flaring and non-flaring active regions and conclude that only a handful are needed for good predictive abilities.« less

  8. Formal verification of an avionics microprocessor

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam, K.; Miller, Steven P.

    1995-01-01

    Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.

  9. SITE CHARACTERIZATION AND MONITORING TECHNOLOGY VERIFICATION: PROGRESS AND RESULTS

    EPA Science Inventory

    The Site Characterization and Monitoring Technology Pilot of the U.S. Environmental Protection Agency's Environmental Technology Verification Program (ETV) has been engaged in verification activities since the fall of 1994 (U.S. EPA, 1997). The purpose of the ETV is to promote th...

  10. Certification and verification for Calmac flat plate solar collector

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Information used in the certification and verification of the Calmac Flat Plate Collector is presented. Contained are such items as test procedures and results, information on materials used, installation, operation, and maintenance manuals, and other information pertaining to the verification and certification.

  11. 40 CFR 1066.240 - Torque transducer verification and calibration.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification and calibration. Calibrate torque-measurement systems as described in 40 CFR 1065.310. ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Torque transducer verification and...

  12. 40 CFR 1066.240 - Torque transducer verification and calibration.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification and calibration. Calibrate torque-measurement systems as described in 40 CFR 1065.310. ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Torque transducer verification and...

  13. 48 CFR 552.204-9 - Personal Identity Verification requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Personal Identity....204-9 Personal Identity Verification requirements. As prescribed in 504.1303, insert the following clause: Personal Identity Verification Requirements (OCT 2012) (a) The contractor shall comply with GSA...

  14. 48 CFR 552.204-9 - Personal Identity Verification requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false Personal Identity....204-9 Personal Identity Verification requirements. As prescribed in 504.1303, insert the following clause: Personal Identity Verification Requirements (OCT 2012) (a) The contractor shall comply with GSA...

  15. 48 CFR 552.204-9 - Personal Identity Verification requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Personal Identity....204-9 Personal Identity Verification requirements. As prescribed in 504.1303, insert the following clause: Personal Identity Verification Requirements (OCT 2012) (a) The contractor shall comply with GSA...

  16. Joint ETV/NOWATECH verification protocol for the Sorbisense GSW40 passive sampler

    EPA Science Inventory

    Environmental technology verification (ETV) is an independent (third party) assessment of the performance of a technology or a product for a specified application, under defined conditions and quality assurance. This verification is a joint verification with the US EPA ETV schem...

  17. 4D ML reconstruction as a tool for volumetric PET-based treatment verification in ion beam radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Bernardi, E., E-mail: elisabetta.debernardi@unimib.it; Ricotti, R.; Riboldi, M.

    2016-02-15

    Purpose: An innovative strategy to improve the sensitivity of positron emission tomography (PET)-based treatment verification in ion beam radiotherapy is proposed. Methods: Low counting statistics PET images acquired during or shortly after the treatment (Measured PET) and a Monte Carlo estimate of the same PET images derived from the treatment plan (Expected PET) are considered as two frames of a 4D dataset. A 4D maximum likelihood reconstruction strategy was adapted to iteratively estimate the annihilation events distribution in a reference frame and the deformation motion fields that map it in the Expected PET and Measured PET frames. The outputs generatedmore » by the proposed strategy are as follows: (1) an estimate of the Measured PET with an image quality comparable to the Expected PET and (2) an estimate of the motion field mapping Expected PET to Measured PET. The details of the algorithm are presented and the strategy is preliminarily tested on analytically simulated datasets. Results: The algorithm demonstrates (1) robustness against noise, even in the worst conditions where 1.5 × 10{sup 4} true coincidences and a random fraction of 73% are simulated; (2) a proper sensitivity to different kind and grade of mismatches ranging between 1 and 10 mm; (3) robustness against bias due to incorrect washout modeling in the Monte Carlo simulation up to 1/3 of the original signal amplitude; and (4) an ability to describe the mismatch even in presence of complex annihilation distributions such as those induced by two perpendicular superimposed ion fields. Conclusions: The promising results obtained in this work suggest the applicability of the method as a quantification tool for PET-based treatment verification in ion beam radiotherapy. An extensive assessment of the proposed strategy on real treatment verification data is planned.« less

  18. Improved Detection Technique for Solvent Rinse Cleanliness Verification

    NASA Technical Reports Server (NTRS)

    Hornung, S. D.; Beeson, H. D.

    2001-01-01

    The NASA White Sands Test Facility (WSTF) has an ongoing effort to reduce or eliminate usage of cleaning solvents such as CFC-113 and its replacements. These solvents are used in the final clean and cleanliness verification processes for flight and ground support hardware, especially for oxygen systems where organic contaminants can pose an ignition hazard. For the final cleanliness verification in the standard process, the equivalent of one square foot of surface area of parts is rinsed with the solvent, and the final 100 mL of the rinse is captured. The amount of nonvolatile residue (NVR) in the solvent is determined by weight after the evaporation of the solvent. An improved process of sampling this rinse, developed at WSTF, requires evaporation of less than 2 mL of the solvent to make the cleanliness verification. Small amounts of the solvent are evaporated in a clean stainless steel cup, and the cleanliness of the stainless steel cup is measured using a commercially available surface quality monitor. The effectiveness of this new cleanliness verification technique was compared to the accepted NVR sampling procedures. Testing with known contaminants in solution, such as hydraulic fluid, fluorinated lubricants, and cutting and lubricating oils, was performed to establish a correlation between amount in solution and the process response. This report presents the approach and results and discusses the issues in establishing the surface quality monitor-based cleanliness verification.

  19. A Vehicular Mobile Standard Instrument for Field Verification of Traffic Speed Meters Based on Dual-Antenna Doppler Radar Sensor

    PubMed Central

    Du, Lei; Sun, Qiao; Cai, Changqing; Bai, Jie; Fan, Zhe; Zhang, Yue

    2018-01-01

    Traffic speed meters are important legal measuring instruments specially used for traffic speed enforcement and must be tested and verified in the field every year using a vehicular mobile standard speed-measuring instrument to ensure speed-measuring performances. The non-contact optical speed sensor and the GPS speed sensor are the two most common types of standard speed-measuring instruments. The non-contact optical speed sensor requires extremely high installation accuracy, and its speed-measuring error is nonlinear and uncorrectable. The speed-measuring accuracy of the GPS speed sensor is rapidly reduced if the amount of received satellites is insufficient enough, which often occurs in urban high-rise regions, tunnels, and mountainous regions. In this paper, a new standard speed-measuring instrument using a dual-antenna Doppler radar sensor is proposed based on a tradeoff between the installation accuracy requirement and the usage region limitation, which has no specified requirements for its mounting distance and no limitation on usage regions and can automatically compensate for the effect of an inclined installation angle on its speed-measuring accuracy. Theoretical model analysis, simulated speed measurement results, and field experimental results compared with a GPS speed sensor with high accuracy showed that the dual-antenna Doppler radar sensor is effective and reliable as a new standard speed-measuring instrument. PMID:29621142

  20. A Vehicular Mobile Standard Instrument for Field Verification of Traffic Speed Meters Based on Dual-Antenna Doppler Radar Sensor.

    PubMed

    Du, Lei; Sun, Qiao; Cai, Changqing; Bai, Jie; Fan, Zhe; Zhang, Yue

    2018-04-05

    Traffic speed meters are important legal measuring instruments specially used for traffic speed enforcement and must be tested and verified in the field every year using a vehicular mobile standard speed-measuring instrument to ensure speed-measuring performances. The non-contact optical speed sensor and the GPS speed sensor are the two most common types of standard speed-measuring instruments. The non-contact optical speed sensor requires extremely high installation accuracy, and its speed-measuring error is nonlinear and uncorrectable. The speed-measuring accuracy of the GPS speed sensor is rapidly reduced if the amount of received satellites is insufficient enough, which often occurs in urban high-rise regions, tunnels, and mountainous regions. In this paper, a new standard speed-measuring instrument using a dual-antenna Doppler radar sensor is proposed based on a tradeoff between the installation accuracy requirement and the usage region limitation, which has no specified requirements for its mounting distance and no limitation on usage regions and can automatically compensate for the effect of an inclined installation angle on its speed-measuring accuracy. Theoretical model analysis, simulated speed measurement results, and field experimental results compared with a GPS speed sensor with high accuracy showed that the dual-antenna Doppler radar sensor is effective and reliable as a new standard speed-measuring instrument.

  1. Evaluation of Gafchromic EBT-XD film, with comparison to EBT3 film, and application in high dose radiotherapy verification.

    PubMed

    Palmer, Antony L; Dimitriadis, Alexis; Nisbet, Andrew; Clark, Catharine H

    2015-11-21

    There is renewed interest in film dosimetry for the verification of dose delivery of complex treatments, particularly small fields, compared to treatment planning system calculations. A new radiochromic film, Gafchromic EBT-XD, is available for high-dose treatment verification and we present the first published evaluation of its use. We evaluate the new film for MV photon dosimetry, including calibration curves, performance with single- and triple-channel dosimetry, and comparison to existing EBT3 film. In the verification of a typical 25 Gy stereotactic radiotherapy (SRS) treatment, compared to TPS planned dose distribution, excellent agreement was seen with EBT-XD using triple-channel dosimetry, in isodose overlay, maximum 1.0 mm difference over 200-2400 cGy, and gamma evaluation, mean passing rate 97% at 3% locally-normalised, 1.5 mm criteria. In comparison to EBT3, EBT-XD gave improved evaluation results for the SRS-plan, had improved calibration curve gradients at high doses, and had reduced lateral scanner effect. The dimensions of the two films are identical. The optical density of EBT-XD is lower than EBT3 for the same dose. The effective atomic number for both may be considered water-equivalent in MV radiotherapy. We have validated the use of EBT-XD for high-dose, small-field radiotherapy, for routine QC and a forthcoming multi-centre SRS dosimetry intercomparison.

  2. Evaluation of Gafchromic EBT-XD film, with comparison to EBT3 film, and application in high dose radiotherapy verification

    NASA Astrophysics Data System (ADS)

    Palmer, Antony L.; Dimitriadis, Alexis; Nisbet, Andrew; Clark, Catharine H.

    2015-11-01

    There is renewed interest in film dosimetry for the verification of dose delivery of complex treatments, particularly small fields, compared to treatment planning system calculations. A new radiochromic film, Gafchromic EBT-XD, is available for high-dose treatment verification and we present the first published evaluation of its use. We evaluate the new film for MV photon dosimetry, including calibration curves, performance with single- and triple-channel dosimetry, and comparison to existing EBT3 film. In the verification of a typical 25 Gy stereotactic radiotherapy (SRS) treatment, compared to TPS planned dose distribution, excellent agreement was seen with EBT-XD using triple-channel dosimetry, in isodose overlay, maximum 1.0 mm difference over 200-2400 cGy, and gamma evaluation, mean passing rate 97% at 3% locally-normalised, 1.5 mm criteria. In comparison to EBT3, EBT-XD gave improved evaluation results for the SRS-plan, had improved calibration curve gradients at high doses, and had reduced lateral scanner effect. The dimensions of the two films are identical. The optical density of EBT-XD is lower than EBT3 for the same dose. The effective atomic number for both may be considered water-equivalent in MV radiotherapy. We have validated the use of EBT-XD for high-dose, small-field radiotherapy, for routine QC and a forthcoming multi-centre SRS dosimetry intercomparison.

  3. Verification of fluid-structure-interaction algorithms through the method of manufactured solutions for actuator-line applications

    NASA Astrophysics Data System (ADS)

    Vijayakumar, Ganesh; Sprague, Michael

    2017-11-01

    Demonstrating expected convergence rates with spatial- and temporal-grid refinement is the ``gold standard'' of code and algorithm verification. However, the lack of analytical solutions and generating manufactured solutions presents challenges for verifying codes for complex systems. The application of the method of manufactured solutions (MMS) for verification for coupled multi-physics phenomena like fluid-structure interaction (FSI) has only seen recent investigation. While many FSI algorithms for aeroelastic phenomena have focused on boundary-resolved CFD simulations, the actuator-line representation of the structure is widely used for FSI simulations in wind-energy research. In this work, we demonstrate the verification of an FSI algorithm using MMS for actuator-line CFD simulations with a simplified structural model. We use a manufactured solution for the fluid velocity field and the displacement of the SMD system. We demonstrate the convergence of both the fluid and structural solver to second-order accuracy with grid and time-step refinement. This work was funded by the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Wind Energy Technologies Office, under Contract No. DE-AC36-08-GO28308 with the National Renewable Energy Laboratory.

  4. Compromises produced by the dialectic between self-verification and self-enhancement.

    PubMed

    Morling, B; Epstein, S

    1997-12-01

    Three studies of people's reactions to evaluative feedback demonstrated that the dialectic between self-enhancement and self-verification results in compromises between these 2 motives, as hypothesized in cognitive-experiential self-theory. The demonstration was facilitated by 2 procedural improvements: Enhancement and verification were established by calibrating evaluative feedback against self appraisals, and degree of enhancement and of verification were varied along a continuum, rather than categorically. There was also support for the hypotheses that processing in an intuitive-experiential mode favors enhancement and processing in an analytical-rational mode favors verification in the kinds of situations investigated.

  5. Bias Corrections for Regional Estimates of the Time-averaged Geomagnetic Field

    NASA Astrophysics Data System (ADS)

    Constable, C.; Johnson, C. L.

    2009-05-01

    We assess two sources of bias in the time-averaged geomagnetic field (TAF) and paleosecular variation (PSV): inadequate temporal sampling, and the use of unit vectors in deriving temporal averages of the regional geomagnetic field. For the first temporal sampling question we use statistical resampling of existing data sets to minimize and correct for bias arising from uneven temporal sampling in studies of the time- averaged geomagnetic field (TAF) and its paleosecular variation (PSV). The techniques are illustrated using data derived from Hawaiian lava flows for 0-5~Ma: directional observations are an updated version of a previously published compilation of paleomagnetic directional data centered on ± 20° latitude by Lawrence et al./(2006); intensity data are drawn from Tauxe & Yamazaki, (2007). We conclude that poor temporal sampling can produce biased estimates of TAF and PSV, and resampling to appropriate statistical distribution of ages reduces this bias. We suggest that similar resampling should be attempted as a bias correction for all regional paleomagnetic data to be used in TAF and PSV modeling. The second potential source of bias is the use of directional data in place of full vector data to estimate the average field. This is investigated for the full vector subset of the updated Hawaiian data set. Lawrence, K.P., C.G. Constable, and C.L. Johnson, 2006, Geochem. Geophys. Geosyst., 7, Q07007, DOI 10.1029/2005GC001181. Tauxe, L., & Yamazkai, 2007, Treatise on Geophysics,5, Geomagnetism, Elsevier, Amsterdam, Chapter 13,p509

  6. Transmutation Fuel Performance Code Thermal Model Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  7. Specification, Validation and Verification of Mobile Application Behavior

    DTIC Science & Technology

    2013-03-01

    VALIDATION AND VERIFICATION OF MOBILE APPLICATION BEHAVIOR by Christopher B. Bonine March 2013 Thesis Advisor: Man-Tak Shing Thesis Co...NUMBERS 6. AUTHOR(S) Christopher B. Bonine 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Postgraduate School Monterey, CA 93943–5000 8...VALIDATION AND VERIFICATION OF MOBILE APPLICATION BEHAVIOR Christopher B. Bonine Lieutenant, United States Navy B.S. Southern Polytechnic State

  8. Control and Non-Payload Communications (CNPC) Prototype Radio Verification Test Report

    NASA Technical Reports Server (NTRS)

    Bishop, William D.; Frantz, Brian D.; Thadhani, Suresh K.; Young, Daniel P.

    2017-01-01

    This report provides an overview and results from the verification of the specifications that defines the operational capabilities of the airborne and ground, L Band and C Band, Command and Non-Payload Communications radio link system. An overview of system verification is provided along with an overview of the operation of the radio. Measurement results are presented for verification of the radios operation.

  9. Is the Magnetic Field in the Heliosheath Sector Region and in the Outer Heliosheath Laminar?

    NASA Astrophysics Data System (ADS)

    Opher, M.; Drake, J. F.; Swisdak, M. M.; Toth, G.

    2010-12-01

    All the current global models of the heliosphere are based on the assumption that the magnetic field in the outer heliosheath close to the heliopause is laminar. We argue that in the outer heliosheath the heliospheric magnetic field is not laminar but instead consists of nested magnetic islands. Recently, we proposed (Drake et al. 2009) that the annihilation of the ``sectored'' magnetic field within the heliosheath as it is compressed on its approach to the heliopause produces the anomalous cosmic rays (ACRs) and also energetic electrons. As a product of the annihilation of the sectored magnetic field, densly-packed magnetic islands are produced. These magnetic islands will be convected with the ambient flows as the sector boundary is carried to higher latitudes filling the outer heliosheath. We further argue that the magnetic islands will develop upstream (but still within the heliosheath) where collisionless reconnection is unfavorable -- large perturbations of the sector structure near the heliopause will cause compressions of the current sheet upstream, triggering reconnection. As a result, the magnetic field in the heliosheath sector region will be disordered well upstream of the heliopause. We present a 3D MHD simulation with unprecedent numerical resolution that captures the sector boundary. We show that due to the high pressure of the interstellar magnetic field the disordered sectored region fills a large portion of the northern part of the heliosphere with a smaller extension in the southern hemisphere. We test these ideas with observations of energetic electrons, which because of their high velocity are most sensitive to the structure of the magnetic field. We suggest that within our scenario we can explain two significant anomalies in the observations of energetic electrons in the outer heliosphere: the sudden decrease in the intensity of low energy electrons (0.02-1.5MeV) from the LECP instrument on Voyager 2 in 2008 (Decker 2010); and the dramatic

  10. Geology of the National Capital Region: field trip guidebook

    USGS Publications Warehouse

    Burton, William; Southworth, Scott

    2004-01-01

    The 2004 Joint Northeast-Southeast Section Meeting of the Geological Society of America is the fourth such meeting and the third to be held in or near Washington, D.C. This guidebook and the field trips presented herein are intended to provide meeting participants, as well as other interested readers, a means to understand and enjoy the rich geological and historical legacy of the National Capital Region. The field trips cover all of the major physiographic and geologic provinces of the central Appalachians in the Mid-Atlantic region. Trip 1 outlines the tectonic history of northern Virginia along an east-to-west transect from the Coastal Plain province to the Blue Ridge province, whereas the other field trips each focus on a specific province. From west to east, these excursions investigate the paleoclimate controls on the stratigraphy of the Paleozoic rocks of the Allegheny Plateau and Valley and Ridge province in West Virginia, Pennsylvania, and Maryland (Trip 3); Eocene volcanic rocks that intrude Paleozoic rocks in the westernmost Valley and Ridge province in Virginia and West Virginia (Trip 4); age, petrology, and structure of Mesoproterozoic gneisses and granitoids located in the Blue Ridge province within and near Shenandoah National Park, Virginia (Trip 2); the use of argon data to unravel the complex structural and thermal history of the metamorphic rocks of the eastern Piedmont province in Maryland and Virginia (Trip 5); the use of cosmogenic isotopes to understand the timing of bedrock incision and formation of terraces along the Potomac River in the eastern Piedmont province near Great Falls, Virginia and Maryland (Trip 6); the nature of the boundary between rocks of the Goochland and Chopawamsic terranes in the eastern Piedmont of Virginia (Trip 7); the role of bluffs and fluvial terraces of the Coastal Plain in the Civil War Battle of Fredericksburg, Virginia (Trip 8); and the Tertiary lithology and paleontology of Coastal Plain strata around the

  11. Enhanced dynamic wedge and independent monitor unit verification.

    PubMed

    Howlett, S J

    2005-03-01

    Some serious radiation accidents have occurred around the world during the delivery of radiotherapy treatment. The regrettable incident in Panama clearly indicated the need for independent monitor unit (MU) verification. Indeed the International Atomic Energy Agency (IAEA), after investigating the incident, made specific recommendations for radiotherapy centres which included an independent monitor unit check for all treatments. Independent monitor unit verification is practiced in many radiotherapy centres in developed countries around the world. It is mandatory in USA but not yet in Australia. This paper describes development of an independent MU program, concentrating on the implementation of the Enhanced Dynamic Wedge (EDW) component. The difficult case of non centre of field (COF) calculation points under the EDW was studied in some detail. Results of a survey of Australasian centres regarding the use of independent MU check systems is also presented. The system was developed with reference to MU calculations made by Pinnacle 3D Radiotherapy Treatment Planning (RTP) system (ADAC - Philips) for 4MV, 6MV and 18MV X-ray beams used at the Newcastle Mater Misericordiae Hospital (NMMH) in the clinical environment. A small systematic error was detected in the equation used for the EDW calculations. Results indicate that COF equations may be used in the non COF situation with similar accuracy to that achieved with profile corrected methods. Further collaborative work with other centres is planned to extend these findings.

  12. A Hybrid On-line Verification Method of Relay Setting

    NASA Astrophysics Data System (ADS)

    Gao, Wangyuan; Chen, Qing; Si, Ji; Huang, Xin

    2017-05-01

    Along with the rapid development of the power industry, grid structure gets more sophisticated. The validity and rationality of protective relaying are vital to the security of power systems. To increase the security of power systems, it is essential to verify the setting values of relays online. Traditional verification methods mainly include the comparison of protection range and the comparison of calculated setting value. To realize on-line verification, the verifying speed is the key. The verifying result of comparing protection range is accurate, but the computation burden is heavy, and the verifying speed is slow. Comparing calculated setting value is much faster, but the verifying result is conservative and inaccurate. Taking the overcurrent protection as example, this paper analyses the advantages and disadvantages of the two traditional methods above, and proposes a hybrid method of on-line verification which synthesizes the advantages of the two traditional methods. This hybrid method can meet the requirements of accurate on-line verification.

  13. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    NASA Astrophysics Data System (ADS)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help

  14. Methods for identification and verification using vacuum XRF system

    NASA Technical Reports Server (NTRS)

    Kaiser, Bruce (Inventor); Schramm, Fred (Inventor)

    2005-01-01

    Apparatus and methods in which one or more elemental taggants that are intrinsically located in an object are detected by x-ray fluorescence analysis under vacuum conditions to identify or verify the object's elemental content for elements with lower atomic numbers. By using x-ray fluorescence analysis, the apparatus and methods of the invention are simple and easy to use, as well as provide detection by a non line-of-sight method to establish the origin of objects, as well as their point of manufacture, authenticity, verification, security, and the presence of impurities. The invention is extremely advantageous because it provides the capability to measure lower atomic number elements in the field with a portable instrument.

  15. Prompt gamma timing range verification for scattered proton beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kormoll, T.; Golnik, C.; Hueso Gonzalez, F.

    2015-07-01

    Range verification is a very important point in order to fully exploit the physical advantages of protons compared to photons in cancer irradiation. Recently, a simple method has been proposed which makes use of the time of fight of protons in tissue and the promptly emitted secondary photons along the proton path (Prompt Gamma Timing, PGT). This has been considered so far for monoenergetic pencil beams only. In this work, it has been studied whether this technique can also be applied in passively formed irradiation fields with a so called spread out Bragg peak. Time correlated profiles could be recorded,more » which show a trend that is consistent with theoretical predictions. (authors)« less

  16. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    NASA Astrophysics Data System (ADS)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  17. Measles and rubella elimination in the WHO Region for Europe: progress and challenges.

    PubMed

    O'Connor, P; Jankovic, D; Muscat, M; Ben-Mamou, M; Reef, S; Papania, M; Singh, S; Kaloumenos, T; Butler, R; Datta, S

    2017-08-01

    Globally measles remains one of the leading causes of death among young children even though a safe and cost-effective vaccine is available. The World Health Organization (WHO) European Region has seen a decline in measles and rubella cases in recent years. The recent outbreaks have primarily affected adolescents and young adults with no vaccination or an incomplete vaccination history. Eliminating measles and rubella is one of the top immunization priorities of the European Region as outlined in the European Vaccine Action Plan 2015-2020. Following the 2010 decision by the Member States in the Region to initiate the process of verifying elimination, the European Regional Verification Commission for Measles and Rubella Elimination (RVC) was established in 2011. The RVC meets every year to evaluate the status of measles and rubella elimination in the Region based on documentation submitted by each country's National Verification Committees. The verification process was however modified in late 2014 to assess the elimination status at the individual country level instead of at regional level. The WHO European Region has made substantial progress towards measles and rubella elimination over the past 5 years. The RVC's conclusion in 2016 that 70% and 66% of the 53 Member States in the Region had interrupted the endemic transmission of measles and rubella, respectively, by 2015 is a testament to this progress. Nevertheless, where measles and rubella remain endemic, challenges in vaccination service delivery and disease surveillance will need to be addressed through focused technical assistance from WHO and development partners. Copyright © 2017 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  18. Formal System Verification for Trustworthy Embedded Systems

    DTIC Science & Technology

    2011-04-19

    microkernel basis. We had previously achieved code- level formal verification of the seL4 microkernel [3]. In the present project, over 12 months with 0.6 FTE...project, we designed and implemented a secure network access device (SAC) on top of the verified seL4 microkernel. The device allows a trusted front...Engelhardt, Rafal Kolan- ski, Michael Norrish, Thomas Sewell, Harvey Tuch, and Simon Winwood. seL4 : Formal verification of an OS kernel. CACM, 53(6):107

  19. Regional Seismic Arrays and Nuclear Test Ban Verification

    DTIC Science & Technology

    1990-12-01

    estimation has been difficult to automate, at least for regional and teleseismic signals. A neural network approach might be applicable here. The data must...use of trained neural networks . Of the 95 events examined, 66 were selected for the classification study based on high signal-to-noise ratio and...the International Joint Conference on Neural Networks , Washington, D.C., June, 1989. Menke, W. Geophysical Data Analysis : Discrete Inverse Theory

  20. 18 CFR 385.2005 - Subscription and verification (Rule 2005).

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Subscription and verification (Rule 2005). 385.2005 Section 385.2005 Conservation of Power and Water Resources FEDERAL ENERGY... Requirements for Filings in Proceedings Before the Commission § 385.2005 Subscription and verification (Rule...

  1. 18 CFR 385.2005 - Subscription and verification (Rule 2005).

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Subscription and verification (Rule 2005). 385.2005 Section 385.2005 Conservation of Power and Water Resources FEDERAL ENERGY... Requirements for Filings in Proceedings Before the Commission § 385.2005 Subscription and verification (Rule...

  2. 18 CFR 385.2005 - Subscription and verification (Rule 2005).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Subscription and verification (Rule 2005). 385.2005 Section 385.2005 Conservation of Power and Water Resources FEDERAL ENERGY... Requirements for Filings in Proceedings Before the Commission § 385.2005 Subscription and verification (Rule...

  3. 18 CFR 385.2005 - Subscription and verification (Rule 2005).

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Subscription and verification (Rule 2005). 385.2005 Section 385.2005 Conservation of Power and Water Resources FEDERAL ENERGY... Requirements for Filings in Proceedings Before the Commission § 385.2005 Subscription and verification (Rule...

  4. 45 CFR 1626.7 - Verification of eligible alien status.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 4 2014-10-01 2014-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the only...

  5. 45 CFR 1626.7 - Verification of eligible alien status.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 4 2012-10-01 2012-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the only...

  6. 45 CFR 1626.7 - Verification of eligible alien status.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 4 2013-10-01 2013-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the only...

  7. 45 CFR 1626.7 - Verification of eligible alien status.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 4 2011-10-01 2011-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the only...

  8. 45 CFR 1626.7 - Verification of eligible alien status.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the only...

  9. 29 CFR 403.8 - Dissemination and verification of reports.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... LABOR-MANAGEMENT STANDARDS LABOR ORGANIZATION ANNUAL FINANCIAL REPORTS § 403.8 Dissemination and verification of reports. (a) Every labor organization required to submit a report under section 201(b) of the... 29 Labor 2 2010-07-01 2010-07-01 false Dissemination and verification of reports. 403.8 Section...

  10. GAMMA–GAMMA ABSORPTION IN THE BROAD LINE REGION RADIATION FIELDS OF GAMMA-RAY BLAZARS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Böttcher, Markus; Els, Paul, E-mail: Markus.Bottcher@nwu.ac.za

    2016-04-20

    The expected level of γγ absorption in the Broad Line Region (BLR) radiation field of γ -ray loud Flat Spectrum Radio Quasars (FSRQs) is evaluated as a function of the location of the γ -ray emission region. This is done self-consistently with parameters inferred from the shape of the spectral energy distribution (SED) in a single-zone leptonic EC-BLR model scenario. We take into account all geometrical effects both in the calculation of the γγ opacity and the normalization of the BLR radiation energy density. As specific examples, we study the FSRQs 3C279 and PKS 1510-089, keeping the BLR radiation energymore » density at the location of the emission region fixed at the values inferred from the SED. We confirm previous findings that the optical depth due to γγ absorption in the BLR radiation field exceeds unity for both 3C279 and PKS 1510-089 for locations of the γ -ray emission region inside the inner boundary of the BLR. It decreases monotonically, with distance from the central engine and drops below unity for locations within the BLR. For locations outside the BLR, the BLR radiation energy density required for the production of GeV γ -rays rapidly increases beyond observational constraints, thus making the EC-BLR mechanism implausible. Therefore, in order to avoid significant γγ absorption by the BLR radiation field, the γ -ray emission region must therefore be located near the outer boundary of the BLR.« less

  11. A Quantitative Approach to the Formal Verification of Real-Time Systems.

    DTIC Science & Technology

    1996-09-01

    Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic

  12. Observations of vector magnetic fields in flaring active regions

    NASA Technical Reports Server (NTRS)

    Chen, Jimin; Wang, Haimin; Zirin, Harold; Ai, Guoxiang

    1994-01-01

    We present vector magnetograph data of 6 active regions, all of which produced major flares. Of the 20 M-class (or above) flares, 7 satisfy the flare conditions prescribed by Hagyard (high shear and strong transverse fields). Strong photospheric shear, however, is not necessarily a condition for a flare. We find an increase in the shear for two flares, a 6-deg shear increase along the neutral line after a X-2 flare and a 13-deg increase after a M-1.9 flare. For other flares, we did not detect substantial shear changes.

  13. The effects of incidence angle on film dosimetry and their consequences in IMRT dose verification.

    PubMed

    Srivastava, R P; De Wagter, C

    2012-10-01

    The dosimetric accuracy of EDR2 radiographic film has been rigorously assessed in regular and intensity modulated beams for various incidence angles, including the parallel and perpendicular orientation. There clearly exists confusion in literature regarding the effect of film orientation. The primary aim is to clarify potential sources of the confusion and to gain physical insight into the film orientation effect with a link to radiochromic film as well. An inverse pyramid IMRT field, consisting of six regular and elongated 3 × 20 cm(2) field segments, was studied in perpendicular and parallel orientation. Assessment of film self-perturbation and intrinsic directional sensitivity were also included in the experiments. Finally, the authors investigated the orientational effect in composite beams in the two extreme orientations, i.e., perpendicular and parallel. The study of an inverse pyramid dose profile revealed good agreement between the perpendicular film and the diamond detector within 0.5% in the low-scatter regions for both 6 and 18 MV. The parallel oriented film demonstrated a 3% under-response at 5-cm (6 MV) depth against the perpendicular orientation, but both orientations over responded equally in the central region, which received only scattered dose, at both 5- and 20-cm depths. In a regular 6-MV 5 × 5 cm(2) field, a 4.1% lower film response was observed in the parallel orientation compared to perpendicular orientation. The under response gradually increased to 6% when reducing the field size to 0.5 × 5 cm(2). On the other hand, the film showed a 1.7% lower response in parallel orientation for the large field size of 20 × 20 cm(2) at 5-cm depth but the difference disappeared at 10 cm. At 18 MV, similar but somewhat lower differences were found between the two orientations. The directional sensitivity of the film diminishes with increasing field size and depth. Surprisingly a composite IMRT beam consisting of 20 adjacent strip segments also

  14. Enhanced verification test suite for physics simulation codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.

    2008-09-01

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  15. A Solar Eruption from a Weak Magnetic Field Region with Relatively Strong Geo-Effectiveness

    NASA Astrophysics Data System (ADS)

    Wang, R.

    2017-12-01

    A moderate flare eruption giving rise to a series of geo-effectiveness on 2015 November 4 caught our attentions, which originated from a relatively weak magnetic field region. The associated characteristics near the Earth are presented, which indicates that the southward magnetic field in the sheath and the ICME induced a geomagnetic storm sequence with a Dst global minimum of 90 nT. The ICME is indicated to have a small inclination angle by using a Grad-Shafranov technique, and corresponds to the flux rope (FR) structure horizontally lying on the solar surface. A small-scale magnetic cancelling feature was detected which is beneath the FR and is co-aligned with the Atmospheric Imaging Assembly (AIA) EUV brightening prior to the eruption. Various magnetic features for space-weather forecasting are computed by using a data product from the Helioseismic and Magnetic Imager (HMI) onboard the Solar Dynamics Observatory (SDO) called Space-weather HMI Active Region Patches (SHARPs), which help us identify the changes of the photospheric magnetic fields during the magnetic cancellation process and prove that the magnetic reconnection associated with the flux cancellation is driven by the magnetic shearing motion on the photosphere. An analysis on the distributions at different heights of decay index is carried out. Combining with a filament height estimation method, the configurations of the FR is identified and a decay index critical value n = 1 is considered to be more appropriate for such a weak magnetic field region. Through a comprehensive analysis to the trigger mechanisms and conditions of the eruption, a clearer scenario of a CME from a relatively weak region is presented.

  16. 34 CFR 668.54 - Selection of applications for verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false Selection of applications for verification. 668.54 Section 668.54 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION STUDENT ASSISTANCE GENERAL PROVISIONS Verification of Student Aid Application Information § 668.54...

  17. PET/CT imaging for treatment verification after proton therapy: A study with plastic phantoms and metallic implants

    PubMed Central

    Parodi, Katia; Paganetti, Harald; Cascio, Ethan; Flanz, Jacob B.; Bonab, Ali A.; Alpert, Nathaniel M.; Lohmann, Kevin; Bortfeld, Thomas

    2008-01-01

    The feasibility of off-line positron emission tomography/computed tomography (PET/CT) for routine three dimensional in-vivo treatment verification of proton radiation therapy is currently under investigation at Massachusetts General Hospital in Boston. In preparation for clinical trials, phantom experiments were carried out to investigate the sensitivity and accuracy of the method depending on irradiation and imaging parameters. Furthermore, they addressed the feasibility of PET/CT as a robust verification tool in the presence of metallic implants. These produce x-ray CT artifacts and fluence perturbations which may compromise the accuracy of treatment planning algorithms. Spread-out Bragg peak proton fields were delivered to different phantoms consisting of polymethylmethacrylate (PMMA), PMMA stacked with lung and bone equivalent materials, and PMMA with titanium rods to mimic implants in patients. PET data were acquired in list mode starting within 20 min after irradiation at a commercial luthetium-oxyorthosilicate (LSO)-based PET/CT scanner. The amount and spatial distribution of the measured activity could be well reproduced by calculations based on the GEANT4 and FLUKA Monte Carlo codes. This phantom study supports the potential of millimeter accuracy for range monitoring and lateral field position verification even after low therapeutic dose exposures of 2 Gy, despite the delay between irradiation and imaging. It also indicates the value of PET for treatment verification in the presence of metallic implants, demonstrating a higher sensitivity to fluence perturbations in comparison to a commercial analytical treatment planning system. Finally, it addresses the suitability of LSO-based PET detectors for hadron therapy monitoring. This unconventional application of PET involves countrates which are orders of magnitude lower than in diagnostic tracer imaging, i.e., the signal of interest is comparable to the noise originating from the intrinsic radioactivity of

  18. NCEP Model Verification

    Science.gov Websites

    daily and monthly statistics. The daily and monthly verification processing is broken down into three geopotential height and wind using daily statistics from the gdas1 prepbufr files at 00Z; 06Z; 12Z; and, 18Z Hemisphere; the Southern Hemisphere; and the Tropics. Daily S1 scores from the GFS and NAM models are

  19. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  20. Effect of verification cores on tip capacity of drilled shafts.

    DOT National Transportation Integrated Search

    2009-02-01

    This research addressed two key issues: : 1) Will verification cores holes fill during concrete backfilling? If so, what are the mechanical properties of the : filling material? In dry conditions, verification core holes always completely fill with c...

  1. PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Frederick, J. M.

    2016-12-01

    In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to

  2. Generic Protocol for the Verification of Ballast Water Treatment Technology

    EPA Science Inventory

    In anticipation of the need to address performance verification and subsequent approval of new and innovative ballast water treatment technologies for shipboard installation, the U.S Coast Guard and the Environmental Protection Agency‘s Environmental Technology Verification Progr...

  3. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... (b) of this section and § 5.514, no individual or family applying for assistance may receive such assistance prior to the verification of the eligibility of at least the individual or one family member. Verification of eligibility consistent with § 5.514 occurs when the individual or family members have submitted...

  4. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... (b) of this section and § 5.514, no individual or family applying for assistance may receive such assistance prior to the verification of the eligibility of at least the individual or one family member. Verification of eligibility consistent with § 5.514 occurs when the individual or family members have submitted...

  5. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... (b) of this section and § 5.514, no individual or family applying for assistance may receive such assistance prior to the verification of the eligibility of at least the individual or one family member. Verification of eligibility consistent with § 5.514 occurs when the individual or family members have submitted...

  6. Gate-Level Commercial Microelectronics Verification with Standard Cell Recognition

    DTIC Science & Technology

    2015-03-26

    21 2.2.1.4 Algorithm Insufficiencies as Applied to DARPA’s Cir- cuit Verification Efforts . . . . . . . . . . . . . . . . . . 22 vi Page...58 4.2 Discussion of SCR Algorithm and Code . . . . . . . . . . . . . . . . . . . 91 4.2.1 Explication of SCR Algorithm ...93 4.2.2 Algorithm Attributes . . . . . . . . . . . . . . . . . . . . . . . . . 118 4.3 Advantages of Transistor-level Verification with SCR

  7. International Cooperative for Aerosol Prediction Workshop on Aerosol Forecast Verification

    NASA Technical Reports Server (NTRS)

    Benedetti, Angela; Reid, Jeffrey S.; Colarco, Peter R.

    2011-01-01

    The purpose of this workshop was to reinforce the working partnership between centers who are actively involved in global aerosol forecasting, and to discuss issues related to forecast verification. Participants included representatives from operational centers with global aerosol forecasting requirements, a panel of experts on Numerical Weather Prediction and Air Quality forecast verification, data providers, and several observers from the research community. The presentations centered on a review of current NWP and AQ practices with subsequent discussion focused on the challenges in defining appropriate verification measures for the next generation of aerosol forecast systems.

  8. An MHD Simulation of Solar Active Region 11158 Driven with a Time-dependent Electric Field Determined from HMI Vector Magnetic Field Measurement Data

    NASA Astrophysics Data System (ADS)

    Hayashi, Keiji; Feng, Xueshang; Xiong, Ming; Jiang, Chaowei

    2018-03-01

    For realistic magnetohydrodynamics (MHD) simulation of the solar active region (AR), two types of capabilities are required. The first is the capability to calculate the bottom-boundary electric field vector, with which the observed magnetic field can be reconstructed through the induction equation. The second is a proper boundary treatment to limit the size of the sub-Alfvénic simulation region. We developed (1) a practical inversion method to yield the solar-surface electric field vector from the temporal evolution of the three components of magnetic field data maps, and (2) a characteristic-based free boundary treatment for the top and side sub-Alfvénic boundary surfaces. We simulate the temporal evolution of AR 11158 over 16 hr for testing, using Solar Dynamics Observatory/Helioseismic Magnetic Imager vector magnetic field observation data and our time-dependent three-dimensional MHD simulation with these two features. Despite several assumptions in calculating the electric field and compromises for mitigating computational difficulties at the very low beta regime, several features of the AR were reasonably retrieved, such as twisting field structures, energy accumulation comparable to an X-class flare, and sudden changes at the time of the X-flare. The present MHD model can be a first step toward more realistic modeling of AR in the future.

  9. Integral field spectroscopy of H II regions in M33

    NASA Astrophysics Data System (ADS)

    López-Hernández, Jesús; Terlevich, Elena; Terlevich, Roberto; Rosa-González, Daniel; Díaz, Ángeles; García-Benito, Rubén; Vílchez, José; Hägele, Guillermo

    2013-03-01

    Integral field spectroscopy is presented for star-forming regions in M33. A central area of 300 × 500 pc2 and the external H II region IC 132, at a galactocentric distance ˜19 arcmin (4.69 kpc), were observed with the Potsdam Multi-Aperture Spectrophotometer instrument at the 3.5-m telescope of the Centro Astronómico Hispano-Alemán (CAHA, aka Calar Alto Observatory). The spectral coverage goes from 3600 Å to 1 μm to include from [O II] λ3727 Å to the near-infrared lines required for deriving sulphur electron temperature and abundance diagnostics. Local conditions within individual H II regions are presented in the form of emission-line fluxes and physical conditions for each spatial resolution element (spaxel) and for segments with similar Hα surface brightness. A clear dichotomy is observed when comparing the central to outer disc H II regions. While the external H II region has higher electron temperature plus larger Hβ equivalent width, size and excitation, the central region has higher extinction and metal content. The dichotomy extends to the Baldwin-Phillips-Terlevich (BPT) diagnostic diagrams that show two orthogonal broad distributions of points. By comparing with pseudo-3D photoionization models, we conclude that the bulk of observed differences are probably related to a different ionization parameter and metallicity. Wolf-Rayet (WR) features are detected in IC 132, and resolved into two concentrations whose integrated spectra were used to estimate the characteristic number of WR stars. No WR features were detected in the central H II regions despite their higher metallicity.

  10. Use of high-field and low-field magnetic resonance imaging to describe the anatomy of the proximal portion of the tarsal region of nonlame horses.

    PubMed

    Biggi, Marianna; Dyson, Sue J

    2018-03-01

    OBJECTIVE To use high-field and low-field MRI to describe the anatomy of the proximal portion of the tarsal region (proximal tarsal region) of nonlame horses. SAMPLE 25 cadaveric equine tarsi. PROCEDURES The proximal portion of 1 tarsus from each of 25 nonlame horses with no history of tarsal lameness underwent high-field (1.5-T) and low-field (0.27-T) MRI. Resulting images were used to subjectively describe the anatomy of that region and obtain measurements of the collateral ligaments of the tarsocrural joint. RESULTS Long and short components of the lateral and medial collateral ligaments of the tarsocrural joint were identified. Various bundles of the short collateral ligaments were difficult to delineate on low-field images. Ligaments typically had low signal intensity in all sequences; however, multiple areas of increased signal intensity were identified at specific locations in most tarsi. This signal intensity was attributed to focal magic angle effect associated with orientation of collagen fibers within the ligaments at those locations. Subchondral bone of the distal aspect of the tibia was uniform in thickness, whereas that of the medial trochlear ridge of the talus was generally thicker than that of the lateral trochlear ridge. In most tarsi, subchondral bone of the talocalcaneal joint decreased in thickness from proximal to distal. CONCLUSIONS AND CLINICAL RELEVANCE Results generated in this study can be used as a reference for interpretation of MRI images of the proximal tarsal region in horses.

  11. 47 CFR 64.1120 - Verification of orders for telecommunications service.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 3 2013-10-01 2013-10-01 false Verification of orders for telecommunications service. 64.1120 Section 64.1120 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON... Telecommunications Service Providers § 64.1120 Verification of orders for telecommunications service. (a) No...

  12. 47 CFR 64.1120 - Verification of orders for telecommunications service.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 3 2012-10-01 2012-10-01 false Verification of orders for telecommunications service. 64.1120 Section 64.1120 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON... Telecommunications Service Providers § 64.1120 Verification of orders for telecommunications service. (a) No...

  13. 47 CFR 64.1120 - Verification of orders for telecommunications service.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 3 2014-10-01 2014-10-01 false Verification of orders for telecommunications service. 64.1120 Section 64.1120 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON... Telecommunications Service Providers § 64.1120 Verification of orders for telecommunications service. (a) No...

  14. 76 FR 23861 - Documents Acceptable for Employment Eligibility Verification; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-29

    ... Documents Acceptable for Employment Eligibility Verification; Correction AGENCY: U.S. Citizenship and... titled Documents Acceptable for Employment Eligibility Verification published in the Federal Register on... a final rule in the Federal Register at 76 FR 21225 establishing Documents Acceptable for Employment...

  15. 47 CFR 64.1120 - Verification of orders for telecommunications service.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 3 2011-10-01 2011-10-01 false Verification of orders for telecommunications service. 64.1120 Section 64.1120 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON... Telecommunications Service Providers § 64.1120 Verification of orders for telecommunications service. (a) No...

  16. A risk analysis approach applied to field surveillance in utility meters in legal metrology

    NASA Astrophysics Data System (ADS)

    Rodrigues Filho, B. A.; Nonato, N. S.; Carvalho, A. D.

    2018-03-01

    Field surveillance represents the level of control in metrological supervision responsible for checking the conformity of measuring instruments in-service. Utility meters represent the majority of measuring instruments produced by notified bodies due to self-verification in Brazil. They play a major role in the economy once electricity, gas and water are the main inputs to industries in their production processes. Then, to optimize the resources allocated to control these devices, the present study applied a risk analysis in order to identify among the 11 manufacturers notified to self-verification, the instruments that demand field surveillance.

  17. Generic interpreters and microprocessor verification

    NASA Technical Reports Server (NTRS)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  18. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luke, S J

    2011-12-20

    This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of themore » US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of

  19. Today's Environmental Technologies-Innovative Solutions for Regional Issues, U.S. EPA ETV and SBIR Programs Regional Workshop, October 7-8, 2008, U.S. EPA Region 2, New York City, New York, Meeting Summary Report

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) and Small Business Innovation Research (SBIR) Programs hosted a workshop on October 7–8, 2008, at the EPA Region 2 office in New York City, New York. The goals of the workshop were to: (1) ...

  20. Experimental preparation and verification of quantum money

    NASA Astrophysics Data System (ADS)

    Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei

    2018-03-01

    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.