DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
2007-12-03
The 100-F-26:10 waste site includes sanitary sewer lines that serviced the former 182-F, 183-F, and 151-F Buildings. In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
2008-03-18
The 100-F-26:15 waste site consisted of the remnant portions of underground process effluent and floor drain pipelines that originated at the 105-F Reactor. In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Rigaku ZSX Mini II (ZSX Mini II) XRF Services x-ray fluorescence (XRF) analyzer was demon-strated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ZSX Mini II analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ZSX Mini II analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element con
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Rontec PicoTAX x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the PicoTAX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the PicoTAX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by c
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Niton XLt 700 Series (XLt) XRF Services x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XLt analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XLt analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Oxford ED2000 x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ED2000 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ED2000 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by com
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Innov-X XT400 Series (XT400) x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XT400 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XT400 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was as
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Elvatech, Ltd. ElvaX (ElvaX) x-ray fluorescence (XRF) analyzer distributed in the United States by Xcalibur XRF Services (Xcalibur), was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ElvaX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ElvaX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as s
International Space Station Requirement Verification for Commercial Visiting Vehicles
NASA Technical Reports Server (NTRS)
Garguilo, Dan
2017-01-01
The COTS program demonstrated NASA could rely on commercial providers for safe, reliable, and cost-effective cargo delivery to ISS. The ISS Program has developed a streamlined process to safely integrate commercial visiting vehicles and ensure requirements are met Levy a minimum requirement set (down from 1000s to 100s) focusing on the ISS interface and safety, reducing the level of NASA oversight/insight and burden on the commercial Partner. Partners provide a detailed verification and validation plan documenting how they will show they've met NASA requirements. NASA conducts process sampling to ensure that the established verification processes is being followed. NASA participates in joint verification events and analysis for requirements that require both parties verify. Verification compliance is approved by NASA and launch readiness certified at mission readiness reviews.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. M. Capron
2008-04-29
The 100-F-26:12 waste site was an approximately 308-m-long, 1.8-m-diameter east-west-trending reinforced concrete pipe that joined the North Process Sewer Pipelines (100-F-26:1) and the South Process Pipelines (100-F-26:4) with the 1.8-m reactor cooling water effluent pipeline (100-F-19). In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.
2013-09-01
33 4.7 SAMPLING RESULTS ...34 5.0 PERFORMANCE RESULTS ...PERFORMANCE RESULTS DISCUSSION ............................................................................ 39 5.2.1 Energy: Verify Power Production
Development of Sample Verification System for Sample Return Missions
NASA Technical Reports Server (NTRS)
Toda, Risaku; McKinney, Colin; Jackson, Shannon P.; Mojarradi, Mohammad; Trebi-Ollennu, Ashitey; Manohara, Harish
2011-01-01
This paper describes the development of a proof of-concept sample verification system (SVS) for in-situ mass measurement of planetary rock and soil sample in future robotic sample return missions. Our proof-of-concept SVS device contains a 10 cm diameter pressure sensitive elastic membrane placed at the bottom of a sample canister. The membrane deforms under the weight of accumulating planetary sample. The membrane is positioned in proximity to an opposing substrate with a narrow gap. The deformation of the membrane makes the gap to be narrower, resulting in increased capacitance between the two nearly parallel plates. Capacitance readout circuitry on a nearby printed circuit board (PCB) transmits data via a low-voltage differential signaling (LVDS) interface. The fabricated SVS proof-of-concept device has successfully demonstrated approximately 1pF/gram capacitance change
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
2008-01-31
The 116-C-3 waste site consisted of two underground storage tanks designed to receive mixed waste from the 105-C Reactor Metals Examination Facility chemical dejacketing process. Confirmatory evaluation and subsequent characterization of the site determined that the southern tank contained approximately 34,000 L (9,000 gal) of dejacketing wastes, and that the northern tank was unused. In accordance with this evaluation, the verification sampling and modeling results support a reclassification of this site to Interim Closed Out. The results of verification sampling demonstrate that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils.more » The results also show that residual contaminant concentrations are protective of groundwater and the Columbia River.« less
He, Hua; McDermott, Michael P.
2012-01-01
Sensitivity and specificity are common measures of the accuracy of a diagnostic test. The usual estimators of these quantities are unbiased if data on the diagnostic test result and the true disease status are obtained from all subjects in an appropriately selected sample. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the result of the diagnostic test and other characteristics of the subjects. Estimators of sensitivity and specificity based on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias under the assumption that the missing data on disease status are missing at random (MAR), that is, the probability of missingness depends on the true (missing) disease status only through the test result and observed covariate information. When some of the covariates are continuous, or the number of covariates is relatively large, the existing methods require parametric models for the probability of disease or the probability of verification (given the test result and covariates), and hence are subject to model misspecification. We propose a new method for correcting verification bias based on the propensity score, defined as the predicted probability of verification given the test result and observed covariates. This is estimated separately for those with positive and negative test results. The new method classifies the verified sample into several subsamples that have homogeneous propensity scores and allows correction for verification bias. Simulation studies demonstrate that the new estimators are more robust to model misspecification than existing methods, but still perform well when the models for the probability of disease and probability of verification are correctly specified. PMID:21856650
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
2008-03-03
The 100-F-26:13 waste site is the network of process sewer pipelines that received effluent from the 108-F Biological Laboratory and discharged it to the 188-F Ash Disposal Area (126-F-1 waste site). The pipelines included one 0.15-m (6-in.)-, two 0.2-m (8-in.)-, and one 0.31-m (12-in.)-diameter vitrified clay pipe segments encased in concrete. In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling demonstrated that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also showed thatmore » residual contaminant concentrations are protective of groundwater and the Columbia River.« less
NASA Technical Reports Server (NTRS)
Hughes, David W.; Hedgeland, Randy J.
1994-01-01
A mechanical simulator of the Hubble Space Telescope (HST) Aft Shroud was built to perform verification testing of the Servicing Mission Scientific Instruments (SI's) and to provide a facility for astronaut training. All assembly, integration, and test activities occurred under the guidance of a contamination control plan, and all work was reviewed by a contamination engineer prior to implementation. An integrated approach was followed in which materials selection, manufacturing, assembly, subsystem integration, and end product use were considered and controlled to ensure that the use of the High Fidelity Mechanical Simulator (HFMS) as a verification tool would not contaminate mission critical hardware. Surfaces were cleaned throughout manufacturing, assembly, and integration, and reverification was performed following major activities. Direct surface sampling was the preferred method of verification, but access and material constraints led to the use of indirect methods as well. Although surface geometries and coatings often made contamination verification difficult, final contamination sampling and monitoring demonstrated the ability to maintain a class M5.5 environment with surface levels less than 400B inside the HFMS.
2013-09-01
17 5.6 SAMPLING RESULTS ........................................................................................ 18 6.0 PERFORMANCE...Page ii 8.0 IMPLEMENTATION ISSUES ........................................................................................ 37 8.1 FILTRATION ...15 iv LIST OF TABLES Page Table 1. Performance results
Demonstration Report for Visual Sample Plan (VSP) Verification Sampling Methods at the Navy/DRI Site
2011-08-01
population of 537,197 with an overall population density of 608 people per square mile (people/ mi2 ). However, the population density in the vicinity...Preliminary Assessment Findings approximately 12 people/ mi2 . Population density is expected to greatly increase following development of the site
NASA Astrophysics Data System (ADS)
Miller, Jacob; Sanders, Stephen; Miyake, Akimasa
2017-12-01
While quantum speed-up in solving certain decision problems by a fault-tolerant universal quantum computer has been promised, a timely research interest includes how far one can reduce the resource requirement to demonstrate a provable advantage in quantum devices without demanding quantum error correction, which is crucial for prolonging the coherence time of qubits. We propose a model device made of locally interacting multiple qubits, designed such that simultaneous single-qubit measurements on it can output probability distributions whose average-case sampling is classically intractable, under similar assumptions as the sampling of noninteracting bosons and instantaneous quantum circuits. Notably, in contrast to these previous unitary-based realizations, our measurement-based implementation has two distinctive features. (i) Our implementation involves no adaptation of measurement bases, leading output probability distributions to be generated in constant time, independent of the system size. Thus, it could be implemented in principle without quantum error correction. (ii) Verifying the classical intractability of our sampling is done by changing the Pauli measurement bases only at certain output qubits. Our usage of random commuting quantum circuits in place of computationally universal circuits allows a unique unification of sampling and verification, so they require the same physical resource requirements in contrast to the more demanding verification protocols seen elsewhere in the literature.
Yin, Hong-Rui; Zhang, Lei; Xie, Li-Qi; Huang, Li-Yong; Xu, Ye; Cai, San-Jun; Yang, Peng-Yuan; Lu, Hao-Jie
2013-09-06
Novel biomarker verification assays are urgently required to improve the efficiency of biomarker development. Benefitting from lower development costs, multiple reaction monitoring (MRM) has been used for biomarker verification as an alternative to immunoassay. However, in general MRM analysis, only one sample can be quantified in a single experiment, which restricts its application. Here, a Hyperplex-MRM quantification approach, which combined mTRAQ for absolute quantification and iTRAQ for relative quantification, was developed to increase the throughput of biomarker verification. In this strategy, equal amounts of internal standard peptides were labeled with mTRAQ reagents Δ0 and Δ8, respectively, as double references, while 4-plex iTRAQ reagents were used to label four different samples as an alternative to mTRAQ Δ4. From the MRM trace and MS/MS spectrum, total amounts and relative ratios of target proteins/peptides of four samples could be acquired simultaneously. Accordingly, absolute amounts of target proteins/peptides in four different samples could be achieved in a single run. In addition, double references were used to increase the reliability of the quantification results. Using this approach, three biomarker candidates, ademosylhomocysteinase (AHCY), cathepsin D (CTSD), and lysozyme C (LYZ), were successfully quantified in colorectal cancer (CRC) tissue specimens of different stages with high accuracy, sensitivity, and reproducibility. To summarize, we demonstrated a promising quantification method for high-throughput verification of biomarker candidates.
Storms, S M; Feltus, A; Barker, A R; Joly, M-A; Girard, M
2009-03-01
Measurement of somatropin charged variants by isoelectric focusing was replaced with capillary zone electrophoresis in the January 2006 European Pharmacopoeia Supplement 5.3, based on results from an interlaboratory collaborative study. Due to incompatibilities and method-robustness issues encountered prior to verification, a number of method parameters required optimisation. As the use of a diode array detector at 195 nm or 200 nm led to a loss of resolution, a variable wavelength detector using a 200 nm filter was employed. Improved injection repeatability was obtained by increasing the injection time and pressure, and changing the sample diluent from water to running buffer. Finally, definition of capillary pre-treatment and rinse procedures resulted in more consistent separations over time. Method verification data are presented demonstrating linearity, specificity, repeatability, intermediate precision, limit of quantitation, sample stability, solution stability, and robustness. Based on these experiments, several modifications to the current method have been recommended and incorporated into the European Pharmacopoeia to help improve method performance across laboratories globally.
Airell, Asa; Lindbäck, Emma; Ataker, Ferda; Pörnull, Kirsti Jalakas; Wretlind, Bengt
2005-06-01
We compared 956 samples for AMPLICOR Neisseria gonorrhoeae polymerase chain reaction (PCR) (Roche) with species verification using the 16S rRNA gene to verification using gyrA gene. Control was the culture method. The gyrA verification uses pyrosequencing of the quinolone resistance-determining region of gyrA. Of 52 samples with optical density >/=0.2 in PCR, 27 were negative in culture, two samples from pharynx were false negative in culture and four samples from pharynx were false positives in verification with 16S rRNA. Twenty-five samples showed growth of gonococci, 18 of the corresponding PCR samples were verified by both methods; three urine samples were positive only in gyrA ; and one pharynx specimen was positive only in 16S rRNA. Three samples were lost. We conclude that AMPLICOR N. gonorrhoeae PCR with verification in gyrA gene can be considered as a diagnostic tool in populations with low prevalence of gonorrhoea and that pharynx specimens should not be analysed by PCR.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-13
... Service [Docket No. FSIS-2008-0008] Salmonella Verification Sampling Program: Response to Comments on New... establishments that participate in SIP. The Agency intends to conduct its own unannounced, small- set sampling to... considering publishing verification sampling results for other product classes. In the 2006 Federal Register...
NASA Astrophysics Data System (ADS)
Rieben, James C., Jr.
This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the use of real world samples. In the organic chemistry experiment, results suggest that the discovery-based design improved student retention of the chain length differentiation by physical properties relative to the verification-based design.
Design and Verification of Critical Pressurised Windows for Manned Spaceflight
NASA Astrophysics Data System (ADS)
Lamoure, Richard; Busto, Lara; Novo, Francisco; Sinnema, Gerben; Leal, Mendes M.
2014-06-01
The Window Design for Manned Spaceflight (WDMS) project was tasked with establishing the state-of-art and explore possible improvements to the current structural integrity verification and fracture control methodologies for manned spacecraft windows.A critical review of the state-of-art in spacecraft window design, materials and verification practice was conducted. Shortcomings of the methodology in terms of analysis, inspection and testing were identified. Schemes for improving verification practices and reducing conservatism whilst maintaining the required safety levels were then proposed.An experimental materials characterisation programme was defined and carried out with the support of the 'Glass and Façade Technology Research Group', at the University of Cambridge. Results of the sample testing campaign were analysed, post-processed and subsequently applied to the design of a breadboard window demonstrator.Two Fused Silica glass window panes were procured and subjected to dedicated analyses, inspection and testing comprising both qualification and acceptance programmes specifically tailored to the objectives of the activity.Finally, main outcomes have been compiled into a Structural Verification Guide for Pressurised Windows in manned spacecraft, incorporating best practices and lessons learned throughout this project.
47 CFR 73.151 - Field strength measurements to establish performance of directional antennas.
Code of Federal Regulations, 2010 CFR
2010-10-01
... verified either by field strength measurement or by computer modeling and sampling system verification. (a... specifically identified by the Commission. (c) Computer modeling and sample system verification of modeled... performance verified by computer modeling and sample system verification. (1) A matrix of impedance...
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. A. Carlson
2006-02-23
The 1607-D4 Septic System was a septic tank and tile field that received sanitary sewage from the 115-D/DR Gas Recirculation Facility. This septic system operated from 1944 to 1968. Decommissioning took place in 1985 and 1986 when all above-grade features were demolished and the tank backfilled. The results of verification sampling demonstrated that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also showed that residual contaminant concentrations are protective of groundwater and the Columbia River.
AERIAL PHOTOGRAPHY AND GROUND VERIFICATION AT POWER PLANT SITES: WISCONSIN POWER PLANT IMPACT STUDY
This study demonstrated and evaluated nine methods for monitoring the deterioration of a large wetland on the site of a newly-constructed coal-fired power plant in Columbia, County, Wisconsin. Four of the nine methods used data from ground sampling; two were remote sensing method...
A performance verification demonstration of technologies capable of detecting dioxin and dioxin-like compounds in soil and sediment samples was conducted in April 2004 under the U.S. Environmental Protection Agency's Superfund Innovative Technology Evaluation (SITE) Monitoring an...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yashchuk, V.V.; Conley, R.; Anderson, E.H.
Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binarypseudo-random (BPR) gratings and arrays has been suggested and and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer. Here we describe the details of development of binarypseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electronmore » microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi{sub 2}/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML testsamples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
V Yashchuk; R Conley; E Anderson
Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested [1] and [2] and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanningmore » (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi2/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.« less
Urine sampling and collection system optimization and testing
NASA Technical Reports Server (NTRS)
Fogal, G. L.; Geating, J. A.; Koesterer, M. G.
1975-01-01
A Urine Sampling and Collection System (USCS) engineering model was developed to provide for the automatic collection, volume sensing and sampling of urine from each micturition. The purpose of the engineering model was to demonstrate verification of the system concept. The objective of the optimization and testing program was to update the engineering model, to provide additional performance features and to conduct system testing to determine operational problems. Optimization tasks were defined as modifications to minimize system fluid residual and addition of thermoelectric cooling.
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
2006-10-19
The 1607-F7, 141-M Building Septic Tank waste site was a septic tank and drain field that received sanitary sewage from the former 141-M Building. Remedial action was performed in August and November 2005. The results of verification sampling demonstrate that residual contaminant concentrations support future unrestricted land uses that can be represented by a rural-residential scenario. These results also show that residual concentrations support unrestricted future use of shallow zone soil and that contaminant levels remaining in the soil are protective of groundwater and the Columbia River.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-21
... children and provide low cost or free school lunch meals to qualified students through subsidies to schools... records to demonstrate compliance with the meal requirements. To the extent practicable, schools ensure... verification of a required sample size), the number of meals served, and data from required reviews conducted...
Compromises produced by the dialectic between self-verification and self-enhancement.
Morling, B; Epstein, S
1997-12-01
Three studies of people's reactions to evaluative feedback demonstrated that the dialectic between self-enhancement and self-verification results in compromises between these 2 motives, as hypothesized in cognitive-experiential self-theory. The demonstration was facilitated by 2 procedural improvements: Enhancement and verification were established by calibrating evaluative feedback against self appraisals, and degree of enhancement and of verification were varied along a continuum, rather than categorically. There was also support for the hypotheses that processing in an intuitive-experiential mode favors enhancement and processing in an analytical-rational mode favors verification in the kinds of situations investigated.
Verification of hypergraph states
NASA Astrophysics Data System (ADS)
Morimae, Tomoyuki; Takeuchi, Yuki; Hayashi, Masahito
2017-12-01
Hypergraph states are generalizations of graph states where controlled-Z gates on edges are replaced with generalized controlled-Z gates on hyperedges. Hypergraph states have several advantages over graph states. For example, certain hypergraph states, such as the Union Jack states, are universal resource states for measurement-based quantum computing with only Pauli measurements, while graph state measurement-based quantum computing needs non-Clifford basis measurements. Furthermore, it is impossible to classically efficiently sample measurement results on hypergraph states unless the polynomial hierarchy collapses to the third level. Although several protocols have been proposed to verify graph states with only sequential single-qubit Pauli measurements, there was no verification method for hypergraph states. In this paper, we propose a method for verifying a certain class of hypergraph states with only sequential single-qubit Pauli measurements. Importantly, no i.i.d. property of samples is assumed in our protocol: any artificial entanglement among samples cannot fool the verifier. As applications of our protocol, we consider verified blind quantum computing with hypergraph states, and quantum computational supremacy demonstrations with hypergraph states.
Exomars Mission Verification Approach
NASA Astrophysics Data System (ADS)
Cassi, Carlo; Gilardi, Franco; Bethge, Boris
According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests between the different levels (system, modules, subsystems, etc) and giving an overview of the main test defined at Spacecraft level. The paper is mainly focused on the verification aspects of the EDL Demonstrator Module and the Rover Module, for which an intense testing activity without previous heritage in Europe is foreseen. In particular the Descent Module has to survive to the Mars atmospheric entry and landing, its surface platform has to stay operational for 8 sols on Martian surface, transmitting scientific data to the Orbiter. The Rover Module has to perform 180 sols mission in Mars surface environment. These operative conditions cannot be verified only by analysis; consequently a test campaign is defined including mechanical tests to simulate the entry loads, thermal test in Mars environment and the simulation of Rover operations on a 'Mars like' terrain. Finally, the paper present an overview of the documentation flow defined to ensure the correct translation of the mission requirements in verification activities (test, analysis, review of design) until the final verification close-out of the above requirements with the final verification reports.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yashchuk, Valeriy V; Conley, Raymond; Anderson, Erik H
Verification of the reliability of metrology data from high quality x-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested [Proc. SPIE 7077-7 (2007), Opt. Eng. 47(7), 073602-1-5 (2008)} and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [Nucl. Instr. and Meth. A 616, 172-82 (2010)]. Here we describe the details ofmore » development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi2/Si multilayer coating with pseudo randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize x-ray microscopes. Corresponding work with x-ray microscopes is in progress.« less
Chua, Hoe-Chee; Lee, Hoi-Sim; Sng, Mui-Tiang
2006-01-13
Analysing nitrogen mustards and their degradation products in decontamination emulsions posed a significant challenge due to the different phases present in such matrices. Extensive sample preparation may be required to isolate target analytes. Furthermore, numerous reaction products are formed in the decontamination emulsion. A fast and effective qualitative screening procedure was developed for these compounds, using liquid chromatography-mass spectrometry (LC-MS). This eliminated the need for additional sample handling and derivatisation that are required for gas chromatographic-mass spectrometric (GC-MS) analysis. A liquid chromatograph with mixed mode column and isocratic elution gave good chromatography. The feasibility of applying this technique for detecting these compounds in spiked water and decontamination emulsion was demonstrated. Detailed characterisation of the degradation products in these two matrices was carried out. The results demonstrated that N-methyldiethanolamine (MDEA), N-ethyldiethanolamine (EDEA) and triethanolamine (TEA) are not the major degradation products of their respective nitrogen mustards. Degradation profiles of nitrogen mustards in water were also established. In verification analysis, it is important not only to develop methods for the identification of the actual chemical agents; the methods must also encompass degradation products of the chemical agents as well so as to exclude false negatives. This study demonstrated the increasingly pivotal role that LC-MS play in verification analysis.
An experimental verification of laser-velocimeter sampling bias and its correction
NASA Technical Reports Server (NTRS)
Johnson, D. A.; Modarress, D.; Owen, F. K.
1982-01-01
The existence of 'sampling bias' in individual-realization laser velocimeter measurements is experimentally verified and shown to be independent of sample rate. The experiments were performed in a simple two-stream mixing shear flow with the standard for comparison being laser-velocimeter results obtained under continuous-wave conditions. It is also demonstrated that the errors resulting from sampling bias can be removed by a proper interpretation of the sampling statistics. In addition, data obtained in a shock-induced separated flow and in the near-wake of airfoils are presented, both bias-corrected and uncorrected, to illustrate the effects of sampling bias in the extreme.
Category V Compliant Container for Mars Sample Return Missions
NASA Technical Reports Server (NTRS)
Dolgin, Benjamin; Sanok, Joseph; Sevilla, Donald; Bement, Laurence J.
2000-01-01
A novel containerization technique that satisfies Planetary Protection (PP) Category V requirements has been developed and demonstrated on the mock-up of the Mars Sample Return Container. The proposed approach uses explosive welding with a sacrificial layer and cut-through-the-seam techniques. The technology produces a container that is free from Martian contaminants on an atomic level. The containerization technique can be used on any celestial body that may support life. A major advantage of the proposed technology is the possibility of very fast (less than an hour) verification of both containment and cleanliness with typical metallurgical laboratory equipment. No separate biological verification is required. In addition to Category V requirements, the proposed container presents a surface that is clean from any, even nonviable organisms, and any molecular fragments of biological origin that are unique to Mars or any other celestial body other than Earth.
NASA/BLM APT, phase 2. Volume 2: Technology demonstration. [Arizona
NASA Technical Reports Server (NTRS)
1981-01-01
Techniques described include: (1) steps in the preprocessing of LANDSAT data; (2) the training of a classifier; (3) maximum likelihood classification and precision; (4) geometric correction; (5) class description; (6) digitizing; (7) digital terrain data; (8) an overview of sample design; (9) allocation and selection of primary sample units; (10) interpretation of secondary sample units; (11) data collection ground plots; (12) data reductions; (13) analysis for productivity estimation and map verification; (14) cost analysis; and (150) LANDSAT digital products. The evaluation of the pre-inventory planning for P.J. is included.
Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B
2009-12-01
Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.
NASA Technical Reports Server (NTRS)
Venkatapathy, Ethiraj; Gage, Peter; Wright, Michael J.
2017-01-01
Mars Sample Return is our Grand Challenge for the coming decade. TPS (Thermal Protection System) nominal performance is not the key challenge. The main difficulty for designers is the need to verify unprecedented reliability for the entry system: current guidelines for prevention of backward contamination require that the probability of spores larger than 1 micron diameter escaping into the Earth environment be lower than 1 million for the entire system, and the allocation to TPS would be more stringent than that. For reference, the reliability allocation for Orion TPS is closer to 11000, and the demonstrated reliability for previous human Earth return systems was closer to 1100. Improving reliability by more than 3 orders of magnitude is a grand challenge indeed. The TPS community must embrace the possibility of new architectures that are focused on reliability above thermal performance and mass efficiency. MSR (Mars Sample Return) EEV (Earth Entry Vehicle) will be hit with MMOD (Micrometeoroid and Orbital Debris) prior to reentry. A chute-less aero-shell design which allows for self-righting shape was baselined in prior MSR studies, with the assumption that a passive system will maximize EEV robustness. Hence the aero-shell along with the TPS has to take ground impact and not break apart. System verification will require testing to establish ablative performance and thermal failure but also testing of damage from MMOD, and structural performance at ground impact. Mission requirements will demand analysis, testing and verification that are focused on establishing reliability of the design. In this proposed talk, we will focus on the grand challenge of MSR EEV TPS and the need for innovative approaches to address challenges in modeling, testing, manufacturing and verification.
40 CFR 1065.342 - Sample dryer verification.
Code of Federal Regulations, 2011 CFR
2011-07-01
...)(2) to remove water from the sample gas, verify the performance upon installation, after major... before the sample gas reaches the analyzer. For example water can negatively interfere with a CLD's NOX... time. You may run this verification on the sample dryer alone, but you must use the maximum gas flow...
40 CFR 1065.342 - Sample dryer verification.
Code of Federal Regulations, 2010 CFR
2010-07-01
...)(2) to remove water from the sample gas, verify the performance upon installation, after major... before the sample gas reaches the analyzer. For example water can negatively interfere with a CLD's NOX... time. You may run this verification on the sample dryer alone, but you must use the maximum gas flow...
NASA Technical Reports Server (NTRS)
Johnson, Kenneth L.; White, K, Preston, Jr.
2012-01-01
The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques. This recommended procedure would be used as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. This document contains the outcome of the assessment.
Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey
2010-09-01
Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.
Alternative sample sizes for verification dose experiments and dose audits
NASA Astrophysics Data System (ADS)
Taylor, W. A.; Hansen, J. M.
1999-01-01
ISO 11137 (1995), "Sterilization of Health Care Products—Requirements for Validation and Routine Control—Radiation Sterilization", provides sampling plans for performing initial verification dose experiments and quarterly dose audits. Alternative sampling plans are presented which provide equivalent protection. These sampling plans can significantly reduce the cost of testing. These alternative sampling plans have been included in a draft ISO Technical Report (type 2). This paper examines the rational behind the proposed alternative sampling plans. The protection provided by the current verification and audit sampling plans is first examined. Then methods for identifying equivalent plans are highlighted. Finally, methods for comparing the cost associated with the different plans are provided. This paper includes additional guidance for selecting between the original and alternative sampling plans not included in the technical report.
40 CFR 1065.545 - Verification of proportional flow control for batch sampling.
Code of Federal Regulations, 2014 CFR
2014-07-01
... control for batch sampling. 1065.545 Section 1065.545 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Performing an Emission Test Over Specified Duty Cycles § 1065.545 Verification of proportional flow control for batch sampling. For any...
DEMONSTRATION AND QUALITY ASSURANCE PROJECT ...
A demonstration of field portable/mobile technologies for measuring trace elements in soil and sediments was conducted under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation (SITE) Program. The demonstration took place from January 24 to 28, 2005, at the Kennedy Athletic, Recreational and Social Park at Kennedy Space Center on Merritt Island, Florida. The purpose of the demonstration was to verify the performance of various instruments that employ X-ray fluorescence (XRF) measurement technologies for the determination of 13 toxic elements in a variety of soil and sediment samples. Instruments from the technology developers listed below were demonstrated. o Innov-X Systems, Inc.o NITON LLC (2 instruments ) o Oxford Instruments Portable Division (formerly Metorex, Inc.) .Oxford Instruments Analytical .Rigaku, Inc.o RONTEC USA Inc.o Xcalibur XRF Services Inc. (Division of Elvatech Ltd. ) This demonstration plan describes the procedures that will be used to verify the performance and cost of the XRF instruments provided by these technology developers. The plan incorporates the quality assurance and quality control elements needed to generate data of sufficient quality to perform this verification. A separate innovative technology verification report (ITVR) will be prepared for each instrument. The objective of this program is to promote the acceptance and use of innovative field technologies by providing well-documented perfor
Cleaning and Cleanliness Verification Techniques for Mars Returned Sample Handling
NASA Technical Reports Server (NTRS)
Mickelson, E. T.; Lindstrom, D. J.; Allton, J. H.; Hittle, J. D.
2002-01-01
Precision cleaning and cleanliness verification techniques are examined as a subset of a comprehensive contamination control strategy for a Mars sample return mission. Additional information is contained in the original extended abstract.
INNOVATIVE TECHNOLOGY VERIFICATION REPORT " ...
The EnSys Petro Test System developed by Strategic Diagnostics Inc. (SDI), was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in Port Hueneme, California. The purpose of the demonstration was to collect reliable performance and cost data for the EnSys Petro Test System and six other field measurement devices for total petroleum hydrocarbons (TPH) in soil. In addition to assessing ease of device operation, the key objectives of the demonstration included determining the (1) method detection limit, (2) accuracy and precision, (3) effects of interferents and soil moisture content on TPH measurement, (4) sample throughput, and (5) TPH measurement costs for each device. The demonstration involved analysis of both performance evaluation samples and environmental samples collected in four areas contaminated with gasoline, diesel, or other petroleum products. The performance and cost results for a given field measurement device were compared to those for an off-site laboratory reference method,
INNOVATIVE TECHNOLOGY VERIFICATION REPORT " ...
The Synchronous Scanning Luminoscope (Luminoscope) developed by the Oak Ridge National Laboratory in collaboration with Environmental Systems Corporation (ESC) was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in Port Hueneme, California. The purpose of the demonstration was to collect reliable performance and cost data for the Luminoscope and six other field measurement devices for total petroleum hydrocarbons (TPH) in soil. In addition to assessing ease of device operation, the key objectives of the demonstration included determining the (1) method detection limit, (2) accuracy and precision, (3) effects of interferents and soil moisture content on TPH measurement, (4) sample throughput, and (5) TPH measurement costs for each device. The demonstration involved analysis of both performance evaluation samples and environmental samples collected in five areas contaminated with gasoline, diesel, lubricating oil, or other petroleum products. The performance and cost results for a given field measurement device were compared to those for an off-site laboratory reference method,
Formal verification of medical monitoring software using Z language: a representative sample.
Babamir, Seyed Morteza; Borhani, Mehdi
2012-08-01
Medical monitoring systems are useful aids assisting physicians in keeping patients under constant surveillance; however, taking sound decision by the systems is a physician concern. As a result, verification of the systems behavior in monitoring patients is a matter of significant. The patient monitoring is undertaken by software in modern medical systems; so, software verification of modern medial systems have been noticed. Such verification can be achieved by the Formal Languages having mathematical foundations. Among others, the Z language is a suitable formal language has been used to formal verification of systems. This study aims to present a constructive method to verify a representative sample of a medical system by which the system is visually specified and formally verified against patient constraints stated in Z Language. Exploiting our past experience in formal modeling Continuous Infusion Insulin Pump (CIIP), we think of the CIIP system as a representative sample of medical systems in proposing our present study. The system is responsible for monitoring diabetic's blood sugar.
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...
The U.S. Environmental Protection Agency has created the Environmental Technology Verification Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the ETV Program...
2007-03-01
Characterisation. In Nanotechnology Aerospace Applications – 2006 (pp. 4-1 – 4-8). Educational Notes RTO-EN-AVT-129bis, Paper 4. Neuilly-sur-Seine, France: RTO...the Commercialisation Processes Concept IDEA Proof-of- Principle Trial Samples Engineering Verification Samples Design Verification Samples...SEIC Systems Engineering for commercialisation Design Houses, Engineering & R&D USERS & Integrators SE S U R Integrators Fabs & Wafer Processing Die
Peng, Jun; Chen, Yi-Ting; Chen, Chien-Lun; Li, Liang
2014-07-01
Large-scale metabolomics study requires a quantitative method to generate metabolome data over an extended period with high technical reproducibility. We report a universal metabolome-standard (UMS) method, in conjunction with chemical isotope labeling liquid chromatography-mass spectrometry (LC-MS), to provide long-term analytical reproducibility and facilitate metabolome comparison among different data sets. In this method, UMS of a specific type of sample labeled by an isotope reagent is prepared a priori. The UMS is spiked into any individual samples labeled by another form of the isotope reagent in a metabolomics study. The resultant mixture is analyzed by LC-MS to provide relative quantification of the individual sample metabolome to UMS. UMS is independent of a study undertaking as well as the time of analysis and useful for profiling the same type of samples in multiple studies. In this work, the UMS method was developed and applied for a urine metabolomics study of bladder cancer. UMS of human urine was prepared by (13)C2-dansyl labeling of a pooled sample from 20 healthy individuals. This method was first used to profile the discovery samples to generate a list of putative biomarkers potentially useful for bladder cancer detection and then used to analyze the verification samples about one year later. Within the discovery sample set, three-month technical reproducibility was examined using a quality control sample and found a mean CV of 13.9% and median CV of 9.4% for all the quantified metabolites. Statistical analysis of the urine metabolome data showed a clear separation between the bladder cancer group and the control group from the discovery samples, which was confirmed by the verification samples. Receiver operating characteristic (ROC) test showed that the area under the curve (AUC) was 0.956 in the discovery data set and 0.935 in the verification data set. These results demonstrated the utility of the UMS method for long-term metabolomics and discovering potential metabolite biomarkers for diagnosis of bladder cancer.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-29
... non-federal community, including the academic, commercial, and public safety sectors, to implement a..., Verification, Demonstration and Trials: Technical Workshop II on Coordinating Federal Government/Private Sector Spectrum Innovation Testing Needs AGENCY: The National Coordination Office (NCO) for Networking and...
NASA Technical Reports Server (NTRS)
1989-01-01
The design and verification requirements are defined which are appropriate to hardware at the detail, subassembly, component, and engine levels and to correlate these requirements to the development demonstrations which provides verification that design objectives are achieved. The high pressure fuel turbopump requirements verification matrix provides correlation between design requirements and the tests required to verify that the requirement have been met.
Quantum money with classical verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gavinsky, Dmitry
We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.
Quantum money with classical verification
NASA Astrophysics Data System (ADS)
Gavinsky, Dmitry
2014-12-01
We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.
The U.S. EPA has created the Environmental Technology Verification program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program tested the performance of baghouse filtrati...
The SeaHorn Verification Framework
NASA Technical Reports Server (NTRS)
Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.
2015-01-01
In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...
Towards real-time VMAT verification using a prototype, high-speed CMOS active pixel sensor.
Zin, Hafiz M; Harris, Emma J; Osmond, John P F; Allinson, Nigel M; Evans, Philip M
2013-05-21
This work investigates the feasibility of using a prototype complementary metal oxide semiconductor active pixel sensor (CMOS APS) for real-time verification of volumetric modulated arc therapy (VMAT) treatment. The prototype CMOS APS used region of interest read out on the chip to allow fast imaging of up to 403.6 frames per second (f/s). The sensor was made larger (5.4 cm × 5.4 cm) using recent advances in photolithographic technique but retains fast imaging speed with the sensor's regional read out. There is a paradigm shift in radiotherapy treatment verification with the advent of advanced treatment techniques such as VMAT. This work has demonstrated that the APS can track multi leaf collimator (MLC) leaves moving at 18 mm s(-1) with an automatic edge tracking algorithm at accuracy better than 1.0 mm even at the fastest imaging speed. Evaluation of the measured fluence distribution for an example VMAT delivery sampled at 50.4 f/s was shown to agree well with the planned fluence distribution, with an average gamma pass rate of 96% at 3%/3 mm. The MLC leaves motion and linac pulse rate variation delivered throughout the VMAT treatment can also be measured. The results demonstrate the potential of CMOS APS technology as a real-time radiotherapy dosimeter for delivery of complex treatments such as VMAT.
NASA Astrophysics Data System (ADS)
Lin, Y. Q.; Ren, W. X.; Fang, S. E.
2011-11-01
Although most vibration-based damage detection methods can acquire satisfactory verification on analytical or numerical structures, most of them may encounter problems when applied to real-world structures under varying environments. The damage detection methods that directly extract damage features from the periodically sampled dynamic time history response measurements are desirable but relevant research and field application verification are still lacking. In this second part of a two-part paper, the robustness and performance of the statistics-based damage index using the forward innovation model by stochastic subspace identification of a vibrating structure proposed in the first part have been investigated against two prestressed reinforced concrete (RC) beams tested in the laboratory and a full-scale RC arch bridge tested in the field under varying environments. Experimental verification is focused on temperature effects. It is demonstrated that the proposed statistics-based damage index is insensitive to temperature variations but sensitive to the structural deterioration or state alteration. This makes it possible to detect the structural damage for the real-scale structures experiencing ambient excitations and varying environmental conditions.
Measuring Data Quality Through a Source Data Verification Audit in a Clinical Research Setting.
Houston, Lauren; Probst, Yasmine; Humphries, Allison
2015-01-01
Health data has long been scrutinised in relation to data quality and integrity problems. Currently, no internationally accepted or "gold standard" method exists measuring data quality and error rates within datasets. We conducted a source data verification (SDV) audit on a prospective clinical trial dataset. An audit plan was applied to conduct 100% manual verification checks on a 10% random sample of participant files. A quality assurance rule was developed, whereby if >5% of data variables were incorrect a second 10% random sample would be extracted from the trial data set. Error was coded: correct, incorrect (valid or invalid), not recorded or not entered. Audit-1 had a total error of 33% and audit-2 36%. The physiological section was the only audit section to have <5% error. Data not recorded to case report forms had the greatest impact on error calculations. A significant association (p=0.00) was found between audit-1 and audit-2 and whether or not data was deemed correct or incorrect. Our study developed a straightforward method to perform a SDV audit. An audit rule was identified and error coding was implemented. Findings demonstrate that monitoring data quality by a SDV audit can identify data quality and integrity issues within clinical research settings allowing quality improvement to be made. The authors suggest this approach be implemented for future research.
This verification test was conducted according to procedures specifiedin the Test/QA Planfor Verification of Enzyme-Linked Immunosorbent Assay (ELISA) Test Kis for the Quantitative Determination of Endocrine Disrupting Compounds (EDCs) in Aqueous Phase Samples. Deviations to the...
Experimental preparation and verification of quantum money
NASA Astrophysics Data System (ADS)
Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei
2018-03-01
A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.
Magnetic cleanliness verification approach on tethered satellite
NASA Technical Reports Server (NTRS)
Messidoro, Piero; Braghin, Massimo; Grande, Maurizio
1990-01-01
Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.
Requirement Assurance: A Verification Process
NASA Technical Reports Server (NTRS)
Alexander, Michael G.
2011-01-01
Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.
Hard and Soft Safety Verifications
NASA Technical Reports Server (NTRS)
Wetherholt, Jon; Anderson, Brenda
2012-01-01
The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
The 100-B-14:2 subsite encompasses the former sanitary sewer feeder lines associated with the 1607-B2 and 1607-B7 septic systems. Feeder lines associated with the 185/190-B building have also been identified as the 100-B-14:8 subsite, and feeder lines associated with the 1607-B7 septic system have also been identified as the 100-B-14:9 subsite. These two subsites have been administratively cancelled to resolve the redundancy. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and themore » Columbia River.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
The 1607-B2 waste site is a former septic system associated with various 100-B facilities, including the 105-B, 108-B, 115-B/C, and 185/190-B buildings. The site was evaluated based on confirmatory results for feeder lines within the 100-B-14:2 subsite and determined to require remediation. The 1607-B2 waste site has been remediated to achieve the remedial action objectives specified in the Remaining Sites ROD. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and themore » Columbia River.« less
Automated biowaste sampling system urine subsystem operating model, part 1
NASA Technical Reports Server (NTRS)
Fogal, G. L.; Mangialardi, J. K.; Rosen, F.
1973-01-01
The urine subsystem automatically provides for the collection, volume sensing, and sampling of urine from six subjects during space flight. Verification of the subsystem design was a primary objective of the current effort which was accomplished thru the detail design, fabrication, and verification testing of an operating model of the subsystem.
Improving semi-text-independent method of writer verification using difference vector
NASA Astrophysics Data System (ADS)
Li, Xin; Ding, Xiaoqing
2009-01-01
The semi-text-independent method of writer verification based on the linear framework is a method that can use all characters of two handwritings to discriminate the writers in the condition of knowing the text contents. The handwritings are allowed to just have small numbers of even totally different characters. This fills the vacancy of the classical text-dependent methods and the text-independent methods of writer verification. Moreover, the information, what every character is, is used for the semi-text-independent method in this paper. Two types of standard templates, generated from many writer-unknown handwritten samples and printed samples of each character, are introduced to represent the content information of each character. The difference vectors of the character samples are gotten by subtracting the standard templates from the original feature vectors and used to replace the original vectors in the process of writer verification. By removing a large amount of content information and remaining the style information, the verification accuracy of the semi-text-independent method is improved. On a handwriting database involving 30 writers, when the query handwriting and the reference handwriting are composed of 30 distinct characters respectively, the average equal error rate (EER) of writer verification reaches 9.96%. And when the handwritings contain 50 characters, the average EER falls to 6.34%, which is 23.9% lower than the EER of not using the difference vectors.
Tuerxunyiming, Muhadasi; Xian, Feng; Zi, Jin; Yimamu, Yilihamujiang; Abuduwayite, Reshalaiti; Ren, Yan; Li, Qidan; Abudula, Abulizi; Liu, SiQi; Mohemaiti, Patamu
2018-01-05
Maturity-onset diabetes of the young (MODY) is an inherited monogenic type of diabetes. Genetic mutations in MODY often cause nonsynonymous changes that directly lead to the functional distortion of proteins and the pathological consequences. Herein, we proposed that the inherited mutations found in a MODY family could cause a disturbance of protein abundance, specifically in serum. The serum samples were collected from a Uyghur MODY family through three generations, and the serum proteins after depletion treatment were examined by quantitative proteomics to characterize the MODY-related serum proteins followed by verification using target quantification of proteomics. A total of 32 serum proteins were preliminarily identified as the MODY-related. Further verification test toward the individual samples demonstrated the 12 candidates with the significantly different abundance in the MODY patients. A comparison of the 12 proteins among the sera of type 1 diabetes, type 2 diabetes, MODY, and healthy subjects was conducted and revealed a protein signature related with MODY composed of the serum proteins such as SERPINA7, APOC4, LPA, C6, and F5.
The Mars Science Laboratory Organic Check Material
NASA Astrophysics Data System (ADS)
Conrad, Pamela G.; Eigenbrode, Jennifer L.; Von der Heydt, Max O.; Mogensen, Claus T.; Canham, John; Harpold, Dan N.; Johnson, Joel; Errigo, Therese; Glavin, Daniel P.; Mahaffy, Paul R.
2012-09-01
Mars Science Laboratory's Curiosity rover carries a set of five external verification standards in hermetically sealed containers that can be sampled as would be a Martian rock, by drilling and then portioning into the solid sample inlet of the Sample Analysis at Mars (SAM) suite. Each organic check material (OCM) canister contains a porous ceramic solid, which has been doped with a fluorinated hydrocarbon marker that can be detected by SAM. The purpose of the OCM is to serve as a verification tool for the organic cleanliness of those parts of the sample chain that cannot be cleaned other than by dilution, i.e., repeated sampling of Martian rock. SAM possesses internal calibrants for verification of both its performance and its internal cleanliness, and the OCM is not used for that purpose. Each OCM unit is designed for one use only, and the choice to do so will be made by the project science group (PSG).
One-time pad, complexity of verification of keys, and practical security of quantum cryptography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Molotkov, S. N., E-mail: sergei.molotkov@gmail.com
2016-11-15
A direct relation between the complexity of the complete verification of keys, which is one of the main criteria of security in classical systems, and a trace distance used in quantum cryptography is demonstrated. Bounds for the minimum and maximum numbers of verification steps required to determine the actual key are obtained.
DOE Office of Scientific and Technical Information (OSTI.GOV)
King, David A.
2012-08-16
Oak Ridge Associated Universities (ORAU) conducted in-process inspections and independent verification (IV) surveys in support of DOE's remedial efforts in Zone 1 of East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. Inspections concluded that the remediation contractor's soil removal and survey objectives were satisfied and the dynamic verification strategy (DVS) was implemented as designed. Independent verification (IV) activities included gamma walkover surveys and soil sample collection/analysis over multiple exposure units (EUs).
Results of the performance verification of the CoaguChek XS system.
Plesch, W; Wolf, T; Breitenbeck, N; Dikkeschei, L D; Cervero, A; Perez, P L; van den Besselaar, A M H P
2008-01-01
This is the first paper reporting a performance verification study of a point-of-care (POC) monitor for prothrombin time (PT) testing according to the requirements given in chapter 8 of the International Organization for Standardization (ISO) 17593:2007 standard "Clinical laboratory testing and in vitro medical devices - Requirements for in vitro monitoring systems for self-testing of oral anticoagulant therapy". The monitor under investigation was the new CoaguChek XS system which is designed for use in patient self testing. Its detection principle is based on the amperometric measurement of the thrombin activity generated by starting the coagulation cascade using a recombinant human thromboplastin. The system performance verification study was performed at four study centers using venous and capillary blood samples on two test strip lots. Laboratory testing was performed from corresponding frozen plasma samples with six commercial thromboplastins. Samples from 73 normal donors and 297 patients on oral anticoagulation therapy were collected. Results were assessed using a refined data set of 260 subjects according to the ISO 17593:2007 standard. Each of the two test strip lots met the acceptance criteria of ISO 17593:2007 versus all thromboplastins (bias -0.19 to 0.18 INR; >97% of data within accuracy limits). The coefficient of variation for imprecision of the PT determinations in INR ranged from 2.0% to 3.2% in venous, and from 2.9% to 4.0% in capillary blood testing. Capillary versus venous INR data showed agreement of results with regression lines equal to the line of identity. The new system demonstrated a high level of trueness and accuracy, and low imprecision in INR testing. It can be concluded that the CoaguChek XS system complies with the requirements in chapter 8 of the ISO standard 17593:2007.
The impact of non-concordant self-report of substance use in clinical trials research.
Clark, C Brendan; Zyambo, Cosmas M; Li, Ye; Cropsey, Karen L
2016-07-01
Studies comparing self-report substance use data to biochemical verification generally demonstrate high rates of concordance. We argue that these rates are due to the relatively high true negative rate in the general population, and high degree of honestly in treatment seeking individuals. We hypothesized that high risk individuals not seeking treatment would demonstrate low concordance and a high false negative rate of self-reported substance use. A sample of 500 individuals from a smoking cessation clinical trial was assessed over 1 year. Assessments included semi-structured interviews, questionnaires (e.g. Addiction Severity Index, etc.), and urine drug screen assays (UDS). Generalized estimating equations (GEEs) were used to predict false negative reports for various substances across the study and determine the influence of substance use on the primary study outcome of smoking cessation. Participants demonstrated high false negative rates in reporting substances use, and the false negative rates increased as the study progressed. Established predictors of false negatives generalized to the current sample. High concordance and low false negative rates were found in self-report of nicotine use. A small but significant relationship was found in for effect of biochemically verified substance use on smoking cessation. Biochemical verification of substance use is needed in high risk populations involved in studies not directly related to the treatment of substance use, especially in populations with high threat of stigmatization. Testing should continue through the time period of the study for maximal identification of substance use. Copyright © 2016 Elsevier Ltd. All rights reserved.
Feasibility of biochemical verification in a web-based smoking cessation study.
Cha, Sarah; Ganz, Ollie; Cohn, Amy M; Ehlke, Sarah J; Graham, Amanda L
2017-10-01
Cogent arguments have been made against the need for biochemical verification in population-based studies with low-demand characteristics. Despite this fact, studies involving digital interventions (low-demand) are often required in peer review to report biochemically verified abstinence. To address this discrepancy, we examined the feasibility and costs of biochemical verification in a web-based study conducted with a national sample. Participants were 600U.S. adult current smokers who registered on a web-based smoking cessation program and completed surveys at baseline and 3months. Saliva sampling kits were sent to participants who reported 7-day abstinence at 3months, and analyzed for cotinine. The response rate at 3-months was 41.2% (n=247): 93 participants reported 7-day abstinence (38%) and were mailed a saliva kit (71% returned). The discordance rate was 36.4%. Participants with discordant responses were more likely to report 3-month use of nicotine replacement therapy or e-cigarettes than those with concordant responses (79.2% vs. 45.2%, p=0.007). The total cost of saliva sampling was $8280 ($125/sample). Biochemical verification was both time- and cost-intensive, and yielded a relatively small number of samples due to low response rates and use of other nicotine products during the follow-up period. There was a high rate of discordance of self-reported abstinence and saliva testing. Costs for data collection may be prohibitive for studies with large sample sizes or limited budgets. Our findings echo previous statements that biochemical verification is not necessary in population-based studies, and add evidence specific to technology-based studies. Copyright © 2017 Elsevier Ltd. All rights reserved.
Design of an occulter testbed at flight Fresnel numbers
NASA Astrophysics Data System (ADS)
Sirbu, Dan; Kasdin, N. Jeremy; Kim, Yunjong; Vanderbei, Robert J.
2015-01-01
An external occulter is a spacecraft flown along the line-of-sight of a space telescope to suppress starlight and enable high-contrast direct imaging of exoplanets. Laboratory verification of occulter designs is necessary to validate the optical models used to design and predict occulter performance. At Princeton, we are designing and building a testbed that allows verification of scaled occulter designs whose suppressed shadow is mathematically identical to that of space occulters. Here, we present a sample design operating at a flight Fresnel number and is thus representative of a realistic space mission. We present calculations of experimental limits arising from the finite size and propagation distance available in the testbed, limitations due to manufacturing feature size, and non-ideal input beam. We demonstrate how the testbed is designed to be feature-size limited, and provide an estimation of the expected performance.
Wan, Xiaohua; Katchalski, Tsvi; Churas, Christopher; Ghosh, Sreya; Phan, Sebastien; Lawrence, Albert; Hao, Yu; Zhou, Ziying; Chen, Ruijuan; Chen, Yu; Zhang, Fa; Ellisman, Mark H
2017-05-01
Because of the significance of electron microscope tomography in the investigation of biological structure at nanometer scales, ongoing improvement efforts have been continuous over recent years. This is particularly true in the case of software developments. Nevertheless, verification of improvements delivered by new algorithms and software remains difficult. Current analysis tools do not provide adaptable and consistent methods for quality assessment. This is particularly true with images of biological samples, due to image complexity, variability, low contrast and noise. We report an electron tomography (ET) simulator with accurate ray optics modeling of image formation that includes curvilinear trajectories through the sample, warping of the sample and noise. As a demonstration of the utility of our approach, we have concentrated on providing verification of the class of reconstruction methods applicable to wide field images of stained plastic-embedded samples. Accordingly, we have also constructed digital phantoms derived from serial block face scanning electron microscope images. These phantoms are also easily modified to include alignment features to test alignment algorithms. The combination of more realistic phantoms with more faithful simulations facilitates objective comparison of acquisition parameters, alignment and reconstruction algorithms and their range of applicability. With proper phantoms, this approach can also be modified to include more complex optical models, including distance-dependent blurring and phase contrast functions, such as may occur in cryotomography. Copyright © 2017 Elsevier Inc. All rights reserved.
Galaxy-galaxy lensing in the Dark Energy Survey Science Verification data
NASA Astrophysics Data System (ADS)
Clampitt, J.; Sánchez, C.; Kwan, J.; Krause, E.; MacCrann, N.; Park, Y.; Troxel, M. A.; Jain, B.; Rozo, E.; Rykoff, E. S.; Wechsler, R. H.; Blazek, J.; Bonnett, C.; Crocce, M.; Fang, Y.; Gaztanaga, E.; Gruen, D.; Jarvis, M.; Miquel, R.; Prat, J.; Ross, A. J.; Sheldon, E.; Zuntz, J.; Abbott, T. M. C.; Abdalla, F. B.; Armstrong, R.; Becker, M. R.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Dietrich, J. P.; Doel, P.; Estrada, J.; Evrard, A. E.; Fausti Neto, A.; Flaugher, B.; Fosalba, P.; Frieman, J.; Gruendl, R. A.; Honscheid, K.; James, D. J.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; March, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Mohr, J. J.; Nichol, R. C.; Nord, B.; Plazas, A. A.; Romer, A. K.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Vikram, V.; Walker, A. R.
2017-03-01
We present galaxy-galaxy lensing results from 139 deg2 of Dark Energy Survey (DES) Science Verification (SV) data. Our lens sample consists of red galaxies, known as redMaGiC, which are specifically selected to have a low photometric redshift error and outlier rate. The lensing measurement has a total signal-to-noise ratio of 29 over scales 0.09 < R < 15 Mpc h-1, including all lenses over a wide redshift range 0.2 < z < 0.8. Dividing the lenses into three redshift bins for this constant moving number density sample, we find no evidence for evolution in the halo mass with redshift. We obtain consistent results for the lensing measurement with two independent shear pipelines, NGMIX and IM3SHAPE. We perform a number of null tests on the shear and photometric redshift catalogues and quantify resulting systematic uncertainties. Covariances from jackknife subsamples of the data are validated with a suite of 50 mock surveys. The result and systematic checks in this work provide a critical input for future cosmological and galaxy evolution studies with the DES data and redMaGiC galaxy samples. We fit a halo occupation distribution (HOD) model, and demonstrate that our data constrain the mean halo mass of the lens galaxies, despite strong degeneracies between individual HOD parameters.
Martinez-Garcia, Elena; Lesur, Antoine; Devis, Laura; Campos, Alexandre; Cabrera, Silvia; van Oostrum, Jan; Matias-Guiu, Xavier; Gil-Moreno, Antonio; Reventos, Jaume; Colas, Eva; Domon, Bruno
2016-08-16
About 30% of endometrial cancer (EC) patients are diagnosed at an advanced stage of the disease, which is associated with a drastic decrease in the 5-year survival rate. The identification of biomarkers in uterine aspirate samples, which are collected by a minimally invasive procedure, would improve early diagnosis of EC. We present a sequential workflow to select from a list of potential EC biomarkers, those which are the most promising to enter a validation study. After the elimination of confounding contributions by residual blood proteins, 52 potential biomarkers were analyzed in uterine aspirates from 20 EC patients and 18 non-EC controls by a high-resolution accurate mass spectrometer operated in parallel reaction monitoring mode. The differential abundance of 26 biomarkers was observed, and among them ten proteins showed a high sensitivity and specificity (AUC > 0.9). The study demonstrates that uterine aspirates are valuable samples for EC protein biomarkers screening. It also illustrates the importance of a biomarker verification phase to fill the gap between discovery and validation studies and highlights the benefits of high resolution mass spectrometry for this purpose. The proteins verified in this study have an increased likelihood to become a clinical assay after a subsequent validation phase.
A Rubric for Extracting Idea Density from Oral Language Samples
Chand, Vineeta; Baynes, Kathleen; Bonnici, Lisa M.; Farias, Sarah Tomaszewski
2012-01-01
While past research has demonstrated that low idea density (ID) scores from natural language samples correlate with late life risk for cognitive decline and Alzheimer’s disease pathology, there are no published rubrics for collecting and analyzing language samples for idea density to verify or extend these findings into new settings. This paper outlines the history of ID research and findings, discusses issues with past rubrics, and then presents an operationalized method for the systematic measurement of ID in language samples, with an extensive manual available as a supplement to this article (Analysis of Idea Density, AID). Finally, reliability statistics for this rubric in the context of dementia research on aging populations and verification that AID can replicate the significant association between ID and late life cognition are presented. PMID:23042498
NASA Technical Reports Server (NTRS)
Johnson, Kenneth L.; White, K. Preston, Jr.
2012-01-01
The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.
Enhanced verification test suite for physics simulation codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.
2008-09-01
This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.
Multi-canister overpack project -- verification and validation, MCNP 4A
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldmann, L.H.
This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less
Toward Automatic Verification of Goal-Oriented Flow Simulations
NASA Technical Reports Server (NTRS)
Nemec, Marian; Aftosmis, Michael J.
2014-01-01
We demonstrate the power of adaptive mesh refinement with adjoint-based error estimates in verification of simulations governed by the steady Euler equations. The flow equations are discretized using a finite volume scheme on a Cartesian mesh with cut cells at the wall boundaries. The discretization error in selected simulation outputs is estimated using the method of adjoint-weighted residuals. Practical aspects of the implementation are emphasized, particularly in the formulation of the refinement criterion and the mesh adaptation strategy. Following a thorough code verification example, we demonstrate simulation verification of two- and three-dimensional problems. These involve an airfoil performance database, a pressure signature of a body in supersonic flow and a launch abort with strong jet interactions. The results show reliable estimates and automatic control of discretization error in all simulations at an affordable computational cost. Moreover, the approach remains effective even when theoretical assumptions, e.g., steady-state and solution smoothness, are relaxed.
NASA Technical Reports Server (NTRS)
Powell, John D.
2003-01-01
This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.
Madon, Stephanie; Guyll, Max; Buller, Ashley A.; Scherr, Kyle C.; Willard, Jennifer; Spoth, Richard
2010-01-01
This research examined whether self-fulfilling prophecy effects are mediated by self-verification, informational conformity, and modeling processes. The authors examined these mediational processes across multiple time frames with longitudinal data obtained from two samples of mother – child dyads (N1 = 487; N2 = 287). Children’s alcohol use was the outcome variable. The results provided consistent support for the mediational process of self-verification. In both samples and across several years of adolescence, there was a significant indirect effect of mothers’ beliefs on children’s alcohol use through children’s self-assessed likelihood of drinking alcohol in the future. Comparatively less support was found for informational conformity and modeling processes as mediators of mothers’ self-fulfilling effects. The potential for self-fulfilling prophecies to produce long lasting changes in targets’ behavior via self-verification processes are discussed. PMID:18665708
Madon, Stephanie; Guyll, Max; Buller, Ashley A; Scherr, Kyle C; Willard, Jennifer; Spoth, Richard
2008-08-01
This research examined whether self-fulfilling prophecy effects are mediated by self-verification, informational conformity, and modeling processes. The authors examined these mediational processes across multiple time frames with longitudinal data obtained from two samples of mother-child dyads (N-sub-1 = 486; N-sub-2 = 287), with children's alcohol use as the outcome variable. The results provided consistent support for the mediational process of self-verification. In both samples and across several years of adolescence, there was a significant indirect effect of mothers' beliefs on children's alcohol use through children's self-assessed likelihood of drinking alcohol in the future. Comparatively less support was found for informational conformity and modeling processes as mediators of mothers' self-fulfilling effects. The potential for self-fulfilling prophecies to produce long-lasting changes in targets' behavior via self-verification processes are discussed. (c) 2008 APA, all rights reserved
Valentiner, David P; Skowronski, John J; McGrath, Patrick B; Smith, Sarah A; Renner, Kerry A
2011-10-01
A self-verification model of social anxiety views negative social self-esteem as a core feature of social anxiety. This core feature is proposed to be maintained through self-verification processes, such as by leading individuals with negative social self-esteem to prefer negative social feedback. This model is tested in two studies. In Study 1, questionnaires were administered to a college sample (N = 317). In Study 2, questionnaires were administered to anxiety disordered patients (N = 62) before and after treatment. Study 1 developed measures of preference for negative social feedback and social self-esteem, and provided evidence of their incremental validity in a college sample. Study 2 found that these two variables are not strongly related to fears of evaluation, are relatively unaffected by a treatment that targets such fears, and predict residual social anxiety following treatment. Overall, these studies provide preliminary evidence for a self-verification model of social anxiety.
Offline signature verification using convolution Siamese network
NASA Astrophysics Data System (ADS)
Xing, Zi-Jian; Yin, Fei; Wu, Yi-Chao; Liu, Cheng-Lin
2018-04-01
This paper presents an offline signature verification approach using convolutional Siamese neural network. Unlike the existing methods which consider feature extraction and metric learning as two independent stages, we adopt a deepleaning based framework which combines the two stages together and can be trained end-to-end. The experimental results on two offline public databases (GPDSsynthetic and CEDAR) demonstrate the superiority of our method on the offline signature verification problem.
NASA Astrophysics Data System (ADS)
Davis, C.; Rozo, E.; Roodman, A.; Alarcon, A.; Cawthon, R.; Gatti, M.; Lin, H.; Miquel, R.; Rykoff, E. S.; Troxel, M. A.; Vielzeuf, P.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Bechtol, K.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Doel, P.; Drlica-Wagner, A.; Fausti Neto, A.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gaztanaga, E.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; Jain, B.; James, D. J.; Jeltema, T.; Krause, E.; Kuehn, K.; Kuhlmann, S.; Kuropatkin, N.; Lahav, O.; Li, T. S.; Lima, M.; March, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Ogando, R. L. C.; Plazas, A. A.; Romer, A. K.; Sanchez, E.; Scarpine, V.; Schindler, R.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, M.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Vikram, V.; Walker, A. R.; Wechsler, R. H.
2018-06-01
Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogues with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty of Δz ˜ ±0.01. We forecast that our proposal can, in principle, control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Our results provide strong motivation to launch a programme to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.
Davis, C.; Rozo, E.; Roodman, A.; ...
2018-03-26
Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogs with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty ofmore » $$\\Delta z \\sim \\pm 0.01$$. We forecast that our proposal can in principle control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Here, our results provide strong motivation to launch a program to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, C.; Rozo, E.; Roodman, A.
Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogs with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty ofmore » $$\\Delta z \\sim \\pm 0.01$$. We forecast that our proposal can in principle control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Here, our results provide strong motivation to launch a program to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.« less
ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM CASE STUDIES: DEMONSTRATING PROGRAM OUTCOMES
The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This bookle...
The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This bookle...
Environmental Technology Verification Program - ETV - Case Studies: Demonstrating Program Outcomes
The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This cd con...
An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices
Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei
2017-01-01
In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer’s forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices. PMID:28075375
An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices.
Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei
2017-01-10
In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer's forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices.
Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doebling, Scott William; Budzien, Joanne Louise; Ferguson, Jim Michael
This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents servemore » as the compilation of results demonstrating accomplishment of these objectives.« less
Universal Linear Optics: An implementation of Boson Sampling on a Fully Reconfigurable Circuit
NASA Astrophysics Data System (ADS)
Harrold, Christopher; Carolan, Jacques; Sparrow, Chris; Russell, Nicholas J.; Silverstone, Joshua W.; Marshall, Graham D.; Thompson, Mark G.; Matthews, Jonathan C. F.; O'Brien, Jeremy L.; Laing, Anthony; Martín-López, Enrique; Shadbolt, Peter J.; Matsuda, Nobuyuki; Oguma, Manabu; Itoh, Mikitaka; Hashimoto, Toshikazu
Linear optics has paved the way for fundamental tests in quantum mechanics and has gone on to enable a broad range of quantum information processing applications for quantum technologies. We demonstrate an integrated photonics processor that is universal for linear optics. The device is a silica-on-silicon planar waveguide circuit (PLC) comprising a cascade of 15 Mach Zehnder interferometers, with 30 directional couplers and 30 tunable thermo-optic phase shifters which are electrically interfaced for the arbitrary setting of a phase. We input ensembles of up to six photons, and monitor the output with a 12-single-photon detector system. The calibrated device is capable of implementing any linear optical protocol. This enables the implementation of new quantum information processing tasks in seconds, which would have previously taken months to realise. We demonstrate 100 instances of the boson sampling problem with verification tests, and six-dimensional complex Hadamards. Also Imperial College London.
Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana
2011-01-01
The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties. PMID:21713128
Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana
2011-01-01
The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties.
NASA Astrophysics Data System (ADS)
Vijayakumar, Ganesh; Sprague, Michael
2017-11-01
Demonstrating expected convergence rates with spatial- and temporal-grid refinement is the ``gold standard'' of code and algorithm verification. However, the lack of analytical solutions and generating manufactured solutions presents challenges for verifying codes for complex systems. The application of the method of manufactured solutions (MMS) for verification for coupled multi-physics phenomena like fluid-structure interaction (FSI) has only seen recent investigation. While many FSI algorithms for aeroelastic phenomena have focused on boundary-resolved CFD simulations, the actuator-line representation of the structure is widely used for FSI simulations in wind-energy research. In this work, we demonstrate the verification of an FSI algorithm using MMS for actuator-line CFD simulations with a simplified structural model. We use a manufactured solution for the fluid velocity field and the displacement of the SMD system. We demonstrate the convergence of both the fluid and structural solver to second-order accuracy with grid and time-step refinement. This work was funded by the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Wind Energy Technologies Office, under Contract No. DE-AC36-08-GO28308 with the National Renewable Energy Laboratory.
Analytical Verifications in Cryogenic Testing of NGST Advanced Mirror System Demonstrators
NASA Technical Reports Server (NTRS)
Cummings, Ramona; Levine, Marie; VanBuren, Dave; Kegley, Jeff; Green, Joseph; Hadaway, James; Presson, Joan; Cline, Todd; Stahl, H. Philip (Technical Monitor)
2002-01-01
Ground based testing is a critical and costly part of component, assembly, and system verifications of large space telescopes. At such tests, however, with integral teamwork by planners, analysts, and test personnel, segments can be included to validate specific analytical parameters and algorithms at relatively low additional cost. This paper opens with strategy of analytical verification segments added to vacuum cryogenic testing of Advanced Mirror System Demonstrator (AMSD) assemblies. These AMSD assemblies incorporate material and architecture concepts being considered in the Next Generation Space Telescope (NGST) design. The test segments for workmanship testing, cold survivability, and cold operation optical throughput are supplemented by segments for analytical verifications of specific structural, thermal, and optical parameters. Utilizing integrated modeling and separate materials testing, the paper continues with support plan for analyses, data, and observation requirements during the AMSD testing, currently slated for late calendar year 2002 to mid calendar year 2003. The paper includes anomaly resolution as gleaned by authors from similar analytical verification support of a previous large space telescope, then closes with draft of plans for parameter extrapolations, to form a well-verified portion of the integrated modeling being done for NGST performance predictions.
Galaxy-galaxy lensing in the Dark Energy Survey Science Verification data
Clampitt, J.; S?nchez, C.; Kwan, J.; ...
2016-11-22
We present galaxy-galaxy lensing results from 139 square degrees of Dark Energy Survey (DES) Science Verification (SV) data. Our lens sample consists of red galaxies, known as redMaGiC, which are specifically selected to have a low photometric redshift error and outlier rate. The lensing measurement has a total signal-to-noise of 29 over scales $0.09 < R < 15$ Mpc/$h$, including all lenses over a wide redshift range $0.2 < z < 0.8$. Dividing the lenses into three redshift bins for this constant moving number density sample, we find no evidence for evolution in the halo mass with redshift. We obtainmore » consistent results for the lensing measurement with two independent shear pipelines, ngmix and im3shape. We perform a number of null tests on the shear and photometric redshift catalogs and quantify resulting systematic uncertainties. Covariances from jackknife subsamples of the data are validated with a suite of 50 mock surveys. The results and systematics checks in this work provide a critical input for future cosmological and galaxy evolution studies with the DES data and redMaGiC galaxy samples. We fit a Halo Occupation Distribution (HOD) model, and demonstrate that our data constrains the mean halo mass of the lens galaxies, despite strong degeneracies between individual HOD parameters.« less
Galaxy-galaxy lensing in the Dark Energy Survey Science Verification data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clampitt, J.; S?nchez, C.; Kwan, J.
We present galaxy-galaxy lensing results from 139 square degrees of Dark Energy Survey (DES) Science Verification (SV) data. Our lens sample consists of red galaxies, known as redMaGiC, which are specifically selected to have a low photometric redshift error and outlier rate. The lensing measurement has a total signal-to-noise of 29 over scales $0.09 < R < 15$ Mpc/$h$, including all lenses over a wide redshift range $0.2 < z < 0.8$. Dividing the lenses into three redshift bins for this constant moving number density sample, we find no evidence for evolution in the halo mass with redshift. We obtainmore » consistent results for the lensing measurement with two independent shear pipelines, ngmix and im3shape. We perform a number of null tests on the shear and photometric redshift catalogs and quantify resulting systematic uncertainties. Covariances from jackknife subsamples of the data are validated with a suite of 50 mock surveys. The results and systematics checks in this work provide a critical input for future cosmological and galaxy evolution studies with the DES data and redMaGiC galaxy samples. We fit a Halo Occupation Distribution (HOD) model, and demonstrate that our data constrains the mean halo mass of the lens galaxies, despite strong degeneracies between individual HOD parameters.« less
redMaGiC: Selecting luminous red galaxies from the DES Science Verification data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rozo, E.; Rykoff, E. S.; Abate, A.
Here, we introduce redMaGiC, an automated algorithm for selecting luminous red galaxies (LRGs). The algorithm was specifically developed to minimize photometric redshift uncertainties in photometric large-scale structure studies. redMaGiC achieves this by self-training the colour cuts necessary to produce a luminosity-thresholded LRG sample of constant comoving density. We demonstrate that redMaGiC photo-zs are very nearly as accurate as the best machine learning-based methods, yet they require minimal spectroscopic training, do not suffer from extrapolation biases, and are very nearly Gaussian. We apply our algorithm to Dark Energy Survey (DES) Science Verification (SV) data to produce a redMaGiC catalogue sampling themore » redshift range z ϵ [0.2, 0.8]. Our fiducial sample has a comoving space density of 10 –3 (h –1 Mpc) –3, and a median photo-z bias (zspec – zphoto) and scatter (σz/(1 + z)) of 0.005 and 0.017, respectively. The corresponding 5σ outlier fraction is 1.4 per cent. We also test our algorithm with Sloan Digital Sky Survey Data Release 8 and Stripe 82 data, and discuss how spectroscopic training can be used to control photo-z biases at the 0.1 per cent level.« less
redMaGiC: Selecting luminous red galaxies from the DES Science Verification data
Rozo, E.; Rykoff, E. S.; Abate, A.; ...
2016-05-30
Here, we introduce redMaGiC, an automated algorithm for selecting luminous red galaxies (LRGs). The algorithm was specifically developed to minimize photometric redshift uncertainties in photometric large-scale structure studies. redMaGiC achieves this by self-training the colour cuts necessary to produce a luminosity-thresholded LRG sample of constant comoving density. We demonstrate that redMaGiC photo-zs are very nearly as accurate as the best machine learning-based methods, yet they require minimal spectroscopic training, do not suffer from extrapolation biases, and are very nearly Gaussian. We apply our algorithm to Dark Energy Survey (DES) Science Verification (SV) data to produce a redMaGiC catalogue sampling themore » redshift range z ϵ [0.2, 0.8]. Our fiducial sample has a comoving space density of 10 –3 (h –1 Mpc) –3, and a median photo-z bias (zspec – zphoto) and scatter (σz/(1 + z)) of 0.005 and 0.017, respectively. The corresponding 5σ outlier fraction is 1.4 per cent. We also test our algorithm with Sloan Digital Sky Survey Data Release 8 and Stripe 82 data, and discuss how spectroscopic training can be used to control photo-z biases at the 0.1 per cent level.« less
Deductive Evaluation: Implicit Code Verification With Low User Burden
NASA Technical Reports Server (NTRS)
Di Vito, Ben L.
2016-01-01
We describe a framework for symbolically evaluating C code using a deductive approach that discovers and proves program properties. The framework applies Floyd-Hoare verification principles in its treatment of loops, with a library of iteration schemes serving to derive loop invariants. During evaluation, theorem proving is performed on-the-fly, obviating the generation of verification conditions normally needed to establish loop properties. A PVS-based prototype is presented along with results for sample C functions.
Improved Detection Technique for Solvent Rinse Cleanliness Verification
NASA Technical Reports Server (NTRS)
Hornung, S. D.; Beeson, H. D.
2001-01-01
The NASA White Sands Test Facility (WSTF) has an ongoing effort to reduce or eliminate usage of cleaning solvents such as CFC-113 and its replacements. These solvents are used in the final clean and cleanliness verification processes for flight and ground support hardware, especially for oxygen systems where organic contaminants can pose an ignition hazard. For the final cleanliness verification in the standard process, the equivalent of one square foot of surface area of parts is rinsed with the solvent, and the final 100 mL of the rinse is captured. The amount of nonvolatile residue (NVR) in the solvent is determined by weight after the evaporation of the solvent. An improved process of sampling this rinse, developed at WSTF, requires evaporation of less than 2 mL of the solvent to make the cleanliness verification. Small amounts of the solvent are evaporated in a clean stainless steel cup, and the cleanliness of the stainless steel cup is measured using a commercially available surface quality monitor. The effectiveness of this new cleanliness verification technique was compared to the accepted NVR sampling procedures. Testing with known contaminants in solution, such as hydraulic fluid, fluorinated lubricants, and cutting and lubricating oils, was performed to establish a correlation between amount in solution and the process response. This report presents the approach and results and discusses the issues in establishing the surface quality monitor-based cleanliness verification.
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2014 CFR
2014-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2012 CFR
2012-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2013 CFR
2013-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2010 CFR
2010-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2011 CFR
2011-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
The U.S. Environmental Protection Agency (EPA) design efficient processes for conducting has created the Environmental Technology perfofl1lance tests of innovative technologies. Verification Program (E TV) to facilitate the deployment of innovative or improved environmental techn...
The U.S. Environmental Protection Agency (EPA) design efficient processes for conducting has created the Environmental Technology perfofl1lance tests of innovative technologies. Verification Program (E TV) to facilitate the deployment of innovative or improved environmental techn...
NASA Technical Reports Server (NTRS)
1986-01-01
Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
P.C. Weaver
2009-02-17
Conduct verification surveys of grids at the DWI 1630 Site in Knoxville, Tennessee. The independent verification team (IVT) from ORISE, conducted verification activities in whole and partial grids, as completed by BJC. ORISE site activities included gamma surface scans and soil sampling within 33 grids; G11 through G14; H11 through H15; X14, X15, X19, and X21; J13 through J15 and J17 through J21; K7 through K9 and K13 through K15; L13 through L15; and M14 through M16
Clinical Skills Verification, Formative Feedback, and Psychiatry Residency Trainees
ERIC Educational Resources Information Center
Dalack, Gregory W.; Jibson, Michael D.
2012-01-01
Objective: The authors describe the implementation of Clinical Skills Verification (CSV) in their program as an in-training assessment intended primarily to provide formative feedback to trainees, strengthen the supervisory experience, identify the need for remediation of interviewing skills, and secondarily to demonstrating resident competence…
40 CFR 86.1823-01 - Durability demonstration procedures for exhaust emissions.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) Discussion of the manufacturer's in-use verification procedures including testing performed, vehicle... performed should also be documented in the manufacturer's submission. The in-use verification program shall...), the Alternate Service Accumulation Durability Program described in § 86.094-13(e) or the Standard Self...
Cleanup Verification Package for the 300 VTS Waste Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
S. W. Clark and T. H. Mitchell
2006-03-13
This cleanup verification package documents completion of remedial action for the 300 Area Vitrification Test Site, also known as the 300 VTS site. The site was used by Pacific Northwest National Laboratory as a field demonstration site for in situ vitrification of soils containing simulated waste.
Formal Methods for Life-Critical Software
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Johnson, Sally C.
1993-01-01
The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.
Use of metaknowledge in the verification of knowledge-based systems
NASA Technical Reports Server (NTRS)
Morell, Larry J.
1989-01-01
Knowledge-based systems are modeled as deductive systems. The model indicates that the two primary areas of concern in verification are demonstrating consistency and completeness. A system is inconsistent if it asserts something that is not true of the modeled domain. A system is incomplete if it lacks deductive capability. Two forms of consistency are discussed along with appropriate verification methods. Three forms of incompleteness are discussed. The use of metaknowledge, knowledge about knowledge, is explored in connection to each form of incompleteness.
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. M. Capron
2008-08-08
The 100-F-46 french drain consisted of a 1.5 to 3 m long, vertically buried, gravel-filled pipe that was approximately 1 m in diameter. Also included in this waste site was a 5 cm cast-iron pipeline that drained condensate from the 119-F Stack Sampling Building into the 100-F-46 french drain. In accordance with this evaluation, the confirmatory sampling results support a reclassification of this site to No Action. The current site conditions achieve the remedial action objectives and the corresponding remedial action goals established in the Remaining Sites ROD. The results of confirmatory sampling show that residual contaminant concentrations do notmore » preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.« less
NASA Technical Reports Server (NTRS)
Eichenlaub, Carl T.; Harper, C. Douglas; Hird, Geoffrey
1993-01-01
Life-critical applications warrant a higher level of software reliability than has yet been achieved. Since it is not certain that traditional methods alone can provide the required ultra reliability, new methods should be examined as supplements or replacements. This paper describes a mathematical counterpart to the traditional process of empirical testing. ORA's Penelope verification system is demonstrated as a tool for evaluating the correctness of Ada software. Grady Booch's Ada calendar utility package, obtained through NASA, was specified in the Larch/Ada language. Formal verification in the Penelope environment established that many of the package's subprograms met their specifications. In other subprograms, failed attempts at verification revealed several errors that had escaped detection by testing.
78 FR 5409 - Ongoing Equivalence Verifications of Foreign Food Regulatory Systems
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-25
... of data shared. Finally, with respect to POE re-inspections, NACMPI recommended the targeting of high-risk product and high-risk imports for sampling and other verification activities during reinspection... authority; the availability of contingency plans in the country for containing and mitigating the effects of...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size of those particles equal to and smalle...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size of those particles equal to and smalle...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size of those particles equal to and smalle...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size for particles equal to or smaller than...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size of those particles equal to and smalle...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size for particles equal to or smaller than...
NASA Technical Reports Server (NTRS)
Srivas, Mandayam; Bickford, Mark
1991-01-01
The design and formal verification of a hardware system for a task that is an important component of a fault tolerant computer architecture for flight control systems is presented. The hardware system implements an algorithm for obtaining interactive consistancy (byzantine agreement) among four microprocessors as a special instruction on the processors. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, provided certain preconditions hold. An assumption is made that the processors execute synchronously. For verification, the authors used a computer aided design hardware design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.
Weak lensing magnification in the Dark Energy Survey Science Verification data
NASA Astrophysics Data System (ADS)
Garcia-Fernandez, M.; Sanchez, E.; Sevilla-Noarbe, I.; Suchyta, E.; Huff, E. M.; Gaztanaga, E.; Aleksić, J.; Ponce, R.; Castander, F. J.; Hoyle, B.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Eifler, T. F.; Evrard, A. E.; Fernandez, E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Jarvis, M.; Kirk, D.; Krause, E.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; MacCrann, N.; Maia, M. A. G.; March, M.; Marshall, J. L.; Melchior, P.; Miquel, R.; Mohr, J. J.; Plazas, A. A.; Romer, A. K.; Roodman, A.; Rykoff, E. S.; Scarpine, V.; Schubnell, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Tarle, G.; Thomas, D.; Walker, A. R.; Wester, W.; DES Collaboration
2018-05-01
In this paper, the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using the Dark Energy Survey Science Verification data set. This analysis is carried out for galaxies that are selected only by its photometric redshift. An extensive analysis of the systematic effects, using new methods based on simulations is performed, including a Monte Carlo sampling of the selection function of the survey.
Height-resolved large-sample INAA of a 1 m long, 13 cm diameter ditch-bottom sample
NASA Astrophysics Data System (ADS)
Blaauw, M.; Baas, H. W.; Donze, M.
2003-06-01
A facility for instrumental neutron activation analysis (INAA) of large samples (up to 1 m long and 15 cm diameter) has been built. Correction methods for the simultaneous occurrence of neutron self-shielding and gamma-ray self-attenuation effects have been implemented and tested with a variety of samples. Now, the method has been extended to allow for the interpretation of scanned, collimated measurements, where results are obtained for individual voxels. As a validation and demonstration, a ditch-bottom sample of the maximum size was taken in a frozen condition. It was cut in 2 cm slices, still frozen, and put together again with each slice in a 2 cm height Petri dish divided in three sections. This allowed for verification of the results by ordinary INAA. Possible explanations for the discrepancies we observed between ordinary and large-sample INAA in the region where the concentration gradients are the steepest are discussed.
Jastrzębska, Aneta; Piasta, Anna; Szłyk, Edward
2014-01-01
A simple and useful method for the determination of biogenic amines in beverage samples based on isotachophoretic separation is described. The proposed procedure permitted simultaneous analysis of histamine, tyramine, cadaverine, putrescine, tryptamine, 2-phenylethylamine, spermine and spermidine. The data presented demonstrate the utility, simplicity, flexibility, sensitivity and environmentally friendly character of the proposed method. The precision of the method expressed as coefficient of variations varied from 0.1% to 5.9% for beverage samples, whereas recoveries varied from 91% to 101%. The results for the determination of biogenic amines were compared with an HPLC procedure based on a pre-column derivatisation reaction of biogenic amines with dansyl chloride. Furthermore, the derivatisation procedure was optimised by verification of concentration and pH of the buffer, the addition of organic solvents, reaction time and temperature.
NASA Astrophysics Data System (ADS)
Poinsot, Audrey; Yang, Fan; Brost, Vincent
2011-02-01
Including multiple sources of information in personal identity recognition and verification gives the opportunity to greatly improve performance. We propose a contactless biometric system that combines two modalities: palmprint and face. Hardware implementations are proposed on the Texas Instrument Digital Signal Processor and Xilinx Field-Programmable Gate Array (FPGA) platforms. The algorithmic chain consists of a preprocessing (which includes palm extraction from hand images), Gabor feature extraction, comparison by Hamming distance, and score fusion. Fusion possibilities are discussed and tested first using a bimodal database of 130 subjects that we designed (uB database), and then two common public biometric databases (AR for face and PolyU for palmprint). High performance has been obtained for recognition and verification purpose: a recognition rate of 97.49% with AR-PolyU database and an equal error rate of 1.10% on the uB database using only two training samples per subject have been obtained. Hardware results demonstrate that preprocessing can easily be performed during the acquisition phase, and multimodal biometric recognition can be treated almost instantly (0.4 ms on FPGA). We show the feasibility of a robust and efficient multimodal hardware biometric system that offers several advantages, such as user-friendliness and flexibility.
redMaGiC: selecting luminous red galaxies from the DES Science Verification data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rozo, E.
We introduce redMaGiC, an automated algorithm for selecting Luminous Red Galaxies (LRGs). The algorithm was developed to minimize photometric redshift uncertainties in photometric large-scale structure studies. redMaGiC achieves this by self-training the color-cuts necessary to produce a luminosity-thresholded LRG sam- ple of constant comoving density. Additionally, we demonstrate that redMaGiC photo-zs are very nearly as accurate as the best machine-learning based methods, yet they require minimal spectroscopic training, do not suffer from extrapolation biases, and are very nearly Gaussian. We apply our algorithm to Dark Energy Survey (DES) Science Verification (SV) data to produce a redMaGiC catalog sampling the redshiftmore » range z ϵ [0.2,0.8]. Our fiducial sample has a comoving space density of 10 -3 (h -1Mpc) -3, and a median photo-z bias (z spec z photo) and scatter (σ z=(1 + z)) of 0.005 and 0.017 respectively.The corresponding 5σ outlier fraction is 1.4%. We also test our algorithm with Sloan Digital Sky Survey (SDSS) Data Release 8 (DR8) and Stripe 82 data, and discuss how spectroscopic training can be used to control photo-z biases at the 0.1% level.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dewberry, R.; Ayers, J.; Tietze, F.
The Analytical Development (AD) Section field nuclear measurement group performed six 'best available technique' verification measurements to satisfy a DOE requirement instituted for the March 2009 semi-annual inventory. The requirement of (1) yielded the need for SRNL Research Operations Department Material Control & Accountability (MC&A) group to measure the Pu content of five items and the highly enrich uranium (HEU) content of two. No 14Q-qualified measurement equipment was available to satisfy the requirement. The AD field nuclear group has routinely performed the required Confirmatory Measurements for the semi-annual inventories for fifteen years using sodium iodide and high purity germanium (HpGe)more » {gamma}-ray pulse height analysis nondestructive assay (NDA) instruments. With appropriate {gamma}-ray acquisition modeling, the HpGe spectrometers can be used to perform verification-type quantitative assay for Pu-isotopics and HEU content. The AD nuclear NDA group is widely experienced with this type of measurement and reports content for these species in requested process control, MC&A booking, and holdup measurements assays Site-wide. However none of the AD HpGe {gamma}-ray spectrometers have been 14Q-qualified, and the requirement of reference 1 specifically excluded a {gamma}-ray PHA measurement from those it would accept for the required verification measurements. The requirement of reference 1 was a new requirement for which the Savannah River National Laboratory (SRNL) Research Operations Department (ROD) MC&A group was unprepared. The criteria for exemption from verification were: (1) isotope content below 50 grams; (2) intrinsically tamper indicating or TID sealed items which contain a Category IV quantity of material; (3) assembled components; and (4) laboratory samples. Therefore all (SRNL) Material Balance Area (MBA) items with greater than 50 grams total Pu or greater than 50 grams HEU were subject to a verification measurement. The pass/fail criteria of reference 7 stated 'The facility will report measured values, book values, and statistical control limits for the selected items to DOE SR...', and 'The site/facility operator must develop, document, and maintain measurement methods for all nuclear material on inventory'. These new requirements exceeded SRNL's experience with prior semi-annual inventory expectations, but allowed the AD nuclear field measurement group to demonstrate its excellent adaptability and superior flexibility to respond to unpredicted expectations from the DOE customer. The requirements yielded five SRNL items subject to Pu verification and two SRNL items subject to HEU verification. These items are listed and described in Table 1.« less
Distributed Capacitive Sensor for Sample Mass Measurement
NASA Technical Reports Server (NTRS)
Toda, Risaku; McKinney, Colin; Jackson, Shannon P.; Mojarradi, Mohammad; Manohara, Harish; Trebi-Ollennu, Ashitey
2011-01-01
Previous robotic sample return missions lacked in situ sample verification/ quantity measurement instruments. Therefore, the outcome of the mission remained unclear until spacecraft return. In situ sample verification systems such as this Distributed Capacitive (DisC) sensor would enable an unmanned spacecraft system to re-attempt the sample acquisition procedures until the capture of desired sample quantity is positively confirmed, thereby maximizing the prospect for scientific reward. The DisC device contains a 10-cm-diameter pressure-sensitive elastic membrane placed at the bottom of a sample canister. The membrane deforms under the weight of accumulating planetary sample. The membrane is positioned in close proximity to an opposing rigid substrate with a narrow gap. The deformation of the membrane makes the gap narrower, resulting in increased capacitance between the two parallel plates (elastic membrane and rigid substrate). C-V conversion circuits on a nearby PCB (printed circuit board) provide capacitance readout via LVDS (low-voltage differential signaling) interface. The capacitance method was chosen over other potential approaches such as the piezoelectric method because of its inherent temperature stability advantage. A reference capacitor and temperature sensor are embedded in the system to compensate for temperature effects. The pressure-sensitive membranes are aluminum 6061, stainless steel (SUS) 403, and metal-coated polyimide plates. The thicknesses of these membranes range from 250 to 500 m. The rigid substrate is made with a 1- to 2-mm-thick wafer of one of the following materials depending on the application requirements glass, silicon, polyimide, PCB substrate. The glass substrate is fabricated by a microelectromechanical systems (MEMS) fabrication approach. Several concentric electrode patterns are printed on the substrate. The initial gap between the two plates, 100 m, is defined by a silicon spacer ring that is anodically bonded to the glass substrate. The fabricated proof-of-concept devices have successfully demonstrated tens to hundreds of picofarads of capacitance change when a simulated sample (100 g to 500 g) is placed on the membrane.
Code of Federal Regulations, 2010 CFR
2010-07-01
... which you sample and record gas-analyzer concentrations. (b) Measurement principles. This test verifies... appropriate frequency to prevent loss of information. This test also verifies that the measurement system... instructions. Adjust the measurement system as needed to optimize performance. Run this verification with the...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2011 CFR
2011-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... reference PM sample media (e.g., filters) before and after a weighing session. A weighing session may be as...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... reference PM sample media (e.g., filters) before and after a weighing session. A weighing session may be as...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2013 CFR
2013-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... reference PM sample media (e.g., filters) before and after a weighing session. A weighing session may be as...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2012 CFR
2012-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... reference PM sample media (e.g., filters) before and after a weighing session. A weighing session may be as...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2010 CFR
2010-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... weighing session by weighing reference PM sample media (e.g., filters) before and after a weighing session...
Current status of verification practices in clinical biochemistry in Spain.
Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè
2013-09-01
Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.
Glove-based approach to online signature verification.
Kamel, Nidal S; Sayeed, Shohel; Ellis, Grant A
2008-06-01
Utilizing the multiple degrees of freedom offered by the data glove for each finger and the hand, a novel on-line signature verification system using the Singular Value Decomposition (SVD) numerical tool for signature classification and verification is presented. The proposed technique is based on the Singular Value Decomposition in finding r singular vectors sensing the maximal energy of glove data matrix A, called principal subspace, so the effective dimensionality of A can be reduced. Having modeled the data glove signature through its r-principal subspace, signature authentication is performed by finding the angles between the different subspaces. A demonstration of the data glove is presented as an effective high-bandwidth data entry device for signature verification. This SVD-based signature verification technique is tested and its performance is shown to be able to recognize forgery signatures with a false acceptance rate of less than 1.2%.
Scheuermann, Taneisha S; Richter, Kimber P; Rigotti, Nancy A; Cummins, Sharon E; Harrington, Kathleen F; Sherman, Scott E; Zhu, Shu-Hong; Tindle, Hilary A; Preacher, Kristopher J
2017-12-01
To estimate the prevalence and predictors of failed biochemical verification of self-reported abstinence among participants enrolled in trials of hospital-initiated smoking cessation interventions. Comparison of characteristics between participants who verified and those who failed to verify self-reported abstinence. Multi-site randomized clinical trials conducted between 2010 and 2014 in hospitals throughout the United States. Recently hospitalized smokers who reported tobacco abstinence 6 months post-randomization and provided a saliva sample for verification purposes (n = 822). Outcomes were salivary cotinine-verified smoking abstinence at 10 and 15 ng/ml cut-points. Predictors and correlates included participant demographics and tobacco use; hospital diagnoses and treatment; and study characteristics collected via surveys and electronic medical records. Usable samples were returned by 69.8% of the 1178 eligible trial participants who reported 7-day point prevalence abstinence. The proportion of participants verified as quit was 57.8% [95% confidence interval (CI) = 54.4, 61.2; 10 ng/ml cut-off] or 60.6% (95% CI = 57.2, 63.9; 15 ng/ml). Factors associated independently with verification at 10 ng/ml were education beyond high school education [odds ratio (OR) = 1.51; 95% CI = 1.07, 2.11], continuous abstinence since hospitalization (OR = 2.82; 95% CI = 2.02, 3.94), mailed versus in-person sample (OR = 3.20; 95% CI = 1.96, 5.21) and race. African American participants were less likely to verify abstinence than white participants (OR = 0.64; 95% CI = 0.44, 0.93). Findings were similar for verification at 15 ng/ml. Verification rates did not differ by treatment group. In the United States, high rates (40%) of recently hospitalized smokers enrolled in smoking cessation trials fail biochemical verification of their self-reported abstinence. © 2017 Society for the Study of Addiction.
Koller, Marianne; Becker, Christian; Thiermann, Horst; Worek, Franz
2010-05-15
The purpose of this study was to check the applicability of different analytical methods for the identification of unknown nerve agents in human body fluids. Plasma and urine samples were spiked with nerve agents (plasma) or with their metabolites (urine) or were left blank. Seven random samples (35% of all samples) were selected for the verification test. Plasma was worked up for unchanged nerve agents and for regenerated nerve agents after fluoride-induced reactivation of nerve agent-inhibited butyrylcholinesterase. Both extracts were analysed by GC-MS. Metabolites were extracted from plasma and urine, respectively, and were analysed by LC-MS. The urinary metabolites and two blank samples could be identified without further measurements, plasma metabolites and blanks were identified in six of seven samples. The analysis of unchanged nerve agent provided five agents/blanks and the sixth agent after further investigation. The determination of the regenerated agents also provided only five clear findings during the first screening because of a rather noisy baseline. Therefore, the sample preparation was extended by a size exclusion step performed before addition of fluoride which visibly reduced baseline noise and thus improved identification of the two missing agents. The test clearly showed that verification should be performed by analysing more than one biomarker to ensure identification of the agent(s). Copyright (c) 2010 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weaver, Phyllis C.
A team from ORAU's Independent Environmental Assessment and Verification Program performed verification survey activities on the South Test Tower and four Hazardous Waste Storage Lockers. Scan data collected by ORAU determined that both the alpha and alpha-plus-beta activity was representative of radiological background conditions. The count rate distribution showed no outliers that would be indicative of alpha or alpha-plus-beta count rates in excess of background. It is the opinion of ORAU that independent verification data collected support the site?s conclusions that the South Tower and Lockers sufficiently meet the site criteria for release to recycle and reuse.
DOE Office of Scientific and Technical Information (OSTI.GOV)
P.C. Weaver
2009-04-29
The primary objective of the independent verification was to determine if BJC performed the appropriate actions to meet the specified “hot spot” cleanup criteria of 500 picocuries per gram (pCi/g) uranium-238 (U-238) in surface soil. Specific tasks performed by the independent verification team (IVT) to satisfy this objective included: 1) performing radiological walkover surveys, and 2) collecting soil samples for independent analyses. The independent verification (IV) efforts were designed to evaluate radioactive contaminants (specifically U-238) in the exposed surfaces below one foot of the original site grade, given that the top one foot layer of soil on the site wasmore » removed in its entirety.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demuth, Scott F.; Trahan, Alexis Chanel
2017-06-26
DIV of facility layout, material flows, and other information provided in the DIQ. Material accountancy through an annual PIV and a number of interim inventory verifications, including UF6 cylinder identification and counting, NDA of cylinders, and DA on a sample collection of UF6. Application of C/S technologies utilizing seals and tamper-indicating devices (TIDs) on cylinders, containers, storage rooms, and IAEA instrumentation to provide continuity of knowledge between inspection. Verification of the absence of undeclared material and operations, especially HEU production, through SNRIs, LFUA of cascade halls, and environmental swipe sampling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.
As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.
Application of software technology to automatic test data analysis
NASA Technical Reports Server (NTRS)
Stagner, J. R.
1991-01-01
The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawrence, Chris C.; Flaska, Marek; Pozzi, Sara A.
2016-08-14
Verification of future warhead-dismantlement treaties will require detection of certain warhead attributes without the disclosure of sensitive design information, and this presents an unusual measurement challenge. Neutron spectroscopy—commonly eschewed as an ill-posed inverse problem—may hold special advantages for warhead verification by virtue of its insensitivity to certain neutron-source parameters like plutonium isotopics. In this article, we investigate the usefulness of unfolded neutron spectra obtained from organic-scintillator data for verifying a particular treaty-relevant warhead attribute: the presence of high-explosive and neutron-reflecting materials. Toward this end, several improvements on current unfolding capabilities are demonstrated: deuterated detectors are shown to have superior response-matrixmore » condition to that of standard hydrogen-base scintintillators; a novel data-discretization scheme is proposed which removes important detector nonlinearities; and a technique is described for re-parameterizing the unfolding problem in order to constrain the parameter space of solutions sought, sidestepping the inverse problem altogether. These improvements are demonstrated with trial measurements and verified using accelerator-based time-of-flight calculation of reference spectra. Then, a demonstration is presented in which the elemental compositions of low-Z neutron-attenuating materials are estimated to within 10%. These techniques could have direct application in verifying the presence of high-explosive materials in a neutron-emitting test item, as well as other for treaty verification challenges.« less
NASA Astrophysics Data System (ADS)
Lawrence, Chris C.; Febbraro, Michael; Flaska, Marek; Pozzi, Sara A.; Becchetti, F. D.
2016-08-01
Verification of future warhead-dismantlement treaties will require detection of certain warhead attributes without the disclosure of sensitive design information, and this presents an unusual measurement challenge. Neutron spectroscopy—commonly eschewed as an ill-posed inverse problem—may hold special advantages for warhead verification by virtue of its insensitivity to certain neutron-source parameters like plutonium isotopics. In this article, we investigate the usefulness of unfolded neutron spectra obtained from organic-scintillator data for verifying a particular treaty-relevant warhead attribute: the presence of high-explosive and neutron-reflecting materials. Toward this end, several improvements on current unfolding capabilities are demonstrated: deuterated detectors are shown to have superior response-matrix condition to that of standard hydrogen-base scintintillators; a novel data-discretization scheme is proposed which removes important detector nonlinearities; and a technique is described for re-parameterizing the unfolding problem in order to constrain the parameter space of solutions sought, sidestepping the inverse problem altogether. These improvements are demonstrated with trial measurements and verified using accelerator-based time-of-flight calculation of reference spectra. Then, a demonstration is presented in which the elemental compositions of low-Z neutron-attenuating materials are estimated to within 10%. These techniques could have direct application in verifying the presence of high-explosive materials in a neutron-emitting test item, as well as other for treaty verification challenges.
Code of Federal Regulations, 2010 CFR
2010-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
Code of Federal Regulations, 2011 CFR
2011-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
Code of Federal Regulations, 2014 CFR
2014-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
Code of Federal Regulations, 2012 CFR
2012-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
Code of Federal Regulations, 2013 CFR
2013-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
Face verification with balanced thresholds.
Yan, Shuicheng; Xu, Dong; Tang, Xiaoou
2007-01-01
The process of face verification is guided by a pre-learned global threshold, which, however, is often inconsistent with class-specific optimal thresholds. It is, hence, beneficial to pursue a balance of the class-specific thresholds in the model-learning stage. In this paper, we present a new dimensionality reduction algorithm tailored to the verification task that ensures threshold balance. This is achieved by the following aspects. First, feasibility is guaranteed by employing an affine transformation matrix, instead of the conventional projection matrix, for dimensionality reduction, and, hence, we call the proposed algorithm threshold balanced transformation (TBT). Then, the affine transformation matrix, constrained as the product of an orthogonal matrix and a diagonal matrix, is optimized to improve the threshold balance and classification capability in an iterative manner. Unlike most algorithms for face verification which are directly transplanted from face identification literature, TBT is specifically designed for face verification and clarifies the intrinsic distinction between these two tasks. Experiments on three benchmark face databases demonstrate that TBT significantly outperforms the state-of-the-art subspace techniques for face verification.
Investigation of a Verification and Validation Tool with a Turbofan Aircraft Engine Application
NASA Technical Reports Server (NTRS)
Uth, Peter; Narang-Siddarth, Anshu; Wong, Edmond
2018-01-01
The development of more advanced control architectures for turbofan aircraft engines can yield gains in performance and efficiency over the lifetime of an engine. However, the implementation of these increasingly complex controllers is contingent on their ability to provide safe, reliable engine operation. Therefore, having the means to verify the safety of new control algorithms is crucial. As a step towards this goal, CoCoSim, a publicly available verification tool for Simulink, is used to analyze C-MAPSS40k, a 40,000 lbf class turbo-fan engine model developed at NASA for testing new control algorithms. Due to current limitations of the verification software, several modifications are made to C-MAPSS40k to achieve compatibility with CoCoSim. Some of these modifications sacrifice fidelity to the original model. Several safety and performance requirements typical for turbofan engines are identified and constructed into a verification framework. Preliminary results using an industry standard baseline controller for these requirements are presented. While verification capabilities are demonstrated, a truly comprehensive analysis will require further development of the verification tool.
Kang, Leni; Zhang, Shaokai; Zhao, Fanghui; Qiao, Youlin
2014-03-01
To evaluate and adjust the verification bias existed in the screening or diagnostic tests. Inverse-probability weighting method was used to adjust the sensitivity and specificity of the diagnostic tests, with an example of cervical cancer screening used to introduce the Compare Tests package in R software which could be implemented. Sensitivity and specificity calculated from the traditional method and maximum likelihood estimation method were compared to the results from Inverse-probability weighting method in the random-sampled example. The true sensitivity and specificity of the HPV self-sampling test were 83.53% (95%CI:74.23-89.93)and 85.86% (95%CI: 84.23-87.36). In the analysis of data with randomly missing verification by gold standard, the sensitivity and specificity calculated by traditional method were 90.48% (95%CI:80.74-95.56)and 71.96% (95%CI:68.71-75.00), respectively. The adjusted sensitivity and specificity under the use of Inverse-probability weighting method were 82.25% (95% CI:63.11-92.62) and 85.80% (95% CI: 85.09-86.47), respectively, whereas they were 80.13% (95%CI:66.81-93.46)and 85.80% (95%CI: 84.20-87.41) under the maximum likelihood estimation method. The inverse-probability weighting method could effectively adjust the sensitivity and specificity of a diagnostic test when verification bias existed, especially when complex sampling appeared.
A study of applications scribe frame data verifications using design rule check
NASA Astrophysics Data System (ADS)
Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki
2013-06-01
In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.
Atkinson, David A.
2002-01-01
Methods and apparatus for ion mobility spectrometry and analyte detection and identification verification system are disclosed. The apparatus is configured to be used in an ion mobility spectrometer and includes a plurality of reactant reservoirs configured to contain a plurality of reactants which can be reacted with the sample to form adducts having varying ion mobilities. A carrier fluid, such as air or nitrogen, is used to carry the sample into the spectrometer. The plurality of reactants are configured to be selectively added to the carrier stream by use inlet and outlet manifolds in communication with the reagent reservoirs, the reservoirs being selectively isolatable by valves. The invention further includes a spectrometer having the reagent system described. In the method, a first reactant is used with the sample. Following a positive result, a second reactant is used to determine whether a predicted response occurs. The occurrence of the second predicted response tends to verify the existence of a component of interest within the sample. A third reactant can also be used to provide further verification of the existence of a component of interest. A library can be established of known responses of compounds of interest with various reactants and the results of a specific multi-reactant survey of a sample can be compared against the library to determine whether a component detected in the sample is likely to be a specific component of interest.
Formal verification of a set of memory management units
NASA Technical Reports Server (NTRS)
Schubert, E. Thomas; Levitt, K.; Cohen, Gerald C.
1992-01-01
This document describes the verification of a set of memory management units (MMU). The verification effort demonstrates the use of hierarchical decomposition and abstract theories. The MMUs can be organized into a complexity hierarchy. Each new level in the hierarchy adds a few significant features or modifications to the lower level MMU. The units described include: (1) a page check translation look-aside module (TLM); (2) a page check TLM with supervisor line; (3) a base bounds MMU; (4) a virtual address translation MMU; and (5) a virtual address translation MMU with memory resident segment table.
Report on the formal specification and partial verification of the VIPER microprocessor
NASA Technical Reports Server (NTRS)
Brock, Bishop; Hunt, Warren A., Jr.
1991-01-01
The formal specification and partial verification of the VIPER microprocessor is reviewed. The VIPER microprocessor was designed by RSRE, Malvern, England, for safety critical computing applications (e.g., aircraft, reactor control, medical instruments, armaments). The VIPER was carefully specified and partially verified in an attempt to provide a microprocessor with completely predictable operating characteristics. The specification of VIPER is divided into several levels of abstraction, from a gate-level description up to an instruction execution model. Although the consistency between certain levels was demonstrated with mechanically-assisted mathematical proof, the formal verification of VIPER was never completed.
Li, Xue; Ahmad, Imad A Haidar; Tam, James; Wang, Yan; Dao, Gina; Blasko, Andrei
2018-02-05
A Total Organic Carbon (TOC) based analytical method to quantitate trace residues of clean-in-place (CIP) detergents CIP100 ® and CIP200 ® on the surfaces of pharmaceutical manufacturing equipment was developed and validated. Five factors affecting the development and validation of the method were identified: diluent composition, diluent volume, extraction method, location for TOC sample preparation, and oxidant flow rate. Key experimental parameters were optimized to minimize contamination and to improve the sensitivity, recovery, and reliability of the method. The optimized concentration of the phosphoric acid in the swabbing solution was 0.05M, and the optimal volume of the sample solution was 30mL. The swab extraction method was 1min sonication. The use of a clean room, as compared to an isolated lab environment, was not required for method validation. The method was demonstrated to be linear with a correlation coefficient (R) of 0.9999. The average recoveries from stainless steel surfaces at multiple spike levels were >90%. The repeatability and intermediate precision results were ≤5% across the 2.2-6.6ppm range (50-150% of the target maximum carry over, MACO, limit). The method was also shown to be sensitive with a detection limit (DL) of 38ppb and a quantitation limit (QL) of 114ppb. The method validation demonstrated that the developed method is suitable for its intended use. The methodology developed in this study is generally applicable to the cleaning verification of any organic detergents used for the cleaning of pharmaceutical manufacturing equipment made of electropolished stainless steel material. Copyright © 2017 Elsevier B.V. All rights reserved.
Alloy, L B; Lipman, A J
1992-05-01
In this commentary we examine Swann, Wenzlaff, Krull, and Pelham's (1992) findings with respect to each of 5 central propositions in self-verification theory. We conclude that although the data are consistent with self-verification theory, none of the 5 components of the theory have been demonstrated convincingly as yet. Specifically, we argue that depressed subjects' selection of social feedback appears to be balanced or evenhanded rather than biased toward negative feedback and that there is little evidence to indicate that depressives actively seek negative appraisals. Furthermore, we suggest that the studies are silent with respect to the motivational postulates of self-verification theory and that a variety of competing cognitive and motivational models can explain Swann et al.'s findings as well as self-verification theory.
Formal Verification of the AAMP-FV Microcode
NASA Technical Reports Server (NTRS)
Miller, Steven P.; Greve, David A.; Wilding, Matthew M.; Srivas, Mandayam
1999-01-01
This report describes the experiences of Collins Avionics & Communications and SRI International in formally specifying and verifying the microcode in a Rockwell proprietary microprocessor, the AAMP-FV, using the PVS verification system. This project built extensively on earlier experiences using PVS to verify the microcode in the AAMP5, a complex, pipelined microprocessor designed for use in avionics displays and global positioning systems. While the AAMP5 experiment demonstrated the technical feasibility of formal verification of microcode, the steep learning curve encountered left unanswered the question of whether it could be performed at reasonable cost. The AAMP-FV project was conducted to determine whether the experience gained on the AAMP5 project could be used to make formal verification of microcode cost effective for safety-critical and high volume devices.
UIVerify: A Web-Based Tool for Verification and Automatic Generation of User Interfaces
NASA Technical Reports Server (NTRS)
Shiffman, Smadar; Degani, Asaf; Heymann, Michael
2004-01-01
In this poster, we describe a web-based tool for verification and automatic generation of user interfaces. The verification component of the tool accepts as input a model of a machine and a model of its interface, and checks that the interface is adequate (correct). The generation component of the tool accepts a model of a given machine and the user's task, and then generates a correct and succinct interface. This write-up will demonstrate the usefulness of the tool by verifying the correctness of a user interface to a flight-control system. The poster will include two more examples of using the tool: verification of the interface to an espresso machine, and automatic generation of a succinct interface to a large hypothetical machine.
NASA Astrophysics Data System (ADS)
Zhong, Shenlu; Li, Mengjiao; Tang, Xiajie; He, Weiqing; Wang, Xiaogang
2017-01-01
A novel optical information verification and encryption method is proposed based on inference principle and phase retrieval with sparsity constraints. In this method, a target image is encrypted into two phase-only masks (POMs), which comprise sparse phase data used for verification. Both of the two POMs need to be authenticated before being applied for decrypting. The target image can be optically reconstructed when the two authenticated POMs are Fourier transformed and convolved by the correct decryption key, which is also generated in encryption process. No holographic scheme is involved in the proposed optical verification and encryption system and there is also no problem of information disclosure in the two authenticable POMs. Numerical simulation results demonstrate the validity and good performance of this new proposed method.
Development of a new lattice physics code robin for PWR application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, S.; Chen, G.
2013-07-01
This paper presents a description of methodologies and preliminary verification results of a new lattice physics code ROBIN, being developed for PWR application at Shanghai NuStar Nuclear Power Technology Co., Ltd. The methods used in ROBIN to fulfill various tasks of lattice physics analysis are an integration of historical methods and new methods that came into being very recently. Not only these methods like equivalence theory for resonance treatment and method of characteristics for neutron transport calculation are adopted, as they are applied in many of today's production-level LWR lattice codes, but also very useful new methods like the enhancedmore » neutron current method for Dancoff correction in large and complicated geometry and the log linear rate constant power depletion method for Gd-bearing fuel are implemented in the code. A small sample of verification results are provided to illustrate the type of accuracy achievable using ROBIN. It is demonstrated that ROBIN is capable of satisfying most of the needs for PWR lattice analysis and has the potential to become a production quality code in the future. (authors)« less
An Identity-Based Anti-Quantum Privacy-Preserving Blind Authentication in Wireless Sensor Networks.
Zhu, Hongfei; Tan, Yu-An; Zhu, Liehuang; Wang, Xianmin; Zhang, Quanxin; Li, Yuanzhang
2018-05-22
With the development of wireless sensor networks, IoT devices are crucial for the Smart City; these devices change people's lives such as e-payment and e-voting systems. However, in these two systems, the state-of-art authentication protocols based on traditional number theory cannot defeat a quantum computer attack. In order to protect user privacy and guarantee trustworthy of big data, we propose a new identity-based blind signature scheme based on number theorem research unit lattice, this scheme mainly uses a rejection sampling theorem instead of constructing a trapdoor. Meanwhile, this scheme does not depend on complex public key infrastructure and can resist quantum computer attack. Then we design an e-payment protocol using the proposed scheme. Furthermore, we prove our scheme is secure in the random oracle, and satisfies confidentiality, integrity, and non-repudiation. Finally, we demonstrate that the proposed scheme outperforms the other traditional existing identity-based blind signature schemes in signing speed and verification speed, outperforms the other lattice-based blind signature in signing speed, verification speed, and signing secret key size.
An Identity-Based Anti-Quantum Privacy-Preserving Blind Authentication in Wireless Sensor Networks
Zhu, Hongfei; Tan, Yu-an; Zhu, Liehuang; Wang, Xianmin; Zhang, Quanxin; Li, Yuanzhang
2018-01-01
With the development of wireless sensor networks, IoT devices are crucial for the Smart City; these devices change people’s lives such as e-payment and e-voting systems. However, in these two systems, the state-of-art authentication protocols based on traditional number theory cannot defeat a quantum computer attack. In order to protect user privacy and guarantee trustworthy of big data, we propose a new identity-based blind signature scheme based on number theorem research unit lattice, this scheme mainly uses a rejection sampling theorem instead of constructing a trapdoor. Meanwhile, this scheme does not depend on complex public key infrastructure and can resist quantum computer attack. Then we design an e-payment protocol using the proposed scheme. Furthermore, we prove our scheme is secure in the random oracle, and satisfies confidentiality, integrity, and non-repudiation. Finally, we demonstrate that the proposed scheme outperforms the other traditional existing identity-based blind signature schemes in signing speed and verification speed, outperforms the other lattice-based blind signature in signing speed, verification speed, and signing secret key size. PMID:29789475
Action-based verification of RTCP-nets with CADP
NASA Astrophysics Data System (ADS)
Biernacki, Jerzy; Biernacka, Agnieszka; Szpyrka, Marcin
2015-12-01
The paper presents an RTCP-nets' (real-time coloured Petri nets) coverability graphs into Aldebaran format translation algorithm. The approach provides the possibility of automatic RTCP-nets verification using model checking techniques provided by the CADP toolbox. An actual fire alarm control panel system has been modelled and several of its crucial properties have been verified to demonstrate the usability of the approach.
40 CFR 1066.220 - Linearity verification.
Code of Federal Regulations, 2012 CFR
2012-07-01
..., you must demonstrate to us that the deficiency does not adversely affect your ability to demonstrate... system at the specified temperatures and pressures. This may include any specified adjustment or periodic...
40 CFR 1066.220 - Linearity verification.
Code of Federal Regulations, 2013 CFR
2013-07-01
..., you must demonstrate to us that the deficiency does not adversely affect your ability to demonstrate... system at the specified temperatures and pressures. This may include any specified adjustment or periodic...
Code of Federal Regulations, 2013 CFR
2013-07-01
... discrete-mode testing. For this check we consider water vapor a gaseous constituent. This verification does... for water removed from the sample done in post-processing according to § 1065.659 and it does not... humidification vessel that contains water. You must humidify NO2 span gas with another moist gas stream. We...
Code of Federal Regulations, 2014 CFR
2014-07-01
... discrete-mode testing. For this check we consider water vapor a gaseous constituent. This verification does... for water removed from the sample done in post-processing according to § 1065.659 (40 CFR 1066.620 for... contains water. You must humidify NO2 span gas with another moist gas stream. We recommend humidifying your...
Cleanup Verification Package for the 118-F-5 PNL Sawdust Pit
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. D. Habel
2008-05-20
This cleanup verification package documents completion of remedial action, sampling activities, and compliance with cleanup criteria for the 118-F-5 Burial Ground, the PNL (Pacific Northwest Laboratory) Sawdust Pit. The 118-F-5 Burial Ground was an unlined trench that received radioactive sawdust from the floors of animal pens in the 100-F Experimental Animal Farm.
Feng, Qin; Gai, Fei; Sang, Yaxiong; Zhang, Jie; Wang, Ping; Wang, Yue; Liu, Bing; Lin, Dongmei; Yu, Yang; Fang, Jian
2018-01-01
The AURA3 clinical trial has shown that advanced non-small cell lung cancer (NSCLC) patients with EGFR T790M mutations in circulating tumor DNA (ctDNA) could benefit from osimertinib. The aim of this study was to assess the usefulness of QuantStudio™ 3D Digital PCR System platform for the detection of plasma EGFR T790M mutations in NSCLC patients, and compare the performances of 3D Digital PCR and ARMS-PCR. A total of 119 Chinese patients were enrolled in this study. Mutant allele frequency of plasma EGFR T790M was detected by 3D Digital PCR, then 25 selected samples were verified by ARMS-PCR and four of them were verified by next generation sequencing (NGS). In total, 52.94% (69/119) had EGFR T790M mutations detected by 3D Digital PCR. In 69 positive samples, the median mutant allele frequency (AF) was 1.09% and three cases presented low concentration (AF <0.1%). Limited by the amount of plasma DNA, 17 samples (AF <2.5%) and eight samples (T790M-) were selected for verification by ARMS-PCR. Four of those samples were verified by NGS as a third verification method. Among the selected 17 positive cases, ten samples presented mutant allele frequency <0.5%, and seven samples presented intermediate mutant allele frequency (0.5% AF 2.5%). However, only three samples (3/17) were identified as positive by ARMS-PCR, namely, P6 (AF =1.09%), P7 (AF =2.09%), and P8 (AF =2.21%). It is worth mentioning that sample P9 (AF =2.05%, analyzed by 3D Digital PCR) was identified as T790M- by ARMS-PCR. Four samples were identified as T790M+ by both NGS and 3D Digital PCR, and typically three samples (3/4) presented at a low ratio (AF <0.5%). Our study demonstrated that 3D Digital PCR is a novel method with high sensitivity and specificity to detect EGFR T790M mutation in plasma.
Sang, Yaxiong; Zhang, Jie; Wang, Ping; Wang, Yue; Liu, Bing; Lin, Dongmei; Yu, Yang; Fang, Jian
2018-01-01
Background The AURA3 clinical trial has shown that advanced non-small cell lung cancer (NSCLC) patients with EGFR T790M mutations in circulating tumor DNA (ctDNA) could benefit from osimertinib. Purpose The aim of this study was to assess the usefulness of QuantStudio™ 3D Digital PCR System platform for the detection of plasma EGFR T790M mutations in NSCLC patients, and compare the performances of 3D Digital PCR and ARMS-PCR. Patients and methods A total of 119 Chinese patients were enrolled in this study. Mutant allele frequency of plasma EGFR T790M was detected by 3D Digital PCR, then 25 selected samples were verified by ARMS-PCR and four of them were verified by next generation sequencing (NGS). Results In total, 52.94% (69/119) had EGFR T790M mutations detected by 3D Digital PCR. In 69 positive samples, the median mutant allele frequency (AF) was 1.09% and three cases presented low concentration (AF <0.1%). Limited by the amount of plasma DNA, 17 samples (AF <2.5%) and eight samples (T790M-) were selected for verification by ARMS-PCR. Four of those samples were verified by NGS as a third verification method. Among the selected 17 positive cases, ten samples presented mutant allele frequency <0.5%, and seven samples presented intermediate mutant allele frequency (0.5% AF 2.5%). However, only three samples (3/17) were identified as positive by ARMS-PCR, namely, P6 (AF =1.09%), P7 (AF =2.09%), and P8 (AF =2.21%). It is worth mentioning that sample P9 (AF =2.05%, analyzed by 3D Digital PCR) was identified as T790M- by ARMS-PCR. Four samples were identified as T790M+ by both NGS and 3D Digital PCR, and typically three samples (3/4) presented at a low ratio (AF <0.5%). Conclusion Our study demonstrated that 3D Digital PCR is a novel method with high sensitivity and specificity to detect EGFR T790M mutation in plasma. PMID:29403309
Options and Risk for Qualification of Electric Propulsion System
NASA Technical Reports Server (NTRS)
Bailey, Michelle; Daniel, Charles; Cook, Steve (Technical Monitor)
2002-01-01
Electric propulsion vehicle systems envelop a wide range of propulsion alternatives including solar and nuclear, which present unique circumstances for qualification. This paper will address the alternatives for qualification of electric propulsion spacecraft systems. The approach taken will be to address the considerations for qualification at the various levels of systems definition. Additionally, for each level of qualification the system level risk implications will be developed. Also, the paper will explore the implications of analysis verses test for various levels of systems definition, while retaining the objectives of a verification program. The limitations of terrestrial testing will be explored along with the risk and implications of orbital demonstration testing. The paper will seek to develop a template for structuring of a verification program based on cost, risk and value return. A successful verification program should establish controls and define objectives of the verification compliance program. Finally the paper will seek to address the political and programmatic factors, which may impact options for system verification.
Dillon, Gerald Patrick; Keegan, Jason D; Wallace, Geoff; Yiannikouris, Alexandros; Moran, Colm Anthony
2018-06-01
Docosahexaenoic acid (DHA), is an omega 3 fatty acid (n-3 FA) that has been shown to play a role in canine growth and physiological integrity and improvements in skin and coat condition. However, potential adverse effects of n-3 FA specifically, impaired cellular immunity has been observed in dogs fed diets with elevated levels of n-3 FA. As such, a safe upper limit (SUL) for total n-3 FAs (DHA and EPA) in dogs has been established. Considering this SUL, sensitive methods detecting DHA in blood serum as a biomarker when conducting n-3 FA supplementation trials involving dogs are required. In this study, an LC-ESI-MS/MS method of DHA detection in dog serum was validated and verified. Recovery of DHA was optimized and parallelism tests were conducted with spiked samples demonstrating that the serum matrix did not interfere with quantitation. The stability of DHA in serum was also investigated, with -80 °C considered suitable when storing samples for up to six months. The method was linear over a calibration range of 1-500 μg/mL and precision and accuracy were found to meet the requirements for validation. This method was verified in an alternative laboratory using a different analytical system and operator, with the results meeting the criteria for verification. Copyright © 2018. Published by Elsevier Inc.
High-speed autoverifying technology for printed wiring boards
NASA Astrophysics Data System (ADS)
Ando, Moritoshi; Oka, Hiroshi; Okada, Hideo; Sakashita, Yorihiro; Shibutani, Nobumi
1996-10-01
We have developed an automated pattern verification technique. The output of an automated optical inspection system contains many false alarms. Verification is needed to distinguish between minor irregularities and serious defects. In the past, this verification was usually done manually, which led to unsatisfactory product quality. The goal of our new automated verification system is to detect pattern features on surface mount technology boards. In our system, we employ a new illumination method, which uses multiple colors and multiple direction illumination. Images are captured with a CCD camera. We have developed a new algorithm that uses CAD data for both pattern matching and pattern structure determination. This helps to search for patterns around a defect and to examine defect definition rules. These are processed with a high speed workstation and a hard-wired circuits. The system can verify a defect within 1.5 seconds. The verification system was tested in a factory. It verified 1,500 defective samples and detected all significant defects with only a 0.1 percent of error rate (false alarm).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.
As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.
Spacecraft servicing demonstration plan
NASA Technical Reports Server (NTRS)
Bergonz, F. H.; Bulboaca, M. A.; Derocher, W. L., Jr.
1984-01-01
A preliminary spacecraft servicing demonstration plan is prepared which leads to a fully verified operational on-orbit servicing system based on the module exchange, refueling, and resupply technologies. The resulting system can be applied at the space station, in low Earth orbit with an orbital maneuvering vehicle (OMV), or be carried with an OMV to geosynchronous orbit by an orbital transfer vehicle. The three phase plan includes ground demonstrations, cargo bay demonstrations, and free flight verifications. The plan emphasizes the exchange of multimission modular spacecraft (MMS) modules which involves space repairable satellites. Three servicer mechanism configurations are the engineering test unit, a protoflight quality unit, and two fully operational units that have been qualified and documented for use in free flight verification activity. The plan balances costs and risks by overlapping study phases, utilizing existing equipment for ground demonstrations, maximizing use of existing MMS equipment, and rental of a spacecraft bus.
40 CFR 1066.135 - Linearity verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... CVS, double-dilution, and partial-flow systems. (3) PM sample. (4) Chiller sample, for gaseous sampling systems that use thermal chillers to dry samples, and that use chiller temperature to calculate dewpoint at the chiller outlet. For testing, if you choose to use the high alarm temperature setpoint for...
NASA Astrophysics Data System (ADS)
Kim, A. A.; Klochkov, D. V.; Konyaev, M. A.; Mihaylenko, A. S.
2017-11-01
The article considers the problem of control and verification of the laser ceilometers basic performance parameters and describes an alternative method based on the use of multi-length fiber optic delay line, simulating atmospheric track. The results of the described experiment demonstrate the great potential of this method for inspection and verification procedures of laser ceilometers.
NASA Astrophysics Data System (ADS)
Marconi, S.; Conti, E.; Christiansen, J.; Placidi, P.
2018-05-01
The operating conditions of the High Luminosity upgrade of the Large Hadron Collider are very demanding for the design of next generation hybrid pixel readout chips in terms of particle rate, radiation level and data bandwidth. To this purpose, the RD53 Collaboration has developed for the ATLAS and CMS experiments a dedicated simulation and verification environment using industry-consolidated tools and methodologies, such as SystemVerilog and the Universal Verification Methodology (UVM). This paper presents how the so-called VEPIX53 environment has first guided the design of digital architectures, optimized for processing and buffering very high particle rates, and secondly how it has been reused for the functional verification of the first large scale demonstrator chip designed by the collaboration, which has recently been submitted.
A Verification System for Distributed Objects with Asynchronous Method Calls
NASA Astrophysics Data System (ADS)
Ahrendt, Wolfgang; Dylla, Maximilian
We present a verification system for Creol, an object-oriented modeling language for concurrent distributed applications. The system is an instance of KeY, a framework for object-oriented software verification, which has so far been applied foremost to sequential Java. Building on KeY characteristic concepts, like dynamic logic, sequent calculus, explicit substitutions, and the taclet rule language, the system presented in this paper addresses functional correctness of Creol models featuring local cooperative thread parallelism and global communication via asynchronous method calls. The calculus heavily operates on communication histories which describe the interfaces of Creol units. Two example scenarios demonstrate the usage of the system.
Behavioral biometrics for verification and recognition of malicious software agents
NASA Astrophysics Data System (ADS)
Yampolskiy, Roman V.; Govindaraju, Venu
2008-04-01
Homeland security requires technologies capable of positive and reliable identification of humans for law enforcement, government, and commercial applications. As artificially intelligent agents improve in their abilities and become a part of our everyday life, the possibility of using such programs for undermining homeland security increases. Virtual assistants, shopping bots, and game playing programs are used daily by millions of people. We propose applying statistical behavior modeling techniques developed by us for recognition of humans to the identification and verification of intelligent and potentially malicious software agents. Our experimental results demonstrate feasibility of such methods for both artificial agent verification and even for recognition purposes.
RF model of the distribution system as a communication channel, phase 2. Volume 1: Summary Report
NASA Technical Reports Server (NTRS)
Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.
1982-01-01
The design, implementation, and verification of a computerized model for predicting the steady-state sinusoidal response of radial (tree) configured distribution feeders was undertaken. That work demonstrated the feasibility and validity based on verification measurements made on a limited size portion of an actual live feeder. On that basis a follow-on effort concerned with (1) extending the verification based on a greater variety of situations and network size, (2) extending the model capabilities for reverse direction propagation, (3) investigating parameter sensitivities, (4) improving transformer models, and (5) investigating procedures/fixes for ameliorating propagation trouble spots was conducted. Results are summarized.
NASA Technical Reports Server (NTRS)
Zoutendyk, J. A.; Smith, L. S.; Soli, G. A.; Thieberger, P.; Wegner, H. E.
1985-01-01
Single-Event Upset (SEU) response of a bipolar low-power Schottky-diode-clamped TTL static RAM has been observed using Br ions in the 100-240 MeV energy range and O ions in the 20-100 MeV range. These data complete the experimental verification of circuit-simulation SEU modeling for this device. The threshold for onset of SEU has been observed by the variation of energy, ion species and angle of incidence. The results obtained from the computer circuit-simulation modeling and experimental model verification demonstrate a viable methodology for modeling SEU in bipolar integrated circuits.
Deductive Evaluation: Formal Code Analysis With Low User Burden
NASA Technical Reports Server (NTRS)
Di Vito, Ben. L
2016-01-01
We describe a framework for symbolically evaluating iterative C code using a deductive approach that automatically discovers and proves program properties. Although verification is not performed, the method can infer detailed program behavior. Software engineering work flows could be enhanced by this type of analysis. Floyd-Hoare verification principles are applied to synthesize loop invariants, using a library of iteration-specific deductive knowledge. When needed, theorem proving is interleaved with evaluation and performed on the fly. Evaluation results take the form of inferred expressions and type constraints for values of program variables. An implementation using PVS (Prototype Verification System) is presented along with results for sample C functions.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-14
... subsequent soil samples showed levels of metals at or below generic residential criteria or background values... 1994- 1996 and additional sampling between 1998 and 2007. Area A--Site Entrance: Soil boring samples... verification samples. Additional soil samples were collected from the same location as the previous collection...
Material Selection for Cable Gland to Improved Reliability of the High-hazard Industries
NASA Astrophysics Data System (ADS)
Vashchuk, S. P.; Slobodyan, S. M.; Deeva, V. S.; Vashchuk, D. S.
2018-01-01
The sealed cable glands (SCG) are available to ensure safest connection sheathed single wire for the hazard production facility (nuclear power plant and others) the same as pilot cable, control cables, radio-frequency cables et al. In this paper, we investigate the specifics of the material selection of SCG with the express aim of hazardous man-made facility. We discuss the safe working conditions for cable glands. The research indicates the sintering powdered metals cables provide the reliability growth due to their properties. A number of studies have demonstrated the verification of material selection. On the face of it, we make findings indicating that double glazed sealed units could enhance reliability. We had evaluated sample reliability under fire conditions, seismic load, and pressure containment failure. We used the samples mineral insulated thermocouple cable.
Sridhar, L; Karthikraj, R; Lakshmi, V V S; Raju, N Prasada; Prabhakar, S
2014-08-01
Rapid detection and identification of chemical warfare agents and related precursors/degradation products in various environmental matrices is of paramount importance for verification of standards set by the chemical weapons convention (CWC). Nitrogen mustards, N,N-dialkylaminoethyl-2-chlorides, N,N-dialkylaminoethanols, N-alkyldiethanolamines, and triethanolamine, which are listed CWC scheduled chemicals, are prone to undergo N-oxidation in environmental matrices or during decontamination process. Thus, screening of the oxidized products of these compounds is also an important task in the verification process because the presence of these products reveals alleged use of nitrogen mustards or precursors of VX compounds. The N-oxides of aminoethanols and aminoethylchlorides easily produce [M + H](+) ions under electrospray ionization conditions, and their collision-induced dissociation spectra include a specific neutral loss of 48 u (OH + CH2OH) and 66 u (OH + CH2Cl), respectively. Based on this specific fragmentation, a rapid screening method was developed for screening of the N-oxides by applying neutral loss scan technique. The method was validated and the applicability of the method was demonstrated by analyzing positive and negative samples. The method was useful in the detection of N-oxides of aminoethanols and aminoethylchlorides in environmental matrices at trace levels (LOD, up to 500 ppb), even in the presence of complex masking agents, without the use of time-consuming sample preparation methods and chromatographic steps. This method is advantageous for the off-site verification program and also for participation in official proficiency tests conducted by the Organization for the Prohibition of Chemical Weapons (OPCW), the Netherlands. The structure of N-oxides can be confirmed by the MS/MS experiments on the detected peaks. A liquid chromatography-mass spectrometry (LC-MS) method was developed for the separation of isomeric N-oxides of aminoethanols and aminoethylchlorides using a C18 Hilic column. Critical isomeric compounds can be confirmed by LC-MS/MS experiments, after detecting the N-oxides from the neutral loss scanning method.
LLNL Genomic Assessment: Viral and Bacterial Sequencing Needs for TMTI, Task 1.4.2 Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slezak, T; Borucki, M; Lam, M
Good progress has been made on both bacterial and viral sequencing by the TMTI centers. While access to appropriate samples is a limiting factor to throughput, excellent progress has been made with respect to getting agreements in place with key sources of relevant materials. Sharing of sequenced genomes funded by TMTI has been extremely limited to date. The April 2010 exercise should force a resolution to this, but additional managerial pressures may be needed to ensure that rapid sharing of TMTI-funded sequencing occurs, regardless of collaborator constraints concerning ultimate publication(s). Policies to permit TMTI-internal rapid sharing of sequenced genomes shouldmore » be written into all TMTI agreements with collaborators now being negotiated. TMTI needs to establish a Web-based system for tracking samples destined for sequencing. This includes metadata on sample origins and contributor, information on sample shipment/receipt, prioritization by TMTI, assignment to one or more sequencing centers (including possible TMTI-sponsored sequencing at a contributor site), and status history of the sample sequencing effort. While this system could be a component of the AFRL system, it is not part of any current development effort. Policy and standardized procedures are needed to ensure appropriate verification of all TMTI samples prior to the investment in sequencing. PCR, arrays, and classical biochemical tests are examples of potential verification methods. Verification is needed to detect miss-labeled, degraded, mixed or contaminated samples. Regular QC exercises are needed to ensure that the TMTI-funded centers are meeting all standards for producing quality genomic sequence data.« less
Direct and full-scale experimental verifications towards ground-satellite quantum key distribution
NASA Astrophysics Data System (ADS)
Wang, Jian-Yu; Yang, Bin; Liao, Sheng-Kai; Zhang, Liang; Shen, Qi; Hu, Xiao-Fang; Wu, Jin-Cai; Yang, Shi-Ji; Jiang, Hao; Tang, Yan-Lin; Zhong, Bo; Liang, Hao; Liu, Wei-Yue; Hu, Yi-Hua; Huang, Yong-Mei; Qi, Bo; Ren, Ji-Gang; Pan, Ge-Sheng; Yin, Juan; Jia, Jian-Jun; Chen, Yu-Ao; Chen, Kai; Peng, Cheng-Zhi; Pan, Jian-Wei
2013-05-01
Quantum key distribution (QKD) provides the only intrinsically unconditional secure method for communication based on the principle of quantum mechanics. Compared with fibre-based demonstrations, free-space links could provide the most appealing solution for communication over much larger distances. Despite significant efforts, all realizations to date rely on stationary sites. Experimental verifications are therefore extremely crucial for applications to a typical low Earth orbit satellite. To achieve direct and full-scale verifications of our set-up, we have carried out three independent experiments with a decoy-state QKD system, and overcome all conditions. The system is operated on a moving platform (using a turntable), on a floating platform (using a hot-air balloon), and with a high-loss channel to demonstrate performances under conditions of rapid motion, attitude change, vibration, random movement of satellites, and a high-loss regime. The experiments address wide ranges of all leading parameters relevant to low Earth orbit satellites. Our results pave the way towards ground-satellite QKD and a global quantum communication network.
Improving Patient Safety With Error Identification in Chemotherapy Orders by Verification Nurses.
Baldwin, Abigail; Rodriguez, Elizabeth S
2016-02-01
The prevalence of medication errors associated with chemotherapy administration is not precisely known. Little evidence exists concerning the extent or nature of errors; however, some evidence demonstrates that errors are related to prescribing. This article demonstrates how the review of chemotherapy orders by a designated nurse known as a verification nurse (VN) at a National Cancer Institute-designated comprehensive cancer center helps to identify prescribing errors that may prevent chemotherapy administration mistakes and improve patient safety in outpatient infusion units. This article will describe the role of the VN and details of the verification process. To identify benefits of the VN role, a retrospective review and analysis of chemotherapy near-miss events from 2009-2014 was performed. A total of 4,282 events related to chemotherapy were entered into the Reporting to Improve Safety and Quality system. A majority of the events were categorized as near-miss events, or those that, because of chance, did not result in patient injury, and were identified at the point of prescribing.
40 CFR 1065.342 - Sample dryer verification.
Code of Federal Regulations, 2013 CFR
2013-07-01
... condensation as required in § 1065.145(d)(1)(i). We recommend that the sample system components be maintained at least 5 °C above the local humidified gas dewpoint to prevent aqueous condensation. (5) Measure...
40 CFR 1065.342 - Sample dryer verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... condensation as required in § 1065.145(d)(1)(i). We recommend that the sample system components be maintained at least 5 °C above the local humidified gas dewpoint to prevent aqueous condensation. (5) Measure...
40 CFR 1065.342 - Sample dryer verification.
Code of Federal Regulations, 2012 CFR
2012-07-01
... condensation as required in § 1065.145(d)(1)(i). We recommend that the sample system components be maintained at least 5 °C above the local humidified gas dewpoint to prevent aqueous condensation. (5) Measure...
40 CFR 761.130 - Sampling requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... developed by the Midwest Research Institute (MRI) for use in enforcement inspections: “Verification of PCB... the MRI report “Field Manual for Grid Sampling of PCB Spill Sites to Verify Cleanup.” Both the MRI...
40 CFR 761.130 - Sampling requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... developed by the Midwest Research Institute (MRI) for use in enforcement inspections: “Verification of PCB... the MRI report “Field Manual for Grid Sampling of PCB Spill Sites to Verify Cleanup.” Both the MRI...
40 CFR 761.130 - Sampling requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... developed by the Midwest Research Institute (MRI) for use in enforcement inspections: “Verification of PCB... the MRI report “Field Manual for Grid Sampling of PCB Spill Sites to Verify Cleanup.” Both the MRI...
Applying Independent Verification and Validation to Automatic Test Equipment
NASA Technical Reports Server (NTRS)
Calhoun, Cynthia C.
1997-01-01
This paper describes a general overview of applying Independent Verification and Validation (IV&V) to Automatic Test Equipment (ATE). The overview is not inclusive of all IV&V activities that can occur or of all development and maintenance items that can be validated and verified, during the IV&V process. A sampling of possible IV&V activities that can occur within each phase of the ATE life cycle are described.
NASA Astrophysics Data System (ADS)
Tang, Xiaoli; Lin, Tong; Jiang, Steve
2009-09-01
We propose a novel approach for potential online treatment verification using cine EPID (electronic portal imaging device) images for hypofractionated lung radiotherapy based on a machine learning algorithm. Hypofractionated radiotherapy requires high precision. It is essential to effectively monitor the target to ensure that the tumor is within the beam aperture. We modeled the treatment verification problem as a two-class classification problem and applied an artificial neural network (ANN) to classify the cine EPID images acquired during the treatment into corresponding classes—with the tumor inside or outside of the beam aperture. Training samples were generated for the ANN using digitally reconstructed radiographs (DRRs) with artificially added shifts in the tumor location—to simulate cine EPID images with different tumor locations. Principal component analysis (PCA) was used to reduce the dimensionality of the training samples and cine EPID images acquired during the treatment. The proposed treatment verification algorithm was tested on five hypofractionated lung patients in a retrospective fashion. On average, our proposed algorithm achieved a 98.0% classification accuracy, a 97.6% recall rate and a 99.7% precision rate. This work was first presented at the Seventh International Conference on Machine Learning and Applications, San Diego, CA, USA, 11-13 December 2008.
A Methodology for Evaluating Artifacts Produced by a Formal Verification Process
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette
2011-01-01
The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.
Feedback Seeking in Early Adolescence: Self-Enhancement or Self-Verification?
Rosen, Lisa H; Principe, Connor P; Langlois, Judith H
2013-02-13
The authors examined whether early adolescents ( N = 90) solicit self-enhancing feedback (i.e., positive feedback) or self-verifying feedback (i.e., feedback congruent with self-views, even when these views are negative). Sixth, seventh, and eighth graders first completed a self-perception measure and then selected whether to receive positive or negative feedback from an unknown peer in different domains of self. Results were consistent with self-verification theory; adolescents who perceived themselves as having both strengths and weaknesses were more likely to seek negative feedback regarding a self-perceived weakness compared to a self-perceived strength. The authors found similar support for self-verification processes when they considered the entire sample regardless of perceived strengths and weaknesses; hierarchical linear modeling (HLM) examined the predictive power of ratings of self-perceived ability, certainty, and importance on feedback seeking for all participants and provided additional evidence of self-verification strivings in adolescence.
Feedback Seeking in Early Adolescence: Self-Enhancement or Self-Verification?
Rosen, Lisa H.; Principe, Connor P.; Langlois, Judith H.
2012-01-01
The authors examined whether early adolescents (N = 90) solicit self-enhancing feedback (i.e., positive feedback) or self-verifying feedback (i.e., feedback congruent with self-views, even when these views are negative). Sixth, seventh, and eighth graders first completed a self-perception measure and then selected whether to receive positive or negative feedback from an unknown peer in different domains of self. Results were consistent with self-verification theory; adolescents who perceived themselves as having both strengths and weaknesses were more likely to seek negative feedback regarding a self-perceived weakness compared to a self-perceived strength. The authors found similar support for self-verification processes when they considered the entire sample regardless of perceived strengths and weaknesses; hierarchical linear modeling (HLM) examined the predictive power of ratings of self-perceived ability, certainty, and importance on feedback seeking for all participants and provided additional evidence of self-verification strivings in adolescence. PMID:23543746
Distilling the Verification Process for Prognostics Algorithms
NASA Technical Reports Server (NTRS)
Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai
2013-01-01
The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.
What is the Final Verification of Engineering Requirements?
NASA Technical Reports Server (NTRS)
Poole, Eric
2010-01-01
This slide presentation reviews the process of development through the final verification of engineering requirements. The definition of the requirements is driven by basic needs, and should be reviewed by both the supplier and the customer. All involved need to agree upon a formal requirements including changes to the original requirements document. After the requirements have ben developed, the engineering team begins to design the system. The final design is reviewed by other organizations. The final operational system must satisfy the original requirements, though many verifications should be performed during the process. The verification methods that are used are test, inspection, analysis and demonstration. The plan for verification should be created once the system requirements are documented. The plan should include assurances that every requirement is formally verified, that the methods and the responsible organizations are specified, and that the plan is reviewed by all parties. The options of having the engineering team involved in all phases of the development as opposed to having some other organization continue the process once the design has been complete is discussed.
This Test and Quality Assurance Plan (TQAP) provides data quality objections for the success factors that were validated during this demonstration include energy production, emissions and emission reductions compared to alternative systems, economics, and operability, including r...
NASA Technical Reports Server (NTRS)
1976-01-01
The framework within which the Applications Systems Verification Tests (ASVTs) are performed and the economic consequences of improved meteorological information demonstrated is described. This framework considers the impact of improved information on decision processes, the data needs to demonstrate the economic impact of the improved information, the data availability, the methodology for determining and analyzing the collected data and demonstrating the economic impact of the improved information, and the possible methods of data collection. Three ASVTs are considered and program outlines and plans are developed for performing experiments to demonstrate the economic consequences of improved meteorological information. The ASVTs are concerned with the citrus crop in Florida, the cotton crop in Mississippi and a group of diverse crops in Oregon. The program outlines and plans include schedules, manpower estimates and funding requirements.
Ontology Matching with Semantic Verification.
Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R
2009-09-01
ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paul, J. N.; Chin, M. R.; Sjoden, G. E.
2013-07-01
A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reactionmore » rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)« less
VAVUQ, Python and Matlab freeware for Verification and Validation, Uncertainty Quantification
NASA Astrophysics Data System (ADS)
Courtney, J. E.; Zamani, K.; Bombardelli, F. A.; Fleenor, W. E.
2015-12-01
A package of scripts is presented for automated Verification and Validation (V&V) and Uncertainty Quantification (UQ) for engineering codes that approximate Partial Differential Equations (PDFs). The code post-processes model results to produce V&V and UQ information. This information can be used to assess model performance. Automated information on code performance can allow for a systematic methodology to assess the quality of model approximations. The software implements common and accepted code verification schemes. The software uses the Method of Manufactured Solutions (MMS), the Method of Exact Solution (MES), Cross-Code Verification, and Richardson Extrapolation (RE) for solution (calculation) verification. It also includes common statistical measures that can be used for model skill assessment. Complete RE can be conducted for complex geometries by implementing high-order non-oscillating numerical interpolation schemes within the software. Model approximation uncertainty is quantified by calculating lower and upper bounds of numerical error from the RE results. The software is also able to calculate the Grid Convergence Index (GCI), and to handle adaptive meshes and models that implement mixed order schemes. Four examples are provided to demonstrate the use of the software for code and solution verification, model validation and uncertainty quantification. The software is used for code verification of a mixed-order compact difference heat transport solver; the solution verification of a 2D shallow-water-wave solver for tidal flow modeling in estuaries; the model validation of a two-phase flow computation in a hydraulic jump compared to experimental data; and numerical uncertainty quantification for 3D CFD modeling of the flow patterns in a Gust erosion chamber.
VEG-01: Veggie Hardware Verification Testing
NASA Technical Reports Server (NTRS)
Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond
2013-01-01
The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.
Secure Image Hash Comparison for Warhead Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bruillard, Paul J.; Jarman, Kenneth D.; Robinson, Sean M.
2014-06-06
The effort to inspect and verify warheads in the context of possible future arms control treaties is rife with security and implementation issues. In this paper we review prior work on perceptual image hashing for template-based warhead verification. Furthermore, we formalize the notion of perceptual hashes and demonstrate that large classes of such functions are likely not cryptographically secure. We close with a brief discussion of fully homomorphic encryption as an alternative technique.
ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - 4100 VAPOR DETECTOR - ELECTRONIC SENSOR TECHNOLOGY
In July 1997, the U.S. Environmental Protection Agency conducted a demonstration of polychlorinated biphenyl (PCB) FIELD ANALYTICAL TECHNIQUES. The demonstration design was subjected to extensive review and comment by EPA's National Exposure Research Laboratory (NERL) Environmen...
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. M. Capron
2008-05-30
The 100-F-44:2 waste site is a steel pipeline that was discovered in a junction box during confirmatory sampling of the 100-F-26:4 pipeline from December 2004 through January 2005. The 100-F-44:2 pipeline feeds into the 100-F-26:4 subsite vitrified clay pipe (VCP) process sewer pipeline from the 108-F Biology Laboratory at the junction box. In accordance with this evaluation, the confirmatory sampling results support a reclassification of this site to No Action. The current site conditions achieve the remedial action objectives and the corresponding remedial action goals established in the Remaining Sites ROD. The results of confirmatory sampling show that residual contaminantmore » concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.« less
Ultrasonic Method for Deployment Mechanism Bolt Element Preload Verification
NASA Technical Reports Server (NTRS)
Johnson, Eric C.; Kim, Yong M.; Morris, Fred A.; Mitchell, Joel; Pan, Robert B.
2014-01-01
Deployment mechanisms play a pivotal role in mission success. These mechanisms often incorporate bolt elements for which a preload within a specified range is essential for proper operation. A common practice is to torque these bolt elements to a specified value during installation. The resulting preload, however, can vary significantly with applied torque for a number of reasons. The goal of this effort was to investigate ultrasonic methods as an alternative for bolt preload verification in such deployment mechanisms. A family of non-explosive release mechanisms widely used by satellite manufacturers was chosen for the work. A willing contractor permitted measurements on a sampling of bolt elements for these release mechanisms that were installed by a technician following a standard practice. A variation of approximately 50% (+/- 25%) in the resultant preloads was observed. An alternative ultrasonic method to set the preloads was then developed and calibration data was accumulated. The method was demonstrated on bolt elements installed in a fixture instrumented with a calibrated load cell and designed to mimic production practice. The ultrasonic method yielded results within +/- 3% of the load cell reading. The contractor has since adopted the alternative method for its future production. Introduction
CMB lensing tomography with the DES Science Verification galaxies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giannantonio, T.
We measure the cross-correlation between the galaxy density in the Dark Energy Survey (DES) Science Verification data and the lensing of the cosmic microwave background (CMB) as reconstructed with the Planck satellite and the South Pole Telescope (SPT). When using the DES main galaxy sample over the full redshift range 0.2 < z phot < 1.2, a cross-correlation signal is detected at 6σ and 4σ with SPT and Planck respectively. We then divide the DES galaxies into five photometric redshift bins, finding significant (>2σ) detections in all bins. Comparing to the fiducial Planck cosmology, we find the redshift evolution ofmore » the signal matches expectations, although the amplitude is consistently lower than predicted across redshift bins. We test for possible systematics that could affect our result and find no evidence for significant contamination. Finally, we demonstrate how these measurements can be used to constrain the growth of structure across cosmic time. We find the data are fit by a model in which the amplitude of structure in the z < 1.2 universe is 0.73 ± 0.16 times as large as predicted in the LCDM Planck cosmology, a 1.7σ deviation.« less
CMB lensing tomography with the DES Science Verification galaxies
Giannantonio, T.
2016-01-07
We measure the cross-correlation between the galaxy density in the Dark Energy Survey (DES) Science Verification data and the lensing of the cosmic microwave background (CMB) as reconstructed with the Planck satellite and the South Pole Telescope (SPT). When using the DES main galaxy sample over the full redshift range 0.2 < z phot < 1.2, a cross-correlation signal is detected at 6σ and 4σ with SPT and Planck respectively. We then divide the DES galaxies into five photometric redshift bins, finding significant (>2σ) detections in all bins. Comparing to the fiducial Planck cosmology, we find the redshift evolution ofmore » the signal matches expectations, although the amplitude is consistently lower than predicted across redshift bins. We test for possible systematics that could affect our result and find no evidence for significant contamination. Finally, we demonstrate how these measurements can be used to constrain the growth of structure across cosmic time. We find the data are fit by a model in which the amplitude of structure in the z < 1.2 universe is 0.73 ± 0.16 times as large as predicted in the LCDM Planck cosmology, a 1.7σ deviation.« less
Analysis of Nerve Agent Metabolites from Hair for Long-Term Verification of Nerve Agent Exposure
2016-05-09
Analysis of Nerve Agent Metabolites from Hair for Long-Term Verification of Nerve Agent Exposure Amanda S. Appel,† John H. McDonough,‡ Joseph D...feasible. In this study, hair was evaluated as a long-term repository of nerve agent hydrolysis products. Pinacolyl methylphosphonic acid (PMPA...hydrolysis product of soman) and isopropyl methylphosphonic acid (IMPA; hydrolysis product of sarin) were extracted from hair samples with N,N
NASA Technical Reports Server (NTRS)
Sanders, Gerald B.; Araghi, Koorosh; Ess, Kim M.; Valencia, Lisa M.; Muscatello, Anthony C.; Calle, Carlos I.; Clark, Larry; Iacomini, Christie
2014-01-01
The making of oxygen from resources in the Martian atmosphere, known as In Situ Resource Utilization (ISRU), has the potential to provide substantial benefits for future robotic and human exploration. In particular, the ability to produce oxygen on Mars for use in propulsion, life support, and power systems can provide significant mission benefits such as a reducing launch mass, lander size, and mission and crew risk. To advance ISRU for possible incorporation into future human missions to Mars, NASA proposed including an ISRU instrument on the Mars 2020 rover mission, through an announcement of opportunity (AO). The purpose of the the Mars Atmosphere Resource Verification INsitu or (MARVIN) instrument is to provide the first demonstration on Mars of oxygen production from acquired and stored Martian atmospheric carbon dioxide, as well as take measurements of atmospheric pressure and temperature, and of suspended dust particle sizes and amounts entrained in collected atmosphere gases at different times of the Mars day and year. The hardware performance and environmental data obtained will be critical for future ISRU systems that will reduce the mass of propellants and other consumables launched from Earth for robotic and human exploration, for better understanding of Mars dust and mitigation techniques to improve crew safety, and to help further define Mars global circulation models and better understand the regional atmospheric dynamics on Mars. The technologies selected for MARVIN are also scalable for future robotic sample return and human missions to Mars using ISRU.
Urine sampling and collection system
NASA Technical Reports Server (NTRS)
Fogal, G. L.; Mangialardi, J. K.; Reinhardt, C. G.
1971-01-01
This specification defines the performance and design requirements for the urine sampling and collection system engineering model and establishes requirements for its design, development, and test. The model shall provide conceptual verification of a system applicable to manned space flight which will automatically provide for collection, volume sensing, and sampling of urine.
Verification bias an underrecognized source of error in assessing the efficacy of medical imaging.
Petscavage, Jonelle M; Richardson, Michael L; Carr, Robert B
2011-03-01
Diagnostic tests are validated by comparison against a "gold standard" reference test. When the reference test is invasive or expensive, it may not be applied to all patients. This can result in biased estimates of the sensitivity and specificity of the diagnostic test. This type of bias is called "verification bias," and is a common problem in imaging research. The purpose of our study is to estimate the prevalence of verification bias in the recent radiology literature. All issues of the American Journal of Roentgenology (AJR), Academic Radiology, Radiology, and European Journal of Radiology (EJR) between November 2006 and October 2009 were reviewed for original research articles mentioning sensitivity or specificity as endpoints. Articles were read to determine whether verification bias was present and searched for author recognition of verification bias in the design. During 3 years, these journals published 2969 original research articles. A total of 776 articles used sensitivity or specificity as an outcome. Of these, 211 articles demonstrated potential verification bias. The fraction of articles with potential bias was respectively 36.4%, 23.4%, 29.5%, and 13.4% for AJR, Academic Radiology, Radiology, and EJR. The total fraction of papers with potential bias in which the authors acknowledged this bias was 17.1%. Verification bias is a common and frequently unacknowledged source of error in efficacy studies of diagnostic imaging. Bias can often be eliminated by proper study design. When it cannot be eliminated, it should be estimated and acknowledged. Published by Elsevier Inc.
Compliance and Verification of Standards and Labelling Programs in China: Lessons Learned
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saheb, Yamina; Zhou, Nan; Fridley, David
2010-06-11
After implementing several energy efficiency standards and labels (30 products covered by MEPS, 50 products covered by voluntary labels and 19 products by mandatory labels), the China National Institute of Standardization (CNIS) is now implementing verification and compliance mechanism to ensure that the energy information of labeled products comply with the requirements of their labels. CNIS is doing so by organizing check testing on a random basis for room air-conditioners, refrigerators, motors, heaters, computer displays, ovens, and self -ballasted lamps. The purpose of the check testing is to understand the implementation of the Chinese labeling scheme and help local authoritiesmore » establishing effective compliance mechanisms. In addition, to ensure robustness and consistency of testing results, CNIS has coordinated a round robin testing for room air conditioners. Eight laboratories (Chinese (6), Australian (1) and Japanese (1)) have been involved in the round robin testing and tests were performed on four sets of samples selected from manufacturer?s production line. This paper describes the methodology used in undertaking both check and round robin testing, provides analysis of testing results and reports on the findings. The analysis of both check and round robin testing demonstrated the benefits of a regularized verification and monitoring system for both laboratories and products such as (i) identifying the possible deviations between laboratories to correct them, (ii) improving the quality of testing facilities, (iii) ensuring the accuracy and reliability of energy label information in order to strength the social credibility of the labeling program and the enforcement mechanism in place.« less
Compliance and Verification of Standards and Labeling Programs in China: Lessons Learned
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saheb, Yamina; Zhou, Nan; Fridley, David
2010-08-01
After implementing several energy efficiency standards and labels (30 products covered by MEPS, 50 products covered by voluntary labels and 19 products by mandatory labels), the China National Institute of Standardization (CNIS) is now implementing verification and compliance mechanism to ensure that the energy information of labeled products comply with the requirements of their labels. CNIS is doing so by organizing check testing on a random basis for room air-conditioners, refrigerators, motors, heaters, computer displays, ovens, and self -ballasted lamps. The purpose of the check testing is to understand the implementation of the Chinese labeling scheme and help local authoritiesmore » establishing effective compliance mechanisms. In addition, to ensure robustness and consistency of testing results, CNIS has coordinated a round robin testing for room air conditioners. Eight laboratories (Chinese (6), Australian (1) and Japanese (1)) have been involved in the round robin testing and tests were performed on four sets of samples selected from manufacturer's production line. This paper describes the methodology used in undertaking both check and round robin testing, provides analysis of testing results and reports on the findings. The analysis of both check and round robin testing demonstrated the benefits of a regularized verification and monitoring system for both laboratories and products such as (i) identifying the possible deviations between laboratories to correct them, (ii) improving the quality of testing facilities, (iii) ensuring the accuracy and reliability of energy label information in order to strength the social credibility of the labeling program and the enforcement mechanism in place.« less
A probabilistic verification score for contours demonstrated with idealized ice-edge forecasts
NASA Astrophysics Data System (ADS)
Goessling, Helge; Jung, Thomas
2017-04-01
We introduce a probabilistic verification score for ensemble-based forecasts of contours: the Spatial Probability Score (SPS). Defined as the spatial integral of local (Half) Brier Scores, the SPS can be considered the spatial analog of the Continuous Ranked Probability Score (CRPS). Applying the SPS to idealized seasonal ensemble forecasts of the Arctic sea-ice edge in a global coupled climate model, we demonstrate that the SPS responds properly to ensemble size, bias, and spread. When applied to individual forecasts or ensemble means (or quantiles), the SPS is reduced to the 'volume' of mismatch, in case of the ice edge corresponding to the Integrated Ice Edge Error (IIEE).
2015-03-01
ARL‐TR‐7244 ● MAR 2015 US Army Research Laboratory Verification and Demonstration for Transition of Non‐Hexavalent Chromium ...Transition of Non‐Hexavalent Chromium , Low‐VOC Alternative Technologies to Replace DOD‐P‐15328 Wash Primer for Multimetal Applications by John Kelley...Hexavalent Chromium , Low-VOC Alternative Technologies to Replace DOD-P-15328 Wash Primer for Multimetal Applications 5a. CONTRACT NUMBER 5b. GRANT
Verification and Validation of Adaptive and Intelligent Systems with Flight Test Results
NASA Technical Reports Server (NTRS)
Burken, John J.; Larson, Richard R.
2009-01-01
F-15 IFCS project goals are: a) Demonstrate Control Approaches that can Efficiently Optimize Aircraft Performance in both Normal and Failure Conditions [A] & [B] failures. b) Advance Neural Network-Based Flight Control Technology for New Aerospace Systems Designs with a Pilot in the Loop. Gen II objectives include; a) Implement and Fly a Direct Adaptive Neural Network Based Flight Controller; b) Demonstrate the Ability of the System to Adapt to Simulated System Failures: 1) Suppress Transients Associated with Failure; 2) Re-Establish Sufficient Control and Handling of Vehicle for Safe Recovery. c) Provide Flight Experience for Development of Verification and Validation Processes for Flight Critical Neural Network Software.
Liu, Yong; Cao, Yu; Li, Yaxiong; Lei, Dongyun; Li, Lin; Hou, Zong Liu; Han, Shen; Meng, Mingyao; Shi, Jianlin; Zhang, Yayong; Wang, Yi; Niu, Zhaoyi; Xie, Yanhua; Xiao, Benshan; Wang, Yuanfei; Li, Xiao; Yang, Lirong
2018-01-01
Background Recently, mutations in several genes have been described to be associated with sporadic ASD, but some genetic variants remain to be identified. The aim of this study was to use whole-exome sequencing (WES) combined with bioinformatics analysis to identify novel genetic variants in cases of sporadic congenital ASD, followed by validation by Sanger sequencing. Material/Methods Five Han patients with secundum ASD were recruited, and their tissue samples were analyzed by WES, followed by verification by Sanger sequencing of tissue and blood samples. Further evaluation using blood samples included 452 additional patients with sporadic secundum ASD (212 male and 240 female patients) and 519 healthy subjects (252 male and 267 female subjects) for further verification by a multiplexed MassARRAY system. Bioinformatic analyses were performed to identify novel genetic variants associated with sporadic ASD. Results From five patients with sporadic ASD, a total of 181,762 genomic variants in 33 exon loci, validated by Sanger sequencing, were selected and underwent MassARRAY analysis in 452 patients with ASD and 519 healthy subjects. Three loci with high mutation frequencies, the 138665410 FOXL2 gene variant, the 23862952 MYH6 gene variant, and the 71098693 HYDIN gene variant were found to be significantly associated with sporadic ASD (P<0.05); variants in FOXL2 and MYH6 were found in patients with isolated, sporadic ASD (P<5×10−4). Conclusions This was the first study that demonstrated variants in FOXL2 and HYDIN associated with sporadic ASD, and supported the use of WES and bioinformatics analysis to identify disease-associated mutations. PMID:29505555
Liu, Yong; Cao, Yu; Li, Yaxiong; Lei, Dongyun; Li, Lin; Hou, Zong Liu; Han, Shen; Meng, Mingyao; Shi, Jianlin; Zhang, Yayong; Wang, Yi; Niu, Zhaoyi; Xie, Yanhua; Xiao, Benshan; Wang, Yuanfei; Li, Xiao; Yang, Lirong; Wang, Wenju; Jiang, Lihong
2018-03-05
BACKGROUND Recently, mutations in several genes have been described to be associated with sporadic ASD, but some genetic variants remain to be identified. The aim of this study was to use whole-exome sequencing (WES) combined with bioinformatics analysis to identify novel genetic variants in cases of sporadic congenital ASD, followed by validation by Sanger sequencing. MATERIAL AND METHODS Five Han patients with secundum ASD were recruited, and their tissue samples were analyzed by WES, followed by verification by Sanger sequencing of tissue and blood samples. Further evaluation using blood samples included 452 additional patients with sporadic secundum ASD (212 male and 240 female patients) and 519 healthy subjects (252 male and 267 female subjects) for further verification by a multiplexed MassARRAY system. Bioinformatic analyses were performed to identify novel genetic variants associated with sporadic ASD. RESULTS From five patients with sporadic ASD, a total of 181,762 genomic variants in 33 exon loci, validated by Sanger sequencing, were selected and underwent MassARRAY analysis in 452 patients with ASD and 519 healthy subjects. Three loci with high mutation frequencies, the 138665410 FOXL2 gene variant, the 23862952 MYH6 gene variant, and the 71098693 HYDIN gene variant were found to be significantly associated with sporadic ASD (P<0.05); variants in FOXL2 and MYH6 were found in patients with isolated, sporadic ASD (P<5×10^-4). CONCLUSIONS This was the first study that demonstrated variants in FOXL2 and HYDIN associated with sporadic ASD, and supported the use of WES and bioinformatics analysis to identify disease-associated mutations.
Implications of sampling design and sample size for national carbon accounting systems
Michael Köhl; Andrew Lister; Charles T. Scott; Thomas Baldauf; Daniel Plugge
2011-01-01
Countries willing to adopt a REDD regime need to establish a national Measurement, Reporting and Verification (MRV) system that provides information on forest carbon stocks and carbon stock changes. Due to the extensive areas covered by forests the information is generally obtained by sample based surveys. Most operational sampling approaches utilize a combination of...
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
2006-09-27
The 100-B-20 waste site, located in the 100-BC-1 Operable Unit of the Hanford Site, consisted of an underground oil tank that once serviced the 1716-B Maintenance Garage. The selected action for the 100-B-20 waste site involved removal of the oil tanks and their contents and demonstrating through confirmatory sampling that all cleanup goals have been met. In accordance with this evaluation, a reclassification status of interim closed out has been determined. The results demonstrate that the site will support future unrestricted land uses that can be represented by a rural-residential scenario. These results also show that residual concentrations support unrestrictedmore » future use of shallow zone soil and that contaminant levels remaining in the soil are protective of groundwater and the Columbia River.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. M. Capron
2008-04-17
The 100-F-54 waste site, part of the 100-FR-2 Operable Unit, is the soil associated with the former pastures for holding domestic farm animals used in experimental toxicology studies. Evaluation of historical information resulted in identification of the experimental animal farm pastures as having potential residual soil contamination due to excrement from experimental animals. The 100-F-54 animal farm pastures confirmatory sampling results support a reclassification of this site to No Action. The current site conditions achieve the remedial action objectives and the corresponding remedial action goals established in the Remaining Sites ROD. The results of confirmatory sampling show that residual contaminantmore » concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.« less
NASA Astrophysics Data System (ADS)
Connick, Robert J.
Accurate measurement of normal incident transmission loss is essential for the acoustic characterization of building materials. In this research, a method of measuring normal incidence sound transmission loss proposed by Salissou et al. as a complement to standard E2611-09 of the American Society for Testing and Materials [Standard Test Method for Measurement of Normal Incidence Sound Transmission of Acoustical Materials Based on the Transfer Matrix Method (American Society for Testing and Materials, New York, 2009)] is verified. Two sam- ples from the original literature are used to verify the method as well as a Filtros RTM sample. Following the verification, several nano-material Aerogel samples are measured.
A physical zero-knowledge object-comparison system for nuclear warhead verification
Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; d'Errico, Francesco
2016-01-01
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications. PMID:27649477
A physical zero-knowledge object-comparison system for nuclear warhead verification.
Philippe, Sébastien; Goldston, Robert J; Glaser, Alexander; d'Errico, Francesco
2016-09-20
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.
A physical zero-knowledge object-comparison system for nuclear warhead verification
NASA Astrophysics Data System (ADS)
Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; D'Errico, Francesco
2016-09-01
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.
A physical zero-knowledge object-comparison system for nuclear warhead verification
Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; ...
2016-09-20
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information.more » More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.« less
A physical zero-knowledge object-comparison system for nuclear warhead verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information.more » More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.« less
This booklet, ETV Program Case Studies: Demonstrating Program Outcomes, Volume III contains two case studies, addressing verified environmental technologies for decentalized wastewater treatment and converting animal waste to energy. Each case study contains a brief description ...
Enhanced Verification Test Suite for Physics Simulation Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamm, J R; Brock, J S; Brandon, S T
2008-10-10
This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest.more » This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of greater sophistication or other physics regimes (e.g., energetic material response, magneto-hydrodynamics), would represent a scientifically desirable complement to the fundamental test cases discussed in this report. The authors believe that this document can be used to enhance the verification analyses undertaken at the DOE WP Laboratories and, thus, to improve the quality, credibility, and usefulness of the simulation codes that are analyzed with these problems.« less
Juillet, Y; Dubois, C; Bintein, F; Dissard, J; Bossée, A
2014-08-01
A new rapid, sensitive and reliable method was developed for the determination of phosgene in air samples using thermal desorption (TD) followed by gas chromatography-mass spectrometry (GC-MS). The method is based on a fast (10 min) active sampling of only 1 L of air onto a Tenax® GR tube doped with 0.5 mL of derivatizing mixture containing dimercaptotoluene and triethylamine in hexane solution. Validation of the TD-GC-MS method showed a low limit of detection (40 ppbv), acceptable repeatability, intermediate fidelity (relative standard deviation within 12 %) and excellent accuracy (>95%). Linearity was demonstrated for two concentration ranges (0.04 to 2.5 ppmv and 2.5 to 10 ppmv) owing to variation of derivatization recovery between low and high concentration levels. Due to its simple on-site implementation and its close similarity with recommended operating procedure (ROP) for chemical warfare agents vapour sampling, the method is particularly useful in the process of verification of the Chemical Weapons Convention.
Precision segmented reflector, figure verification sensor
NASA Technical Reports Server (NTRS)
Manhart, Paul K.; Macenka, Steve A.
1989-01-01
The Precision Segmented Reflector (PSR) program currently under way at the Jet Propulsion Laboratory is a test bed and technology demonstration program designed to develop and study the structural and material technologies required for lightweight, precision segmented reflectors. A Figure Verification Sensor (FVS) which is designed to monitor the active control system of the segments is described, a best fit surface is defined, and an image or wavefront quality of the assembled array of reflecting panels is assessed
Cleaning verification by air/water impingement
NASA Technical Reports Server (NTRS)
Jones, Lisa L.; Littlefield, Maria D.; Melton, Gregory S.; Caimi, Raoul E. B.; Thaxton, Eric A.
1995-01-01
This paper will discuss how the Kennedy Space Center intends to perform precision cleaning verification by Air/Water Impingement in lieu of chlorofluorocarbon-113 gravimetric nonvolatile residue analysis (NVR). Test results will be given that demonstrate the effectiveness of the Air/Water system. A brief discussion of the Total Carbon method via the use of a high temperature combustion analyzer will also be given. The necessary equipment for impingement will be shown along with other possible applications of this technology.
NASA Technical Reports Server (NTRS)
Pan, Jing; Levitt, Karl N.; Cohen, Gerald C.
1991-01-01
Discussed here is work to formally specify and verify a floating point coprocessor based on the MC68881. The HOL verification system developed at Cambridge University was used. The coprocessor consists of two independent units: the bus interface unit used to communicate with the cpu and the arithmetic processing unit used to perform the actual calculation. Reasoning about the interaction and synchronization among processes using higher order logic is demonstrated.
Restricted access processor - An application of computer security technology
NASA Technical Reports Server (NTRS)
Mcmahon, E. M.
1985-01-01
This paper describes a security guard device that is currently being developed by Computer Sciences Corporation (CSC). The methods used to provide assurance that the system meets its security requirements include the system architecture, a system security evaluation, and the application of formal and informal verification techniques. The combination of state-of-the-art technology and the incorporation of new verification procedures results in a demonstration of the feasibility of computer security technology for operational applications.
ETV TEST OF PCDD/F EMISSIONS MONITORING SYSTEMS
Four polychlorinated dibenzodioxin and furan (PCDD/F) emission monitors were tested under the EPA Environmental Technology and Verification (ETV) program. Two long-term sampling devices, the DioxinMonitoringSystem and Adsorption Method for Sampling Dioxins and Furans, and two sem...
Astrobiology's Central Dilemma: How can we detect Life if we cannot even Define it?
NASA Astrophysics Data System (ADS)
Clark, B. C.
2001-11-01
Culling and consolidating from a collection of 102 attributes asserted as properties of Life, and the numerous Definitions of Life which invoke them, a new definition is proposed. Analysis of the pathways to proving that any given entity, from micro-sample to planetary object, harbors one or more lifeforms provides strategies for the observations, experiments and detection approaches. These are necessarily varied because of the relative accessibility/inaccessibility of the samples themselves, for example, from Mars, Europa, the ancient Earth or extra-solar system planets. A two-tiered Definition of Life has been formulated, involving both Lifeform and Organism. Devising exploration strategies with a reasonable probability of success and acceptance should proceed along the steps needed for detection and verification of the minimal properties which define Life itself. Multiple approaches, such as high resolution remote spectroscopy for detection of biomarker gases, in situ demonstrations of energy utilization to performs functions such as anabolic or catabolic transformations, achievement of demonstrated reproduction through multi-condition incubations, and probes for macromolecular biochemicals which indicate information storage should be undertaken wherever possible, as should return of samples to terrestrial laboratories for more versatile, more sensitive and more definitive examinations. Use of control samples is paramount, as is detailed understanding of the chemistry and physics of the environment which constrains the activities and tracers being sought.
2006-09-30
High-Pressure Waterjet • CO2 Pellet/Turbine Wheel • Ultrahigh-Pressure Waterjet 5 Process Water Reuse/Recycle • Cross-Flow Microfiltration ...documented on a process or laboratory form. Corrective action will involve taking all necessary steps to restore a measuring system to proper working order...In all cases, a nonconformance will be rectified before sample processing and analysis continues. If corrective action does not restore the
Skates, Steven J.; Gillette, Michael A.; LaBaer, Joshua; Carr, Steven A.; Anderson, N. Leigh; Liebler, Daniel C.; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L.; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R.; Rodriguez, Henry; Boja, Emily S.
2014-01-01
Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC), with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance, and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step towards building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research. PMID:24063748
Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S
2013-12-06
Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.
ENVIRONMENTAL TECHNOLOGICAL VERIFICATION REPORT - L2000 PCB/CHLORIDE ANALYZER - DEXSIL CORPORATION
In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of Polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCBs in soil...
ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - ENVIROGARD PCB TEST KIT - STRATEGIC DIAGNOSTICS INC
In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of Polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCBs in soil...
In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of polychlorinated biphenyl (PCB) field analytical techniques. The demonstration design was subjected to extensive review and comment by EPA's National Exposure Research Laboratory (NERL) Envi...
Hybrid Gama Emission Tomography (HGET): FY16 Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Erin A.; Smith, Leon E.; Wittman, Richard S.
2017-02-01
Current International Atomic Energy Agency (IAEA) methodologies for the verification of fresh low-enriched uranium (LEU) and mixed oxide (MOX) fuel assemblies are volume-averaging methods that lack sensitivity to individual pins. Further, as fresh fuel assemblies become more and more complex (e.g., heavy gadolinium loading, high degrees of axial and radial variation in fissile concentration), the accuracy of current IAEA instruments degrades and measurement time increases. Particularly in light of the fact that no special tooling is required to remove individual pins from modern fuel assemblies, the IAEA needs new capabilities for the verification of unirradiated (i.e., fresh LEU and MOX)more » assemblies to ensure that fissile material has not been diverted. Passive gamma emission tomography has demonstrated potential to provide pin-level verification of spent fuel, but gamma-ray emission rates from unirradiated fuel emissions are significantly lower, precluding purely passive tomography methods. The work presented here introduces the concept of Hybrid Gamma Emission Tomography (HGET) for verification of unirradiated fuels, in which a neutron source is used to actively interrogate the fuel assembly and the resulting gamma-ray emissions are imaged using tomographic methods to provide pin-level verification of fissile material concentration.« less
Exploration of Uncertainty in Glacier Modelling
NASA Technical Reports Server (NTRS)
Thompson, David E.
1999-01-01
There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.
Reconstruction based finger-knuckle-print verification with score level adaptive binary fusion.
Gao, Guangwei; Zhang, Lei; Yang, Jian; Zhang, Lin; Zhang, David
2013-12-01
Recently, a new biometrics identifier, namely finger knuckle print (FKP), has been proposed for personal authentication with very interesting results. One of the advantages of FKP verification lies in its user friendliness in data collection. However, the user flexibility in positioning fingers also leads to a certain degree of pose variations in the collected query FKP images. The widely used Gabor filtering based competitive coding scheme is sensitive to such variations, resulting in many false rejections. We propose to alleviate this problem by reconstructing the query sample with a dictionary learned from the template samples in the gallery set. The reconstructed FKP image can reduce much the enlarged matching distance caused by finger pose variations; however, both the intra-class and inter-class distances will be reduced. We then propose a score level adaptive binary fusion rule to adaptively fuse the matching distances before and after reconstruction, aiming to reduce the false rejections without increasing much the false acceptances. Experimental results on the benchmark PolyU FKP database show that the proposed method significantly improves the FKP verification accuracy.
40 CFR 1065.925 - PEMS preparation for field testing.
Code of Federal Regulations, 2010 CFR
2010-07-01
.... 1065.925 Section 1065.925 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... purge any gaseous sampling PEMS instruments with ambient air until sampling begins to prevent system contamination from excessive cold-start emissions. (e) Conduct calibrations and verifications. (f) Operate any...
ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) TEST OF DIOXIN EMISSION MONITORS
The performance of four dioxin emission monitors including two long-term sampling devices, the DMS (DioxinMonitoringSystem) and AMESA (Adsorption Method for Sampling Dioxins and Furans), and two semi-real-time continuous monitors, RIMMPA-TOFMS (Resonance Ionization with Multi-Mir...
Self-verification and depression among youth psychiatric inpatients.
Joiner, T E; Katz, J; Lew, A S
1997-11-01
According to self-verification theory (e.g., W.B. Swann, 1983), people are motivated to preserve stable self-concepts by seeking self-confirming interpersonal responses, even if the responses are negative. In the current study of 72 youth psychiatric inpatients (36 boys; 36 girls; ages 7-17, M = 13.18; SD = 2.59), the authors provide the 1st test of self-verification theory among a youth sample. Participants completed self-report questionnaires on depression, self-esteem, anxiety, negative and positive affect, and interest in negative feedback from others. The authors made chart diagnoses available, and they collected peer rejection ratings. Consistent with hypotheses, the authors found that interest in negative feedback was associated with depression, was predictive of peer rejection (but only within relatively longer peer relationships), was more highly related to cognitive than emotional aspects of depression, and was specifically associated with depression, rather than being generally associated with emotional distress. The authors discuss implications for self-verification theory and for the phenomenology of youth depression.
Improved Hip-Based Individual Recognition Using Wearable Motion Recording Sensor
NASA Astrophysics Data System (ADS)
Gafurov, Davrondzhon; Bours, Patrick
In todays society the demand for reliable verification of a user identity is increasing. Although biometric technologies based on fingerprint or iris can provide accurate and reliable recognition performance, they are inconvenient for periodic or frequent re-verification. In this paper we propose a hip-based user recognition method which can be suitable for implicit and periodic re-verification of the identity. In our approach we use a wearable accelerometer sensor attached to the hip of the person, and then the measured hip motion signal is analysed for identity verification purposes. The main analyses steps consists of detecting gait cycles in the signal and matching two sets of detected gait cycles. Evaluating the approach on a hip data set consisting of 400 gait sequences (samples) from 100 subjects, we obtained equal error rate (EER) of 7.5% and identification rate at rank 1 was 81.4%. These numbers are improvements by 37.5% and 11.2% respectively of the previous study using the same data set.
NASA Astrophysics Data System (ADS)
Szeleszczuk, Łukasz; Gubica, Tomasz; Zimniak, Andrzej; Pisklak, Dariusz M.; Dąbrowska, Kinga; Cyrański, Michał K.; Kańska, Marianna
2017-10-01
A convenient method for the indirect crystal structure verification of methyl glycosides was demonstrated. Single-crystal X-ray diffraction structures for methyl glycoside acetates were deacetylated and subsequently subjected to DFT calculations under periodic boundary conditions. Solid-state NMR spectroscopy served as a guide for calculations. A high level of accuracy of the modelled crystal structures of methyl glycosides was confirmed by comparison with published results of neutron diffraction study using RMSD method.
2015-10-01
Hawaii HASP Health and Safety Plan IDA Institute for Defense Analyses IVS Instrument Verification Strip m Meter mm Millimeter MPV Man Portable...the ArcSecond laser ranger was impractical due to the requirement to maintain line-of-sight for three rovers and tedious calibration. The SERDP...within 0.1m spacing and 99% within 0.15 m Repeatability of Instrument Verification Strip (IVS) survey Amplitude of EM anomaly Amplitude of
In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCB's in soi...
On marker-based parentage verification via non-linear optimization.
Boerner, Vinzent
2017-06-15
Parentage verification by molecular markers is mainly based on short tandem repeat markers. Single nucleotide polymorphisms (SNPs) as bi-allelic markers have become the markers of choice for genotyping projects. Thus, the subsequent step is to use SNP genotypes for parentage verification as well. Recent developments of algorithms such as evaluating opposing homozygous SNP genotypes have drawbacks, for example the inability of rejecting all animals of a sample of potential parents. This paper describes an algorithm for parentage verification by constrained regression which overcomes the latter limitation and proves to be very fast and accurate even when the number of SNPs is as low as 50. The algorithm was tested on a sample of 14,816 animals with 50, 100 and 500 SNP genotypes randomly selected from 40k genotypes. The samples of putative parents of these animals contained either five random animals, or four random animals and the true sire. Parentage assignment was performed by ranking of regression coefficients, or by setting a minimum threshold for regression coefficients. The assignment quality was evaluated by the power of assignment (P[Formula: see text]) and the power of exclusion (P[Formula: see text]). If the sample of putative parents contained the true sire and parentage was assigned by coefficient ranking, P[Formula: see text] and P[Formula: see text] were both higher than 0.99 for the 500 and 100 SNP genotypes, and higher than 0.98 for the 50 SNP genotypes. When parentage was assigned by a coefficient threshold, P[Formula: see text] was higher than 0.99 regardless of the number of SNPs, but P[Formula: see text] decreased from 0.99 (500 SNPs) to 0.97 (100 SNPs) and 0.92 (50 SNPs). If the sample of putative parents did not contain the true sire and parentage was rejected using a coefficient threshold, the algorithm achieved a P[Formula: see text] of 1 (500 SNPs), 0.99 (100 SNPs) and 0.97 (50 SNPs). The algorithm described here is easy to implement, fast and accurate, and is able to assign parentage using genomic marker data with a size as low as 50 SNPs.
NASA Astrophysics Data System (ADS)
Vainshtein, Igor; Baruch, Shlomi; Regev, Itai; Segal, Victor; Filis, Avishai; Riabzev, Sergey
2018-05-01
The growing demand for EO applications that work around the clock 24hr/7days a week, such as in border surveillance systems, emphasizes the need for a highly reliable cryocooler having increased operational availability and optimized system's Integrated Logistic Support (ILS). In order to meet this need, RICOR developed linear and rotary cryocoolers which achieved successfully this goal. Cryocoolers MTTF was analyzed by theoretical reliability evaluation methods, demonstrated by normal and accelerated life tests at Cryocooler level and finally verified by field data analysis derived from Cryocoolers operating at system level. The following paper reviews theoretical reliability analysis methods together with analyzing reliability test results derived from standard and accelerated life demonstration tests performed at Ricor's advanced reliability laboratory. As a summary for the work process, reliability verification data will be presented as a feedback from fielded systems.
Soil Gas Sample Handling: Evaluation of Water Removal and Sample Ganging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fritz, Brad G.; Abrecht, David G.; Hayes, James C.
2016-10-31
Soil gas sampling is currently conducted in support of Nuclear Test Ban treaty verification. Soil gas samples are collected and analyzed for isotopes of interest. Some issues that can impact sampling and analysis of these samples are excess moisture and sample processing time. Here we discuss three potential improvements to the current sampling protocol; a desiccant for water removal, use of molecular sieve to remove CO 2 from the sample during collection, and a ganging manifold to allow composite analysis of multiple samples.
Accurate Biomass Estimation via Bayesian Adaptive Sampling
NASA Technical Reports Server (NTRS)
Wheeler, Kevin R.; Knuth, Kevin H.; Castle, Joseph P.; Lvov, Nikolay
2005-01-01
The following concepts were introduced: a) Bayesian adaptive sampling for solving biomass estimation; b) Characterization of MISR Rahman model parameters conditioned upon MODIS landcover. c) Rigorous non-parametric Bayesian approach to analytic mixture model determination. d) Unique U.S. asset for science product validation and verification.
Rapid determination of alpha emitters using Actinide resin.
Navarro, N; Rodriguez, L; Alvarez, A; Sancho, C
2004-01-01
The European Commission has recently published the recommended radiological protection criteria for the clearance of building and building rubble from the dismantling of nuclear installations. Radionuclide specific clearance levels for actinides are very low (between 0.1 and 1 Bq g(-1)). The prevalence of natural radionuclides in rubble materials makes the verification of these levels by direct alpha counting impossible. The capability of Actinide resin (Eichrom Industries, Inc.) for extracting plutonium and americium from rubble samples has been tested in this work. Besides a strong affinity for actinides in the tri, tetra and hexavalent oxidation states, this extraction chromatographic resin presents an easy recovery of absorbed radionuclides. The retention capability was evaluated on rubble samples spiked with certified radionuclide standards (239Pu and 241Am). Samples were leached with nitric acid, passed through a chromatographic column containing the resin and the elution fraction was measured by LSC. Actinide retention varies from 60% to 80%. Based on these results, a rapid method for the verification of clearance levels for actinides in rubble samples is proposed.
Nguyen, Huynh; Morgan, David A F; Sly, Lindsay I; Benkovich, Morris; Cull, Sharon; Forwood, Mark R
2008-06-01
ISO 11137-2006 (ISO 11137-2a 2006) provides a VDmax 15 method for substantiation of 15 kGy as radiation sterilisation dose (RSD) for health care products with a relatively low sample requirement. Moreover, the method is also valid for products in which the bioburden level is less than or equal to 1.5. In the literature, the bioburden level of processed bone allografts is extremely low. Similarly, the Queensland Bone Bank (QBB) usually recovers no viable organisms from processed bone allografts. Because bone allografts are treated as a type of health care product, the aim of this research was to substantiate 15 kGy as a RSD for frozen bone allografts at the QBB using method VDmax 15-ISO 11137-2: 2006 (ISO 11137-2e, Procedure for method VDmax 15 for multiple production batches. Sterilisation of health care products - radiation - part 2: establishing the sterilisation dose, 2006; ISO 11137-2f, Procedure for method VDmax 15 for a single production batch. Sterilisation of health care products - radiation - part 2: establishing the sterilisation dose, 2006). 30 femoral heads, 40 milled bone allografts and 40 structural bone allografts manufactured according to QBB standard operating procedures were used. Estimated bioburdens for each bone allograft group were used to calculate the verification doses. Next, 10 samples per group were irradiated at the verification dose, sterility was tested and the number of positive tests of sterility recorded. If the number of positive samples was no more than 1, from the 10 tests carried out in each group, the verification was accepted and 15 kGy was substantiated as RSD for those bone allografts. The bioburdens in all three groups were 0, and therefore the verification doses were 0 kGy. Sterility tests of femoral heads and milled bones were all negative (no contamination), and there was one positive test of sterility in the structural bone allograft. Accordingly, the verification was accepted. Using the ISO validated protocol, VDmax 15, 15 kGy was substantiated as RSD for frozen bone allografts manufactured at the QBB.
Analysis of particulate contamination on tape lift samples from the VETA optical surfaces
NASA Technical Reports Server (NTRS)
Germani, Mark S.
1992-01-01
Particulate contamination analysis was carried out on samples taken from the Verification Engineering Test Article (VETA) x-ray detection system. A total of eighteen tape lift samples were taken from the VETA optical surfaces. Initially, the samples were tested using a scanning electron microscope. Additionally, particle composition was determined by energy dispersive x-ray spectrometry. Results are presented in terms of particle loading per sample.
NASA Technical Reports Server (NTRS)
Rumsey, Christopher L.
2009-01-01
In current practice, it is often difficult to draw firm conclusions about turbulence model accuracy when performing multi-code CFD studies ostensibly using the same model because of inconsistencies in model formulation or implementation in different codes. This paper describes an effort to improve the consistency, verification, and validation of turbulence models within the aerospace community through a website database of verification and validation cases. Some of the variants of two widely-used turbulence models are described, and two independent computer codes (one structured and one unstructured) are used in conjunction with two specific versions of these models to demonstrate consistency with grid refinement for several representative problems. Naming conventions, implementation consistency, and thorough grid resolution studies are key factors necessary for success.
In April 1995, the U.S. Environmental Protection Agency (EPA) sponsored a demonstration of field portable X-ray fluorescence (FPXRF) analyzers. The primary objectives of this demonstration were (1) to determine how well FPXRF analyzers perform in comparison to standard reference...
In April 1995, the Environmental Protection Agency (EPA) conducted a demonstration of field portable X-ray fluorescence (FPXRF) Analyzers. The primary objectives of this demonstration were (1) to determine how well FPXRF analyzers perform in comparison to a standard reference m...
A Method for the Evaluation of Thousands of Automated 3D Stem Cell Segmentations
Bajcsy, Peter; Simon, Mylene; Florczyk, Stephen; Simon, Carl G.; Juba, Derek; Brady, Mary
2016-01-01
There is no segmentation method that performs perfectly with any data set in comparison to human segmentation. Evaluation procedures for segmentation algorithms become critical for their selection. The problems associated with segmentation performance evaluations and visual verification of segmentation results are exaggerated when dealing with thousands of 3D image volumes because of the amount of computation and manual inputs needed. We address the problem of evaluating 3D segmentation performance when segmentation is applied to thousands of confocal microscopy images (z-stacks). Our approach is to incorporate experimental imaging and geometrical criteria, and map them into computationally efficient segmentation algorithms that can be applied to a very large number of z-stacks. This is an alternative approach to considering existing segmentation methods and evaluating most state-of-the-art algorithms. We designed a methodology for 3D segmentation performance characterization that consists of design, evaluation and verification steps. The characterization integrates manual inputs from projected surrogate “ground truth” of statistically representative samples and from visual inspection into the evaluation. The novelty of the methodology lies in (1) designing candidate segmentation algorithms by mapping imaging and geometrical criteria into algorithmic steps, and constructing plausible segmentation algorithms with respect to the order of algorithmic steps and their parameters, (2) evaluating segmentation accuracy using samples drawn from probability distribution estimates of candidate segmentations, and (3) minimizing human labor needed to create surrogate “truth” by approximating z-stack segmentations with 2D contours from three orthogonal z-stack projections and by developing visual verification tools. We demonstrate the methodology by applying it to a dataset of 1253 mesenchymal stem cells. The cells reside on 10 different types of biomaterial scaffolds, and are stained for actin and nucleus yielding 128 460 image frames (on average 125 cells/scaffold × 10 scaffold types × 2 stains × 51 frames/cell). After constructing and evaluating six candidates of 3D segmentation algorithms, the most accurate 3D segmentation algorithm achieved an average precision of 0.82 and an accuracy of 0.84 as measured by the Dice similarity index where values greater than 0.7 indicate a good spatial overlap. A probability of segmentation success was 0.85 based on visual verification, and a computation time was 42.3 h to process all z-stacks. While the most accurate segmentation technique was 4.2 times slower than the second most accurate algorithm, it consumed on average 9.65 times less memory per z-stack segmentation. PMID:26268699
Grewe, P M; Feutry, P; Hill, P L; Gunasekera, R M; Schaefer, K M; Itano, D G; Fuller, D W; Foster, S D; Davies, C R
2015-11-23
Tropical tuna fisheries are central to food security and economic development of many regions of the world. Contemporary population assessment and management generally assume these fisheries exploit a single mixed spawning population, within ocean basins. To date population genetics has lacked the required power to conclusively test this assumption. Here we demonstrate heterogeneous population structure among yellowfin tuna sampled at three locations across the Pacific Ocean (western, central, and eastern) via analysis of double digest restriction-site associated DNA using Next Generation Sequencing technology. The differences among locations are such that individuals sampled from one of the three regions examined can be assigned with close to 100% accuracy demonstrating the power of this approach for providing practical markers for fishery independent verification of catch provenance in a way not achieved by previous techniques. Given these results, an extended pan-tropical survey of yellowfin tuna using this approach will not only help combat the largest threat to sustainable fisheries (i.e. illegal, unreported, and unregulated fishing) but will also provide a basis to transform current monitoring, assessment, and management approaches for this globally significant species.
Grewe, P. M.; Feutry, P.; Hill, P. L.; Gunasekera, R. M.; Schaefer, K. M.; Itano, D. G.; Fuller, D. W.; Foster, S. D.; Davies, C. R.
2015-01-01
Tropical tuna fisheries are central to food security and economic development of many regions of the world. Contemporary population assessment and management generally assume these fisheries exploit a single mixed spawning population, within ocean basins. To date population genetics has lacked the required power to conclusively test this assumption. Here we demonstrate heterogeneous population structure among yellowfin tuna sampled at three locations across the Pacific Ocean (western, central, and eastern) via analysis of double digest restriction-site associated DNA using Next Generation Sequencing technology. The differences among locations are such that individuals sampled from one of the three regions examined can be assigned with close to 100% accuracy demonstrating the power of this approach for providing practical markers for fishery independent verification of catch provenance in a way not achieved by previous techniques. Given these results, an extended pan-tropical survey of yellowfin tuna using this approach will not only help combat the largest threat to sustainable fisheries (i.e. illegal, unreported, and unregulated fishing) but will also provide a basis to transform current monitoring, assessment, and management approaches for this globally significant species. PMID:26593698
Development of an Ultra-Low Background Liquid Scintillation Counter for Trace Level Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erchinger, Jennifer L.; Orrell, John L.; Aalseth, Craig E.
2015-09-01
Low-level liquid scintillation counting (LSC) has been established as one of the radiation detection techniques useful in elucidating environmental processes and environmental monitoring around nuclear facilities. The Ultra-Low Background Liquid Scintillation Counter (ULB-LSC) under construction in the Shallow Underground Laboratory at Pacific Northwest National Laboratory aims to further reduce the MDAs and/or required sample processing. Through layers of passive shielding in conjunction with an active veto and 30 meters water equivalent overburden, the background reduction is expected to be 10 to 100 times below typical analytic low-background liquid scintillation systems. Simulations have shown an expected background of around 14 countsmore » per day. A novel approach to the light collection will use a coated hollow light guide cut into the inner copper shielding. Demonstration LSC measurements will show low-energy detection, spectral deconvolution, and alpha/beta discrimination capabilities, from trials with standards of tritium, strontium-90, and actinium-227, respectively. An overview of the system design and expected demonstration measurements will emphasize the potential applications of the ULB-LSC in environmental monitoring for treaty verification, reach-back sample analysis, and facility inspections.« less
NASA Technical Reports Server (NTRS)
Hornung, Steven D.; Biesinger, Paul; Kirsch, Mike; Beeson, Harold; Leuders, Kathy
1999-01-01
The NASA White Sands Test Facility (WSTF) has developed an entirely aqueous final cleaning and verification process to replace the current chlorofluorocarbon (CFC) 113 based process. This process has been accepted for final cleaning and cleanliness verification of WSTF ground support equipment. The aqueous process relies on ultrapure water at 50 C (323 K) and ultrasonic agitation for removal of organic compounds and particulate. The cleanliness is verified bv determining the total organic carbon (TOC) content and filtration with particulate counting. The effectiveness of the aqueous methods for detecting hydrocarbon contamination and particulate was compared to the accepted CFC 113 sampling procedures. Testing with known contaminants, such as hydraulic fluid and cutting and lubricating oils, to establish a correlation between aqueous TOC and CFC 113 nonvolatile residue (NVR) was performed. Particulate sampling on cleaned batches of hardware that were randomly separated and sampled by the two methods was performed. This paper presents the approach and results, and discusses the issues in establishing the equivalence of aqueous sampling to CFC 113 sampling, while describing the approach for implementing aqueous techniques on Space Shuttle Propulsion hardware.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peters, T. B.; Bannochie, C. J.
Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H in support of verification of Macrobatch (Salt Batch) 11 for the Interim Salt Disposition Program (ISDP) for processing. This document reports characterization data on the samples of Tank 21H and fulfills the requirements of Deliverable 3 of the Technical Task Request (TTR).
Bowen, Raffick A R; Adcock, Dorothy M
2016-12-01
Blood collection tubes (BCTs) are an often under-recognized variable in the preanalytical phase of clinical laboratory testing. Unfortunately, even the best-designed and manufactured BCTs may not work well in all clinical settings. Clinical laboratories, in collaboration with healthcare providers, should carefully evaluate BCTs prior to putting them into clinical use to determine their limitations and ensure that patients are not placed at risk because of inaccuracies due to poor tube performance. Selection of the best BCTs can be achieved through comparing advertising materials, reviewing the literature, observing the device at a scientific meeting, receiving a demonstration, evaluating the device under simulated conditions, or testing the device with patient samples. Although many publications have discussed method validations, few detail how to perform experiments for tube verification and validation. This article highlights the most common and impactful variables related to BCTs and discusses the validation studies that a typical clinical laboratory should perform when selecting BCTs. We also present a brief review of how in vitro diagnostic devices, particularly BCTs, are regulated in the United States, the European Union, and Canada. The verification and validation of BCTs will help to avoid the economic and human costs associated with incorrect test results, including poor patient care, unnecessary testing, and delays in test results. We urge laboratorians, tube manufacturers, diagnostic companies, and other researchers to take all the necessary steps to protect against the adverse effects of BCT components and their additives on clinical assays. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Dotan, Raffy
2012-06-01
The multisession maximal lactate steady-state (MLSS) test is the gold standard for anaerobic threshold (AnT) estimation. However, it is highly impractical, requires high fitness level, and suffers additional shortcomings. Existing single-session AnT-estimating tests are of compromised validity, reliability, and resolution. The presented reverse lactate threshold test (RLT) is a single-session, AnT-estimating test, aimed at avoiding the pitfalls of existing tests. It is based on the novel concept of identifying blood lactate's maximal appearance-disappearance equilibrium by approaching the AnT from higher, rather than from lower exercise intensities. Rowing, cycling, and running case data (4 recreational and competitive athletes, male and female, aged 17-39 y) are presented. Subjects performed the RLT test and, on a separate session, a single 30-min MLSS-type verification test at the RLT-determined intensity. The RLT and its MLSS verification exhibited exceptional agreement at 0.5% discrepancy or better. The RLT's training sensitivity was demonstrated by a case of 2.5-mo training regimen following which the RLT's 15-W improvement was fully MLSS-verified. The RLT's test-retest reliability was examined in 10 trained and untrained subjects. Test 2 differed from test 1 by only 0.3% with an intraclass correlation of 0.997. The data suggest RLT to accurately and reliably estimate AnT (as represented by MLSS verification) with high resolution and in distinctly different sports and to be sensitive to training adaptations. Compared with MLSS, the single-session RLT is highly practical and its lower fitness requirements make it applicable to athletes and untrained individuals alike. Further research is needed to establish RLT's validity and accuracy in larger samples.
Ghorbani Moghaddam, Masoud; Achuthan, Ajit; Bednarcyk, Brett A; Arnold, Steven M; Pineda, Evan J
2016-05-04
A multiscale computational model is developed for determining the elasto-plastic behavior of polycrystal metals by employing a single crystal plasticity constitutive model that can capture the microstructural scale stress field on a finite element analysis (FEA) framework. The generalized method of cells (GMC) micromechanics model is used for homogenizing the local field quantities. At first, the stand-alone GMC is applied for studying simple material microstructures such as a repeating unit cell (RUC) containing single grain or two grains under uniaxial loading conditions. For verification, the results obtained by the stand-alone GMC are compared to those from an analogous FEA model incorporating the same single crystal plasticity constitutive model. This verification is then extended to samples containing tens to hundreds of grains. The results demonstrate that the GMC homogenization combined with the crystal plasticity constitutive framework is a promising approach for failure analysis of structures as it allows for properly predicting the von Mises stress in the entire RUC, in an average sense, as well as in the local microstructural level, i.e. , each individual grain. Two-three orders of saving in computational cost, at the expense of some accuracy in prediction, especially in the prediction of the components of local tensor field quantities and the quantities near the grain boundaries, was obtained with GMC. Finally, the capability of the developed multiscale model linking FEA and GMC to solve real-life-sized structures is demonstrated by successfully analyzing an engine disc component and determining the microstructural scale details of the field quantities.
Ghorbani Moghaddam, Masoud; Achuthan, Ajit; Bednarcyk, Brett A.; Arnold, Steven M.; Pineda, Evan J.
2016-01-01
A multiscale computational model is developed for determining the elasto-plastic behavior of polycrystal metals by employing a single crystal plasticity constitutive model that can capture the microstructural scale stress field on a finite element analysis (FEA) framework. The generalized method of cells (GMC) micromechanics model is used for homogenizing the local field quantities. At first, the stand-alone GMC is applied for studying simple material microstructures such as a repeating unit cell (RUC) containing single grain or two grains under uniaxial loading conditions. For verification, the results obtained by the stand-alone GMC are compared to those from an analogous FEA model incorporating the same single crystal plasticity constitutive model. This verification is then extended to samples containing tens to hundreds of grains. The results demonstrate that the GMC homogenization combined with the crystal plasticity constitutive framework is a promising approach for failure analysis of structures as it allows for properly predicting the von Mises stress in the entire RUC, in an average sense, as well as in the local microstructural level, i.e., each individual grain. Two–three orders of saving in computational cost, at the expense of some accuracy in prediction, especially in the prediction of the components of local tensor field quantities and the quantities near the grain boundaries, was obtained with GMC. Finally, the capability of the developed multiscale model linking FEA and GMC to solve real-life-sized structures is demonstrated by successfully analyzing an engine disc component and determining the microstructural scale details of the field quantities. PMID:28773458
NASA Astrophysics Data System (ADS)
Wier, Timothy P.; Moser, Cameron S.; Grant, Jonathan F.; Riley, Scott C.; Robbins-Wamsley, Stephanie H.; First, Matthew R.; Drake, Lisa A.
2017-10-01
Both L-shaped ("L") and straight ("Straight") sample probes have been used to collect water samples from a main ballast line in land-based or shipboard verification testing of ballast water management systems (BWMS). A series of experiments was conducted to quantify and compare the sampling efficiencies of L and Straight sample probes. The findings from this research-that both L and Straight probes sample organisms with similar efficiencies-permit increased flexibility for positioning sample probes aboard ships.
21 CFR 812.35 - Supplemental applications.
Code of Federal Regulations, 2014 CFR
2014-04-01
... control procedures of § 820.30, preclinical/animal testing, peer reviewed published literature, or other... the verification and validation testing, as appropriate, demonstrated that the design outputs met the...
21 CFR 812.35 - Supplemental applications.
Code of Federal Regulations, 2011 CFR
2011-04-01
... control procedures of § 820.30, preclinical/animal testing, peer reviewed published literature, or other... the verification and validation testing, as appropriate, demonstrated that the design outputs met the...
21 CFR 812.35 - Supplemental applications.
Code of Federal Regulations, 2013 CFR
2013-04-01
... control procedures of § 820.30, preclinical/animal testing, peer reviewed published literature, or other... the verification and validation testing, as appropriate, demonstrated that the design outputs met the...
21 CFR 812.35 - Supplemental applications.
Code of Federal Regulations, 2012 CFR
2012-04-01
... control procedures of § 820.30, preclinical/animal testing, peer reviewed published literature, or other... the verification and validation testing, as appropriate, demonstrated that the design outputs met the...
In April 1995, the U.S. Environmental Protection Agency (EPA) sponsored a demonstration of field portable X-ray fluorescence (FPXRF) analyzers. The primary objectives of this demonstration were (1) to determine how well FPXRF analyzers perform in comparison to standard reference...
The Rontec PicoTAX x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Rec...
Microprocessor Based Temperature Control of Liquid Delivery with Flow Disturbances.
ERIC Educational Resources Information Center
Kaya, Azmi
1982-01-01
Discusses analytical design and experimental verification of a PID control value for a temperature controlled liquid delivery system, demonstrating that the analytical design techniques can be experimentally verified by using digital controls as a tool. Digital control instrumentation and implementation are also demonstrated and documented for…
Extremely accurate sequential verification of RELAP5-3D
Mesina, George L.; Aumiller, David L.; Buschman, Francis X.
2015-11-19
Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less
Evaluation of the 29-km Eta Model for Weather Support to the United States Space Program
NASA Technical Reports Server (NTRS)
Manobianco, John; Nutter, Paul
1997-01-01
The Applied Meteorology Unit (AMU) conducted a year-long evaluation of NCEP's 29-km mesoscale Eta (meso-eta) weather prediction model in order to identify added value to forecast operations in support of the United States space program. The evaluation was stratified over warm and cool seasons and considered both objective and subjective verification methodologies. Objective verification results generally indicate that meso-eta model point forecasts at selected stations exhibit minimal error growth in terms of RMS errors and are reasonably unbiased. Conversely, results from the subjective verification demonstrate that model forecasts of developing weather events such as thunderstorms, sea breezes, and cold fronts, are not always as accurate as implied by the seasonal error statistics. Sea-breeze case studies reveal that the model generates a dynamically-consistent thermally direct circulation over the Florida peninsula, although at a larger scale than observed. Thunderstorm verification reveals that the meso-eta model is capable of predicting areas of organized convection, particularly during the late afternoon hours but is not capable of forecasting individual thunderstorms. Verification of cold fronts during the cool season reveals that the model is capable of forecasting a majority of cold frontal passages through east central Florida to within +1-h of observed frontal passage.
Analysis of wolves and sheep. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hogden, J.; Papcun, G.; Zlokarnik, I.
1997-08-01
In evaluating speaker verification systems, asymmetries have been observed in the ease with which people are able to break into other people`s voice locks. People who are good at breaking into voice locks are called wolves, and people whose locks are easy to break into are called sheep. (Goats are people that have a difficult time opening their own voice locks.) Analyses of speaker verification algorithms could be used to understand wolf/sheep asymmetries. Using the notion of a ``speaker space``, it is demonstrated that such asymmetries could arise even though the similarity of voice 1 to voice 2 is themore » same as the inverse similarity. This explains partially the wolf/sheep asymmetries, although there may be other factors. The speaker space can be computed from interspeaker similarity data using multidimensional scaling, and such speaker space can be used to given a good approximation of the interspeaker similarities. The derived speaker space can be used to predict which of the enrolled speakers are likely to be wolves and which are likely to be sheep. However, a speaker must first enroll in the speaker key system and then be compared to each of the other speakers; a good estimate of a person`s speaker space position could be obtained using only a speech sample.« less
Branck, Tobyn A.; Hurley, Matthew J.; Prata, Gianna N.; Crivello, Christina A.
2017-01-01
ABSTRACT Listeria monocytogenes is of great concern in food processing facilities because it persists in biofilms, facilitating biotransfer. Stainless steel is commonly used for food contact surfaces and transport containers. L. monocytogenes biofilms on stainless steel served as a model system for surface sampling, to test the performance of a sonicating swab in comparison with a standard cotton swab. Swab performance and consistency were determined using total viable counts. Stainless steel coupons sampled with both types of swabs were examined using scanning electron microscopy, to visualize biofilms and surface structures (i.e., polishing grooves and scratches). Laser scanning confocal microscopy was used to image and to quantitate the biofilms remaining after sampling with each swab type. The total viable counts were significantly higher (P ≤ 0.05) with the sonicating swab than with the standard swab in each trial. The sonicating swab was more consistent in cell recovery than was the standard swab, with coefficients of variation ranging from 8.9% to 12.3% and from 7.1% to 37.6%, respectively. Scanning electron microscopic imaging showed that biofilms remained in the polished grooves of the coupons sampled with the standard swab but were noticeably absent with the sonicating swab. Percent area measurements of biofilms remaining on the stainless steel coupons showed significantly (P ≤ 0.05) less biofilm remaining when the sonicating swab was used (median, 1.1%), compared with the standard swab (median, 70.4%). The sonicating swab provided greater recovery of cells, with more consistency, than did the standard swab, and it is employs sonication, suction, and scrubbing. IMPORTANCE Inadequate surface sampling can result in foodborne illness outbreaks from biotransfer, since verification of sanitization protocols relies on surface sampling and recovery of microorganisms for detection and enumeration. Swabbing is a standard method for microbiological sampling of surfaces. Although swabbing offers portability and ease of use, there are limitations, such as high user variability and low recovery rates, which can be attributed to many different causes. This study demonstrates some benefits that a sonicating swab has over a standard swab for removal and collection of microbiological samples from a surface, to provide better verification of surface cleanliness and to help decrease the potential for biotransfer of pathogens into foods. PMID:28314729
Branck, Tobyn A; Hurley, Matthew J; Prata, Gianna N; Crivello, Christina A; Marek, Patrick J
2017-06-01
Listeria monocytogenes is of great concern in food processing facilities because it persists in biofilms, facilitating biotransfer. Stainless steel is commonly used for food contact surfaces and transport containers. L. monocytogenes biofilms on stainless steel served as a model system for surface sampling, to test the performance of a sonicating swab in comparison with a standard cotton swab. Swab performance and consistency were determined using total viable counts. Stainless steel coupons sampled with both types of swabs were examined using scanning electron microscopy, to visualize biofilms and surface structures (i.e., polishing grooves and scratches). Laser scanning confocal microscopy was used to image and to quantitate the biofilms remaining after sampling with each swab type. The total viable counts were significantly higher ( P ≤ 0.05) with the sonicating swab than with the standard swab in each trial. The sonicating swab was more consistent in cell recovery than was the standard swab, with coefficients of variation ranging from 8.9% to 12.3% and from 7.1% to 37.6%, respectively. Scanning electron microscopic imaging showed that biofilms remained in the polished grooves of the coupons sampled with the standard swab but were noticeably absent with the sonicating swab. Percent area measurements of biofilms remaining on the stainless steel coupons showed significantly ( P ≤ 0.05) less biofilm remaining when the sonicating swab was used (median, 1.1%), compared with the standard swab (median, 70.4%). The sonicating swab provided greater recovery of cells, with more consistency, than did the standard swab, and it is employs sonication, suction, and scrubbing. IMPORTANCE Inadequate surface sampling can result in foodborne illness outbreaks from biotransfer, since verification of sanitization protocols relies on surface sampling and recovery of microorganisms for detection and enumeration. Swabbing is a standard method for microbiological sampling of surfaces. Although swabbing offers portability and ease of use, there are limitations, such as high user variability and low recovery rates, which can be attributed to many different causes. This study demonstrates some benefits that a sonicating swab has over a standard swab for removal and collection of microbiological samples from a surface, to provide better verification of surface cleanliness and to help decrease the potential for biotransfer of pathogens into foods. Copyright © 2017 American Society for Microbiology.
Verification of Ceramic Structures
NASA Astrophysics Data System (ADS)
Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit
2012-07-01
In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).
The TraceDetect's SafeGuard is designed to automatically measure total arsenic concentrations in drinking water samples (including raw water and treated water) over a range from 1 ppb to over 100 ppb. Once the operator has introduced the sample vial and selected "measure&qu...
ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, PCB DETECTION TECHNOLOGY, HYBRIZYME DELFIA TM ASSAY
The DELFIA PCB Assay is a solid-phase time-resolved fluoroimmunoassay based on the sequential addition of sample extract and europium-labeled PCB tracer to a monoclonal antibody reagent specific for PCBs. In this assay, the antibody reagent and sample extract are added to a strip...
Experimental verification of multipartite entanglement in quantum networks
McCutcheon, W.; Pappa, A.; Bell, B. A.; McMillan, A.; Chailloux, A.; Lawson, T.; Mafu, M.; Markham, D.; Diamanti, E.; Kerenidis, I.; Rarity, J. G.; Tame, M. S.
2016-01-01
Multipartite entangled states are a fundamental resource for a wide range of quantum information processing tasks. In particular, in quantum networks, it is essential for the parties involved to be able to verify if entanglement is present before they carry out a given distributed task. Here we design and experimentally demonstrate a protocol that allows any party in a network to check if a source is distributing a genuinely multipartite entangled state, even in the presence of untrusted parties. The protocol remains secure against dishonest behaviour of the source and other parties, including the use of system imperfections to their advantage. We demonstrate the verification protocol in a three- and four-party setting using polarization-entangled photons, highlighting its potential for realistic photonic quantum communication and networking applications. PMID:27827361
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-28
... persons to submit comments on this document. Comments may be submitted by one of the following methods... very low (less than one percent), and this carcass sampling was expensive for the Agency. As stated in.... Following the implementation of PR/HACCP, FSIS analyzed only one pathogen per sample. Then, in 2008, FSIS...
NASA Astrophysics Data System (ADS)
Martinez, J. C.; Guzmán-Sepúlveda, J. R.; Bolañoz Evia, G. R.; Córdova, T.; Guzmán-Cabrera, R.
2018-06-01
In this work, we applied machine learning techniques to Raman spectra for the characterization and classification of manufactured pharmaceutical products. Our measurements were taken with commercial equipment, for accurate assessment of variations with respect to one calibrated control sample. Unlike the typical use of Raman spectroscopy in pharmaceutical applications, in our approach the principal components of the Raman spectrum are used concurrently as attributes in machine learning algorithms. This permits an efficient comparison and classification of the spectra measured from the samples under study. This also allows for accurate quality control as all relevant spectral components are considered simultaneously. We demonstrate our approach with respect to the specific case of acetaminophen, which is one of the most widely used analgesics in the market. In the experiments, commercial samples from thirteen different laboratories were analyzed and compared against a control sample. The raw data were analyzed based on an arithmetic difference between the nominal active substance and the measured values in each commercial sample. The principal component analysis was applied to the data for quantitative verification (i.e., without considering the actual concentration of the active substance) of the difference in the calibrated sample. Our results show that by following this approach adulterations in pharmaceutical compositions can be clearly identified and accurately quantified.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ashcraft, C. Chace; Niederhaus, John Henry; Robinson, Allen C.
We present a verification and validation analysis of a coordinate-transformation-based numerical solution method for the two-dimensional axisymmetric magnetic diffusion equation, implemented in the finite-element simulation code ALEGRA. The transformation, suggested by Melissen and Simkin, yields an equation set perfectly suited for linear finite elements and for problems with large jumps in material conductivity near the axis. The verification analysis examines transient magnetic diffusion in a rod or wire in a very low conductivity background by first deriving an approximate analytic solution using perturbation theory. This approach for generating a reference solution is shown to be not fully satisfactory. A specializedmore » approach for manufacturing an exact solution is then used to demonstrate second-order convergence under spatial refinement and tem- poral refinement. For this new implementation, a significant improvement relative to previously available formulations is observed. Benefits in accuracy for computed current density and Joule heating are also demonstrated. The validation analysis examines the circuit-driven explosion of a copper wire using resistive magnetohydrodynamics modeling, in comparison to experimental tests. The new implementation matches the accuracy of the existing formulation, with both formulations capturing the experimental burst time and action to within approximately 2%.« less
Particle shape accounts for instrumental discrepancy in ice core dust size distributions
NASA Astrophysics Data System (ADS)
Folden Simonsen, Marius; Cremonesi, Llorenç; Baccolo, Giovanni; Bosch, Samuel; Delmonte, Barbara; Erhardt, Tobias; Kjær, Helle Astrid; Potenza, Marco; Svensson, Anders; Vallelonga, Paul
2018-05-01
The Klotz Abakus laser sensor and the Coulter counter are both used for measuring the size distribution of insoluble mineral dust particles in ice cores. While the Coulter counter measures particle volume accurately, the equivalent Abakus instrument measurement deviates substantially from the Coulter counter. We show that the difference between the Abakus and the Coulter counter measurements is mainly caused by the irregular shape of dust particles in ice core samples. The irregular shape means that a new calibration routine based on standard spheres is necessary for obtaining fully comparable data. This new calibration routine gives an increased accuracy to Abakus measurements, which may improve future ice core record intercomparisons. We derived an analytical model for extracting the aspect ratio of dust particles from the difference between Abakus and Coulter counter data. For verification, we measured the aspect ratio of the same samples directly using a single-particle extinction and scattering instrument. The results demonstrate that the model is accurate enough to discern between samples of aspect ratio 0.3 and 0.4 using only the comparison of Abakus and Coulter counter data.
Model-based engineering for medical-device software.
Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi
2010-01-01
This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.
The Second NASA Formal Methods Workshop 1992
NASA Technical Reports Server (NTRS)
Johnson, Sally C. (Compiler); Holloway, C. Michael (Compiler); Butler, Ricky W. (Compiler)
1992-01-01
The primary goal of the workshop was to bring together formal methods researchers and aerospace industry engineers to investigate new opportunities for applying formal methods to aerospace problems. The first part of the workshop was tutorial in nature. The second part of the workshop explored the potential of formal methods to address current aerospace design and verification problems. The third part of the workshop involved on-line demonstrations of state-of-the-art formal verification tools. Also, a detailed survey was filled in by the attendees; the results of the survey are compiled.
NASA Technical Reports Server (NTRS)
Ponchak, George E.; Chun, Donghoon; Katehi, Linda P. B.; Yook, Jong-Gwan
1999-01-01
Coupling between microstrip lines in dense RF packages is a common problem that degrades circuit performance. Prior 3D-FEM electromagnetic simulations have shown that metal filled via hole fences between two adjacent microstrip lines actually increases coupling between the lines; however, if the top of the via posts are connected by a metal Strip, coupling is reduced. In this paper, experimental verification of the 3D-FEM simulations Is demonstrated for commercially fabricated LTCC packages.
Tethered satellite system dynamics and control review panel and related activities, phase 3
NASA Technical Reports Server (NTRS)
1991-01-01
Two major tests of the Tethered Satellite System (TSS) engineering and flight units were conducted to demonstrate the functionality of the hardware and software. Deficiencies in the hardware/software integration tests (HSIT) led to a recommendation for more testing to be performed. Selected problem areas of tether dynamics were analyzed, including verification of the severity of skip rope oscillations, verification or comparison runs to explore dynamic phenomena observed in other simulations, and data generation runs to explore the performance of the time domain and frequency domain skip rope observers.
The Environmental Response Laboratory Network supports the goal to increase national capacity for biological analysis of environmental samples. This includes methods development and verification, technology transfer, and collaboration with USDA, FERN, CDC.
Assume-Guarantee Verification of Source Code with Design-Level Assumptions
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.
2004-01-01
Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.
A system verification platform for high-density epiretinal prostheses.
Chen, Kuanfu; Lo, Yi-Kai; Yang, Zhi; Weiland, James D; Humayun, Mark S; Liu, Wentai
2013-06-01
Retinal prostheses have restored light perception to people worldwide who have poor or no vision as a consequence of retinal degeneration. To advance the quality of visual stimulation for retinal implant recipients, a higher number of stimulation channels is expected in the next generation retinal prostheses, which poses a great challenge to system design and verification. This paper presents a system verification platform dedicated to the development of retinal prostheses. The system includes primary processing, dual-band power and data telemetry, a high-density stimulator array, and two methods for output verification. End-to-end system validation and individual functional block characterization can be achieved with this platform through visual inspection and software analysis. Custom-built software running on the computers also provides a good way for testing new features before they are realized by the ICs. Real-time visual feedbacks through the video displays make it easy to monitor and debug the system. The characterization of the wireless telemetry and the demonstration of the visual display are reported in this paper using a 256-channel retinal prosthetic IC as an example.
A software engineering approach to expert system design and verification
NASA Technical Reports Server (NTRS)
Bochsler, Daniel C.; Goodwin, Mary Ann
1988-01-01
Software engineering design and verification methods for developing expert systems are not yet well defined. Integration of expert system technology into software production environments will require effective software engineering methodologies to support the entire life cycle of expert systems. The software engineering methods used to design and verify an expert system, RENEX, is discussed. RENEX demonstrates autonomous rendezvous and proximity operations, including replanning trajectory events and subsystem fault detection, onboard a space vehicle during flight. The RENEX designers utilized a number of software engineering methodologies to deal with the complex problems inherent in this system. An overview is presented of the methods utilized. Details of the verification process receive special emphasis. The benefits and weaknesses of the methods for supporting the development life cycle of expert systems are evaluated, and recommendations are made based on the overall experiences with the methods.
NASA Technical Reports Server (NTRS)
Bickford, Mark; Srivas, Mandayam
1991-01-01
Presented here is a formal specification and verification of a property of a quadruplicately redundant fault tolerant microprocessor system design. A complete listing of the formal specification of the system and the correctness theorems that are proved are given. The system performs the task of obtaining interactive consistency among the processors using a special instruction on the processors. The design is based on an algorithm proposed by Pease, Shostak, and Lamport. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, providing certain preconditions hold, using a computer aided design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.
WTP Waste Feed Qualification: Hydrogen Generation Rate Measurement Apparatus Testing Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stone, M. E.; Newell, J. D.; Smith, T. E.
The generation rate of hydrogen gas in the Hanford tank waste will be measured during the qualification of the staged tank waste for processing in the Hanford Tank Waste Treatment and Immobilization Plant. Based on a review of past practices in measurement of the hydrogen generation, an apparatus to perform this measurement has been designed and tested for use during waste feed qualification. The hydrogen generation rate measurement apparatus (HGRMA) described in this document utilized a 100 milliliter sample in a continuously-purged, continuously-stirred vessel, with measurement of hydrogen concentration in the vent gas. The vessel and lid had a combinedmore » 220 milliliters of headspace. The vent gas system included a small condenser to prevent excessive evaporative losses from the sample during the test, as well as a demister and filter to prevent particle migration from the sample to the gas chromatography system. The gas chromatograph was an on line automated instrument with a large-volume sample-injection system to allow measurement of very low hydrogen concentrations. This instrument automatically sampled the vent gas from the hydrogen generation rate measurement apparatus every five minutes and performed data regression in real time. The fabrication of the hydrogen generation rate measurement apparatus was in accordance with twenty three (23) design requirements documented in the conceptual design package, as well as seven (7) required developmental activities documented in the task plan associated with this work scope. The HGRMA was initially tested for proof of concept with physical simulants, and a remote demonstration of the system was performed in the Savannah River National Laboratory Shielded Cells Mockup Facility. Final verification testing was performed using non-radioactive simulants of the Hanford tank waste. Three different simulants were tested to bound the expected rheological properties expected during waste feed qualification testing. These simulants were tested at different temperatures using purge gas spiked with varying amounts of hydrogen to provide verification that the system could accurately measure the hydrogen in the vent gas at steady state.« less
Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk
2015-12-01
The purpose of this study is to verification of relationship model between Korean new elderly class's recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model.
Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk
2015-01-01
The purpose of this study is to verification of relationship model between Korean new elderly class’s recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model. PMID:26730383
Sampled-Data Techniques Applied to a Digital Controller for an Altitude Autopilot
NASA Technical Reports Server (NTRS)
Schmidt, Stanley F.; Harper, Eleanor V.
1959-01-01
Sampled-data theory, using the Z transformation, is applied to the design of a digital controller for an aircraft-altitude autopilot. Particular attention is focused on the sensitivity of the design to parameter variations and the abruptness of the response, that is, the normal acceleration required to carry out a transient maneuver. Consideration of these two characteristics of the system has shown that the finite settling time design method produces an unacceptable system, primarily because of the high sensitivity of the response to parameter variations, although abruptness can be controlled by increasing the sampling period. Also demonstrated is the importance of having well-damped poles or zeros if cancellation is attempted in the design methods. A different method of smoothing the response and obtaining a design which is not excessively sensitive is proposed, and examples are carried through to demonstrate the validity of the procedure. This method is based on design concepts of continuous systems, and it is shown that if no pole-zero cancellations are allowed in the design, one can obtain a response which is not too abrupt, is relatively insensitive to parameter variations, and is not sensitive to practical limits on control-surface rate. This particular design also has the simplest possible pulse transfer function for the digital controller. Simulation techniques and root loci are used for the verification of the design philosophy.
Toward a durable superhydrophobic aluminum surface by etching and ZnO nanoparticle deposition.
Rezayi, Toktam; Entezari, Mohammad H
2016-02-01
Fabrication of suitable roughness is a fundamental step for acquiring superhydrophobic surfaces. For this purpose, a deposition of ZnO nanoparticles on Al surface was carried out by simple immersion and ultrasound approaches. Then, surface energy reduction was performed using stearic acid (STA) ethanol solution for both methods. The results demonstrated that ultrasound would lead to more stable superhydrophobic Al surfaces (STA-ZnO-Al-U) in comparison with simple immersion method (STA-ZnO-Al-I). Besides, etching in HCl solution in another sample was carried out before ZnO deposition for acquiring more mechanically stable superhydrophobic surface. The potentiodynamic measurements demonstrate that etching in HCl solution under ultrasound leads to superhydrophobic surface (STA-ZnO-Al(E)-U). This sample shows remarkable decrease in corrosion current density (icorr) and long-term stability improvement versus immersion in NaCl solution (3.5%) in comparison with the sample prepared without etching (STA-ZnO-Al-U). Scanning electron micrograph (SEM) and energy-dispersive X-ray spectroscopy (EDX) confirmed a more condense and further particle deposition on Al substrate when ultrasound was applied in the system. The crystallite evaluation of deposited ZnO nanoparticles was carried out using X-ray diffractometer (XRD). Finally, for STA grafting verification on Al surface, Fourier transform infrared in conjunction with attenuated total reflection (FTIR-ATR) was used as a proper technique. Copyright © 2015 Elsevier Inc. All rights reserved.
Time-Lapse Electrical Geophysical Monitoring of Amendment-Based Biostimulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Timothy C.; Versteeg, Roelof; Day-Lewis, Frederick D.
Biostimulation is increasingly used to accelerate microbial remediation of recalcitrant groundwater contaminants. Effective application of biostimulation requires successful emplacement of amendment in the contaminant target zone. Verification of remediation performance requires postemplacement assessment and contaminant monitoring. Sampling based approaches are expensive and provide low-density spatial and temporal information. Time-lapse electrical resistivity tomography (ERT) is an effective geophysical method for determining temporal changes in subsurface electrical conductivity. Because remedial amendments and biostimulation-related biogeochemical processes often change subsurface electrical conductivity, ERT can complement and enhance sampling-based approaches for assessing emplacement and monitoring biostimulation-based remediation. Field studies demonstrating the ability of time-lapse ERTmore » to monitor amendment emplacement and behavior were performed during a biostimulation remediation effort conducted at the Department of Defense Reutilization and Marketing Office (DRMO) Yard, in Brandywine, Maryland, United States. Geochemical fluid sampling was used to calibrate a petrophysical relation in order to predict groundwater indicators of amendment distribution. The petrophysical relations were field validated by comparing predictions to sequestered fluid sample results, thus demonstrating the potential of electrical geophysics for quantitative assessment of amendment-related geochemical properties. Crosshole radar zero-offset profile and borehole geophysical logging were also performed to augment the data set and validate interpretation. In addition to delineating amendment transport in the first 10 months after emplacement, the time-lapse ERT results show later changes in bulk electrical properties interpreted as mineral precipitation. Results support the use of more cost-effective surfacebased ERT in conjunction with limited field sampling to improve spatial and temporal monitoring of amendment emplacement and remediation performance.« less
Time-lapse electrical geophysical monitoring of amendment-based biostimulation
Johnson, Timothy C.; Versteeg, Roelof J.; Day-Lewis, Frederick D.; Major, William; Lane, John W.
2015-01-01
Biostimulation is increasingly used to accelerate microbial remediation of recalcitrant groundwater contaminants. Effective application of biostimulation requires successful emplacement of amendment in the contaminant target zone. Verification of remediation performance requires postemplacement assessment and contaminant monitoring. Sampling-based approaches are expensive and provide low-density spatial and temporal information. Time-lapse electrical resistivity tomography (ERT) is an effective geophysical method for determining temporal changes in subsurface electrical conductivity. Because remedial amendments and biostimulation-related biogeochemical processes often change subsurface electrical conductivity, ERT can complement and enhance sampling-based approaches for assessing emplacement and monitoring biostimulation-based remediation.Field studies demonstrating the ability of time-lapse ERT to monitor amendment emplacement and behavior were performed during a biostimulation remediation effort conducted at the Department of Defense Reutilization and Marketing Office (DRMO) Yard, in Brandywine, Maryland, United States. Geochemical fluid sampling was used to calibrate a petrophysical relation in order to predict groundwater indicators of amendment distribution. The petrophysical relations were field validated by comparing predictions to sequestered fluid sample results, thus demonstrating the potential of electrical geophysics for quantitative assessment of amendment-related geochemical properties. Crosshole radar zero-offset profile and borehole geophysical logging were also performed to augment the data set and validate interpretation.In addition to delineating amendment transport in the first 10 months after emplacement, the time-lapse ERT results show later changes in bulk electrical properties interpreted as mineral precipitation. Results support the use of more cost-effective surface-based ERT in conjunction with limited field sampling to improve spatial and temporal monitoring of amendment emplacement and remediation performance.
The Oxford ED2000 x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recr...
The Niton XLt 700 Series (XLt) XRF Services x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at ...
This work represents the technical and editorial contributions of a large number of U.S. Environmental Protection Agency (EPA) employees and others familiar with or interested in the demonstration and evaluation of innovative site characterization and monitoring technologies. In ...
The Innov-X XT400 Series (XT400) x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kenned...
The Rigaku ZSX Mini II (ZSX Mini II) XRF Services x-ray fluorescence (XRF) analyzer was demon-strated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2...
Verenitch, Sergei; Mazumder, Asit
2015-01-01
The use of nitrogen stable isotopes to discriminate between conventionally and organically grown crops has been further developed in this study. Soil and irrigation water from different regions, as well as nitrogen fertilizers used, have been examined in detail to determine their effects on nitrogen isotope composition of spinach, lettuce, broccoli and tomatoes. Over 1000 samples of various types of organically and conventionally grown produce of known origin, along with the samples of nitrogen fertilizers used for their growth, have been analysed in order to assemble the datasets of crop/fertilizer correlations. The results demonstrate that the developed approach can be used as a valuable component in the verification of agricultural practices for more than 25 different types of commercially grown green produce, either organic or conventional. Over a period of two years, various organic and non-organic greens, from different stores in Seattle (WA, USA) and Victoria (BC, Canada), were collected and analysed using this methodology with the objective of determining any pattern of misrepresentation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
2007-08-30
The 1607-B1 Septic System includes a septic tank, drain field, and associated connecting pipelines and influent sanitary sewer lines. This septic system serviced the former 1701-B Badgehouse, 1720-B Patrol Building/Change Room, and the 1709-B Fire Headquarters. The 1607-B1 waste site received unknown amounts of nonhazardous, nonradioactive sanitary sewage from these facilities during its operational history from 1944 to approximately 1970. In accordance with this evaluation, the confirmatory sampling results support a reclassification of this site to No Action. The current site conditions achieve the remedial action objectives and the corresponding remedial action goals established in the Remaining Sites ROD. Themore » results of confirmatory sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.« less
NASA Astrophysics Data System (ADS)
Kaminski, Thomas; Rayner, Peter Julian
2017-10-01
Various observational data streams have been shown to provide valuable constraints on the state and evolution of the global carbon cycle. These observations have the potential to reduce uncertainties in past, current, and predicted natural and anthropogenic surface fluxes. In particular such observations provide independent information for verification of actions as requested by the Paris Agreement. It is, however, difficult to decide which variables to sample, and how, where, and when to sample them, in order to achieve an optimal use of the observational capabilities. Quantitative network design (QND) assesses the impact of a given set of existing or hypothetical observations in a modelling framework. QND has been used to optimise in situ networks and assess the benefit to be expected from planned space missions. This paper describes recent progress and highlights aspects that are not yet sufficiently addressed. It demonstrates the advantage of an integrated QND system that can simultaneously evaluate a multitude of observational data streams and assess their complementarity and redundancy.
40 CFR 86.1845-01 - Manufacturer in-use verification testing requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... of test vehicles in the sample comply with the sample size requirements of this section. Any post... vehicles, light-duty trucks, and complete heavy-duty vehicles shall test, or cause to have tested a...) Low mileage testing. [Reserved] (c) High-mileage testing—(1) Test groups. Testing must be conducted...
40 CFR 86.1845-01 - Manufacturer in-use verification testing requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... of test vehicles in the sample comply with the sample size requirements of this section. Any post... vehicles, light-duty trucks, and complete heavy-duty vehicles shall test, or cause to have tested a...) Low mileage testing. [Reserved] (c) High-mileage testing—(1) Test groups. Testing must be conducted...
40 CFR 86.1845-01 - Manufacturer in-use verification testing requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... of test vehicles in the sample comply with the sample size requirements of this section. Any post... vehicles, light-duty trucks, and complete heavy-duty vehicles shall test, or cause to have tested a...) Low mileage testing. [Reserved] (c) High-mileage testing—(1) Test groups. Testing must be conducted...
40 CFR 86.1845-01 - Manufacturer in-use verification testing requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... of test vehicles in the sample comply with the sample size requirements of this section. Any post... vehicles, light-duty trucks, and complete heavy-duty vehicles shall test, or cause to have tested a...) Low mileage testing. [Reserved] (c) High-mileage testing—(1) Test groups. Testing must be conducted...
40 CFR 86.1845-04 - Manufacturer in-use verification testing requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... of test vehicles in the sample comply with the sample size requirements of this section. Any post... HDV must test, or cause to have tested, a specified number of vehicles. Such testing must be conducted... first test will be considered the official results for the test vehicle, regardless of any test results...
A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment.
Fuangrod, Todsaporn; Woodruff, Henry C; van Uytven, Eric; McCurdy, Boyd M C; Kuncic, Zdenka; O'Connor, Daryl J; Greer, Peter B
2013-09-01
To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient. The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance. The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s). A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.
Weak lensing magnification in the Dark Energy Survey Science Verification Data
Garcia-Fernandez, M.; et al.
2018-02-02
In this paper the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using data from the Dark Energy Survey Science Verification dataset. The analysis is carried out for two photometrically-selected galaxy samples, with mean photometric redshifts in themore » $0.2 < z < 0.4$ and $0.7 < z < 1.0$ ranges, in the riz bands. A signal is detected with a $$3.5\\sigma$$ significance level in each of the bands tested, and is compatible with the magnification predicted by the $$\\Lambda$$CDM model. After an extensive analysis, it cannot be attributed to any known systematic effect. The detection of the magnification signal is robust to estimated uncertainties in the outlier rate of the pho- tometric redshifts, but this will be an important issue for use of photometric redshifts in magnification mesurements from larger samples. In addition to the detection of the magnification signal, a method to select the sample with the maximum signal-to-noise is proposed and validated with data.« less
Weak lensing magnification in the Dark Energy Survey Science Verification Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia-Fernandez, M.; et al.
In this paper the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using data from the Dark Energy Survey Science Verification dataset. The analysis is carried out for two photometrically-selected galaxy samples, with mean photometric redshifts in themore » $0.2 < z < 0.4$ and $0.7 < z < 1.0$ ranges, in the riz bands. A signal is detected with a $$3.5\\sigma$$ significance level in each of the bands tested, and is compatible with the magnification predicted by the $$\\Lambda$$CDM model. After an extensive analysis, it cannot be attributed to any known systematic effect. The detection of the magnification signal is robust to estimated uncertainties in the outlier rate of the pho- tometric redshifts, but this will be an important issue for use of photometric redshifts in magnification mesurements from larger samples. In addition to the detection of the magnification signal, a method to select the sample with the maximum signal-to-noise is proposed and validated with data.« less
Weak lensing magnification in the Dark Energy Survey Science Verification Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia-Fernandez, M.; et al.
2016-11-30
In this paper the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using data from the Dark Energy Survey Science Verification dataset. The analysis is carried out for two photometrically-selected galaxy samples, with mean photometric redshifts in themore » $0.2 < z < 0.4$ and $0.7 < z < 1.0$ ranges, in the riz bands. A signal is detected with a $$3.5\\sigma$$ significance level in each of the bands tested, and is compatible with the magnification predicted by the $$\\Lambda$$CDM model. After an extensive analysis, it cannot be attributed to any known systematic effect. The detection of the magnification signal is robust to estimated uncertainties in the outlier rate of the pho- tometric redshifts, but this will be an important issue for use of photometric redshifts in magnification mesurements from larger samples. In addition to the detection of the magnification signal, a method to select the sample with the maximum signal-to-noise is proposed and validated with data.« less
Addona, Terri A; Abbatiello, Susan E; Schilling, Birgit; Skates, Steven J; Mani, D R; Bunk, David M; Spiegelman, Clifford H; Zimmerman, Lisa J; Ham, Amy-Joan L; Keshishian, Hasmik; Hall, Steven C; Allen, Simon; Blackman, Ronald K; Borchers, Christoph H; Buck, Charles; Cardasis, Helene L; Cusack, Michael P; Dodder, Nathan G; Gibson, Bradford W; Held, Jason M; Hiltke, Tara; Jackson, Angela; Johansen, Eric B; Kinsinger, Christopher R; Li, Jing; Mesri, Mehdi; Neubert, Thomas A; Niles, Richard K; Pulsipher, Trenton C; Ransohoff, David; Rodriguez, Henry; Rudnick, Paul A; Smith, Derek; Tabb, David L; Tegeler, Tony J; Variyath, Asokan M; Vega-Montoto, Lorenzo J; Wahlander, Åsa; Waldemarson, Sofia; Wang, Mu; Whiteaker, Jeffrey R; Zhao, Lei; Anderson, N Leigh; Fisher, Susan J; Liebler, Daniel C; Paulovich, Amanda G; Regnier, Fred E; Tempst, Paul; Carr, Steven A
2010-01-01
Verification of candidate biomarkers relies upon specific, quantitative assays optimized for selective detection of target proteins, and is increasingly viewed as a critical step in the discovery pipeline that bridges unbiased biomarker discovery to preclinical validation. Although individual laboratories have demonstrated that multiple reaction monitoring (MRM) coupled with isotope dilution mass spectrometry can quantify candidate protein biomarkers in plasma, reproducibility and transferability of these assays between laboratories have not been demonstrated. We describe a multilaboratory study to assess reproducibility, recovery, linear dynamic range and limits of detection and quantification of multiplexed, MRM-based assays, conducted by NCI-CPTAC. Using common materials and standardized protocols, we demonstrate that these assays can be highly reproducible within and across laboratories and instrument platforms, and are sensitive to low µg/ml protein concentrations in unfractionated plasma. We provide data and benchmarks against which individual laboratories can compare their performance and evaluate new technologies for biomarker verification in plasma. PMID:19561596
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruedig, Elizabeth
Public Law 105-119 directs the U.S. Department of Energy (DOE) to convey or transfer parcels of land to the Incorporated County of Los Alamos or their designees and to the Department of Interior, Bureau of Indian Affairs, in trust for the Pueblo de San Ildefonso. Los Alamos National Security is tasked to support DOE in conveyance and/or transfer of identified land parcels no later than September 2022. Under DOE Order 458.1, Radiation Protection of the Public and the Environment (O458.1, 2013) and Los Alamos National Laboratory (LANL or the Laboratory) implementing Policy 412 (P412, 2014), real property with the potentialmore » to contain residual radioactive material must meet the criteria for clearance and release to the public. This Sampling and Analysis Plan (SAP) is a second investigation of Tract A-18-2 for the purpose of verifying the previous sampling results (LANL 2017). This sample plan requires 18 projectspecific soil samples for use in radiological clearance decisions consistent with LANL Procedure ENV-ES-TP-238 (2015a) and guidance in the Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM, 2000). The sampling work will be conducted by LANL, and samples will be evaluated by a LANL-contracted independent lab. However, there will be federal review (verification) of all steps of the sampling process.« less
Certification of lightning protection for a full-authority digital engine control
NASA Technical Reports Server (NTRS)
Dargi, M.; Rupke, E.; Wiles, K.
1991-01-01
FADEC systems present many challenges to the lightning protection engineer. Verification of the protection-design adequacy for certification purposes presents additional challenges. The basic requirements of the certification plan of a FADEC is to demonstrate compliance with Federal Airworthiness Regulations (FAR) 25.1309 and 25.581. These FARs are intended for transport aircraft, but there are equivalent sections for general aviation aircraft, normal and transport rotorcraft. Military aircraft may have additional requirements. The criteria for demonstration of adequate lightning protection for a FADEC systems include the procedures outlined in FAA Advisory Circular (AC) 20-136, Protection of aircraft electrical/electronic systems against the indirect effects of lightning. As FADEC systems, including the interconnecting wiring, are generally not susceptible to direct attachment of lightning currents, the verification of protection against indirect effects is primarily described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swart, Peter K.; Dixon, Tim
2014-09-30
A series of surface geophysical and geochemical techniques are tested in order to demonstrate and validate low cost approaches for Monitoring, Verification and Accounting (MVA) of the integrity of deep reservoirs for CO 2 storage. These techniques are (i) surface deformation by GPS; ii) surface deformation by InSAR; iii) passive source seismology via broad band seismometers; and iv) soil gas monitoring with a cavity ring down spectrometer for measurement of CO 2 concentration and carbon isotope ratio. The techniques were tested at an active EOR (Enhanced Oil Recovery) site in Texas. Each approach has demonstrated utility. Assuming Carbon Capture, Utilizationmore » and Storage (CCUS) activities become operational in the future, these techniques can be used to augment more expensive down-hole techniques.« less
Development and verification of a cementless novel tapered wedge stem for total hip arthroplasty.
Faizan, Ahmad; Wuestemann, Thies; Nevelos, Jim; Bastian, Adam C; Collopy, Dermot
2015-02-01
Most current tapered wedge hip stems were designed based upon the original Mueller straight stem design introduced in 1977. These stems were designed to have a single medial curvature and grew laterally to accommodate different sizes. In this preclinical study, the design and verification of a tapered wedge stem using computed tomography scans of 556 patients are presented. The computer simulation demonstrated that the novel stem, designed for proximal engagement, allowed for reduced distal fixation, particularly in the 40-60 year male population. Moreover, the physical micromotion testing and finite element analysis demonstrated that the novel stem allowed for reduced micromotion. In summary, preclinical data suggest that the computed tomography based stem design described here may offer enhanced implant fit and reduced micromotion. Copyright © 2014 Elsevier Inc. All rights reserved.
Model Checking for Verification of Interactive Health IT Systems
Butler, Keith A.; Mercer, Eric; Bahrami, Ali; Tao, Cui
2015-01-01
Rigorous methods for design and verification of health IT systems have lagged far behind their proliferation. The inherent technical complexity of healthcare, combined with the added complexity of health information technology makes their resulting behavior unpredictable and introduces serious risk. We propose to mitigate this risk by formalizing the relationship between HIT and the conceptual work that increasingly typifies modern care. We introduce new techniques for modeling clinical workflows and the conceptual products within them that allow established, powerful modeling checking technology to be applied to interactive health IT systems. The new capability can evaluate the workflows of a new HIT system performed by clinicians and computers to improve safety and reliability. We demonstrate the method on a patient contact system to demonstrate model checking is effective for interactive systems and that much of it can be automated. PMID:26958166
Long, Ju
2016-05-01
In China, -(SEA), -α(3.7) and -α(4.2) are common deletional α-thalassemia alleles. Gap-PCR is the currently used detection method for these alleles, whose disadvantages include time-consuming procedure and increased potential for PCR product contamination. Therefore, this detection method needs to be improved. Based on identical-primer homologous fragments, a qPCR system was developed for deletional α-thalassemia genotyping, which was composed of a group of quantitatively-related primers and their corresponding probes plus two groups of qualitatively-related primers and their corresponding probes. In order to verify the accuracy of the qPCR system, known genotype samples and random samples are employed. The standard curve result demonstrated that designed primers and probes all yielded good amplification efficiency. In the tests of known genotype samples and random samples, sample detection results were consistent with verification results. In detecting αα, -(SEA), -α(3.7) and -α(4.2) alleles, deletional α-thalassemia alleles are accurately detected by this method. In addition, this method is provided with a wider detection range, greater speed and reduced PCR product contamination risk when compared with current common gap-PCR detection reagents. Copyright © 2016 Elsevier B.V. All rights reserved.
The 2014 Sandia Verification and Validation Challenge: Problem statement
Hu, Kenneth; Orient, George
2016-01-18
This paper presents a case study in utilizing information from experiments, models, and verification and validation (V&V) to support a decision. It consists of a simple system with data and models provided, plus a safety requirement to assess. The goal is to pose a problem that is flexible enough to allow challengers to demonstrate a variety of approaches, but constrained enough to focus attention on a theme. This was accomplished by providing a good deal of background information in addition to the data, models, and code, but directing the participants' activities with specific deliverables. In this challenge, the theme ismore » how to gather and present evidence about the quality of model predictions, in order to support a decision. This case study formed the basis of the 2014 Sandia V&V Challenge Workshop and this resulting special edition of the ASME Journal of Verification, Validation, and Uncertainty Quantification.« less
Experimental measurement-device-independent verification of quantum steering
NASA Astrophysics Data System (ADS)
Kocsis, Sacha; Hall, Michael J. W.; Bennet, Adam J.; Saunders, Dylan J.; Pryde, Geoff J.
2015-01-01
Bell non-locality between distant quantum systems—that is, joint correlations which violate a Bell inequality—can be verified without trusting the measurement devices used, nor those performing the measurements. This leads to unconditionally secure protocols for quantum information tasks such as cryptographic key distribution. However, complete verification of Bell non-locality requires high detection efficiencies, and is not robust to typical transmission losses over long distances. In contrast, quantum or Einstein-Podolsky-Rosen steering, a weaker form of quantum correlation, can be verified for arbitrarily low detection efficiencies and high losses. The cost is that current steering-verification protocols require complete trust in one of the measurement devices and its operator, allowing only one-sided secure key distribution. Here we present measurement-device-independent steering protocols that remove this need for trust, even when Bell non-locality is not present. We experimentally demonstrate this principle for singlet states and states that do not violate a Bell inequality.
Experimental measurement-device-independent verification of quantum steering.
Kocsis, Sacha; Hall, Michael J W; Bennet, Adam J; Saunders, Dylan J; Pryde, Geoff J
2015-01-07
Bell non-locality between distant quantum systems--that is, joint correlations which violate a Bell inequality--can be verified without trusting the measurement devices used, nor those performing the measurements. This leads to unconditionally secure protocols for quantum information tasks such as cryptographic key distribution. However, complete verification of Bell non-locality requires high detection efficiencies, and is not robust to typical transmission losses over long distances. In contrast, quantum or Einstein-Podolsky-Rosen steering, a weaker form of quantum correlation, can be verified for arbitrarily low detection efficiencies and high losses. The cost is that current steering-verification protocols require complete trust in one of the measurement devices and its operator, allowing only one-sided secure key distribution. Here we present measurement-device-independent steering protocols that remove this need for trust, even when Bell non-locality is not present. We experimentally demonstrate this principle for singlet states and states that do not violate a Bell inequality.
Formal verification of a fault tolerant clock synchronization algorithm
NASA Technical Reports Server (NTRS)
Rushby, John; Vonhenke, Frieder
1989-01-01
A formal specification and mechanically assisted verification of the interactive convergence clock synchronization algorithm of Lamport and Melliar-Smith is described. Several technical flaws in the analysis given by Lamport and Melliar-Smith were discovered, even though their presentation is unusally precise and detailed. It seems that these flaws were not detected by informal peer scrutiny. The flaws are discussed and a revised presentation of the analysis is given that not only corrects the flaws but is also more precise and easier to follow. Some of the corrections to the flaws require slight modifications to the original assumptions underlying the algorithm and to the constraints on its parameters, and thus change the external specifications of the algorithm. The formal analysis of the interactive convergence clock synchronization algorithm was performed using the Enhanced Hierarchical Development Methodology (EHDM) formal specification and verification environment. This application of EHDM provides a demonstration of some of the capabilities of the system.
Kuswandi, Bambang; Irmawati, Titi; Hidayat, Moch Amrun; Jayus; Ahmad, Musa
2014-01-27
A simple visual ethanol biosensor based on alcohol oxidase (AOX) immobilised onto polyaniline (PANI) film for halal verification of fermented beverage samples is described. This biosensor responds to ethanol via a colour change from green to blue, due to the enzymatic reaction of ethanol that produces acetaldehyde and hydrogen peroxide, when the latter oxidizes the PANI film. The procedure to obtain this biosensor consists of the immobilization of AOX onto PANI film by adsorption. For the immobilisation, an AOX solution is deposited on the PANI film and left at room temperature until dried (30 min). The biosensor was constructed as a dip stick for visual and simple use. The colour changes of the films have been scanned and analysed using image analysis software (i.e., ImageJ) to study the characteristics of the biosensor's response toward ethanol. The biosensor has a linear response in an ethanol concentration range of 0.01%-0.8%, with a correlation coefficient (r) of 0.996. The limit detection of the biosensor was 0.001%, with reproducibility (RSD) of 1.6% and a life time up to seven weeks when stored at 4 °C. The biosensor provides accurate results for ethanol determination in fermented drinks and was in good agreement with the standard method (gas chromatography) results. Thus, the biosensor could be used as a simple visual method for ethanol determination in fermented beverage samples that can be useful for Muslim community for halal verification.
Fernández-Galán, Esther; Bedini, Josep Lluís; Filella, Xavier
2017-12-01
This study is the first verification of the novel iPTH Siemens ADVIA Centaur® Intact Parathyroid Hormone (iPTHm) chemiluminescence immunoassay based on monoclonal antibodies. We also compared the iPTH results obtained using this assay with the previous ADVIA Centaur® Parathyroid Hormone assay (iPTHp) based on polyclonal antibodies. The analytical performance study of iPTHm assay included LoD, LoQ, intra- and inter-assay reproducibility, and linearity. A comparison study was performed on 369 routine plasma samples. The results were analyzed independently for patients with normal and abnormal GFR, as well as patients on hemodialysis. In addition, clinical concordance between assays was assessed. Finally, we studied PTH stability of plasma samples at 4°C. For the iPTHm assay LoD and LoQ were 0.03pmol/L and 0.10pmol/L, respectively. Intra- and inter-assay CV were between 2.3% and 6.2%. Linearity was correct in the range from 3.82 to 203.08pmol/L. Correlation studies showed a good correlation (r=0.99) between iPTHm and iPTHp, with bias of -2.55% (IC -3.48% to -1.62%) in the range from 0.32 to 117.07pmol/L. Clinical concordance, assessed by Kappa Index, was 0.874. The stability study showed that differences compared to basal iPTH concentration did not exceed 20% in any of the samples analyzed. The iPTHm assay demonstrated acceptable performance and a very good clinical concordance with iPTHp assay, currently used in our laboratory. Thus, the novel iPTHm assay can replace the previous iPTHp assay, since results provided by both assays are very similar. In our study, the stability of iPTH is not affected by storage up to 14days. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Damage Detection and Verification System (DDVS) for In-Situ Health Monitoring
NASA Technical Reports Server (NTRS)
Williams, Martha K.; Lewis, Mark; Szafran, J.; Shelton, C.; Ludwig, L.; Gibson, T.; Lane, J.; Trautwein, T.
2015-01-01
Project presentation for Game Changing Program Smart Book Release. Detection and Verification System (DDVS) expands the Flat Surface Damage Detection System (FSDDS) sensory panels damage detection capabilities and includes an autonomous inspection capability utilizing cameras and dynamic computer vision algorithms to verify system health. Objectives of this formulation task are to establish the concept of operations, formulate the system requirements for a potential ISS flight experiment, and develop a preliminary design of an autonomous inspection capability system that will be demonstrated as a proof-of-concept ground based damage detection and inspection system.
NASA Technical Reports Server (NTRS)
Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.
2016-01-01
With the recent development of multi-dimensional thermal protection system (TPS) material response codes including the capabilities to account for radiative heating is a requirement. This paper presents the recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute view factors for radiation problems involving multiple surfaces. Furthermore, verification and validation of the code's radiation capabilities are demonstrated by comparing solutions to analytical results, to other codes, and to radiant test data.
NASA Technical Reports Server (NTRS)
Stevens, G. H.; Anzic, G.
1979-01-01
NASA is conducting a series of millimeter wave satellite communication systems and market studies to: (1) determine potential domestic 30/20 GHz satellite concepts and market potential, and (2) establish the requirements for a suitable technology verification payload which, although intended to be modest in capacity, would sufficiently demonstrate key technologies and experimentally address key operational issues. Preliminary results and critical issues of the current contracted effort are described. Also included is a description of a NASA-developed multibeam satellite payload configuration which may be representative of concepts utilized in a technology flight verification program.
A thermal scale modeling study for Apollo and Apollo applications, volume 1
NASA Technical Reports Server (NTRS)
Shannon, R. L.
1972-01-01
The program is reported for developing and demonstrating the capabilities of thermal scale modeling as a thermal design and verification tool for Apollo and Apollo Applications Projects. The work performed for thermal scale modeling of STB; cabin atmosphere/spacecraft cabin wall thermal interface; closed loop heat rejection radiator; and docked module/spacecraft thermal interface are discussed along with the test facility requirements for thermal scale model testing of AAP spacecraft. It is concluded that thermal scale modeling can be used as an effective thermal design and verification tool to provide data early in a spacecraft development program.
Conditional High-Order Boltzmann Machines for Supervised Relation Learning.
Huang, Yan; Wang, Wei; Wang, Liang; Tan, Tieniu
2017-09-01
Relation learning is a fundamental problem in many vision tasks. Recently, high-order Boltzmann machine and its variants have shown their great potentials in learning various types of data relation in a range of tasks. But most of these models are learned in an unsupervised way, i.e., without using relation class labels, which are not very discriminative for some challenging tasks, e.g., face verification. In this paper, with the goal to perform supervised relation learning, we introduce relation class labels into conventional high-order multiplicative interactions with pairwise input samples, and propose a conditional high-order Boltzmann Machine (CHBM), which can learn to classify the data relation in a binary classification way. To be able to deal with more complex data relation, we develop two improved variants of CHBM: 1) latent CHBM, which jointly performs relation feature learning and classification, by using a set of latent variables to block the pathway from pairwise input samples to output relation labels and 2) gated CHBM, which untangles factors of variation in data relation, by exploiting a set of latent variables to multiplicatively gate the classification of CHBM. To reduce the large number of model parameters generated by the multiplicative interactions, we approximately factorize high-order parameter tensors into multiple matrices. Then, we develop efficient supervised learning algorithms, by first pretraining the models using joint likelihood to provide good parameter initialization, and then finetuning them using conditional likelihood to enhance the discriminant ability. We apply the proposed models to a series of tasks including invariant recognition, face verification, and action similarity labeling. Experimental results demonstrate that by exploiting supervised relation labels, our models can greatly improve the performance.
NASA Technical Reports Server (NTRS)
Melendez, Orlando; Trizzino, Mary; Fedderson, Bryan
1997-01-01
The National Aeronautics and Space Administration (NASA), Kennedy Space Center (KSC) Materials Science Division conducted a study to evaluate alternative solvents for CFC-113 in precision cleaning and verification on typical samples that are used in the KSC environment. The effects of AK-225(R), Vertrel(R), MCA, and HFE A 7100 on selected metal and polymer materials were studied over 1, 7 and 30 day test times. This report addresses a study on the compatibility aspects of replacement solvents for materials in aerospace applications.
Self-Verification and Depressive Symptoms in Marriage and Courtship: A Multiple Pathway Model.
ERIC Educational Resources Information Center
Katz, Jennifer; Beach, Steven R. H.
1997-01-01
Examines whether self-verifying feedback may lead to decreased depressive symptoms. Results, based on 138 married women and 258 dating women, showed full mediational effects in the married sample and partial effects in the dating sample. Findings suggest that partner self-verifying feedback may intensify the effect of self-esteem on depression.…
The purpose of this SOP is to ensure suitable temperature maintenance of freezers used for storage of samples. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the "Border" study. Keywords: freezers; operation.
The National H...
The purpose of this SOP is to assure suitable temperature maintenance in refrigerators and freezers used for sample storage during the Arizona NHEXAS project and the "Border" study. Keywords: lab; equipment; refrigerators and freezers.
The National Human Exposure Assessment Su...
Krasteva, Vessela; Jekova, Irena; Schmid, Ramun
2018-01-01
This study aims to validate the 12-lead electrocardiogram (ECG) as a biometric modality based on two straightforward binary QRS template matching characteristics. Different perspectives of the human verification problem are considered, regarding the optimal lead selection and stability over sample size, gender, age, heart rate (HR). A clinical 12-lead resting ECG database, including a population of 460 subjects with two-session recordings (>1 year apart) is used. Cost-effective strategies for extraction of personalized QRS patterns (100ms) and binary template matching estimate similarity in the time scale (matching time) and dissimilarity in the amplitude scale (mismatch area). The two-class person verification task, taking the decision to validate or to reject the subject identity is managed by linear discriminant analysis (LDA). Non-redundant LDA models for different lead configurations (I,II,III,aVF,aVL,aVF,V1-V6) are trained on the first half of 230 subjects by stepwise feature selection until maximization of the area under the receiver operating characteristic curve (ROC AUC). The operating point on the training ROC at equal error rate (EER) is tested on the independent dataset (second half of 230 subjects) to report unbiased validation of test-ROC AUC and true verification rate (TVR = 100-EER). The test results are further evaluated in groups by sample size, gender, age, HR. The optimal QRS pattern projection for single-lead ECG biometric modality is found in the frontal plane sector (60°-0°) with best (Test-AUC/TVR) for lead II (0.941/86.8%) and slight accuracy drop for -aVR (-0.017/-1.4%), I (-0.01/-1.5%). Chest ECG leads have degrading accuracy from V1 (0.885/80.6%) to V6 (0.799/71.8%). The multi-lead ECG improves verification: 6-chest (0.97/90.9%), 6-limb (0.986/94.3%), 12-leads (0.995/97.5%). The QRS pattern matching model shows stable performance for verification of 10 to 230 individuals; insignificant degradation of TVR in women by (1.2-3.6%), adults ≥70 years (3.7%), younger <40 years (1.9%), HR<60bpm (1.2%), HR>90bpm (3.9%), no degradation for HR change (0 to >20bpm).
DOE Office of Scientific and Technical Information (OSTI.GOV)
ADAMS, WADE C
At Pennsylvania Department of Environmental Protection's request, ORAU's IEAV program conducted verification surveys on the excavated surfaces of Section 3, SUs 1, 4, and 5 at the Whittaker site on March 13 and 14, 2013. The survey activities included visual inspections, gamma radiation surface scans, gamma activity measurements, and soil sampling activities. Verification activities also included the review and assessment of the licensee's project documentation and methodologies. Surface scans identified four areas of elevated direct gamma radiation distinguishable from background; one area within SUs 1 and 4 and two areas within SU5. One area within SU5 was remediated by removingmore » a golf ball size piece of slag while ORAU staff was onsite. With the exception of the golf ball size piece of slag within SU5, a review of the ESL Section 3 EXS data packages for SUs 1, 4, and 5 indicated that these locations of elevated gamma radiation were also identified by the ESL gamma scans and that ESL personnel performed additional investigations and soil sampling within these areas. The investigative results indicated that the areas met the release criteria.« less
Li, Chao; Zhang, Yan-po; Guo, Wei-dong; Zhu, Yue; Xu, Jing; Deng, Xun
2010-09-01
Fluorescence excitation-emission matrix (EEM) and absorption spectroscopy were applied to study the optical properties of 29 CDOM samples collected from different ballast tanks of nine international route vessels anchored in Xiamen Port between October 2007 and April 2008. The purpose was to examine the feasibility of these spectral properties as a tracer to verify if these vessels follow the mid-ocean ballast water exchange (BWE) regulation. Using parallel factor analysis, four fluorescent components were identified, including two humic-like components (C1: 245, 300/386 nm; C2: 250, 345/458 nm) and two protein-like components (C3: 220, 275/306 nm; C4: 235, 290/345 nm), of which C2 component was the suitable fluorescence verification indicator. The vertical distribution of all fluorescent components in ballast tank was nearly similar indicating that profile-mixing sampling was preferable. Combined use of C2 component, spectral slope ratio (SR) of absorption spectroscopy and salinity may provide reasonable verification if BWE carried out by these nine ships. The results suggested that the combined use of multiple parameters (fluorescence, absorption and salinity) would be much reliable to determine the origin of ballast water, and to provide the technical guarantee for fast examination of ballast water exchange in Chinese ports.
Haidar Ahmad, Imad A; Tam, James; Li, Xue; Duffield, William; Tarara, Thomas; Blasko, Andrei
2017-02-05
The parameters affecting the recovery of pharmaceutical residues from the surface of stainless steel coupons for quantitative cleaning verification method development have been studied, including active pharmaceutical ingredient (API) level, spiking procedure, API/excipient ratio, analyst-to-analyst variability, inter-day variability, and cleaning procedure of the coupons. The lack of a well-defined procedure that consistently cleaned coupon surface was identified as the major contributor to low and variable recoveries. Assessment of acid, base, and oxidant washes, as well as the order of treatment, showed that a base-water-acid-water-oxidizer-water wash procedure resulted in consistent, accurate spiked recovery (>90%) and reproducible results (S rel ≤4%). By applying this cleaning procedure to the previously used coupons that failed the cleaning acceptance criteria, multiple analysts were able to obtain consistent recoveries from day-to-day for different APIs, and API/excipient ratios at various spike levels. We successfully applied our approach for cleaning verification of small molecules (MW<1000Da) as well as large biomolecules (MW up to 50,000Da). Method robustness was greatly influenced by the sample preparation procedure, especially for analyses using total organic carbon (TOC) determination. Copyright © 2016 Elsevier B.V. All rights reserved.
Hou, Guixue; Lou, Xiaomin; Sun, Yulin; Xu, Shaohang; Zi, Jin; Wang, Quanhui; Zhou, Baojin; Han, Bo; Wu, Lin; Zhao, Xiaohang; Lin, Liang; Liu, Siqi
2015-09-04
We propose an efficient integration of SWATH with MRM for biomarker discovery and verification when the corresponding ion library is well established. We strictly controlled the false positive rate associated with SWATH MS signals and carefully selected the target peptides coupled with SWATH and MRM. We collected 10 samples of esophageal squamous cell carcinoma (ESCC) tissues paired with tumors and adjacent regions and quantified 1758 unique proteins with FDR 1% at protein level using SWATH, in which 467 proteins were abundance-dependent with ESCC. After carefully evaluating the SWATH MS signals of the up-regulated proteins, we selected 120 proteins for MRM verification. MRM analysis of the pooled and individual esophageal tissues resulted in 116 proteins that exhibited similar abundance response modes to ESCC that were acquired with SWATH. Because the ESCC-related proteins consisted of a high percentile of secreted proteins, we conducted the MRM assay on patient sera that were collected from pre- and postoperation. Of the 116 target proteins, 42 were identified in the ESCC sera, including 11 with lowered abundances postoperation. Coupling SWATH and MRM is thus feasible and efficient for the discovery and verification of cancer-related protein biomarkers.
Enrichment Assay Methods Development for the Integrated Cylinder Verification System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.
2009-10-22
International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify eachmore » cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.« less
Development of CFC-Free Cleaning Processes at the NASA White Sands Test Facility
NASA Technical Reports Server (NTRS)
Beeson, Harold; Kirsch, Mike; Hornung, Steven; Biesinger, Paul
1995-01-01
The NASA White Sands Test Facility (WSTF) is developing cleaning and verification processes to replace currently used chlorofluorocarbon-113- (CFC-113-) based processes. The processes being evaluated include both aqueous- and solvent-based techniques. The presentation will include the findings of investigations of aqueous cleaning and verification processes that are based on a draft of a proposed NASA Kennedy Space Center (KSC) cleaning procedure. Verification testing with known contaminants, such as hydraulic fluid and commonly used oils, established correlations between nonvolatile residue and CFC-113. Recoveries ranged from 35 to 60 percent of theoretical. WSTF is also investigating enhancements to aqueous sampling for organics and particulates. Although aqueous alternatives have been identified for several processes, a need still exists for nonaqueous solvent cleaning, such as the cleaning and cleanliness verification of gauges used for oxygen service. The cleaning effectiveness of tetrachloroethylene (PCE), trichloroethylene (TCE), ethanol, hydrochlorofluorocarbon-225 (HCFC-225), tert-butylmethylether, and n-Hexane was evaluated using aerospace gauges and precision instruments and then compared to the cleaning effectiveness of CFC-113. Solvents considered for use in oxygen systems were also tested for oxygen compatibility using high-pressure oxygen autoignition and liquid oxygen mechanical impact testing.
Evaluation of the 29-km Eta Model. Part 1; Objective Verification at Three Selected Stations
NASA Technical Reports Server (NTRS)
Nutter, Paul A.; Manobianco, John; Merceret, Francis J. (Technical Monitor)
1998-01-01
This paper describes an objective verification of the National Centers for Environmental Prediction (NCEP) 29-km eta model from May 1996 through January 1998. The evaluation was designed to assess the model's surface and upper-air point forecast accuracy at three selected locations during separate warm (May - August) and cool (October - January) season periods. In order to enhance sample sizes available for statistical calculations, the objective verification includes two consecutive warm and cool season periods. Systematic model deficiencies comprise the larger portion of the total error in most of the surface forecast variables that were evaluated. The error characteristics for both surface and upper-air forecasts vary widely by parameter, season, and station location. At upper levels, a few characteristic biases are identified. Overall however, the upper-level errors are more nonsystematic in nature and could be explained partly by observational measurement uncertainty. With a few exceptions, the upper-air results also indicate that 24-h model error growth is not statistically significant. In February and August 1997, NCEP implemented upgrades to the eta model's physical parameterizations that were designed to change some of the model's error characteristics near the surface. The results shown in this paper indicate that these upgrades led to identifiable and statistically significant changes in forecast accuracy for selected surface parameters. While some of the changes were expected, others were not consistent with the intent of the model updates and further emphasize the need for ongoing sensitivity studies and localized statistical verification efforts. Objective verification of point forecasts is a stringent measure of model performance, but when used alone, is not enough to quantify the overall value that model guidance may add to the forecast process. Therefore, results from a subjective verification of the meso-eta model over the Florida peninsula are discussed in the companion paper by Manobianco and Nutter. Overall verification results presented here and in part two should establish a reasonable benchmark from which model users and developers may pursue the ongoing eta model verification strategies in the future.
Tegel, Hanna; Yderland, Louise; Boström, Tove; Eriksson, Cecilia; Ukkonen, Kaisa; Vasala, Antti; Neubauer, Peter; Ottosson, Jenny; Hober, Sophia
2011-08-01
Protein production and analysis in a parallel fashion is today applied in laboratories worldwide and there is a great need to improve the techniques and systems used for this purpose. In order to save time and money, a fast and reliable screening method for analysis of protein production and also verification of the protein product is desired. Here, a micro-scale protocol for the parallel production and screening of 96 proteins in plate format is described. Protein capture was achieved using immobilized metal affinity chromatography and the product was verified using matrix-assisted laser desorption ionization time-of-flight MS. In order to obtain sufficiently high cell densities and product yield in the small-volume cultivations, the EnBase® cultivation technology was applied, which enables cultivation in as small volumes as 150 μL. Here, the efficiency of the method is demonstrated by producing 96 human, recombinant proteins, both in micro-scale and using a standard full-scale protocol and comparing the results in regard to both protein identity and sample purity. The results obtained are highly comparable to those acquired through employing standard full-scale purification protocols, thus validating this method as a successful initial screening step before protein production at a larger scale. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Cosmic shear measurements with Dark Energy Survey Science Verification data
Becker, M. R.
2016-07-06
Here, we present measurements of weak gravitational lensing cosmic shear two-point statistics using Dark Energy Survey Science Verification data. We demonstrate that our results are robust to the choice of shear measurement pipeline, either ngmix or im3shape, and robust to the choice of two-point statistic, including both real and Fourier-space statistics. Our results pass a suite of null tests including tests for B-mode contamination and direct tests for any dependence of the two-point functions on a set of 16 observing conditions and galaxy properties, such as seeing, airmass, galaxy color, galaxy magnitude, etc. We use a large suite of simulationsmore » to compute the covariance matrix of the cosmic shear measurements and assign statistical significance to our null tests. We find that our covariance matrix is consistent with the halo model prediction, indicating that it has the appropriate level of halo sample variance. We also compare the same jackknife procedure applied to the data and the simulations in order to search for additional sources of noise not captured by the simulations. We find no statistically significant extra sources of noise in the data. The overall detection significance with tomography for our highest source density catalog is 9.7σ. Cosmological constraints from the measurements in this work are presented in a companion paper.« less
[Validation and verfication of microbiology methods].
Camaró-Sala, María Luisa; Martínez-García, Rosana; Olmos-Martínez, Piedad; Catalá-Cuenca, Vicente; Ocete-Mochón, María Dolores; Gimeno-Cardona, Concepción
2015-01-01
Clinical microbiologists should ensure, to the maximum level allowed by the scientific and technical development, the reliability of the results. This implies that, in addition to meeting the technical criteria to ensure their validity, they must be performed with a number of conditions that allows comparable results to be obtained, regardless of the laboratory that performs the test. In this sense, the use of recognized and accepted reference methodsis the most effective tool for these guarantees. The activities related to verification and validation of analytical methods has become very important, as there is continuous development, as well as updating techniques and increasingly complex analytical equipment, and an interest of professionals to ensure quality processes and results. The definitions of validation and verification are described, along with the different types of validation/verification, and the types of methods, and the level of validation necessary depending on the degree of standardization. The situations in which validation/verification is mandatory and/or recommended is discussed, including those particularly related to validation in Microbiology. It stresses the importance of promoting the use of reference strains as controls in Microbiology and the use of standard controls, as well as the importance of participation in External Quality Assessment programs to demonstrate technical competence. The emphasis is on how to calculate some of the parameters required for validation/verification, such as the accuracy and precision. The development of these concepts can be found in the microbiological process SEIMC number 48: «Validation and verification of microbiological methods» www.seimc.org/protocols/microbiology. Copyright © 2013 Elsevier España, S.L.U. y Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
Raiszadeh, Michelle M.; Ross, Mark M.; Russo, Paul S.; Schaepper, Mary Ann H.; Zhou, Weidong; Deng, Jianghong; Ng, Daniel; Dickson, April; Dickson, Cindy; Strom, Monica; Osorio, Carolina; Soeprono, Thomas; Wulfkuhle, Julia D.; Kabbani, Nadine; Petricoin, Emanuel F.; Liotta, Lance A.; Kirsch, Wolff M.
2012-01-01
Liquid chromatography tandem mass spectrometry (LC-MS/MS) and multiple reaction monitoring mass spectrometry (MRM-MS) proteomics analyses were performed on eccrine sweat of healthy controls, and the results were compared with those from individuals diagnosed with schizophrenia (SZ). This is the first large scale study of the sweat proteome. First, we performed LC-MS/MS on pooled SZ samples and pooled control samples for global proteomics analysis. Results revealed a high abundance of diverse proteins and peptides in eccrine sweat. Most of the proteins identified from sweat samples were found to be different than the most abundant proteins from serum, which indicates that eccrine sweat is not simply a plasma transudate, and may thereby be a source of unique disease-associated biomolecules. A second independent set of patient and control sweat samples were analyzed by LC-MS/MS and spectral counting to determine qualitative protein differential abundances between the control and disease groups. Differential abundances of selected proteins, initially determined by spectral counting, were verified by MRM-MS analyses. Seventeen proteins showed a differential abundance of approximately two-fold or greater between the SZ pooled sample and the control pooled sample. This study demonstrates the utility of LC-MS/MS and MRM-MS as a viable strategy for the discovery and verification of potential sweat protein disease biomarkers. PMID:22256890
MEMS for Space Flight Applications
NASA Technical Reports Server (NTRS)
Lawton, R.
1998-01-01
Micro-Electrical Mechanical Systems (MEMS) are entering the stage of design and verification to demonstrate the utility of the technology for a wide range of applications including sensors and actuators for military, space, medical, industrial, consumer, automotive and instrumentation products.
Space fabrication demonstration system: Executive summary. [for large space structures
NASA Technical Reports Server (NTRS)
1979-01-01
The results of analysis and tests conducted to define the basic 1-m beam configuration required, and the design, development, fabrication, and verification tests of the machine required to automatically produce these beams are presented.
40 CFR 86.1838-01 - Small volume manufacturer certification procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... paragraph (c)(2). (i) Small volume in-use verification test vehicles may be procured from customers or may... miles, a manufacturer may demonstrate to the satisfaction of the Agency that, based on owner survey data...
DOT National Transportation Integrated Search
2018-01-01
Connected vehicles (CVs) and their integration with transportation infrastructure provide new approaches to wrong-way driving (WWD) detection, warning, verification, and intervention that will help practitioners further reduce the occurrence and seve...
Wu, Xiaoping; Akgün, Can; Vaughan, J Thomas; Andersen, Peter; Strupp, John; Uğurbil, Kâmil; Van de Moortele, Pierre-François
2010-07-01
Parallel excitation holds strong promises to mitigate the impact of large transmit B1 (B+1) distortion at very high magnetic field. Accelerated RF pulses, however, inherently tend to require larger values in RF peak power which may result in substantial increase in Specific Absorption Rate (SAR) in tissues, which is a constant concern for patient safety at very high field. In this study, we demonstrate adapted rate RF pulse design allowing for SAR reduction while preserving excitation target accuracy. Compared with other proposed implementations of adapted rate RF pulses, our approach is compatible with any k-space trajectories, does not require an analytical expression of the gradient waveform and can be used for large flip angle excitation. We demonstrate our method with numerical simulations based on electromagnetic modeling and we include an experimental verification of transmit pattern accuracy on an 8 transmit channel 9.4 T system.
Numerical Modeling of Ablation Heat Transfer
NASA Technical Reports Server (NTRS)
Ewing, Mark E.; Laker, Travis S.; Walker, David T.
2013-01-01
A unique numerical method has been developed for solving one-dimensional ablation heat transfer problems. This paper provides a comprehensive description of the method, along with detailed derivations of the governing equations. This methodology supports solutions for traditional ablation modeling including such effects as heat transfer, material decomposition, pyrolysis gas permeation and heat exchange, and thermochemical surface erosion. The numerical scheme utilizes a control-volume approach with a variable grid to account for surface movement. This method directly supports implementation of nontraditional models such as material swelling and mechanical erosion, extending capabilities for modeling complex ablation phenomena. Verifications of the numerical implementation are provided using analytical solutions, code comparisons, and the method of manufactured solutions. These verifications are used to demonstrate solution accuracy and proper error convergence rates. A simple demonstration of a mechanical erosion (spallation) model is also provided to illustrate the unique capabilities of the method.
L(sub 1) Adaptive Flight Control System: Flight Evaluation and Technology Transition
NASA Technical Reports Server (NTRS)
Xargay, Enric; Hovakimyan, Naira; Dobrokhodov, Vladimir; Kaminer, Isaac; Gregory, Irene M.; Cao, Chengyu
2010-01-01
Certification of adaptive control technologies for both manned and unmanned aircraft represent a major challenge for current Verification and Validation techniques. A (missing) key step towards flight certification of adaptive flight control systems is the definition and development of analysis tools and methods to support Verification and Validation for nonlinear systems, similar to the procedures currently used for linear systems. In this paper, we describe and demonstrate the advantages of L(sub l) adaptive control architectures for closing some of the gaps in certification of adaptive flight control systems, which may facilitate the transition of adaptive control into military and commercial aerospace applications. As illustrative examples, we present the results of a piloted simulation evaluation on the NASA AirSTAR flight test vehicle, and results of an extensive flight test program conducted by the Naval Postgraduate School to demonstrate the advantages of L(sub l) adaptive control as a verifiable robust adaptive flight control system.
Time-Lapse Electrical Geophysical Monitoring of Amendment-Based Biostimulation.
Johnson, Timothy C; Versteeg, Roelof J; Day-Lewis, Frederick D; Major, William; Lane, John W
2015-01-01
Biostimulation is increasingly used to accelerate microbial remediation of recalcitrant groundwater contaminants. Effective application of biostimulation requires successful emplacement of amendment in the contaminant target zone. Verification of remediation performance requires postemplacement assessment and contaminant monitoring. Sampling-based approaches are expensive and provide low-density spatial and temporal information. Time-lapse electrical resistivity tomography (ERT) is an effective geophysical method for determining temporal changes in subsurface electrical conductivity. Because remedial amendments and biostimulation-related biogeochemical processes often change subsurface electrical conductivity, ERT can complement and enhance sampling-based approaches for assessing emplacement and monitoring biostimulation-based remediation. Field studies demonstrating the ability of time-lapse ERT to monitor amendment emplacement and behavior were performed during a biostimulation remediation effort conducted at the Department of Defense Reutilization and Marketing Office (DRMO) Yard, in Brandywine, Maryland, United States. Geochemical fluid sampling was used to calibrate a petrophysical relation in order to predict groundwater indicators of amendment distribution. The petrophysical relations were field validated by comparing predictions to sequestered fluid sample results, thus demonstrating the potential of electrical geophysics for quantitative assessment of amendment-related geochemical properties. Crosshole radar zero-offset profile and borehole geophysical logging were also performed to augment the data set and validate interpretation. In addition to delineating amendment transport in the first 10 months after emplacement, the time-lapse ERT results show later changes in bulk electrical properties interpreted as mineral precipitation. Results support the use of more cost-effective surface-based ERT in conjunction with limited field sampling to improve spatial and temporal monitoring of amendment emplacement and remediation performance. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.
Area of Concern (AOC) 314 Verification Survey at Former McClellan AFB, Sacramento, CA
2015-03-31
also collected 22 soil samples from within AOC 314. Laboratory analysis revealed that the concentration of radium-226 (Ra-226) in 10 of the soil ...at least one sample that exceeded 2.0 pCi/g. The highest concentration of Ra-226 found in any of the soil samples was 25.8 pCi/g. Based on these...and ensure the potential health risk to future inhabitants is minimized. USAFSAM/OEC personnel also collected 22 soil samples from within AOC 314
Diode step stress program for JANTX1N5615
NASA Technical Reports Server (NTRS)
1979-01-01
The effect of power/temperature step stress when applied to the switching diode JANTX1N5615 manufactured by Semtech and Micro semiconductor was examined. A total of 48 samples from each manufacturer were submitted to the process. In addition, two control sample units were maintained for verification of the electrical parametric testing. All test samples were subjected to the electrical tests after completing the prior power/temperature step stress point. Results are presented.
Wind energy converter GROWIAN 2
NASA Astrophysics Data System (ADS)
Braun, D.; Kloeppel, V.; Marsch, G.; Meggle, R.; Mehlhose, R.; Schoebe, B.; Wennekers, R.
1984-04-01
Multi MW wind energy conversion systems in the rotor class of 135 m diam are described. A variable-speed horizontal-axis downwind machine with a one-bladed teetering rotor and a guyed soft steel tower was investigated and a 1 to 3 scaled demonstrator with a rotor diameter of 48 m was built. The demonstrator will undergo a 2 year verification test program.
Verification of Advective Bar Elements Implemented in the Aria Thermal Response Code.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mills, Brantley
2016-01-01
A verification effort was undertaken to evaluate the implementation of the new advective bar capability in the Aria thermal response code. Several approaches to the verification process were taken : a mesh refinement study to demonstrate solution convergence in the fluid and the solid, visually examining the mapping of the advective bar element nodes to the surrounding surfaces, and a comparison of solutions produced using the advective bars for simple geometries with solutions from commercial CFD software . The mesh refinement study has shown solution convergence for simple pipe flow in both temperature and velocity . Guidelines were provided tomore » achieve appropriate meshes between the advective bar elements and the surrounding volume. Simulations of pipe flow using advective bars elements in Aria have been compared to simulations using the commercial CFD software ANSYS Fluent (r) and provided comparable solutions in temperature and velocity supporting proper implementation of the new capability. Verification of Advective Bar Elements iv Acknowledgements A special thanks goes to Dean Dobranich for his guidance and expertise through all stages of this effort . His advice and feedback was instrumental to its completion. Thanks also goes to Sam Subia and Tolu Okusanya for helping to plan many of the verification activities performed in this document. Thank you to Sam, Justin Lamb and Victor Brunini for their assistance in resolving issues encountered with running the advective bar element model. Finally, thanks goes to Dean, Sam, and Adam Hetzler for reviewing the document and providing very valuable comments.« less
Treatment of Bottled Liquid Waste During Remediation of the Hanford 618-10 Burial Ground - 13001
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faulk, Darrin E.; Pearson, Chris M.; Vedder, Barry L.
2013-07-01
A problematic waste form encountered during remediation of the Hanford Site 618-10 burial ground consists of bottled aqueous waste potentially contaminated with regulated metals. The liquid waste requires stabilization prior to landfill disposal. Prior remediation activities at other Hanford burial grounds resulted in a standard process for sampling and analyzing liquid waste using manual methods. Due to the highly dispersible characteristics of alpha contamination, and the potential for shock sensitive chemicals, a different method for bottle processing was needed for the 618-10 burial ground. Discussions with the United States Department of Energy (DOE) and United States Environmental Protection Agency (EPA)more » led to development of a modified approach. The modified approach involves treatment of liquid waste in bottles, up to one gallon per bottle, in a tray or box within the excavation of the remediation site. Bottles are placed in the box, covered with soil and fixative, crushed, and mixed with a Portland cement grout. The potential hazards of the liquid waste preclude sampling prior to treatment. Post treatment verification sampling is performed to demonstrate compliance with land disposal restrictions and disposal facility acceptance criteria. (authors)« less
The purpose of this SOP is to assure suitable temperature maintenance in refrigerators and freezers used for sample storage during the Arizona NHEXAS project and the Border study. Keywords: lab; equipment; refrigerators and freezers.
The U.S.-Mexico Border Program is sponsored...
Transistor step stress testing program for JANTX2N2484
NASA Technical Reports Server (NTRS)
1979-01-01
The effect of power/temperature step stress when applied to the transistor JANTX2N2484, manufactured by Raytheon and Teledyne was evaluated. Forty-eight samples from each manufacturer were divided equally (16 per group) into three groups and submitted to the processes outlined. In addition, two control sample units were maintained for verification of the electrical parametric testing.
ERIC Educational Resources Information Center
Mesmer-Magnus, Jessica R.; Viswesvaran, Chockalingam
2005-01-01
The overlap between measures of work-to-family (WFC) and family-to-work conflict (FWC) was meta-analytically investigated. Researchers have assumed WFC and FWC to be distinct, however, this assumption requires empirical verification. Across 25 independent samples (total N=9079) the sample size weighted mean observed correlation was .38 and the…
Pelle, Dominic W; Ringler, Jonathan W; Peacock, Jacqueline D; Kampfschulte, Kevin; Scholten, Donald J; Davis, Mary M; Mitchell, Deanna S; Steensma, Matthew R
2014-08-01
Aneurysmal bone cyst (ABC) is a benign tumor of bone presenting as a cystic, expansile lesion in both the axial and appendicular skeleton. Axial lesions demand special consideration, because treatment-related morbidity can be devastating. In similar lesions, such as giant cell tumor of bone (GCTB), the receptor-activator of nuclear kappaB ligand (RANKL)-receptor-activator of nuclear kappaB (RANK) signaling axis is essential to tumor progression. Although ABC and GCTB are distinct entities, they both contain abundant multinucleated giant cells and are osteolytic characteristically. We hypothesize that ABCs express both RANKL and RANK similarly in a cell-type specific manner, and that targeted RANKL therapy will mitigate ABC tumor progression. Cellular expression of RANKL and RANK was determined in freshly harvested ABC samples using laser confocal microscopy. A consistent cell-type-specific pattern was observed: fibroblastlike stromal cells expressed RANKL strongly whereas monocyte/macrophage precursor and multinucleated giant cells expressed RANK. Relative RANKL expression was determined by quantitative real-time polymerase chain reaction in ABC and GCTB tissue samples; no difference in relative expression was observed (P > 0.05). In addition, we review the case of a 5-year-old boy with a large, aggressive sacral ABC. After 3 months of targeted RANKL inhibition with denosumab, magnetic resonance imaging demonstrated tumor shrinkage, bone reconstitution, and healing of a pathologic fracture. Ambulation, and bowel and bladder function were restored at 6 months. Denosumab treatment was well tolerated. Post hoc analysis demonstrated strong RANKL expression in the pretreatment tumor sample. These findings demonstrate that RANKL-RANK signal activation is essential to ABC tumor progression. RANKL-targeted therapy may be an effective alternative to surgery in select ABC presentations. Copyright © 2014 Mosby, Inc. All rights reserved.
Kim, Sang-Bog; Roche, Jennifer
2013-08-01
Organically bound tritium (OBT) is an important tritium species that can be measured in most environmental samples, but has only recently been recognized as a species of tritium in these samples. Currently, OBT is not routinely measured by environmental monitoring laboratories around the world. There are no certified reference materials (CRMs) for environmental samples. Thus, quality assurance (QA), or verification of the accuracy of the OBT measurement, is not possible. Alternatively, quality control (QC), or verification of the precision of the OBT measurement, can be achieved. In the past, there have been differences in OBT analysis results between environmental laboratories. A possible reason for the discrepancies may be differences in analytical methods. Therefore, inter-laboratory OBT comparisons among the environmental laboratories are important and would provide a good opportunity for adopting a reference OBT analytical procedure. Due to the analytical issues, only limited information is available on OBT measurement. Previously conducted OBT inter-laboratory practices are reviewed and the findings are described. Based on our experiences, a few considerations were suggested for the international OBT inter-laboratory comparison exercise to be completed in the near future. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
Coherent lidar design and performance verification
NASA Technical Reports Server (NTRS)
Frehlich, Rod
1993-01-01
The verification of LAWS beam alignment in space can be achieved by a measurement of heterodyne efficiency using the surface return. The crucial element is a direct detection signal that can be identified for each surface return. This should be satisfied for LAWS but will not be satisfied for descoped LAWS. The performance of algorithms for velocity estimation can be described with two basic parameters: the number of coherently detected photo-electrons per estimate and the number of independent signal samples per estimate. The average error of spectral domain velocity estimation algorithms are bounded by a new periodogram Cramer-Rao Bound. Comparison of the periodogram CRB with the exact CRB indicates a factor of two improvement in velocity accuracy is possible using non-spectral domain estimators. This improvement has been demonstrated with a maximum-likelihood estimator. The comparison of velocity estimation algorithms for 2 and 10 micron coherent lidar was performed by assuming all the system design parameters are fixed and the signal statistics are dominated by a 1 m/s rms wind fluctuation over the range gate. The beam alignment requirements for 2 micron are much more severe than for a 10 micron lidar. The effects of the random backscattered field on estimating the alignment error is a major problem for space based lidar operation, especially if the heterodyne efficiency cannot be estimated. For LAWS, the biggest science payoff would result from a short transmitted pulse, on the order of 0.5 microseconds instead of 3 microseconds. The numerically errors for simulation of laser propagation in the atmosphere have been determined as a joint project with the University of California, San Diego. Useful scaling laws were obtained for Kolmogorov atmospheric refractive turbulence and an atmospheric refractive turbulence characterized with an inner scale. This permits verification of the simulation procedure which is essential for the evaluation of the effects of refractive turbulence on coherent Doppler lidar systems. The analysis of 2 micron Doppler lidar data from Coherent Technologies, Inc. (CTI) has demonstrated many of the advantages of doppler lidar measurements of boundary layer winds. The effects of wind shear and wind turbulence over the pulse volume are probably the dominant source of the reduced performance. The effects of wind shear and wind turbulence on the statistical description of doppler lidar data has been derived and calculated.
Kuswandi, Bambang; Irmawati, Titi; Hidayat, Moch Amrun; Jayus; Ahmad, Musa
2014-01-01
A simple visual ethanol biosensor based on alcohol oxidase (AOX) immobilised onto polyaniline (PANI) film for halal verification of fermented beverage samples is described. This biosensor responds to ethanol via a colour change from green to blue, due to the enzymatic reaction of ethanol that produces acetaldehyde and hydrogen peroxide, when the latter oxidizes the PANI film. The procedure to obtain this biosensor consists of the immobilization of AOX onto PANI film by adsorption. For the immobilisation, an AOX solution is deposited on the PANI film and left at room temperature until dried (30 min). The biosensor was constructed as a dip stick for visual and simple use. The colour changes of the films have been scanned and analysed using image analysis software (i.e., ImageJ) to study the characteristics of the biosensor's response toward ethanol. The biosensor has a linear response in an ethanol concentration range of 0.01%–0.8%, with a correlation coefficient (r) of 0.996. The limit detection of the biosensor was 0.001%, with reproducibility (RSD) of 1.6% and a life time up to seven weeks when stored at 4 °C. The biosensor provides accurate results for ethanol determination in fermented drinks and was in good agreement with the standard method (gas chromatography) results. Thus, the biosensor could be used as a simple visual method for ethanol determination in fermented beverage samples that can be useful for Muslim community for halal verification. PMID:24473284
Nuclear Energy Experiments to the Center for Global Security and Cooperation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osborn, Douglas M.
2015-06-01
This is to serve as verification that the Center 6200 experimental pieces supplied to the Technology Training and Demonstration Area within the Center of Global Security and Cooperation are indeed unclassified unlimited released for viewing.
46 CFR 62.20-3 - Plans for information.
Code of Federal Regulations, 2011 CFR
2011-10-01
... detected by the crew, alternatives available to the crew, and possible design verification tests necessary... reliability of the design. It should be conducted to a level of detail necessary to demonstrate compliance... at an early stage of design. ...
ExoMars Raman laser spectrometer breadboard overview
NASA Astrophysics Data System (ADS)
Díaz, E.; Moral, A. G.; Canora, C. P.; Ramos, G.; Barcos, O.; Prieto, J. A. R.; Hutchinson, I. B.; Ingley, R.; Colombo, M.; Canchal, R.; Dávila, B.; Manfredi, J. A. R.; Jiménez, A.; Gallego, P.; Pla, J.; Margoillés, R.; Rull, F.; Sansano, A.; López, G.; Catalá, A.; Tato, C.
2011-10-01
The Raman Laser Spectrometer (RLS) is one of the Pasteur Payload instruments, within the ESA's Aurora Exploration Programme, ExoMars mission. The RLS Instrument will perform Raman spectroscopy on crushed powdered samples deposited on a small container after crushing the cores obtained by the Rover's drill system. In response to ESA requirements for delta-PDR to be held in mid 2012, an instrument BB programme has been developed, by RLS Assembly Integration and Verification (AIV) Team to achieve the Technology Readiness level 5 (TRL5), during last 2010 and whole 2011. Currently RLS instrument is being developed pending its CoDR (Conceptual Design Revision) with ESA, in October 2011. It is planned to have a fully operative breadboard, conformed from different unit and sub-units breadboards that would demonstrate the end-to-end performance of the flight representative units by 2011 Q4.
Arithmetic Circuit Verification Based on Symbolic Computer Algebra
NASA Astrophysics Data System (ADS)
Watanabe, Yuki; Homma, Naofumi; Aoki, Takafumi; Higuchi, Tatsuo
This paper presents a formal approach to verify arithmetic circuits using symbolic computer algebra. Our method describes arithmetic circuits directly with high-level mathematical objects based on weighted number systems and arithmetic formulae. Such circuit description can be effectively verified by polynomial reduction techniques using Gröbner Bases. In this paper, we describe how the symbolic computer algebra can be used to describe and verify arithmetic circuits. The advantageous effects of the proposed approach are demonstrated through experimental verification of some arithmetic circuits such as multiply-accumulator and FIR filter. The result shows that the proposed approach has a definite possibility of verifying practical arithmetic circuits.
Reasoning about Function Objects
NASA Astrophysics Data System (ADS)
Nordio, Martin; Calcagno, Cristiano; Meyer, Bertrand; Müller, Peter; Tschannen, Julian
Modern object-oriented languages support higher-order implementations through function objects such as delegates in C#, agents in Eiffel, or closures in Scala. Function objects bring a new level of abstraction to the object-oriented programming model, and require a comparable extension to specification and verification techniques. We introduce a verification methodology that extends function objects with auxiliary side-effect free (pure) methods to model logical artifacts: preconditions, postconditions and modifies clauses. These pure methods can be used to specify client code abstractly, that is, independently from specific instantiations of the function objects. To demonstrate the feasibility of our approach, we have implemented an automatic prover, which verifies several non-trivial examples.
Diode step stress program, JANTX1N5614
NASA Technical Reports Server (NTRS)
1978-01-01
The reliability of switching diode JANTX1N5614 was tested. The effect of power/temperature step stress on the diode was determined. Control sample units were maintained for verification of the electrical parametric testing. Results are reported.
Smith, Ryan L; Haworth, Annette; Panettieri, Vanessa; Millar, Jeremy L; Franich, Rick D
2016-05-01
Verification of high dose rate (HDR) brachytherapy treatment delivery is an important step, but is generally difficult to achieve. A technique is required to monitor the treatment as it is delivered, allowing comparison with the treatment plan and error detection. In this work, we demonstrate a method for monitoring the treatment as it is delivered and directly comparing the delivered treatment with the treatment plan in the clinical workspace. This treatment verification system is based on a flat panel detector (FPD) used for both pre-treatment imaging and source tracking. A phantom study was conducted to establish the resolution and precision of the system. A pretreatment radiograph of a phantom containing brachytherapy catheters is acquired and registration between the measurement and treatment planning system (TPS) is performed using implanted fiducial markers. The measured catheter paths immediately prior to treatment were then compared with the plan. During treatment delivery, the position of the (192)Ir source is determined at each dwell position by measuring the exit radiation with the FPD and directly compared to the planned source dwell positions. The registration between the two corresponding sets of fiducial markers in the TPS and radiograph yielded a registration error (residual) of 1.0 mm. The measured catheter paths agreed with the planned catheter paths on average to within 0.5 mm. The source positions measured with the FPD matched the planned source positions for all dwells on average within 0.6 mm (s.d. 0.3, min. 0.1, max. 1.4 mm). We have demonstrated a method for directly comparing the treatment plan with the delivered treatment that can be easily implemented in the clinical workspace. Pretreatment imaging was performed, enabling visualization of the implant before treatment delivery and identification of possible catheter displacement. Treatment delivery verification was performed by measuring the source position as each dwell was delivered. This approach using a FPD for imaging and source tracking provides a noninvasive method of acquiring extensive information for verification in HDR prostate brachytherapy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Ryan L., E-mail: ryan.smith@wbrc.org.au; Millar, Jeremy L.; Franich, Rick D.
Purpose: Verification of high dose rate (HDR) brachytherapy treatment delivery is an important step, but is generally difficult to achieve. A technique is required to monitor the treatment as it is delivered, allowing comparison with the treatment plan and error detection. In this work, we demonstrate a method for monitoring the treatment as it is delivered and directly comparing the delivered treatment with the treatment plan in the clinical workspace. This treatment verification system is based on a flat panel detector (FPD) used for both pre-treatment imaging and source tracking. Methods: A phantom study was conducted to establish the resolutionmore » and precision of the system. A pretreatment radiograph of a phantom containing brachytherapy catheters is acquired and registration between the measurement and treatment planning system (TPS) is performed using implanted fiducial markers. The measured catheter paths immediately prior to treatment were then compared with the plan. During treatment delivery, the position of the {sup 192}Ir source is determined at each dwell position by measuring the exit radiation with the FPD and directly compared to the planned source dwell positions. Results: The registration between the two corresponding sets of fiducial markers in the TPS and radiograph yielded a registration error (residual) of 1.0 mm. The measured catheter paths agreed with the planned catheter paths on average to within 0.5 mm. The source positions measured with the FPD matched the planned source positions for all dwells on average within 0.6 mm (s.d. 0.3, min. 0.1, max. 1.4 mm). Conclusions: We have demonstrated a method for directly comparing the treatment plan with the delivered treatment that can be easily implemented in the clinical workspace. Pretreatment imaging was performed, enabling visualization of the implant before treatment delivery and identification of possible catheter displacement. Treatment delivery verification was performed by measuring the source position as each dwell was delivered. This approach using a FPD for imaging and source tracking provides a noninvasive method of acquiring extensive information for verification in HDR prostate brachytherapy.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Damiani, Rick
This manual summarizes the theory and preliminary verifications of the JacketSE module, which is an offshore jacket sizing tool that is part of the Wind-Plant Integrated System Design & Engineering Model toolbox. JacketSE is based on a finite-element formulation and on user-prescribed inputs and design standards' criteria (constraints). The physics are highly simplified, with a primary focus on satisfying ultimate limit states and modal performance requirements. Preliminary validation work included comparing industry data and verification against ANSYS, a commercial finite-element analysis package. The results are encouraging, and future improvements to the code are recommended in this manual.
Classical verification of quantum circuits containing few basis changes
NASA Astrophysics Data System (ADS)
Demarie, Tommaso F.; Ouyang, Yingkai; Fitzsimons, Joseph F.
2018-04-01
We consider the task of verifying the correctness of quantum computation for a restricted class of circuits which contain at most two basis changes. This contains circuits giving rise to the second level of the Fourier hierarchy, the lowest level for which there is an established quantum advantage. We show that when the circuit has an outcome with probability at least the inverse of some polynomial in the circuit size, the outcome can be checked in polynomial time with bounded error by a completely classical verifier. This verification procedure is based on random sampling of computational paths and is only possible given knowledge of the likely outcome.
[Tobacco quality analysis of producing areas of Yunnan tobacco using near-infrared (NIR) spectrum].
Wang, Yi; Ma, Xiang; Wen, Ya-Dong; Yu, Chun-Xia; Wang, Luo-Ping; Zhao, Long-Lian; Li, Jun-Hui
2013-01-01
In the present study, tobacco quality analysis of different producing areas was carried out applying spectrum projection and correlation methods. The group of industrial classification data was near-infrared (NIR) spectrum in 2010 year of middle parts of tobacco plant from Hongta Tobacco (Group) Co., Ltd. Twelve hundred seventy six superior tobacco leaf samples were collected from four producing areas, in which three areas from Yuxi, Chuxiong and Zhaotong, in Yunnan province all belong to tobacco varieties of K326 and one area from Dali belongs to tobacco varieties of Hongda. The conclusion showed that when the samples were divided into two parts by the ratio of 2 : 1 randomly as analysis and verification sets, the verification set corresponded with the analysis set applying spectrum projection because their correlation coefficients by the first and second dimensional projection were all above 0.99. At the same time, The study discussed a method to get the quantitative similarity values of different producing areas samples. The similarity values were instructive in tobacco plant planning, quality management, acquisition of raw materials of tobacco and tobacco leaf blending.
Electroacoustic verification of frequency modulation systems in cochlear implant users.
Fidêncio, Vanessa Luisa Destro; Jacob, Regina Tangerino de Souza; Tanamati, Liége Franzini; Bucuvic, Érika Cristina; Moret, Adriane Lima Mortari
2017-12-26
The frequency modulation system is a device that helps to improve speech perception in noise and is considered the most beneficial approach to improve speech recognition in noise in cochlear implant users. According to guidelines, there is a need to perform a check before fitting the frequency modulation system. Although there are recommendations regarding the behavioral tests that should be performed at the fitting of the frequency modulation system to cochlear implant users, there are no published recommendations regarding the electroacoustic test that should be performed. Perform and determine the validity of an electroacoustic verification test for frequency modulation systems coupled to different cochlear implant speech processors. The sample included 40 participants between 5 and 18 year's users of four different models of speech processors. For the electroacoustic evaluation, we used the Audioscan Verifit device with the HA-1 coupler and the listening check devices corresponding to each speech processor model. In cases where the transparency was not achieved, a modification was made in the frequency modulation gain adjustment and we used the Brazilian version of the "Phrases in Noise Test" to evaluate the speech perception in competitive noise. It was observed that there was transparency between the frequency modulation system and the cochlear implant in 85% of the participants evaluated. After adjusting the gain of the frequency modulation receiver in the other participants, the devices showed transparency when the electroacoustic verification test was repeated. It was also observed that patients demonstrated better performance in speech perception in noise after a new adjustment, that is, in these cases; the electroacoustic transparency caused behavioral transparency. The electroacoustic evaluation protocol suggested was effective in evaluation of transparency between the frequency modulation system and the cochlear implant. Performing the adjustment of the speech processor and the frequency modulation system gain are essential when fitting this device. Copyright © 2017 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.
1988-01-01
under field conditions. Sampling and analytical laboratory activities were performed by Ecology and Environment, Inc., and California Analytical...the proposed AER3 test conditions. All test samples would be obtained onsite by Ecology and Environment, Inc., of Buffalo, New York, and sent to...ensuring its safe operation. Ecology and Environment performed onsite verification sampling. This activity was coordinated with the Huber project team
Transistor step stress testing program for JANTX2N2905A
NASA Technical Reports Server (NTRS)
1979-01-01
The effect of power/temperature step stress when applied to the transistor JANTX2N2905A manufactured by Texas Instruments and Motorola is reported. A total of 48 samples from each manufacturer was submitted to the process outlined. In addition, two control sample units were maintained for verification of the electrical parametric testing. All test samples were subjected to the electrical tests outlined in Table 2 after completing the prior power/temperature step stress point.
A Design Rationale Capture Tool to Support Design Verification and Re-use
NASA Technical Reports Server (NTRS)
Hooey, Becky Lee; Da Silva, Jonny C.; Foyle, David C.
2012-01-01
A design rationale tool (DR tool) was developed to capture design knowledge to support design verification and design knowledge re-use. The design rationale tool captures design drivers and requirements, and documents the design solution including: intent (why it is included in the overall design); features (why it is designed the way it is); information about how the design components support design drivers and requirements; and, design alternatives considered but rejected. For design verification purposes, the tool identifies how specific design requirements were met and instantiated within the final design, and which requirements have not been met. To support design re-use, the tool identifies which design decisions are affected when design drivers and requirements are modified. To validate the design tool, the design knowledge from the Taxiway Navigation and Situation Awareness (T-NASA; Foyle et al., 1996) system was captured and the DR tool was exercised to demonstrate its utility for validation and re-use.
Systems Approach to Arms Control Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allen, K; Neimeyer, I; Listner, C
2015-05-15
Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between twomore » model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.« less
Verification of combined thermal-hydraulic and heat conduction analysis code FLOWNET/TRUMP
NASA Astrophysics Data System (ADS)
Maruyama, Soh; Fujimoto, Nozomu; Kiso, Yoshihiro; Murakami, Tomoyuki; Sudo, Yukio
1988-09-01
This report presents the verification results of the combined thermal-hydraulic and heat conduction analysis code, FLOWNET/TRUMP which has been utilized for the core thermal hydraulic design, especially for the analysis of flow distribution among fuel block coolant channels, the determination of thermal boundary conditions for fuel block stress analysis and the estimation of fuel temperature in the case of fuel block coolant channel blockage accident in the design of the High Temperature Engineering Test Reactor(HTTR), which the Japan Atomic Energy Research Institute has been planning to construct in order to establish basic technologies for future advanced very high temperature gas-cooled reactors and to be served as an irradiation test reactor for promotion of innovative high temperature new frontier technologies. The verification of the code was done through the comparison between the analytical results and experimental results of the Helium Engineering Demonstration Loop Multi-channel Test Section(HENDEL T(sub 1-M)) with simulated fuel rods and fuel blocks.
Verification and Validation of KBS with Neural Network Components
NASA Technical Reports Server (NTRS)
Wen, Wu; Callahan, John
1996-01-01
Artificial Neural Network (ANN) play an important role in developing robust Knowledge Based Systems (KBS). The ANN based components used in these systems learn to give appropriate predictions through training with correct input-output data patterns. Unlike traditional KBS that depends on a rule database and a production engine, the ANN based system mimics the decisions of an expert without specifically formulating the if-than type of rules. In fact, the ANNs demonstrate their superiority when such if-then type of rules are hard to generate by human expert. Verification of traditional knowledge based system is based on the proof of consistency and completeness of the rule knowledge base and correctness of the production engine.These techniques, however, can not be directly applied to ANN based components.In this position paper, we propose a verification and validation procedure for KBS with ANN based components. The essence of the procedure is to obtain an accurate system specification through incremental modification of the specifications using an ANN rule extraction algorithm.
DOT National Transportation Integrated Search
1981-08-01
The Transit Reliability Information Program (TRIP) is a government-initiated program to assist the transit industry in satisfying its need for transit reliability information. TRIP provides this assistance through the operation of a national Data Ban...
Xu, Xinxing; Li, Wen; Xu, Dong
2015-12-01
In this paper, we propose a new approach to improve face verification and person re-identification in the RGB images by leveraging a set of RGB-D data, in which we have additional depth images in the training data captured using depth cameras such as Kinect. In particular, we extract visual features and depth features from the RGB images and depth images, respectively. As the depth features are available only in the training data, we treat the depth features as privileged information, and we formulate this task as a distance metric learning with privileged information problem. Unlike the traditional face verification and person re-identification tasks that only use visual features, we further employ the extra depth features in the training data to improve the learning of distance metric in the training process. Based on the information-theoretic metric learning (ITML) method, we propose a new formulation called ITML with privileged information (ITML+) for this task. We also present an efficient algorithm based on the cyclic projection method for solving the proposed ITML+ formulation. Extensive experiments on the challenging faces data sets EUROCOM and CurtinFaces for face verification as well as the BIWI RGBD-ID data set for person re-identification demonstrate the effectiveness of our proposed approach.
Basic principles and recent observations of rotationally sampled wind
NASA Technical Reports Server (NTRS)
Connell, James R.
1995-01-01
The concept of rotationally sampled wind speed is described. The unusual wind characteristics that result from rotationally sampling the wind are shown first for early measurements made using an 8-point ring of anemometers on a vertical plane array of meteorological towers. Quantitative characterization of the rotationally sampled wind is made in terms of the power spectral density function of the wind speed. Verification of the importance of the new concept is demonstrated with spectral analyses of the response of the MOD-OA blade flapwise root bending moment and the corresponding rotational analysis of the wind measured immediately upwind of the MOD-OA using a 12-point ring of anemometers on a 7-tower vertical plane array. The Pacific Northwest Laboratory (PNL) theory of the rotationally sampled wind speed power spectral density function is tested successfully against the wind spectrum measured at the MOD-OA vertical plane array. A single-tower empirical model of the rotationally sampled wind speed is also successfully tested against the measurements from the full vertical plane array. Rotational measurements of the wind velocity with hotfilm anemometers attached to rotating blades are shown to be accurate and practical for research on winds at the blades of wind turbines. Some measurements at the rotor blade of a MOD-2 turbine using the hotfilm technique in a pilot research program are shown. They are compared and contrasted to the expectations based upon application of the PNL theory of rotationally sampled wind to the MOD-2 size and rotation rate but without teeter, blade bending, or rotor induction accounted for. Finally, the importance of temperature layering and of wind modifications due to flow over complex terrain is demonstrated by the use of hotfilm anemometer data, and meteorological tower and acoustic doppler sounder data from the MOD-2 site at Goodnoe Hills, Washington.
2008-02-28
An ER-2 high-altitude Earth science aircraft banks away during a flight over the southern Sierra Nevada. NASA’s Armstrong Flight Research Center operates two of the Lockheed-built aircraft on a wide variety of environmental science, atmospheric sampling, and satellite data verification missions.
Methods and Procedures in PIRLS 2016
ERIC Educational Resources Information Center
Martin, Michael O., Ed.; Mullis, Ina V. S., Ed.; Hooper, Martin, Ed.
2017-01-01
"Methods and Procedures in PIRLS 2016" documents the development of the Progress in International Reading Literacy Study (PIRLS) assessments and questionnaires and describes the methods used in sampling, translation verification, data collection, database construction, and the construction of the achievement and context questionnaire…
[The Dose Effect of Isocenter Selection during IMRT Dose Verification with the 2D Chamber Array].
Xie, Chuanbin; Cong, Xiaohu; Xu, Shouping; Dai, Xiangkun; Wang, Yunlai; Han, Lu; Gong, Hanshun; Ju, Zhongjian; Ge, Ruigang; Ma, Lin
2015-03-01
To investigate the dose effect of isocenter difference during IMRT dose verification with the 2D chamber array. The samples collected from 10 patients were respectively designed for IMRT plans, the isocenter of which was independently defined as P(o), P(x) and P(y). P(o) was fixed on the target center and the other points shifted 8cm from the target center in the orientation of x/y. The PTW729 was used for 2D dose verification in the 3 groups which beams of plans were set to 0 degrees. The γ-analysis passing rates for the whole plan and each beam were gotten using the different standards in the 3 groups, The results showed the mean passing rate of γ-analysis was highest in the P(o) group, and the mean passing rate of the whole plan was better than that of each beam. In addition, it became worse with the increase of dose leakage between the leaves in P(y) group. Therefore, the determination of isocenter has a visible effect for IMRT dose verification of the 2D chamber array, The isocenter of the planning design should be close to the geometric center of target.
OH/H2O Detection Capability Evaluation on Chang'e-5 Lunar Mineralogical Spectrometer (LMS)
NASA Astrophysics Data System (ADS)
Liu, Bin; Ren, Xin; Liu, Jianjun; Li, Chunlai; Mu, Lingli; Deng, Liyan
2016-10-01
The Chang'e-5 (CE-5) lunar sample return mission is scheduled to launch in 2017 to bring back lunar regolith and drill samples. The Chang'e-5 Lunar Mineralogical Spectrometer (LMS), as one of the three sets of scientific payload installed on the lander, is used to collect in-situ spectrum and analyze the mineralogical composition of the samplingsite. It can also help to select the sampling site, and to compare the measured laboratory spectrum of returned sample with in-situ data. LMS employs acousto-optic tunable filters (AOTFs) and is composed of a VIS/NIR module (0.48μm-1.45μm) and an IR module (1.4μm -3.2μm). It has spectral resolution ranging from 3 to 25 nm, with a field of view (FOV) of 4.24°×4.24°. Unlike Chang'e-3 VIS/NIR Imaging Spectrometer (VNIS), the spectral coverage of LMS is extended from 2.4μm to 3.2μm, which has capability to identify H2O/OH absorption features around 2.7μm. An aluminum plate and an Infragold plate are fixed in the dust cover, being used as calibration targets in the VIS/NIR and IR spectral range respectively when the dust cover is open. Before launch, a ground verification test of LMS needs to be conducted in order to: 1) test and verify the detection capability of LMS through evaluation on the quality of image and spectral data collected for the simulated lunar samples; and 2) evaluate the accuracy of data processing methods by the simulation of instrument working on the moon. The ground verification test will be conducted both in the lab and field. The spectra of simulated lunar regolith/mineral samples will be collected simultaneously by the LMS and two calibrated spectrometers: a FTIR spectrometer (Model 102F) and an ASD FieldSpec 4 Hi-Res spectrometer. In this study, the results of the LMS ground verification test will be reported, and OH/H2O Detection Capability will be evaluated especially.
MEMS resonant load cells for micro-mechanical test frames: feasibility study and optimal design
NASA Astrophysics Data System (ADS)
Torrents, A.; Azgin, K.; Godfrey, S. W.; Topalli, E. S.; Akin, T.; Valdevit, L.
2010-12-01
This paper presents the design, optimization and manufacturing of a novel micro-fabricated load cell based on a double-ended tuning fork. The device geometry and operating voltages are optimized for maximum force resolution and range, subject to a number of manufacturing and electromechanical constraints. All optimizations are enabled by analytical modeling (verified by selected finite elements analyses) coupled with an efficient C++ code based on the particle swarm optimization algorithm. This assessment indicates that force resolutions of ~0.5-10 nN are feasible in vacuum (~1-50 mTorr), with force ranges as large as 1 N. Importantly, the optimal design for vacuum operation is independent of the desired range, ensuring versatility. Experimental verifications on a sub-optimal device fabricated using silicon-on-glass technology demonstrate a resolution of ~23 nN at a vacuum level of ~50 mTorr. The device demonstrated in this article will be integrated in a hybrid micro-mechanical test frame for unprecedented combinations of force resolution and range, displacement resolution and range, optical (or SEM) access to the sample, versatility and cost.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. M. Capron
2008-04-15
The 100-F-50 waste site, part of the 100-FR-2 Operable Unit, is a steel stormwater runoff culvert that runs between two railroad grades in the south-central portion of the 100-F Area. The culvert exiting the west side of the railroad grade is mostly encased in concrete and surrounded by a concrete stormwater collection depression partially filled with soil and vegetation. The drain pipe exiting the east side of the railroad grade embankment is partially filled with soil and rocks. The 100-F-50 stormwater diversion culvert confirmatory sampling results support a reclassification of this site to no action. The current site conditions achievemore » the remedial action objectives and corresponding remedial action goals established in the Remaining Sites ROD. The results of confirmatory sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.« less
Information Theory for Gabor Feature Selection for Face Recognition
NASA Astrophysics Data System (ADS)
Shen, Linlin; Bai, Li
2006-12-01
A discriminative and robust feature—kernel enhanced informative Gabor feature—is proposed in this paper for face recognition. Mutual information is applied to select a set of informative and nonredundant Gabor features, which are then further enhanced by kernel methods for recognition. Compared with one of the top performing methods in the 2004 Face Verification Competition (FVC2004), our methods demonstrate a clear advantage over existing methods in accuracy, computation efficiency, and memory cost. The proposed method has been fully tested on the FERET database using the FERET evaluation protocol. Significant improvements on three of the test data sets are observed. Compared with the classical Gabor wavelet-based approaches using a huge number of features, our method requires less than 4 milliseconds to retrieve a few hundreds of features. Due to the substantially reduced feature dimension, only 4 seconds are required to recognize 200 face images. The paper also unified different Gabor filter definitions and proposed a training sample generation algorithm to reduce the effects caused by unbalanced number of samples available in different classes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harpenau, Evan M.
The U.S. Department of Energy (DOE) Order 458.1 requires independent verification (IV) of DOE cleanup projects (DOE 2011). The Oak Ridge Institute for Science and Education (ORISE) has been designated as the responsible organization for IV of the High Flux Beam Reactor (HFBR) Stack and Grounds area at Brookhaven National Laboratory (BNL) in Upton, New York. The IV evaluation may consist of an in-process inspection with document and data reviews (Type A Verification) or a confirmatory survey of the site (Type B Verification). DOE and ORISE determined that a Type A verification of the documents and data for the HFBRmore » Stack and Grounds: Survey Units (SU) 6, 7, and 8 was appropriate based on the initial survey unit classification, the walkover surveys, and the final analytical results provided by the Brookhaven Science Associates (BSA). The HFBR Stack and Grounds surveys began in June 2011 and were completed in September 2011. Survey activities by BSA included gamma walkover scans and sampling of the as-left soils in accordance with the BSA Work Procedure (BNL 2010a). The Field Sampling Plan - Stack and Remaining HFBR Outside Areas (FSP) stated that gamma walk-over surveys would be conducted with a bare sodium iodide (NaI) detector, and a collimated detector would be used to check areas with elevated count rates to locate the source of the high readings (BNL 2010b). BSA used the Mult- Agency Radiation Survey and Site Investigation Manual (MARSSIM) principles for determining the classifications of each survey unit. Therefore, SUs 6 and 7 were identified as Class 1 and SU 8 was deemed Class 2 (BNL 2010b). Gamma walkover surveys of SUs 6, 7, and 8 were completed using a 2X2 NaI detector coupled to a data-logger with a global positioning system (GPS). The 100% scan surveys conducted prior to the final status survey (FSS) sampling identified two general soil areas and two isolated soil locations with elevated radioactivity. The general areas of elevated activity identified were investigated further with a collimated NaI detector. The uncollimated average gamma count rate was less than 15,000 counts per minute (cpm) for the SU 6, 7, and 8 composite area (BNL 2011a). Elevated count rates were observed in portions of each survey unit. The general areas of elevated counts near the Building 801 ventilation and operations and the entry to the Stack were determined to be directly related to the radioactive processes in those structures. To compensate for this radioactive shine, a collimated or shielded detector was used to lower the background count rate (BNL 2011b and c). This allowed the surveyor(s) to distinguish between background and actual radioactive contamination. Collimated gamma survey count rates in these shine affected areas were below 9,000 cpm (BNL 2011a). The average background count rate of 7,500 cpm was reported by BSA for uncollimated NaI detectors (BNL 2011d). The average collimated background ranged from 4,500-6,500 cpm in the westernmost part of SU 8 and from 2,000-3,500 cpm in all other areas (BNL 2011e). Based on these data, no further investigations were necessary for these general areas. SU 8 was the only survey unit that exhibited verified elevated radioactivity levels. The first of two isolated locations of elevated radioactivity had an uncollimated direct measurement of 50,000 cpm with an area background of 7,500 cpm (BNL 2011f). The second small area exhibiting elevated radiation levels was identified at a depth of 6 inches from the surface. The maximum reported count rate of 28,000 cpm was observed during scanning (BNL 2011g). The affected areas were remediated, and the contaminated soils were placed in an intermodal container for disposal. BSA's post-remediation walkover surveys were expanded to include a 10-foot radius around the excavated locations, and it was determined that further investigation was not required for these areas (BNL 2011 f and g). The post-remediation soil samples were collected and analyzed with onsite gamma spectroscopy equipment. These samples were also included with the FSS samples that were analyzed at an offsite facility for the primary radionuclides of concern (ROCs) (i.e., cesium-137, strontium-90, and radium-226). Analysis included full spectrum gamma spectroscopy for all samples. Sr-90 analysis was completed on all samples from SUs 6 and 7. However Sr-90 was only completed on the cores, composites, field blank and duplicate samples in SU 8. Alpha spectroscopy as well as liquid scintillation performed for tritium, carbon-14, and nickel-63 concentrations were completed on the composite samples from SUs 6 and 7. Various cores, composites, and samples from the remediated areas of SU 8 received alpha spectroscopy as well as liquid scintillation analyses for tritium, carbon-14, and nickel-63 to determine respective ROC concentrations (BNL 2011 h, i, and j). BSA submitted the FSS data and analytical results to demonstrate that remediation efforts complied with the specified cleanup goal of less than or equal to 15 millirem per year (mrem/yr) above background to a resident in 50 years (BNL 2011a through j). ORISE has reviewed the project documentation and FSS data for the HFBR Stack and Grounds: SUs 6, 7, and 8. The highest concentrations of the primary ROCs reported were 5.92 picocuries per gram (pCi/g) for Cs-137 and 2.03 pCi/g for Sr-90, with both ROCs having the qualifier for the sample result as less than the minimum detectable activity (MDA). For Ra-226, the highest detected concentration from the FSS data provided was 0.682 pCi/g. Other potential secondary contaminants were below their respective MDAs. Therefore, ORISE is of the opinion that BSA has provided sufficient evidence to demonstrate compliance with the 15 mrem/yr cleanup objectives for the final status survey data provided.« less
Advanced Curation Protocols for Mars Returned Sample Handling
NASA Astrophysics Data System (ADS)
Bell, M.; Mickelson, E.; Lindstrom, D.; Allton, J.
Introduction: Johnson Space Center has over 30 years experience handling precious samples which include Lunar rocks and Antarctic meteorites. However, we recognize that future curation of samples from such missions as Genesis, Stardust, and Mars S mple Return, will require a high degree of biosafety combined witha extremely low levels of inorganic, organic, and biological contamination. To satisfy these requirements, research in the JSC Advanced Curation Lab is currently focused toward two major areas: preliminary examination techniques and cleaning and verification techniques . Preliminary Examination Techniques : In order to minimize the number of paths for contamination we are exploring the synergy between human &robotic sample handling in a controlled environment to help determine the limits of clean curation. Within the Advanced Curation Laboratory is a prototype, next-generation glovebox, which contains a robotic micromanipulator. The remotely operated manipulator has six degrees-of- freedom and can be programmed to perform repetitive sample handling tasks. Protocols are being tested and developed to perform curation tasks such as rock splitting, weighing, imaging, and storing. Techniques for sample transfer enabling more detailed remote examination without compromising the integrity of sample science are also being developed . The glovebox is equipped with a rapid transfer port through which samples can be passed without exposure. The transfer is accomplished by using a unique seal and engagement system which allows passage between containers while maintaining a first seal to the outside environment and a second seal to prevent the outside of the container cover and port door from becoming contaminated by the material being transferred. Cleaning and Verification Techniques: As part of the contamination control effort, innovative cleaning techniques are being identified and evaluated in conjunction with sensitive cleanliness verification methods. Towards this end, cleaning techniques such as ultrasonication in ultra -pure water (UPW), oxygen (O2) plasma, and carbon dioxide (CO2) "snow" are being used to clean a variety of different contaminants on a variety of different surfaces. Additionally, once cleaned, techniques to directly verify the s rface cleanliness are being developed. Theseu include X ray photoelectron spectroscopy (XPS) quantification, and screening with- contact angle measure ments , which can be correlated with XPS standards. Methods developed in the Advanced Curation Laboratory will determine the extent to which inorganic and biological contamination can be controlled and minimized.
HiMAT highly maneuverable aircraft technology, flight report
NASA Technical Reports Server (NTRS)
1982-01-01
Flight verification of a primary flight control system, designed to control the unstable HiMAT aircraft is presented. The initial flight demonstration of a maneuver autopilot in the level cruise mode and the gathering of a limited amount of airspeed calibration data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lavietes, A.; Kalkhoran, N.
The overall goal of this project was to demonstrate a compact gamma-ray spectroscopic system with better energy resolution and lower costs than scintillator-based detector systems for uranium enrichment analysis applications.
Verification of Wind Measurement to 450-Meter Altitude with Mobile Laser Doppler System
DOT National Transportation Integrated Search
1977-12-01
The Lockheed mobile atmospheric unit is a laser Doppler velocimeter system designed for the remote sensing of winds. The capability of the laser Doppler velocimeter accurately to measure winds to 150-meter altitude has been previously demonstrated. T...
Pyroelectric effect in tryglicyne sulphate single crystals - Differential measurement method
NASA Astrophysics Data System (ADS)
Trybus, M.
2018-06-01
A simple mathematical model of the pyroelectric phenomenon was used to explain the electric response of the TGS (triglycine sulphate) samples in the linear heating process in ferroelectric and paraelectric phases. Experimental verification of mathematical model was realized. TGS single crystals were grown and four electrode samples were fabricated. Differential measurements of the pyroelectric response of two different regions of the samples were performed and the results were compared with data obtained from the model. Experimental results are in good agreement with model calculations.
AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) TESTING OF FOUR MERCURY EMISSION SAMPLING SYSTEMS
CEMs - Tekran Instrument Corp. Series 3300 and Thermo Electron's Mercury Freedom System Continuous Emission Monitors (CEMs) for mercury are designed to determine total and/or chemically speciated vapor-phase mercury in combustion emissions. Performance for mercury CEMs are cont...
FIELD VERIFICATION OF LINERS FROM SANITARY LANDFILLS
Liner specimens from three existing landfill sites were collected and examined to determine the changes in their physical properties over time and to validate data being developed through laboratory research. Samples examined included a 15-mil PVC liner from a sludge lagoon in Ne...
Some Methods for Evaluating Program Implementation.
ERIC Educational Resources Information Center
Hardy, Roy A.
An approach to evaluating program implementation is described. This approach includes the development of a project description which includes a structure matrix, sampling from the structure matrix, and preparing an implementation evaluation plan. The implementation evaluation plan should include: (1) verification of implementation of planned…
NASA Astrophysics Data System (ADS)
Ament, F.; Weusthoff, T.; Arpagaus, M.; Rotach, M.
2009-04-01
The main aim of the WWRP Forecast Demonstration Project MAP D-PHASE is to demonstrate the performance of today's models to forecast heavy precipitation and flood events in the Alpine region. Therefore an end-to-end, real-time forecasting system was installed and operated during the D PHASE Operations Period from June to November 2007. Part of this system are 30 numerical weather prediction models (deterministic as well as ensemble systems) operated by weather services and research institutes, which issue alerts if predicted precipitation accumulations exceed critical thresholds. Additionally to the real-time alerts, all relevant model fields of these simulations are stored in a central data archive. This comprehensive data set allows a detailed assessment of today's quantitative precipitation forecast (QPF) performance in the Alpine region. We will present results of QPF verifications against Swiss radar and rain gauge data both from a qualitative point of view, in terms of alerts, as well as from a quantitative perspective, in terms of precipitation rate. Various influencing factors like lead time, accumulation time, selection of warning thresholds, or bias corrections will be discussed. Additional to traditional verifications of area average precipitation amounts, the performance of the models to predict the correct precipitation statistics without requiring a point-to-point match will be described by using modern Fuzzy verification techniques. Both analyses reveal significant advantages of deep convection resolving models compared to coarser models with parameterized convection. An intercomparison of the model forecasts themselves reveals a remarkably high variability between different models, and makes it worthwhile to evaluate the potential of a multi-model ensemble. Various multi-model ensemble strategies will be tested by combining D-PHASE models to virtual ensemble systems.
Development and Multi-laboratory Verification of US EPA ...
A drinking water method for seven pesticides and pesticide degradates is presented that addresses the occurrence monitoring needs of the US Environmental Protection Agency (EPA) for a future Unregulated Contaminant Monitoring Regulation (UCMR). The method employs online solid phase extraction-liquid chromatography–tandem mass spectrometry (SPE-LC–MS-MS). Online SPE-LC–MS-MS has the potential to offer cost-effective, faster, more sensitive and more rugged methods than the traditional offline SPE approach due to complete automation of the SPE process, as well as seamless integration with the LC–MS-MS system. The method uses 2-chloroacetamide, ascorbic acid and Trizma to preserve the drinking water samples for up to 28 days. The mean recoveries in drinking water (from a surface water source) fortified with method analytes are 87.1–112% with relative standard deviations of <14%. Single laboratory lowest concentration minimum reporting levels of 0.27–1.7 ng/L are demonstrated with this methodology. Multi-laboratory data are presented that demonstrate method ruggedness and transferability. The final method meets all of the EPA's UCMR survey requirements for sample collection and storage, precision, accuracy, and sensitivity. The journal article describes the development of drinking water Method 543 for analysis of selected CCL 3 chemicals. It is anticipated this method may be used in a future Unregulated Contaminant Monitoring Regulation to gather nationw
NASA Technical Reports Server (NTRS)
Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.
2016-01-01
With the recent development of multi-dimensional thermal protection system (TPS) material response codes, the capability to account for surface-to-surface radiation exchange in complex geometries is critical. This paper presents recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute geometric view factors for radiation problems involving multiple surfaces. Verification of the code's radiation capabilities and results of a code-to-code comparison are presented. Finally, a demonstration case of a two-dimensional ablating cavity with enclosure radiation accounting for a changing geometry is shown.
Formal Analysis of BPMN Models Using Event-B
NASA Astrophysics Data System (ADS)
Bryans, Jeremy W.; Wei, Wei
The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.
Quantum blind dual-signature scheme without arbitrator
NASA Astrophysics Data System (ADS)
Li, Wei; Shi, Ronghua; Huang, Dazu; Shi, Jinjing; Guo, Ying
2016-03-01
Motivated by the elegant features of a bind signature, we suggest the design of a quantum blind dual-signature scheme with three phases, i.e., initial phase, signing phase and verification phase. Different from conventional schemes, legal messages are signed not only by the blind signatory but also by the sender in the signing phase. It does not rely much on an arbitrator in the verification phase as the previous quantum signature schemes usually do. The security is guaranteed by entanglement in quantum information processing. Security analysis demonstrates that the signature can be neither forged nor disavowed by illegal participants or attacker. It provides a potential application for e-commerce or e-payment systems with the current technology.
A fingerprint key binding algorithm based on vector quantization and error correction
NASA Astrophysics Data System (ADS)
Li, Liang; Wang, Qian; Lv, Ke; He, Ning
2012-04-01
In recent years, researches on seamless combination cryptosystem with biometric technologies, e.g. fingerprint recognition, are conducted by many researchers. In this paper, we propose a binding algorithm of fingerprint template and cryptographic key to protect and access the key by fingerprint verification. In order to avoid the intrinsic fuzziness of variant fingerprints, vector quantization and error correction technique are introduced to transform fingerprint template and then bind with key, after a process of fingerprint registration and extracting global ridge pattern of fingerprint. The key itself is secure because only hash value is stored and it is released only when fingerprint verification succeeds. Experimental results demonstrate the effectiveness of our ideas.
Wet countdown demonstration and flight readiness firing
NASA Technical Reports Server (NTRS)
1981-01-01
The prelaunch tests for the Space Transportation System 1 flight are briefly described. Testing is divided into two major sections: the wet countdown demonstration test/flight readiness firing, which includes a 20 second test firing of the orbiter's three main engines, and a mission verification test, which is centered on flight and landing operations. The functions of the countdown sequence are listed and end of mission and mission abort exercises are described.
Jabbari, Keyvan; Pashaei, Fakhereh; Ay, Mohammad R.; Amouheidari, Alireza; Tavakoli, Mohammad B.
2018-01-01
Background: MapCHECK2 is a two-dimensional diode arrays planar dosimetry verification system. Dosimetric results are evaluated with gamma index. This study aims to provide comprehensive information on the impact of various factors on the gamma index values of MapCHECK2, which is mostly used for IMRT dose verification. Methods: Seven fields were planned for 6 and 18 MV photons. The azimuthal angle is defined as any rotation of collimators or the MapCHECK2 around the central axis, which was varied from 5 to −5°. The gantry angle was changed from −8 to 8°. Isodose sampling resolution was studied in the range of 0.5 to 4 mm. The effects of additional buildup on gamma index in three cases were also assessed. Gamma test acceptance criteria were 3%/3 mm. Results: The change of azimuthal angle in 5° interval reduced gamma index value by about 9%. The results of putting buildups of various thicknesses on the MapCHECK2 surface showed that gamma index was generally improved in thicker buildup, especially for 18 MV. Changing the sampling resolution from 4 to 2 mm resulted in an increase in gamma index by about 3.7%. The deviation of the gantry in 8° intervals in either directions changed the gamma index only by about 1.6% for 6 MV and 2.1% for 18 MV. Conclusion: Among the studied parameters, the azimuthal angle is one of the most effective factors on gamma index value. The gantry angle deviation and sampling resolution are less effective on gamma index value reduction. PMID:29535922
Influenza forecasting with Google Flu Trends.
Dugas, Andrea Freyer; Jalalpour, Mehdi; Gel, Yulia; Levin, Scott; Torcaso, Fred; Igusa, Takeru; Rothman, Richard E
2013-01-01
We developed a practical influenza forecast model based on real-time, geographically focused, and easy to access data, designed to provide individual medical centers with advanced warning of the expected number of influenza cases, thus allowing for sufficient time to implement interventions. Secondly, we evaluated the effects of incorporating a real-time influenza surveillance system, Google Flu Trends, and meteorological and temporal information on forecast accuracy. Forecast models designed to predict one week in advance were developed from weekly counts of confirmed influenza cases over seven seasons (2004-2011) divided into seven training and out-of-sample verification sets. Forecasting procedures using classical Box-Jenkins, generalized linear models (GLM), and generalized linear autoregressive moving average (GARMA) methods were employed to develop the final model and assess the relative contribution of external variables such as, Google Flu Trends, meteorological data, and temporal information. A GARMA(3,0) forecast model with Negative Binomial distribution integrating Google Flu Trends information provided the most accurate influenza case predictions. The model, on the average, predicts weekly influenza cases during 7 out-of-sample outbreaks within 7 cases for 83% of estimates. Google Flu Trend data was the only source of external information to provide statistically significant forecast improvements over the base model in four of the seven out-of-sample verification sets. Overall, the p-value of adding this external information to the model is 0.0005. The other exogenous variables did not yield a statistically significant improvement in any of the verification sets. Integer-valued autoregression of influenza cases provides a strong base forecast model, which is enhanced by the addition of Google Flu Trends confirming the predictive capabilities of search query based syndromic surveillance. This accessible and flexible forecast model can be used by individual medical centers to provide advanced warning of future influenza cases.
Qiu, Shi; Yang, Wen-Zhi; Yao, Chang-Liang; Qiu, Zhi-Dong; Shi, Xiao-Jian; Zhang, Jing-Xian; Hou, Jin-Jun; Wang, Qiu-Rong; Wu, Wan-Ying; Guo, De-An
2016-07-01
A key segment in authentication of herbal medicines is the establishment of robust biomarkers that embody the intrinsic metabolites difference independent of the growing environment or processing technics. We present a strategy by nontargeted metabolomics and "Commercial-homophyletic" comparison-induced biomarkers verification with new bioinformatic vehicles, to improve the efficiency and reliability in authentication of herbal medicines. The chemical differentiation of five different parts (root, leaf, flower bud, berry, and seed) of Panax ginseng was illustrated as a case study. First, an optimized ultra-performance liquid chromatography/quadrupole time-of-flight-MS(E) (UPLC/QTOF-MS(E)) approach was established for global metabolites profiling. Second, UNIFI™ combined with search of an in-house library was employed to automatically characterize the metabolites. Third, pattern recognition multivariate statistical analysis of the MS(E) data of different parts of commercial and homophyletic samples were separately performed to explore potential biomarkers. Fourth, potential biomarkers deduced from commercial and homophyletic root and leaf samples were cross-compared to infer robust biomarkers. Fifth, discriminating models by artificial neutral network (ANN) were established to identify different parts of P. ginseng. Consequently, 164 compounds were characterized, and 11 robust biomarkers enabling the differentiation among root, leaf, flower bud, and berry, were discovered by removing those structurally unstable and possibly processing-related ones. The ANN models using the robust biomarkers managed to exactly discriminate four different parts and root adulterant with leaf as well. Conclusively, biomarkers verification using homophyletic samples conduces to the discovery of robust biomarkers. The integrated strategy facilitates authentication of herbal medicines in a more efficient and more intelligent manner. Copyright © 2016 Elsevier B.V. All rights reserved.
Melchior, P.; Gruen, D.; McClintock, T.; ...
2017-05-16
Here, we use weak-lensing shear measurements to determine the mean mass of optically selected galaxy clusters in Dark Energy Survey Science Verification data. In a blinded analysis, we split the sample of more than 8000 redMaPPer clusters into 15 subsets, spanning ranges in the richness parameter 5 ≤ λ ≤ 180 and redshift 0.2 ≤ z ≤ 0.8, and fit the averaged mass density contrast profiles with a model that accounts for seven distinct sources of systematic uncertainty: shear measurement and photometric redshift errors; cluster-member contamination; miscentring; deviations from the NFW halo profile; halo triaxiality and line-of-sight projections.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melchior, P.; Gruen, D.; McClintock, T.
Here, we use weak-lensing shear measurements to determine the mean mass of optically selected galaxy clusters in Dark Energy Survey Science Verification data. In a blinded analysis, we split the sample of more than 8000 redMaPPer clusters into 15 subsets, spanning ranges in the richness parameter 5 ≤ λ ≤ 180 and redshift 0.2 ≤ z ≤ 0.8, and fit the averaged mass density contrast profiles with a model that accounts for seven distinct sources of systematic uncertainty: shear measurement and photometric redshift errors; cluster-member contamination; miscentring; deviations from the NFW halo profile; halo triaxiality and line-of-sight projections.
Vikram, V.
2015-07-29
Weak gravitational lensing allows one to reconstruct the spatial distribution of the projected mass density across the sky. These “mass maps” provide a powerful tool for studying cosmology as they probe both luminous and dark matter. In this paper, we present a weak lensing mass map reconstructed from shear measurements in a 139 deg 2 area from the Dark Energy Survey (DES) science verification data. We compare the distribution of mass with that of the foreground distribution of galaxies and clusters. The overdensities in the reconstructed map correlate well with the distribution of optically detected clusters. We demonstrate that candidatemore » superclusters and voids along the line of sight can be identified, exploiting the tight scatter of the cluster photometric redshifts. We cross-correlate the mass map with a foreground magnitude-limited galaxy sample from the same data. Our measurement gives results consistent with mock catalogs from N-body simulations that include the primary sources of statistical uncertainties in the galaxy, lensing, and photo-z catalogs. The statistical significance of the cross-correlation is at the 6.8σ level with 20 arcminute smoothing. We find that the contribution of systematics to the lensing mass maps is generally within measurement uncertainties. In this study, we analyze less than 3% of the final area that will be mapped by the DES; the tools and analysis techniques developed in this paper can be applied to forthcoming larger data sets from the survey.« less
NASA Technical Reports Server (NTRS)
Mitchell, Mark A.; Lowrey, Nikki M.
2015-01-01
Since the 1990's, when the Class I Ozone Depleting Substance (ODS) chlorofluorocarbon-113 (CFC-113) was banned, NASA's rocket propulsion test facilities at Marshall Space Flight Center (MSFC) and Stennis Space Center (SSC) have relied upon hydrochlorofluorocarbon-225 (HCFC-225) to safely clean and verify the cleanliness of large scale propulsion oxygen systems. Effective January 1, 2015, the production, import, export, and new use of HCFC-225, a Class II ODS, was prohibited by the Clean Air Act. In 2012 through 2014, leveraging resources from both the NASA Rocket Propulsion Test Program and the Defense Logistics Agency - Aviation Hazardous Minimization and Green Products Branch, test labs at MSFC, SSC, and Johnson Space Center's White Sands Test Facility (WSTF) collaborated to seek out, test, and qualify a replacement for HCFC-225 that is both an effective cleaner and safe for use with oxygen systems. Candidate solvents were selected and a test plan was developed following the guidelines of ASTM G127, Standard Guide for the Selection of Cleaning Agents for Oxygen Systems. Solvents were evaluated for materials compatibility, oxygen compatibility, cleaning effectiveness, and suitability for use in cleanliness verification and field cleaning operations. Two solvents were determined to be acceptable for cleaning oxygen systems and one was chosen for implementation at NASA's rocket propulsion test facilities. The test program and results are summarized. This project also demonstrated the benefits of cross-agency collaboration in a time of limited resources.
A TRMM-Calibrated Infrared Technique for Convective and Stratiform Rainfall: Analysis and Validation
NASA Technical Reports Server (NTRS)
Negri, Andrew; Starr, David OC. (Technical Monitor)
2001-01-01
A satellite infrared technique with passive microwave calibration has been developed for estimating convective and stratiform rainfall. The Convective-Stratiform Technique, calibrated by coincident, physically retrieved rain rates from the TRMM Microwave Imager (TMI), has been applied to 30 min interval GOES infrared data and aggregated over seasonal and yearly periods over northern South America. The diurnal cycle of rainfall, as well as the division between convective and stratiform rainfall is presented. For the period Jan-April 1999, analysis revealed significant effects of local circulations (river breeze, land/sea breeze, mountain/valley) on both the total rainfall and it's diurnal cycle. Results compared well (a one-hour lag) with the diurnal cycle derived from TOGA radar-estimated rainfall in Rondonia. The satellite estimates revealed that the convective rain constituted 24% of the rain area while accounting for 67% of the rain volume. Estimates of the diurnal cycle (both total rainfall and convective/stratiform) for an area encompassing the Amazon Basin (3 x 10(exp 6) sq km) were in phase with those from the TRMM Precipitation Radar, despite the latter's limited sampling. Results will be presented comparing the yearly (2000) diurnal cycle for large regions (including the Amazon Basin), and an intercomparison of January-March estimates for three years, (1999-2001). We hope to demonstrate the utility of using the TRMM PR observations as verification for infrared estimates of the diurnal cycle, and as verification of the apportionment of rainfall into convective and stratiform components.
A TRMM-Calibrated Infrared Technique for Convective and Stratiform Rainfall: Analysis and Validation
NASA Technical Reports Server (NTRS)
Negri, Andrew; Starr, David OC. (Technical Monitor)
2001-01-01
A satellite infrared technique with passive microwave calibration has been developed for estimating convective and stratiform. rainfall. The Convective-Stratiform Technique, calibrated by coincident, physically retrieved rain rates from the TRMM Microwave Imager (TMI), has been applied to 30 min interval GOES infrared data and aggregated over seasonal and yearly periods over northern South America. The diurnal cycle of rainfall, as well as the division between convective and stratiform rainfall is presented. For the period Jan-April 1999, analysis revealed significant effects of local circulations (river breeze, land/sea breeze, mountain/valley) on both the total rainfall and it's diurnal cycle. Results compared well (a one-hour lag) with the diurnal cycle derived from TOGA radar-estimated rainfall in Rondonia. The satellite estimates revealed that the convective rain constituted 24% of the rain area while accounting for 67% of the rain volume. Estimates of the diurnal cycle (both total rainfall and convective/stratiform) for an area encompassing the Amazon Basin (3 x 10(exp 6) square km) were in phase with those from the TRMM Precipitation Radar, despite the latter's limited sampling. Results will be presented comparing the yearly (2000) diurnal cycle for large regions (including the Amazon Basin), and an intercomparison of January-March estimates for three years, 1999-2001. We hope to demonstrate the utility of using the TRMM PR observations as verification for infrared estimates of the diurnal cycle, and as verification of the apportionment of rainfall into convective and stratiform components.
Hayabusa: Navigation Challenges for Earth Return
NASA Technical Reports Server (NTRS)
Haw, Robert J.; Bhaskaran, S.; Strauss, W.; Sklyanskiy, E.; Graat, E. J.; Smith, J. J.; Menom, P.; Ardalan, S.; Ballard, C.; Williams, P.;
2011-01-01
Hayabusa was a JAXA sample-return mission to Itokawa navigated, in part, by JPL personnel. Hayabusa survived several near mission-ending failures at Itokawa yet returned to Earth with an asteroid regolith sample on June 13, 2010. This paper describes NASA/JPL's participation in the Hayabusa mission during the last 100 days of its mission, wherein JPL provided tracking data and orbit determination, plus verification of maneuver design and entry, descent and landing.
Biometric Fusion Demonstration System Scientific Report
2004-03-01
verification and facial recognition , searching watchlist databases comprised of full or partial facial images or voice recordings. Multiple-biometric...17 2.2.1.1 Fingerprint and Facial Recognition ............................... 17...iv DRDC Ottawa CR 2004 – 056 2.2.1.2 Iris Recognition and Facial Recognition ........................ 18
Physical property measurements on analog granites related to the joint verification experiment
NASA Astrophysics Data System (ADS)
Martin, Randolph J., III; Coyner, Karl B.; Haupt, Robert W.
1990-08-01
A key element in JVE (Joint Verification Experiment) conducted jointly between the United States and the USSR is the analysis of the geology and physical properties of the rocks in the respective test sites. A study was initiated to examine unclassified crystalline rock specimens obtained from areas near the Soviet site, Semipalatinsk and appropriate analog samples selected from Mt. Katadin, Maine. These rocks were also compared to Sierra White and Westerly Granite which have been studied in great detail. Measurements performed to characterize these rocks were: (1) Uniaxial strain with simultaneous compressional and shear wave velocities; (2) Hydrostatic compression to 150 MPa with simultaneous compressional and shear wave velocities; (3) Attenuation measurements as a function of frequency and strain amplitude for both dry and water saturated conditions. Elastic moduli determined from the hydrostatic compression and uniaxial strain test show that the rock matrix/mineral properties were comparable in magnitudes which vary within 25 percent from sample to sample. These properties appear to be approximately isotropic, especially at high pressures. However, anisotropy evident for certain samples at pressures below 35 MPa is attributed to dominant pre-existing microcrack populations and their alignments. Dependence of extensional attenuation and Young's modulus on strain amplitude were experimentally determined for intact Sierra White granite using the hysteresis loop technique.
Photometric redshift analysis in the Dark Energy Survey Science Verification data
NASA Astrophysics Data System (ADS)
Sánchez, C.; Carrasco Kind, M.; Lin, H.; Miquel, R.; Abdalla, F. B.; Amara, A.; Banerji, M.; Bonnett, C.; Brunner, R.; Capozzi, D.; Carnero, A.; Castander, F. J.; da Costa, L. A. N.; Cunha, C.; Fausti, A.; Gerdes, D.; Greisel, N.; Gschwend, J.; Hartley, W.; Jouvel, S.; Lahav, O.; Lima, M.; Maia, M. A. G.; Martí, P.; Ogando, R. L. C.; Ostrovski, F.; Pellegrini, P.; Rau, M. M.; Sadeh, I.; Seitz, S.; Sevilla-Noarbe, I.; Sypniewski, A.; de Vicente, J.; Abbot, T.; Allam, S. S.; Atlee, D.; Bernstein, G.; Bernstein, J. P.; Buckley-Geer, E.; Burke, D.; Childress, M. J.; Davis, T.; DePoy, D. L.; Dey, A.; Desai, S.; Diehl, H. T.; Doel, P.; Estrada, J.; Evrard, A.; Fernández, E.; Finley, D.; Flaugher, B.; Frieman, J.; Gaztanaga, E.; Glazebrook, K.; Honscheid, K.; Kim, A.; Kuehn, K.; Kuropatkin, N.; Lidman, C.; Makler, M.; Marshall, J. L.; Nichol, R. C.; Roodman, A.; Sánchez, E.; Santiago, B. X.; Sako, M.; Scalzo, R.; Smith, R. C.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Tucker, D. L.; Uddin, S. A.; Valdés, F.; Walker, A.; Yuan, F.; Zuntz, J.
2014-12-01
We present results from a study of the photometric redshift performance of the Dark Energy Survey (DES), using the early data from a Science Verification period of observations in late 2012 and early 2013 that provided science-quality images for almost 200 sq. deg. at the nominal depth of the survey. We assess the photometric redshift (photo-z) performance using about 15 000 galaxies with spectroscopic redshifts available from other surveys. These galaxies are used, in different configurations, as a calibration sample, and photo-z's are obtained and studied using most of the existing photo-z codes. A weighting method in a multidimensional colour-magnitude space is applied to the spectroscopic sample in order to evaluate the photo-z performance with sets that mimic the full DES photometric sample, which is on average significantly deeper than the calibration sample due to the limited depth of spectroscopic surveys. Empirical photo-z methods using, for instance, artificial neural networks or random forests, yield the best performance in the tests, achieving core photo-z resolutions σ68 ˜ 0.08. Moreover, the results from most of the codes, including template-fitting methods, comfortably meet the DES requirements on photo-z performance, therefore, providing an excellent precedent for future DES data sets.
Photometric redshift analysis in the Dark Energy Survey Science Verification data
Sanchez, C.; Carrasco Kind, M.; Lin, H.; ...
2014-10-09
In this study, we present results from a study of the photometric redshift performance of the Dark Energy Survey (DES), using the early data from a Science Verification period of observations in late 2012 and early 2013 that provided science-quality images for almost 200 sq. deg. at the nominal depth of the survey. We assess the photometric redshift (photo-z) performance using about 15 000 galaxies with spectroscopic redshifts available from other surveys. These galaxies are used, in different configurations, as a calibration sample, and photo-z's are obtained and studied using most of the existing photo-z codes. A weighting method inmore » a multidimensional colour–magnitude space is applied to the spectroscopic sample in order to evaluate the photo-z performance with sets that mimic the full DES photometric sample, which is on average significantly deeper than the calibration sample due to the limited depth of spectroscopic surveys. In addition, empirical photo-z methods using, for instance, artificial neural networks or random forests, yield the best performance in the tests, achieving core photo-z resolutions σ68 ~ 0.08. Moreover, the results from most of the codes, including template-fitting methods, comfortably meet the DES requirements on photo-z performance, therefore, providing an excellent precedent for future DES data sets.« less
Hierarchical specification of the SIFT fault tolerant flight control system
NASA Technical Reports Server (NTRS)
Melliar-Smith, P. M.; Schwartz, R. L.
1981-01-01
The specification and mechanical verification of the Software Implemented Fault Tolerance (SIFT) flight control system is described. The methodology employed in the verification effort is discussed, and a description of the hierarchical models of the SIFT system is given. To meet the objective of NASA for the reliability of safety critical flight control systems, the SIFT computer must achieve a reliability well beyond the levels at which reliability can be actually measured. The methodology employed to demonstrate rigorously that the SIFT computer meets as reliability requirements is described. The hierarchy of design specifications from very abstract descriptions of system function down to the actual implementation is explained. The most abstract design specifications can be used to verify that the system functions correctly and with the desired reliability since almost all details of the realization were abstracted out. A succession of lower level models refine these specifications to the level of the actual implementation, and can be used to demonstrate that the implementation has the properties claimed of the abstract design specifications.
Characterization of lens based photoacoustic imaging system.
Francis, Kalloor Joseph; Chinni, Bhargava; Channappayya, Sumohana S; Pachamuthu, Rajalakshmi; Dogra, Vikram S; Rao, Navalgund
2017-12-01
Some of the challenges in translating photoacoustic (PA) imaging to clinical applications includes limited view of the target tissue, low signal to noise ratio and the high cost of developing real-time systems. Acoustic lens based PA imaging systems, also known as PA cameras are a potential alternative to conventional imaging systems in these scenarios. The 3D focusing action of lens enables real-time C-scan imaging with a 2D transducer array. In this paper, we model the underlying physics in a PA camera in the mathematical framework of an imaging system and derive a closed form expression for the point spread function (PSF). Experimental verification follows including the details on how to design and fabricate the lens inexpensively. The system PSF is evaluated over a 3D volume that can be imaged by this PA camera. Its utility is demonstrated by imaging phantom and an ex vivo human prostate tissue sample.
NASA Technical Reports Server (NTRS)
Morrison, D. R.; Lewis, M. L.
1982-01-01
Static zone electrophoresis is an electrokinetic method of separating macromolecules and small particles. However, its application for the isolation of biological cells and concentrated protein solutions is limited by sedimentation and convection. Microgravity eliminates or reduces sedimentation, floatation, and density-driven convection arising from either Joule heating or concentration differences. The advantages of such an environment were first demonstrated in space during the Apollo 14 and 16 missions. In 1975 the Electrophoresis Technology Experiment (MA-011) was conducted during the Apollo-Soyuz Test Project flight. In 1979 a project was initiated to repeat the separations of human kidney cells. One of the major objectives of the Electrophoresis Equipment Verification Tests (EEVT) on STS-3 was to repeat and thereby validate the first successful electrophoretic separation of human kidney cells. Attention is given to the EEVT apparatus, the preflight electrophoresis, and inflight operational results.
Version 2.0 Visual Sample Plan (VSP): UXO Module Code Description and Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, Richard O.; Wilson, John E.; O'Brien, Robert F.
2003-05-06
The Pacific Northwest National Laboratory (PNNL) is developing statistical methods for determining the amount of geophysical surveys conducted along transects (swaths) that are needed to achieve specified levels of confidence of finding target areas (TAs) of anomalous readings and possibly unexploded ordnance (UXO) at closed, transferring and transferred (CTT) Department of Defense (DoD) ranges and other sites. The statistical methods developed by PNNL have been coded into the UXO module of the Visual Sample Plan (VSP) software code that is being developed by PNNL with support from the DoD, the U.S. Department of Energy (DOE, and the U.S. Environmental Protectionmore » Agency (EPA). (The VSP software and VSP Users Guide (Hassig et al, 2002) may be downloaded from http://dqo.pnl.gov/vsp.) This report describes and documents the statistical methods developed and the calculations and verification testing that have been conducted to verify that VSPs implementation of these methods is correct and accurate.« less
Verification of spectrophotometric method for nitrate analysis in water samples
NASA Astrophysics Data System (ADS)
Kurniawati, Puji; Gusrianti, Reny; Dwisiwi, Bledug Bernanti; Purbaningtias, Tri Esti; Wiyantoko, Bayu
2017-12-01
The aim of this research was to verify the spectrophotometric method to analyze nitrate in water samples using APHA 2012 Section 4500 NO3-B method. The verification parameters used were: linearity, method detection limit, level of quantitation, level of linearity, accuracy and precision. Linearity was obtained by using 0 to 50 mg/L nitrate standard solution and the correlation coefficient of standard calibration linear regression equation was 0.9981. The method detection limit (MDL) was defined as 0,1294 mg/L and limit of quantitation (LOQ) was 0,4117 mg/L. The result of a level of linearity (LOL) was 50 mg/L and nitrate concentration 10 to 50 mg/L was linear with a level of confidence was 99%. The accuracy was determined through recovery value was 109.1907%. The precision value was observed using % relative standard deviation (%RSD) from repeatability and its result was 1.0886%. The tested performance criteria showed that the methodology was verified under the laboratory conditions.
Simulation-To-Flight (STF-1): A Mission to Enable CubeSat Software-Based Validation and Verification
NASA Technical Reports Server (NTRS)
Morris, Justin; Zemerick, Scott; Grubb, Matt; Lucas, John; Jaridi, Majid; Gross, Jason N.; Ohi, Nicholas; Christian, John A.; Vassiliadis, Dimitris; Kadiyala, Anand;
2016-01-01
The Simulation-to-Flight 1 (STF-1) CubeSat mission aims to demonstrate how legacy simulation technologies may be adapted for flexible and effective use on missions using the CubeSat platform. These technologies, named NASA Operational Simulator (NOS), have demonstrated significant value on several missions such as James Webb Space Telescope, Global Precipitation Measurement, Juno, and Deep Space Climate Observatory in the areas of software development, mission operations/training, verification and validation (V&V), test procedure development and software systems check-out. STF-1 will demonstrate a highly portable simulation and test platform that allows seamless transition of mission development artifacts to flight products. This environment will decrease development time of future CubeSat missions by lessening the dependency on hardware resources. In addition, through a partnership between NASA GSFC, the West Virginia Space Grant Consortium and West Virginia University, the STF-1 CubeSat will hosts payloads for three secondary objectives that aim to advance engineering and physical-science research in the areas of navigation systems of small satellites, provide useful data for understanding magnetosphere-ionosphere coupling and space weather, and verify the performance and durability of III-V Nitride-based materials.
30 CFR 250.913 - When must I resubmit Platform Verification Program plans?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication verification, or installation verification... 30 Mineral Resources 2 2010-07-01 2010-07-01 false When must I resubmit Platform Verification...
Wang, Yi; Xiang, Ma; Wen, Ya-Dong; Yu, Chun-Xia; Wang, Luo-Ping; Zhao, Long-Lian; Li, Jun-Hui
2012-11-01
In this study, tobacco quality analysis of main Industrial classification of different years was carried out applying spectrum projection and correlation methods. The group of data was near-infrared (NIR) spectrum from Hongta Tobacco (Group) Co., Ltd. 5730 tobacco leaf Industrial classification samples from Yuxi in Yunnan province from 2007 to 2010 year were collected using near infrared spectroscopy, which from different parts and colors and all belong to tobacco varieties of HONGDA. The conclusion showed that, when the samples were divided to two part by the ratio of 2:1 randomly as analysis and verification sets in the same year, the verification set corresponded with the analysis set applying spectrum projection because their correlation coefficients were above 0.98. The correlation coefficients between two different years applying spectrum projection were above 0.97. The highest correlation coefficient was the one between 2008 and 2009 year and the lowest correlation coefficient was the one between 2007 and 2010 year. At the same time, The study discussed a method to get the quantitative similarity values of different industrial classification samples. The similarity and consistency values were instructive in combination and replacement of tobacco leaf blending.
Biometric verification in dynamic writing
NASA Astrophysics Data System (ADS)
George, Susan E.
2002-03-01
Pen-tablet devices capable of capturing the dynamics of writing record temporal and pressure information as well as the spatial pattern. This paper explores biometric verification based upon the dynamics of writing where writers are distinguished not on the basis of what they write (ie the signature), but how they write. We have collected samples of dynamic writing from 38 Chinese writers. Each writer was asked to provide 10 copies of a paragraph of text and the same number of signature samples. From the data we have extracted stroke-based primitives from the sentence data utilizing pen-up/down information and heuristic rules about the shape of the character. The x, y and pressure values of each primitive were interpolated into an even temporal range based upon a 20 msec sampling rate. We applied the Daubechies 1 wavelet transform to the x signal, y signal and pressure signal using the coefficients as inputs to a multi-layer perceptron trained with back-propagation on the sentence data. We found a sensitivity of 0.977 and specificity of 0.990 recognizing writers based on test primitives extracted from sentence data and measures of 0.916 and 0.961 respectively, from test primitives extracted from signature data.
Horseshoes in a Chaotic System with Only One Stable Equilibrium
NASA Astrophysics Data System (ADS)
Huan, Songmei; Li, Qingdu; Yang, Xiao-Song
To confirm the numerically demonstrated chaotic behavior in a chaotic system with only one stable equilibrium reported by Wang and Chen, we resort to Poincaré map technique and present a rigorous computer-assisted verification of horseshoe chaos by virtue of topological horseshoes theory.
The Synchronous Scanning Luminoscope (Luminoscope) developed by the Oak Ridge National Laboratory in collaboration with Environmental Systems Corporation (ESC) was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program i...
Future Launch Vehicle Structures - Expendable and Reusable Elements
NASA Astrophysics Data System (ADS)
Obersteiner, M. H.; Borriello, G.
2002-01-01
Further evolution of existing expendable launch vehicles will be an obvious element influencing the future of space transportation. Besides this reusability might be the change with highest potential for essential improvement. The expected cost reduction and finally contributing to this, the improvement of reliability including safe mission abort capability are driving this idea. Although there are ideas of semi-reusable launch vehicles, typically two stages vehicles - reusable first stage or booster(s) and expendable second or upper stage - it should be kept in mind that the benefit of reusability will only overwhelm if there is a big enough share influencing the cost calculation. Today there is the understanding that additional technology preparation and verification will be necessary to master reusability and get enough benefits compared with existing launch vehicles. This understanding is based on several technology and system concepts preparation and verification programmes mainly done in the US but partially also in Europe and Japan. The major areas of necessary further activities are: - System concepts including business plan considerations - Sub-system or component technologies refinement - System design and operation know-how and capabilities - Verification and demonstration oriented towards future mission mastering: One of the most important aspects for the creation of those coming programmes and activities will be the iterative process of requirements definition derived from concepts analyses including economical considerations and the results achieved and verified within technology and verification programmes. It is the intention of this paper to provide major trends for those requirements focused on future launch vehicles structures. This will include the aspects of requirements only valid for reusable launch vehicles and those common for expendable, semi-reusable and reusable launch vehicles. Structures and materials is and will be one of the important technology areas to be improved. This includes: - Primary structures - Thermal protection systems (for high and low temperatures) - Hot structures (leading edges, engine cowling, ...) - Tanks (for various propellants and fluids, cryo, ...) Requirements to be considered are including materials properties and a variety of loads definition - static and dynamic. Based on existing knowledge and experience for expendable LV (Ariane, ...) and aircraft there is the need to established a combined understanding to provide the basis for an efficient RLV design. Health monitoring will support the cost efficient operation of future reusable structures, but will also need a sound understanding of loads and failure mechanisms as basis. Risk mitigation will ask for several steps of demonstration towards a cost efficient RLV (structures) operation. Typically this has or will start with basic technology, to be evolved to components demonstration (TPS, tanks, ...) and finally to result in the demonstration of the cost efficient reuse operation. This paper will also include a programmatic logic concerning future LV structures demonstration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bunch, Kyle J.; Williams, Laura S.; Jones, Anthony M.
The 2010 ratification of the New START Treaty has been widely regarded as a noteworthy national security achievement for both the Obama administration and the Medvedev-Putin regime, but deeper cuts are envisioned under future arms control regimes. Future verification needs will include monitoring the storage of warhead components and fissile materials and verifying dismantlement of warheads, pits, secondaries, and other materials. From both the diplomatic and technical perspectives, verification under future arms control regimes will pose new challenges. Since acceptable verification technology must protect sensitive design information and attributes, non-nuclear non-sensitive signatures may provide a significant verification tool without themore » use of additional information barriers. The use of electromagnetic signatures to monitor nuclear material storage containers is a promising technology with the potential to fulfill these challenging requirements. Research performed at Pacific Northwest National Laboratory (PNNL) has demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to confirm the presence of specific components on a “yes/no” basis without revealing classified information. Arms control inspectors might use this technique to verify the presence or absence of monitored items, including both nuclear and non-nuclear materials. Although additional research is needed to study signature aspects such as uniqueness and investigate container-specific scenarios, the technique potentially offers a rapid and cost-effective tool to verify reduction and dismantlement of U.S. and Russian nuclear weapons.« less
New Aspects of Probabilistic Forecast Verification Using Information Theory
NASA Astrophysics Data System (ADS)
Tödter, Julian; Ahrens, Bodo
2013-04-01
This work deals with information-theoretical methods in probabilistic forecast verification, particularly concerning ensemble forecasts. Recent findings concerning the "Ignorance Score" are shortly reviewed, then a consistent generalization to continuous forecasts is motivated. For ensemble-generated forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up a natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The useful properties of the conceptually appealing CRIGN are illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This algorithm can also be used to calculate the decomposition of the more traditional CRPS exactly. The applicability of the "new" measures is demonstrated in a small evaluation study of ensemble-based precipitation forecasts.
Bortolan, Giovanni
2015-01-01
Traditional means for identity validation (PIN codes, passwords), and physiological and behavioral biometric characteristics (fingerprint, iris, and speech) are susceptible to hacker attacks and/or falsification. This paper presents a method for person verification/identification based on correlation of present-to-previous limb ECG leads: I (r I), II (r II), calculated from them first principal ECG component (r PCA), linear and nonlinear combinations between r I, r II, and r PCA. For the verification task, the one-to-one scenario is applied and threshold values for r I, r II, and r PCA and their combinations are derived. The identification task supposes one-to-many scenario and the tested subject is identified according to the maximal correlation with a previously recorded ECG in a database. The population based ECG-ILSA database of 540 patients (147 healthy subjects, 175 patients with cardiac diseases, and 218 with hypertension) has been considered. In addition a common reference PTB dataset (14 healthy individuals) with short time interval between the two acquisitions has been taken into account. The results on ECG-ILSA database were satisfactory with healthy people, and there was not a significant decrease in nonhealthy patients, demonstrating the robustness of the proposed method. With PTB database, the method provides an identification accuracy of 92.9% and a verification sensitivity and specificity of 100% and 89.9%. PMID:26568954
New generalized Noh solutions for HEDP hydrocode verification
NASA Astrophysics Data System (ADS)
Velikovich, A. L.; Giuliani, J. L.; Zalesak, S. T.; Tangri, V.
2017-10-01
The classic Noh solution describing stagnation of a cold ideal gas in a strong accretion shock wave has been the workhorse of compressible hydrocode verification for over three decades. We describe a number of its generalizations available for HEDP code verification. First, for an ideal gas, we have obtained self-similar solutions that describe adiabatic convergence either of a finite-pressure gas into an empty cavity or of a finite-amplitude sound wave into a uniform resting gas surrounding the center or axis of symmetry. At the moment of collapse such a flow produces a uniform gas whose velocity at each point is constant and directed towards the axis or the center, i. e. the initial condition similar to the classic solution but with a finite pressure of the converging gas. After that, a constant-velocity accretion shock propagates into the incident gas whose pressure and velocity profiles are not flat, in contrast with the classic solution. Second, for an arbitrary equation of state, we demonstrate the existence of self-similar solutions of the Noh problem in cylindrical and spherical geometry. Examples of such solutions with a three-term equation of state that includes cold, thermal ion/lattice, and thermal electron contributions are presented for aluminum and copper. These analytic solutions are compared to our numerical simulation results as an example of their use for code verification. Work supported by the US DOE/NNSA.
NASA Astrophysics Data System (ADS)
Wijaya, Surya Li; Savvides, Marios; Vijaya Kumar, B. V. K.
2005-02-01
Face recognition on mobile devices, such as personal digital assistants and cell phones, is a big challenge owing to the limited computational resources available to run verifications on the devices themselves. One approach is to transmit the captured face images by use of the cell-phone connection and to run the verification on a remote station. However, owing to limitations in communication bandwidth, it may be necessary to transmit a compressed version of the image. We propose using the image compression standard JPEG2000, which is a wavelet-based compression engine used to compress the face images to low bit rates suitable for transmission over low-bandwidth communication channels. At the receiver end, the face images are reconstructed with a JPEG2000 decoder and are fed into the verification engine. We explore how advanced correlation filters, such as the minimum average correlation energy filter [Appl. Opt. 26, 3633 (1987)] and its variants, perform by using face images captured under different illumination conditions and encoded with different bit rates under the JPEG2000 wavelet-encoding standard. We evaluate the performance of these filters by using illumination variations from the Carnegie Mellon University's Pose, Illumination, and Expression (PIE) face database. We also demonstrate the tolerance of these filters to noisy versions of images with illumination variations.
Jekova, Irena; Bortolan, Giovanni
2015-01-01
Traditional means for identity validation (PIN codes, passwords), and physiological and behavioral biometric characteristics (fingerprint, iris, and speech) are susceptible to hacker attacks and/or falsification. This paper presents a method for person verification/identification based on correlation of present-to-previous limb ECG leads: I (r I), II (r II), calculated from them first principal ECG component (r PCA), linear and nonlinear combinations between r I, r II, and r PCA. For the verification task, the one-to-one scenario is applied and threshold values for r I, r II, and r PCA and their combinations are derived. The identification task supposes one-to-many scenario and the tested subject is identified according to the maximal correlation with a previously recorded ECG in a database. The population based ECG-ILSA database of 540 patients (147 healthy subjects, 175 patients with cardiac diseases, and 218 with hypertension) has been considered. In addition a common reference PTB dataset (14 healthy individuals) with short time interval between the two acquisitions has been taken into account. The results on ECG-ILSA database were satisfactory with healthy people, and there was not a significant decrease in nonhealthy patients, demonstrating the robustness of the proposed method. With PTB database, the method provides an identification accuracy of 92.9% and a verification sensitivity and specificity of 100% and 89.9%.
SMAP Verification and Validation Project - Final Report
NASA Technical Reports Server (NTRS)
Murry, Michael
2012-01-01
In 2007, the National Research Council (NRC) released the Decadal Survey of Earth science. In the future decade, the survey identified 15 new space missions of significant scientific and application value for the National Aeronautics and Space Administration (NASA) to undertake. One of these missions was the Soil Moisture Active Passive (SMAP) mission that NASA assigned to the Jet Propulsion Laboratory (JPL) in 2008. The goal of SMAP1 is to provide global, high resolution mapping of soil moisture and its freeze/thaw states. The SMAP project recently passed its Critical Design Review and is proceeding with its fabrication and testing phase.Verification and Validation (V&V) is widely recognized as a critical component in system engineering and is vital to the success of any space mission. V&V is a process that is used to check that a system meets its design requirements and specifications in order to fulfill its intended purpose. Verification often refers to the question "Have we built the system right?" whereas Validation asks "Have we built the right system?" Currently the SMAP V&V team is verifying design requirements through inspection, demonstration, analysis, or testing. An example of the SMAP V&V process is the verification of the antenna pointing accuracy with mathematical models since it is not possible to provide the appropriate micro-gravity environment for testing the antenna on Earth before launch.