Automated biowaste sampling system urine subsystem operating model, part 1
NASA Technical Reports Server (NTRS)
Fogal, G. L.; Mangialardi, J. K.; Rosen, F.
1973-01-01
The urine subsystem automatically provides for the collection, volume sensing, and sampling of urine from six subjects during space flight. Verification of the subsystem design was a primary objective of the current effort which was accomplished thru the detail design, fabrication, and verification testing of an operating model of the subsystem.
Urine sampling and collection system
NASA Technical Reports Server (NTRS)
Fogal, G. L.; Mangialardi, J. K.; Reinhardt, C. G.
1971-01-01
This specification defines the performance and design requirements for the urine sampling and collection system engineering model and establishes requirements for its design, development, and test. The model shall provide conceptual verification of a system applicable to manned space flight which will automatically provide for collection, volume sensing, and sampling of urine.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-14
... subsequent soil samples showed levels of metals at or below generic residential criteria or background values... 1994- 1996 and additional sampling between 1998 and 2007. Area A--Site Entrance: Soil boring samples... verification samples. Additional soil samples were collected from the same location as the previous collection...
DOE Office of Scientific and Technical Information (OSTI.GOV)
King, David A.
2012-08-16
Oak Ridge Associated Universities (ORAU) conducted in-process inspections and independent verification (IV) surveys in support of DOE's remedial efforts in Zone 1 of East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. Inspections concluded that the remediation contractor's soil removal and survey objectives were satisfied and the dynamic verification strategy (DVS) was implemented as designed. Independent verification (IV) activities included gamma walkover surveys and soil sample collection/analysis over multiple exposure units (EUs).
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2014 CFR
2014-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2012 CFR
2012-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2013 CFR
2013-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2010 CFR
2010-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2011 CFR
2011-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
Soil Gas Sample Handling: Evaluation of Water Removal and Sample Ganging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fritz, Brad G.; Abrecht, David G.; Hayes, James C.
2016-10-31
Soil gas sampling is currently conducted in support of Nuclear Test Ban treaty verification. Soil gas samples are collected and analyzed for isotopes of interest. Some issues that can impact sampling and analysis of these samples are excess moisture and sample processing time. Here we discuss three potential improvements to the current sampling protocol; a desiccant for water removal, use of molecular sieve to remove CO 2 from the sample during collection, and a ganging manifold to allow composite analysis of multiple samples.
Urine sampling and collection system optimization and testing
NASA Technical Reports Server (NTRS)
Fogal, G. L.; Geating, J. A.; Koesterer, M. G.
1975-01-01
A Urine Sampling and Collection System (USCS) engineering model was developed to provide for the automatic collection, volume sensing and sampling of urine from each micturition. The purpose of the engineering model was to demonstrate verification of the system concept. The objective of the optimization and testing program was to update the engineering model, to provide additional performance features and to conduct system testing to determine operational problems. Optimization tasks were defined as modifications to minimize system fluid residual and addition of thermoelectric cooling.
Self-verification motives at the collective level of self-definition.
Chen, Serena; Chen, Karen Y; Shaw, Lindsay
2004-01-01
Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demuth, Scott F.; Trahan, Alexis Chanel
2017-06-26
DIV of facility layout, material flows, and other information provided in the DIQ. Material accountancy through an annual PIV and a number of interim inventory verifications, including UF6 cylinder identification and counting, NDA of cylinders, and DA on a sample collection of UF6. Application of C/S technologies utilizing seals and tamper-indicating devices (TIDs) on cylinders, containers, storage rooms, and IAEA instrumentation to provide continuity of knowledge between inspection. Verification of the absence of undeclared material and operations, especially HEU production, through SNRIs, LFUA of cascade halls, and environmental swipe sampling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.
As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.
Area of Concern (AOC) 314 Verification Survey at Former McClellan AFB, Sacramento, CA
2015-03-31
also collected 22 soil samples from within AOC 314. Laboratory analysis revealed that the concentration of radium-226 (Ra-226) in 10 of the soil ...at least one sample that exceeded 2.0 pCi/g. The highest concentration of Ra-226 found in any of the soil samples was 25.8 pCi/g. Based on these...and ensure the potential health risk to future inhabitants is minimized. USAFSAM/OEC personnel also collected 22 soil samples from within AOC 314
Reconstruction based finger-knuckle-print verification with score level adaptive binary fusion.
Gao, Guangwei; Zhang, Lei; Yang, Jian; Zhang, Lin; Zhang, David
2013-12-01
Recently, a new biometrics identifier, namely finger knuckle print (FKP), has been proposed for personal authentication with very interesting results. One of the advantages of FKP verification lies in its user friendliness in data collection. However, the user flexibility in positioning fingers also leads to a certain degree of pose variations in the collected query FKP images. The widely used Gabor filtering based competitive coding scheme is sensitive to such variations, resulting in many false rejections. We propose to alleviate this problem by reconstructing the query sample with a dictionary learned from the template samples in the gallery set. The reconstructed FKP image can reduce much the enlarged matching distance caused by finger pose variations; however, both the intra-class and inter-class distances will be reduced. We then propose a score level adaptive binary fusion rule to adaptively fuse the matching distances before and after reconstruction, aiming to reduce the false rejections without increasing much the false acceptances. Experimental results on the benchmark PolyU FKP database show that the proposed method significantly improves the FKP verification accuracy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.
As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.
DOE Office of Scientific and Technical Information (OSTI.GOV)
P.C. Weaver
2009-04-29
The primary objective of the independent verification was to determine if BJC performed the appropriate actions to meet the specified “hot spot” cleanup criteria of 500 picocuries per gram (pCi/g) uranium-238 (U-238) in surface soil. Specific tasks performed by the independent verification team (IVT) to satisfy this objective included: 1) performing radiological walkover surveys, and 2) collecting soil samples for independent analyses. The independent verification (IV) efforts were designed to evaluate radioactive contaminants (specifically U-238) in the exposed surfaces below one foot of the original site grade, given that the top one foot layer of soil on the site wasmore » removed in its entirety.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-12
...) also requires Eligible Telecommunications Carriers (ETCs) to submit to the Universal Service.... Prior to 2009, USAC provided sample certification and verification letters on its website to assist ETCs... check box to accommodate wireless ETCs serving non-federal default states that do not assert...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-21
... children and provide low cost or free school lunch meals to qualified students through subsidies to schools... records to demonstrate compliance with the meal requirements. To the extent practicable, schools ensure... verification of a required sample size), the number of meals served, and data from required reviews conducted...
Feasibility of biochemical verification in a web-based smoking cessation study.
Cha, Sarah; Ganz, Ollie; Cohn, Amy M; Ehlke, Sarah J; Graham, Amanda L
2017-10-01
Cogent arguments have been made against the need for biochemical verification in population-based studies with low-demand characteristics. Despite this fact, studies involving digital interventions (low-demand) are often required in peer review to report biochemically verified abstinence. To address this discrepancy, we examined the feasibility and costs of biochemical verification in a web-based study conducted with a national sample. Participants were 600U.S. adult current smokers who registered on a web-based smoking cessation program and completed surveys at baseline and 3months. Saliva sampling kits were sent to participants who reported 7-day abstinence at 3months, and analyzed for cotinine. The response rate at 3-months was 41.2% (n=247): 93 participants reported 7-day abstinence (38%) and were mailed a saliva kit (71% returned). The discordance rate was 36.4%. Participants with discordant responses were more likely to report 3-month use of nicotine replacement therapy or e-cigarettes than those with concordant responses (79.2% vs. 45.2%, p=0.007). The total cost of saliva sampling was $8280 ($125/sample). Biochemical verification was both time- and cost-intensive, and yielded a relatively small number of samples due to low response rates and use of other nicotine products during the follow-up period. There was a high rate of discordance of self-reported abstinence and saliva testing. Costs for data collection may be prohibitive for studies with large sample sizes or limited budgets. Our findings echo previous statements that biochemical verification is not necessary in population-based studies, and add evidence specific to technology-based studies. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Wier, Timothy P.; Moser, Cameron S.; Grant, Jonathan F.; Riley, Scott C.; Robbins-Wamsley, Stephanie H.; First, Matthew R.; Drake, Lisa A.
2017-10-01
Both L-shaped ("L") and straight ("Straight") sample probes have been used to collect water samples from a main ballast line in land-based or shipboard verification testing of ballast water management systems (BWMS). A series of experiments was conducted to quantify and compare the sampling efficiencies of L and Straight sample probes. The findings from this research-that both L and Straight probes sample organisms with similar efficiencies-permit increased flexibility for positioning sample probes aboard ships.
Methods and Procedures in PIRLS 2016
ERIC Educational Resources Information Center
Martin, Michael O., Ed.; Mullis, Ina V. S., Ed.; Hooper, Martin, Ed.
2017-01-01
"Methods and Procedures in PIRLS 2016" documents the development of the Progress in International Reading Literacy Study (PIRLS) assessments and questionnaires and describes the methods used in sampling, translation verification, data collection, database construction, and the construction of the achievement and context questionnaire…
Hou, Guixue; Lou, Xiaomin; Sun, Yulin; Xu, Shaohang; Zi, Jin; Wang, Quanhui; Zhou, Baojin; Han, Bo; Wu, Lin; Zhao, Xiaohang; Lin, Liang; Liu, Siqi
2015-09-04
We propose an efficient integration of SWATH with MRM for biomarker discovery and verification when the corresponding ion library is well established. We strictly controlled the false positive rate associated with SWATH MS signals and carefully selected the target peptides coupled with SWATH and MRM. We collected 10 samples of esophageal squamous cell carcinoma (ESCC) tissues paired with tumors and adjacent regions and quantified 1758 unique proteins with FDR 1% at protein level using SWATH, in which 467 proteins were abundance-dependent with ESCC. After carefully evaluating the SWATH MS signals of the up-regulated proteins, we selected 120 proteins for MRM verification. MRM analysis of the pooled and individual esophageal tissues resulted in 116 proteins that exhibited similar abundance response modes to ESCC that were acquired with SWATH. Because the ESCC-related proteins consisted of a high percentile of secreted proteins, we conducted the MRM assay on patient sera that were collected from pre- and postoperation. Of the 116 target proteins, 42 were identified in the ESCC sera, including 11 with lowered abundances postoperation. Coupling SWATH and MRM is thus feasible and efficient for the discovery and verification of cancer-related protein biomarkers.
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Rigaku ZSX Mini II (ZSX Mini II) XRF Services x-ray fluorescence (XRF) analyzer was demon-strated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ZSX Mini II analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ZSX Mini II analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element con
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Rontec PicoTAX x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the PicoTAX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the PicoTAX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by c
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Niton XLt 700 Series (XLt) XRF Services x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XLt analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XLt analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Oxford ED2000 x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ED2000 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ED2000 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by com
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Innov-X XT400 Series (XT400) x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XT400 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XT400 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was as
FIELD VERIFICATION OF LINERS FROM SANITARY LANDFILLS
Liner specimens from three existing landfill sites were collected and examined to determine the changes in their physical properties over time and to validate data being developed through laboratory research. Samples examined included a 15-mil PVC liner from a sludge lagoon in Ne...
40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) If your...
40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) If your...
40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) For...
40 CFR 1066.420 - Pre-test verification procedures and pre-test data collection.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Pre-test verification procedures and pre-test data collection. 1066.420 Section 1066.420 Protection of Environment ENVIRONMENTAL PROTECTION... Test § 1066.420 Pre-test verification procedures and pre-test data collection. (a) Follow the...
40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) If your...
40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) If your...
40 CFR 1066.420 - Pre-test verification procedures and pre-test data collection.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Pre-test verification procedures and pre-test data collection. 1066.420 Section 1066.420 Protection of Environment ENVIRONMENTAL PROTECTION... Test § 1066.420 Pre-test verification procedures and pre-test data collection. (a) Follow the...
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Elvatech, Ltd. ElvaX (ElvaX) x-ray fluorescence (XRF) analyzer distributed in the United States by Xcalibur XRF Services (Xcalibur), was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ElvaX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ElvaX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as s
OH/H2O Detection Capability Evaluation on Chang'e-5 Lunar Mineralogical Spectrometer (LMS)
NASA Astrophysics Data System (ADS)
Liu, Bin; Ren, Xin; Liu, Jianjun; Li, Chunlai; Mu, Lingli; Deng, Liyan
2016-10-01
The Chang'e-5 (CE-5) lunar sample return mission is scheduled to launch in 2017 to bring back lunar regolith and drill samples. The Chang'e-5 Lunar Mineralogical Spectrometer (LMS), as one of the three sets of scientific payload installed on the lander, is used to collect in-situ spectrum and analyze the mineralogical composition of the samplingsite. It can also help to select the sampling site, and to compare the measured laboratory spectrum of returned sample with in-situ data. LMS employs acousto-optic tunable filters (AOTFs) and is composed of a VIS/NIR module (0.48μm-1.45μm) and an IR module (1.4μm -3.2μm). It has spectral resolution ranging from 3 to 25 nm, with a field of view (FOV) of 4.24°×4.24°. Unlike Chang'e-3 VIS/NIR Imaging Spectrometer (VNIS), the spectral coverage of LMS is extended from 2.4μm to 3.2μm, which has capability to identify H2O/OH absorption features around 2.7μm. An aluminum plate and an Infragold plate are fixed in the dust cover, being used as calibration targets in the VIS/NIR and IR spectral range respectively when the dust cover is open. Before launch, a ground verification test of LMS needs to be conducted in order to: 1) test and verify the detection capability of LMS through evaluation on the quality of image and spectral data collected for the simulated lunar samples; and 2) evaluate the accuracy of data processing methods by the simulation of instrument working on the moon. The ground verification test will be conducted both in the lab and field. The spectra of simulated lunar regolith/mineral samples will be collected simultaneously by the LMS and two calibrated spectrometers: a FTIR spectrometer (Model 102F) and an ASD FieldSpec 4 Hi-Res spectrometer. In this study, the results of the LMS ground verification test will be reported, and OH/H2O Detection Capability will be evaluated especially.
NASA Astrophysics Data System (ADS)
Rieben, James C., Jr.
This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the use of real world samples. In the organic chemistry experiment, results suggest that the discovery-based design improved student retention of the chain length differentiation by physical properties relative to the verification-based design.
Self-verification and depression among youth psychiatric inpatients.
Joiner, T E; Katz, J; Lew, A S
1997-11-01
According to self-verification theory (e.g., W.B. Swann, 1983), people are motivated to preserve stable self-concepts by seeking self-confirming interpersonal responses, even if the responses are negative. In the current study of 72 youth psychiatric inpatients (36 boys; 36 girls; ages 7-17, M = 13.18; SD = 2.59), the authors provide the 1st test of self-verification theory among a youth sample. Participants completed self-report questionnaires on depression, self-esteem, anxiety, negative and positive affect, and interest in negative feedback from others. The authors made chart diagnoses available, and they collected peer rejection ratings. Consistent with hypotheses, the authors found that interest in negative feedback was associated with depression, was predictive of peer rejection (but only within relatively longer peer relationships), was more highly related to cognitive than emotional aspects of depression, and was specifically associated with depression, rather than being generally associated with emotional distress. The authors discuss implications for self-verification theory and for the phenomenology of youth depression.
75 FR 17923 - Agency Information Collection Activities: Proposed Collection: Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-08
... Verification and Community Site Information form, the Loan Information and Verification form, the Authorization to Release Information form, the Applicant Checklist, and the Self-Certification form. The annual... respondent responses response hours NHSC LRP Application 5,175 1 5,175 0.30 1,553 Employment Verification-- 5...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-04
...-0008; OMB Number 1014-0009] Information Collection Activities: Legacy Data Verification Process (LDVP); Submitted for Office of Management and Budget (OMB) Review; Comment Request ACTION: 30-Day notice. SUMMARY... the Notice to Lessees (NTL) on the Legacy Data Verification Process (LDVP). This notice also provides...
Scheuermann, Taneisha S; Richter, Kimber P; Rigotti, Nancy A; Cummins, Sharon E; Harrington, Kathleen F; Sherman, Scott E; Zhu, Shu-Hong; Tindle, Hilary A; Preacher, Kristopher J
2017-12-01
To estimate the prevalence and predictors of failed biochemical verification of self-reported abstinence among participants enrolled in trials of hospital-initiated smoking cessation interventions. Comparison of characteristics between participants who verified and those who failed to verify self-reported abstinence. Multi-site randomized clinical trials conducted between 2010 and 2014 in hospitals throughout the United States. Recently hospitalized smokers who reported tobacco abstinence 6 months post-randomization and provided a saliva sample for verification purposes (n = 822). Outcomes were salivary cotinine-verified smoking abstinence at 10 and 15 ng/ml cut-points. Predictors and correlates included participant demographics and tobacco use; hospital diagnoses and treatment; and study characteristics collected via surveys and electronic medical records. Usable samples were returned by 69.8% of the 1178 eligible trial participants who reported 7-day point prevalence abstinence. The proportion of participants verified as quit was 57.8% [95% confidence interval (CI) = 54.4, 61.2; 10 ng/ml cut-off] or 60.6% (95% CI = 57.2, 63.9; 15 ng/ml). Factors associated independently with verification at 10 ng/ml were education beyond high school education [odds ratio (OR) = 1.51; 95% CI = 1.07, 2.11], continuous abstinence since hospitalization (OR = 2.82; 95% CI = 2.02, 3.94), mailed versus in-person sample (OR = 3.20; 95% CI = 1.96, 5.21) and race. African American participants were less likely to verify abstinence than white participants (OR = 0.64; 95% CI = 0.44, 0.93). Findings were similar for verification at 15 ng/ml. Verification rates did not differ by treatment group. In the United States, high rates (40%) of recently hospitalized smokers enrolled in smoking cessation trials fail biochemical verification of their self-reported abstinence. © 2017 Society for the Study of Addiction.
Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk
2015-12-01
The purpose of this study is to verification of relationship model between Korean new elderly class's recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model.
Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk
2015-01-01
The purpose of this study is to verification of relationship model between Korean new elderly class’s recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model. PMID:26730383
Airell, Asa; Lindbäck, Emma; Ataker, Ferda; Pörnull, Kirsti Jalakas; Wretlind, Bengt
2005-06-01
We compared 956 samples for AMPLICOR Neisseria gonorrhoeae polymerase chain reaction (PCR) (Roche) with species verification using the 16S rRNA gene to verification using gyrA gene. Control was the culture method. The gyrA verification uses pyrosequencing of the quinolone resistance-determining region of gyrA. Of 52 samples with optical density >/=0.2 in PCR, 27 were negative in culture, two samples from pharynx were false negative in culture and four samples from pharynx were false positives in verification with 16S rRNA. Twenty-five samples showed growth of gonococci, 18 of the corresponding PCR samples were verified by both methods; three urine samples were positive only in gyrA ; and one pharynx specimen was positive only in 16S rRNA. Three samples were lost. We conclude that AMPLICOR N. gonorrhoeae PCR with verification in gyrA gene can be considered as a diagnostic tool in populations with low prevalence of gonorrhoea and that pharynx specimens should not be analysed by PCR.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-01
... Collection for Placement Verification and Follow-Up of Job Corps Participants; Extension Without Revisions... Placement Verification and Follow-up of Job Corps Participants, using post-center surveys of Job Corps... to Lawrence Lyford, Office of Job Corps, Room N-4507, Employment and Training Administration, U.S...
Michael C. Wiemann; Edgard O. Espinoza
2017-01-01
To evade endangered timber species laws, unscrupulous importers sometimes attempt to pass protected Dalbergia nigra as look-alike but unprotected, Dalbergia Spruceana. Wood density and fluorescence properties are sometimes used to identify the species. Although these properties are useful and do not require special equipment,...
Code of Federal Regulations, 2010 CFR
2010-01-01
...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...
Code of Federal Regulations, 2011 CFR
2011-01-01
...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...
Code of Federal Regulations, 2014 CFR
2014-01-01
...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...
Code of Federal Regulations, 2012 CFR
2012-01-01
...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...
Code of Federal Regulations, 2013 CFR
2013-01-01
...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-13
... Service [Docket No. FSIS-2008-0008] Salmonella Verification Sampling Program: Response to Comments on New... establishments that participate in SIP. The Agency intends to conduct its own unannounced, small- set sampling to... considering publishing verification sampling results for other product classes. In the 2006 Federal Register...
Argon Collection And Purification For Proliferation Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Achey, R.; Hunter, D.
2015-10-09
In order to determine whether a seismic event was a declared/undeclared underground nuclear weapon test, environmental samples must be taken and analyzed for signatures that are unique to a nuclear explosion. These signatures are either particles or gases. Particle samples are routinely taken and analyzed under the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) verification regime as well as by individual countries. Gas samples are analyzed for signature gases, especially radioactive xenon. Underground nuclear tests also produce radioactive argon, but that signature is not well monitored. A radioactive argon signature, along with other signatures, can more conclusively determine whether an event wasmore » a nuclear test. This project has developed capabilities for collecting and purifying argon samples for ultra-low-background proportional counting. SRNL has developed a continuous gas enrichment system that produces an output stream containing 97% argon from whole air using adsorbent separation technology (the flow diagram for the system is shown in the figure). The vacuum swing adsorption (VSA) enrichment system is easily scalable to produce ten liters or more of 97% argon within twelve hours. A gas chromatographic separation using a column of modified hydrogen mordenite molecular sieve has been developed that can further purify the sample to better than 99% purity after separation from the helium carrier gas. The combination of these concentration and purification systems has the capability of being used for a field-deployable system for collecting argon samples suitable for ultra-low-background proportional counting for detecting nuclear detonations under the On-Site Inspection program of the CTBTO verification regime. The technology also has applications for the bulk argon separation from air for industrial purposes such as the semi-conductor industry.« less
47 CFR 73.151 - Field strength measurements to establish performance of directional antennas.
Code of Federal Regulations, 2010 CFR
2010-10-01
... verified either by field strength measurement or by computer modeling and sampling system verification. (a... specifically identified by the Commission. (c) Computer modeling and sample system verification of modeled... performance verified by computer modeling and sample system verification. (1) A matrix of impedance...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-26
... Collection Activities: Form G-845 and Form G- 845 Supplement, Revision of a Currently Approved Information Collection; Comment Request ACTION: 30-Day Notice of Information Collection under Review: Form G- 845 and Form G-845 Supplement, Document Verification Request and Document Verification Request Supplement; OMB...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-28
... Collection; Comment Request; Data Collection and Verification for the Marine Protected Areas Inventory AGENCY... developing a national system of marine protected areas (MPAs). These departments are working closely with... Administration (NOAA) and DOI have created the Marine Protected Areas Inventory, an online spatial database that...
Wirojanagud, Wanpen; Srisatit, Thares
2014-01-01
Fuzzy overlay approach on three raster maps including land slope, soil type, and distance to stream can be used to identify the most potential locations of high arsenic contamination in soils. Verification of high arsenic contamination was made by collection samples and analysis of arsenic content and interpolation surface by spatial anisotropic method. A total of 51 soil samples were collected at the potential contaminated location clarified by fuzzy overlay approach. At each location, soil samples were taken at the depth of 0.00-1.00 m from the surface ground level. Interpolation surface of the analysed arsenic content using spatial anisotropic would verify the potential arsenic contamination location obtained from fuzzy overlay outputs. Both outputs of the spatial surface anisotropic and the fuzzy overlay mapping were significantly spatially conformed. Three contaminated areas with arsenic concentrations of 7.19 ± 2.86, 6.60 ± 3.04, and 4.90 ± 2.67 mg/kg exceeded the arsenic content of 3.9 mg/kg, the maximum concentration level (MCL) for agricultural soils as designated by Office of National Environment Board of Thailand. It is concluded that fuzzy overlay mapping could be employed for identification of potential contamination area with the verification by surface anisotropic approach including intensive sampling and analysis of the substances of interest. PMID:25110751
Logistic model of nitrate in streams of the upper-midwestern United States
Mueller, D.K.; Ruddy, B.C.; Battaglin, W.A.
1997-01-01
Nitrate in surface water can have adverse effects on aquatic life and, in drinking-water supplies, can be a risk to human health. As part of a regional study, nitrates as N (NO3-N) was analyzed in water samples collected from streams throughout 10 Midwestern states during synoptic surveys in 1989, 1990, and 1994. Data from the period immediately following crop planting at 124 sites were analyzed during logistic regression to relate discrete categories of NO3-N concentrations to characteristics of the basins upstream from the sites. The NO3-N data were divided into three categories representing probable background concentrations (10 mg L-1). Nitrate-N concentrations were positively correlated to streamflow, upstream area planted in corn (Zea mays L.), and upstream N- fertilizers application rates. Elevated NO3-N concentrations were associated with poorly drained soils and were weakly correlated with population density. Nitrate-N and streamflow data collected during 1989 and 1990 were used to calibrate the model, and data collected during 1994 were used for verification. The model correctly estimated NO3-N concentration categories for 79% of the samples in the calibration data set and 60% of the samples in the verification data set. The model was used to indicate where NO3-N concentrations might be elevated or exceed the NO3-N MCL in streams throughout the study area. The potential for elevated NO3-N concentrations was predicted to be greatest for streams in Illinois, Indiana, Iowa, and western Ohio.
NASA/BLM APT, phase 2. Volume 2: Technology demonstration. [Arizona
NASA Technical Reports Server (NTRS)
1981-01-01
Techniques described include: (1) steps in the preprocessing of LANDSAT data; (2) the training of a classifier; (3) maximum likelihood classification and precision; (4) geometric correction; (5) class description; (6) digitizing; (7) digital terrain data; (8) an overview of sample design; (9) allocation and selection of primary sample units; (10) interpretation of secondary sample units; (11) data collection ground plots; (12) data reductions; (13) analysis for productivity estimation and map verification; (14) cost analysis; and (150) LANDSAT digital products. The evaluation of the pre-inventory planning for P.J. is included.
77 FR 64596 - Proposed Information Collection (Income Verification) Activity: Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-22
... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0518] Proposed Information Collection (Income... to income- dependent benefits. DATES: Written comments and recommendations on the proposed collection... techniques or the use of other forms of information technology. Title: Income Verification, VA Form 21-0161a...
Haskew, John; Rø, Gunnar; Saito, Kaori; Turner, Kenrick; Odhiambo, George; Wamae, Annah; Sharif, Shahnaaz; Sugishita, Tomohiko
2015-05-01
Complete and timely health information is essential to inform public health decision-making for maternal and child health, but is often lacking in resource-constrained settings. Electronic medical record (EMR) systems are increasingly being adopted to support the delivery of health care, and are particularly amenable to maternal and child health services. An EMR system could enable the mother and child to be tracked and monitored throughout maternity shared care, improve quality and completeness of data collected and enhance sharing of health information between outpatient clinic and the hospital, and between clinical and public health services to inform decision-making. This study implemented a novel cloud-based electronic medical record system in a maternal and child health outpatient setting in Western Kenya between April and June 2013 and evaluated its impact on improving completeness of data collected by clinical and public health services. The impact of the system was assessed using a two-sample test of proportions pre- and post-implementation of EMR-based data verification. Significant improvements in completeness of the antenatal record were recorded through implementation of EMR-based data verification. A difference of 42.9% in missing data (including screening for hypertension, tuberculosis, malaria, HIV status or ART status of HIV positive women) was recorded pre- and post-implementation. Despite significant impact of EMR-based data verification on data completeness, overall screening rates in antenatal care were low. This study has shown that EMR-based data verification can improve the completeness of data collected in the patient record for maternal and child health. A number of issues, including data management and patient confidentiality, must be considered but significant improvements in data quality are recorded through implementation of this EMR model. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-31
... Information Collection: Federal Labor Standards Payee Verification and Payment Processing AGENCY: Office of the Chief Information Officer, HUD. ACTION: Notice. SUMMARY: HUD has submitted the proposed information collection requirement described below to the Office of Management and Budget (OMB) for review, in...
Onsite Gaseous Centrifuge Enrichment Plant UF6 Cylinder Destructive Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anheier, Norman C.; Cannon, Bret D.; Qiao, Hong
2012-07-17
The IAEA safeguards approach for gaseous centrifuge enrichment plants (GCEPs) includes measurements of gross, partial, and bias defects in a statistical sampling plan. These safeguard methods consist principally of mass and enrichment nondestructive assay (NDA) verification. Destructive assay (DA) samples are collected from a limited number of cylinders for high precision offsite mass spectrometer analysis. DA is typically used to quantify bias defects in the GCEP material balance. Under current safeguards measures, the operator collects a DA sample from a sample tap following homogenization. The sample is collected in a small UF6 sample bottle, then sealed and shipped under IAEAmore » chain of custody to an offsite analytical laboratory. Current practice is expensive and resource intensive. We propose a new and novel approach for performing onsite gaseous UF6 DA analysis that provides rapid and accurate assessment of enrichment bias defects. DA samples are collected using a custom sampling device attached to a conventional sample tap. A few micrograms of gaseous UF6 is chemically adsorbed onto a sampling coupon in a matter of minutes. The collected DA sample is then analyzed onsite using Laser Ablation Absorption Ratio Spectrometry-Destructive Assay (LAARS-DA). DA results are determined in a matter of minutes at sufficient accuracy to support reliable bias defect conclusions, while greatly reducing DA sample volume, analysis time, and cost.« less
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the ...
NASA Technical Reports Server (NTRS)
Fries, M. D.; Allen, C. C.; Calaway, M. J.; Evans, C. A.; Stansbery, E. K.
2015-01-01
Curation of NASA's astromaterials sample collections is a demanding and evolving activity that supports valuable science from NASA missions for generations, long after the samples are returned to Earth. For example, NASA continues to loan hundreds of Apollo program samples to investigators every year and those samples are often analyzed using instruments that did not exist at the time of the Apollo missions themselves. The samples are curated in a manner that minimizes overall contamination, enabling clean, new high-sensitivity measurements and new science results over 40 years after their return to Earth. As our exploration of the Solar System progresses, upcoming and future NASA sample return missions will return new samples with stringent contamination control, sample environmental control, and Planetary Protection requirements. Therefore, an essential element of a healthy astromaterials curation program is a research and development (R&D) effort that characterizes and employs new technologies to maintain current collections and enable new missions - an Advanced Curation effort. JSC's Astromaterials Acquisition & Curation Office is continually performing Advanced Curation research, identifying and defining knowledge gaps about research, development, and validation/verification topics that are critical to support current and future NASA astromaterials sample collections. The following are highlighted knowledge gaps and research opportunities.
OSIRIS-REx Touch-and-Go (TAG) Mission Design for Asteroid Sample Collection
NASA Technical Reports Server (NTRS)
May, Alexander; Sutter, Brian; Linn, Timothy; Bierhaus, Beau; Berry, Kevin; Mink, Ron
2014-01-01
The Origins Spectral Interpretation Resource Identification Security Regolith Explorer (OSIRIS-REx) mission is a NASA New Frontiers mission launching in September 2016 to rendezvous with the near-Earth asteroid Bennu in October 2018. After several months of proximity operations to characterize the asteroid, OSIRIS-REx flies a Touch-And-Go (TAG) trajectory to the asteroid's surface to collect at least 60 g of pristine regolith sample for Earth return. This paper provides mission and flight system overviews, with more details on the TAG mission design and key events that occur to safely and successfully collect the sample. An overview of the navigation performed relative to a chosen sample site, along with the maneuvers to reach the desired site is described. Safety monitoring during descent is performed with onboard sensors providing an option to abort, troubleshoot, and try again if necessary. Sample collection occurs using a collection device at the end of an articulating robotic arm during a brief five second contact period, while a constant force spring mechanism in the arm assists to rebound the spacecraft away from the surface. Finally, the sample is measured quantitatively utilizing the law of conservation of angular momentum, along with qualitative data from imagery of the sampling device. Upon sample mass verification, the arm places the sample into the Stardust-heritage Sample Return Capsule (SRC) for return to Earth in September 2023.
NASA Astrophysics Data System (ADS)
Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan
2018-02-01
The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.
Forecasting of cyanobacterial density in Torrão reservoir using artificial neural networks.
Torres, Rita; Pereira, Elisa; Vasconcelos, Vítor; Teles, Luís Oliva
2011-06-01
The ability of general regression neural networks (GRNN) to forecast the density of cyanobacteria in the Torrão reservoir (Tâmega river, Portugal), in a period of 15 days, based on three years of collected physical and chemical data, was assessed. Several models were developed and 176 were selected based on their correlation values for the verification series. A time lag of 11 was used, equivalent to one sample (periods of 15 days in the summer and 30 days in the winter). Several combinations of the series were used. Input and output data collected from three depths of the reservoir were applied (surface, euphotic zone limit and bottom). The model that presented a higher average correlation value presented the correlations 0.991; 0.843; 0.978 for training, verification and test series. This model had the three series independent in time: first test series, then verification series and, finally, training series. Only six input variables were considered significant to the performance of this model: ammonia, phosphates, dissolved oxygen, water temperature, pH and water evaporation, physical and chemical parameters referring to the three depths of the reservoir. These variables are common to the next four best models produced and, although these included other input variables, their performance was not better than the selected best model.
Enrichment Assay Methods Development for the Integrated Cylinder Verification System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.
2009-10-22
International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify eachmore » cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-22
.... SUPPLEMENTARY INFORMATION: RI 38-107, Verification of Who is Getting Payments, is designed for use by the... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: Verification of Who Is Getting Payments, RI... currently approved information collection request (ICR) 3206-0197, Verification of Who is Getting Payments...
[Tobacco quality analysis of producing areas of Yunnan tobacco using near-infrared (NIR) spectrum].
Wang, Yi; Ma, Xiang; Wen, Ya-Dong; Yu, Chun-Xia; Wang, Luo-Ping; Zhao, Long-Lian; Li, Jun-Hui
2013-01-01
In the present study, tobacco quality analysis of different producing areas was carried out applying spectrum projection and correlation methods. The group of industrial classification data was near-infrared (NIR) spectrum in 2010 year of middle parts of tobacco plant from Hongta Tobacco (Group) Co., Ltd. Twelve hundred seventy six superior tobacco leaf samples were collected from four producing areas, in which three areas from Yuxi, Chuxiong and Zhaotong, in Yunnan province all belong to tobacco varieties of K326 and one area from Dali belongs to tobacco varieties of Hongda. The conclusion showed that when the samples were divided into two parts by the ratio of 2 : 1 randomly as analysis and verification sets, the verification set corresponded with the analysis set applying spectrum projection because their correlation coefficients by the first and second dimensional projection were all above 0.99. At the same time, The study discussed a method to get the quantitative similarity values of different producing areas samples. The similarity values were instructive in tobacco plant planning, quality management, acquisition of raw materials of tobacco and tobacco leaf blending.
Li, Chao; Zhang, Yan-po; Guo, Wei-dong; Zhu, Yue; Xu, Jing; Deng, Xun
2010-09-01
Fluorescence excitation-emission matrix (EEM) and absorption spectroscopy were applied to study the optical properties of 29 CDOM samples collected from different ballast tanks of nine international route vessels anchored in Xiamen Port between October 2007 and April 2008. The purpose was to examine the feasibility of these spectral properties as a tracer to verify if these vessels follow the mid-ocean ballast water exchange (BWE) regulation. Using parallel factor analysis, four fluorescent components were identified, including two humic-like components (C1: 245, 300/386 nm; C2: 250, 345/458 nm) and two protein-like components (C3: 220, 275/306 nm; C4: 235, 290/345 nm), of which C2 component was the suitable fluorescence verification indicator. The vertical distribution of all fluorescent components in ballast tank was nearly similar indicating that profile-mixing sampling was preferable. Combined use of C2 component, spectral slope ratio (SR) of absorption spectroscopy and salinity may provide reasonable verification if BWE carried out by these nine ships. The results suggested that the combined use of multiple parameters (fluorescence, absorption and salinity) would be much reliable to determine the origin of ballast water, and to provide the technical guarantee for fast examination of ballast water exchange in Chinese ports.
77 FR 291 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-04
... Verification System (IEVS) Reporting and Supporting Regulations Contained in 42 CFR 431.17, 431.306, 435.910... verifications; Form Number: CMS-R-74 (OCN 0938-0467); Frequency: Monthly; Affected Public: State, Local, or..., issuers offering group health insurance coverage, and self-insured nonfederal governmental plans (through...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-12
... (original and update), and verification audit; names of the person(s) who completed the self-assessment... of the self assessment, date of the verification audit report, name of the auditor, signature and... self assessment, (2) conducting a baseline survey of the regulated industry, and (3) obtaining an...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-30
... for OMB Review; Comment Request; Placement Verification and Follow-Up of Job Corps Participants ACTION... Training Administration (ETA) sponsored information collection request (ICR) titled, ``Placement Verification and Follow-up of Job Corps Participants,'' to the Office of Management and Budget (OMB) for review...
Global Characterization of Protein Altering Mutations in Prostate Cancer
2011-08-01
prevalence of candidate cancer genes observed here in prostate cancer. (3) Perform integrative analyses of somatic mutation with gene expression and copy...analyses of somatic mutation with gene expression and copy number change data collected on the same samples. Body This is a “synergy” project between...However, to perform initial verification/validation studies, we have evaluated the mutation calls for several genes discovered initially by the
VEG-01: Veggie Hardware Verification Testing
NASA Technical Reports Server (NTRS)
Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond
2013-01-01
The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.
40 CFR 1065.342 - Sample dryer verification.
Code of Federal Regulations, 2011 CFR
2011-07-01
...)(2) to remove water from the sample gas, verify the performance upon installation, after major... before the sample gas reaches the analyzer. For example water can negatively interfere with a CLD's NOX... time. You may run this verification on the sample dryer alone, but you must use the maximum gas flow...
40 CFR 1065.342 - Sample dryer verification.
Code of Federal Regulations, 2010 CFR
2010-07-01
...)(2) to remove water from the sample gas, verify the performance upon installation, after major... before the sample gas reaches the analyzer. For example water can negatively interfere with a CLD's NOX... time. You may run this verification on the sample dryer alone, but you must use the maximum gas flow...
Dukić, Lora; Simundić, Ana-Maria; Malogorski, Davorin
2014-01-01
Sample type recommended by the manufacturer for the digoxin Abbott assay is either serum collected in glass tubes or plasma (sodium heparin, lithium heparin, citrate, EDTA or oxalate as anticoagulant) collected in plastic tubes. In our hospital samples are collected in plastic tubes. Our hypothesis was that the serum sample collected in plastic serum tube can be used interchangeably with plasma sample for measurement of digoxin concentration. Our aim was verification of plastic serum tubes for determination of digoxin concentration. Concentration of digoxin was determined simultaneously in 26 venous blood plasma (plastic Vacuette, LH Lithium heparin) and serum (plastic Vacuette, Z Serum Clot activator; both Greiner Bio-One GmbH, Kremsmünster, Austria) samples, on Abbott AxSYM analyzer using the original Abbott Digoxin III assay (Abbott, Wiesbaden, Germany). Tube comparability was assessed using the Passing Bablok regression and Bland-Altman plot. Serum and plasma digoxin concentrations are comparable. Passing Bablok intercept (0.08 [95% CI = -0.10 to 0.20]) and slope (0.99 [95% CI = 0.92 to 1.11]) showed there is no constant or proportional error. Blood samples drawn in plastic serum tubes and plastic plasma tubes can be interchangeably used for determination of digoxin concentration.
Dukić, Lora; Šimundić, Ana-Maria; Malogorski, Davorin
2014-01-01
Introduction: Sample type recommended by the manufacturer for the digoxin Abbott assay is either serum collected in glass tubes or plasma (sodium heparin, lithium heparin, citrate, EDTA or oxalate as anticoagulant) collected in plastic tubes. In our hospital samples are collected in plastic tubes. Our hypothesis was that the serum sample collected in plastic serum tube can be used interchangeably with plasma sample for measurement of digoxin concentration. Our aim was verification of plastic serum tubes for determination of digoxin concentration. Materials and methods: Concentration of digoxin was determined simultaneously in 26 venous blood plasma (plastic Vacuette, LH Lithium heparin) and serum (plastic Vacuette, Z Serum Clot activator; both Greiner Bio-One GmbH, Kremsmünster, Austria) samples, on Abbott AxSYM analyzer using the original Abbott Digoxin III assay (Abbott, Wiesbaden, Germany). Tube comparability was assessed using the Passing Bablok regression and Bland-Altman plot. Results: Serum and plasma digoxin concentrations are comparable. Passing Bablok intercept (0.08 [95% CI = −0.10 to 0.20]) and slope (0.99 [95% CI = 0.92 to 1.11]) showed there is no constant or proportional error. Conclusion: Blood samples drawn in plastic serum tubes and plastic plasma tubes can be interchangeably used for determination of digoxin concentration. PMID:24627723
NASA Technical Reports Server (NTRS)
Johnson, Kenneth L.; White, K, Preston, Jr.
2012-01-01
The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques. This recommended procedure would be used as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. This document contains the outcome of the assessment.
The Learner Verification of Series r: The New Macmillan Reading Program; Highlights.
ERIC Educational Resources Information Center
National Evaluation Systems, Inc., Amherst, MA.
National Evaluation Systems, Inc., has developed curriculum evaluation techniques, in terms of learner verification, which may be used to help the curriculum-development efforts of publishing companies, state education departments, and universities. This document includes a summary of the learner-verification approach, with data collected about a…
75 FR 51821 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-23
... Verification form, the Employment Verification and Community Site Information form, the Payment Information Form, the Authorization to Release Information form and the Self-Certification Form. Once health...,035 Employment Verification and 5,175 1 5,175 .75 3,881 Community Site Information Form Loan...
78 FR 18305 - Notice of Request for Extension of a Currently Approved Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
... Identity Verification (PIV) Request for Credential, the USDA Homeland Security Presidential Directive 12... consists of two phases of implementation: Personal Identity Verification phase I (PIV I) and Personal Identity Verification phase II (PIV II). The information requested must be provided by Federal employees...
Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B
2009-12-01
Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-30
... Proposed Information Collection to OMB; Enterprise Income Verification (EIV) System User Access Authorization Form and Rules of Behavior and User Agreement AGENCY: Office of the Chief Information Officer, HUD... user with information related to the Rules of Behavior for system usage and the user's responsibilities...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-13
... Proposed Information Collection to OMB Enterprise Income Verification (EIV) System--Debts Owed to Public... of Management and Budget (OMB) for review, as required by the Paperwork Reduction Act. HUD is... Management and Budget, New Executive Office Building, Washington, DC 20503; fax: 202-395-5806. Email: OIRA...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-04
... Public Housing Agency (PHA). The information is used by PHAs to determine a family's suitability for... Information Collection for Public Comment Enterprise Income Verification (EIV) Systems--Debts Owed to Public Housing Agencies and Terminations AGENCY: Office of the Assistance Secretary for Public and Indian Housing...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weaver, Phyllis C.
A team from ORAU's Independent Environmental Assessment and Verification Program performed verification survey activities on the South Test Tower and four Hazardous Waste Storage Lockers. Scan data collected by ORAU determined that both the alpha and alpha-plus-beta activity was representative of radiological background conditions. The count rate distribution showed no outliers that would be indicative of alpha or alpha-plus-beta count rates in excess of background. It is the opinion of ORAU that independent verification data collected support the site?s conclusions that the South Tower and Lockers sufficiently meet the site criteria for release to recycle and reuse.
[The Dose Effect of Isocenter Selection during IMRT Dose Verification with the 2D Chamber Array].
Xie, Chuanbin; Cong, Xiaohu; Xu, Shouping; Dai, Xiangkun; Wang, Yunlai; Han, Lu; Gong, Hanshun; Ju, Zhongjian; Ge, Ruigang; Ma, Lin
2015-03-01
To investigate the dose effect of isocenter difference during IMRT dose verification with the 2D chamber array. The samples collected from 10 patients were respectively designed for IMRT plans, the isocenter of which was independently defined as P(o), P(x) and P(y). P(o) was fixed on the target center and the other points shifted 8cm from the target center in the orientation of x/y. The PTW729 was used for 2D dose verification in the 3 groups which beams of plans were set to 0 degrees. The γ-analysis passing rates for the whole plan and each beam were gotten using the different standards in the 3 groups, The results showed the mean passing rate of γ-analysis was highest in the P(o) group, and the mean passing rate of the whole plan was better than that of each beam. In addition, it became worse with the increase of dose leakage between the leaves in P(y) group. Therefore, the determination of isocenter has a visible effect for IMRT dose verification of the 2D chamber array, The isocenter of the planning design should be close to the geometric center of target.
Automated verification of flight software. User's manual
NASA Technical Reports Server (NTRS)
Saib, S. H.
1982-01-01
(Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.
Verifying the Comprehensive Nuclear-Test-Ban Treaty by Radioxenon Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ringbom, Anders
2005-05-24
The current status of the ongoing establishment of a verification system for the Comprehensive Nuclear-Test-Ban Treaty using radioxenon detection is discussed. As an example of equipment used in this application the newly developed fully automatic noble gas sampling and detection system SAUNA is described, and data collected with this system are discussed. It is concluded that the most important remaining scientific challenges in the field concern event categorization and meteorological backtracking.
Alternative sample sizes for verification dose experiments and dose audits
NASA Astrophysics Data System (ADS)
Taylor, W. A.; Hansen, J. M.
1999-01-01
ISO 11137 (1995), "Sterilization of Health Care Products—Requirements for Validation and Routine Control—Radiation Sterilization", provides sampling plans for performing initial verification dose experiments and quarterly dose audits. Alternative sampling plans are presented which provide equivalent protection. These sampling plans can significantly reduce the cost of testing. These alternative sampling plans have been included in a draft ISO Technical Report (type 2). This paper examines the rational behind the proposed alternative sampling plans. The protection provided by the current verification and audit sampling plans is first examined. Then methods for identifying equivalent plans are highlighted. Finally, methods for comparing the cost associated with the different plans are provided. This paper includes additional guidance for selecting between the original and alternative sampling plans not included in the technical report.
40 CFR 1065.545 - Verification of proportional flow control for batch sampling.
Code of Federal Regulations, 2014 CFR
2014-07-01
... control for batch sampling. 1065.545 Section 1065.545 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Performing an Emission Test Over Specified Duty Cycles § 1065.545 Verification of proportional flow control for batch sampling. For any...
76 FR 60829 - Information Collection Being Reviewed by the Federal Communications Commission
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-30
... Authorization-Verification (Retention of Records). Form No.: N/A. Type of Review: Extension of a currently... verification, the responsible party, as shown in 47 CFR 2.909 shall maintain the records listed as follows: (1... laboratory, company, or individual performing the verification testing. The Commission may request additional...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-05
...: 3060-0329. Title: Section 2.955, Equipment Authorization-Verification (Retention of Records). Form No.... Section 2.955 describes for each equipment device subject to verification, the responsible party, as shown... performing the verification testing. The Commission may request additional information regarding the test...
78 FR 6849 - Agency Information Collection (Verification of VA Benefits) Activity Under OMB Review
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-31
... (Verification of VA Benefits) Activity Under OMB Review AGENCY: Veterans Benefits Administration, Department of... ``OMB Control No. 2900-0406.'' SUPPLEMENTARY INFORMATION: Title: Verification of VA Benefits, VA Form 26... eliminate unlimited versions of lender- designed forms. The form also informs the lender whether or not the...
Cleaning and Cleanliness Verification Techniques for Mars Returned Sample Handling
NASA Technical Reports Server (NTRS)
Mickelson, E. T.; Lindstrom, D. J.; Allton, J. H.; Hittle, J. D.
2002-01-01
Precision cleaning and cleanliness verification techniques are examined as a subset of a comprehensive contamination control strategy for a Mars sample return mission. Additional information is contained in the original extended abstract.
Wang, Yi; Xiang, Ma; Wen, Ya-Dong; Yu, Chun-Xia; Wang, Luo-Ping; Zhao, Long-Lian; Li, Jun-Hui
2012-11-01
In this study, tobacco quality analysis of main Industrial classification of different years was carried out applying spectrum projection and correlation methods. The group of data was near-infrared (NIR) spectrum from Hongta Tobacco (Group) Co., Ltd. 5730 tobacco leaf Industrial classification samples from Yuxi in Yunnan province from 2007 to 2010 year were collected using near infrared spectroscopy, which from different parts and colors and all belong to tobacco varieties of HONGDA. The conclusion showed that, when the samples were divided to two part by the ratio of 2:1 randomly as analysis and verification sets in the same year, the verification set corresponded with the analysis set applying spectrum projection because their correlation coefficients were above 0.98. The correlation coefficients between two different years applying spectrum projection were above 0.97. The highest correlation coefficient was the one between 2008 and 2009 year and the lowest correlation coefficient was the one between 2007 and 2010 year. At the same time, The study discussed a method to get the quantitative similarity values of different industrial classification samples. The similarity and consistency values were instructive in combination and replacement of tobacco leaf blending.
Biometric verification in dynamic writing
NASA Astrophysics Data System (ADS)
George, Susan E.
2002-03-01
Pen-tablet devices capable of capturing the dynamics of writing record temporal and pressure information as well as the spatial pattern. This paper explores biometric verification based upon the dynamics of writing where writers are distinguished not on the basis of what they write (ie the signature), but how they write. We have collected samples of dynamic writing from 38 Chinese writers. Each writer was asked to provide 10 copies of a paragraph of text and the same number of signature samples. From the data we have extracted stroke-based primitives from the sentence data utilizing pen-up/down information and heuristic rules about the shape of the character. The x, y and pressure values of each primitive were interpolated into an even temporal range based upon a 20 msec sampling rate. We applied the Daubechies 1 wavelet transform to the x signal, y signal and pressure signal using the coefficients as inputs to a multi-layer perceptron trained with back-propagation on the sentence data. We found a sensitivity of 0.977 and specificity of 0.990 recognizing writers based on test primitives extracted from sentence data and measures of 0.916 and 0.961 respectively, from test primitives extracted from signature data.
Starting a European Space Agency Sample Analogue Collection for Robotic Exploration Missions
NASA Astrophysics Data System (ADS)
Smith, C. L.; Mavris, C.; Michalski, J. R.; Rumsey, M. S.; Russell, S. S.; Jones, C.; Schroeven-Deceuninck, H.
2015-12-01
The Natural History Museum is working closely with the European Space Agency (ESA) and the UK Space Agency to develop a European collection of analogue materials with appropriate physical/mechanical and chemical (mineralogical) properties which can support the development and verification of both spacecraft and scientific systems for potential science and exploration missions to Phobos/Deimos, Mars, C-type asteroids and the Moon. As an ESA Collection it will be housed at the ESA Centre based at Harwell, UK. The "ESA Sample Analogues Collection" will be composed of both natural and artificial materials chosen to (as closely as possible) replicate the surfaces and near-surfaces of different Solar System target bodies of exploration interest. The analogue samples will be fully characterised in terms of both their physical/mechanical properties (compressive strength, bulk density, grain shape, grain size, cohesion and angle of internal friction) and their chemical/mineralogical properties (texture, modal mineralogy, bulk chemical composition - major, minor and trace elements and individual mineralogical compositions). The Collection will be fully curated to international standards including implementation of a user-friendly database and will be available for use by engineers and scientists across the UK and Europe. Enhancement of the initial Collection will be possible through collaborations with other ESA and UK Space Agency supported activities, such as the acquisition of new samples during field trials.
The International Space Station Urine Monitoring System (UMS)
NASA Technical Reports Server (NTRS)
Feeback, Daniel L.; Cibuzar, Branelle R.; Milstead, Jeffery R.; Pietrzyk,, Robert A.; Clark, Mark S.F.
2009-01-01
A device capable of making in-flight volume measurements of single void urine samples, the Urine Monitoring System (UMS), was developed and flown on seven U.S. Space Shuttle missions. This device provided volume data for each urine void from multiple crewmembers and allowed samples of each to be taken and returned to Earth for post-flight analysis. There were a number of design flaws in the original instrument including the presence of liquid carry-over producing invalid "actual" micturition volumes and cross-contamination between successive users from residual urine in "dead" spots". Additionally, high or low volume voids could not be accurately measured, the on-orbit calibration and nominal use sequence was time intensive, and the unit had to be returned and disassembled to retrieve the volume data. These problems have been resolved in a new version, the International Space Station (ISS) UMS, that has been designed to provide real-time in-flight volume data with accuracy and precision equivalent to measurements made on Earth and the ability to provide urine samples that are unadulterated by the device. Originally conceived to be interfaced with a U.S.-built Waste Collection System (WCS), the unit now has been modified to interface with the Russian-supplied Sanitary Hygiene Device (ASY). The ISS UMS provides significant advantages over the current method of collecting urine samples into Urine Collection Devices (UCDs), from which samples are removed and returned to Earth for analyses. A significant future advantage of the UMS is that it can provide an interface to analytical instrumentation that will allow real-time measurement of urine bioanalytes allowing monitoring of crewmember health status during flight and the ability to provide medical interventions based on the results of these measurements. Currently, the ISS UMS is scheduled to launch along with Node-3 on STS-130 (20A) in December 2009. UMS will be installed and scientific/functional verification completed prior to placing the instrument into operation. Samples collected during the verification sequence will be returned for analyses on STS-131 (19A) currently scheduled for launch in March 2010. The presence of a UMS on ISS will provide the capability to conduct additional collaborative human life science investigations among the ISS International Partners.
Formal verification of medical monitoring software using Z language: a representative sample.
Babamir, Seyed Morteza; Borhani, Mehdi
2012-08-01
Medical monitoring systems are useful aids assisting physicians in keeping patients under constant surveillance; however, taking sound decision by the systems is a physician concern. As a result, verification of the systems behavior in monitoring patients is a matter of significant. The patient monitoring is undertaken by software in modern medical systems; so, software verification of modern medial systems have been noticed. Such verification can be achieved by the Formal Languages having mathematical foundations. Among others, the Z language is a suitable formal language has been used to formal verification of systems. This study aims to present a constructive method to verify a representative sample of a medical system by which the system is visually specified and formally verified against patient constraints stated in Z Language. Exploiting our past experience in formal modeling Continuous Infusion Insulin Pump (CIIP), we think of the CIIP system as a representative sample of medical systems in proposing our present study. The system is responsible for monitoring diabetic's blood sugar.
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...
Enhancing pre-service physics teachers' creative thinking skills through HOT lab design
NASA Astrophysics Data System (ADS)
Malik, Adam; Setiawan, Agus; Suhandi, Andi; Permanasari, Anna
2017-08-01
A research on the implementation of HOT (Higher Order Thinking) Laboratory has been carried out. This research is aimed to compare increasing of creative thinking skills of pre-service physics teachers who receive physics lesson with HOT Lab and with verification lab for the topic of electric circuit. This research used a quasi-experiment methods with control group pretest-posttest design. The subject of the research is 40 Physics Education pre-service physics teachers of UIN Sunan Gunung Djati Bandung. Research samples were selected by class random sampling technique. Data on pre-service physics teachers' creative thinking skills were collected using test of creative thinking skills in the form of essay. The results of the research reveal that average of N-gain of creative thinking skills are <0,69> for pre-service physics teachers who received lesson with HOT Lab design and <0,39> for pre-service physics teachers who received lesson with verification lab, respectively. Therefore, we conclude that application of HOT Lab design is more effective to increase creative thinking skills in the lesson of electric circuit.
NASA Astrophysics Data System (ADS)
Malik, A.; Setiawan, A.; Suhandi, A.; Permanasari, A.; Dirgantara, Y.; Yuniarti, H.; Sapriadil, S.; Hermita, N.
2018-01-01
This study aimed to investigate the improvement to pre-service teacher’s communication skills through Higher Order Thinking Laboratory (HOT Lab) on electric circuit topic. This research used the quasi-experiment method with pretest-posttest control group design. Research subjects were 60 students of Physics Education in UIN Sunan Gunung Djati Bandung. The sample was chosen by random sampling technique. Students’ communication skill data collected using a communication skills test instruments-essays form and observations sheets. The results showed that pre-service teacher communication skills using HOT Lab were higher than verification lab. Student’s communication skills in groups using HOT Lab were not influenced by gender. Communication skills could increase due to HOT Lab based on problems solving that can develop communication through hands-on activities. Therefore, the conclusion of this research shows the application of HOT Lab is more effective than the verification lab to improve communication skills of pre-service teachers in electric circuit topic and gender is not related to a person’s communication skills.
Tuerxunyiming, Muhadasi; Xian, Feng; Zi, Jin; Yimamu, Yilihamujiang; Abuduwayite, Reshalaiti; Ren, Yan; Li, Qidan; Abudula, Abulizi; Liu, SiQi; Mohemaiti, Patamu
2018-01-05
Maturity-onset diabetes of the young (MODY) is an inherited monogenic type of diabetes. Genetic mutations in MODY often cause nonsynonymous changes that directly lead to the functional distortion of proteins and the pathological consequences. Herein, we proposed that the inherited mutations found in a MODY family could cause a disturbance of protein abundance, specifically in serum. The serum samples were collected from a Uyghur MODY family through three generations, and the serum proteins after depletion treatment were examined by quantitative proteomics to characterize the MODY-related serum proteins followed by verification using target quantification of proteomics. A total of 32 serum proteins were preliminarily identified as the MODY-related. Further verification test toward the individual samples demonstrated the 12 candidates with the significantly different abundance in the MODY patients. A comparison of the 12 proteins among the sera of type 1 diabetes, type 2 diabetes, MODY, and healthy subjects was conducted and revealed a protein signature related with MODY composed of the serum proteins such as SERPINA7, APOC4, LPA, C6, and F5.
The U.S. Environmental Protection Agency has created the Environmental Technology Verification Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the ETV Program...
2007-03-01
Characterisation. In Nanotechnology Aerospace Applications – 2006 (pp. 4-1 – 4-8). Educational Notes RTO-EN-AVT-129bis, Paper 4. Neuilly-sur-Seine, France: RTO...the Commercialisation Processes Concept IDEA Proof-of- Principle Trial Samples Engineering Verification Samples Design Verification Samples...SEIC Systems Engineering for commercialisation Design Houses, Engineering & R&D USERS & Integrators SE S U R Integrators Fabs & Wafer Processing Die
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-06
... solicits comments on information needed to issue a Personal Identity Verification (PIV) identification card... Personnel Security and Identity Management (07C), Department of Veterans Affairs, 810 Vermont Avenue NW...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monson, Lawrence M.
2002-04-24
The following work was performed: (1) collected reconnaissance micro-magnetic data and background field data for Area 1, (2) identified and collected soil sample data in three anomalous regions of Area 1, (3) sampled soils in Northwest Poplar Oil Field, (4) graphed, mapped, and interpreted all data areas listed above, (5) registered for the AAPG Penrose Conference on Hydrocarbon Seepage Mechanisms and Migration (postponed from 9/16/01 until 4/7/02 in Vancouver, B.C.). Results include the identification and confirmation of an oil and gas prospect in the northwest part of Area 1 and the verification of a potential shallow gas prospect in themore » West Poplar Area. Correlation of hydrocarbon micro-seepage to TM tonal anomalies needs further data analysis.« less
The U.S. EPA has created the Environmental Technology Verification program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program tested the performance of baghouse filtrati...
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...
A Rubric for Extracting Idea Density from Oral Language Samples
Chand, Vineeta; Baynes, Kathleen; Bonnici, Lisa M.; Farias, Sarah Tomaszewski
2012-01-01
While past research has demonstrated that low idea density (ID) scores from natural language samples correlate with late life risk for cognitive decline and Alzheimer’s disease pathology, there are no published rubrics for collecting and analyzing language samples for idea density to verify or extend these findings into new settings. This paper outlines the history of ID research and findings, discusses issues with past rubrics, and then presents an operationalized method for the systematic measurement of ID in language samples, with an extensive manual available as a supplement to this article (Analysis of Idea Density, AID). Finally, reliability statistics for this rubric in the context of dementia research on aging populations and verification that AID can replicate the significant association between ID and late life cognition are presented. PMID:23042498
Results of the performance verification of the CoaguChek XS system.
Plesch, W; Wolf, T; Breitenbeck, N; Dikkeschei, L D; Cervero, A; Perez, P L; van den Besselaar, A M H P
2008-01-01
This is the first paper reporting a performance verification study of a point-of-care (POC) monitor for prothrombin time (PT) testing according to the requirements given in chapter 8 of the International Organization for Standardization (ISO) 17593:2007 standard "Clinical laboratory testing and in vitro medical devices - Requirements for in vitro monitoring systems for self-testing of oral anticoagulant therapy". The monitor under investigation was the new CoaguChek XS system which is designed for use in patient self testing. Its detection principle is based on the amperometric measurement of the thrombin activity generated by starting the coagulation cascade using a recombinant human thromboplastin. The system performance verification study was performed at four study centers using venous and capillary blood samples on two test strip lots. Laboratory testing was performed from corresponding frozen plasma samples with six commercial thromboplastins. Samples from 73 normal donors and 297 patients on oral anticoagulation therapy were collected. Results were assessed using a refined data set of 260 subjects according to the ISO 17593:2007 standard. Each of the two test strip lots met the acceptance criteria of ISO 17593:2007 versus all thromboplastins (bias -0.19 to 0.18 INR; >97% of data within accuracy limits). The coefficient of variation for imprecision of the PT determinations in INR ranged from 2.0% to 3.2% in venous, and from 2.9% to 4.0% in capillary blood testing. Capillary versus venous INR data showed agreement of results with regression lines equal to the line of identity. The new system demonstrated a high level of trueness and accuracy, and low imprecision in INR testing. It can be concluded that the CoaguChek XS system complies with the requirements in chapter 8 of the ISO standard 17593:2007.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barrett, Christopher A.; Martinez, Alonzo; McNamara, Bruce K.
International Atom Energy Agency (IAEA) safeguard verification measures in gaseous centrifuge enrichment plants (GCEPs) rely on environmental sampling, non-destructive assay (NDA), and destructive assay (DA) sampling and analysis to determine uranium enrichment. UF6 bias defect measurements are made by DA sampling and analysis to assure that enrichment is consistent with declarations. DA samples are collected from a limited number of cylinders for high precision, offsite mass spectrometer analysis. Samples are typically drawn from a sampling tap into a UF6 sample bottle, then packaged, sealed, and shipped under IAEA chain of custody to an offsite analytical laboratory. Future DA safeguard measuresmore » may require improvements in efficiency and effectiveness as GCEP capacities increase and UF6 shipping regulations become increasingly more restrictive. The Pacific Northwest National Laboratory (PNNL) DA sampler concept and Laser Ablation Absorption Ratio Spectrometry (LAARS) assay method are under development to potentially provide DA safeguard tools that increase inspection effectiveness and reduce sample shipping constraints. The PNNL DA sampler concept uses a handheld sampler to collect DA samples for either onsite LAARS assay or offsite laboratory analysis. The DA sampler design will use a small sampling planchet that is coated with an adsorptive film to collect controlled quantities of UF6 gas directly from a cylinder or process sampling tap. Development efforts are currently underway at PNNL to enhance LAARS assay performance to allow high-precision onsite bias defect measurements. In this paper, we report on the experimental investigation to develop adsorptive films for the PNNL DA sampler concept. These films are intended to efficiently capture UF6 and then stabilize the collected DA sample prior to onsite LAARS or offsite laboratory analysis. Several porous material composite films were investigated, including a film designed to maximize the chemical adsorption and binding of gaseous UF6 onto the sampling planchet.« less
VERIFYING THE VOC CONTROL PERFORMANCE OF BIOREACTORS
The paper describes the verification testing approach used to collect high-quality, peer-reviewed data on the performance of bioreaction-based technologies for the control of volatile organic compounds (VOCs). The verification protocol that describes the approach for these tests ...
This verification test was conducted according to procedures specifiedin the Test/QA Planfor Verification of Enzyme-Linked Immunosorbent Assay (ELISA) Test Kis for the Quantitative Determination of Endocrine Disrupting Compounds (EDCs) in Aqueous Phase Samples. Deviations to the...
Lessons from UNSCOM and IAEA regarding remote monitoring and air sampling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dupree, S.A.
1996-01-01
In 1991, at the direction of the United Nations Security Council, UNSCOM and IAEA developed plans for On-going Monitoring and Verification (OMV) in Iraq. The plans were accepted by the Security Council and remote monitoring and atmospheric sampling equipment has been installed at selected sites in Iraq. The remote monitoring equipment consists of video cameras and sensors positioned to observe equipment or activities at sites that could be used to support the development or manufacture of weapons of mass destruction, or long-range missiles. The atmospheric sampling equipment provides unattended collection of chemical samples from sites that could be used tomore » support the development or manufacture of chemical weapon agents. To support OMV in Iraq, UNSCOM has established the Baghdad Monitoring and Verification Centre. Imagery from the remote monitoring cameras can be accessed in near-real time from the Centre through RIF communication links with the monitored sites. The OMV program in Iraq has implications for international cooperative monitoring in both global and regional contexts. However, monitoring systems such as those used in Iraq are not sufficient, in and of themselves, to guarantee the absence of prohibited activities. Such systems cannot replace on-site inspections by competent, trained inspectors. However, monitoring similar to that used in Iraq can contribute to openness and confidence building, to the development of mutual trust, and to the improvement of regional stability.« less
Martinez-Garcia, Elena; Lesur, Antoine; Devis, Laura; Campos, Alexandre; Cabrera, Silvia; van Oostrum, Jan; Matias-Guiu, Xavier; Gil-Moreno, Antonio; Reventos, Jaume; Colas, Eva; Domon, Bruno
2016-08-16
About 30% of endometrial cancer (EC) patients are diagnosed at an advanced stage of the disease, which is associated with a drastic decrease in the 5-year survival rate. The identification of biomarkers in uterine aspirate samples, which are collected by a minimally invasive procedure, would improve early diagnosis of EC. We present a sequential workflow to select from a list of potential EC biomarkers, those which are the most promising to enter a validation study. After the elimination of confounding contributions by residual blood proteins, 52 potential biomarkers were analyzed in uterine aspirates from 20 EC patients and 18 non-EC controls by a high-resolution accurate mass spectrometer operated in parallel reaction monitoring mode. The differential abundance of 26 biomarkers was observed, and among them ten proteins showed a high sensitivity and specificity (AUC > 0.9). The study demonstrates that uterine aspirates are valuable samples for EC protein biomarkers screening. It also illustrates the importance of a biomarker verification phase to fill the gap between discovery and validation studies and highlights the benefits of high resolution mass spectrometry for this purpose. The proteins verified in this study have an increased likelihood to become a clinical assay after a subsequent validation phase.
INNOVATIVE TECHNOLOGY VERIFICATION REPORT " ...
The EnSys Petro Test System developed by Strategic Diagnostics Inc. (SDI), was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in Port Hueneme, California. The purpose of the demonstration was to collect reliable performance and cost data for the EnSys Petro Test System and six other field measurement devices for total petroleum hydrocarbons (TPH) in soil. In addition to assessing ease of device operation, the key objectives of the demonstration included determining the (1) method detection limit, (2) accuracy and precision, (3) effects of interferents and soil moisture content on TPH measurement, (4) sample throughput, and (5) TPH measurement costs for each device. The demonstration involved analysis of both performance evaluation samples and environmental samples collected in four areas contaminated with gasoline, diesel, or other petroleum products. The performance and cost results for a given field measurement device were compared to those for an off-site laboratory reference method,
INNOVATIVE TECHNOLOGY VERIFICATION REPORT " ...
The Synchronous Scanning Luminoscope (Luminoscope) developed by the Oak Ridge National Laboratory in collaboration with Environmental Systems Corporation (ESC) was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in Port Hueneme, California. The purpose of the demonstration was to collect reliable performance and cost data for the Luminoscope and six other field measurement devices for total petroleum hydrocarbons (TPH) in soil. In addition to assessing ease of device operation, the key objectives of the demonstration included determining the (1) method detection limit, (2) accuracy and precision, (3) effects of interferents and soil moisture content on TPH measurement, (4) sample throughput, and (5) TPH measurement costs for each device. The demonstration involved analysis of both performance evaluation samples and environmental samples collected in five areas contaminated with gasoline, diesel, lubricating oil, or other petroleum products. The performance and cost results for a given field measurement device were compared to those for an off-site laboratory reference method,
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-08
... Protocol Gas Verification Program; EPA ICR No. 2375.01, OMB Control Number 2060-NEW AGENCY: Environmental... Air Protocol Gas Verification Program. ICR numbers: EPA ICR No. 2375.01, OMB Control No. 2060-NEW. ICR...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-27
... Number 1076-0160, which expires August 31, 2011. DATES: Interested persons are invited to submit comments... Number: 1076-0160. Title: Verification of Indian preference for Employment in the BIA and IHS, 25 CFR 5...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-31
... authorized by OMB Control Number 1076-0160, which expires August 31, 2011. DATES: Interested persons are... Number: 1076-0160. Title: Verification of Indian preference for Employment in the BIA and IHS, 25 CFR...
ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT FOR AMMONIA RECOVERY PROCESS
This Technology Verification report describes the nature and scope of an environmental evaluation of ThermoEnergy Corporation’s Ammonia Recovery Process (ARP) system. The information contained in this report represents data that were collected over a 3-month pilot study. The ti...
Test/QA Plan for Verification of Nitrate Sensors for Groundwater Remediation Monitoring
A submersible nitrate sensor is capable of collecting in-situ measurements of dissolved nitrate concentrations in groundwater. Although several types of nitrate sensors currently exist, this verification test will focus on submersible sensors equipped with a nitrate-specific ion...
Improving semi-text-independent method of writer verification using difference vector
NASA Astrophysics Data System (ADS)
Li, Xin; Ding, Xiaoqing
2009-01-01
The semi-text-independent method of writer verification based on the linear framework is a method that can use all characters of two handwritings to discriminate the writers in the condition of knowing the text contents. The handwritings are allowed to just have small numbers of even totally different characters. This fills the vacancy of the classical text-dependent methods and the text-independent methods of writer verification. Moreover, the information, what every character is, is used for the semi-text-independent method in this paper. Two types of standard templates, generated from many writer-unknown handwritten samples and printed samples of each character, are introduced to represent the content information of each character. The difference vectors of the character samples are gotten by subtracting the standard templates from the original feature vectors and used to replace the original vectors in the process of writer verification. By removing a large amount of content information and remaining the style information, the verification accuracy of the semi-text-independent method is improved. On a handwriting database involving 30 writers, when the query handwriting and the reference handwriting are composed of 30 distinct characters respectively, the average equal error rate (EER) of writer verification reaches 9.96%. And when the handwritings contain 50 characters, the average EER falls to 6.34%, which is 23.9% lower than the EER of not using the difference vectors.
The Mars Science Laboratory Organic Check Material
NASA Astrophysics Data System (ADS)
Conrad, Pamela G.; Eigenbrode, Jennifer L.; Von der Heydt, Max O.; Mogensen, Claus T.; Canham, John; Harpold, Dan N.; Johnson, Joel; Errigo, Therese; Glavin, Daniel P.; Mahaffy, Paul R.
2012-09-01
Mars Science Laboratory's Curiosity rover carries a set of five external verification standards in hermetically sealed containers that can be sampled as would be a Martian rock, by drilling and then portioning into the solid sample inlet of the Sample Analysis at Mars (SAM) suite. Each organic check material (OCM) canister contains a porous ceramic solid, which has been doped with a fluorinated hydrocarbon marker that can be detected by SAM. The purpose of the OCM is to serve as a verification tool for the organic cleanliness of those parts of the sample chain that cannot be cleaned other than by dilution, i.e., repeated sampling of Martian rock. SAM possesses internal calibrants for verification of both its performance and its internal cleanliness, and the OCM is not used for that purpose. Each OCM unit is designed for one use only, and the choice to do so will be made by the project science group (PSG).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson-Nichols, M.J.; Egidi, P.V.; Roemer, E.K.
2000-09-01
f I The Oak Ridge National Laboratory (ORNL) Environmental Technology Section conducted an independent verification (IV) survey of the clean storage pile at the Johnston Atoll Plutonium Contaminated Soil Remediation Project (JAPCSRP) from January 18-25, 1999. The goal of the JAPCSRP is to restore a 24-acre area that was contaminated with plutonium oxide particles during nuclear testing in the 1960s. The selected remedy was a soil sorting operation that combined radiological measurements and mining processes to identify and sequester plutonium-contaminated soil. The soil sorter operated from about 1990 to 1998. The remaining clean soil is stored on-site for planned beneficialmore » use on Johnston Island. The clean storage pile currently consists of approximately 120,000 m3 of coral. ORNL conducted the survey according to a Sampling and Analysis Plan, which proposed to provide an IV of the clean pile by collecting a minimum number (99) of samples. The goal was to ascertain wi th 95% confidence whether 97% of the processed soil is less than or equal to the accepted guideline (500-Bq/kg or 13.5-pCi/g) total transuranic (TRU) activity.« less
75 FR 5853 - Proposed Collection; Comment Request for Form 13803
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-04
... 13803, Income Verification Express Service Application and Employee Delegation Form. DATES: Written... Application and Employee Delegation Form. OMB Number: 1545-2032. Form Number: Form 13803. Abstract: Form 13803, Income Verification Express Service Application and Employee Delegation Form, is used to submit the...
Based upon the structure and specifications in ANSI/ASQC E4-1994, Specifications and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs, the Environmental Technology Verification (ETV) program Quality and Management Plan (QMP) f...
Verifying the operational set-up of a radionuclide air-monitoring station.
Werzi, R; Padoani, F
2007-05-01
A worldwide radionuclide network of 80 stations, part of the International Monitoring System, was designed to monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty. After installation, the stations are certified to comply with the minimum requirements laid down by the Preparatory Commission of the Comprehensive Nuclear-Test-Ban Treaty Organization. Among the several certification tests carried out at each station, the verification of the radionuclide activity concentrations is a crucial one and is based on an independent testing of the airflow rate measurement system and of the gamma detector system, as well as on the assessment of the samples collected during parallel sampling and measured at radionuclide laboratories.
Verification of component mode techniques for flexible multibody systems
NASA Technical Reports Server (NTRS)
Wiens, Gloria J.
1990-01-01
Investigations were conducted in the modeling aspects of flexible multibodies undergoing large angular displacements. Models were to be generated and analyzed through application of computer simulation packages employing the 'component mode synthesis' techniques. Multibody Modeling, Verification and Control Laboratory (MMVC) plan was implemented, which includes running experimental tests on flexible multibody test articles. From these tests, data was to be collected for later correlation and verification of the theoretical results predicted by the modeling and simulation process.
Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo
2012-01-01
Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044
Blood venous sample collection: Recommendations overview and a checklist to improve quality.
Giavarina, Davide; Lippi, Giuseppe
2017-07-01
The extra-analytical phases of the total testing process have substantial impact on managed care, as well as an inherent high risk of vulnerability to errors which is often greater than that of the analytical phase. The collection of biological samples is a crucial preanalytical activity. Problems or errors occurring shortly before, or soon after, this preanalytical step may impair sample quality and characteristics, or else modify the final results of testing. The standardization of fasting requirements, rest, patient position and psychological state of the patient are therefore crucial for mitigating the impact of preanalytical variability. Moreover, the quality of materials used for collecting specimens, along with their compatibility, can guarantee sample quality and persistence of chemical and physical characteristics of the analytes over time, so safeguarding the reliability of testing. Appropriate techniques and sampling procedures are effective to prevent problems such as hemolysis, undue clotting in the blood tube, draw of insufficient sample volume and modification of analyte concentration. An accurate identification of both patient and blood samples is a key priority as for other healthcare activities. Good laboratory practice and appropriate training of operators, by specifically targeting collection of biological samples, blood in particular, may greatly improve this issue, thus lowering the risk of errors and their adverse clinical consequences. The implementation of a simple and rapid check-list, including verification of blood collection devices, patient preparation and sampling techniques, was found to be effective for enhancing sample quality and reducing some preanalytical errors associated with these procedures. The use of this tool, along with implementation of objective and standardized systems for detecting non-conformities related to unsuitable samples, can be helpful for standardizing preanalytical activities and improving the quality of laboratory diagnostics, ultimately helping to reaffirm a "preanalytical" culture founded on knowledge and real risk perception. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
75 FR 82575 - Federal Acquisition Regulation; Personal Identity Verification of Contractor Personnel
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-30
... 9000-AL60 Federal Acquisition Regulation; Personal Identity Verification of Contractor Personnel AGENCY... requirement of collecting from contractors all forms of Government-provided identification once they are no...D Inspector General Audit Report No. D-2009-005, entitled ``Controls Over the Contractor Common...
Wilson Corners SWMU 001 2014 Annual Long Term Monitoring Report Kennedy Space Center, Florida
NASA Technical Reports Server (NTRS)
Langenbach, James
2015-01-01
This document presents the findings of the 2014 Long Term Monitoring (LTM) that was completed at the Wilson Corners site, located at the National Aeronautics and Space Administration (NASA) John F. Kennedy Space Center (KSC), Florida. The goals of the 2014 annual LTM event were to evaluate the groundwater flow direction and gradient and to monitor the vertical and downgradient horizontal extent of the volatile organic compounds (VOCs) in groundwater at the site. The LTM activities consisted of an annual groundwater sampling event in December 2014, which included the collection of water levels from the LTM wells. During the annual groundwater sampling event, depth to groundwater was measured and VOC samples were collected using passive diffusion bags (PDBs) from 30 monitoring wells. In addition to the LTM sampling, additional assessment sampling was performed at the site using low-flow techniques based on previous LTM results and assessment activities. Assessment of monitoring well MW0052DD was performed by collecting VOC samples using low-flow techniques before and after purging 100 gallons from the well. Monitoring well MW0064 was sampled to supplement shallow VOC data north of Hot Spot 2 and east of Hot Spot 4. Monitoring well MW0089 was sampled due to its proximity to MW0090. MW0090 is screened in a deeper interval and had an unexpected detection of trichloroethene (TCE) during the 2013 LTM, which was corroborated during the March 2014 verification sampling. Monitoring well MW0130 was sampled to provide additional VOC data beneath the semi-confining clay layer in the Hot Spot 2 area.
The Environmental Technology Verification report discusses the technology and performance of Seal Assist System (SAS) for natural gas reciprocating compressor rod packing manufactured by A&A Environmental Seals, Inc. The SAS uses a secondary containment gland to collect natural g...
76 FR 45902 - Agency Information Collection Activities: Proposed Request and Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-01
... will allow our users to maintain one User ID, consisting of a self-selected Username and Password, to...) Registration and identity verification; (2) enhancement of the User ID; and (3) authentication. The...- person identification verification process for individuals who cannot or are not willing to register...
76 FR 9020 - Proposed Information Collection Activity; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-16
... -------- Preparation and Submission of Data 54 1 640 34,560 Verification Procedures--Sec. Sec. 261.60-261.63 Caseload... for Needy Families (TANF) program, it imposed a new data requirement that States prepare and submit data verification procedures and replaced other data requirements with new versions including: the TANF...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-24
... Acquisition Regulation; FAR Case 2009-027, Personal Identity Verification of Contractor Personnel AGENCIES... of collecting from contractors all forms of Government provided identification once they are no..., titled Controls Over the Contractor Common Access Card (CAC) Life Cycle, was performed to determine...
NASA Technical Reports Server (NTRS)
Johnson, Kenneth L.; White, K. Preston, Jr.
2012-01-01
The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
2007-12-03
The 100-F-26:10 waste site includes sanitary sewer lines that serviced the former 182-F, 183-F, and 151-F Buildings. In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
2008-03-18
The 100-F-26:15 waste site consisted of the remnant portions of underground process effluent and floor drain pipelines that originated at the 105-F Reactor. In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.
Bowen, Raffick A R; Adcock, Dorothy M
2016-12-01
Blood collection tubes (BCTs) are an often under-recognized variable in the preanalytical phase of clinical laboratory testing. Unfortunately, even the best-designed and manufactured BCTs may not work well in all clinical settings. Clinical laboratories, in collaboration with healthcare providers, should carefully evaluate BCTs prior to putting them into clinical use to determine their limitations and ensure that patients are not placed at risk because of inaccuracies due to poor tube performance. Selection of the best BCTs can be achieved through comparing advertising materials, reviewing the literature, observing the device at a scientific meeting, receiving a demonstration, evaluating the device under simulated conditions, or testing the device with patient samples. Although many publications have discussed method validations, few detail how to perform experiments for tube verification and validation. This article highlights the most common and impactful variables related to BCTs and discusses the validation studies that a typical clinical laboratory should perform when selecting BCTs. We also present a brief review of how in vitro diagnostic devices, particularly BCTs, are regulated in the United States, the European Union, and Canada. The verification and validation of BCTs will help to avoid the economic and human costs associated with incorrect test results, including poor patient care, unnecessary testing, and delays in test results. We urge laboratorians, tube manufacturers, diagnostic companies, and other researchers to take all the necessary steps to protect against the adverse effects of BCT components and their additives on clinical assays. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Genome-Scale Screen for DNA Methylation-Based Detection Markers for Ovarian Cancer
Houshdaran, Sahar; Shen, Hui; Widschwendter, Martin; Daxenbichler, Günter; Long, Tiffany; Marth, Christian; Laird-Offringa, Ite A.; Press, Michael F.; Dubeau, Louis; Siegmund, Kimberly D.; Wu, Anna H.; Groshen, Susan; Chandavarkar, Uma; Roman, Lynda D.; Berchuck, Andrew; Pearce, Celeste L.; Laird, Peter W.
2011-01-01
Background The identification of sensitive biomarkers for the detection of ovarian cancer is of high clinical relevance for early detection and/or monitoring of disease recurrence. We developed a systematic multi-step biomarker discovery and verification strategy to identify candidate DNA methylation markers for the blood-based detection of ovarian cancer. Methodology/Principal Findings We used the Illumina Infinium platform to analyze the DNA methylation status of 27,578 CpG sites in 41 ovarian tumors. We employed a marker selection strategy that emphasized sensitivity by requiring consistency of methylation across tumors, while achieving specificity by excluding markers with methylation in control leukocyte or serum DNA. Our verification strategy involved testing the ability of identified markers to monitor disease burden in serially collected serum samples from ovarian cancer patients who had undergone surgical tumor resection compared to CA-125 levels. We identified one marker, IFFO1 promoter methylation (IFFO1-M), that is frequently methylated in ovarian tumors and that is rarely detected in the blood of normal controls. When tested in 127 serially collected sera from ovarian cancer patients, IFFO1-M showed post-resection kinetics significantly correlated with serum CA-125 measurements in six out of 16 patients. Conclusions/Significance We implemented an effective marker screening and verification strategy, leading to the identification of IFFO1-M as a blood-based candidate marker for sensitive detection of ovarian cancer. Serum levels of IFFO1-M displayed post-resection kinetics consistent with a reflection of disease burden. We anticipate that IFFO1-M and other candidate markers emerging from this marker development pipeline may provide disease detection capabilities that complement existing biomarkers. PMID:22163280
NASA Astrophysics Data System (ADS)
Mittal, R.; Rao, P.; Kaur, P.
2018-01-01
Elemental evaluations in scanty powdered material have been made using energy dispersive X-ray fluorescence (EDXRF) measurements, for which formulations along with specific procedure for sample target preparation have been developed. Fractional amount evaluation involves an itinerary of steps; (i) collection of elemental characteristic X-ray counts in EDXRF spectra recorded with different weights of material, (ii) search for linearity between X-ray counts and material weights, (iii) calculation of elemental fractions from the linear fit, and (iv) again linear fitting of calculated fractions with sample weights and its extrapolation to zero weight. Thus, elemental fractions at zero weight are free from material self absorption effects for incident and emitted photons. The analytical procedure after its verification with known synthetic samples of macro-nutrients, potassium and calcium, was used for wheat plant/ soil samples obtained from a pot experiment.
He, Hua; McDermott, Michael P.
2012-01-01
Sensitivity and specificity are common measures of the accuracy of a diagnostic test. The usual estimators of these quantities are unbiased if data on the diagnostic test result and the true disease status are obtained from all subjects in an appropriately selected sample. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the result of the diagnostic test and other characteristics of the subjects. Estimators of sensitivity and specificity based on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias under the assumption that the missing data on disease status are missing at random (MAR), that is, the probability of missingness depends on the true (missing) disease status only through the test result and observed covariate information. When some of the covariates are continuous, or the number of covariates is relatively large, the existing methods require parametric models for the probability of disease or the probability of verification (given the test result and covariates), and hence are subject to model misspecification. We propose a new method for correcting verification bias based on the propensity score, defined as the predicted probability of verification given the test result and observed covariates. This is estimated separately for those with positive and negative test results. The new method classifies the verified sample into several subsamples that have homogeneous propensity scores and allows correction for verification bias. Simulation studies demonstrate that the new estimators are more robust to model misspecification than existing methods, but still perform well when the models for the probability of disease and probability of verification are correctly specified. PMID:21856650
Multi-canister overpack project -- verification and validation, MCNP 4A
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldmann, L.H.
This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less
Implementation and verification of global optimization benchmark problems
NASA Astrophysics Data System (ADS)
Posypkin, Mikhail; Usov, Alexander
2017-12-01
The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-14
... and natural gas resources in a manner that is consistent with the need to make such resources... to prevent or minimize the likelihood of blowouts, loss of well control, fires, spillages, physical... the environment or to property, or endanger life or health.'' BSEE's Legacy Data Verification Process...
38 CFR 74.11 - How does CVE process applications for VetBiz VIP Verification Program?
Code of Federal Regulations, 2012 CFR
2012-07-01
... electronic means. (The Office of Management and Budget has approved the information collection requirements... Veterans Enterprise, is authorized to approve or deny applications for VetBiz VIP Verification. The CVE... complete and suitable for evaluation and, if not, what additional information or clarification is required...
38 CFR 74.11 - How does CVE process applications for VetBiz VIP Verification Program?
Code of Federal Regulations, 2010 CFR
2010-07-01
... electronic means. (The Office of Management and Budget has approved the information collection requirements... Veterans Enterprise, is authorized to approve or deny applications for VetBiz VIP Verification. The CVE... complete and suitable for evaluation and, if not, what additional information or clarification is required...
38 CFR 74.11 - How does CVE process applications for VetBiz VIP Verification Program?
Code of Federal Regulations, 2011 CFR
2011-07-01
... electronic means. (The Office of Management and Budget has approved the information collection requirements... Veterans Enterprise, is authorized to approve or deny applications for VetBiz VIP Verification. The CVE... complete and suitable for evaluation and, if not, what additional information or clarification is required...
38 CFR 74.11 - How does CVE process applications for VetBiz VIP Verification Program?
Code of Federal Regulations, 2014 CFR
2014-07-01
... electronic means. (The Office of Management and Budget has approved the information collection requirements... Veterans Enterprise, is authorized to approve or deny applications for VetBiz VIP Verification. The CVE... complete and suitable for evaluation and, if not, what additional information or clarification is required...
38 CFR 74.11 - How does CVE process applications for VetBiz VIP Verification Program?
Code of Federal Regulations, 2013 CFR
2013-07-01
... electronic means. (The Office of Management and Budget has approved the information collection requirements... Veterans Enterprise, is authorized to approve or deny applications for VetBiz VIP Verification. The CVE... complete and suitable for evaluation and, if not, what additional information or clarification is required...
Madon, Stephanie; Guyll, Max; Buller, Ashley A.; Scherr, Kyle C.; Willard, Jennifer; Spoth, Richard
2010-01-01
This research examined whether self-fulfilling prophecy effects are mediated by self-verification, informational conformity, and modeling processes. The authors examined these mediational processes across multiple time frames with longitudinal data obtained from two samples of mother – child dyads (N1 = 487; N2 = 287). Children’s alcohol use was the outcome variable. The results provided consistent support for the mediational process of self-verification. In both samples and across several years of adolescence, there was a significant indirect effect of mothers’ beliefs on children’s alcohol use through children’s self-assessed likelihood of drinking alcohol in the future. Comparatively less support was found for informational conformity and modeling processes as mediators of mothers’ self-fulfilling effects. The potential for self-fulfilling prophecies to produce long lasting changes in targets’ behavior via self-verification processes are discussed. PMID:18665708
Madon, Stephanie; Guyll, Max; Buller, Ashley A; Scherr, Kyle C; Willard, Jennifer; Spoth, Richard
2008-08-01
This research examined whether self-fulfilling prophecy effects are mediated by self-verification, informational conformity, and modeling processes. The authors examined these mediational processes across multiple time frames with longitudinal data obtained from two samples of mother-child dyads (N-sub-1 = 486; N-sub-2 = 287), with children's alcohol use as the outcome variable. The results provided consistent support for the mediational process of self-verification. In both samples and across several years of adolescence, there was a significant indirect effect of mothers' beliefs on children's alcohol use through children's self-assessed likelihood of drinking alcohol in the future. Comparatively less support was found for informational conformity and modeling processes as mediators of mothers' self-fulfilling effects. The potential for self-fulfilling prophecies to produce long-lasting changes in targets' behavior via self-verification processes are discussed. (c) 2008 APA, all rights reserved
Valentiner, David P; Skowronski, John J; McGrath, Patrick B; Smith, Sarah A; Renner, Kerry A
2011-10-01
A self-verification model of social anxiety views negative social self-esteem as a core feature of social anxiety. This core feature is proposed to be maintained through self-verification processes, such as by leading individuals with negative social self-esteem to prefer negative social feedback. This model is tested in two studies. In Study 1, questionnaires were administered to a college sample (N = 317). In Study 2, questionnaires were administered to anxiety disordered patients (N = 62) before and after treatment. Study 1 developed measures of preference for negative social feedback and social self-esteem, and provided evidence of their incremental validity in a college sample. Study 2 found that these two variables are not strongly related to fears of evaluation, are relatively unaffected by a treatment that targets such fears, and predict residual social anxiety following treatment. Overall, these studies provide preliminary evidence for a self-verification model of social anxiety.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolme, David S; Tokola, Ryan A; Boehnen, Chris Bensing
Automatic recognition systems are a valuable tool for identifying unknown deceased individuals. Immediately af- ter death fingerprint and face biometric samples are easy to collect using standard sensors and cameras and can be easily matched to anti-mortem biometric samples. Even though post-mortem fingerprints and faces have been used for decades, there are no studies that track these biomet- rics through the later stages of decomposition to determine the length of time the biometrics remain viable. This paper discusses a multimodal dataset of fingerprints, faces, and irises from 14 human cadavers that decomposed outdoors under natural conditions. Results include predictive modelsmore » relating time and temperature, measured as Accumulated Degree Days (ADD), and season (winter, spring, summer) to the predicted probably of automatic verification using a commercial algorithm.« less
Walsh, Stephen Joseph; Meador, Michael R.
1998-01-01
Fish community structure is characterized by the U.S. Geological Survey's National Water-Quality Assessment (NAWQA) Program as part of a perennial, multidisciplinary approach to evaluating the physical, chemical, and biological conditions of the Nation's water resources. The objective of quality assurance and quality control of fish taxonomic data that are collected as part of the NAWQA Program is to establish uniform guidelines and protocols for the identification, processing, and archiving of fish specimens to ensure that accurate and reliable data are collected. Study unit biologists, collaborating with regional biologists and fish taxonomic specialists, prepare a pre-sampling study plan that includes a preliminary faunal list and identification of an ichthyological curation center for receiving preserved fish specimens. Problematic taxonomic issues and protected taxa also are identified in the study plan, and collecting permits are obtained in advance of sampling activities. Taxonomic specialists are selected to identify fish specimens in the field and to assist in determining what fish specimens should be sacrificed, fixed, and preserved for laboratory identification, independent taxonomic verification, and long-term storage in reference or voucher collections. Quantitative and qualitative sampling of fishes follows standard methods previously established for the NAWQA Program. Common ichthyological techniques are used to process samples in the field and prepare fish specimens to be returned to the laboratory or sent to an institutional repository. Taxonomic identifications are reported by using a standardized list of scientific names that provides nomenclatural consistency and uniformity across study units.
Proceedings of the workshop for exchange of technology for CWC inspections
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGuire, R.R.
1993-04-01
With the signing of the Chemical Weapons Convention (CWC), the work of the Preparatory Commission in defining the modalities of on-site verification inspections will begin early in 1993. One of the methods for increasing the effectiveness of inspections is the collection of samples for chemical analysis. The CWC allows for this analysis to be performed either at the site of the inspection or in a dedicated off-site laboratory. The decision as to where samples are to be analyzed in any specific instance may involve a consideration of the threat, real or perceived, to the compromise of legitimate sensitive host-party information.more » The ability to perform efficient chemical analysis at the inspection site, where samples remain in joint (host-inspector) custody and the analytical procedures can be observed by the host, can alleviate much of the concern over possible loss of confidential information in both government and industry. This workshop was designed to encourage the exchange of information among participants with experience in the use of analytical equipment for on-site sample collection and analysis. Individual projects are processed separately for the databases.« less
Feasibility of conducting wetfall chemistry investigations around the Bowen Power Plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, N.C.J.; Patrinos, A.A.N.
1979-10-01
The feasibility of expanding the Meteorological Effects of Thermal Energy Releases - Oak Ridge National Laboratory (METER-ORNL) research at Bower Power Plant, a coal-fired power plant in northwest Georgia, to include wetfall chemistry is evaluated using results of similar studies around other power plants, several atmospheric washout models, analysis of spatial variability in precipitation, and field logistical considerations. An optimal wetfall chemistry network design is proposed, incorporating the inner portion of the existing rain-gauge network and augmented by additional sites to ensure adequate coverage of probable target areas. The predicted sulfate production rate differs by about four orders of magnitudemore » among the models reviewed with a pH of 3. No model can claim superiority over any other model without substantive data verification. The spatial uniformity in rain amount is evaluated using four storms that occurred at the METER-ORNL network. Values of spatial variability ranged from 8 to 31% and decreased as the mean rainfall increased. The field study of wetfall chemistry will require a minimum of 5 persons to operate the approximately 50 collectors covering an area of 740 km/sup 2/. Preliminary wetfall-only samples collected on an event basis showed lower pH and higher electrical conductivity of precipitation collected about 5 km downwind of the power plant relative to samples collected upwind. Wetfall samples collected on a weekly basis using automatic samplers, however, showed variable results, with no consistent pattern. This suggests the need for event sampling to minimize variable rain volume and multiple-source effects often associated with weekly samples.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linard, Joshua; Campbell, Sam
This event included annual sampling of groundwater and surface water locations at the Gunnison, Colorado, Processing Site. Sampling and analyses were conducted as specified in Sampling and Analysis Plan for US Department of Energy Office of Legacy Management Sites (LMS/PRO/S04351, continually updated, http://energy.gov/lm/downloads/sampling-and analysis-plan-us-department-energy-office-legacy-management-sites). Samples were collected from 28 monitoring wells, three domestic wells, and six surface locations in April at the processing site as specified in the draft 2010 Ground Water Compliance Action Plan for the Gunnison, Colorado, Processing Site. Planned monitoring locations are shown in Attachment 1, Sampling and Analysis Work Order. Domestic wells 0476 and 0477 weremore » sampled in June because the homes were unoccupied in April, and the wells were not in use. Duplicate samples were collected from locations 0126, 0477, and 0780. One equipment blank was collected during this sampling event. Water levels were measured at all monitoring wells that were sampled. See Attachment 2, Trip Reports for additional details. The analytical data and associated qualifiers can be viewed in environmental database reports and are also available for viewing with dynamic mapping via the GEMS (Geospatial Environmental Mapping System) website at http://gems.lm.doe.gov/#. No issues were identified during the data validation process that requires additional action or follow-up. An assessment of anomalous data is included in Attachment 3. Interpretation and presentation of results, including an assessment ofthe natural flushing compliance strategy, will be reported in the upcoming 2016 Verification Monitoring Report. U.S.« less
NASA Technical Reports Server (NTRS)
Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.
1993-01-01
This technical report contains the Higher-Order Logic (HOL) listings of the partial verification of the requirements and design for a commercially developed processor interface unit (PIU). The PIU is an interface chip performing memory interface, bus interface, and additional support services for a commercial microprocessor within a fault tolerant computer system. This system, the Fault Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance-free operation, or both. This report contains the actual HOL listings of the PIU verification as it currently exists. Section two of this report contains general-purpose HOL theories and definitions that support the PIU verification. These include arithmetic theories dealing with inequalities and associativity, and a collection of tactics used in the PIU proofs. Section three contains the HOL listings for the completed PIU design verification. Section 4 contains the HOL listings for the partial requirements verification of the P-Port.
Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)
NASA Astrophysics Data System (ADS)
Selvy, Brian M.; Claver, Charles; Angeli, George
2014-08-01
This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.
An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices
Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei
2017-01-01
In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer’s forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices. PMID:28075375
An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices.
Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei
2017-01-10
In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer's forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices.
Exomars Mission Verification Approach
NASA Astrophysics Data System (ADS)
Cassi, Carlo; Gilardi, Franco; Bethge, Boris
According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests between the different levels (system, modules, subsystems, etc) and giving an overview of the main test defined at Spacecraft level. The paper is mainly focused on the verification aspects of the EDL Demonstrator Module and the Rover Module, for which an intense testing activity without previous heritage in Europe is foreseen. In particular the Descent Module has to survive to the Mars atmospheric entry and landing, its surface platform has to stay operational for 8 sols on Martian surface, transmitting scientific data to the Orbiter. The Rover Module has to perform 180 sols mission in Mars surface environment. These operative conditions cannot be verified only by analysis; consequently a test campaign is defined including mechanical tests to simulate the entry loads, thermal test in Mars environment and the simulation of Rover operations on a 'Mars like' terrain. Finally, the paper present an overview of the documentation flow defined to ensure the correct translation of the mission requirements in verification activities (test, analysis, review of design) until the final verification close-out of the above requirements with the final verification reports.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Porcella, D.B.; Bowie, G.L.; Campbell, C.L.
The Ecosystem Assessment Model (EAM) of the Cooling Lake Assessment Methodology was applied to the extensive ecological field data collected at Lake Norman, North Carolina by Duke Power Company to evaluate its capability to simulate lake ecosystems and the ecological effects of steam electric power plants. The EAM provided simulations over a five-year verification period that behaved as expected based on a one-year calibration. Major state variables of interest to utilities and regulatory agencies are: temperature, dissolved oxygen, and fish community variables. In qualitative terms, temperature simulation was very accurate, dissolved oxygen simulation was accurate, and fish prediction was reasonablymore » accurate. The need for more accurate fisheries data collected at monthly intervals and non-destructive sampling techniques was identified.« less
How Formal Dynamic Verification Tools Facilitate Novel Concurrency Visualizations
NASA Astrophysics Data System (ADS)
Aananthakrishnan, Sriram; Delisi, Michael; Vakkalanka, Sarvani; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev
With the exploding scale of concurrency, presenting valuable pieces of information collected by formal verification tools intuitively and graphically can greatly enhance concurrent system debugging. Traditional MPI program debuggers present trace views of MPI program executions. Such views are redundant, often containing equivalent traces that permute independent MPI calls. In our ISP formal dynamic verifier for MPI programs, we present a collection of alternate views made possible by the use of formal dynamic verification. Some of ISP’s views help pinpoint errors, some facilitate discerning errors by eliminating redundancy, while others help understand the program better by displaying concurrent even orderings that must be respected by all MPI implementations, in the form of completes-before graphs. In this paper, we describe ISP’s graphical user interface (GUI) capabilities in all these areas which are currently supported by a portable Java based GUI, a Microsoft Visual Studio GUI, and an Eclipse based GUI whose development is in progress.
NASA Technical Reports Server (NTRS)
Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.
1994-01-01
NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 sq m. Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging/diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg/sq ft of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVRs impinged from witness plates of 0.05 to 0.75 sq m.
NASA Technical Reports Server (NTRS)
Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.
1995-01-01
NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 m(exp 2). Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging-diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC-113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg-ft(exp 2) of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVR's impinged from witness plates of 0.05 to 0.75 m(exp 2).
NASA Technical Reports Server (NTRS)
Wolf, Michael
2012-01-01
A document describes an algorithm created to estimate the mass placed on a sample verification sensor (SVS) designed for lunar or planetary robotic sample return missions. A novel SVS measures the capacitance between a rigid bottom plate and an elastic top membrane in seven locations. As additional sample material (soil and/or small rocks) is placed on the top membrane, the deformation of the membrane increases the capacitance. The mass estimation algorithm addresses both the calibration of each SVS channel, and also addresses how to combine the capacitances read from each of the seven channels into a single mass estimate. The probabilistic approach combines the channels according to the variance observed during the training phase, and provides not only the mass estimate, but also a value for the certainty of the estimate. SVS capacitance data is collected for known masses under a wide variety of possible loading scenarios, though in all cases, the distribution of sample within the canister is expected to be approximately uniform. A capacitance-vs-mass curve is fitted to this data, and is subsequently used to determine the mass estimate for the single channel s capacitance reading during the measurement phase. This results in seven different mass estimates, one for each SVS channel. Moreover, the variance of the calibration data is used to place a Gaussian probability distribution function (pdf) around this mass estimate. To blend these seven estimates, the seven pdfs are combined into a single Gaussian distribution function, providing the final mean and variance of the estimate. This blending technique essentially takes the final estimate as an average of the estimates of the seven channels, weighted by the inverse of the channel s variance.
Deductive Evaluation: Implicit Code Verification With Low User Burden
NASA Technical Reports Server (NTRS)
Di Vito, Ben L.
2016-01-01
We describe a framework for symbolically evaluating C code using a deductive approach that discovers and proves program properties. The framework applies Floyd-Hoare verification principles in its treatment of loops, with a library of iteration schemes serving to derive loop invariants. During evaluation, theorem proving is performed on-the-fly, obviating the generation of verification conditions normally needed to establish loop properties. A PVS-based prototype is presented along with results for sample C functions.
Improved Detection Technique for Solvent Rinse Cleanliness Verification
NASA Technical Reports Server (NTRS)
Hornung, S. D.; Beeson, H. D.
2001-01-01
The NASA White Sands Test Facility (WSTF) has an ongoing effort to reduce or eliminate usage of cleaning solvents such as CFC-113 and its replacements. These solvents are used in the final clean and cleanliness verification processes for flight and ground support hardware, especially for oxygen systems where organic contaminants can pose an ignition hazard. For the final cleanliness verification in the standard process, the equivalent of one square foot of surface area of parts is rinsed with the solvent, and the final 100 mL of the rinse is captured. The amount of nonvolatile residue (NVR) in the solvent is determined by weight after the evaporation of the solvent. An improved process of sampling this rinse, developed at WSTF, requires evaporation of less than 2 mL of the solvent to make the cleanliness verification. Small amounts of the solvent are evaporated in a clean stainless steel cup, and the cleanliness of the stainless steel cup is measured using a commercially available surface quality monitor. The effectiveness of this new cleanliness verification technique was compared to the accepted NVR sampling procedures. Testing with known contaminants in solution, such as hydraulic fluid, fluorinated lubricants, and cutting and lubricating oils, was performed to establish a correlation between amount in solution and the process response. This report presents the approach and results and discusses the issues in establishing the surface quality monitor-based cleanliness verification.
EOS-AM precision pointing verification
NASA Technical Reports Server (NTRS)
Throckmorton, A.; Braknis, E.; Bolek, J.
1993-01-01
The Earth Observing System (EOS) AM mission requires tight pointing knowledge to meet scientific objectives, in a spacecraft with low frequency flexible appendage modes. As the spacecraft controller reacts to various disturbance sources and as the inherent appendage modes are excited by this control action, verification of precision pointing knowledge becomes particularly challenging for the EOS-AM mission. As presently conceived, this verification includes a complementary set of multi-disciplinary analyses, hardware tests and real-time computer in the loop simulations, followed by collection and analysis of hardware test and flight data and supported by a comprehensive data base repository for validated program values.
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
The U.S. Environmental Protection Agency (EPA) design efficient processes for conducting has created the Environmental Technology perfofl1lance tests of innovative technologies. Verification Program (E TV) to facilitate the deployment of innovative or improved environmental techn...
The U.S. Environmental Protection Agency (EPA) design efficient processes for conducting has created the Environmental Technology perfofl1lance tests of innovative technologies. Verification Program (E TV) to facilitate the deployment of innovative or improved environmental techn...
DOE Office of Scientific and Technical Information (OSTI.GOV)
P.C. Weaver
2009-02-17
Conduct verification surveys of grids at the DWI 1630 Site in Knoxville, Tennessee. The independent verification team (IVT) from ORISE, conducted verification activities in whole and partial grids, as completed by BJC. ORISE site activities included gamma surface scans and soil sampling within 33 grids; G11 through G14; H11 through H15; X14, X15, X19, and X21; J13 through J15 and J17 through J21; K7 through K9 and K13 through K15; L13 through L15; and M14 through M16
Environmental assessment of creosote-treated pilings in the marine environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butala, J.H.; Webb, D.A.; Jop, K.M.
1995-12-31
A comprehensive ecological risk assessment was conducted to evaluate the environmental impact of creosote-treated pilings in the marine environment at Moss Landing Harbor, Moss Landing, California. Four areas of investigation comprising the risk assessment were (1) evaluation of environmental conditions around existing creosote-treated pilings (2) investigating effects related to restoration of pilings (3) assessing creosote migration into surrounding environment, one year after pile-driving and (4) confirmation of creosote toxicity in laboratory studies. Biological and chemical evaluation of the impact of creosote-treated pilings was conducted on surface sheen, water column and sediment samples collected at Moss Landing Harbor. Water samples (surfacemore » sheen, water column and sediment pore water) were evaluated using short-term chronic exposures with Mysidopsis bahia, while bulk sediment samples were evaluated with 10-day sediment toxicity tests with Ampelisca abdita. Samples of surface, column water and sediment were analyzed for the constituents of creosote by GC mass spectrometry. In addition, a sample of neat material used to preserve treated pilings represented a reference for the polyaromatic hydrocarbons. Verification of organism response and analyses of field collected samples was performed by conducting 10-day A. abdita sediment and 7-day M. bahia elutriate exposures with creosote applied to clean sediment collected at Moss Landing, Evaluations were also performed to determine the effects of photoinduced toxicity on test organisms exposed to PAHs. The biological and analytical results of the field and laboratory exposures are being used to evaluate and determine risk of creosote-treated pilings on the marine environment.« less
International Space Station Requirement Verification for Commercial Visiting Vehicles
NASA Technical Reports Server (NTRS)
Garguilo, Dan
2017-01-01
The COTS program demonstrated NASA could rely on commercial providers for safe, reliable, and cost-effective cargo delivery to ISS. The ISS Program has developed a streamlined process to safely integrate commercial visiting vehicles and ensure requirements are met Levy a minimum requirement set (down from 1000s to 100s) focusing on the ISS interface and safety, reducing the level of NASA oversight/insight and burden on the commercial Partner. Partners provide a detailed verification and validation plan documenting how they will show they've met NASA requirements. NASA conducts process sampling to ensure that the established verification processes is being followed. NASA participates in joint verification events and analysis for requirements that require both parties verify. Verification compliance is approved by NASA and launch readiness certified at mission readiness reviews.
Eblen, Denise R; Barlow, Kristina E; Naugle, Alecia Larew
2006-11-01
The U.S. Food Safety and Inspection Service (FSIS) pathogen reduction-hazard analysis critical control point systems final rule, published in 1996, established Salmonella performance standards for broiler chicken, cow and bull, market hog, and steer and heifer carcasses and for ground beef, chicken, and turkey meat. In 1998, the FSIS began testing to verify that establishments are meeting performance standards. Samples are collected in sets in which the number of samples is defined but varies according to product class. A sample set fails when the number of positive Salmonella samples exceeds the maximum number of positive samples allowed under the performance standard. Salmonella sample sets collected at 1,584 establishments from 1998 through 2003 were examined to identify factors associated with failure of one or more sets. Overall, 1,282 (80.9%) of establishments never had failed sets. In establishments that did experience set failure(s), generally the failed sets were collected early in the establishment testing history, with the exception of broiler establishments where failure(s) occurred both early and late in the course of testing. Small establishments were more likely to have experienced a set failure than were large or very small establishments, and broiler establishments were more likely to have failed than were ground beef, market hog, or steer-heifer establishments. Agency response to failed Salmonella sample sets in the form of in-depth verification reviews and related establishment-initiated corrective actions have likely contributed to declines in the number of establishments that failed sets. A focus on food safety measures in small establishments and broiler processing establishments should further reduce the number of sample sets that fail to meet the Salmonella performance standard.
Verification and characterization of chromosome duplication in haploid maize.
de Oliveira Couto, E G; Resende Von Pinho, E V; Von Pinho, R G; Veiga, A D; de Carvalho, M R; de Oliveira Bustamante, F; Nascimento, M S
2015-06-26
Doubled haploid technology has been used by various private companies. However, information regarding chromosome duplication methodologies, particularly those concerning techniques used to identify duplication in cells, is limited. Thus, we analyzed and characterized artificially doubled haploids using microsatellites molecular markers, pollen viability, and flow cytometry techniques. Evaluated material was obtained using two different chromosome duplication protocols in maize seeds considered haploids, resulting from the cross between the haploid inducer line KEMS and 4 hybrids (GNS 3225, GNS 3032, GNS 3264, and DKB 393). Fourteen days after duplication, plant samples were collected and assessed by flow cytometry. Further, the plants were transplanted to a field, and samples were collected for DNA analyses using microsatellite markers. The tassels were collected during anthesis for pollen viability analyses. Haploid, diploid, and mixoploid individuals were detected using flow cytometry, demonstrating that this technique was efficient for identifying doubled haploids. The microsatellites markers were also efficient for confirming the ploidies preselected by flow cytometry and for identifying homozygous individuals. Pollen viability showed a significant difference between the evaluated ploidies when the Alexander and propionic-carmin stains were used. The viability rates between the plodies analyzed show potential for fertilization.
Specification and Verification of Medical Monitoring System Using Petri-nets.
Majma, Negar; Babamir, Seyed Morteza
2014-07-01
To monitor the patient behavior, data are collected from patient's body by a medical monitoring device so as to calculate the output using embedded software. Incorrect calculations may endanger the patient's life if the software fails to meet the patient's requirements. Accordingly, the veracity of the software behavior is a matter of concern in the medicine; moreover, the data collected from the patient's body are fuzzy. Some methods have already dealt with monitoring the medical monitoring devices; however, model based monitoring fuzzy computations of such devices have been addressed less. The present paper aims to present synthesizing a fuzzy Petri-net (FPN) model to verify behavior of a sample medical monitoring device called continuous infusion insulin (INS) because Petri-net (PN) is one of the formal and visual methods to verify the software's behavior. The device is worn by the diabetic patients and then the software calculates the INS dose and makes a decision for injection. The input and output of the infusion INS software are not crisp in the real world; therefore, we present them in fuzzy variables. Afterwards, we use FPN instead of clear PN to model the fuzzy variables. The paper follows three steps to synthesize an FPN to deal with verification of the infusion INS device: (1) Definition of fuzzy variables, (2) definition of fuzzy rules and (3) design of the FPN model to verify the software behavior.
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Comprehensive Evaluation and Implementation of Improvement Actions in Butcher Shops
Leotta, Gerardo A.; Brusa, Victoria; Galli, Lucía; Adriani, Cristian; Linares, Luciano; Etcheverría, Analía; Sanz, Marcelo; Sucari, Adriana; Peral García, Pilar; Signorini, Marcelo
2016-01-01
Foodborne pathogens can cause acute and chronic diseases and produce a wide range of symptoms. Since the consumption of ground beef is a risk factor for infections with some bacterial pathogens, we performed a comprehensive evaluation of butcher shops, implemented improvement actions for both butcher shops and consumers, and verified the impact of those actions implemented. A comprehensive evaluation was made and risk was quantified on a 1–100 scale as high-risk (1–40), moderate-risk (41–70) or low-risk (71–100). A total of 172 raw ground beef and 672 environmental samples were collected from 86 butcher shops during the evaluation (2010–2011) and verification (2013) stages of the study. Ground beef samples were analyzed for mesophilic aerobic organisms, Escherichia coli and coagulase-positive Staphylococcus aureus enumeration. Salmonella spp., E. coli O157:H7, non-O157 Shiga toxin-producing E. coli (STEC), and Listeria monocytogenes were detected and isolated from all samples. Risk quantification resulted in 43 (50.0%) high-risk, 34 (39.5%) moderate-risk, and nine (10.5%) low-risk butcher shops. Training sessions for 498 handlers and 4,506 consumers were held. Re-evaluation by risk quantification and microbiological analyses resulted in 19 (22.1%) high-risk, 42 (48.8%) moderate-risk and 25 (29.1%) low-risk butcher shops. The count of indicator microorganisms decreased with respect to the 2010–2011 period. After the implementation of improvement actions, the presence of L. monocytogenes, E. coli O157:H7 and stx genes in ground beef decreased. Salmonella spp. was isolated from 10 (11.6%) ground beef samples, without detecting statistically significant differences between both study periods (evaluation and verification). The percentage of pathogens in environmental samples was reduced in the verification period (Salmonella spp., 1.5%; L. monocytogenes, 10.7%; E. coli O157:H7, 0.6%; non-O157 STEC, 6.8%). Risk quantification was useful to identify those relevant facts in butcher shops. The reduction of contamination in ground beef and the environment was possible after training handlers based on the problems identified in their own butcher shops. Our results confirm the feasibility of implementing a comprehensive risk management program in butcher shops, and the importance of information campaigns targeting consumers. Further collaborative efforts would be necessary to improve foodstuffs safety at retail level and at home. PMID:27618439
Comprehensive Evaluation and Implementation of Improvement Actions in Butcher Shops.
Leotta, Gerardo A; Brusa, Victoria; Galli, Lucía; Adriani, Cristian; Linares, Luciano; Etcheverría, Analía; Sanz, Marcelo; Sucari, Adriana; Peral García, Pilar; Signorini, Marcelo
2016-01-01
Foodborne pathogens can cause acute and chronic diseases and produce a wide range of symptoms. Since the consumption of ground beef is a risk factor for infections with some bacterial pathogens, we performed a comprehensive evaluation of butcher shops, implemented improvement actions for both butcher shops and consumers, and verified the impact of those actions implemented. A comprehensive evaluation was made and risk was quantified on a 1-100 scale as high-risk (1-40), moderate-risk (41-70) or low-risk (71-100). A total of 172 raw ground beef and 672 environmental samples were collected from 86 butcher shops during the evaluation (2010-2011) and verification (2013) stages of the study. Ground beef samples were analyzed for mesophilic aerobic organisms, Escherichia coli and coagulase-positive Staphylococcus aureus enumeration. Salmonella spp., E. coli O157:H7, non-O157 Shiga toxin-producing E. coli (STEC), and Listeria monocytogenes were detected and isolated from all samples. Risk quantification resulted in 43 (50.0%) high-risk, 34 (39.5%) moderate-risk, and nine (10.5%) low-risk butcher shops. Training sessions for 498 handlers and 4,506 consumers were held. Re-evaluation by risk quantification and microbiological analyses resulted in 19 (22.1%) high-risk, 42 (48.8%) moderate-risk and 25 (29.1%) low-risk butcher shops. The count of indicator microorganisms decreased with respect to the 2010-2011 period. After the implementation of improvement actions, the presence of L. monocytogenes, E. coli O157:H7 and stx genes in ground beef decreased. Salmonella spp. was isolated from 10 (11.6%) ground beef samples, without detecting statistically significant differences between both study periods (evaluation and verification). The percentage of pathogens in environmental samples was reduced in the verification period (Salmonella spp., 1.5%; L. monocytogenes, 10.7%; E. coli O157:H7, 0.6%; non-O157 STEC, 6.8%). Risk quantification was useful to identify those relevant facts in butcher shops. The reduction of contamination in ground beef and the environment was possible after training handlers based on the problems identified in their own butcher shops. Our results confirm the feasibility of implementing a comprehensive risk management program in butcher shops, and the importance of information campaigns targeting consumers. Further collaborative efforts would be necessary to improve foodstuffs safety at retail level and at home.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-19
... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0406] Proposed Information Collection... any VA-guaranteed loans on an automatic basis. DATES: Written comments and recommendations on the... written comments on the collection of information through the Federal Docket Management System (FDMS) at...
78 FR 67204 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-08
... action to submit an information collection request to the Office of Management and Budget (OMB) and... Verification System (LVS) has been developed, providing an electronic method for fulfilling this requirement... publicly available documents, including the draft supporting statement, at the NRC's Public Document Room...
78 FR 5409 - Ongoing Equivalence Verifications of Foreign Food Regulatory Systems
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-25
... of data shared. Finally, with respect to POE re-inspections, NACMPI recommended the targeting of high-risk product and high-risk imports for sampling and other verification activities during reinspection... authority; the availability of contingency plans in the country for containing and mitigating the effects of...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size of those particles equal to and smalle...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size of those particles equal to and smalle...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size of those particles equal to and smalle...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size for particles equal to or smaller than...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size of those particles equal to and smalle...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size for particles equal to or smaller than...
Zlotnik, V.A.; McGuire, V.L.
1998-01-01
Using the developed theory and modified Springer-Gelhar (SG) model, an identification method is proposed for estimating hydraulic conductivity from multi-level slug tests. The computerized algorithm calculates hydraulic conductivity from both monotonic and oscillatory well responses obtained using a double-packer system. Field verification of the method was performed at a specially designed fully penetrating well of 0.1-m diameter with a 10-m screen in a sand and gravel alluvial aquifer (MSEA site, Shelton, Nebraska). During well installation, disturbed core samples were collected every 0.6 m using a split-spoon sampler. Vertical profiles of hydraulic conductivity were produced on the basis of grain-size analysis of the disturbed core samples. These results closely correlate with the vertical profile of horizontal hydraulic conductivity obtained by interpreting multi-level slug test responses using the modified SG model. The identification method was applied to interpret the response from 474 slug tests in 156 locations at the MSEA site. More than 60% of responses were oscillatory. The method produced a good match to experimental data for both oscillatory and monotonic responses using an automated curve matching procedure. The proposed method allowed us to drastically increase the efficiency of each well used for aquifer characterization and to process massive arrays of field data. Recommendations generalizing this experience to massive application of the proposed method are developed.Using the developed theory and modified Springer-Gelhar (SG) model, an identification method is proposed for estimating hydraulic conductivity from multi-level slug tests. The computerized algorithm calculates hydraulic conductivity from both monotonic and oscillatory well responses obtained using a double-packer system. Field verification of the method was performed at a specially designed fully penetrating well of 0.1-m diameter with a 10-m screen in a sand and gravel alluvial aquifer (MSEA site, Shelton, Nebraska). During well installation, disturbed core samples were collected every 0.6 m using a split-spoon sampler. Vertical profiles of hydraulic conductivity were produced on the basis of grain-size analysis of the disturbed core samples. These results closely correlate with the vertical profile of horizontal hydraulic conductivity obtained by interpreting multi-level slug test responses using the modified SG model. The identification method was applied to interpret the response from 474 slug tests in 156 locations at the MSEA site. More than 60% of responses were oscillatory. The method produced a good match to experimental data for both oscillatory and monotonic responses using an automated curve matching procedure. The proposed method allowed us to drastically increase the efficiency of each well used for aquifer characterization and to process massive arrays of field data. Recommendations generalizing this experience to massive application of the proposed method are developed.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-31
... to determine Filipino Veterans or beneficiaries receiving benefit at the full-dollar rate continues... approved collection. Abstract: VA Form Letter 21-914 is use to verify whether Filipino Veterans of the...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-07
... to determine Filipino veterans or beneficiaries receiving benefit at the full-dollar rate continues... approved collection. Abstract: VA Form Letter 21-914 is use to verify whether Filipino veterans of the...
Weak lensing magnification in the Dark Energy Survey Science Verification data
NASA Astrophysics Data System (ADS)
Garcia-Fernandez, M.; Sanchez, E.; Sevilla-Noarbe, I.; Suchyta, E.; Huff, E. M.; Gaztanaga, E.; Aleksić, J.; Ponce, R.; Castander, F. J.; Hoyle, B.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Eifler, T. F.; Evrard, A. E.; Fernandez, E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Jarvis, M.; Kirk, D.; Krause, E.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; MacCrann, N.; Maia, M. A. G.; March, M.; Marshall, J. L.; Melchior, P.; Miquel, R.; Mohr, J. J.; Plazas, A. A.; Romer, A. K.; Roodman, A.; Rykoff, E. S.; Scarpine, V.; Schubnell, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Tarle, G.; Thomas, D.; Walker, A. R.; Wester, W.; DES Collaboration
2018-05-01
In this paper, the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using the Dark Energy Survey Science Verification data set. This analysis is carried out for galaxies that are selected only by its photometric redshift. An extensive analysis of the systematic effects, using new methods based on simulations is performed, including a Monte Carlo sampling of the selection function of the survey.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. M. Capron
2008-04-29
The 100-F-26:12 waste site was an approximately 308-m-long, 1.8-m-diameter east-west-trending reinforced concrete pipe that joined the North Process Sewer Pipelines (100-F-26:1) and the South Process Pipelines (100-F-26:4) with the 1.8-m reactor cooling water effluent pipeline (100-F-19). In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.
Check-Standard Testing Across Multiple Transonic Wind Tunnels with the Modern Design of Experiments
NASA Technical Reports Server (NTRS)
Deloach, Richard
2012-01-01
This paper reports the result of an analysis of wind tunnel data acquired in support of the Facility Analysis Verification & Operational Reliability (FAVOR) project. The analysis uses methods referred to collectively at Langley Research Center as the Modern Design of Experiments (MDOE). These methods quantify the total variance in a sample of wind tunnel data and partition it into explained and unexplained components. The unexplained component is further partitioned in random and systematic components. This analysis was performed on data acquired in similar wind tunnel tests executed in four different U.S. transonic facilities. The measurement environment of each facility was quantified and compared.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-27
... FURTHER INFORMATION CONTACT: Denise McLamb, Enterprise Records Service (005R1B), Department of Veterans... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0673] Agency Information Collection (One-VA..., Security, and Preparedness, Department of Veterans Affairs, will submit the collection of information...
Development of Sample Verification System for Sample Return Missions
NASA Technical Reports Server (NTRS)
Toda, Risaku; McKinney, Colin; Jackson, Shannon P.; Mojarradi, Mohammad; Trebi-Ollennu, Ashitey; Manohara, Harish
2011-01-01
This paper describes the development of a proof of-concept sample verification system (SVS) for in-situ mass measurement of planetary rock and soil sample in future robotic sample return missions. Our proof-of-concept SVS device contains a 10 cm diameter pressure sensitive elastic membrane placed at the bottom of a sample canister. The membrane deforms under the weight of accumulating planetary sample. The membrane is positioned in proximity to an opposing substrate with a narrow gap. The deformation of the membrane makes the gap to be narrower, resulting in increased capacitance between the two nearly parallel plates. Capacitance readout circuitry on a nearby printed circuit board (PCB) transmits data via a low-voltage differential signaling (LVDS) interface. The fabricated SVS proof-of-concept device has successfully demonstrated approximately 1pF/gram capacitance change
Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model
NASA Astrophysics Data System (ADS)
Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal
How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.
Chemical Analysis Results for Potable Water from ISS Expeditions 21 to 25
NASA Technical Reports Server (NTRS)
Straub, John E., II; Plumlee, Debrah K.; Schultz, John R.; McCoy, J. Torin
2010-01-01
The Johnson Space Center Water and Food Analytical Laboratory (WAFAL) performed detailed ground-based analyses of archival water samples for verification of the chemical quality of the International Space Station (ISS) potable water supplies for Expeditions 21 to 25. Over a 14-month period, the Space Shuttle visited the ISS on five occasions to complete construction and deliver supplies. The onboard supplies of potable water available for consumption by the Expeditions 21 to 25 crews consisted of Russian ground-supplied potable water, Russian potable water regenerated from humidity condensate, and US potable water recovered from urine distillate and condensate. Chemical archival water samples that were collected with U.S. hardware during Expeditions 21 to 25 were returned on Shuttle flights STS-129 (ULF3), STS-130 (20A), STS-131 (19A), STS-132 (ULF4) and STS-133 (ULF5), as well as on Soyuz flights 19-22. This paper reports the analytical results for the returned archival water samples and evaluates their compliance with ISS water quality standards. The WAFAL also received and analyzed aliquots of some Russian potable water samples collected in-flight and pre-flight samples of Rodnik potable water delivered to the Station on the Russian Progress vehicle during Expeditions 21 to 25. These additional analytical results are also reported and discussed in this paper.
Yin, Hong-Rui; Zhang, Lei; Xie, Li-Qi; Huang, Li-Yong; Xu, Ye; Cai, San-Jun; Yang, Peng-Yuan; Lu, Hao-Jie
2013-09-06
Novel biomarker verification assays are urgently required to improve the efficiency of biomarker development. Benefitting from lower development costs, multiple reaction monitoring (MRM) has been used for biomarker verification as an alternative to immunoassay. However, in general MRM analysis, only one sample can be quantified in a single experiment, which restricts its application. Here, a Hyperplex-MRM quantification approach, which combined mTRAQ for absolute quantification and iTRAQ for relative quantification, was developed to increase the throughput of biomarker verification. In this strategy, equal amounts of internal standard peptides were labeled with mTRAQ reagents Δ0 and Δ8, respectively, as double references, while 4-plex iTRAQ reagents were used to label four different samples as an alternative to mTRAQ Δ4. From the MRM trace and MS/MS spectrum, total amounts and relative ratios of target proteins/peptides of four samples could be acquired simultaneously. Accordingly, absolute amounts of target proteins/peptides in four different samples could be achieved in a single run. In addition, double references were used to increase the reliability of the quantification results. Using this approach, three biomarker candidates, ademosylhomocysteinase (AHCY), cathepsin D (CTSD), and lysozyme C (LYZ), were successfully quantified in colorectal cancer (CRC) tissue specimens of different stages with high accuracy, sensitivity, and reproducibility. To summarize, we demonstrated a promising quantification method for high-throughput verification of biomarker candidates.
Code of Federal Regulations, 2010 CFR
2010-07-01
... which you sample and record gas-analyzer concentrations. (b) Measurement principles. This test verifies... appropriate frequency to prevent loss of information. This test also verifies that the measurement system... instructions. Adjust the measurement system as needed to optimize performance. Run this verification with the...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2011 CFR
2011-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... reference PM sample media (e.g., filters) before and after a weighing session. A weighing session may be as...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... reference PM sample media (e.g., filters) before and after a weighing session. A weighing session may be as...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2013 CFR
2013-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... reference PM sample media (e.g., filters) before and after a weighing session. A weighing session may be as...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2012 CFR
2012-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... reference PM sample media (e.g., filters) before and after a weighing session. A weighing session may be as...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2010 CFR
2010-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... weighing session by weighing reference PM sample media (e.g., filters) before and after a weighing session...
NASA Technical Reports Server (NTRS)
Hughes, David W.; Hedgeland, Randy J.
1994-01-01
A mechanical simulator of the Hubble Space Telescope (HST) Aft Shroud was built to perform verification testing of the Servicing Mission Scientific Instruments (SI's) and to provide a facility for astronaut training. All assembly, integration, and test activities occurred under the guidance of a contamination control plan, and all work was reviewed by a contamination engineer prior to implementation. An integrated approach was followed in which materials selection, manufacturing, assembly, subsystem integration, and end product use were considered and controlled to ensure that the use of the High Fidelity Mechanical Simulator (HFMS) as a verification tool would not contaminate mission critical hardware. Surfaces were cleaned throughout manufacturing, assembly, and integration, and reverification was performed following major activities. Direct surface sampling was the preferred method of verification, but access and material constraints led to the use of indirect methods as well. Although surface geometries and coatings often made contamination verification difficult, final contamination sampling and monitoring demonstrated the ability to maintain a class M5.5 environment with surface levels less than 400B inside the HFMS.
Current status of verification practices in clinical biochemistry in Spain.
Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè
2013-09-01
Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
2008-01-31
The 116-C-3 waste site consisted of two underground storage tanks designed to receive mixed waste from the 105-C Reactor Metals Examination Facility chemical dejacketing process. Confirmatory evaluation and subsequent characterization of the site determined that the southern tank contained approximately 34,000 L (9,000 gal) of dejacketing wastes, and that the northern tank was unused. In accordance with this evaluation, the verification sampling and modeling results support a reclassification of this site to Interim Closed Out. The results of verification sampling demonstrate that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils.more » The results also show that residual contaminant concentrations are protective of groundwater and the Columbia River.« less
Koller, Marianne; Becker, Christian; Thiermann, Horst; Worek, Franz
2010-05-15
The purpose of this study was to check the applicability of different analytical methods for the identification of unknown nerve agents in human body fluids. Plasma and urine samples were spiked with nerve agents (plasma) or with their metabolites (urine) or were left blank. Seven random samples (35% of all samples) were selected for the verification test. Plasma was worked up for unchanged nerve agents and for regenerated nerve agents after fluoride-induced reactivation of nerve agent-inhibited butyrylcholinesterase. Both extracts were analysed by GC-MS. Metabolites were extracted from plasma and urine, respectively, and were analysed by LC-MS. The urinary metabolites and two blank samples could be identified without further measurements, plasma metabolites and blanks were identified in six of seven samples. The analysis of unchanged nerve agent provided five agents/blanks and the sixth agent after further investigation. The determination of the regenerated agents also provided only five clear findings during the first screening because of a rather noisy baseline. Therefore, the sample preparation was extended by a size exclusion step performed before addition of fluoride which visibly reduced baseline noise and thus improved identification of the two missing agents. The test clearly showed that verification should be performed by analysing more than one biomarker to ensure identification of the agent(s). Copyright (c) 2010 Elsevier B.V. All rights reserved.
Knoeferle, Pia; Urbach, Thomas P.; Kutas, Marta
2010-01-01
To re-establish picture-sentence verification – discredited possibly for its over-reliance on post-sentence response time (RT) measures - as a task for situated comprehension, we collected event-related brain potentials (ERPs) as participants read a subject-verb-object sentence, and RTs indicating whether or not the verb matched a previously depicted action. For mismatches (vs matches), speeded RTs were longer, verb N400s over centro-parietal scalp larger, and ERPs to the object noun more negative. RTs (congruence effect) correlated inversely with the centro-parietal verb N400s, and positively with the object ERP congruence effects. Verb N400s, object ERPs, and verbal working memory scores predicted more variance in RT effects (50%) than N400s alone. Thus, (1) verification processing is not all post-sentence; (2) simple priming cannot account for these results; and (3) verification tasks can inform studies of situated comprehension. PMID:20701712
Spacecraft attitude calibration/verification baseline study
NASA Technical Reports Server (NTRS)
Chen, L. C.
1981-01-01
A baseline study for a generalized spacecraft attitude calibration/verification system is presented. It can be used to define software specifications for three major functions required by a mission: the pre-launch parameter observability and data collection strategy study; the in-flight sensor calibration; and the post-calibration attitude accuracy verification. Analytical considerations are given for both single-axis and three-axis spacecrafts. The three-axis attitudes considered include the inertial-pointing attitudes, the reference-pointing attitudes, and attitudes undergoing specific maneuvers. The attitude sensors and hardware considered include the Earth horizon sensors, the plane-field Sun sensors, the coarse and fine two-axis digital Sun sensors, the three-axis magnetometers, the fixed-head star trackers, and the inertial reference gyros.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-23
... Nutrition Assistance Program Prisoner and Death Match Requirements AGENCY: Food and Nutrition Service (FNS.... SUPPLEMENTARY INFORMATION: Title: Supplemental Nutrition Assistance Program Prisoner and Death Match... verification and death matching procedures as mandated by legislation and previously implemented through agency...
76 FR 338 - Notice of Proposed Information Collection Requests
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-04
... information technology. Dated: December 28, 2010. James Hyler, Acting Director, Information Collection... Program (TQE) Scholarship Contract and Teaching Verification Forms on Scholarship Recipients. OMB Control... service obligation to teach in a high-need school in a high-need Local Educational Agency. This...
Code of Federal Regulations, 2010 CFR
2010-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
Code of Federal Regulations, 2011 CFR
2011-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
Code of Federal Regulations, 2014 CFR
2014-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
Code of Federal Regulations, 2012 CFR
2012-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
Code of Federal Regulations, 2013 CFR
2013-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
NASA Astrophysics Data System (ADS)
Miller, Jacob; Sanders, Stephen; Miyake, Akimasa
2017-12-01
While quantum speed-up in solving certain decision problems by a fault-tolerant universal quantum computer has been promised, a timely research interest includes how far one can reduce the resource requirement to demonstrate a provable advantage in quantum devices without demanding quantum error correction, which is crucial for prolonging the coherence time of qubits. We propose a model device made of locally interacting multiple qubits, designed such that simultaneous single-qubit measurements on it can output probability distributions whose average-case sampling is classically intractable, under similar assumptions as the sampling of noninteracting bosons and instantaneous quantum circuits. Notably, in contrast to these previous unitary-based realizations, our measurement-based implementation has two distinctive features. (i) Our implementation involves no adaptation of measurement bases, leading output probability distributions to be generated in constant time, independent of the system size. Thus, it could be implemented in principle without quantum error correction. (ii) Verifying the classical intractability of our sampling is done by changing the Pauli measurement bases only at certain output qubits. Our usage of random commuting quantum circuits in place of computationally universal circuits allows a unique unification of sampling and verification, so they require the same physical resource requirements in contrast to the more demanding verification protocols seen elsewhere in the literature.
Kang, Leni; Zhang, Shaokai; Zhao, Fanghui; Qiao, Youlin
2014-03-01
To evaluate and adjust the verification bias existed in the screening or diagnostic tests. Inverse-probability weighting method was used to adjust the sensitivity and specificity of the diagnostic tests, with an example of cervical cancer screening used to introduce the Compare Tests package in R software which could be implemented. Sensitivity and specificity calculated from the traditional method and maximum likelihood estimation method were compared to the results from Inverse-probability weighting method in the random-sampled example. The true sensitivity and specificity of the HPV self-sampling test were 83.53% (95%CI:74.23-89.93)and 85.86% (95%CI: 84.23-87.36). In the analysis of data with randomly missing verification by gold standard, the sensitivity and specificity calculated by traditional method were 90.48% (95%CI:80.74-95.56)and 71.96% (95%CI:68.71-75.00), respectively. The adjusted sensitivity and specificity under the use of Inverse-probability weighting method were 82.25% (95% CI:63.11-92.62) and 85.80% (95% CI: 85.09-86.47), respectively, whereas they were 80.13% (95%CI:66.81-93.46)and 85.80% (95%CI: 84.20-87.41) under the maximum likelihood estimation method. The inverse-probability weighting method could effectively adjust the sensitivity and specificity of a diagnostic test when verification bias existed, especially when complex sampling appeared.
Atkinson, David A.
2002-01-01
Methods and apparatus for ion mobility spectrometry and analyte detection and identification verification system are disclosed. The apparatus is configured to be used in an ion mobility spectrometer and includes a plurality of reactant reservoirs configured to contain a plurality of reactants which can be reacted with the sample to form adducts having varying ion mobilities. A carrier fluid, such as air or nitrogen, is used to carry the sample into the spectrometer. The plurality of reactants are configured to be selectively added to the carrier stream by use inlet and outlet manifolds in communication with the reagent reservoirs, the reservoirs being selectively isolatable by valves. The invention further includes a spectrometer having the reagent system described. In the method, a first reactant is used with the sample. Following a positive result, a second reactant is used to determine whether a predicted response occurs. The occurrence of the second predicted response tends to verify the existence of a component of interest within the sample. A third reactant can also be used to provide further verification of the existence of a component of interest. A library can be established of known responses of compounds of interest with various reactants and the results of a specific multi-reactant survey of a sample can be compared against the library to determine whether a component detected in the sample is likely to be a specific component of interest.
76 FR 78264 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-16
... HHAs and Supp. Regs. in 42 CFR 48.55, 484.205, 484.245, 484.250; Use: This data set is currently... program. Since 1999, the Medicare CoPs have mandated that HHAs use the OASIS data set when evaluating... visits; the data collected during site visits facilitates the verification of the accuracy and...
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
2008-03-03
The 100-F-26:13 waste site is the network of process sewer pipelines that received effluent from the 108-F Biological Laboratory and discharged it to the 188-F Ash Disposal Area (126-F-1 waste site). The pipelines included one 0.15-m (6-in.)-, two 0.2-m (8-in.)-, and one 0.31-m (12-in.)-diameter vitrified clay pipe segments encased in concrete. In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling demonstrated that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also showed thatmore » residual contaminant concentrations are protective of groundwater and the Columbia River.« less
The Epoxytec, Inc. CPP™ epoxy coating used for wastewater collection system rehabilitation was evaluated by EPA’s Environmental Technology Verification Program under laboratory conditions at the Center for Innovative Grouting Material and Technology (CIGMAT) Laboratory at the Uni...
The Standard Cement Materials, Inc. Standard Epoxy Coating 4553™ (SEC 4553) epoxy coating used for wastewater collection system rehabilitation was evaluated by EPA’s Environmental Technology Verification Program under laboratory conditions at the Center for Innovative Grouting Ma...
Data Collection for Foreign Scholars. Working Paper #11.
ERIC Educational Resources Information Center
Dean, Michael F.
This working paper provides suggestions and considerations for anyone contemplating the electronic collection of data on foreign scholars (as distinct from international students). It is noted that, because the Immigration and Naturalization Service requires employment verification and immigration information such as country of citizenship and…
Municipalities are discovering rapid degradation of infrastructures in wastewater collection and treatment facilities due to the infiltration of water from the surrounding environments. Wastewater facilities are not only wet, but also experience hydrostatic pressure conditions un...
Make the World Safer from Nuclear Weapons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowyer, Ted
Senior Nuclear Scientist Ted Bowyer knows firsthand the challenges associated with protecting our nation. Ted and his colleagues help detect the proliferation of nuclear weapons. They developed award-winning technologies that give international treaty verification authorities “eyes and ears” around the globe. The instruments, located in 80 countries, help ensure compliance with the Comprehensive Nuclear Test-Ban Treaty, or CTBT. They are completely automated radionuclide monitoring systems that would detect airborne radioactive particles if a nuclear detonation occurred in the air, underground or at sea. Some samples collected through these technologies are sent to PNNL’s Shallow Underground Laboratory—the only certified U.S. radionuclidemore » laboratory for the CTBT’s International Monitoring System Organization.« less
Chemical Analysis Results for Potable Water from ISS Expeditions 21 Through 25
NASA Technical Reports Server (NTRS)
Straub, John E., II; Plumlee, Debrah K.; Schultz, John R.; McCoy, J. Torin
2011-01-01
The Johnson Space Center Water and Food Analytical Laboratory (WAFAL) performed detailed ground-based analyses of archival water samples for verification of the chemical quality of the International Space Station (ISS) potable water supplies for Expeditions 21 through 25. Over a 14-month period the Space Shuttle visited the ISS on four occasions to complete construction and deliver supplies. The onboard supplies of potable water available for consumption by the Expeditions 21 to 25 crews consisted of Russian ground-supplied potable water, Russian potable water regenerated from humidity condensate, and US potable water recovered from urine distillate and condensate. Chemical archival water samples that were collected with U.S. hardware during Expeditions 21 to 25 were returned on Shuttle flights STS-129 (ULF3), STS-130 (20A), STS-131 (19A), and STS-132 (ULF4), as well as on Soyuz flights 19-23. This paper reports the analytical results for these returned potable water archival samples and their compliance with ISS water quality standards.
Hair Analysis in Forensic Toxicology: An Updated Review with a Special Focus on Pitfalls.
Kintz, Pascal
2017-01-01
The detection of drugs in hair analysis has progressively emerged as a consequence of the enhanced sensitivity of analytical techniques used in forensic toxicology; a greater advantage in using this matrix with respect to classical ones (i.e. urine and blood) is an easier and non-invasive sample collection, even when the careful supervision of law enforcement officers is required to avoid the risk that the sample may be adulterated or replaced. Moreover, according to the length of the hair, the history of drug exposure can be retrospectively monitored from few weeks up to months or years since sample collection. Through a detailed revision of the existent literature, this manuscript provides information on the proper sample collection, preparation and analysis, as well as pitfalls in forensic hair analysis, and summarizes the wide range of application of this technology, including excessive alcohol drinking, doping, child abuse, and offences linked to drug use. Verification of history of psychotropic drugs, alcohol and doping agents use by hair analysis, hair testing for driving license regranting and drug facilitated crimes, and testing for drugs in hair of children have been reviewed together with recent trends in hair contamination and possibility to disclose use of new psychoactive substances by hair analysis. Hair analysis in forensic toxicology has been quickly emerged and improved in recent years; a deeper knowledge of advantages and limitations of this unique matrix is necessary for a better use in forensic caseworks. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Code of Federal Regulations, 2013 CFR
2013-07-01
... discrete-mode testing. For this check we consider water vapor a gaseous constituent. This verification does... for water removed from the sample done in post-processing according to § 1065.659 and it does not... humidification vessel that contains water. You must humidify NO2 span gas with another moist gas stream. We...
Code of Federal Regulations, 2014 CFR
2014-07-01
... discrete-mode testing. For this check we consider water vapor a gaseous constituent. This verification does... for water removed from the sample done in post-processing according to § 1065.659 (40 CFR 1066.620 for... contains water. You must humidify NO2 span gas with another moist gas stream. We recommend humidifying your...
Cleanup Verification Package for the 118-F-5 PNL Sawdust Pit
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. D. Habel
2008-05-20
This cleanup verification package documents completion of remedial action, sampling activities, and compliance with cleanup criteria for the 118-F-5 Burial Ground, the PNL (Pacific Northwest Laboratory) Sawdust Pit. The 118-F-5 Burial Ground was an unlined trench that received radioactive sawdust from the floors of animal pens in the 100-F Experimental Animal Farm.
A round robin approach to the analysis of bisphenol a (BPA) in human blood samples
2014-01-01
Background Human exposure to bisphenol A (BPA) is ubiquitous, yet there are concerns about whether BPA can be measured in human blood. This Round Robin was designed to address this concern through three goals: 1) to identify collection materials, reagents and detection apparatuses that do not contribute BPA to serum; 2) to identify sensitive and precise methods to accurately measure unconjugated BPA (uBPA) and BPA-glucuronide (BPA-G), a metabolite, in serum; and 3) to evaluate whether inadvertent hydrolysis of BPA-G occurs during sample handling and processing. Methods Four laboratories participated in this Round Robin. Laboratories screened materials to identify BPA contamination in collection and analysis materials. Serum was spiked with concentrations of uBPA and/or BPA-G ranging from 0.09-19.5 (uBPA) and 0.5-32 (BPA-G) ng/mL. Additional samples were preserved unspiked as ‘environmental’ samples. Blinded samples were provided to laboratories that used LC/MSMS to simultaneously quantify uBPA and BPA-G. To determine whether inadvertent hydrolysis of BPA metabolites occurred, samples spiked with only BPA-G were analyzed for the presence of uBPA. Finally, three laboratories compared direct and indirect methods of quantifying BPA-G. Results We identified collection materials and reagents that did not introduce BPA contamination. In the blinded spiked sample analysis, all laboratories were able to distinguish low from high values of uBPA and BPA-G, for the whole spiked sample range and for those samples spiked with the three lowest concentrations (0.5-3.1 ng/ml). By completion of the Round Robin, three laboratories had verified methods for the analysis of uBPA and two verified for the analysis of BPA-G (verification determined by: 4 of 5 samples within 20% of spiked concentrations). In the analysis of BPA-G only spiked samples, all laboratories reported BPA-G was the majority of BPA detected (92.2 – 100%). Finally, laboratories were more likely to be verified using direct methods than indirect ones using enzymatic hydrolysis. Conclusions Sensitive and accurate methods for the direct quantification of uBPA and BPA-G were developed in multiple laboratories and can be used for the analysis of human serum samples. BPA contamination can be controlled during sample collection and inadvertent hydrolysis of BPA conjugates can be avoided during sample handling. PMID:24690217
METHOD 544. DETERMINATION OF MICROCYSTINS AND ...
Method 544 is an accurate and precise analytical method to determine six microcystins (including MC-LR) and nodularin in drinking water using solid phase extraction and liquid chromatography tandem mass spectrometry (SPE-LC/MS/MS). The advantage of this SPE-LC/MS/MS is its sensitivity and ability to speciate the microcystins. This method development task establishes sample preservation techniques, sample concentration and analytical procedures, aqueous and extract holding time criteria and quality control procedures. Draft Method 544 undergone a multi-laboratory verification to ensure other laboratories can implement the method and achieve the quality control measures specified in the method. It is anticipated that Method 544 may be used in UCMR 4 to collect nationwide occurrence data for selected microcystins in drinking water. The purpose of this research project is to develop an accurate and precise analytical method to concentrate and determine selected MCs and nodularin in drinking water.
DOT National Transportation Integrated Search
2011-10-28
Since 1985, ODOT has been manually collecting rut : depth data using a straight edge and dial gauge (S&G). This : method is slow and dangerous to pavement condition raters : when traffic control is not available. According to the : Pavement Condition...
The Protective Liner Systems International, Inc. Epoxy Mastic PLS-614 coating used for wastewater collection system rehabilitation was evaluated by EPA’s Environmental Technology Verification Program under laboratory conditions at the Center for Innovative Grouting Material and T...
Municipalities are discovering rapid degradation of infrastructures in wastewater collection and treatment facilities due to infiltration of leaking water from the surrounding environments. Rehabilitation of these facilities by in situ methods, including the use of grouting, is u...
Municipalities are discovering rapid degradation of infrastructures in wastewater collection and treatment facilities due to infiltration of leaking water from the surrounding environments. Rehabilitation of these facilities by in situ methods, including the use of grouting, is u...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-29
... Enterprise, Department of Veterans Affairs. ACTION: Notice. SUMMARY: The Center for Veterans Enterprise (CVE... veterans owned businesses. DATES: Written comments and recommendations on the proposed collection of... online through the Federal Docket Management System (FDMS) at http://www.Regulations.gov . FOR FURTHER...
High-speed autoverifying technology for printed wiring boards
NASA Astrophysics Data System (ADS)
Ando, Moritoshi; Oka, Hiroshi; Okada, Hideo; Sakashita, Yorihiro; Shibutani, Nobumi
1996-10-01
We have developed an automated pattern verification technique. The output of an automated optical inspection system contains many false alarms. Verification is needed to distinguish between minor irregularities and serious defects. In the past, this verification was usually done manually, which led to unsatisfactory product quality. The goal of our new automated verification system is to detect pattern features on surface mount technology boards. In our system, we employ a new illumination method, which uses multiple colors and multiple direction illumination. Images are captured with a CCD camera. We have developed a new algorithm that uses CAD data for both pattern matching and pattern structure determination. This helps to search for patterns around a defect and to examine defect definition rules. These are processed with a high speed workstation and a hard-wired circuits. The system can verify a defect within 1.5 seconds. The verification system was tested in a factory. It verified 1,500 defective samples and detected all significant defects with only a 0.1 percent of error rate (false alarm).
2014 Assessment of the Ballistic Missile Defense System (BMDS)
2015-03-23
for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of...take several more years to collect the test data needed to adequately VV&A the BMDS M&S required to perform such assessments. As data are collected ...Accreditation is possible only if a sufficient quantity and quality of flight test data have been collected to support model verification and
40 CFR 1066.135 - Linearity verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... CVS, double-dilution, and partial-flow systems. (3) PM sample. (4) Chiller sample, for gaseous sampling systems that use thermal chillers to dry samples, and that use chiller temperature to calculate dewpoint at the chiller outlet. For testing, if you choose to use the high alarm temperature setpoint for...
2015-03-13
A. Lee. “A Programming Model for Time - Synchronized Distributed Real- Time Systems”. In: Proceedings of Real Time and Em- bedded Technology and Applications Symposium. 2007, pp. 259–268. ...From MetroII to Metronomy, Designing Contract-based Function-Architecture Co-simulation Framework for Timing Verification of Cyber-Physical Systems...the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
... affected agencies concerning the proposed collection of information to: (1) Evaluate whether the proposed... user with information related to the Rules of Behavior for system usage and the user's responsibilities... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. FR-5690-N-05] Proposed Information...
DOT National Transportation Integrated Search
2011-11-01
Since 1985, ODOT has been manually collecting rut : depth data using a straight edge and dial gauge (S&G). This : method is slow and dangerous to pavement condition raters : when traffic control is not available. According to the : Pavement Condition...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-19
... (ETA) sponsored information collection request (ICR) titled, ``Income and Eligibility Verification... this request to the Office of Information and Regulatory Affairs, Attn: OMB Desk Officer for DOL-ETA..., the ETA issued a final rule regarding the Confidentiality and Disclosure of State Unemployment...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-22
..., received. This procedure increases the effectiveness of controls on the international trade of strategic... collection). Affected Public: Business or other for-profit organizations. Estimated Number of Respondents... Annual Cost to Public: $0. IV. Request for Comments Comments are invited on: (a) Whether the proposed...
Publication Of Oceanographic Data on CD-ROM
NASA Technical Reports Server (NTRS)
Hilland, Jeffrey E.; Smith, Elizabeth A.; Martin, Michael D.
1992-01-01
Large collections of oceanographic data and other large collections of data published on CD-ROM's in formats facilitating access and analysis. Involves four major steps: preprocessing, premastering, mastering, and verification. Large capacity, small size, commercial availability, long-life, and standard format of CD-ROM's offer advantages over computer-compatible magnetic tape.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... DEPARTMENT OF HOMELAND SECURITY U.S. Citizenship and Immigration Services Agency Information...), U.S. Citizenship and Immigration Services (USCIS) will be submitting the following information... sponsoring the collection: Form I-9. U.S. Citizenship and Immigration Services. (4) Affected public who will...
Deductive Evaluation: Formal Code Analysis With Low User Burden
NASA Technical Reports Server (NTRS)
Di Vito, Ben. L
2016-01-01
We describe a framework for symbolically evaluating iterative C code using a deductive approach that automatically discovers and proves program properties. Although verification is not performed, the method can infer detailed program behavior. Software engineering work flows could be enhanced by this type of analysis. Floyd-Hoare verification principles are applied to synthesize loop invariants, using a library of iteration-specific deductive knowledge. When needed, theorem proving is interleaved with evaluation and performed on the fly. Evaluation results take the form of inferred expressions and type constraints for values of program variables. An implementation using PVS (Prototype Verification System) is presented along with results for sample C functions.
SU-F-T-494: A Multi-Institutional Study of Independent Dose Verification Using Golden Beam Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Itano, M; Yamazaki, T; Tachibana, R
Purpose: In general, beam data of individual linac is measured for independent dose verification software program and the verification is performed as a secondary check. In this study, independent dose verification using golden beam data was compared to that using individual linac’s beam data. Methods: Six institutions were participated and three different beam data were prepared. The one was individual measured data (Original Beam Data, OBD) .The others were generated by all measurements from same linac model (Model-GBD) and all linac models (All-GBD). The three different beam data were registered to the independent verification software program for each institute. Subsequently,more » patient’s plans in eight sites (brain, head and neck, lung, esophagus, breast, abdomen, pelvis and bone) were analyzed using the verification program to compare doses calculated using the three different beam data. Results: 1116 plans were collected from six institutes. Compared to using the OBD, the results shows the variation using the Model-GBD based calculation and the All-GBD was 0.0 ± 0.3% and 0.0 ± 0.6%, respectively. The maximum variations were 1.2% and 2.3%, respectively. The plans with the variation over 1% shows the reference points were located away from the central axis with/without physical wedge. Conclusion: The confidence limit (2SD) using the Model-GBD and the All-GBD was within 0.6% and 1.2%, respectively. Thus, the use of golden beam data may be feasible for independent verification. In addition to it, the verification using golden beam data provide quality assurance of planning from the view of audit. This research is partially supported by Japan Agency for Medical Research and Development(AMED)« less
Scharer, Rachel M.; Patterson III, William F.; Carlson, John K.; Poulakis, Gregg R.
2012-01-01
Endangered smalltooth sawfish (Pristis pectinata) were opportunistically sampled in south Florida and aged by counting opaque bands in sectioned vertebrae (n = 15). Small sample size precluded traditional age verification, but fish collected in spring and summer had translucent vertebrae margins, while fish collected in winter had opaque margins. Trends in Sr:Ca measured across vertebrae with laser ablation-inductively coupled plasma-mass spectrometry corresponded well to annual salinity trends observed in sawfish estuarine nursery habitats in south Florida, thus serve as a chemical marker verifying annual formation of opaque bands. Based on that finding and assumptions about mean birth date and timing of opaque band formation, estimated age ranged from 0.4 y for a 0.60 m total length (TL) male to 14.0 y for a 4.35 m TL female. Von Bertalanffy growth parameters computed from size at age data were 4.48 m for L∞, 0.219 y−1for k, and −0.81 y for t0. Results of this study have important implications for sawfish conservation as well as for inferring habitat residency of euryhaline elasmobranchs via chemical analysis of vertebrae. PMID:23082225
Testing an online, dynamic consent portal for large population biobank research.
Thiel, Daniel B; Platt, Jodyn; Platt, Tevah; King, Susan B; Fisher, Nicole; Shelton, Robert; Kardia, Sharon L R
2015-01-01
Michigan's BioTrust for Health, a public health research biobank comprised of residual dried bloodspot (DBS) cards from newborn screening contains over 4 million samples collected without written consent. Participant-centric initiatives are IT tools that hold great promise to address the consent challenges in biobank research. Working with Private Access Inc., a pioneer in patient-centric web solutions, we created and pilot tested a dynamic informed consent simulation, paired with an educational website, focusing on consent for research utilizing DBSs in Michigan's BioTrust for Health. Out of 187 pilot testers recruited in 2 groups, 137 completed the consent simulation and exit survey. Over 50% indicated their willingness to set up an account if the simulation went live and to recommend it to others. Participants raised concerns about the process of identity verification and appeared to have little experience with sharing health information online. Applying online, dynamic approaches to address the consent challenges raised by biobanks with legacy sample collections should be explored, given the positive reaction to our pilot test and the strong preference for active consent. Balancing security and privacy with accessibility and ease of use will continue to be a challenge. © 2014 S. Karger AG, Basel.
Polkowska, Izabela; Bartoszcze-Tomaszewska, Małgorzata; Sobczyńska-Rak, Aleksandra; Matuszewski, Łukasz
2017-01-01
Dogs commonly serve as a model for various human conditions, including periodontal diseases. The aim of this study was to identify the anaerobic bacteria that colonize the subgingival areas in dogs and humans by using rapid real-time polymerase chain reaction (RT-PCR)-based tests and to compare the results obtained in each species. Bacterial microflora evaluations, both quantitative and qualitative, were performed by applying ready-made tests on twelve dogs and twelve humans. Five samples were collected from each subject's deepest gingival pockets and joined to form a collective sample. The results of the study revealed interspecies similarities in the prevalences of Porphyromonas (P.) gingivalis, Treponema denticola, Tannerella forsythia, and Fusobacterium nucleatum. Red complex bacteria comprised the largest portion of the studied bacterial complexes in all study groups, with P. gingivalis being the most commonly isolated bacterium. The results show similarities in the prevalence of bacterial microflora in dogs and humans. Microbiological analysis of gingival pockets by using rapid real-time PCR-based tests in clinical practice, both veterinary and human, can facilitate the choice of appropriate pharmacological treatment and can provide a basis for subsequent verification of the treatment's effectiveness. PMID:27297417
Gołyńska, Magdalena; Polkowska, Izabela; Bartoszcze-Tomaszewska, Małgorzata; Sobczyńska-Rak, Aleksandra; Matuszewski, Łukasz
2017-03-30
Dogs commonly serve as a model for various human conditions, including periodontal diseases. The aim of this study was to identify the anaerobic bacteria that colonize the subgingival areas in dogs and humans by using rapid real-time polymerase chain reaction (RT-PCR)-based tests and to compare the results obtained in each species. Bacterial microflora evaluations, both quantitative and qualitative, were performed by applying ready-made tests on twelve dogs and twelve humans. Five samples were collected from each subject's deepest gingival pockets and joined to form a collective sample. The results of the study revealed interspecies similarities in the prevalences of Porphyromonas ( P .) gingivalis, Treponema denticola, Tannerella forsythia , and Fusobacterium nucleatum . Red complex bacteria comprised the largest portion of the studied bacterial complexes in all study groups, with P. gingivalis being the most commonly isolated bacterium. The results show similarities in the prevalence of bacterial microflora in dogs and humans. Microbiological analysis of gingival pockets by using rapid real-time PCR-based tests in clinical practice, both veterinary and human, can facilitate the choice of appropriate pharmacological treatment and can provide a basis for subsequent verification of the treatment's effectiveness.
7 CFR 926.20 - Verification of reports and records.
Code of Federal Regulations, 2013 CFR
2013-01-01
... COLLECTION, REPORTING AND RECORDKEEPING REQUIREMENTS APPLICABLE TO CRANBERRIES NOT SUBJECT TO THE CRANBERRY... any premises where applicable records are maintained, where cranberries and cranberry products are...
7 CFR 926.20 - Verification of reports and records.
Code of Federal Regulations, 2012 CFR
2012-01-01
... COLLECTION, REPORTING AND RECORDKEEPING REQUIREMENTS APPLICABLE TO CRANBERRIES NOT SUBJECT TO THE CRANBERRY... any premises where applicable records are maintained, where cranberries and cranberry products are...
7 CFR 926.20 - Verification of reports and records.
Code of Federal Regulations, 2011 CFR
2011-01-01
... COLLECTION, REPORTING AND RECORDKEEPING REQUIREMENTS APPLICABLE TO CRANBERRIES NOT SUBJECT TO THE CRANBERRY... any premises where applicable records are maintained, where cranberries and cranberry products are...
7 CFR 926.20 - Verification of reports and records.
Code of Federal Regulations, 2014 CFR
2014-01-01
... COLLECTION, REPORTING AND RECORDKEEPING REQUIREMENTS APPLICABLE TO CRANBERRIES NOT SUBJECT TO THE CRANBERRY... any premises where applicable records are maintained, where cranberries and cranberry products are...
LLNL Genomic Assessment: Viral and Bacterial Sequencing Needs for TMTI, Task 1.4.2 Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slezak, T; Borucki, M; Lam, M
Good progress has been made on both bacterial and viral sequencing by the TMTI centers. While access to appropriate samples is a limiting factor to throughput, excellent progress has been made with respect to getting agreements in place with key sources of relevant materials. Sharing of sequenced genomes funded by TMTI has been extremely limited to date. The April 2010 exercise should force a resolution to this, but additional managerial pressures may be needed to ensure that rapid sharing of TMTI-funded sequencing occurs, regardless of collaborator constraints concerning ultimate publication(s). Policies to permit TMTI-internal rapid sharing of sequenced genomes shouldmore » be written into all TMTI agreements with collaborators now being negotiated. TMTI needs to establish a Web-based system for tracking samples destined for sequencing. This includes metadata on sample origins and contributor, information on sample shipment/receipt, prioritization by TMTI, assignment to one or more sequencing centers (including possible TMTI-sponsored sequencing at a contributor site), and status history of the sample sequencing effort. While this system could be a component of the AFRL system, it is not part of any current development effort. Policy and standardized procedures are needed to ensure appropriate verification of all TMTI samples prior to the investment in sequencing. PCR, arrays, and classical biochemical tests are examples of potential verification methods. Verification is needed to detect miss-labeled, degraded, mixed or contaminated samples. Regular QC exercises are needed to ensure that the TMTI-funded centers are meeting all standards for producing quality genomic sequence data.« less
NASA Technical Reports Server (NTRS)
Sung, Q. C.; Miller, L. D.
1977-01-01
Three methods were tested for collection of the training sets needed to establish the spectral signatures of the land uses/land covers sought due to the difficulties of retrospective collection of representative ground control data. Computer preprocessing techniques applied to the digital images to improve the final classification results were geometric corrections, spectral band or image ratioing and statistical cleaning of the representative training sets. A minimal level of statistical verification was made based upon the comparisons between the airphoto estimates and the classification results. The verifications provided a further support to the selection of MSS band 5 and 7. It also indicated that the maximum likelihood ratioing technique can achieve more agreeable classification results with the airphoto estimates than the stepwise discriminant analysis.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
40 CFR 1065.342 - Sample dryer verification.
Code of Federal Regulations, 2013 CFR
2013-07-01
... condensation as required in § 1065.145(d)(1)(i). We recommend that the sample system components be maintained at least 5 °C above the local humidified gas dewpoint to prevent aqueous condensation. (5) Measure...
40 CFR 1065.342 - Sample dryer verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... condensation as required in § 1065.145(d)(1)(i). We recommend that the sample system components be maintained at least 5 °C above the local humidified gas dewpoint to prevent aqueous condensation. (5) Measure...
40 CFR 1065.342 - Sample dryer verification.
Code of Federal Regulations, 2012 CFR
2012-07-01
... condensation as required in § 1065.145(d)(1)(i). We recommend that the sample system components be maintained at least 5 °C above the local humidified gas dewpoint to prevent aqueous condensation. (5) Measure...
40 CFR 761.130 - Sampling requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... developed by the Midwest Research Institute (MRI) for use in enforcement inspections: “Verification of PCB... the MRI report “Field Manual for Grid Sampling of PCB Spill Sites to Verify Cleanup.” Both the MRI...
40 CFR 761.130 - Sampling requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... developed by the Midwest Research Institute (MRI) for use in enforcement inspections: “Verification of PCB... the MRI report “Field Manual for Grid Sampling of PCB Spill Sites to Verify Cleanup.” Both the MRI...
40 CFR 761.130 - Sampling requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... developed by the Midwest Research Institute (MRI) for use in enforcement inspections: “Verification of PCB... the MRI report “Field Manual for Grid Sampling of PCB Spill Sites to Verify Cleanup.” Both the MRI...
Storms, S M; Feltus, A; Barker, A R; Joly, M-A; Girard, M
2009-03-01
Measurement of somatropin charged variants by isoelectric focusing was replaced with capillary zone electrophoresis in the January 2006 European Pharmacopoeia Supplement 5.3, based on results from an interlaboratory collaborative study. Due to incompatibilities and method-robustness issues encountered prior to verification, a number of method parameters required optimisation. As the use of a diode array detector at 195 nm or 200 nm led to a loss of resolution, a variable wavelength detector using a 200 nm filter was employed. Improved injection repeatability was obtained by increasing the injection time and pressure, and changing the sample diluent from water to running buffer. Finally, definition of capillary pre-treatment and rinse procedures resulted in more consistent separations over time. Method verification data are presented demonstrating linearity, specificity, repeatability, intermediate precision, limit of quantitation, sample stability, solution stability, and robustness. Based on these experiments, several modifications to the current method have been recommended and incorporated into the European Pharmacopoeia to help improve method performance across laboratories globally.
Applying Independent Verification and Validation to Automatic Test Equipment
NASA Technical Reports Server (NTRS)
Calhoun, Cynthia C.
1997-01-01
This paper describes a general overview of applying Independent Verification and Validation (IV&V) to Automatic Test Equipment (ATE). The overview is not inclusive of all IV&V activities that can occur or of all development and maintenance items that can be validated and verified, during the IV&V process. A sampling of possible IV&V activities that can occur within each phase of the ATE life cycle are described.
NASA Astrophysics Data System (ADS)
Tang, Xiaoli; Lin, Tong; Jiang, Steve
2009-09-01
We propose a novel approach for potential online treatment verification using cine EPID (electronic portal imaging device) images for hypofractionated lung radiotherapy based on a machine learning algorithm. Hypofractionated radiotherapy requires high precision. It is essential to effectively monitor the target to ensure that the tumor is within the beam aperture. We modeled the treatment verification problem as a two-class classification problem and applied an artificial neural network (ANN) to classify the cine EPID images acquired during the treatment into corresponding classes—with the tumor inside or outside of the beam aperture. Training samples were generated for the ANN using digitally reconstructed radiographs (DRRs) with artificially added shifts in the tumor location—to simulate cine EPID images with different tumor locations. Principal component analysis (PCA) was used to reduce the dimensionality of the training samples and cine EPID images acquired during the treatment. The proposed treatment verification algorithm was tested on five hypofractionated lung patients in a retrospective fashion. On average, our proposed algorithm achieved a 98.0% classification accuracy, a 97.6% recall rate and a 99.7% precision rate. This work was first presented at the Seventh International Conference on Machine Learning and Applications, San Diego, CA, USA, 11-13 December 2008.
Automated Network Mapping and Topology Verification
2016-06-01
collection of information includes amplifying data about the networked devices such as hardware details, logical addressing schemes, 7 operating ...collection of information, including suggestions for reducing this burden, to Washington headquarters Services, Directorate for Information Operations ...maximum 200 words) The current military reliance on computer networks for operational missions and administrative duties makes network
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-14
... DEPARTMENT OF HOMELAND SECURITY U.S. Citizenship and Immigration Services [OMB Control Number 1615... Department of Homeland Security (DHS), U.S. Citizenship and Immigration Services (USCIS) published a 30-day..., 2012, to ensure the public sufficient opportunity to comment on the information collection. In the 30...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-07
... Disadvantaged Business Utilization (OSDBU), Department of Veterans Affairs, will submit the collection of information abstracted below to the Office of Management and Budget (OMB) for review and comment. The PRA... correspondence. FOR FURTHER INFORMATION CONTACT: Denise McLamb, Enterprise Records Service (005R1B), Department...
Feedback Seeking in Early Adolescence: Self-Enhancement or Self-Verification?
Rosen, Lisa H; Principe, Connor P; Langlois, Judith H
2013-02-13
The authors examined whether early adolescents ( N = 90) solicit self-enhancing feedback (i.e., positive feedback) or self-verifying feedback (i.e., feedback congruent with self-views, even when these views are negative). Sixth, seventh, and eighth graders first completed a self-perception measure and then selected whether to receive positive or negative feedback from an unknown peer in different domains of self. Results were consistent with self-verification theory; adolescents who perceived themselves as having both strengths and weaknesses were more likely to seek negative feedback regarding a self-perceived weakness compared to a self-perceived strength. The authors found similar support for self-verification processes when they considered the entire sample regardless of perceived strengths and weaknesses; hierarchical linear modeling (HLM) examined the predictive power of ratings of self-perceived ability, certainty, and importance on feedback seeking for all participants and provided additional evidence of self-verification strivings in adolescence.
Feedback Seeking in Early Adolescence: Self-Enhancement or Self-Verification?
Rosen, Lisa H.; Principe, Connor P.; Langlois, Judith H.
2012-01-01
The authors examined whether early adolescents (N = 90) solicit self-enhancing feedback (i.e., positive feedback) or self-verifying feedback (i.e., feedback congruent with self-views, even when these views are negative). Sixth, seventh, and eighth graders first completed a self-perception measure and then selected whether to receive positive or negative feedback from an unknown peer in different domains of self. Results were consistent with self-verification theory; adolescents who perceived themselves as having both strengths and weaknesses were more likely to seek negative feedback regarding a self-perceived weakness compared to a self-perceived strength. The authors found similar support for self-verification processes when they considered the entire sample regardless of perceived strengths and weaknesses; hierarchical linear modeling (HLM) examined the predictive power of ratings of self-perceived ability, certainty, and importance on feedback seeking for all participants and provided additional evidence of self-verification strivings in adolescence. PMID:23543746
Using expansive grasses for monitoring heavy metal pollution in the vicinity of roads.
Vachová, Pavla; Vach, Marek; Najnarová, Eva
2017-10-01
We propose a method for monitoring heavy metal deposition in the vicinity of roads using the leaf surfaces of two expansive grass species which are greatly abundant. A principle of the proposed procedure is to minimize the number of operations in collecting and preparing samples for analysis. The monitored elements are extracted from the leaf surfaces using dilute nitric acid directly in the sample-collection bottle. The ensuing steps, then, are only to filter the extraction solution and the elemental analysis itself. The verification results indicate that the selected grasses Calamagrostis epigejos and Arrhenatherum elatius are well suited to the proposed procedure. Selected heavy metals (Zn, Cu, Pb, Ni, Cr, and Cd) in concentrations appropriate for direct determination using methods of elemental analysis can be extracted from the surface of leaves of these species collected in the vicinity of roads with medium traffic loads. Comparing the two species showed that each had a different relationship between the amounts of deposited heavy metals and distance from the road. This disparity can be explained by specific morphological properties of the two species' leaf surfaces. Due to the abundant occurrence of the two species and the method's general simplicity and ready availability, we regard the proposed approach to constitute a broadly usable and repeatable one for producing reproducible results. Copyright © 2017 Elsevier Ltd. All rights reserved.
Design and Verification of Critical Pressurised Windows for Manned Spaceflight
NASA Astrophysics Data System (ADS)
Lamoure, Richard; Busto, Lara; Novo, Francisco; Sinnema, Gerben; Leal, Mendes M.
2014-06-01
The Window Design for Manned Spaceflight (WDMS) project was tasked with establishing the state-of-art and explore possible improvements to the current structural integrity verification and fracture control methodologies for manned spacecraft windows.A critical review of the state-of-art in spacecraft window design, materials and verification practice was conducted. Shortcomings of the methodology in terms of analysis, inspection and testing were identified. Schemes for improving verification practices and reducing conservatism whilst maintaining the required safety levels were then proposed.An experimental materials characterisation programme was defined and carried out with the support of the 'Glass and Façade Technology Research Group', at the University of Cambridge. Results of the sample testing campaign were analysed, post-processed and subsequently applied to the design of a breadboard window demonstrator.Two Fused Silica glass window panes were procured and subjected to dedicated analyses, inspection and testing comprising both qualification and acceptance programmes specifically tailored to the objectives of the activity.Finally, main outcomes have been compiled into a Structural Verification Guide for Pressurised Windows in manned spacecraft, incorporating best practices and lessons learned throughout this project.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-31
...In compliance with the Paperwork Reduction Act (PRA) of 1995 (44 U.S.C. 3501-3521), this notice announces that the Veterans Benefits Administration (VBA), Department of Veterans Affairs, will submit the collection of information abstracted below to the Office of Management and Budget (OMB) for review and comment. The PRA submission describes the nature of the information collection and its expected cost and burden; it includes the actual data collection instrument.
Roberts, Marilyn C; Joshi, Prabhu Raj; Greninger, Alexander L; Melendez, Daira; Paudel, Saroj; Acharya, Mahesh; Bimali, Nabin Kishor; Koju, Narayan P; No, David; Chalise, Mukesh; Kyes, Randall C
2018-05-01
Swine nasal samples [n = 282] were collected from 12 randomly selected farms around Kathmandu, Nepal, from healthy animals. In addition, wild monkey (Macaca mulatta) saliva samples [n = 59] were collected near temples areas in Kathmandu using a non-invasive sampling technique. All samples were processed for MRSA using standardized selective media and conventional biochemical tests. MRSA verification was done and isolates characterized by SCCmec, multilocus sequence typing, whole genome sequencing [WGS] and antibiotic susceptibilities. Six (2.1%) swine MRSA were isolated from five of the different swine herds tested, five were ST22 type IV and one ST88 type V. Four (6.8%) macaques MRSA were isolated, with three ST22 SCCmec type IV and one ST239 type III. WGS sequencing showed that the eight ciprofloxacin resistant ST22 isolates carried gyrA mutation [S84L]. Six isolates carried the erm(C) genes, five isolates carried aacC-aphD genes and four isolates carried blaZ genes. The swine linezolid resistant ST22 did not carry any known acquired linezolid resistance genes but had a mutation in ribosomal protein L22 [A29V] and an insertion in L4 [68KG69], both previously associated with linezolid resistance. Multiple virulence factors were also identified. This is the first time MRSA ST22 SCCmec IV has been isolated from livestock or primates.
Defining the IEEE-854 floating-point standard in PVS
NASA Technical Reports Server (NTRS)
Miner, Paul S.
1995-01-01
A significant portion of the ANSI/IEEE-854 Standard for Radix-Independent Floating-Point Arithmetic is defined in PVS (Prototype Verification System). Since IEEE-854 is a generalization of the ANSI/IEEE-754 Standard for Binary Floating-Point Arithmetic, the definition of IEEE-854 in PVS also formally defines much of IEEE-754. This collection of PVS theories provides a basis for machine checked verification of floating-point systems. This formal definition illustrates that formal specification techniques are sufficiently advanced that is is reasonable to consider their use in the development of future standards.
NASA Astrophysics Data System (ADS)
Billings, Andrew; Kaiser, Carl; Young, Craig M.; Hiebert, Laurel S.; Cole, Eli; Wagner, Jamie K. S.; Van Dover, Cindy Lee
2017-03-01
The current standard for large-volume (thousands of cubic meters) zooplankton sampling in the deep sea is the MOCNESS, a system of multiple opening-closing nets, typically lowered to within 50 m of the seabed and towed obliquely to the surface to obtain low-spatial-resolution samples that integrate across 10 s of meters of water depth. The SyPRID (Sentry Precision Robotic Impeller Driven) sampler is an innovative, deep-rated (6000 m) plankton sampler that partners with the Sentry Autonomous Underwater Vehicle (AUV) to obtain paired, large-volume plankton samples at specified depths and survey lines to within 1.5 m of the seabed and with simultaneous collection of sensor data. SyPRID uses a perforated Ultra-High-Molecular-Weight (UHMW) plastic tube to support a fine mesh net within an outer carbon composite tube (tube-within-a-tube design), with an axial flow pump located aft of the capture filter. The pump facilitates flow through the system and reduces or possibly eliminates the bow wave at the mouth opening. The cod end, a hollow truncated cone, is also made of UHMW plastic and includes a collection volume designed to provide an area where zooplankton can collect, out of the high flow region. SyPRID attaches as a saddle-pack to the Sentry vehicle. Sentry itself is configured with a flight control system that enables autonomous survey paths to low altitudes. In its verification deployment at the Blake Ridge Seep (2160 m) on the US Atlantic Margin, SyPRID was operated for 6 h at an altitude of 5 m. It recovered plankton samples, including delicate living larvae, from the near-bottom stratum that is seldom sampled by a typical MOCNESS tow. The prototype SyPRID and its next generations will enable studies of plankton or other particulate distributions associated with localized physico-chemical strata in the water column or above patchy habitats on the seafloor.
The USEPA has been very active in membrane research. The following areas are currently being investigated: in-house fouling research, Information Collection Rule (ICR) treatment studies, inorganic scaling modeling, Environmental Technology Verification (ETV) program implementati...
78 FR 57162 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-17
... that must be capable of verification by qualified auditors. Besides determining program reimbursement...). Insurers, underwriters, third party administrators, and self-insured/self-administered employers use the...
Analysis of Nerve Agent Metabolites from Hair for Long-Term Verification of Nerve Agent Exposure
2016-05-09
Analysis of Nerve Agent Metabolites from Hair for Long-Term Verification of Nerve Agent Exposure Amanda S. Appel,† John H. McDonough,‡ Joseph D...feasible. In this study, hair was evaluated as a long-term repository of nerve agent hydrolysis products. Pinacolyl methylphosphonic acid (PMPA...hydrolysis product of soman) and isopropyl methylphosphonic acid (IMPA; hydrolysis product of sarin) were extracted from hair samples with N,N
Genetic characterization of Zostera asiatica on the Pacific Coast of North America
Talbot, S.L.; Wyllie-Echeverria, S.; Ward, D.H.; Rearick, J.R.; Sage, G.K.; Chesney, B.; Phillips, R.C.
2006-01-01
We gathered sequence information from the nuclear 5.8S rDNA gene and associated internal transcribed spacers, ITS-1 and ITS-2 (5.8S rDNA/ITS), and the chloroplast maturase K (matK) gene, from Zostera samples collected from subtidal habitats in Monterey and Santa Barbara (Isla Vista) bays, California, to test the hypothesis that these plants are conspecific with Z. asiatica Miki of Asia. Sequences from approximately 520 base pairs of the nuclear 5.8S rDNA/ITS obtained from the subtidal Monterey and Isla Vista Zostera samples were identical to homologous sequences obtained from Z. marina collected from intertidal habitats in Japan, Alaska, Oregon and California. Similarly, sequences from the matK gene from the subtidal Zostera samples were identical to matK sequences obtained from Z. marina collected from intertidal habitats in Japan, Alaska, Oregon and California, but differed from Z. asiatica sequences accessioned into GenBank. This suggests the subtidal plants are conspecific with Z. marina, not Z. asiatica. However, we found that herbarium samples accessioned into the Kyoto University Herbarium, determined to be Z. asiatica, yielded 5.8S rDNA/ITS sequences consistent with either Z. japonica, in two cases, or Z. marina, in one case. Similar results were observed for the chloroplast matK gene; we found haplotypes that were inconsistent with published matK sequences from Z. asiatica collected from Japan. These results underscore the need for closer examination of the relationship between Z. marina along the Pacific Coast of North America, and Z. asiatica of Asia, for the retention and verification of specimens examined in scientific studies, and for assessment of the usefulness of morphological characters in the determination of taxonomic relationships within Zosteraceae.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jankovic, John; Zontek, Tracy L.; Ogle, Burton R.
We examined the calibration records of two direct reading instruments designated as condensation particle counters in order to determine the number of times they were found to be out of tolerance at annual manufacturer's recalibration. For both instruments were found to be out of tolerance more times than within tolerance. And, it was concluded that annual calibration alone was insufficient to provide operational confidence in an instrument's response. Thus, a method based on subsequent agreement with data gathered from a newly calibrated instrument was developed to confirm operational readiness between annual calibrations, hereafter referred to as bump testing. The methodmore » consists of measuring source particles produced by a gas grille spark igniter in a gallon-size jar. Sampling from this chamber with a newly calibrated instrument to determine the calibrated response over the particle concentration range of interest serves as a reference. Agreement between this reference response and subsequent responses at later dates implies that the instrument is performing as it was at the time of calibration. Side-by-side sampling allows the level of agreement between two or more instruments to be determined. This is useful when simultaneously collected data are compared for differences, i.e., background with process aerosol concentrations. A reference set of data was obtained using the spark igniter. The generation system was found to be reproducible and suitable to form the basis of calibration verification. Finally, the bump test is simple enough to be performed periodically throughout the calibration year or prior to field monitoring.« less
Jankovic, John; Zontek, Tracy L.; Ogle, Burton R.; ...
2015-01-27
We examined the calibration records of two direct reading instruments designated as condensation particle counters in order to determine the number of times they were found to be out of tolerance at annual manufacturer's recalibration. For both instruments were found to be out of tolerance more times than within tolerance. And, it was concluded that annual calibration alone was insufficient to provide operational confidence in an instrument's response. Thus, a method based on subsequent agreement with data gathered from a newly calibrated instrument was developed to confirm operational readiness between annual calibrations, hereafter referred to as bump testing. The methodmore » consists of measuring source particles produced by a gas grille spark igniter in a gallon-size jar. Sampling from this chamber with a newly calibrated instrument to determine the calibrated response over the particle concentration range of interest serves as a reference. Agreement between this reference response and subsequent responses at later dates implies that the instrument is performing as it was at the time of calibration. Side-by-side sampling allows the level of agreement between two or more instruments to be determined. This is useful when simultaneously collected data are compared for differences, i.e., background with process aerosol concentrations. A reference set of data was obtained using the spark igniter. The generation system was found to be reproducible and suitable to form the basis of calibration verification. Finally, the bump test is simple enough to be performed periodically throughout the calibration year or prior to field monitoring.« less
Bai, Zhiliang; Chen, Shili; Jia, Lecheng; Zeng, Zhoumo
2018-01-01
Embracing the fact that one can recover certain signals and images from far fewer measurements than traditional methods use, compressive sensing (CS) provides solutions to huge amounts of data collection in phased array-based material characterization. This article describes how a CS framework can be utilized to effectively compress ultrasonic phased array images in time and frequency domains. By projecting the image onto its Discrete Cosine transform domain, a novel scheme was implemented to verify the potentiality of CS for data reduction, as well as to explore its reconstruction accuracy. The results from CIVA simulations indicate that both time and frequency domain CS can accurately reconstruct array images using samples less than the minimum requirements of the Nyquist theorem. For experimental verification of three types of artificial flaws, although a considerable data reduction can be achieved with defects clearly preserved, it is currently impossible to break Nyquist limitation in the time domain. Fortunately, qualified recovery in the frequency domain makes it happen, meaning a real breakthrough for phased array image reconstruction. As a case study, the proposed CS procedure is applied to the inspection of an engine cylinder cavity containing different pit defects and the results show that orthogonal matching pursuit (OMP)-based CS guarantees the performance for real application. PMID:29738452
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-22
... Mental Health Center Verification Template; (2) Attachment 2--Invoice Template; (3) Attachment 3--FCC.... SUPPLEMENTARY INFORMATION: OMB Control Number: 3060-0804. Title: Universal Service--Rural Health Care Program/Rural Health Care Pilot Program. Form No.: FCC Forms 465, 466, 466-A and 467. Type of Review: Revision...
77 FR 21616 - Agency Information Collection Activities: Proposed Request and Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-10
... disability payments. SSA considers the claimants the primary sources of verification; therefore, if claimants... or private self-insured companies administering WC/PDB benefits to disability claimants. Type of...
Gosselin, Robert C; Adcock, Dorothy M; Bates, Shannon M; Douxfils, Jonathan; Favaloro, Emmanuel J; Gouin-Thibault, Isabelle; Guillermo, Cecilia; Kawai, Yohko; Lindhoff-Last, Edelgard; Kitchen, Steve
2018-03-01
This guidance document was prepared on behalf of the International Council for Standardization in Haematology (ICSH) for providing haemostasis-related guidance documents for clinical laboratories. This inaugural coagulation ICSH document was developed by an ad hoc committee, comprised of international clinical and laboratory direct acting oral anticoagulant (DOAC) experts. The committee developed consensus recommendations for laboratory measurement of DOACs (dabigatran, rivaroxaban, apixaban and edoxaban), which would be germane for laboratories assessing DOAC anticoagulation. This guidance document addresses all phases of laboratory DOAC measurements, including pre-analytical (e.g. preferred time sample collection, preferred sample type, sample stability), analytical (gold standard method, screening and quantifying methods) and post analytical (e.g. reporting units, quality assurance). The committee addressed the use and limitations of screening tests such as prothrombin time, activated partial thromboplastin time as well as viscoelastic measurements of clotting blood and point of care methods. Additionally, the committee provided recommendations for the proper validation or verification of performance of laboratory assays prior to implementation for clinical use, and external quality assurance to provide continuous assessment of testing and reporting method. Schattauer GmbH Stuttgart.
Implications of sampling design and sample size for national carbon accounting systems
Michael Köhl; Andrew Lister; Charles T. Scott; Thomas Baldauf; Daniel Plugge
2011-01-01
Countries willing to adopt a REDD regime need to establish a national Measurement, Reporting and Verification (MRV) system that provides information on forest carbon stocks and carbon stock changes. Due to the extensive areas covered by forests the information is generally obtained by sample based surveys. Most operational sampling approaches utilize a combination of...
An Abstract Systolic Model and Its Application to the Design of Finite Element Systems.
1983-01-01
networks as a collection of communicating. parallel :.,’-.processes, some of the techniques for the verification of distributed systems ,.woi (see for...item must be collected . even If there is no Interest In its value. In this case. the collection of the data is simply achieved by changing the state of...the appropriate data as well as for collecting the output data and performing some additional tasks that we will discuss later. A basic functional
NASA Astrophysics Data System (ADS)
Connick, Robert J.
Accurate measurement of normal incident transmission loss is essential for the acoustic characterization of building materials. In this research, a method of measuring normal incidence sound transmission loss proposed by Salissou et al. as a complement to standard E2611-09 of the American Society for Testing and Materials [Standard Test Method for Measurement of Normal Incidence Sound Transmission of Acoustical Materials Based on the Transfer Matrix Method (American Society for Testing and Materials, New York, 2009)] is verified. Two sam- ples from the original literature are used to verify the method as well as a Filtros RTM sample. Following the verification, several nano-material Aerogel samples are measured.
Hierarchical Representation Learning for Kinship Verification.
Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul
2017-01-01
Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.
ETV TEST OF PCDD/F EMISSIONS MONITORING SYSTEMS
Four polychlorinated dibenzodioxin and furan (PCDD/F) emission monitors were tested under the EPA Environmental Technology and Verification (ETV) program. Two long-term sampling devices, the DioxinMonitoringSystem and Adsorption Method for Sampling Dioxins and Furans, and two sem...
1991-07-01
concerning disposition of soil that is considered hazardous after treatment. The report also documents the data collected in support of soil disposition...regulatory and technical lessons learned concerning disposition of soil after treatment. The report also documents the data collected in support of soil...were undertaken to support delisting of the soil, including the Wii / verification test burn, a RCRA trial burn, and data collected during routine
Christensen, James C.; Shiyanov, Pavel A.; Estepp, Justin R.; Schlager, John J.
2014-01-01
Expanding interest in oxytocin, particularly the role of endogenous oxytocin in human social behavior, has created a pressing need for replication of results and verification of assay methods. In this study, we sought to replicate and extend previous results correlating plasma oxytocin with trust and trustworthy behavior. As a necessary first step, the two most commonly used commercial assays were compared in human plasma via the addition of a known quantity of exogenous oxytocin, with and without sample extraction. Plasma sample extraction was found to be critical in obtaining repeatable concentrations of oxytocin. In the subsequent trust experiment, twelve samples in duplicate, from each of 82 participants, were collected over approximately six hours during the performance of a Prisoner’s Dilemma task paradigm that stressed human interpersonal trust. We found no significant relationship between plasma oxytocin concentrations and trusting or trustworthy behavior. In light of these findings, previous published work that used oxytocin immunoassays without sample extraction should be reexamined and future research exploring links between endogenous human oxytocin and trust or social behavior should proceed with careful consideration of methods and appropriate biofluids for analysis. PMID:25549255
[Quality Management System in Pathological Laboratory].
Koyatsu, Junichi; Ueda, Yoshihiko
2015-07-01
Even compared to other clinical laboratories, the pathological laboratory conducts troublesome work, and many of the work processes are also manual. Therefore, the introduction of the systematic management of administration is necessary. It will be a shortcut to use existing standards such as ISO 15189 for this purpose. There is no standard specialized for the pathological laboratory, but it is considered to be important to a pathological laboratory in particular. 1. Safety nianagement of the personnel and environmental conditions. Comply with laws and regulations concerning the handling of hazardous materials. 2. Pre-examination processes. The laboratory shall have documented procedures for the proper collection and handling of primary samples. Developed and documented criteria for acceptance or rejection of samples are applied. 3. Examination processes. Selection, verification, and validation of the examination procedures. Devise a system that can constantly monitor the traceability of the sample. 4. Post-examination processes. Storage, retention, and disposal of clinical samples. 5. Release of results. When examination results fall within established alert or critical intervals, immediately notify the physicians. The important point is to recognize the needs of the client and be aware that pathological diagnoses are always "the final diagnoses".
Pogorzelec, Marta; Piekarska, Katarzyna
2018-08-01
The primary goal of the presented study was the investigation of occurrence and concentration of sixteen selected polycyclic aromatic hydrocarbons in samples from various stages of water treatment and verification of the applicability of semi-permeable membrane devices in the monitoring of drinking water. Another objective was to verify if weather seasons affect the concentration and complexity of PAHs. For these purposes, semipermeable membrane devices were installed in a surface water treatment plant located in Lower Silesia (Poland). Samples were collected monthly over a period of one year. To determine the effect of water treatment on PAH concentrations, four sampling sites were selected: raw water input, a stream of water in the pipe just before ozonation, treated water output and water after passing through the distribution system. After each month of sampling, SPMDs were exchanged for fresh ones and prepared for instrumental analysis. Concentrations of polycyclic aromatic hydrocarbons were determined by high-performance liquid chromatography (HPLC). The presented study indicates that semipermeable membrane devices can be an effective tool for the analysis of drinking water, in which organic micropollutants occur at very low concentrations. Copyright © 2018 Elsevier B.V. All rights reserved.
2006-09-30
High-Pressure Waterjet • CO2 Pellet/Turbine Wheel • Ultrahigh-Pressure Waterjet 5 Process Water Reuse/Recycle • Cross-Flow Microfiltration ...documented on a process or laboratory form. Corrective action will involve taking all necessary steps to restore a measuring system to proper working order...In all cases, a nonconformance will be rectified before sample processing and analysis continues. If corrective action does not restore the
Skates, Steven J.; Gillette, Michael A.; LaBaer, Joshua; Carr, Steven A.; Anderson, N. Leigh; Liebler, Daniel C.; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L.; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R.; Rodriguez, Henry; Boja, Emily S.
2014-01-01
Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC), with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance, and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step towards building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research. PMID:24063748
Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S
2013-12-06
Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.
Branck, Tobyn A.; Hurley, Matthew J.; Prata, Gianna N.; Crivello, Christina A.
2017-01-01
ABSTRACT Listeria monocytogenes is of great concern in food processing facilities because it persists in biofilms, facilitating biotransfer. Stainless steel is commonly used for food contact surfaces and transport containers. L. monocytogenes biofilms on stainless steel served as a model system for surface sampling, to test the performance of a sonicating swab in comparison with a standard cotton swab. Swab performance and consistency were determined using total viable counts. Stainless steel coupons sampled with both types of swabs were examined using scanning electron microscopy, to visualize biofilms and surface structures (i.e., polishing grooves and scratches). Laser scanning confocal microscopy was used to image and to quantitate the biofilms remaining after sampling with each swab type. The total viable counts were significantly higher (P ≤ 0.05) with the sonicating swab than with the standard swab in each trial. The sonicating swab was more consistent in cell recovery than was the standard swab, with coefficients of variation ranging from 8.9% to 12.3% and from 7.1% to 37.6%, respectively. Scanning electron microscopic imaging showed that biofilms remained in the polished grooves of the coupons sampled with the standard swab but were noticeably absent with the sonicating swab. Percent area measurements of biofilms remaining on the stainless steel coupons showed significantly (P ≤ 0.05) less biofilm remaining when the sonicating swab was used (median, 1.1%), compared with the standard swab (median, 70.4%). The sonicating swab provided greater recovery of cells, with more consistency, than did the standard swab, and it is employs sonication, suction, and scrubbing. IMPORTANCE Inadequate surface sampling can result in foodborne illness outbreaks from biotransfer, since verification of sanitization protocols relies on surface sampling and recovery of microorganisms for detection and enumeration. Swabbing is a standard method for microbiological sampling of surfaces. Although swabbing offers portability and ease of use, there are limitations, such as high user variability and low recovery rates, which can be attributed to many different causes. This study demonstrates some benefits that a sonicating swab has over a standard swab for removal and collection of microbiological samples from a surface, to provide better verification of surface cleanliness and to help decrease the potential for biotransfer of pathogens into foods. PMID:28314729
Branck, Tobyn A; Hurley, Matthew J; Prata, Gianna N; Crivello, Christina A; Marek, Patrick J
2017-06-01
Listeria monocytogenes is of great concern in food processing facilities because it persists in biofilms, facilitating biotransfer. Stainless steel is commonly used for food contact surfaces and transport containers. L. monocytogenes biofilms on stainless steel served as a model system for surface sampling, to test the performance of a sonicating swab in comparison with a standard cotton swab. Swab performance and consistency were determined using total viable counts. Stainless steel coupons sampled with both types of swabs were examined using scanning electron microscopy, to visualize biofilms and surface structures (i.e., polishing grooves and scratches). Laser scanning confocal microscopy was used to image and to quantitate the biofilms remaining after sampling with each swab type. The total viable counts were significantly higher ( P ≤ 0.05) with the sonicating swab than with the standard swab in each trial. The sonicating swab was more consistent in cell recovery than was the standard swab, with coefficients of variation ranging from 8.9% to 12.3% and from 7.1% to 37.6%, respectively. Scanning electron microscopic imaging showed that biofilms remained in the polished grooves of the coupons sampled with the standard swab but were noticeably absent with the sonicating swab. Percent area measurements of biofilms remaining on the stainless steel coupons showed significantly ( P ≤ 0.05) less biofilm remaining when the sonicating swab was used (median, 1.1%), compared with the standard swab (median, 70.4%). The sonicating swab provided greater recovery of cells, with more consistency, than did the standard swab, and it is employs sonication, suction, and scrubbing. IMPORTANCE Inadequate surface sampling can result in foodborne illness outbreaks from biotransfer, since verification of sanitization protocols relies on surface sampling and recovery of microorganisms for detection and enumeration. Swabbing is a standard method for microbiological sampling of surfaces. Although swabbing offers portability and ease of use, there are limitations, such as high user variability and low recovery rates, which can be attributed to many different causes. This study demonstrates some benefits that a sonicating swab has over a standard swab for removal and collection of microbiological samples from a surface, to provide better verification of surface cleanliness and to help decrease the potential for biotransfer of pathogens into foods. Copyright © 2017 American Society for Microbiology.
The F1000Research: Ebola article collection
Piot, Peter
2014-01-01
The explosion of information about Ebola requires rapid publication, transparent verification and unrestricted access. I urge everyone involved in all aspects of the Ebola epidemic to openly and rapidly report their experiences and findings. PMID:25580233
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-17
...: VA Form Letter 21-914 is use to verify whether Filipino veterans of the Special Philippine Scouts, Commonwealth Army of the Philippines, organized guerilla groups receiving service-connected compensation...
Verenitch, Sergei; Mazumder, Asit
2015-01-01
The use of nitrogen stable isotopes to discriminate between conventionally and organically grown crops has been further developed in this study. Soil and irrigation water from different regions, as well as nitrogen fertilizers used, have been examined in detail to determine their effects on nitrogen isotope composition of spinach, lettuce, broccoli and tomatoes. Over 1000 samples of various types of organically and conventionally grown produce of known origin, along with the samples of nitrogen fertilizers used for their growth, have been analysed in order to assemble the datasets of crop/fertilizer correlations. The results demonstrate that the developed approach can be used as a valuable component in the verification of agricultural practices for more than 25 different types of commercially grown green produce, either organic or conventional. Over a period of two years, various organic and non-organic greens, from different stores in Seattle (WA, USA) and Victoria (BC, Canada), were collected and analysed using this methodology with the objective of determining any pattern of misrepresentation.
DECHADE: DEtecting slight Changes with HArd DEcisions in Wireless Sensor Networks
NASA Astrophysics Data System (ADS)
Ciuonzo, D.; Salvo Rossi, P.
2018-07-01
This paper focuses on the problem of change detection through a Wireless Sensor Network (WSN) whose nodes report only binary decisions (on the presence/absence of a certain event to be monitored), due to bandwidth/energy constraints. The resulting problem can be modelled as testing the equality of samples drawn from independent Bernoulli probability mass functions, when the bit probabilities under both hypotheses are not known. Both One-Sided (OS) and Two-Sided (TS) tests are considered, with reference to: (i) identical bit probability (a homogeneous scenario), (ii) different per-sensor bit probabilities (a non-homogeneous scenario) and (iii) regions with identical bit probability (a block-homogeneous scenario) for the observed samples. The goal is to provide a systematic framework collecting a plethora of viable detectors (designed via theoretically founded criteria) which can be used for each instance of the problem. Finally, verification of the derived detectors in two relevant WSN-related problems is provided to show the appeal of the proposed framework.
Exploration of Uncertainty in Glacier Modelling
NASA Technical Reports Server (NTRS)
Thompson, David E.
1999-01-01
There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.
NASA Technical Reports Server (NTRS)
Venkatapathy, Ethiraj; Gage, Peter; Wright, Michael J.
2017-01-01
Mars Sample Return is our Grand Challenge for the coming decade. TPS (Thermal Protection System) nominal performance is not the key challenge. The main difficulty for designers is the need to verify unprecedented reliability for the entry system: current guidelines for prevention of backward contamination require that the probability of spores larger than 1 micron diameter escaping into the Earth environment be lower than 1 million for the entire system, and the allocation to TPS would be more stringent than that. For reference, the reliability allocation for Orion TPS is closer to 11000, and the demonstrated reliability for previous human Earth return systems was closer to 1100. Improving reliability by more than 3 orders of magnitude is a grand challenge indeed. The TPS community must embrace the possibility of new architectures that are focused on reliability above thermal performance and mass efficiency. MSR (Mars Sample Return) EEV (Earth Entry Vehicle) will be hit with MMOD (Micrometeoroid and Orbital Debris) prior to reentry. A chute-less aero-shell design which allows for self-righting shape was baselined in prior MSR studies, with the assumption that a passive system will maximize EEV robustness. Hence the aero-shell along with the TPS has to take ground impact and not break apart. System verification will require testing to establish ablative performance and thermal failure but also testing of damage from MMOD, and structural performance at ground impact. Mission requirements will demand analysis, testing and verification that are focused on establishing reliability of the design. In this proposed talk, we will focus on the grand challenge of MSR EEV TPS and the need for innovative approaches to address challenges in modeling, testing, manufacturing and verification.
Earth Resources Technology Satellite Operations Control Center (OCC). ERTS-B flight activation plan
NASA Technical Reports Server (NTRS)
1974-01-01
Included in this plan are general objectives through Day 7, operational guidelines and restraints. Following the activation of all subsystems (through Day 3), special series of payload operations were performed to obtain data samples for the different combinations of exposure/gain settings. This took place from Day 4 through Day 7. The Orbit Adjust was employed to perform vernier corrections after the orbit had been defined. The orbit data was collected through Day 3, with the corrections being made from Day 4 through Day 7. ERTS command auxiliary memory (ECAM) was turned on in Day 3 and the memory dumped to a narrow band tape recorder. A verification of memory was done in the off line mode. ECAM was not used in a payload support mode until Day 7.
40 CFR 1065.925 - PEMS preparation for field testing.
Code of Federal Regulations, 2010 CFR
2010-07-01
.... 1065.925 Section 1065.925 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... purge any gaseous sampling PEMS instruments with ambient air until sampling begins to prevent system contamination from excessive cold-start emissions. (e) Conduct calibrations and verifications. (f) Operate any...
ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) TEST OF DIOXIN EMISSION MONITORS
The performance of four dioxin emission monitors including two long-term sampling devices, the DMS (DioxinMonitoringSystem) and AMESA (Adsorption Method for Sampling Dioxins and Furans), and two semi-real-time continuous monitors, RIMMPA-TOFMS (Resonance Ionization with Multi-Mir...
Field procedures for verification and adjustment of fire behavior predictions
Richard C. Rothermel; George C. Rinehart
1983-01-01
The problem of verifying predictions of fire behavior, primarily rate of spread, is discussed in terms of the fire situation for which predictions are made, and the type of fire where data are to be collected. Procedures for collecting data and performing analysis are presented for both readily accessible fires where data should be complete, and for inaccessible fires...
Improved Hip-Based Individual Recognition Using Wearable Motion Recording Sensor
NASA Astrophysics Data System (ADS)
Gafurov, Davrondzhon; Bours, Patrick
In todays society the demand for reliable verification of a user identity is increasing. Although biometric technologies based on fingerprint or iris can provide accurate and reliable recognition performance, they are inconvenient for periodic or frequent re-verification. In this paper we propose a hip-based user recognition method which can be suitable for implicit and periodic re-verification of the identity. In our approach we use a wearable accelerometer sensor attached to the hip of the person, and then the measured hip motion signal is analysed for identity verification purposes. The main analyses steps consists of detecting gait cycles in the signal and matching two sets of detected gait cycles. Evaluating the approach on a hip data set consisting of 400 gait sequences (samples) from 100 subjects, we obtained equal error rate (EER) of 7.5% and identification rate at rank 1 was 81.4%. These numbers are improvements by 37.5% and 11.2% respectively of the previous study using the same data set.
National Center for Nuclear Security: The Nuclear Forensics Project (F2012)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klingensmith, A. L.
These presentation visuals introduce the National Center for Nuclear Security. Its chartered mission is to enhance the Nation’s verification and detection capabilities in support of nuclear arms control and nonproliferation through R&D activities at the NNSS. It has three focus areas: Treaty Verification Technologies, Nonproliferation Technologies, and Technical Nuclear Forensics. The objectives of nuclear forensics are to reduce uncertainty in the nuclear forensics process & improve the scientific defensibility of nuclear forensics conclusions when applied to nearsurface nuclear detonations. Research is in four key areas: Nuclear Physics, Debris collection and analysis, Prompt diagnostics, and Radiochemistry.
WASTEWATER INFRASTRUCTURE TECHNOLOGY VERIFICATION
Many of the wastewater collection systems in the United States were developed in the early part of the last century. Maintenance, retrofits, and rehabilitations since then have resulted in patchwork systems consisting of technologies from different eras. More advanced and cos...
NASA Technical Reports Server (NTRS)
1976-01-01
The framework within which the Applications Systems Verification Tests (ASVTs) are performed and the economic consequences of improved meteorological information demonstrated is described. This framework considers the impact of improved information on decision processes, the data needs to demonstrate the economic impact of the improved information, the data availability, the methodology for determining and analyzing the collected data and demonstrating the economic impact of the improved information, and the possible methods of data collection. Three ASVTs are considered and program outlines and plans are developed for performing experiments to demonstrate the economic consequences of improved meteorological information. The ASVTs are concerned with the citrus crop in Florida, the cotton crop in Mississippi and a group of diverse crops in Oregon. The program outlines and plans include schedules, manpower estimates and funding requirements.
Quality control and quality assurance plan for bridge channel-stability assessments in Massachusetts
Parker, Gene W.; Pinson, Harlow
1993-01-01
A quality control and quality assurance plan has been implemented as part of the Massachusetts bridge scour and channel-stability assessment program. This program is being conducted by the U.S. Geological Survey, Massachusetts-Rhode Island District, in cooperation with the Massachusetts Highway Department. Project personnel training, data-integrity verification, and new data-management technologies are being utilized in the channel-stability assessment process to improve current data-collection and management techniques. An automated data-collection procedure has been implemented to standardize channel-stability assessments on a regular basis within the State. An object-oriented data structure and new image management tools are used to produce a data base enabling management of multiple data object classes. Data will be reviewed by assessors and data base managers before being merged into a master bridge-scour data base, which includes automated data-verification routines.
On marker-based parentage verification via non-linear optimization.
Boerner, Vinzent
2017-06-15
Parentage verification by molecular markers is mainly based on short tandem repeat markers. Single nucleotide polymorphisms (SNPs) as bi-allelic markers have become the markers of choice for genotyping projects. Thus, the subsequent step is to use SNP genotypes for parentage verification as well. Recent developments of algorithms such as evaluating opposing homozygous SNP genotypes have drawbacks, for example the inability of rejecting all animals of a sample of potential parents. This paper describes an algorithm for parentage verification by constrained regression which overcomes the latter limitation and proves to be very fast and accurate even when the number of SNPs is as low as 50. The algorithm was tested on a sample of 14,816 animals with 50, 100 and 500 SNP genotypes randomly selected from 40k genotypes. The samples of putative parents of these animals contained either five random animals, or four random animals and the true sire. Parentage assignment was performed by ranking of regression coefficients, or by setting a minimum threshold for regression coefficients. The assignment quality was evaluated by the power of assignment (P[Formula: see text]) and the power of exclusion (P[Formula: see text]). If the sample of putative parents contained the true sire and parentage was assigned by coefficient ranking, P[Formula: see text] and P[Formula: see text] were both higher than 0.99 for the 500 and 100 SNP genotypes, and higher than 0.98 for the 50 SNP genotypes. When parentage was assigned by a coefficient threshold, P[Formula: see text] was higher than 0.99 regardless of the number of SNPs, but P[Formula: see text] decreased from 0.99 (500 SNPs) to 0.97 (100 SNPs) and 0.92 (50 SNPs). If the sample of putative parents did not contain the true sire and parentage was rejected using a coefficient threshold, the algorithm achieved a P[Formula: see text] of 1 (500 SNPs), 0.99 (100 SNPs) and 0.97 (50 SNPs). The algorithm described here is easy to implement, fast and accurate, and is able to assign parentage using genomic marker data with a size as low as 50 SNPs.
Accurate Biomass Estimation via Bayesian Adaptive Sampling
NASA Technical Reports Server (NTRS)
Wheeler, Kevin R.; Knuth, Kevin H.; Castle, Joseph P.; Lvov, Nikolay
2005-01-01
The following concepts were introduced: a) Bayesian adaptive sampling for solving biomass estimation; b) Characterization of MISR Rahman model parameters conditioned upon MODIS landcover. c) Rigorous non-parametric Bayesian approach to analytic mixture model determination. d) Unique U.S. asset for science product validation and verification.
Rapid determination of alpha emitters using Actinide resin.
Navarro, N; Rodriguez, L; Alvarez, A; Sancho, C
2004-01-01
The European Commission has recently published the recommended radiological protection criteria for the clearance of building and building rubble from the dismantling of nuclear installations. Radionuclide specific clearance levels for actinides are very low (between 0.1 and 1 Bq g(-1)). The prevalence of natural radionuclides in rubble materials makes the verification of these levels by direct alpha counting impossible. The capability of Actinide resin (Eichrom Industries, Inc.) for extracting plutonium and americium from rubble samples has been tested in this work. Besides a strong affinity for actinides in the tri, tetra and hexavalent oxidation states, this extraction chromatographic resin presents an easy recovery of absorbed radionuclides. The retention capability was evaluated on rubble samples spiked with certified radionuclide standards (239Pu and 241Am). Samples were leached with nitric acid, passed through a chromatographic column containing the resin and the elution fraction was measured by LSC. Actinide retention varies from 60% to 80%. Based on these results, a rapid method for the verification of clearance levels for actinides in rubble samples is proposed.
Nguyen, Huynh; Morgan, David A F; Sly, Lindsay I; Benkovich, Morris; Cull, Sharon; Forwood, Mark R
2008-06-01
ISO 11137-2006 (ISO 11137-2a 2006) provides a VDmax 15 method for substantiation of 15 kGy as radiation sterilisation dose (RSD) for health care products with a relatively low sample requirement. Moreover, the method is also valid for products in which the bioburden level is less than or equal to 1.5. In the literature, the bioburden level of processed bone allografts is extremely low. Similarly, the Queensland Bone Bank (QBB) usually recovers no viable organisms from processed bone allografts. Because bone allografts are treated as a type of health care product, the aim of this research was to substantiate 15 kGy as a RSD for frozen bone allografts at the QBB using method VDmax 15-ISO 11137-2: 2006 (ISO 11137-2e, Procedure for method VDmax 15 for multiple production batches. Sterilisation of health care products - radiation - part 2: establishing the sterilisation dose, 2006; ISO 11137-2f, Procedure for method VDmax 15 for a single production batch. Sterilisation of health care products - radiation - part 2: establishing the sterilisation dose, 2006). 30 femoral heads, 40 milled bone allografts and 40 structural bone allografts manufactured according to QBB standard operating procedures were used. Estimated bioburdens for each bone allograft group were used to calculate the verification doses. Next, 10 samples per group were irradiated at the verification dose, sterility was tested and the number of positive tests of sterility recorded. If the number of positive samples was no more than 1, from the 10 tests carried out in each group, the verification was accepted and 15 kGy was substantiated as RSD for those bone allografts. The bioburdens in all three groups were 0, and therefore the verification doses were 0 kGy. Sterility tests of femoral heads and milled bones were all negative (no contamination), and there was one positive test of sterility in the structural bone allograft. Accordingly, the verification was accepted. Using the ISO validated protocol, VDmax 15, 15 kGy was substantiated as RSD for frozen bone allografts manufactured at the QBB.
Analysis of particulate contamination on tape lift samples from the VETA optical surfaces
NASA Technical Reports Server (NTRS)
Germani, Mark S.
1992-01-01
Particulate contamination analysis was carried out on samples taken from the Verification Engineering Test Article (VETA) x-ray detection system. A total of eighteen tape lift samples were taken from the VETA optical surfaces. Initially, the samples were tested using a scanning electron microscope. Additionally, particle composition was determined by energy dispersive x-ray spectrometry. Results are presented in terms of particle loading per sample.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-13
...The Veterans Benefits Administration (VBA), Department of Veterans Affairs (VA), is announcing an opportunity for public comment on the proposed collection of certain information by the agency. Under the Paperwork Reduction Act (PRA) of 1995, Federal agencies are required to publish notice in the Federal Register concerning each proposed collection of information, including each proposed extension of a currently approved collection, and allow 60 days for public comment in response to the notice. This notice solicits comments on information needed to determine an individual's continued entitlement to VA benefits.
Improved orbiter waste collection system study
NASA Technical Reports Server (NTRS)
Bastin, P. H.
1984-01-01
Design concepts for improved fecal waste collection both on the space shuttle orbiter and as a precursor for the space station are discussed. Inflight usage problems associated with the existing orbiter waste collection subsystem are considered. A basis was sought for the selection of an optimum waste collection system concept which may ultimately result in the development of an orbiter flight test article for concept verification and subsequent production of new flight hardware. Two concepts were selected for orbiter and are shown in detail. Additionally, one concept selected for application to the space station is presented.
Benchmarking desktop and mobile handwriting across COTS devices: The e-BioSign biometric database
Tolosana, Ruben; Vera-Rodriguez, Ruben; Fierrez, Julian; Morales, Aythami; Ortega-Garcia, Javier
2017-01-01
This paper describes the design, acquisition process and baseline evaluation of the new e-BioSign database, which includes dynamic signature and handwriting information. Data is acquired from 5 different COTS devices: three Wacom devices (STU-500, STU-530 and DTU-1031) specifically designed to capture dynamic signatures and handwriting, and two general purpose tablets (Samsung Galaxy Note 10.1 and Samsung ATIV 7). For the two Samsung tablets, data is collected using both pen stylus and also the finger in order to study the performance of signature verification in a mobile scenario. Data was collected in two sessions for 65 subjects, and includes dynamic information of the signature, the full name and alpha numeric sequences. Skilled forgeries were also performed for signatures and full names. We also report a benchmark evaluation based on e-BioSign for person verification under three different real scenarios: 1) intra-device, 2) inter-device, and 3) mixed writing-tool. We have experimented the proposed benchmark using the main existing approaches for signature verification: feature- and time functions-based. As a result, new insights into the problem of signature biometrics in sensor-interoperable scenarios have been obtained, namely: the importance of specific methods for dealing with device interoperability, and the necessity of a deeper analysis on signatures acquired using the finger as the writing tool. This e-BioSign public database allows the research community to: 1) further analyse and develop signature verification systems in realistic scenarios, and 2) investigate towards a better understanding of the nature of the human handwriting when captured using electronic COTS devices in realistic conditions. PMID:28475590
Benchmarking desktop and mobile handwriting across COTS devices: The e-BioSign biometric database.
Tolosana, Ruben; Vera-Rodriguez, Ruben; Fierrez, Julian; Morales, Aythami; Ortega-Garcia, Javier
2017-01-01
This paper describes the design, acquisition process and baseline evaluation of the new e-BioSign database, which includes dynamic signature and handwriting information. Data is acquired from 5 different COTS devices: three Wacom devices (STU-500, STU-530 and DTU-1031) specifically designed to capture dynamic signatures and handwriting, and two general purpose tablets (Samsung Galaxy Note 10.1 and Samsung ATIV 7). For the two Samsung tablets, data is collected using both pen stylus and also the finger in order to study the performance of signature verification in a mobile scenario. Data was collected in two sessions for 65 subjects, and includes dynamic information of the signature, the full name and alpha numeric sequences. Skilled forgeries were also performed for signatures and full names. We also report a benchmark evaluation based on e-BioSign for person verification under three different real scenarios: 1) intra-device, 2) inter-device, and 3) mixed writing-tool. We have experimented the proposed benchmark using the main existing approaches for signature verification: feature- and time functions-based. As a result, new insights into the problem of signature biometrics in sensor-interoperable scenarios have been obtained, namely: the importance of specific methods for dealing with device interoperability, and the necessity of a deeper analysis on signatures acquired using the finger as the writing tool. This e-BioSign public database allows the research community to: 1) further analyse and develop signature verification systems in realistic scenarios, and 2) investigate towards a better understanding of the nature of the human handwriting when captured using electronic COTS devices in realistic conditions.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-14
... a TWIC and a voluntary customer satisfaction survey. DATES: Send your comments by August 15, 2011. A... identification verification and access control. TSA also conducts a survey to capture worker overall satisfaction...
Quality control of recycled asphaltic concrete : final report.
DOT National Transportation Integrated Search
1982-07-01
This study examined the variations found in recycled asphaltic concrete mix based upon plant quality control data and verification testing. The data was collected from four recycled hot-mix projects constructed in 1981. All plant control and acceptan...
TETAM Model Verification Study. Volume I. Representation of Intervisibility, Initial Comparisons
1976-02-01
simulation models in terms of firings, engagements, and losses between tank and antitank as compared with the field data collected during the free play battles of Field Experiment 11.8 are found in Volume III. (Author)
37 CFR 201.30 - Verification of Statements of Account.
Code of Federal Regulations, 2011 CFR
2011-07-01
... manufacturer or importer of digital devices or media who is required by 17 U.S.C. 1003 to file with the... applicable generally to attest engagements (collectively, the “AICPA Code”); and (ii) He or she is...
NASA Technical Reports Server (NTRS)
Hornung, Steven D.; Biesinger, Paul; Kirsch, Mike; Beeson, Harold; Leuders, Kathy
1999-01-01
The NASA White Sands Test Facility (WSTF) has developed an entirely aqueous final cleaning and verification process to replace the current chlorofluorocarbon (CFC) 113 based process. This process has been accepted for final cleaning and cleanliness verification of WSTF ground support equipment. The aqueous process relies on ultrapure water at 50 C (323 K) and ultrasonic agitation for removal of organic compounds and particulate. The cleanliness is verified bv determining the total organic carbon (TOC) content and filtration with particulate counting. The effectiveness of the aqueous methods for detecting hydrocarbon contamination and particulate was compared to the accepted CFC 113 sampling procedures. Testing with known contaminants, such as hydraulic fluid and cutting and lubricating oils, to establish a correlation between aqueous TOC and CFC 113 nonvolatile residue (NVR) was performed. Particulate sampling on cleaned batches of hardware that were randomly separated and sampled by the two methods was performed. This paper presents the approach and results, and discusses the issues in establishing the equivalence of aqueous sampling to CFC 113 sampling, while describing the approach for implementing aqueous techniques on Space Shuttle Propulsion hardware.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peters, T. B.; Bannochie, C. J.
Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H in support of verification of Macrobatch (Salt Batch) 11 for the Interim Salt Disposition Program (ISDP) for processing. This document reports characterization data on the samples of Tank 21H and fulfills the requirements of Deliverable 3 of the Technical Task Request (TTR).
Houck, Constance S; Deshpande, Jayant K; Flick, Randall P
2017-06-01
The Task Force for Children's Surgical Care, an ad-hoc multidisciplinary group of invited leaders in pediatric perioperative medicine, was assembled in May 2012 to consider approaches to optimize delivery of children's surgical care in today's competitive national healthcare environment. Over the subsequent 3 years, with support from the American College of Surgeons (ACS) and Children's Hospital Association (CHA), the group established principles regarding perioperative resource standards, quality improvement and safety processes, data collection, and verification that were used to develop an ACS-sponsored Children's Surgery Verification and Quality Improvement Program (ACS CSV). The voluntary ACS CSV was officially launched in January 2017 and more than 125 pediatric surgical programs have expressed interest in verification. ACS CSV-verified programs have specific requirements for pediatric anesthesia leadership, resources, and the availability of pediatric anesthesiologists or anesthesiologists with pediatric expertise to care for infants and young children. The present review outlines the history of the ACS CSV, key elements of the program, and the standards specific to pediatric anesthesiology. As with the pediatric trauma programs initiated more than 40 years ago, this program has the potential to significantly improve surgical care for infants and children in the United States and Canada.
Power Performance Verification of a Wind Farm Using the Friedman's Test.
Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L
2016-06-03
In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman's test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.
Power Performance Verification of a Wind Farm Using the Friedman’s Test
Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L.
2016-01-01
In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable. PMID:27271628
Da Molin, Simona; Cappellini, Fabrizio; Falbo, Rosanna; Signorini, Stefano; Brambilla, Paolo
2014-11-01
Heart-type fatty acid-binding protein (H-FABP) is an early biomarker of cardiac injury. Randox Laboratories developed an immunoturbidimetric H-FABP assay for non-proprietary automated clinical chemistry analysers that could be useful in the emergency department. We verified the analytical performances claimed by Randox Laboratories on Roche Cobas 6000 clinical chemistry platform in use in our laboratory, and we defined our own 99th percentile upper reference limit for H-FABP. For the verification of method performances, we used pools of spared patient samples from routine and two levels of quality control material, while samples for the reference value study were collected from 545 blood donors. Following CLSI guidelines we verified limit of blank (LOB), limit of detection (LOD), limit of quantitation (LOQ), repeatability and within-laboratory precision, trueness, linearity, and the stability of H-FABP in EDTA over 24h. The LOQ (3.19 μg/L) was verified with a CV% of 10.4. The precision was verified for the low (mean 5.88 μg/L, CV=6.7%), the medium (mean 45.28 μg/L, CV=3.0%), and the high concentration (mean 88.81 μg/L, CV=4.0%). The trueness was verified as well as the linearity over the indicated measurement interval of 0.747-120 μg/L. The H-FABP in EDTA samples is stable throughout 24h both at room temperature and at 4 °C. The H-FABP 99th percentile upper reference limit for all subjects (3.60 μg/L, 95% CI 3.51-3.77) is more appropriate than gender-specific ones that are not statistically different. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda
2016-01-01
Background Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules’ performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Methods Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. Results A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2–4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Conclusion Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved. PMID:26730980
EXhype: A tool for mineral classification using hyperspectral data
NASA Astrophysics Data System (ADS)
Adep, Ramesh Nityanand; shetty, Amba; Ramesh, H.
2017-02-01
Various supervised classification algorithms have been developed to classify earth surface features using hyperspectral data. Each algorithm is modelled based on different human expertises. However, the performance of conventional algorithms is not satisfactory to map especially the minerals in view of their typical spectral responses. This study introduces a new expert system named 'EXhype (Expert system for hyperspectral data classification)' to map minerals. The system incorporates human expertise at several stages of it's implementation: (i) to deal with intra-class variation; (ii) to identify absorption features; (iii) to discriminate spectra by considering absorption features, non-absorption features and by full spectra comparison; and (iv) finally takes a decision based on learning and by emphasizing most important features. It is developed using a knowledge base consisting of an Optimal Spectral Library, Segmented Upper Hull method, Spectral Angle Mapper (SAM) and Artificial Neural Network. The performance of the EXhype is compared with a traditional, most commonly used SAM algorithm using Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data acquired over Cuprite, Nevada, USA. A virtual verification method is used to collect samples information for accuracy assessment. Further, a modified accuracy assessment method is used to get a real users accuracies in cases where only limited or desired classes are considered for classification. With the modified accuracy assessment method, SAM and EXhype yields an overall accuracy of 60.35% and 90.75% and the kappa coefficient of 0.51 and 0.89 respectively. It was also found that the virtual verification method allows to use most desired stratified random sampling method and eliminates all the difficulties associated with it. The experimental results show that EXhype is not only producing better accuracy compared to traditional SAM but, can also rightly classify the minerals. It is proficient in avoiding misclassification between target classes when applied on minerals.
Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda
2016-01-01
Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules' performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2-4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved.
NASA Technical Reports Server (NTRS)
Kahan, A. M. (Principal Investigator)
1975-01-01
The author has identified the following significant results. The LANDSAT data collection system has proven itself to be a valuable tool for control of cloud seeding operations and for verification of weather forecasts. These platforms have proven to be reliable weather resistant units suitable for the collection of hydrometeorological data from remote severe weather environments. The detailed design of the wind speed and direction system and the wire-wrapping of the logic boards were completed.
A 300 GHz collective scattering diagnostic for low temperature plasmas.
Hardin, Robert A; Scime, Earl E; Heard, John
2008-10-01
A compact and portable 300 GHz collective scattering diagnostic employing a homodyne detection scheme has been constructed and installed on the hot helicon experiment (HELIX). Verification of the homodyne detection scheme was accomplished with a rotating grooved aluminum wheel to Doppler shift the interaction beam. The HELIX chamber geometry and collection optics allow measurement of scattering angles ranging from 60 degrees to 90 degrees. Artificially driven ion-acoustic waves are also being investigated as a proof-of-principle test for the diagnostic system.
Peng, Jun; Chen, Yi-Ting; Chen, Chien-Lun; Li, Liang
2014-07-01
Large-scale metabolomics study requires a quantitative method to generate metabolome data over an extended period with high technical reproducibility. We report a universal metabolome-standard (UMS) method, in conjunction with chemical isotope labeling liquid chromatography-mass spectrometry (LC-MS), to provide long-term analytical reproducibility and facilitate metabolome comparison among different data sets. In this method, UMS of a specific type of sample labeled by an isotope reagent is prepared a priori. The UMS is spiked into any individual samples labeled by another form of the isotope reagent in a metabolomics study. The resultant mixture is analyzed by LC-MS to provide relative quantification of the individual sample metabolome to UMS. UMS is independent of a study undertaking as well as the time of analysis and useful for profiling the same type of samples in multiple studies. In this work, the UMS method was developed and applied for a urine metabolomics study of bladder cancer. UMS of human urine was prepared by (13)C2-dansyl labeling of a pooled sample from 20 healthy individuals. This method was first used to profile the discovery samples to generate a list of putative biomarkers potentially useful for bladder cancer detection and then used to analyze the verification samples about one year later. Within the discovery sample set, three-month technical reproducibility was examined using a quality control sample and found a mean CV of 13.9% and median CV of 9.4% for all the quantified metabolites. Statistical analysis of the urine metabolome data showed a clear separation between the bladder cancer group and the control group from the discovery samples, which was confirmed by the verification samples. Receiver operating characteristic (ROC) test showed that the area under the curve (AUC) was 0.956 in the discovery data set and 0.935 in the verification data set. These results demonstrated the utility of the UMS method for long-term metabolomics and discovering potential metabolite biomarkers for diagnosis of bladder cancer.
Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey
2010-09-01
Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.
2013-09-01
33 4.7 SAMPLING RESULTS ...34 5.0 PERFORMANCE RESULTS ...PERFORMANCE RESULTS DISCUSSION ............................................................................ 39 5.2.1 Energy: Verify Power Production
The EPA's ETV Program, in partnership with recognized testing organizations, objectively and systematically documents the performance of commercial ready technologies. Together, with the full participation of the technology developer, develop plans, conduct tests, collect and ana...
Overview of open resources to support automated structure verification and elucidation
Cheminformatics methods form an essential basis for providing analytical scientists with access to data, algorithms and workflows. There are an increasing number of free online databases (compound databases, spectral libraries, data repositories) and a rich collection of software...
The TraceDetect's SafeGuard is designed to automatically measure total arsenic concentrations in drinking water samples (including raw water and treated water) over a range from 1 ppb to over 100 ppb. Once the operator has introduced the sample vial and selected "measure&qu...
ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, PCB DETECTION TECHNOLOGY, HYBRIZYME DELFIA TM ASSAY
The DELFIA PCB Assay is a solid-phase time-resolved fluoroimmunoassay based on the sequential addition of sample extract and europium-labeled PCB tracer to a monoclonal antibody reagent specific for PCBs. In this assay, the antibody reagent and sample extract are added to a strip...
Conducting Research from Small University Observatories: Investigating Exoplanet Candidates
NASA Astrophysics Data System (ADS)
Moreland, Kimberly D.
2018-01-01
Kepler has to date discovered 4,496 exoplanet candidates, but only half are confirmed, and only a handful are thought to be Earth sized and in the habitable zone. Planet verification often involves extensive follow-up observations, which are both time and resource intensive. The data set collected by Kepler is massive and will be studied for decades. University/small observatories, such as the one at Texas State University, are in a good position to assist with the exoplanet candidate verification process. By preforming extended monitoring campaigns, which are otherwise cost ineffective for larger observatories, students gain valuable research experience and contribute valuable data and results to the scientific community.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-28
... persons to submit comments on this document. Comments may be submitted by one of the following methods... very low (less than one percent), and this carcass sampling was expensive for the Agency. As stated in.... Following the implementation of PR/HACCP, FSIS analyzed only one pathogen per sample. Then, in 2008, FSIS...
SU-E-T-762: Toward Volume-Based Independent Dose Verification as Secondary Check
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tachibana, H; Tachibana, R
2015-06-15
Purpose: Lung SBRT plan has been shifted to volume prescription technique. However, point dose agreement is still verified using independent dose verification at the secondary check. The volume dose verification is more affected by inhomogeneous correction rather than point dose verification currently used as the check. A feasibility study for volume dose verification was conducted in lung SBRT plan. Methods: Six SBRT plans were collected in our institute. Two dose distributions with / without inhomogeneous correction were generated using Adaptive Convolve (AC) in Pinnacle3. Simple MU Analysis (SMU, Triangle Product, Ishikawa, JP) was used as the independent dose verification softwaremore » program, in which a modified Clarkson-based algorithm was implemented and radiological path length was computed using CT images independently to the treatment planning system. The agreement in point dose and mean dose between the AC with / without the correction and the SMU were assessed. Results: In the point dose evaluation for the center of the GTV, the difference shows the systematic shift (4.5% ± 1.9 %) in comparison of the AC with the inhomogeneous correction, on the other hands, there was good agreement of 0.2 ± 0.9% between the SMU and the AC without the correction. In the volume evaluation, there were significant differences in mean dose for not only PTV (14.2 ± 5.1 %) but also GTV (8.0 ± 5.1 %) compared to the AC with the correction. Without the correction, the SMU showed good agreement for GTV (1.5 ± 0.9%) as well as PTV (0.9% ± 1.0%). Conclusion: The volume evaluation for secondary check may be possible in homogenous region. However, the volume including the inhomogeneous media would make larger discrepancy. Dose calculation algorithm for independent verification needs to be modified to take into account the inhomogeneous correction.« less
The Environmental Response Laboratory Network supports the goal to increase national capacity for biological analysis of environmental samples. This includes methods development and verification, technology transfer, and collaboration with USDA, FERN, CDC.
The MCNP6 Analytic Criticality Benchmark Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
2016-06-16
Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bojechko, Casey; Phillps, Mark; Kalet, Alan
Purpose: Complex treatments in radiation therapy require robust verification in order to prevent errors that can adversely affect the patient. For this purpose, the authors estimate the effectiveness of detecting errors with a “defense in depth” system composed of electronic portal imaging device (EPID) based dosimetry and a software-based system composed of rules-based and Bayesian network verifications. Methods: The authors analyzed incidents with a high potential severity score, scored as a 3 or 4 on a 4 point scale, recorded in an in-house voluntary incident reporting system, collected from February 2012 to August 2014. The incidents were categorized into differentmore » failure modes. The detectability, defined as the number of incidents that are detectable divided total number of incidents, was calculated for each failure mode. Results: In total, 343 incidents were used in this study. Of the incidents 67% were related to photon external beam therapy (EBRT). The majority of the EBRT incidents were related to patient positioning and only a small number of these could be detected by EPID dosimetry when performed prior to treatment (6%). A large fraction could be detected by in vivo dosimetry performed during the first fraction (74%). Rules-based and Bayesian network verifications were found to be complimentary to EPID dosimetry, able to detect errors related to patient prescriptions and documentation, and errors unrelated to photon EBRT. Combining all of the verification steps together, 91% of all EBRT incidents could be detected. Conclusions: This study shows that the defense in depth system is potentially able to detect a large majority of incidents. The most effective EPID-based dosimetry verification is in vivo measurements during the first fraction and is complemented by rules-based and Bayesian network plan checking.« less
76 FR 21225 - Documents Acceptable for Employment Eligibility Verification
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-15
... identity and employment authorization documents (EADs) and receipts that employees may present to employers... \\1\\ (hereinafter collectively referred to as ``employer(s)'') are required to verify the identity and... as acceptable for establishing identity and employment authorization. The employer must examine the...
Designing to Sample the Unknown: Lessons from OSIRIS-REx Project Systems Engineering
NASA Technical Reports Server (NTRS)
Everett, David; Mink, Ronald; Linn, Timothy; Wood, Joshua
2017-01-01
On September 8, 2016, the third NASA New Frontiers mission launched on an Atlas V 411. The Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer (OSIRIS-REx) will rendezvous with asteroid Bennu in 2018, collect a sample in 2020, and return that sample to Earth in September 2023. The development team has overcome a number of challenges in order to design and build a system that will make contact with an unexplored, airless, low-gravity body. This paper will provide an overview of the mission, then focus in on the system-level challenges and some of the key system-level processes. Some of the lessons here are unique to the type of mission, like discussion of operating at a largely-unknown, low-gravity object. Other lessons, particularly from the build phase, have broad implications. The OSIRIS-REx risk management process was particularly effective in achieving an on-time and under-budget development effort. The systematic requirements management and verification and the system validation also helped identify numerous potential problems. The final assessment of the OSIRIS-REx performance will need to wait until the sample is returned in 2023, but this post-launch assessment will capture some of the key systems-engineering lessons from the development team.
Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier
2017-03-14
Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks causes a further verticalization of the health system. Our results highlight the potential disconnect between the theory of change behind RBF and the actual scheme's implementation. The implications are relevant at methodological level, stressing the importance of analyzing implementation processes to fully understand results, as well as at operational level, pointing to the need to carefully adapt the design of RBF schemes (including verification and other key functions) to the context and to allow room to iteratively modify it during implementation. They also question whether the rationale for thorough and costly verification is justified, or rather adaptations are possible.
A Verification Method for MASOES.
Perozo, N; Aguilar Perozo, J; Terán, O; Molina, H
2013-02-01
MASOES is a 3agent architecture for designing and modeling self-organizing and emergent systems. This architecture describes the elements, relationships, and mechanisms, both at the individual and the collective levels, that favor the analysis of the self-organizing and emergent phenomenon without mathematically modeling the system. In this paper, a method is proposed for verifying MASOES from the point of view of design in order to study the self-organizing and emergent behaviors of the modeled systems. The verification criteria are set according to what is proposed in MASOES for modeling self-organizing and emerging systems and the principles of the wisdom of crowd paradigm and the fuzzy cognitive map (FCM) theory. The verification method for MASOES has been implemented in a tool called FCM Designer and has been tested to model a community of free software developers that works under the bazaar style as well as a Wikipedia community in order to study their behavior and determine their self-organizing and emergent capacities.
40 CFR 86.1845-01 - Manufacturer in-use verification testing requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... of test vehicles in the sample comply with the sample size requirements of this section. Any post... vehicles, light-duty trucks, and complete heavy-duty vehicles shall test, or cause to have tested a...) Low mileage testing. [Reserved] (c) High-mileage testing—(1) Test groups. Testing must be conducted...
40 CFR 86.1845-01 - Manufacturer in-use verification testing requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... of test vehicles in the sample comply with the sample size requirements of this section. Any post... vehicles, light-duty trucks, and complete heavy-duty vehicles shall test, or cause to have tested a...) Low mileage testing. [Reserved] (c) High-mileage testing—(1) Test groups. Testing must be conducted...
40 CFR 86.1845-01 - Manufacturer in-use verification testing requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... of test vehicles in the sample comply with the sample size requirements of this section. Any post... vehicles, light-duty trucks, and complete heavy-duty vehicles shall test, or cause to have tested a...) Low mileage testing. [Reserved] (c) High-mileage testing—(1) Test groups. Testing must be conducted...
40 CFR 86.1845-01 - Manufacturer in-use verification testing requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... of test vehicles in the sample comply with the sample size requirements of this section. Any post... vehicles, light-duty trucks, and complete heavy-duty vehicles shall test, or cause to have tested a...) Low mileage testing. [Reserved] (c) High-mileage testing—(1) Test groups. Testing must be conducted...
40 CFR 86.1845-04 - Manufacturer in-use verification testing requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... of test vehicles in the sample comply with the sample size requirements of this section. Any post... HDV must test, or cause to have tested, a specified number of vehicles. Such testing must be conducted... first test will be considered the official results for the test vehicle, regardless of any test results...
Demonstration Report for Visual Sample Plan (VSP) Verification Sampling Methods at the Navy/DRI Site
2011-08-01
population of 537,197 with an overall population density of 608 people per square mile (people/ mi2 ). However, the population density in the vicinity...Preliminary Assessment Findings approximately 12 people/ mi2 . Population density is expected to greatly increase following development of the site
Weak lensing magnification in the Dark Energy Survey Science Verification Data
Garcia-Fernandez, M.; et al.
2018-02-02
In this paper the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using data from the Dark Energy Survey Science Verification dataset. The analysis is carried out for two photometrically-selected galaxy samples, with mean photometric redshifts in themore » $0.2 < z < 0.4$ and $0.7 < z < 1.0$ ranges, in the riz bands. A signal is detected with a $$3.5\\sigma$$ significance level in each of the bands tested, and is compatible with the magnification predicted by the $$\\Lambda$$CDM model. After an extensive analysis, it cannot be attributed to any known systematic effect. The detection of the magnification signal is robust to estimated uncertainties in the outlier rate of the pho- tometric redshifts, but this will be an important issue for use of photometric redshifts in magnification mesurements from larger samples. In addition to the detection of the magnification signal, a method to select the sample with the maximum signal-to-noise is proposed and validated with data.« less
Weak lensing magnification in the Dark Energy Survey Science Verification Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia-Fernandez, M.; et al.
In this paper the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using data from the Dark Energy Survey Science Verification dataset. The analysis is carried out for two photometrically-selected galaxy samples, with mean photometric redshifts in themore » $0.2 < z < 0.4$ and $0.7 < z < 1.0$ ranges, in the riz bands. A signal is detected with a $$3.5\\sigma$$ significance level in each of the bands tested, and is compatible with the magnification predicted by the $$\\Lambda$$CDM model. After an extensive analysis, it cannot be attributed to any known systematic effect. The detection of the magnification signal is robust to estimated uncertainties in the outlier rate of the pho- tometric redshifts, but this will be an important issue for use of photometric redshifts in magnification mesurements from larger samples. In addition to the detection of the magnification signal, a method to select the sample with the maximum signal-to-noise is proposed and validated with data.« less
Weak lensing magnification in the Dark Energy Survey Science Verification Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia-Fernandez, M.; et al.
2016-11-30
In this paper the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using data from the Dark Energy Survey Science Verification dataset. The analysis is carried out for two photometrically-selected galaxy samples, with mean photometric redshifts in themore » $0.2 < z < 0.4$ and $0.7 < z < 1.0$ ranges, in the riz bands. A signal is detected with a $$3.5\\sigma$$ significance level in each of the bands tested, and is compatible with the magnification predicted by the $$\\Lambda$$CDM model. After an extensive analysis, it cannot be attributed to any known systematic effect. The detection of the magnification signal is robust to estimated uncertainties in the outlier rate of the pho- tometric redshifts, but this will be an important issue for use of photometric redshifts in magnification mesurements from larger samples. In addition to the detection of the magnification signal, a method to select the sample with the maximum signal-to-noise is proposed and validated with data.« less
Merlyn J. Paulson
1979-01-01
This paper outlines a project level process (V.I.S.) which utilizes very accurate and flexible computer algorithms in combination with contemporary site analysis and design techniques for visual evaluation, design and management. The process provides logical direction and connecting bridges through problem identification, information collection and verification, visual...
75 FR 8294 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-24
... of the Information: The USDA eAuthentication Service provides public and government businesses single sign-on capability for USDA applications, management of user credentials, and verification of identify... collection of information on those who are to respond, including through the use of appropriate automated...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruedig, Elizabeth
Public Law 105-119 directs the U.S. Department of Energy (DOE) to convey or transfer parcels of land to the Incorporated County of Los Alamos or their designees and to the Department of Interior, Bureau of Indian Affairs, in trust for the Pueblo de San Ildefonso. Los Alamos National Security is tasked to support DOE in conveyance and/or transfer of identified land parcels no later than September 2022. Under DOE Order 458.1, Radiation Protection of the Public and the Environment (O458.1, 2013) and Los Alamos National Laboratory (LANL or the Laboratory) implementing Policy 412 (P412, 2014), real property with the potentialmore » to contain residual radioactive material must meet the criteria for clearance and release to the public. This Sampling and Analysis Plan (SAP) is a second investigation of Tract A-18-2 for the purpose of verifying the previous sampling results (LANL 2017). This sample plan requires 18 projectspecific soil samples for use in radiological clearance decisions consistent with LANL Procedure ENV-ES-TP-238 (2015a) and guidance in the Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM, 2000). The sampling work will be conducted by LANL, and samples will be evaluated by a LANL-contracted independent lab. However, there will be federal review (verification) of all steps of the sampling process.« less
Taghdisi, Mohammad Hossein; Babazadeh, Towhid; Moradi, Fatemeh; Shariat, Fariba
2016-01-01
The importance of consuming fruits and vegetables (F&V) in prevention of chronic diseases is known. Childhood play an important role in formation of healthy eating habits. The purpose of this study was to examine the effect of education, with application of the theory of planned behavior, on improvement of F&V consumption. In this quasi-experimental study, 184 fourth, fifth, and sixth-grade students participated were enrolled from Jan 2013 to Jun 2014. The samples were selected from 6 schools in Chalderan County, West Azerbaijan, Iran through cluster random sampling method. Two out of 6 schools were randomly selected and each was employed in either experimental or control group. The data collection instruments included a researcher-made questionnaire and a 24-h F&V recall. Data were collected after verification of the reliability and validity of the questionnaire. Before the intervention, no significant difference was observed between the intervention and control group regarding attitude, subjective norms, perceived behavioral control, behavioral intention and fruits and vegetables consumption (P>0.05). However, after the educational intervention, the mean scores of attitude, subjective norms, perceived behavioral control, behavioral intention variables and fruits and vegetables were significantly higher in the intervention group when compared to the control group(P<0.05). Increased behavioral intention, attitude, subjective norms, and perceived behavioral control can promote F&V consumption among the students.
Verification of hypergraph states
NASA Astrophysics Data System (ADS)
Morimae, Tomoyuki; Takeuchi, Yuki; Hayashi, Masahito
2017-12-01
Hypergraph states are generalizations of graph states where controlled-Z gates on edges are replaced with generalized controlled-Z gates on hyperedges. Hypergraph states have several advantages over graph states. For example, certain hypergraph states, such as the Union Jack states, are universal resource states for measurement-based quantum computing with only Pauli measurements, while graph state measurement-based quantum computing needs non-Clifford basis measurements. Furthermore, it is impossible to classically efficiently sample measurement results on hypergraph states unless the polynomial hierarchy collapses to the third level. Although several protocols have been proposed to verify graph states with only sequential single-qubit Pauli measurements, there was no verification method for hypergraph states. In this paper, we propose a method for verifying a certain class of hypergraph states with only sequential single-qubit Pauli measurements. Importantly, no i.i.d. property of samples is assumed in our protocol: any artificial entanglement among samples cannot fool the verifier. As applications of our protocol, we consider verified blind quantum computing with hypergraph states, and quantum computational supremacy demonstrations with hypergraph states.
2013-12-16
ODONTOCETE STUDIES OFF THE PACIFIC MISSILE RANGE FACILITY IN FEBRUARY 2013: SATELLITE-TAGGING, PHOTO- IDENTIFICATION, AND PASSIVE ACOUSTIC...burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this
Documentation of a Gulf sturgeon spawning site on the Yellow River, Alabama, USA
Kreiser, Brian R.; Berg, J.; Randall, M.; Parauka, F.; Floyd, S.; Young, B.; Sulak, Kenneth J.
2008-01-01
Parauka and Giorgianni (2002) reported that potential Gulf sturgeon spawning habitat is present in the Yellow River; however, efforts to document spawning by the collection of eggs or larvae have been unsuccessful in the past. Herein, we report on the first successful collection of eggs from a potential spawning site on the Yellow River and the verification of their identity as Gulf sturgeon by using molecular methods.
Geographic Information System Data Analysis
NASA Technical Reports Server (NTRS)
Billings, Chad; Casad, Christopher; Floriano, Luis G.; Hill, Tracie; Johnson, Rashida K.; Locklear, J. Mark; Penn, Stephen; Rhoulac, Tori; Shay, Adam H.; Taylor, Antone;
1995-01-01
Data was collected in order to further NASA Langley Research Center's Geographic Information System(GIS). Information on LaRC's communication, electrical, and facility configurations was collected. Existing data was corrected through verification, resulting in more accurate databases. In addition, Global Positioning System(GPS) points were used in order to accurately impose buildings on digitized images. Overall, this project will help the Imaging and CADD Technology Team (ICTT) prove GIS to be a valuable resource for LaRC.
Microbial soil community analyses for forensic science: Application to a blind test.
Demanèche, Sandrine; Schauser, Leif; Dawson, Lorna; Franqueville, Laure; Simonet, Pascal
2017-01-01
Soil complexity, heterogeneity and transferability make it valuable in forensic investigations to help obtain clues as to the origin of an unknown sample, or to compare samples from a suspect or object with samples collected at a crime scene. In a few countries, soil analysis is used in matters from site verification to estimates of time after death. However, up to date the application or use of soil information in criminal investigations has been limited. In particular, comparing bacterial communities in soil samples could be a useful tool for forensic science. To evaluate the relevance of this approach, a blind test was performed to determine the origin of two questioned samples (one from the mock crime scene and the other from a 50:50 mixture of the crime scene and the alibi site) compared to three control samples (soil samples from the crime scene, from a context site 25m away from the crime scene and from the alibi site which was the suspect's home). Two biological methods were used, Ribosomal Intergenic Spacer Analysis (RISA), and 16S rRNA gene sequencing with Illumina Miseq, to evaluate the discriminating power of soil bacterial communities. Both techniques discriminated well between soils from a single source, but a combination of both techniques was necessary to show that the origin was a mixture of soils. This study illustrates the potential of applying microbial ecology methodologies in soil as an evaluative forensic tool. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Geographical provenance of palm oil by fatty acid and volatile compound fingerprinting techniques.
Tres, A; Ruiz-Samblas, C; van der Veer, G; van Ruth, S M
2013-04-15
Analytical methods are required in addition to administrative controls to verify the geographical origin of vegetable oils such as palm oil in an objective manner. In this study the application of fatty acid and volatile organic compound fingerprinting in combination with chemometrics have been applied to verify the geographical origin of crude palm oil (continental scale). For this purpose 94 crude palm oil samples were collected from South East Asia (55), South America (11) and Africa (28). Partial least squares discriminant analysis (PLS-DA) was used to develop a hierarchical classification model by combining two consecutive binary PLS-DA models. First, a PLS-DA model was built to distinguish South East Asian from non-South East Asian palm oil samples. Then a second model was developed, only for the non-Asian samples, to discriminate African from South American crude palm oil. Models were externally validated by using them to predict the identity of new authentic samples. The fatty acid fingerprinting model revealed three misclassified samples. The volatile compound fingerprinting models showed an 88%, 100% and 100% accuracy for the South East Asian, African and American class, respectively. The verification of the geographical origin of crude palm oil is feasible by fatty acid and volatile compound fingerprinting. Further research is required to further validate the approach and to increase its spatial specificity to country/province scale. Copyright © 2012 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. M. Capron
2008-04-15
The 100-F-50 waste site, part of the 100-FR-2 Operable Unit, is a steel stormwater runoff culvert that runs between two railroad grades in the south-central portion of the 100-F Area. The culvert exiting the west side of the railroad grade is mostly encased in concrete and surrounded by a concrete stormwater collection depression partially filled with soil and vegetation. The drain pipe exiting the east side of the railroad grade embankment is partially filled with soil and rocks. The 100-F-50 stormwater diversion culvert confirmatory sampling results support a reclassification of this site to no action. The current site conditions achievemore » the remedial action objectives and corresponding remedial action goals established in the Remaining Sites ROD. The results of confirmatory sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.« less
Kuswandi, Bambang; Irmawati, Titi; Hidayat, Moch Amrun; Jayus; Ahmad, Musa
2014-01-27
A simple visual ethanol biosensor based on alcohol oxidase (AOX) immobilised onto polyaniline (PANI) film for halal verification of fermented beverage samples is described. This biosensor responds to ethanol via a colour change from green to blue, due to the enzymatic reaction of ethanol that produces acetaldehyde and hydrogen peroxide, when the latter oxidizes the PANI film. The procedure to obtain this biosensor consists of the immobilization of AOX onto PANI film by adsorption. For the immobilisation, an AOX solution is deposited on the PANI film and left at room temperature until dried (30 min). The biosensor was constructed as a dip stick for visual and simple use. The colour changes of the films have been scanned and analysed using image analysis software (i.e., ImageJ) to study the characteristics of the biosensor's response toward ethanol. The biosensor has a linear response in an ethanol concentration range of 0.01%-0.8%, with a correlation coefficient (r) of 0.996. The limit detection of the biosensor was 0.001%, with reproducibility (RSD) of 1.6% and a life time up to seven weeks when stored at 4 °C. The biosensor provides accurate results for ethanol determination in fermented drinks and was in good agreement with the standard method (gas chromatography) results. Thus, the biosensor could be used as a simple visual method for ethanol determination in fermented beverage samples that can be useful for Muslim community for halal verification.
Verification of rut depth collected with the INO laser rut measurement system (LRMS).
DOT National Transportation Integrated Search
2011-10-28
Pavement rutting can be an indicator that a section of roadway is in need of repair or replacement. It can also become : a hazard to drivers, causing loss of control or hydroplaning when water accumulates. To better monitor pavement : conditions thro...
This document provides a general set of guidelines that may be consistently applied for collecting, evaluation, and reporting the costs of technologies tested under the ETV Program. Because of the diverse nature of the technologies and industries covered in this program, each ETV...
NASA Astrophysics Data System (ADS)
Müller, A.; Urich, D.; Kreck, G.; Metzmacher, M.; Lindner, R.
2018-04-01
The presentation will cover results from an ESA supported investigation to collect lessons learned for mechanism assembly with the focus on quality and contamination requirements verification in exploration projects such as ExoMars.
2007-03-01
8 The Centre Cotton Manufacturing Company (1812).................................................................. 9...112 Data collection...sedimentation surface) derived from the CF:CS 210Pb model. Data are not available for vibracores CMS- SD-4210 and CMS-SD-4213 in 1938 and 1958 and
78 FR 66365 - Proposed Information Collection Activity; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-05
... for Needy Families (TANF) program, it imposed a new data requirement that States prepare and submit data verification procedures and replaced other data requirements with new versions including: the TANF Data Report, the SSP-MOE Data Report, the Caseload Reduction Documentation Process, and the Reasonable...
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. A. Carlson
2006-02-23
The 1607-D4 Septic System was a septic tank and tile field that received sanitary sewage from the 115-D/DR Gas Recirculation Facility. This septic system operated from 1944 to 1968. Decommissioning took place in 1985 and 1986 when all above-grade features were demolished and the tank backfilled. The results of verification sampling demonstrated that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also showed that residual contaminant concentrations are protective of groundwater and the Columbia River.
NASA Technical Reports Server (NTRS)
Melendez, Orlando; Trizzino, Mary; Fedderson, Bryan
1997-01-01
The National Aeronautics and Space Administration (NASA), Kennedy Space Center (KSC) Materials Science Division conducted a study to evaluate alternative solvents for CFC-113 in precision cleaning and verification on typical samples that are used in the KSC environment. The effects of AK-225(R), Vertrel(R), MCA, and HFE A 7100 on selected metal and polymer materials were studied over 1, 7 and 30 day test times. This report addresses a study on the compatibility aspects of replacement solvents for materials in aerospace applications.
Self-Verification and Depressive Symptoms in Marriage and Courtship: A Multiple Pathway Model.
ERIC Educational Resources Information Center
Katz, Jennifer; Beach, Steven R. H.
1997-01-01
Examines whether self-verifying feedback may lead to decreased depressive symptoms. Results, based on 138 married women and 258 dating women, showed full mediational effects in the married sample and partial effects in the dating sample. Findings suggest that partner self-verifying feedback may intensify the effect of self-esteem on depression.…
The purpose of this SOP is to ensure suitable temperature maintenance of freezers used for storage of samples. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the "Border" study. Keywords: freezers; operation.
The National H...
The purpose of this SOP is to assure suitable temperature maintenance in refrigerators and freezers used for sample storage during the Arizona NHEXAS project and the "Border" study. Keywords: lab; equipment; refrigerators and freezers.
The National Human Exposure Assessment Su...
2013-09-01
17 5.6 SAMPLING RESULTS ........................................................................................ 18 6.0 PERFORMANCE...Page ii 8.0 IMPLEMENTATION ISSUES ........................................................................................ 37 8.1 FILTRATION ...15 iv LIST OF TABLES Page Table 1. Performance results
Extracellular Vesicles in Bile as Markers of Malignant Biliary Stenoses.
Severino, Valeria; Dumonceau, Jean-Marc; Delhaye, Myriam; Moll, Solange; Annessi-Ramseyer, Isabelle; Robin, Xavier; Frossard, Jean-Louis; Farina, Annarita
2017-08-01
Algorithms for diagnosis of malignant common bile duct (CBD) stenoses are complex and lack accuracy. Malignant tumors secrete large numbers of extracellular vesicles (EVs) into surrounding fluids; EVs might therefore serve as biomarkers for diagnosis. We investigated whether concentrations of EVs in bile could discriminate malignant from nonmalignant CBD stenoses. We collected bile and blood samples from 50 patients undergoing therapeutic endoscopic retrograde cholangiopancreatography at university hospitals in Europe for CBD stenosis of malignant (pancreatic cancer, n = 20 or cholangiocarcinoma, n = 5) or nonmalignant (chronic pancreatitis [CP], n = 15) origin. Ten patients with CBD obstruction due to biliary stones were included as controls. EV concentrations in samples were determined by nanoparticle tracking analyses. The discovery cohort comprised the first 10 patients with a diagnosis of pancreatic cancer, based on tissue analysis, and 10 consecutive controls. Using samples from these subjects, we identified a threshold concentration of bile EVs that could best discriminate between patients with pancreatic cancer from controls. We verified the diagnostic performance of bile EV concentration by analyzing samples from the 30 consecutive patients with a diagnosis of malignant (pancreatic cancer or cholangiocarcinoma, n = 15) or nonmalignant (CP, n = 15) CBD stenosis. Samples were compared using the Mann-Whitney test and nonparametric Spearman correlation analysis. Receiver operating characteristic area under the curve was used to determine diagnostic accuracy. In both cohorts, the median concentration of EVs was significantly higher in bile samples from patients with malignant CBD stenoses than controls or nonmalignant CBD stenoses (2.41 × 10 15 vs 1.60 × 10 14 nanoparticles/L in the discovery cohort; P < .0001 and 4.00 × 10 15 vs 1.26 × 10 14 nanoparticles/L in the verification cohort; P < .0001). A threshold of 9.46 × 10 14 nanoparticles/L in bile best distinguished patients with malignant CBD from controls in the discovery cohort. In the verification cohort, this threshold discriminated malignant from nonmalignant CBD stenoses with 100% accuracy. Serum concentration of EVs distinguished patients with malignant vs patients with nonmalignant CBD stenoses with 63.3% diagnostic accuracy. Concentration of EVs in bile samples discriminates between patients with malignant vs nonmalignant CBD stenosis with 100% accuracy. Further studies are needed to confirm these findings. Clinical Trial registration no: ISRCTN66835592. Copyright © 2017 AGA Institute. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Ross, Howard (Compiler)
2000-01-01
This document contains the results of a collection of selected cooperative research projects between principal investigators in the microgravity combustion science programs, sponsored by NASA and NEDO. Cooperation involved the use of drop towers in Japan and the United States, and the sharing of subsequent research data and findings. The topical areas include: (1) Interacting droplet arrays, (2) high pressure binary fuel sprays, (3) sooting droplet combustion, (4) flammability limits and dynamics of spherical, premixed gaseous flames and, (5) ignition and transition of flame spread across thin solid fuel samples. All of the investigators view this collaboration as a success. Novel flame behaviors were found and later published in archival journals. In some cases the experiments provided verification of the design and behavior in subsequent experiments performed on the Space Shuttle. In other cases, the experiments provided guidance to experiments that are expected to be performed on the International Space Station.
NASA Technical Reports Server (NTRS)
Luvall, J. C.; Sprigg, W. A.; Levetin, E.; Huete, A.; Nickovic, S.; Pejanovic, G. A.; Vukovic, A.; VandeWater, P. K.; Myers, O. B.; Budge, A. M.;
2011-01-01
Pollen can be transported great distances. Van de Water et. al. reported Juniperus spp. pollen was transported 200-600 km. Hence local observations of plant phenology may not be consistent with the timing and source of pollen collected by pollen sampling instruments. The DREAM (Dust REgional Atmospheric Model) is a verified model for atmospheric dust transport modeling using MODIS data products to identify source regions and quantities of dust. We are modifying the DREAM model to incorporate pollen transport. Pollen release will be estimated based on MODIS derived phenology of Juniperus spp. communities. Ground based observational records of pollen release timing and quantities will be used as verification. This information will be used to support the Centers for Disease Control and Prevention's National Environmental Public Health Tracking Program and the State of New Mexico environmental public health decision support for asthma and allergies alerts.
You Can Run, But You Can't Hide Juniper Pollen Phenology and Dispersal
NASA Technical Reports Server (NTRS)
Luvall, Jeffrey C.
2013-01-01
Pollen can be transported great distances. Van de Water et. al., 2003 reported Juniperus spp. pollen was transported 200-600 km. Hence local observations of plant phenology may not be consistent with the timing and source of pollen collected by pollen sampling instruments. The DREAM (Dust REgional Atmospheric Model, Nickovic et al. 2001) is a verified model for atmospheric dust transport modeling using MODIS data products to identify source regions and quantities of dust. We are modified the DREAM model to incorporate pollen transport. Pollen release is estimated based on MODIS derived phenology of Juniperus spp. communities. Ground based observational records of pollen release timing and quantities are used as verification. This information will be used to support the Centers for Disease Control and Prevention's National Environmental Public Health Tracking Program and the State of New Mexico environmental public health decision support for asthma and allergies alerts.
Regional agriculture surveys using ERTS-1 data
NASA Technical Reports Server (NTRS)
Draeger, W. C.; Nichols, J. D.; Benson, A. S.; Larrabee, D. G.; Jenkus, W. M.; Hay, C. M.
1974-01-01
The Center for Remote Sensing Research has conducted studies designed to evaluate the potential application of ERTS data in performing agricultural inventories, and to develop efficient methods of data handling and analysis useful in the operational context for performing large area surveys. This work has resulted in the development of an integrated system utilizing both human and computer analysis of ground, aerial, and space imagery, which has been shown to be very efficient for regional crop acreage inventories. The technique involves: (1) the delineation of ERTS images into relatively homogeneous strata by human interpreters, (2) the point-by-point classification of the area within each strata on the basis of crop type using a human/machine interactive digital image processing system; and (3) a multistage sampling procedure for the collection of supporting aerial and ground data used in the adjustment and verification of the classification results.
Martins, Andréa Maria Eleutério de Barros Lima; da Costa, Fernanda Marques; Ferreira, Raquel Conceição; dos Santos Neto, Pedro Eleutério; de Magalhaes, Tatiana Almeida; de Sá, Maria Aparecida Barbosa; Pordeus, Isabela Almeida
2015-01-01
Cross-sectional study conducted among workers of the Family Health Strategy Montes Claros. To investigate the report of vaccination against Hepatitis B, verification of immunization and the factors associated with dosages of anti-HBs. We collected blood samples from those reported that they had one or more doses of the vaccine. We evaluated the association of the dosage of anti- HBs with sociodemographic conditions, occupational and behavioral. The associations were verified by Mann Whitney and Kruskal Wallis and correlation Spermann by linear regression using SPSS® 17.0. Among the 761 respondents, 504 (66.1%) were vaccinated, 52.5 % received three doses, 30.4 % verified immunization. Of the 397 evaluated for the determination of anti-Hbs, 16.4% were immune. It was found that longer duration of work was associated with higher levels of anti-HBs, while levels of smoking were inversely associated with anti-HBs. These workers need for vaccination campaigns.
Krasteva, Vessela; Jekova, Irena; Schmid, Ramun
2018-01-01
This study aims to validate the 12-lead electrocardiogram (ECG) as a biometric modality based on two straightforward binary QRS template matching characteristics. Different perspectives of the human verification problem are considered, regarding the optimal lead selection and stability over sample size, gender, age, heart rate (HR). A clinical 12-lead resting ECG database, including a population of 460 subjects with two-session recordings (>1 year apart) is used. Cost-effective strategies for extraction of personalized QRS patterns (100ms) and binary template matching estimate similarity in the time scale (matching time) and dissimilarity in the amplitude scale (mismatch area). The two-class person verification task, taking the decision to validate or to reject the subject identity is managed by linear discriminant analysis (LDA). Non-redundant LDA models for different lead configurations (I,II,III,aVF,aVL,aVF,V1-V6) are trained on the first half of 230 subjects by stepwise feature selection until maximization of the area under the receiver operating characteristic curve (ROC AUC). The operating point on the training ROC at equal error rate (EER) is tested on the independent dataset (second half of 230 subjects) to report unbiased validation of test-ROC AUC and true verification rate (TVR = 100-EER). The test results are further evaluated in groups by sample size, gender, age, HR. The optimal QRS pattern projection for single-lead ECG biometric modality is found in the frontal plane sector (60°-0°) with best (Test-AUC/TVR) for lead II (0.941/86.8%) and slight accuracy drop for -aVR (-0.017/-1.4%), I (-0.01/-1.5%). Chest ECG leads have degrading accuracy from V1 (0.885/80.6%) to V6 (0.799/71.8%). The multi-lead ECG improves verification: 6-chest (0.97/90.9%), 6-limb (0.986/94.3%), 12-leads (0.995/97.5%). The QRS pattern matching model shows stable performance for verification of 10 to 230 individuals; insignificant degradation of TVR in women by (1.2-3.6%), adults ≥70 years (3.7%), younger <40 years (1.9%), HR<60bpm (1.2%), HR>90bpm (3.9%), no degradation for HR change (0 to >20bpm).
DOE Office of Scientific and Technical Information (OSTI.GOV)
ADAMS, WADE C
At Pennsylvania Department of Environmental Protection's request, ORAU's IEAV program conducted verification surveys on the excavated surfaces of Section 3, SUs 1, 4, and 5 at the Whittaker site on March 13 and 14, 2013. The survey activities included visual inspections, gamma radiation surface scans, gamma activity measurements, and soil sampling activities. Verification activities also included the review and assessment of the licensee's project documentation and methodologies. Surface scans identified four areas of elevated direct gamma radiation distinguishable from background; one area within SUs 1 and 4 and two areas within SU5. One area within SU5 was remediated by removingmore » a golf ball size piece of slag while ORAU staff was onsite. With the exception of the golf ball size piece of slag within SU5, a review of the ESL Section 3 EXS data packages for SUs 1, 4, and 5 indicated that these locations of elevated gamma radiation were also identified by the ESL gamma scans and that ESL personnel performed additional investigations and soil sampling within these areas. The investigative results indicated that the areas met the release criteria.« less
Haidar Ahmad, Imad A; Tam, James; Li, Xue; Duffield, William; Tarara, Thomas; Blasko, Andrei
2017-02-05
The parameters affecting the recovery of pharmaceutical residues from the surface of stainless steel coupons for quantitative cleaning verification method development have been studied, including active pharmaceutical ingredient (API) level, spiking procedure, API/excipient ratio, analyst-to-analyst variability, inter-day variability, and cleaning procedure of the coupons. The lack of a well-defined procedure that consistently cleaned coupon surface was identified as the major contributor to low and variable recoveries. Assessment of acid, base, and oxidant washes, as well as the order of treatment, showed that a base-water-acid-water-oxidizer-water wash procedure resulted in consistent, accurate spiked recovery (>90%) and reproducible results (S rel ≤4%). By applying this cleaning procedure to the previously used coupons that failed the cleaning acceptance criteria, multiple analysts were able to obtain consistent recoveries from day-to-day for different APIs, and API/excipient ratios at various spike levels. We successfully applied our approach for cleaning verification of small molecules (MW<1000Da) as well as large biomolecules (MW up to 50,000Da). Method robustness was greatly influenced by the sample preparation procedure, especially for analyses using total organic carbon (TOC) determination. Copyright © 2016 Elsevier B.V. All rights reserved.
Development of CFC-Free Cleaning Processes at the NASA White Sands Test Facility
NASA Technical Reports Server (NTRS)
Beeson, Harold; Kirsch, Mike; Hornung, Steven; Biesinger, Paul
1995-01-01
The NASA White Sands Test Facility (WSTF) is developing cleaning and verification processes to replace currently used chlorofluorocarbon-113- (CFC-113-) based processes. The processes being evaluated include both aqueous- and solvent-based techniques. The presentation will include the findings of investigations of aqueous cleaning and verification processes that are based on a draft of a proposed NASA Kennedy Space Center (KSC) cleaning procedure. Verification testing with known contaminants, such as hydraulic fluid and commonly used oils, established correlations between nonvolatile residue and CFC-113. Recoveries ranged from 35 to 60 percent of theoretical. WSTF is also investigating enhancements to aqueous sampling for organics and particulates. Although aqueous alternatives have been identified for several processes, a need still exists for nonaqueous solvent cleaning, such as the cleaning and cleanliness verification of gauges used for oxygen service. The cleaning effectiveness of tetrachloroethylene (PCE), trichloroethylene (TCE), ethanol, hydrochlorofluorocarbon-225 (HCFC-225), tert-butylmethylether, and n-Hexane was evaluated using aerospace gauges and precision instruments and then compared to the cleaning effectiveness of CFC-113. Solvents considered for use in oxygen systems were also tested for oxygen compatibility using high-pressure oxygen autoignition and liquid oxygen mechanical impact testing.
Evaluation of the 29-km Eta Model. Part 1; Objective Verification at Three Selected Stations
NASA Technical Reports Server (NTRS)
Nutter, Paul A.; Manobianco, John; Merceret, Francis J. (Technical Monitor)
1998-01-01
This paper describes an objective verification of the National Centers for Environmental Prediction (NCEP) 29-km eta model from May 1996 through January 1998. The evaluation was designed to assess the model's surface and upper-air point forecast accuracy at three selected locations during separate warm (May - August) and cool (October - January) season periods. In order to enhance sample sizes available for statistical calculations, the objective verification includes two consecutive warm and cool season periods. Systematic model deficiencies comprise the larger portion of the total error in most of the surface forecast variables that were evaluated. The error characteristics for both surface and upper-air forecasts vary widely by parameter, season, and station location. At upper levels, a few characteristic biases are identified. Overall however, the upper-level errors are more nonsystematic in nature and could be explained partly by observational measurement uncertainty. With a few exceptions, the upper-air results also indicate that 24-h model error growth is not statistically significant. In February and August 1997, NCEP implemented upgrades to the eta model's physical parameterizations that were designed to change some of the model's error characteristics near the surface. The results shown in this paper indicate that these upgrades led to identifiable and statistically significant changes in forecast accuracy for selected surface parameters. While some of the changes were expected, others were not consistent with the intent of the model updates and further emphasize the need for ongoing sensitivity studies and localized statistical verification efforts. Objective verification of point forecasts is a stringent measure of model performance, but when used alone, is not enough to quantify the overall value that model guidance may add to the forecast process. Therefore, results from a subjective verification of the meso-eta model over the Florida peninsula are discussed in the companion paper by Manobianco and Nutter. Overall verification results presented here and in part two should establish a reasonable benchmark from which model users and developers may pursue the ongoing eta model verification strategies in the future.
76 FR 72225 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-22
.... Independent public accountants must verify the fund's assets at least three times a year and two of the... independent public accountants when they perform verifications of fund assets.\\4\\ Approximately 243 funds rely... hours per fund) x $165 (fund senior accountant's hourly rate) = $82.50. \\3\\ Respondents estimated that...
VALIDATION EXISTING DATA IN THE ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM
Establishing the credibility of existing data is an ongoing issue, particularly when the data sets are to be used for a secondary purpose, not the original reason for which they were collected. If the secondary purpose is similar to the primary purpose, the potential user may hav...
Toward Ada Verification: A Collection of Relevant Topics
1986-06-01
presumably it is this- if there are no default values, a programming error which results in failure to initialize a variable is more likely to advertise ... disavantages tu using AVID. First, TDL is a more complicated interface than first-order logic (as used in the CSG). Second, AVID is unsupported and
Code of Federal Regulations, 2010 CFR
2010-07-01
..., Statements of Account, Auditor's Reports, and other verification information filed in the Copyright Office... statements of account under compulsory license for making/distributing phonorecords of 201.19 Nondramatic... works, Royalties and statements of account under compulsory license for making/distributing 201.19...
75 FR 28550 - Proposed Information Collection; Comment Request; Delivery Verification Procedure
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-21
... the commodities shipped to the U.S. were in fact received. This procedure increases the effectiveness... Review: Regular submission. Affected Public: Business or other for-profit organizations. Estimated Number.... Estimated Total Annual Cost to Public: $0. IV. Request for Comments Comments are invited on: (a) Whether the...
Testing a Model of Participant Retention in Longitudinal Substance Abuse Research
ERIC Educational Resources Information Center
Gilmore, Devin; Kuperminc, Gabriel P.
2014-01-01
Longitudinal substance abuse research has often been compromised by high rates of attrition, thought to be the result of the lifestyle that often accompanies addiction. Several studies have used strategies including collection of locator information at the baseline assessment, verification of the information, and interim contacts prior to…
78 FR 59029 - Information Collection Being Reviewed by the Federal Communications Commission
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-25
... Broadcast Station Antenna Patterns. Form No.: Not applicable. Type of Review: Revision of a currently... Rules Regarding AM Radio Service Directional Antenna Performance Verification, MM Docket No. 93-177, FCC... functions as the antenna. Consequently, a nearby tower may become an unintended part of the AM antenna...
32 CFR Attachment B to Subpart B... - Standard B-Single Scope Background Investigation (SSBI)
Code of Federal Regulations, 2010 CFR
2010-07-01
...) Employment: Verification of all employments for the past seven years; personal interviews of sources... most recent or most significant claimed attendance, degree, or diploma. Interviews of appropriate... of the subject and collectively span at least the last seven years. (9) Former Spouse: An interview...
76 FR 44006 - Information Collection Being Reviewed by the Federal Communications Commission
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-22
...) Attachment 1--Community Mental Health Center Verification Template; (2) Attachment 2--Invoice Template; (3... community's ability to provide a rapid and coordinated response in the event of a public health crisis.... Title: Universal Service--Rural Health Care Program/Rural Health Care Pilot Program. Form Nos.: FCC...
SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamashita, M; Kokubo, M; Institute of Biomedical Research and Innovation, Kobe, Hyogo
2016-06-15
Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used formore » dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less
NASA Technical Reports Server (NTRS)
Sanders, Gerald B.; Araghi, Koorosh; Ess, Kim M.; Valencia, Lisa M.; Muscatello, Anthony C.; Calle, Carlos I.; Clark, Larry; Iacomini, Christie
2014-01-01
The making of oxygen from resources in the Martian atmosphere, known as In Situ Resource Utilization (ISRU), has the potential to provide substantial benefits for future robotic and human exploration. In particular, the ability to produce oxygen on Mars for use in propulsion, life support, and power systems can provide significant mission benefits such as a reducing launch mass, lander size, and mission and crew risk. To advance ISRU for possible incorporation into future human missions to Mars, NASA proposed including an ISRU instrument on the Mars 2020 rover mission, through an announcement of opportunity (AO). The purpose of the the Mars Atmosphere Resource Verification INsitu or (MARVIN) instrument is to provide the first demonstration on Mars of oxygen production from acquired and stored Martian atmospheric carbon dioxide, as well as take measurements of atmospheric pressure and temperature, and of suspended dust particle sizes and amounts entrained in collected atmosphere gases at different times of the Mars day and year. The hardware performance and environmental data obtained will be critical for future ISRU systems that will reduce the mass of propellants and other consumables launched from Earth for robotic and human exploration, for better understanding of Mars dust and mitigation techniques to improve crew safety, and to help further define Mars global circulation models and better understand the regional atmospheric dynamics on Mars. The technologies selected for MARVIN are also scalable for future robotic sample return and human missions to Mars using ISRU.
Category V Compliant Container for Mars Sample Return Missions
NASA Technical Reports Server (NTRS)
Dolgin, Benjamin; Sanok, Joseph; Sevilla, Donald; Bement, Laurence J.
2000-01-01
A novel containerization technique that satisfies Planetary Protection (PP) Category V requirements has been developed and demonstrated on the mock-up of the Mars Sample Return Container. The proposed approach uses explosive welding with a sacrificial layer and cut-through-the-seam techniques. The technology produces a container that is free from Martian contaminants on an atomic level. The containerization technique can be used on any celestial body that may support life. A major advantage of the proposed technology is the possibility of very fast (less than an hour) verification of both containment and cleanliness with typical metallurgical laboratory equipment. No separate biological verification is required. In addition to Category V requirements, the proposed container presents a surface that is clean from any, even nonviable organisms, and any molecular fragments of biological origin that are unique to Mars or any other celestial body other than Earth.
Diode step stress program for JANTX1N5615
NASA Technical Reports Server (NTRS)
1979-01-01
The effect of power/temperature step stress when applied to the switching diode JANTX1N5615 manufactured by Semtech and Micro semiconductor was examined. A total of 48 samples from each manufacturer were submitted to the process. In addition, two control sample units were maintained for verification of the electrical parametric testing. All test samples were subjected to the electrical tests after completing the prior power/temperature step stress point. Results are presented.
The Golosiiv on-line plate archive database, management and maintenance
NASA Astrophysics Data System (ADS)
Pakuliak, L.; Sergeeva, T.
2007-08-01
We intend to create online version of the database of the MAO NASU plate archive as VO-compatible structures in accordance with principles, developed by the International Virtual Observatory Alliance in order to make them available for world astronomical community. The online version of the log-book database is constructed by means of MySQL+PHP. Data management system provides a user with user interface, gives a capability of detailed traditional form-filling radial search of plates, obtaining some auxiliary sampling, the listing of each collection and permits to browse the detail descriptions of collections. The administrative tool allows database administrator the data correction, enhancement with new data sets and control of the integrity and consistence of the database as a whole. The VO-compatible database is currently constructing under the demands and in the accordance with principles of international data archives and has to be strongly generalized in order to provide a possibility of data mining by means of standard interfaces and to be the best fitted to the demands of WFPDB Group for databases of the plate catalogues. On-going enhancements of database toward the WFPDB bring the problem of the verification of data to the forefront, as it demands the high degree of data reliability. The process of data verification is practically endless and inseparable from data management owing to a diversity of data errors nature, that means to a variety of ploys of their identification and fixing. The current status of MAO NASU glass archive forces the activity in both directions simultaneously: the enhancement of log-book database with new sets of observational data as well as generalized database creation and the cross-identification between them. The VO-compatible version of the database is supplying with digitized data of plates obtained with MicroTek ScanMaker 9800 XL TMA. The scanning procedure is not total but is conducted selectively in the frames of special projects.
The purpose of this SOP is to assure suitable temperature maintenance in refrigerators and freezers used for sample storage during the Arizona NHEXAS project and the Border study. Keywords: lab; equipment; refrigerators and freezers.
The U.S.-Mexico Border Program is sponsored...
Transistor step stress testing program for JANTX2N2484
NASA Technical Reports Server (NTRS)
1979-01-01
The effect of power/temperature step stress when applied to the transistor JANTX2N2484, manufactured by Raytheon and Teledyne was evaluated. Forty-eight samples from each manufacturer were divided equally (16 per group) into three groups and submitted to the processes outlined. In addition, two control sample units were maintained for verification of the electrical parametric testing.
ERIC Educational Resources Information Center
Mesmer-Magnus, Jessica R.; Viswesvaran, Chockalingam
2005-01-01
The overlap between measures of work-to-family (WFC) and family-to-work conflict (FWC) was meta-analytically investigated. Researchers have assumed WFC and FWC to be distinct, however, this assumption requires empirical verification. Across 25 independent samples (total N=9079) the sample size weighted mean observed correlation was .38 and the…
Series: Pragmatic trials and real world evidence: Paper 8. Data collection and management.
Meinecke, Anna-Katharina; Welsing, Paco; Kafatos, George; Burke, Des; Trelle, Sven; Kubin, Maria; Nachbaur, Gaelle; Egger, Matthias; Zuidgeest, Mira
2017-11-01
Pragmatic trials can improve our understanding of how treatments will perform in routine practice. In a series of eight papers, the GetReal Consortium has evaluated the challenges in designing and conducting pragmatic trials and their specific methodological, operational, regulatory, and ethical implications. The present final paper of the series discusses the operational and methodological challenges of data collection in pragmatic trials. A more pragmatic data collection needs to balance the delivery of highly accurate and complete data with minimizing the level of interference that data entry and verification induce with clinical practice. Furthermore, it should allow for the involvement of a representative sample of practices, physicians, and patients who prescribe/receive treatment in routine care. This paper discusses challenges that are related to the different methods of data collection and presents potential solutions where possible. No one-size-fits-all recommendation can be given for the collection of data in pragmatic trials, although in general the application of existing routinely used data-collection systems and processes seems to best suit the pragmatic approach. However, data access and privacy, the time points of data collection, the level of detail in the data, and the lack of a clear understanding of the data-collection process were identified as main challenges for the usage of routinely collected data in pragmatic trials. A first step should be to determine to what extent existing health care databases provide the necessary study data and can accommodate data collection and management. When more elaborate or detailed data collection or more structured follow-up is required, data collection in a pragmatic trial will have to be tailor-made, often using a hybrid approach using a dedicated electronic case report form (eCRF). In this case, the eCRF should be kept as simple as possible to reduce the burden for practitioners and minimize influence on routine clinical practice. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Kim, Sang-Bog; Roche, Jennifer
2013-08-01
Organically bound tritium (OBT) is an important tritium species that can be measured in most environmental samples, but has only recently been recognized as a species of tritium in these samples. Currently, OBT is not routinely measured by environmental monitoring laboratories around the world. There are no certified reference materials (CRMs) for environmental samples. Thus, quality assurance (QA), or verification of the accuracy of the OBT measurement, is not possible. Alternatively, quality control (QC), or verification of the precision of the OBT measurement, can be achieved. In the past, there have been differences in OBT analysis results between environmental laboratories. A possible reason for the discrepancies may be differences in analytical methods. Therefore, inter-laboratory OBT comparisons among the environmental laboratories are important and would provide a good opportunity for adopting a reference OBT analytical procedure. Due to the analytical issues, only limited information is available on OBT measurement. Previously conducted OBT inter-laboratory practices are reviewed and the findings are described. Based on our experiences, a few considerations were suggested for the international OBT inter-laboratory comparison exercise to be completed in the near future. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe Nellie; Sentz, Kari; Swanson, Meili Claire
Recent advances in information technology have led to an expansion of crowdsourcing activities that utilize the “power of the people” harnessed via online games, communities of interest, and other platforms to collect, analyze, verify, and provide technological solutions for challenges from a multitude of domains. To related this surge in popularity, the research team developed a taxonomy of crowdsourcing activities as they relate to international nuclear safeguards, evaluated the potential legal and ethical issues surrounding the use of crowdsourcing to support safeguards, and proposed experimental designs to test the capabilities and prospect for the use of crowdsourcing to support nuclearmore » safeguards verification.« less
NASA Technical Reports Server (NTRS)
Kleb, William L.; Wood, William A.
2004-01-01
The computational simulation community is not routinely publishing independently verifiable tests to accompany new models or algorithms. A survey reveals that only 22% of new models published are accompanied by tests suitable for independently verifying the new model. As the community develops larger codes with increased functionality, and hence increased complexity in terms of the number of building block components and their interactions, it becomes prohibitively expensive for each development group to derive the appropriate tests for each component. Therefore, the computational simulation community is building its collective castle on a very shaky foundation of components with unpublished and unrepeatable verification tests. The computational simulation community needs to begin publishing component level verification tests before the tide of complexity undermines its foundation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weaver, T.A.; Baker, D.F.; Edwards, C.L.
1993-10-01
Surface ground motion was recorded for many of the Integrated Verification Experiments using standard 10-, 25- and 100-g accelerometers, force-balanced accelerometers and, for some events, using golf balls and 0.39-cm steel balls as surface inertial gauges (SIGs). This report contains the semi-processed acceleration, velocity, and displacement data for the accelerometers fielded and the individual observations for the SIG experiments. Most acceleration, velocity, and displacement records have had calibrations applied and have been deramped, offset corrected, and deglitched but are otherwise unfiltered or processed from their original records. Digital data for all of these records are stored at Los Alamos Nationalmore » Laboratory.« less
Jornet, Núria; Carrasco, Pablo; Beltrán, Mercè; Calvo, Juan Francisco; Escudé, Lluís; Hernández, Victor; Quera, Jaume; Sáez, Jordi
2014-09-01
We performed a multicentre intercomparison of IMRT optimisation and dose planning and IMRT pre-treatment verification methods and results. The aims were to check consistency between dose plans and to validate whether in-house pre-treatment verification results agreed with those of an external audit. Participating centres used two mock cases (prostate and head and neck) for the intercomparison and audit. Compliance to dosimetric goals and total number of MU per plan were collected. A simple quality index to compare the different plans was proposed. We compared gamma index pass rates using the centre's equipment and methodology to those of an external audit. While for the prostate case, all centres fulfilled the dosimetric goals and plan quality was homogeneous, that was not the case for the head and neck case. The number of MU did not correlate with the plan quality index. Pre-treatment verifications results of the external audit did not agree with those of the in-house measurements for two centres: being within tolerance for in-house measurements and unacceptable for the audit or the other way round. Although all plans fulfilled dosimetric constraints, plan quality is highly dependent on the planner expertise. External audits are an excellent tool to detect errors in IMRT implementation and cannot be replaced by intercomparison using results obtained by centres. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Hailstorms over Switzerland: Verification of Crowd-sourced Data
NASA Astrophysics Data System (ADS)
Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia
2016-04-01
The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-11-01
This report presents the results of instrumentation measurements and observations made during construction of the North Ramp Starter Tunnel (NRST) of the Exploratory Studies Facility (ESF). The information in this report was developed as part of the Design Verification Study, Section 8.3.1.15.1.8 of the Yucca Mountain Site Characterization Plan (DOE 1988). The ESF is being constructed by the US Department of Energy (DOE) to evaluate the feasibility of locating a potential high-level nuclear waste repository on lands within and adjacent to the Nevada Test Site (NTS), Nye County, Nevada. The Design Verification Studies are performed to collect information during constructionmore » of the ESF that will be useful for design and construction of the potential repository. Four experiments make up the Design Verification Study: Evaluation of Mining Methods, Monitoring Drift Stability, Monitoring of Ground Support Systems, and The Air Quality and Ventilation Experiment. This report describes Sandia National Laboratories` (SNL) efforts in the first three of these experiments in the NRST.« less
VerifEYE: a real-time meat inspection system for the beef processing industry
NASA Astrophysics Data System (ADS)
Kocak, Donna M.; Caimi, Frank M.; Flick, Rick L.; Elharti, Abdelmoula
2003-02-01
Described is a real-time meat inspection system developed for the beef processing industry by eMerge Interactive. Designed to detect and localize trace amounts of contamination on cattle carcasses in the packing process, the system affords the beef industry an accurate, high speed, passive optical method of inspection. Using a method patented by United States Department of Agriculture and Iowa State University, the system takes advantage of fluorescing chlorophyll found in the animal's diet and therefore the digestive track to allow detection and imaging of contaminated areas that may harbor potentially dangerous microbial pathogens. Featuring real-time image processing and documentation of performance, the system can be easily integrated into a processing facility's Hazard Analysis and Critical Control Point quality assurance program. This paper describes the VerifEYE carcass inspection and removal verification system. Results indicating the feasibility of the method, as well as field data collected using a prototype system during four university trials conducted in 2001 are presented. Two successful demonstrations using the prototype system were held at a major U.S. meat processing facility in early 2002.
Authentication Based on Pole-zero Models of Signature Velocity
Rashidi, Saeid; Fallah, Ali; Towhidkhah, Farzad
2013-01-01
With the increase of communication and financial transaction through internet, on-line signature verification is an accepted biometric technology for access control and plays a significant role in authenticity and authorization in modernized society. Therefore, fast and precise algorithms for the signature verification are very attractive. The goal of this paper is modeling of velocity signal that pattern and properties is stable for persons. With using pole-zero models based on discrete cosine transform, precise method is proposed for modeling and then features is founded from strokes. With using linear, parzen window and support vector machine classifiers, the signature verification technique was tested with a large number of authentic and forgery signatures and has demonstrated the good potential of this technique. The signatures are collected from three different database include a proprietary database, the SVC2004 and the Sabanci University signature database benchmark databases. Experimental results based on Persian, SVC2004 and SUSIG databases show that our method achieves an equal error rate of 5.91%, 5.62% and 3.91% in the skilled forgeries, respectively. PMID:24696797
Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data
NASA Astrophysics Data System (ADS)
Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai
2017-11-01
Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.
Kuswandi, Bambang; Irmawati, Titi; Hidayat, Moch Amrun; Jayus; Ahmad, Musa
2014-01-01
A simple visual ethanol biosensor based on alcohol oxidase (AOX) immobilised onto polyaniline (PANI) film for halal verification of fermented beverage samples is described. This biosensor responds to ethanol via a colour change from green to blue, due to the enzymatic reaction of ethanol that produces acetaldehyde and hydrogen peroxide, when the latter oxidizes the PANI film. The procedure to obtain this biosensor consists of the immobilization of AOX onto PANI film by adsorption. For the immobilisation, an AOX solution is deposited on the PANI film and left at room temperature until dried (30 min). The biosensor was constructed as a dip stick for visual and simple use. The colour changes of the films have been scanned and analysed using image analysis software (i.e., ImageJ) to study the characteristics of the biosensor's response toward ethanol. The biosensor has a linear response in an ethanol concentration range of 0.01%–0.8%, with a correlation coefficient (r) of 0.996. The limit detection of the biosensor was 0.001%, with reproducibility (RSD) of 1.6% and a life time up to seven weeks when stored at 4 °C. The biosensor provides accurate results for ethanol determination in fermented drinks and was in good agreement with the standard method (gas chromatography) results. Thus, the biosensor could be used as a simple visual method for ethanol determination in fermented beverage samples that can be useful for Muslim community for halal verification. PMID:24473284
Astrobiology's Central Dilemma: How can we detect Life if we cannot even Define it?
NASA Astrophysics Data System (ADS)
Clark, B. C.
2001-11-01
Culling and consolidating from a collection of 102 attributes asserted as properties of Life, and the numerous Definitions of Life which invoke them, a new definition is proposed. Analysis of the pathways to proving that any given entity, from micro-sample to planetary object, harbors one or more lifeforms provides strategies for the observations, experiments and detection approaches. These are necessarily varied because of the relative accessibility/inaccessibility of the samples themselves, for example, from Mars, Europa, the ancient Earth or extra-solar system planets. A two-tiered Definition of Life has been formulated, involving both Lifeform and Organism. Devising exploration strategies with a reasonable probability of success and acceptance should proceed along the steps needed for detection and verification of the minimal properties which define Life itself. Multiple approaches, such as high resolution remote spectroscopy for detection of biomarker gases, in situ demonstrations of energy utilization to performs functions such as anabolic or catabolic transformations, achievement of demonstrated reproduction through multi-condition incubations, and probes for macromolecular biochemicals which indicate information storage should be undertaken wherever possible, as should return of samples to terrestrial laboratories for more versatile, more sensitive and more definitive examinations. Use of control samples is paramount, as is detailed understanding of the chemistry and physics of the environment which constrains the activities and tracers being sought.
Nagy, Tamás; van Lien, René; Willemsen, Gonneke; Proctor, Gordon; Efting, Marieke; Fülöp, Márta; Bárdos, György; Veerman, Enno C I; Bosch, Jos A
2015-07-01
Salivary alpha-amylase (sAA) is used as a sympathetic (SNS) stress marker, though its release is likely co-determined by SNS and parasympathetic (PNS) activation. The SNS and PNS show asynchronous changes during acute stressors, and sAA responses may thus vary with sample timing. Thirty-four participants underwent an eight-minute memory task (MT) and cold pressor task (CPT). Cardiovascular SNS (pre-ejection period, blood pressure) and PNS (heart rate variability) activity were monitored continuously. Unstimulated saliva was collected repeatedly during and after each laboratory stressor, and sAA concentration (U/ml) and secretion (U/minute) determined. Both stressors increased anxiety. The MT caused an immediate and continued cardiac SNS activation, but sAA concentration increased at task cessation only (+54%); i.e., when there was SNS-PNS co-activation. During the MT sAA secretion even decreased (-35%) in conjunction with flow rate and vagal tone. The CPT robustly increased blood pressure but not sAA. In summary, sAA fluctuations did not parallel changes in cardiac SNS activity or anxiety. sAA responses seem contingent on sample timing and flow rate, likely involving both SNS and PNS influences. Verification using other stressors and contexts seems warranted. Copyright © 2015 Elsevier B.V. All rights reserved.
A Secure Region-Based Geographic Routing Protocol (SRBGR) for Wireless Sensor Networks
Adnan, Ali Idarous; Hanapi, Zurina Mohd; Othman, Mohamed; Zukarnain, Zuriati Ahmad
2017-01-01
Due to the lack of dependency for routing initiation and an inadequate allocated sextant on responding messages, the secure geographic routing protocols for Wireless Sensor Networks (WSNs) have attracted considerable attention. However, the existing protocols are more likely to drop packets when legitimate nodes fail to respond to the routing initiation messages while attackers in the allocated sextant manage to respond. Furthermore, these protocols are designed with inefficient collection window and inadequate verification criteria which may lead to a high number of attacker selections. To prevent the failure to find an appropriate relay node and undesirable packet retransmission, this paper presents Secure Region-Based Geographic Routing Protocol (SRBGR) to increase the probability of selecting the appropriate relay node. By extending the allocated sextant and applying different message contention priorities more legitimate nodes can be admitted in the routing process. Moreover, the paper also proposed the bound collection window for a sufficient collection time and verification cost for both attacker identification and isolation. Extensive simulation experiments have been performed to evaluate the performance of the proposed protocol in comparison with other existing protocols. The results demonstrate that SRBGR increases network performance in terms of the packet delivery ratio and isolates attacks such as Sybil and Black hole. PMID:28121992
A Secure Region-Based Geographic Routing Protocol (SRBGR) for Wireless Sensor Networks.
Adnan, Ali Idarous; Hanapi, Zurina Mohd; Othman, Mohamed; Zukarnain, Zuriati Ahmad
2017-01-01
Due to the lack of dependency for routing initiation and an inadequate allocated sextant on responding messages, the secure geographic routing protocols for Wireless Sensor Networks (WSNs) have attracted considerable attention. However, the existing protocols are more likely to drop packets when legitimate nodes fail to respond to the routing initiation messages while attackers in the allocated sextant manage to respond. Furthermore, these protocols are designed with inefficient collection window and inadequate verification criteria which may lead to a high number of attacker selections. To prevent the failure to find an appropriate relay node and undesirable packet retransmission, this paper presents Secure Region-Based Geographic Routing Protocol (SRBGR) to increase the probability of selecting the appropriate relay node. By extending the allocated sextant and applying different message contention priorities more legitimate nodes can be admitted in the routing process. Moreover, the paper also proposed the bound collection window for a sufficient collection time and verification cost for both attacker identification and isolation. Extensive simulation experiments have been performed to evaluate the performance of the proposed protocol in comparison with other existing protocols. The results demonstrate that SRBGR increases network performance in terms of the packet delivery ratio and isolates attacks such as Sybil and Black hole.
Diode step stress program, JANTX1N5614
NASA Technical Reports Server (NTRS)
1978-01-01
The reliability of switching diode JANTX1N5614 was tested. The effect of power/temperature step stress on the diode was determined. Control sample units were maintained for verification of the electrical parametric testing. Results are reported.
Trebitz, Anett S; Hoffman, Joel C; Darling, John A; Pilgrim, Erik M; Kelly, John R; Brown, Emily A; Chadderton, W Lindsay; Egan, Scott P; Grey, Erin K; Hashsham, Syed A; Klymus, Katy E; Mahon, Andrew R; Ram, Jeffrey L; Schultz, Martin T; Stepien, Carol A; Schardt, James C
2017-11-01
Following decades of ecologic and economic impacts from a growing list of nonindigenous and invasive species, government and management entities are committing to systematic early- detection monitoring (EDM). This has reinvigorated investment in the science underpinning such monitoring, as well as the need to convey that science in practical terms to those tasked with EDM implementation. Using the context of nonindigenous species in the North American Great Lakes, this article summarizes the current scientific tools and knowledge - including limitations, research needs, and likely future developments - relevant to various aspects of planning and conducting comprehensive EDM. We begin with the scope of the effort, contrasting target-species with broad-spectrum monitoring, reviewing information to support prioritization based on species and locations, and exploring the challenge of moving beyond individual surveys towards a coordinated monitoring network. Next, we discuss survey design, including effort to expend and its allocation over space and time. A section on sample collection and analysis overviews the merits of collecting actual organisms versus shed DNA, reviews the capabilities and limitations of identification by morphology, DNA target markers, or DNA barcoding, and examines best practices for sample handling and data verification. We end with a section addressing the analysis of monitoring data, including methods to evaluate survey performance and characterize and communicate uncertainty. Although the body of science supporting EDM implementation is already substantial, research and information needs (many already actively being addressed) include: better data to support risk assessments that guide choice of taxa and locations to monitor; improved understanding of spatiotemporal scales for sample collection; further development of DNA target markers, reference barcodes, genomic workflows, and synergies between DNA-based and morphology-based taxonomy; and tools and information management systems for better evaluating and communicating survey outcomes and uncertainty. Published by Elsevier Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
2006-10-19
The 1607-F7, 141-M Building Septic Tank waste site was a septic tank and drain field that received sanitary sewage from the former 141-M Building. Remedial action was performed in August and November 2005. The results of verification sampling demonstrate that residual contaminant concentrations support future unrestricted land uses that can be represented by a rural-residential scenario. These results also show that residual concentrations support unrestricted future use of shallow zone soil and that contaminant levels remaining in the soil are protective of groundwater and the Columbia River.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Damiani, Rick
This manual summarizes the theory and preliminary verifications of the JacketSE module, which is an offshore jacket sizing tool that is part of the Wind-Plant Integrated System Design & Engineering Model toolbox. JacketSE is based on a finite-element formulation and on user-prescribed inputs and design standards' criteria (constraints). The physics are highly simplified, with a primary focus on satisfying ultimate limit states and modal performance requirements. Preliminary validation work included comparing industry data and verification against ANSYS, a commercial finite-element analysis package. The results are encouraging, and future improvements to the code are recommended in this manual.
Classical verification of quantum circuits containing few basis changes
NASA Astrophysics Data System (ADS)
Demarie, Tommaso F.; Ouyang, Yingkai; Fitzsimons, Joseph F.
2018-04-01
We consider the task of verifying the correctness of quantum computation for a restricted class of circuits which contain at most two basis changes. This contains circuits giving rise to the second level of the Fourier hierarchy, the lowest level for which there is an established quantum advantage. We show that when the circuit has an outcome with probability at least the inverse of some polynomial in the circuit size, the outcome can be checked in polynomial time with bounded error by a completely classical verifier. This verification procedure is based on random sampling of computational paths and is only possible given knowledge of the likely outcome.
High-pressure swing system for measurements of radioactive fission gases in air samples
NASA Astrophysics Data System (ADS)
Schell, W. R.; Vives-Battle, J.; Yoon, S. R.; Tobin, M. J.
1999-01-01
Radionuclides emitted from nuclear reactors, fuel reprocessing facilities and nuclear weapons tests are distributed widely in the atmosphere but have very low concentrations. As part of the Comprehensive Test Ban Treaty (CTBT), identification and verification of the emission of radionuclides from such sources are fundamental in maintaining nuclear security. To detect underground and underwater nuclear weapons tests, only the gaseous components need to be analyzed. Equipment has now been developed that can be used to collect large volumes of air, separate and concentrate the radioactive gas constituents, such as xenon and krypton, and measure them quantitatively. By measuring xenon isotopes with different half-lives, the time since the fission event can be determined. Developments in high-pressure (3500 kPa) swing chromatography using molecular sieve adsorbents have provided the means to collect and purify trace quantities of the gases from large volumes of air automatically. New scintillation detectors, together with timing and pulse shaping electronics, have provided the low-background levels essential in identifying the gamma ray, X-ray, and electron energy spectra of specific radionuclides. System miniaturization and portability with remote control could be designed for a field-deployable production model.
NASA Astrophysics Data System (ADS)
Nunnallee, Edmund Pierce, Jr.
1980-03-01
This dissertation consists of an investigation into the empirical scaling of a digital echo integrator for assessment of a population of juvenile sockeye salmon in Cultus Lake, British Columbia, Canada. The scaling technique was developed over the last ten years for use with totally uncalibrated but stabilized data collection and analysis equipment, and has been applied to populations of fish over a wide geographical range. This is the first investigation into the sources of bias and the accuracy of the technique, however, and constitutes a verification of the method. The initial section of the investigation describes hydroacoustic data analysis methods for estimation of effective sampling volume which is necessary for estimation of fish density. The second section consists of a computer simulation of effective sample volume estimation by this empirical method and is used to investigate the degree of bias introduced by electronic and physical parameters such as boat speed -fish depth interaction effects, electronic thresholding and saturation, transducer beam angle, fish depth stratification by size and spread of the target strength distribution of the fish. Comparisons of simulation predictions of sample volume estimation bias to actual survey results are given at the end of this section. A verification of the scaling method is then presented by comparison of a hydroacoustically derived estimation of the Cultus Lake smolt population to an independent and concurrent estimate made by counting the migrant fish as they passed through a weir in the outlet stream of the lake. Finally, the effect on conduct and accuracy of hydroacoustic assessment of juvenile sockeye salmon due to several behavioral traits are discussed. These traits include movements of presmolt fish in a lake just prior to their outmigration, daily vertical migrations and the emergence and dispersal of sockeye fry in Cultus Lake. In addition, a comparison of the summer depth preferences of the fish over their entire geographical distribution on the west coast of the U.S. and Canada are discussed in terms of hydroacoustic accessibility.
SU-E-T-49: A Multi-Institutional Study of Independent Dose Verification for IMRT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baba, H; Tachibana, H; Kamima, T
2015-06-15
Purpose: AAPM TG114 does not cover the independent verification for IMRT. We conducted a study of independent dose verification for IMRT in seven institutes to show the feasibility. Methods: 384 IMRT plans in the sites of prostate and head and neck (HN) were collected from the institutes, where the planning was performed using Eclipse and Pinnacle3 with the two techniques of step and shoot (S&S) and sliding window (SW). All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiologicalmore » path length. An ion-chamber measurement in a water-equivalent slab phantom was performed to compare the doses computed using the TPS and an independent dose verification program. Additionally, the agreement in dose computed in patient CT images between using the TPS and using the SMU was assessed. The dose of the composite beams in the plan was evaluated. Results: The agreement between the measurement and the SMU were −2.3±1.9 % and −5.6±3.6 % for prostate and HN sites, respectively. The agreement between the TPSs and the SMU were −2.1±1.9 % and −3.0±3.7 for prostate and HN sites, respectively. There was a negative systematic difference with similar standard deviation and the difference was larger in the HN site. The S&S technique showed a statistically significant difference between the SW. Because the Clarkson-based method in the independent program underestimated (cannot consider) the dose under the MLC. Conclusion: The accuracy would be improved when the Clarkson-based algorithm should be modified for IMRT and the tolerance level would be within 5%.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-03
... management, retrofit strategies, home performance verification, and sustainable construction fundamentals... submitted to the Office of Management and Budget (OMB) for review, as required by the Paperwork Reduction... (2528--Pending) and should be sent to: HUD Desk Officer, Office of Management and Budget, New Executive...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-12
... information or comments relative to alternative energy-related uses of the OCS; certified verification agents... evidence that agent is authorized to act for bidder; if applicable, submit information to support delay in... title, eligibility and other qualifications; and evidence that agent is authorized to execute assignment...
Top DoD Management Challenges, Fiscal Year 2018
2018-01-01
Afghan Human Resource Information Management System to validate ANDSF personnel numbers and salaries; • Afghan Personnel Pay System to facilitate...unit strength accountability and personnel verification; and • Core Information Management System to improve accountability of equipment inventories...ACQUISITION AND CONTRACT MANAGEMENT Federal Acquisition Regulation requires contractor performance information be collected in the Contractor
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-24
... DEPARTMENT OF HOMELAND SECURITY U.S. Citizenship and Immigration Services [OMB Control No. 1615... Employment Eligibility Verification; OMB Control No. 1615- 0112. The Department of Homeland Security, U.S..., should be directed to the Department of Homeland Security (DHS), and to the Office of Management and...
78 FR 6852 - Agency Information Collection (Income Verification) Activity Under OMB Review
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-31
... refer to ``OMB Control No. 2900-0518'' in any correspondence. FOR FURTHER INFORMATION CONTACT: Crystal...., Washington, DC 20420, (202) 632-7492, FAX (202) 632-7583 or email crystal[email protected] . Please refer to.... Estimated Annual Burden: 15,000 hours. Frequency of Response: One time. Estimated Number of Respondents: 30...
75 FR 4100 - Enterprise Income Verification (EIV) System-Debts Owed to PHAs and Terminations
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-26
... Paperwork Reduction Act. The Department is soliciting public comments on the subject proposal. This information collection is required to identify families who no longer participate in a HUD rental assistance program due to adverse termination of tenancy and/or assistance, land owe a debit to a Public Housing...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-14
... State Income and Eligibility Verification Provisions of the Deficit Reduction Act of 1984, Extension... the Deficit Reduction Act of 1984, which expires September 30, 2013. A copy of the proposed..., Attention: Patricia Mertens. Telephone number: 202-693-3182 (this is not a toll-free number). Fax: 202-693...
The forest and the trees: Applications for molecular markers in the Pecan Breeding Program
USDA-ARS?s Scientific Manuscript database
Inventory specific verification of accession identity is crucial to the function of the National Collection of Genetic Resources (NCGR) for Pecans and Hickories, and is an increasingly important component of the USDA ARS Pecan Breeding Program. The foundation of the NCGR is the living trees maintai...
This project will contribute valuable information on the performance characteristics of new technology for use in infrastructure rehabilitation, and will provide additional credibility to the U.S. Environment Protection Agency’s (EPA) Office of Research and Development’s (ORD) fo...
2013-09-30
STATEMENT A. Approved for public release; distribution is unlimited. . Soundscapes Michael B...models to provide hindcasts, nowcasts, and forecasts of the time-evolving soundscape . In terms of the types of sound sources, we will focus initially on...APPROACH The research has two principle thrusts: 1) the modeling of the soundscape , and 2) verification using datasets that have been collected
2012-09-30
STATEMENT A. Approved for public release; distribution is unlimited. . Soundscapes Michael B...models to provide hindcasts, nowcasts, and forecasts of the time-evolving soundscape . In terms of the types of sound sources, we will focus initially on...APPROACH The research has two principle thrusts: 1) the modeling of the soundscape , and 2) verification using datasets that have been collected
49 CFR 1572.17 - Applicant information required for TWIC security threat assessment.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Border Protection Arrival-Departure Record, Form I-94. (9) Except as described in paragraph (a)(9)(i) of... renew a TWIC, must submit biometric information to be used for identity verification purposes. If an individual cannot provide the selected biometric, TSA will collect an alternative biometric identifier. (d...
49 CFR 1572.17 - Applicant information required for TWIC security threat assessment.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Border Protection Arrival-Departure Record, Form I-94. (9) Except as described in paragraph (a)(9)(i) of... renew a TWIC, must submit biometric information to be used for identity verification purposes. If an individual cannot provide the selected biometric, TSA will collect an alternative biometric identifier. (d...
49 CFR 1572.17 - Applicant information required for TWIC security threat assessment.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Border Protection Arrival-Departure Record, Form I-94. (9) Except as described in paragraph (a)(9)(i) of... renew a TWIC, must submit biometric information to be used for identity verification purposes. If an individual cannot provide the selected biometric, TSA will collect an alternative biometric identifier. (d...
49 CFR 1572.17 - Applicant information required for TWIC security threat assessment.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Border Protection Arrival-Departure Record, Form I-94. (9) Except as described in paragraph (a)(9)(i) of... renew a TWIC, must submit biometric information to be used for identity verification purposes. If an individual cannot provide the selected biometric, TSA will collect an alternative biometric identifier. (d...
49 CFR 1572.17 - Applicant information required for TWIC security threat assessment.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Border Protection Arrival-Departure Record, Form I-94. (9) Except as described in paragraph (a)(9)(i) of... renew a TWIC, must submit biometric information to be used for identity verification purposes. If an individual cannot provide the selected biometric, TSA will collect an alternative biometric identifier. (d...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-15
... (LVC) will serve as the means by which the U.S. Department of Education (the Department) collects certain information from commercial holders of Federal Family Education Loan (FFEL) Program loans that a... DEPARTMENT OF EDUCATION Notice of Submission for OMB Review; Federal Student Aid; Loan...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-10
... DEPARTMENT OF HOMELAND SECURITY U.S. Citizenship and Immigration Services [OMB Control Number 1615.... On August 22, 2012 the Department of Homeland Security, U.S. Citizenship and Immigration Services.... Citizenship and Immigration Services, Department of Homeland Security. [FR Doc. 2012-22138 Filed 9-7-12; 8:45...
Calibration of a distributed routing rainfall-runoff model at four urban sites near Miami, Florida
Doyle, W. Harry; Miller, Jeffrey E.
1980-01-01
Urban stormwater data from four Miami, Fla. catchments were collected and compiled by the U.S. Geological Survey and were used for testing the applicability of deterministic modeling for characterizing stormwater flows from small land-use areas. A description of model calibration and verification is presented for: (1) A 40.8 acre single-family residential area, (2) a 58.3-acre highway area, (3) a 20.4-acre commercial area, and (4) a 14.7-acre multifamily residential area. Rainfall-runoff data for 80, 108, 114, and 52 storms at sites, 1, 2, 3, and 4, respectively, were collected, analyzed, and stored on direct-access files. Rainfall and runoff data for these storms (at 1-minute time intervals) were used in flow-modeling simulation analyses. A distributed routing Geological Survey rainfall-runoff model was used to determine rainfall excess and route overland and channel flows at each site. Optimization of soil-moisture- accounting and infiltration parameters was performed during the calibration phases. The results of this study showed that, with qualifications, an acceptable verification of the Geological Survey model can be achieved. (Kosco-USGS)
Modeling tidal hydrodynamics of San Diego Bay, California
Wang, P.-F.; Cheng, R.T.; Richter, K.; Gross, E.S.; Sutton, D.; Gartner, J.W.
1998-01-01
In 1983, current data were collected by the National Oceanic and Atmospheric Administration using mechanical current meters. During 1992 through 1996, acoustic Doppler current profilers as well as mechanical current meters and tide gauges were used. These measurements not only document tides and tidal currents in San Diego Bay, but also provide independent data sets for model calibration and verification. A high resolution (100-m grid), depth-averaged, numerical hydrodynamic model has been implemented for San Diego Bay to describe essential tidal hydrodynamic processes in the bay. The model is calibrated using the 1983 data set and verified using the more recent 1992-1996 data. Discrepancies between model predictions and field data in beth model calibration and verification are on the order of the magnitude of uncertainties in the field data. The calibrated and verified numerical model has been used to quantify residence time and dilution and flushing of contaminant effluent into San Diego Bay. Furthermore, the numerical model has become an important research tool in ongoing hydrodynamic and water quality studies and in guiding future field data collection programs.
K-12th grade students as active contributors to research investigations
NASA Astrophysics Data System (ADS)
Rock, Barrett N.; Lauten, Gary N.
1996-12-01
The Earth Day: Forest Watch Program at the University of New Hampshire utilizes morphological and anatomical measurements made on branch and needle samples from eastern white pine ( Pinus strobus), collected by K-12 students throughout New Hampshire and Maine. White pine is considered to be a bio-indicator species for ozone exposure. A University research project which monitors the response of white pine to elevated levels of tropospheric ozone has been developed by the authors, who incorporate student-made measurements such as needle length, occurrence of diagnostic foliar symptoms, needle retention, and cellular levels of damage, into an on-going project which characterizes conifer response to a variety of air pollutants. The research team compares classroom measurements with laboratory spectral reflectance measurements made on student-collected branch samples, and infers state-of-health conditions in white pine from the two-state area. These state-of-health data are, in turn, compared with State-monitored tropospheric ozone measurements on a yearly basis, resulting in change-over-time analysis of both regional ozone levels and relative levels of tree health. Based on the work to data (1991-1996), student-derived data have been found to correlate well with spectral parameters and with spatial patterns of summer ozone levels, suggesting that student measurements represent an accurate and reliable source of data for research scientists. Specific examples of student datasets and comparisons with reflectance data and how these can be used for Landsat data verification are presented, along with a discussion of the importance of being able to assess the accuracy of student data. Research scientists need to recognize the tremendous potential for access to reliable data represented by student data-collection programs such as Earth Day:Forest Watch.
Digital questionnaire platform in the Danish Blood Donor Study.
Burgdorf, K S; Felsted, N; Mikkelsen, S; Nielsen, M H; Thørner, L W; Pedersen, O B; Sørensen, E; Nielsen, K R; Bruun, M T; Werge, T; Erikstrup, C; Hansen, T; Ullum, H
2016-10-01
The Danish Blood Donor Study (DBDS) is a prospective, population-based study and biobank. Since 2010, 100,000 Danish blood donors have been included in the study. Prior to July 2015 all participating donors had to complete a paper-based questionnaire. Here we describe the establishment of a digital tablet-based questionnaire platform implemented in blood bank sites across Denmark. The digital questionnaire was developed using the open source survey software tool LimeSurvey. The participants accesses the questionnaire online with a standard SSL encrypted HTTP connection using their personal civil registration numbers. The questionnaire is placed at a front-end web server and a collection server retrieves the completed questionnaires. Data from blood samples, register data, genetic data and verification of signed informed consent are then transferred to and merged with the questionnaire data in the DBDS database. The digital platform enables personalized questionnaires, presenting only questions relevant to the specific donor by hiding unneeded follow-up questions on screening question results. New versions of questionnaires are immediately available at all blood collection facilities when new projects are initiated. The digital platform is a faster, cost-effective and more flexible solution to collect valid data from participating donors compared to paper-based questionnaires. The overall system can be used around the world by the use of Internet connection, but the level of security depends on the sensitivity of the data to be collected. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
1988-01-01
under field conditions. Sampling and analytical laboratory activities were performed by Ecology and Environment, Inc., and California Analytical...the proposed AER3 test conditions. All test samples would be obtained onsite by Ecology and Environment, Inc., of Buffalo, New York, and sent to...ensuring its safe operation. Ecology and Environment performed onsite verification sampling. This activity was coordinated with the Huber project team
Transistor step stress testing program for JANTX2N2905A
NASA Technical Reports Server (NTRS)
1979-01-01
The effect of power/temperature step stress when applied to the transistor JANTX2N2905A manufactured by Texas Instruments and Motorola is reported. A total of 48 samples from each manufacturer was submitted to the process outlined. In addition, two control sample units were maintained for verification of the electrical parametric testing. All test samples were subjected to the electrical tests outlined in Table 2 after completing the prior power/temperature step stress point.
Tempia, S; Salman, M D; Keefe, T; Morley, P; Freier, J E; DeMartini, J C; Wamwayi, H M; Njeumi, F; Soumaré, B; Abdi, A M
2010-12-01
A cross-sectional sero-survey, using a two-stage cluster sampling design, was conducted between 2002 and 2003 in ten administrative regions of central and southern Somalia, to estimate the seroprevalence and geographic distribution of rinderpest (RP) in the study area, as well as to identify potential risk factors for the observed seroprevalence distribution. The study was also used to test the feasibility of the spatially integrated investigation technique in nomadic and semi-nomadic pastoral systems. In the absence of a systematic list of livestock holdings, the primary sampling units were selected by generating random map coordinates. A total of 9,216 serum samples were collected from cattle aged 12 to 36 months at 562 sampling sites. Two apparent clusters of RP seroprevalence were detected. Four potential risk factors associated with the observed seroprevalence were identified: the mobility of cattle herds, the cattle population density, the proximity of cattle herds to cattle trade routes and cattle herd size. Risk maps were then generated to assist in designing more targeted surveillance strategies. The observed seroprevalence in these areas declined over time. In subsequent years, similar seroprevalence studies in neighbouring areas of Kenya and Ethiopia also showed a very low seroprevalence of RP or the absence of antibodies against RP. The progressive decline in RP antibody prevalence is consistent with virus extinction. Verification of freedom from RP infection in the Somali ecosystem is currently in progress.
NASA Astrophysics Data System (ADS)
Mahmood, H.; Siddique, M. R. H.; Akhter, M.
2016-08-01
Estimations of biomass, volume and carbon stock are important in the decision making process for the sustainable management of a forest. These estimations can be conducted by using available allometric equations of biomass and volume. Present study aims to: i. develop a compilation with verified allometric equations of biomass, volume, and carbon for trees and shrubs of Bangladesh, ii. find out the gaps and scope for further development of allometric equations for different trees and shrubs of Bangladesh. Key stakeholders (government departments, research organizations, academic institutions, and potential individual researchers) were identified considering their involvement in use and development of allometric equations. A list of documents containing allometric equations was prepared from secondary sources. The documents were collected, examined, and sorted to avoid repetition, yielding 50 documents. These equations were tested through a quality control scheme involving operational verification, conceptual verification, applicability, and statistical credibility. A total of 517 allometric equations for 80 species of trees, shrubs, palm, and bamboo were recorded. In addition, 222 allometric equations for 39 species were validated through the quality control scheme. Among the verified equations, 20%, 12% and 62% of equations were for green-biomass, oven-dried biomass, and volume respectively and 4 tree species contributed 37% of the total verified equations. Five gaps have been pinpointed for the existing allometric equations of Bangladesh: a. little work on allometric equation of common tree and shrub species, b. most of the works were concentrated on certain species, c. very little proportion of allometric equations for biomass estimation, d. no allometric equation for belowground biomass and carbon estimation, and d. lower proportion of valid allometric equations. It is recommended that site and species specific allometric equations should be developed and consistency in field sampling, sample processing, data recording and selection of allometric equations should be maintained to ensure accuracy in estimation of biomass, volume, and carbon stock in different forest types of Bangladesh.
2008-02-28
An ER-2 high-altitude Earth science aircraft banks away during a flight over the southern Sierra Nevada. NASA’s Armstrong Flight Research Center operates two of the Lockheed-built aircraft on a wide variety of environmental science, atmospheric sampling, and satellite data verification missions.
Advanced Curation: Solving Current and Future Sample Return Problems
NASA Technical Reports Server (NTRS)
Fries, M.; Calaway, M.; Evans, C.; McCubbin, F.
2015-01-01
Advanced Curation is a wide-ranging and comprehensive research and development effort at NASA Johnson Space Center that identifies and remediates sample related issues. For current collections, Advanced Curation investigates new cleaning, verification, and analytical techniques to assess their suitability for improving curation processes. Specific needs are also assessed for future sample return missions. For each need, a written plan is drawn up to achieve the requirement. The plan draws while upon current Curation practices, input from Curators, the analytical expertise of the Astromaterials Research and Exploration Science (ARES) team, and suitable standards maintained by ISO, IEST, NIST and other institutions. Additionally, new technologies are adopted on the bases of need and availability. Implementation plans are tested using customized trial programs with statistically robust courses of measurement, and are iterated if necessary until an implementable protocol is established. Upcoming and potential NASA missions such as OSIRIS-REx, the Asteroid Retrieval Mission (ARM), sample return missions in the New Frontiers program, and Mars sample return (MSR) all feature new difficulties and specialized sample handling requirements. The Mars 2020 mission in particular poses a suite of challenges since the mission will cache martian samples for possible return to Earth. In anticipation of future MSR, the following problems are among those under investigation: What is the most efficient means to achieve the less than 1.0 ng/sq cm total organic carbon (TOC) cleanliness required for all sample handling hardware? How do we maintain and verify cleanliness at this level? The Mars 2020 Organic Contamination Panel (OCP) predicts that organic carbon, if present, will be present at the "one to tens" of ppb level in martian near-surface samples. The same samples will likely contain wt% perchlorate salts, or approximately 1,000,000x as much perchlorate oxidizer as organic carbon. The chemical kinetics of this reaction are poorly understood at present under the conditions of cached or curated martian samples. Among other parameters, what is the maximum temperature allowed during storage in order to preserve native martian organic compounds for analysis? What is the best means to collect headspace gases from cached martian (and other) samples? This gas will contain not only martian atmosphere but also off-gassed volatiles from the cached solids.
Measuring Data Quality Through a Source Data Verification Audit in a Clinical Research Setting.
Houston, Lauren; Probst, Yasmine; Humphries, Allison
2015-01-01
Health data has long been scrutinised in relation to data quality and integrity problems. Currently, no internationally accepted or "gold standard" method exists measuring data quality and error rates within datasets. We conducted a source data verification (SDV) audit on a prospective clinical trial dataset. An audit plan was applied to conduct 100% manual verification checks on a 10% random sample of participant files. A quality assurance rule was developed, whereby if >5% of data variables were incorrect a second 10% random sample would be extracted from the trial data set. Error was coded: correct, incorrect (valid or invalid), not recorded or not entered. Audit-1 had a total error of 33% and audit-2 36%. The physiological section was the only audit section to have <5% error. Data not recorded to case report forms had the greatest impact on error calculations. A significant association (p=0.00) was found between audit-1 and audit-2 and whether or not data was deemed correct or incorrect. Our study developed a straightforward method to perform a SDV audit. An audit rule was identified and error coding was implemented. Findings demonstrate that monitoring data quality by a SDV audit can identify data quality and integrity issues within clinical research settings allowing quality improvement to be made. The authors suggest this approach be implemented for future research.