Airell, Asa; Lindbäck, Emma; Ataker, Ferda; Pörnull, Kirsti Jalakas; Wretlind, Bengt
2005-06-01
We compared 956 samples for AMPLICOR Neisseria gonorrhoeae polymerase chain reaction (PCR) (Roche) with species verification using the 16S rRNA gene to verification using gyrA gene. Control was the culture method. The gyrA verification uses pyrosequencing of the quinolone resistance-determining region of gyrA. Of 52 samples with optical density >/=0.2 in PCR, 27 were negative in culture, two samples from pharynx were false negative in culture and four samples from pharynx were false positives in verification with 16S rRNA. Twenty-five samples showed growth of gonococci, 18 of the corresponding PCR samples were verified by both methods; three urine samples were positive only in gyrA ; and one pharynx specimen was positive only in 16S rRNA. Three samples were lost. We conclude that AMPLICOR N. gonorrhoeae PCR with verification in gyrA gene can be considered as a diagnostic tool in populations with low prevalence of gonorrhoea and that pharynx specimens should not be analysed by PCR.
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
2007-12-03
The 100-F-26:10 waste site includes sanitary sewer lines that serviced the former 182-F, 183-F, and 151-F Buildings. In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
2008-03-18
The 100-F-26:15 waste site consisted of the remnant portions of underground process effluent and floor drain pipelines that originated at the 105-F Reactor. In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.
International Space Station Requirement Verification for Commercial Visiting Vehicles
NASA Technical Reports Server (NTRS)
Garguilo, Dan
2017-01-01
The COTS program demonstrated NASA could rely on commercial providers for safe, reliable, and cost-effective cargo delivery to ISS. The ISS Program has developed a streamlined process to safely integrate commercial visiting vehicles and ensure requirements are met Levy a minimum requirement set (down from 1000s to 100s) focusing on the ISS interface and safety, reducing the level of NASA oversight/insight and burden on the commercial Partner. Partners provide a detailed verification and validation plan documenting how they will show they've met NASA requirements. NASA conducts process sampling to ensure that the established verification processes is being followed. NASA participates in joint verification events and analysis for requirements that require both parties verify. Verification compliance is approved by NASA and launch readiness certified at mission readiness reviews.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-14
... subsequent soil samples showed levels of metals at or below generic residential criteria or background values... 1994- 1996 and additional sampling between 1998 and 2007. Area A--Site Entrance: Soil boring samples... verification samples. Additional soil samples were collected from the same location as the previous collection...
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. M. Capron
2008-04-29
The 100-F-26:12 waste site was an approximately 308-m-long, 1.8-m-diameter east-west-trending reinforced concrete pipe that joined the North Process Sewer Pipelines (100-F-26:1) and the South Process Pipelines (100-F-26:4) with the 1.8-m reactor cooling water effluent pipeline (100-F-19). In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-13
... Service [Docket No. FSIS-2008-0008] Salmonella Verification Sampling Program: Response to Comments on New... establishments that participate in SIP. The Agency intends to conduct its own unannounced, small- set sampling to... considering publishing verification sampling results for other product classes. In the 2006 Federal Register...
47 CFR 73.151 - Field strength measurements to establish performance of directional antennas.
Code of Federal Regulations, 2010 CFR
2010-10-01
... verified either by field strength measurement or by computer modeling and sampling system verification. (a... specifically identified by the Commission. (c) Computer modeling and sample system verification of modeled... performance verified by computer modeling and sample system verification. (1) A matrix of impedance...
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
2008-01-31
The 116-C-3 waste site consisted of two underground storage tanks designed to receive mixed waste from the 105-C Reactor Metals Examination Facility chemical dejacketing process. Confirmatory evaluation and subsequent characterization of the site determined that the southern tank contained approximately 34,000 L (9,000 gal) of dejacketing wastes, and that the northern tank was unused. In accordance with this evaluation, the verification sampling and modeling results support a reclassification of this site to Interim Closed Out. The results of verification sampling demonstrate that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils.more » The results also show that residual contaminant concentrations are protective of groundwater and the Columbia River.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
2008-03-03
The 100-F-26:13 waste site is the network of process sewer pipelines that received effluent from the 108-F Biological Laboratory and discharged it to the 188-F Ash Disposal Area (126-F-1 waste site). The pipelines included one 0.15-m (6-in.)-, two 0.2-m (8-in.)-, and one 0.31-m (12-in.)-diameter vitrified clay pipe segments encased in concrete. In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling demonstrated that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also showed thatmore » residual contaminant concentrations are protective of groundwater and the Columbia River.« less
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the ...
Koller, Marianne; Becker, Christian; Thiermann, Horst; Worek, Franz
2010-05-15
The purpose of this study was to check the applicability of different analytical methods for the identification of unknown nerve agents in human body fluids. Plasma and urine samples were spiked with nerve agents (plasma) or with their metabolites (urine) or were left blank. Seven random samples (35% of all samples) were selected for the verification test. Plasma was worked up for unchanged nerve agents and for regenerated nerve agents after fluoride-induced reactivation of nerve agent-inhibited butyrylcholinesterase. Both extracts were analysed by GC-MS. Metabolites were extracted from plasma and urine, respectively, and were analysed by LC-MS. The urinary metabolites and two blank samples could be identified without further measurements, plasma metabolites and blanks were identified in six of seven samples. The analysis of unchanged nerve agent provided five agents/blanks and the sixth agent after further investigation. The determination of the regenerated agents also provided only five clear findings during the first screening because of a rather noisy baseline. Therefore, the sample preparation was extended by a size exclusion step performed before addition of fluoride which visibly reduced baseline noise and thus improved identification of the two missing agents. The test clearly showed that verification should be performed by analysing more than one biomarker to ensure identification of the agent(s). Copyright (c) 2010 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Rieben, James C., Jr.
This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the use of real world samples. In the organic chemistry experiment, results suggest that the discovery-based design improved student retention of the chain length differentiation by physical properties relative to the verification-based design.
40 CFR 1065.342 - Sample dryer verification.
Code of Federal Regulations, 2011 CFR
2011-07-01
...)(2) to remove water from the sample gas, verify the performance upon installation, after major... before the sample gas reaches the analyzer. For example water can negatively interfere with a CLD's NOX... time. You may run this verification on the sample dryer alone, but you must use the maximum gas flow...
40 CFR 1065.342 - Sample dryer verification.
Code of Federal Regulations, 2010 CFR
2010-07-01
...)(2) to remove water from the sample gas, verify the performance upon installation, after major... before the sample gas reaches the analyzer. For example water can negatively interfere with a CLD's NOX... time. You may run this verification on the sample dryer alone, but you must use the maximum gas flow...
NASA Technical Reports Server (NTRS)
Johnson, Kenneth L.; White, K, Preston, Jr.
2012-01-01
The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques. This recommended procedure would be used as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. This document contains the outcome of the assessment.
Self-Verification and Depressive Symptoms in Marriage and Courtship: A Multiple Pathway Model.
ERIC Educational Resources Information Center
Katz, Jennifer; Beach, Steven R. H.
1997-01-01
Examines whether self-verifying feedback may lead to decreased depressive symptoms. Results, based on 138 married women and 258 dating women, showed full mediational effects in the married sample and partial effects in the dating sample. Findings suggest that partner self-verifying feedback may intensify the effect of self-esteem on depression.…
Alternative sample sizes for verification dose experiments and dose audits
NASA Astrophysics Data System (ADS)
Taylor, W. A.; Hansen, J. M.
1999-01-01
ISO 11137 (1995), "Sterilization of Health Care Products—Requirements for Validation and Routine Control—Radiation Sterilization", provides sampling plans for performing initial verification dose experiments and quarterly dose audits. Alternative sampling plans are presented which provide equivalent protection. These sampling plans can significantly reduce the cost of testing. These alternative sampling plans have been included in a draft ISO Technical Report (type 2). This paper examines the rational behind the proposed alternative sampling plans. The protection provided by the current verification and audit sampling plans is first examined. Then methods for identifying equivalent plans are highlighted. Finally, methods for comparing the cost associated with the different plans are provided. This paper includes additional guidance for selecting between the original and alternative sampling plans not included in the technical report.
40 CFR 1065.545 - Verification of proportional flow control for batch sampling.
Code of Federal Regulations, 2014 CFR
2014-07-01
... control for batch sampling. 1065.545 Section 1065.545 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Performing an Emission Test Over Specified Duty Cycles § 1065.545 Verification of proportional flow control for batch sampling. For any...
Cleaning and Cleanliness Verification Techniques for Mars Returned Sample Handling
NASA Technical Reports Server (NTRS)
Mickelson, E. T.; Lindstrom, D. J.; Allton, J. H.; Hittle, J. D.
2002-01-01
Precision cleaning and cleanliness verification techniques are examined as a subset of a comprehensive contamination control strategy for a Mars sample return mission. Additional information is contained in the original extended abstract.
Formal verification of medical monitoring software using Z language: a representative sample.
Babamir, Seyed Morteza; Borhani, Mehdi
2012-08-01
Medical monitoring systems are useful aids assisting physicians in keeping patients under constant surveillance; however, taking sound decision by the systems is a physician concern. As a result, verification of the systems behavior in monitoring patients is a matter of significant. The patient monitoring is undertaken by software in modern medical systems; so, software verification of modern medial systems have been noticed. Such verification can be achieved by the Formal Languages having mathematical foundations. Among others, the Z language is a suitable formal language has been used to formal verification of systems. This study aims to present a constructive method to verify a representative sample of a medical system by which the system is visually specified and formally verified against patient constraints stated in Z Language. Exploiting our past experience in formal modeling Continuous Infusion Insulin Pump (CIIP), we think of the CIIP system as a representative sample of medical systems in proposing our present study. The system is responsible for monitoring diabetic's blood sugar.
Peng, Jun; Chen, Yi-Ting; Chen, Chien-Lun; Li, Liang
2014-07-01
Large-scale metabolomics study requires a quantitative method to generate metabolome data over an extended period with high technical reproducibility. We report a universal metabolome-standard (UMS) method, in conjunction with chemical isotope labeling liquid chromatography-mass spectrometry (LC-MS), to provide long-term analytical reproducibility and facilitate metabolome comparison among different data sets. In this method, UMS of a specific type of sample labeled by an isotope reagent is prepared a priori. The UMS is spiked into any individual samples labeled by another form of the isotope reagent in a metabolomics study. The resultant mixture is analyzed by LC-MS to provide relative quantification of the individual sample metabolome to UMS. UMS is independent of a study undertaking as well as the time of analysis and useful for profiling the same type of samples in multiple studies. In this work, the UMS method was developed and applied for a urine metabolomics study of bladder cancer. UMS of human urine was prepared by (13)C2-dansyl labeling of a pooled sample from 20 healthy individuals. This method was first used to profile the discovery samples to generate a list of putative biomarkers potentially useful for bladder cancer detection and then used to analyze the verification samples about one year later. Within the discovery sample set, three-month technical reproducibility was examined using a quality control sample and found a mean CV of 13.9% and median CV of 9.4% for all the quantified metabolites. Statistical analysis of the urine metabolome data showed a clear separation between the bladder cancer group and the control group from the discovery samples, which was confirmed by the verification samples. Receiver operating characteristic (ROC) test showed that the area under the curve (AUC) was 0.956 in the discovery data set and 0.935 in the verification data set. These results demonstrated the utility of the UMS method for long-term metabolomics and discovering potential metabolite biomarkers for diagnosis of bladder cancer.
Reconstruction based finger-knuckle-print verification with score level adaptive binary fusion.
Gao, Guangwei; Zhang, Lei; Yang, Jian; Zhang, Lin; Zhang, David
2013-12-01
Recently, a new biometrics identifier, namely finger knuckle print (FKP), has been proposed for personal authentication with very interesting results. One of the advantages of FKP verification lies in its user friendliness in data collection. However, the user flexibility in positioning fingers also leads to a certain degree of pose variations in the collected query FKP images. The widely used Gabor filtering based competitive coding scheme is sensitive to such variations, resulting in many false rejections. We propose to alleviate this problem by reconstructing the query sample with a dictionary learned from the template samples in the gallery set. The reconstructed FKP image can reduce much the enlarged matching distance caused by finger pose variations; however, both the intra-class and inter-class distances will be reduced. We then propose a score level adaptive binary fusion rule to adaptively fuse the matching distances before and after reconstruction, aiming to reduce the false rejections without increasing much the false acceptances. Experimental results on the benchmark PolyU FKP database show that the proposed method significantly improves the FKP verification accuracy.
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. A. Carlson
2006-02-23
The 1607-D4 Septic System was a septic tank and tile field that received sanitary sewage from the 115-D/DR Gas Recirculation Facility. This septic system operated from 1944 to 1968. Decommissioning took place in 1985 and 1986 when all above-grade features were demolished and the tank backfilled. The results of verification sampling demonstrated that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also showed that residual contaminant concentrations are protective of groundwater and the Columbia River.
The U.S. Environmental Protection Agency has created the Environmental Technology Verification Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the ETV Program...
2007-03-01
Characterisation. In Nanotechnology Aerospace Applications – 2006 (pp. 4-1 – 4-8). Educational Notes RTO-EN-AVT-129bis, Paper 4. Neuilly-sur-Seine, France: RTO...the Commercialisation Processes Concept IDEA Proof-of- Principle Trial Samples Engineering Verification Samples Design Verification Samples...SEIC Systems Engineering for commercialisation Design Houses, Engineering & R&D USERS & Integrators SE S U R Integrators Fabs & Wafer Processing Die
The U.S. EPA has created the Environmental Technology Verification program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program tested the performance of baghouse filtrati...
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...
This verification test was conducted according to procedures specifiedin the Test/QA Planfor Verification of Enzyme-Linked Immunosorbent Assay (ELISA) Test Kis for the Quantitative Determination of Endocrine Disrupting Compounds (EDCs) in Aqueous Phase Samples. Deviations to the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
2006-10-19
The 1607-F7, 141-M Building Septic Tank waste site was a septic tank and drain field that received sanitary sewage from the former 141-M Building. Remedial action was performed in August and November 2005. The results of verification sampling demonstrate that residual contaminant concentrations support future unrestricted land uses that can be represented by a rural-residential scenario. These results also show that residual concentrations support unrestricted future use of shallow zone soil and that contaminant levels remaining in the soil are protective of groundwater and the Columbia River.
Classical verification of quantum circuits containing few basis changes
NASA Astrophysics Data System (ADS)
Demarie, Tommaso F.; Ouyang, Yingkai; Fitzsimons, Joseph F.
2018-04-01
We consider the task of verifying the correctness of quantum computation for a restricted class of circuits which contain at most two basis changes. This contains circuits giving rise to the second level of the Fourier hierarchy, the lowest level for which there is an established quantum advantage. We show that when the circuit has an outcome with probability at least the inverse of some polynomial in the circuit size, the outcome can be checked in polynomial time with bounded error by a completely classical verifier. This verification procedure is based on random sampling of computational paths and is only possible given knowledge of the likely outcome.
Automated biowaste sampling system urine subsystem operating model, part 1
NASA Technical Reports Server (NTRS)
Fogal, G. L.; Mangialardi, J. K.; Rosen, F.
1973-01-01
The urine subsystem automatically provides for the collection, volume sensing, and sampling of urine from six subjects during space flight. Verification of the subsystem design was a primary objective of the current effort which was accomplished thru the detail design, fabrication, and verification testing of an operating model of the subsystem.
Improving semi-text-independent method of writer verification using difference vector
NASA Astrophysics Data System (ADS)
Li, Xin; Ding, Xiaoqing
2009-01-01
The semi-text-independent method of writer verification based on the linear framework is a method that can use all characters of two handwritings to discriminate the writers in the condition of knowing the text contents. The handwritings are allowed to just have small numbers of even totally different characters. This fills the vacancy of the classical text-dependent methods and the text-independent methods of writer verification. Moreover, the information, what every character is, is used for the semi-text-independent method in this paper. Two types of standard templates, generated from many writer-unknown handwritten samples and printed samples of each character, are introduced to represent the content information of each character. The difference vectors of the character samples are gotten by subtracting the standard templates from the original feature vectors and used to replace the original vectors in the process of writer verification. By removing a large amount of content information and remaining the style information, the verification accuracy of the semi-text-independent method is improved. On a handwriting database involving 30 writers, when the query handwriting and the reference handwriting are composed of 30 distinct characters respectively, the average equal error rate (EER) of writer verification reaches 9.96%. And when the handwritings contain 50 characters, the average EER falls to 6.34%, which is 23.9% lower than the EER of not using the difference vectors.
The Mars Science Laboratory Organic Check Material
NASA Astrophysics Data System (ADS)
Conrad, Pamela G.; Eigenbrode, Jennifer L.; Von der Heydt, Max O.; Mogensen, Claus T.; Canham, John; Harpold, Dan N.; Johnson, Joel; Errigo, Therese; Glavin, Daniel P.; Mahaffy, Paul R.
2012-09-01
Mars Science Laboratory's Curiosity rover carries a set of five external verification standards in hermetically sealed containers that can be sampled as would be a Martian rock, by drilling and then portioning into the solid sample inlet of the Sample Analysis at Mars (SAM) suite. Each organic check material (OCM) canister contains a porous ceramic solid, which has been doped with a fluorinated hydrocarbon marker that can be detected by SAM. The purpose of the OCM is to serve as a verification tool for the organic cleanliness of those parts of the sample chain that cannot be cleaned other than by dilution, i.e., repeated sampling of Martian rock. SAM possesses internal calibrants for verification of both its performance and its internal cleanliness, and the OCM is not used for that purpose. Each OCM unit is designed for one use only, and the choice to do so will be made by the project science group (PSG).
Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey
2010-09-01
Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.
NASA Astrophysics Data System (ADS)
Malik, A.; Setiawan, A.; Suhandi, A.; Permanasari, A.; Dirgantara, Y.; Yuniarti, H.; Sapriadil, S.; Hermita, N.
2018-01-01
This study aimed to investigate the improvement to pre-service teacher’s communication skills through Higher Order Thinking Laboratory (HOT Lab) on electric circuit topic. This research used the quasi-experiment method with pretest-posttest control group design. Research subjects were 60 students of Physics Education in UIN Sunan Gunung Djati Bandung. The sample was chosen by random sampling technique. Students’ communication skill data collected using a communication skills test instruments-essays form and observations sheets. The results showed that pre-service teacher communication skills using HOT Lab were higher than verification lab. Student’s communication skills in groups using HOT Lab were not influenced by gender. Communication skills could increase due to HOT Lab based on problems solving that can develop communication through hands-on activities. Therefore, the conclusion of this research shows the application of HOT Lab is more effective than the verification lab to improve communication skills of pre-service teachers in electric circuit topic and gender is not related to a person’s communication skills.
DOE Office of Scientific and Technical Information (OSTI.GOV)
King, David A.
2012-08-16
Oak Ridge Associated Universities (ORAU) conducted in-process inspections and independent verification (IV) surveys in support of DOE's remedial efforts in Zone 1 of East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. Inspections concluded that the remediation contractor's soil removal and survey objectives were satisfied and the dynamic verification strategy (DVS) was implemented as designed. Independent verification (IV) activities included gamma walkover surveys and soil sample collection/analysis over multiple exposure units (EUs).
Feasibility of biochemical verification in a web-based smoking cessation study.
Cha, Sarah; Ganz, Ollie; Cohn, Amy M; Ehlke, Sarah J; Graham, Amanda L
2017-10-01
Cogent arguments have been made against the need for biochemical verification in population-based studies with low-demand characteristics. Despite this fact, studies involving digital interventions (low-demand) are often required in peer review to report biochemically verified abstinence. To address this discrepancy, we examined the feasibility and costs of biochemical verification in a web-based study conducted with a national sample. Participants were 600U.S. adult current smokers who registered on a web-based smoking cessation program and completed surveys at baseline and 3months. Saliva sampling kits were sent to participants who reported 7-day abstinence at 3months, and analyzed for cotinine. The response rate at 3-months was 41.2% (n=247): 93 participants reported 7-day abstinence (38%) and were mailed a saliva kit (71% returned). The discordance rate was 36.4%. Participants with discordant responses were more likely to report 3-month use of nicotine replacement therapy or e-cigarettes than those with concordant responses (79.2% vs. 45.2%, p=0.007). The total cost of saliva sampling was $8280 ($125/sample). Biochemical verification was both time- and cost-intensive, and yielded a relatively small number of samples due to low response rates and use of other nicotine products during the follow-up period. There was a high rate of discordance of self-reported abstinence and saliva testing. Costs for data collection may be prohibitive for studies with large sample sizes or limited budgets. Our findings echo previous statements that biochemical verification is not necessary in population-based studies, and add evidence specific to technology-based studies. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Johnson, Kenneth L.; White, K. Preston, Jr.
2012-01-01
The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.
Kleene Algebra and Bytecode Verification
2016-04-27
computing the star (Kleene closure) of a matrix of transfer functions. In this paper we show how this general framework applies to the problem of Java ...bytecode verification. We show how to specify transfer functions arising in Java bytecode verification in such a way that the Kleene algebra operations...potentially improve the performance over the standard worklist algorithm when a small cutset can be found. Key words: Java , bytecode, verification, static
Haidar Ahmad, Imad A; Tam, James; Li, Xue; Duffield, William; Tarara, Thomas; Blasko, Andrei
2017-02-05
The parameters affecting the recovery of pharmaceutical residues from the surface of stainless steel coupons for quantitative cleaning verification method development have been studied, including active pharmaceutical ingredient (API) level, spiking procedure, API/excipient ratio, analyst-to-analyst variability, inter-day variability, and cleaning procedure of the coupons. The lack of a well-defined procedure that consistently cleaned coupon surface was identified as the major contributor to low and variable recoveries. Assessment of acid, base, and oxidant washes, as well as the order of treatment, showed that a base-water-acid-water-oxidizer-water wash procedure resulted in consistent, accurate spiked recovery (>90%) and reproducible results (S rel ≤4%). By applying this cleaning procedure to the previously used coupons that failed the cleaning acceptance criteria, multiple analysts were able to obtain consistent recoveries from day-to-day for different APIs, and API/excipient ratios at various spike levels. We successfully applied our approach for cleaning verification of small molecules (MW<1000Da) as well as large biomolecules (MW up to 50,000Da). Method robustness was greatly influenced by the sample preparation procedure, especially for analyses using total organic carbon (TOC) determination. Copyright © 2016 Elsevier B.V. All rights reserved.
[Tobacco quality analysis of producing areas of Yunnan tobacco using near-infrared (NIR) spectrum].
Wang, Yi; Ma, Xiang; Wen, Ya-Dong; Yu, Chun-Xia; Wang, Luo-Ping; Zhao, Long-Lian; Li, Jun-Hui
2013-01-01
In the present study, tobacco quality analysis of different producing areas was carried out applying spectrum projection and correlation methods. The group of industrial classification data was near-infrared (NIR) spectrum in 2010 year of middle parts of tobacco plant from Hongta Tobacco (Group) Co., Ltd. Twelve hundred seventy six superior tobacco leaf samples were collected from four producing areas, in which three areas from Yuxi, Chuxiong and Zhaotong, in Yunnan province all belong to tobacco varieties of K326 and one area from Dali belongs to tobacco varieties of Hongda. The conclusion showed that when the samples were divided into two parts by the ratio of 2 : 1 randomly as analysis and verification sets, the verification set corresponded with the analysis set applying spectrum projection because their correlation coefficients by the first and second dimensional projection were all above 0.99. At the same time, The study discussed a method to get the quantitative similarity values of different producing areas samples. The similarity values were instructive in tobacco plant planning, quality management, acquisition of raw materials of tobacco and tobacco leaf blending.
He, Hua; McDermott, Michael P.
2012-01-01
Sensitivity and specificity are common measures of the accuracy of a diagnostic test. The usual estimators of these quantities are unbiased if data on the diagnostic test result and the true disease status are obtained from all subjects in an appropriately selected sample. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the result of the diagnostic test and other characteristics of the subjects. Estimators of sensitivity and specificity based on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias under the assumption that the missing data on disease status are missing at random (MAR), that is, the probability of missingness depends on the true (missing) disease status only through the test result and observed covariate information. When some of the covariates are continuous, or the number of covariates is relatively large, the existing methods require parametric models for the probability of disease or the probability of verification (given the test result and covariates), and hence are subject to model misspecification. We propose a new method for correcting verification bias based on the propensity score, defined as the predicted probability of verification given the test result and observed covariates. This is estimated separately for those with positive and negative test results. The new method classifies the verified sample into several subsamples that have homogeneous propensity scores and allows correction for verification bias. Simulation studies demonstrate that the new estimators are more robust to model misspecification than existing methods, but still perform well when the models for the probability of disease and probability of verification are correctly specified. PMID:21856650
Multi-canister overpack project -- verification and validation, MCNP 4A
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldmann, L.H.
This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less
Madon, Stephanie; Guyll, Max; Buller, Ashley A.; Scherr, Kyle C.; Willard, Jennifer; Spoth, Richard
2010-01-01
This research examined whether self-fulfilling prophecy effects are mediated by self-verification, informational conformity, and modeling processes. The authors examined these mediational processes across multiple time frames with longitudinal data obtained from two samples of mother – child dyads (N1 = 487; N2 = 287). Children’s alcohol use was the outcome variable. The results provided consistent support for the mediational process of self-verification. In both samples and across several years of adolescence, there was a significant indirect effect of mothers’ beliefs on children’s alcohol use through children’s self-assessed likelihood of drinking alcohol in the future. Comparatively less support was found for informational conformity and modeling processes as mediators of mothers’ self-fulfilling effects. The potential for self-fulfilling prophecies to produce long lasting changes in targets’ behavior via self-verification processes are discussed. PMID:18665708
Madon, Stephanie; Guyll, Max; Buller, Ashley A; Scherr, Kyle C; Willard, Jennifer; Spoth, Richard
2008-08-01
This research examined whether self-fulfilling prophecy effects are mediated by self-verification, informational conformity, and modeling processes. The authors examined these mediational processes across multiple time frames with longitudinal data obtained from two samples of mother-child dyads (N-sub-1 = 486; N-sub-2 = 287), with children's alcohol use as the outcome variable. The results provided consistent support for the mediational process of self-verification. In both samples and across several years of adolescence, there was a significant indirect effect of mothers' beliefs on children's alcohol use through children's self-assessed likelihood of drinking alcohol in the future. Comparatively less support was found for informational conformity and modeling processes as mediators of mothers' self-fulfilling effects. The potential for self-fulfilling prophecies to produce long-lasting changes in targets' behavior via self-verification processes are discussed. (c) 2008 APA, all rights reserved
Valentiner, David P; Skowronski, John J; McGrath, Patrick B; Smith, Sarah A; Renner, Kerry A
2011-10-01
A self-verification model of social anxiety views negative social self-esteem as a core feature of social anxiety. This core feature is proposed to be maintained through self-verification processes, such as by leading individuals with negative social self-esteem to prefer negative social feedback. This model is tested in two studies. In Study 1, questionnaires were administered to a college sample (N = 317). In Study 2, questionnaires were administered to anxiety disordered patients (N = 62) before and after treatment. Study 1 developed measures of preference for negative social feedback and social self-esteem, and provided evidence of their incremental validity in a college sample. Study 2 found that these two variables are not strongly related to fears of evaluation, are relatively unaffected by a treatment that targets such fears, and predict residual social anxiety following treatment. Overall, these studies provide preliminary evidence for a self-verification model of social anxiety.
2008-07-01
Leachate . ................................ 56 xi Table 24. Smelter Site Soil Lettuce Germination Percentage...sand soil (Table 22). This discovery was contrary to the hypothesized results. Archived samples of leachate from each treatment were examined...but after further investigation ,the pH and EC of the New Jersey leachate showed no remarkable differences between the unamended or sand unamended
Verification of spectrophotometric method for nitrate analysis in water samples
NASA Astrophysics Data System (ADS)
Kurniawati, Puji; Gusrianti, Reny; Dwisiwi, Bledug Bernanti; Purbaningtias, Tri Esti; Wiyantoko, Bayu
2017-12-01
The aim of this research was to verify the spectrophotometric method to analyze nitrate in water samples using APHA 2012 Section 4500 NO3-B method. The verification parameters used were: linearity, method detection limit, level of quantitation, level of linearity, accuracy and precision. Linearity was obtained by using 0 to 50 mg/L nitrate standard solution and the correlation coefficient of standard calibration linear regression equation was 0.9981. The method detection limit (MDL) was defined as 0,1294 mg/L and limit of quantitation (LOQ) was 0,4117 mg/L. The result of a level of linearity (LOL) was 50 mg/L and nitrate concentration 10 to 50 mg/L was linear with a level of confidence was 99%. The accuracy was determined through recovery value was 109.1907%. The precision value was observed using % relative standard deviation (%RSD) from repeatability and its result was 1.0886%. The tested performance criteria showed that the methodology was verified under the laboratory conditions.
An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices
Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei
2017-01-01
In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer’s forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices. PMID:28075375
An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices.
Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei
2017-01-10
In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer's forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices.
Working Memory Mechanism in Proportional Quantifier Verification
ERIC Educational Resources Information Center
Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria
2014-01-01
The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…
[The Dose Effect of Isocenter Selection during IMRT Dose Verification with the 2D Chamber Array].
Xie, Chuanbin; Cong, Xiaohu; Xu, Shouping; Dai, Xiangkun; Wang, Yunlai; Han, Lu; Gong, Hanshun; Ju, Zhongjian; Ge, Ruigang; Ma, Lin
2015-03-01
To investigate the dose effect of isocenter difference during IMRT dose verification with the 2D chamber array. The samples collected from 10 patients were respectively designed for IMRT plans, the isocenter of which was independently defined as P(o), P(x) and P(y). P(o) was fixed on the target center and the other points shifted 8cm from the target center in the orientation of x/y. The PTW729 was used for 2D dose verification in the 3 groups which beams of plans were set to 0 degrees. The γ-analysis passing rates for the whole plan and each beam were gotten using the different standards in the 3 groups, The results showed the mean passing rate of γ-analysis was highest in the P(o) group, and the mean passing rate of the whole plan was better than that of each beam. In addition, it became worse with the increase of dose leakage between the leaves in P(y) group. Therefore, the determination of isocenter has a visible effect for IMRT dose verification of the 2D chamber array, The isocenter of the planning design should be close to the geometric center of target.
Formal Multilevel Hierarchical Verification of Synchronous MOS VLSI Circuits.
1987-06-01
166 12.4 Capacitance Coupling............................. 166 12.5 Multiple Abstraction Fuctions ....................... 168...depend on whether it is performing flat verification or hierarchical verification. The primary operations of Silica Pithecus when performing flat...signals never arise. The primary operation of Silica Pithecus when performing hierarchical verification is processing constraints to show they hold
Deductive Evaluation: Implicit Code Verification With Low User Burden
NASA Technical Reports Server (NTRS)
Di Vito, Ben L.
2016-01-01
We describe a framework for symbolically evaluating C code using a deductive approach that discovers and proves program properties. The framework applies Floyd-Hoare verification principles in its treatment of loops, with a library of iteration schemes serving to derive loop invariants. During evaluation, theorem proving is performed on-the-fly, obviating the generation of verification conditions normally needed to establish loop properties. A PVS-based prototype is presented along with results for sample C functions.
Improved Detection Technique for Solvent Rinse Cleanliness Verification
NASA Technical Reports Server (NTRS)
Hornung, S. D.; Beeson, H. D.
2001-01-01
The NASA White Sands Test Facility (WSTF) has an ongoing effort to reduce or eliminate usage of cleaning solvents such as CFC-113 and its replacements. These solvents are used in the final clean and cleanliness verification processes for flight and ground support hardware, especially for oxygen systems where organic contaminants can pose an ignition hazard. For the final cleanliness verification in the standard process, the equivalent of one square foot of surface area of parts is rinsed with the solvent, and the final 100 mL of the rinse is captured. The amount of nonvolatile residue (NVR) in the solvent is determined by weight after the evaporation of the solvent. An improved process of sampling this rinse, developed at WSTF, requires evaporation of less than 2 mL of the solvent to make the cleanliness verification. Small amounts of the solvent are evaporated in a clean stainless steel cup, and the cleanliness of the stainless steel cup is measured using a commercially available surface quality monitor. The effectiveness of this new cleanliness verification technique was compared to the accepted NVR sampling procedures. Testing with known contaminants in solution, such as hydraulic fluid, fluorinated lubricants, and cutting and lubricating oils, was performed to establish a correlation between amount in solution and the process response. This report presents the approach and results and discusses the issues in establishing the surface quality monitor-based cleanliness verification.
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2014 CFR
2014-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2012 CFR
2012-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2013 CFR
2013-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2010 CFR
2010-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2011 CFR
2011-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
The U.S. Environmental Protection Agency (EPA) design efficient processes for conducting has created the Environmental Technology perfofl1lance tests of innovative technologies. Verification Program (E TV) to facilitate the deployment of innovative or improved environmental techn...
The U.S. Environmental Protection Agency (EPA) design efficient processes for conducting has created the Environmental Technology perfofl1lance tests of innovative technologies. Verification Program (E TV) to facilitate the deployment of innovative or improved environmental techn...
DOE Office of Scientific and Technical Information (OSTI.GOV)
P.C. Weaver
2009-02-17
Conduct verification surveys of grids at the DWI 1630 Site in Knoxville, Tennessee. The independent verification team (IVT) from ORISE, conducted verification activities in whole and partial grids, as completed by BJC. ORISE site activities included gamma surface scans and soil sampling within 33 grids; G11 through G14; H11 through H15; X14, X15, X19, and X21; J13 through J15 and J17 through J21; K7 through K9 and K13 through K15; L13 through L15; and M14 through M16
Physical property measurements on analog granites related to the joint verification experiment
NASA Astrophysics Data System (ADS)
Martin, Randolph J., III; Coyner, Karl B.; Haupt, Robert W.
1990-08-01
A key element in JVE (Joint Verification Experiment) conducted jointly between the United States and the USSR is the analysis of the geology and physical properties of the rocks in the respective test sites. A study was initiated to examine unclassified crystalline rock specimens obtained from areas near the Soviet site, Semipalatinsk and appropriate analog samples selected from Mt. Katadin, Maine. These rocks were also compared to Sierra White and Westerly Granite which have been studied in great detail. Measurements performed to characterize these rocks were: (1) Uniaxial strain with simultaneous compressional and shear wave velocities; (2) Hydrostatic compression to 150 MPa with simultaneous compressional and shear wave velocities; (3) Attenuation measurements as a function of frequency and strain amplitude for both dry and water saturated conditions. Elastic moduli determined from the hydrostatic compression and uniaxial strain test show that the rock matrix/mineral properties were comparable in magnitudes which vary within 25 percent from sample to sample. These properties appear to be approximately isotropic, especially at high pressures. However, anisotropy evident for certain samples at pressures below 35 MPa is attributed to dominant pre-existing microcrack populations and their alignments. Dependence of extensional attenuation and Young's modulus on strain amplitude were experimentally determined for intact Sierra White granite using the hysteresis loop technique.
Wang, Yi; Xiang, Ma; Wen, Ya-Dong; Yu, Chun-Xia; Wang, Luo-Ping; Zhao, Long-Lian; Li, Jun-Hui
2012-11-01
In this study, tobacco quality analysis of main Industrial classification of different years was carried out applying spectrum projection and correlation methods. The group of data was near-infrared (NIR) spectrum from Hongta Tobacco (Group) Co., Ltd. 5730 tobacco leaf Industrial classification samples from Yuxi in Yunnan province from 2007 to 2010 year were collected using near infrared spectroscopy, which from different parts and colors and all belong to tobacco varieties of HONGDA. The conclusion showed that, when the samples were divided to two part by the ratio of 2:1 randomly as analysis and verification sets in the same year, the verification set corresponded with the analysis set applying spectrum projection because their correlation coefficients were above 0.98. The correlation coefficients between two different years applying spectrum projection were above 0.97. The highest correlation coefficient was the one between 2008 and 2009 year and the lowest correlation coefficient was the one between 2007 and 2010 year. At the same time, The study discussed a method to get the quantitative similarity values of different industrial classification samples. The similarity and consistency values were instructive in combination and replacement of tobacco leaf blending.
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
An Efficient Location Verification Scheme for Static Wireless Sensor Networks.
Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok
2017-01-24
In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.
An Efficient Location Verification Scheme for Static Wireless Sensor Networks
Kim, In-hwan; Kim, Bo-sung; Song, JooSeok
2017-01-01
In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007
Krasteva, Vessela; Jekova, Irena; Schmid, Ramun
2018-01-01
This study aims to validate the 12-lead electrocardiogram (ECG) as a biometric modality based on two straightforward binary QRS template matching characteristics. Different perspectives of the human verification problem are considered, regarding the optimal lead selection and stability over sample size, gender, age, heart rate (HR). A clinical 12-lead resting ECG database, including a population of 460 subjects with two-session recordings (>1 year apart) is used. Cost-effective strategies for extraction of personalized QRS patterns (100ms) and binary template matching estimate similarity in the time scale (matching time) and dissimilarity in the amplitude scale (mismatch area). The two-class person verification task, taking the decision to validate or to reject the subject identity is managed by linear discriminant analysis (LDA). Non-redundant LDA models for different lead configurations (I,II,III,aVF,aVL,aVF,V1-V6) are trained on the first half of 230 subjects by stepwise feature selection until maximization of the area under the receiver operating characteristic curve (ROC AUC). The operating point on the training ROC at equal error rate (EER) is tested on the independent dataset (second half of 230 subjects) to report unbiased validation of test-ROC AUC and true verification rate (TVR = 100-EER). The test results are further evaluated in groups by sample size, gender, age, HR. The optimal QRS pattern projection for single-lead ECG biometric modality is found in the frontal plane sector (60°-0°) with best (Test-AUC/TVR) for lead II (0.941/86.8%) and slight accuracy drop for -aVR (-0.017/-1.4%), I (-0.01/-1.5%). Chest ECG leads have degrading accuracy from V1 (0.885/80.6%) to V6 (0.799/71.8%). The multi-lead ECG improves verification: 6-chest (0.97/90.9%), 6-limb (0.986/94.3%), 12-leads (0.995/97.5%). The QRS pattern matching model shows stable performance for verification of 10 to 230 individuals; insignificant degradation of TVR in women by (1.2-3.6%), adults ≥70 years (3.7%), younger <40 years (1.9%), HR<60bpm (1.2%), HR>90bpm (3.9%), no degradation for HR change (0 to >20bpm).
78 FR 5409 - Ongoing Equivalence Verifications of Foreign Food Regulatory Systems
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-25
... of data shared. Finally, with respect to POE re-inspections, NACMPI recommended the targeting of high-risk product and high-risk imports for sampling and other verification activities during reinspection... authority; the availability of contingency plans in the country for containing and mitigating the effects of...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size of those particles equal to and smalle...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size of those particles equal to and smalle...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size of those particles equal to and smalle...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size for particles equal to or smaller than...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size of those particles equal to and smalle...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size for particles equal to or smaller than...
Weak lensing magnification in the Dark Energy Survey Science Verification data
NASA Astrophysics Data System (ADS)
Garcia-Fernandez, M.; Sanchez, E.; Sevilla-Noarbe, I.; Suchyta, E.; Huff, E. M.; Gaztanaga, E.; Aleksić, J.; Ponce, R.; Castander, F. J.; Hoyle, B.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Eifler, T. F.; Evrard, A. E.; Fernandez, E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Jarvis, M.; Kirk, D.; Krause, E.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; MacCrann, N.; Maia, M. A. G.; March, M.; Marshall, J. L.; Melchior, P.; Miquel, R.; Mohr, J. J.; Plazas, A. A.; Romer, A. K.; Roodman, A.; Rykoff, E. S.; Scarpine, V.; Schubnell, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Tarle, G.; Thomas, D.; Walker, A. R.; Wester, W.; DES Collaboration
2018-05-01
In this paper, the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using the Dark Energy Survey Science Verification data set. This analysis is carried out for galaxies that are selected only by its photometric redshift. An extensive analysis of the systematic effects, using new methods based on simulations is performed, including a Monte Carlo sampling of the selection function of the survey.
Assessment of test methods for evaluating effectiveness of cleaning flexible endoscopes.
Washburn, Rebecca E; Pietsch, Jennifer J
2018-06-01
Strict adherence to each step of reprocessing is imperative to removing potentially infectious agents. Multiple methods for verifying proper reprocessing exist; however, each presents challenges and limitations, and best practice within the industry has not been established. Our goal was to evaluate endoscope cleaning verification tests with particular interest in the evaluation of the manual cleaning step. The results of the cleaning verification tests were compared with microbial culturing to see if a positive cleaning verification test would be predictive of microbial growth. This study was conducted at 2 high-volume endoscopy units within a multisite health care system. Each of the 90 endoscopes were tested for adenosine triphosphate, protein, microbial growth via agar plate, and rapid gram-negative culture via assay. The endoscopes were tested in 3 locations: the instrument channel, control knob, and elevator mechanism. This analysis showed substantial level of agreement between protein detection postmanual cleaning and protein detection post-high-level disinfection at the control head for scopes sampled sequentially. This study suggests that if protein is detected postmanual cleaning, there is a significant likelihood that protein will also be detected post-high-level disinfection. It also infers that a cleaning verification test is not predictive of microbial growth. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Development of Sample Verification System for Sample Return Missions
NASA Technical Reports Server (NTRS)
Toda, Risaku; McKinney, Colin; Jackson, Shannon P.; Mojarradi, Mohammad; Trebi-Ollennu, Ashitey; Manohara, Harish
2011-01-01
This paper describes the development of a proof of-concept sample verification system (SVS) for in-situ mass measurement of planetary rock and soil sample in future robotic sample return missions. Our proof-of-concept SVS device contains a 10 cm diameter pressure sensitive elastic membrane placed at the bottom of a sample canister. The membrane deforms under the weight of accumulating planetary sample. The membrane is positioned in proximity to an opposing substrate with a narrow gap. The deformation of the membrane makes the gap to be narrower, resulting in increased capacitance between the two nearly parallel plates. Capacitance readout circuitry on a nearby printed circuit board (PCB) transmits data via a low-voltage differential signaling (LVDS) interface. The fabricated SVS proof-of-concept device has successfully demonstrated approximately 1pF/gram capacitance change
Jabbari, Keyvan; Pashaei, Fakhereh; Ay, Mohammad R.; Amouheidari, Alireza; Tavakoli, Mohammad B.
2018-01-01
Background: MapCHECK2 is a two-dimensional diode arrays planar dosimetry verification system. Dosimetric results are evaluated with gamma index. This study aims to provide comprehensive information on the impact of various factors on the gamma index values of MapCHECK2, which is mostly used for IMRT dose verification. Methods: Seven fields were planned for 6 and 18 MV photons. The azimuthal angle is defined as any rotation of collimators or the MapCHECK2 around the central axis, which was varied from 5 to −5°. The gantry angle was changed from −8 to 8°. Isodose sampling resolution was studied in the range of 0.5 to 4 mm. The effects of additional buildup on gamma index in three cases were also assessed. Gamma test acceptance criteria were 3%/3 mm. Results: The change of azimuthal angle in 5° interval reduced gamma index value by about 9%. The results of putting buildups of various thicknesses on the MapCHECK2 surface showed that gamma index was generally improved in thicker buildup, especially for 18 MV. Changing the sampling resolution from 4 to 2 mm resulted in an increase in gamma index by about 3.7%. The deviation of the gantry in 8° intervals in either directions changed the gamma index only by about 1.6% for 6 MV and 2.1% for 18 MV. Conclusion: Among the studied parameters, the azimuthal angle is one of the most effective factors on gamma index value. The gantry angle deviation and sampling resolution are less effective on gamma index value reduction. PMID:29535922
Results of the performance verification of the CoaguChek XS system.
Plesch, W; Wolf, T; Breitenbeck, N; Dikkeschei, L D; Cervero, A; Perez, P L; van den Besselaar, A M H P
2008-01-01
This is the first paper reporting a performance verification study of a point-of-care (POC) monitor for prothrombin time (PT) testing according to the requirements given in chapter 8 of the International Organization for Standardization (ISO) 17593:2007 standard "Clinical laboratory testing and in vitro medical devices - Requirements for in vitro monitoring systems for self-testing of oral anticoagulant therapy". The monitor under investigation was the new CoaguChek XS system which is designed for use in patient self testing. Its detection principle is based on the amperometric measurement of the thrombin activity generated by starting the coagulation cascade using a recombinant human thromboplastin. The system performance verification study was performed at four study centers using venous and capillary blood samples on two test strip lots. Laboratory testing was performed from corresponding frozen plasma samples with six commercial thromboplastins. Samples from 73 normal donors and 297 patients on oral anticoagulation therapy were collected. Results were assessed using a refined data set of 260 subjects according to the ISO 17593:2007 standard. Each of the two test strip lots met the acceptance criteria of ISO 17593:2007 versus all thromboplastins (bias -0.19 to 0.18 INR; >97% of data within accuracy limits). The coefficient of variation for imprecision of the PT determinations in INR ranged from 2.0% to 3.2% in venous, and from 2.9% to 4.0% in capillary blood testing. Capillary versus venous INR data showed agreement of results with regression lines equal to the line of identity. The new system demonstrated a high level of trueness and accuracy, and low imprecision in INR testing. It can be concluded that the CoaguChek XS system complies with the requirements in chapter 8 of the ISO standard 17593:2007.
Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B
2009-12-01
Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.
The politics of verification and the control of nuclear tests, 1945-1980
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallagher, N.W.
1990-01-01
This dissertation addresses two questions: (1) why has agreement been reached on verification regimes to support some arms control accords but not others; and (2) what determines the extent to which verification arrangements promote stable cooperation. This study develops an alternative framework for analysis by examining the politics of verification at two levels. The logical politics of verification are shaped by the structure of the problem of evaluating cooperation under semi-anarchical conditions. The practical politics of verification are driven by players' attempts to use verification arguments to promote their desired security outcome. The historical material shows that agreements on verificationmore » regimes are reached when key domestic and international players desire an arms control accord and believe that workable verification will not have intolerable costs. Clearer understanding of how verification is itself a political problem, and how players manipulate it to promote other goals is necessary if the politics of verification are to support rather than undermine the development of stable cooperation.« less
Yin, Hong-Rui; Zhang, Lei; Xie, Li-Qi; Huang, Li-Yong; Xu, Ye; Cai, San-Jun; Yang, Peng-Yuan; Lu, Hao-Jie
2013-09-06
Novel biomarker verification assays are urgently required to improve the efficiency of biomarker development. Benefitting from lower development costs, multiple reaction monitoring (MRM) has been used for biomarker verification as an alternative to immunoassay. However, in general MRM analysis, only one sample can be quantified in a single experiment, which restricts its application. Here, a Hyperplex-MRM quantification approach, which combined mTRAQ for absolute quantification and iTRAQ for relative quantification, was developed to increase the throughput of biomarker verification. In this strategy, equal amounts of internal standard peptides were labeled with mTRAQ reagents Δ0 and Δ8, respectively, as double references, while 4-plex iTRAQ reagents were used to label four different samples as an alternative to mTRAQ Δ4. From the MRM trace and MS/MS spectrum, total amounts and relative ratios of target proteins/peptides of four samples could be acquired simultaneously. Accordingly, absolute amounts of target proteins/peptides in four different samples could be achieved in a single run. In addition, double references were used to increase the reliability of the quantification results. Using this approach, three biomarker candidates, ademosylhomocysteinase (AHCY), cathepsin D (CTSD), and lysozyme C (LYZ), were successfully quantified in colorectal cancer (CRC) tissue specimens of different stages with high accuracy, sensitivity, and reproducibility. To summarize, we demonstrated a promising quantification method for high-throughput verification of biomarker candidates.
Code of Federal Regulations, 2010 CFR
2010-07-01
... which you sample and record gas-analyzer concentrations. (b) Measurement principles. This test verifies... appropriate frequency to prevent loss of information. This test also verifies that the measurement system... instructions. Adjust the measurement system as needed to optimize performance. Run this verification with the...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2011 CFR
2011-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... reference PM sample media (e.g., filters) before and after a weighing session. A weighing session may be as...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... reference PM sample media (e.g., filters) before and after a weighing session. A weighing session may be as...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2013 CFR
2013-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... reference PM sample media (e.g., filters) before and after a weighing session. A weighing session may be as...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2012 CFR
2012-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... reference PM sample media (e.g., filters) before and after a weighing session. A weighing session may be as...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2010 CFR
2010-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... weighing session by weighing reference PM sample media (e.g., filters) before and after a weighing session...
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
The 100-B-14:2 subsite encompasses the former sanitary sewer feeder lines associated with the 1607-B2 and 1607-B7 septic systems. Feeder lines associated with the 185/190-B building have also been identified as the 100-B-14:8 subsite, and feeder lines associated with the 1607-B7 septic system have also been identified as the 100-B-14:9 subsite. These two subsites have been administratively cancelled to resolve the redundancy. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and themore » Columbia River.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
The 1607-B2 waste site is a former septic system associated with various 100-B facilities, including the 105-B, 108-B, 115-B/C, and 185/190-B buildings. The site was evaluated based on confirmatory results for feeder lines within the 100-B-14:2 subsite and determined to require remediation. The 1607-B2 waste site has been remediated to achieve the remedial action objectives specified in the Remaining Sites ROD. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and themore » Columbia River.« less
NASA Technical Reports Server (NTRS)
Hughes, David W.; Hedgeland, Randy J.
1994-01-01
A mechanical simulator of the Hubble Space Telescope (HST) Aft Shroud was built to perform verification testing of the Servicing Mission Scientific Instruments (SI's) and to provide a facility for astronaut training. All assembly, integration, and test activities occurred under the guidance of a contamination control plan, and all work was reviewed by a contamination engineer prior to implementation. An integrated approach was followed in which materials selection, manufacturing, assembly, subsystem integration, and end product use were considered and controlled to ensure that the use of the High Fidelity Mechanical Simulator (HFMS) as a verification tool would not contaminate mission critical hardware. Surfaces were cleaned throughout manufacturing, assembly, and integration, and reverification was performed following major activities. Direct surface sampling was the preferred method of verification, but access and material constraints led to the use of indirect methods as well. Although surface geometries and coatings often made contamination verification difficult, final contamination sampling and monitoring demonstrated the ability to maintain a class M5.5 environment with surface levels less than 400B inside the HFMS.
Verification of S&D Solutions for Network Communications and Devices
NASA Astrophysics Data System (ADS)
Rudolph, Carsten; Compagna, Luca; Carbone, Roberto; Muñoz, Antonio; Repp, Jürgen
This chapter describes the tool-supported verification of S&D Solutions on the level of network communications and devices. First, the general goals and challenges of verification in the context of AmI systems are highlighted and the role of verification and validation within the SERENITY processes is explained.Then, SERENITY extensions to the SH VErification tool are explained using small examples. Finally, the applicability of existing verification tools is discussed in the context of the AVISPA toolset. The two different tools show that for the security analysis of network and devices S&D Patterns relevant complementary approachesexist and can be used.
Current status of verification practices in clinical biochemistry in Spain.
Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè
2013-09-01
Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.
Scheuermann, Taneisha S; Richter, Kimber P; Rigotti, Nancy A; Cummins, Sharon E; Harrington, Kathleen F; Sherman, Scott E; Zhu, Shu-Hong; Tindle, Hilary A; Preacher, Kristopher J
2017-12-01
To estimate the prevalence and predictors of failed biochemical verification of self-reported abstinence among participants enrolled in trials of hospital-initiated smoking cessation interventions. Comparison of characteristics between participants who verified and those who failed to verify self-reported abstinence. Multi-site randomized clinical trials conducted between 2010 and 2014 in hospitals throughout the United States. Recently hospitalized smokers who reported tobacco abstinence 6 months post-randomization and provided a saliva sample for verification purposes (n = 822). Outcomes were salivary cotinine-verified smoking abstinence at 10 and 15 ng/ml cut-points. Predictors and correlates included participant demographics and tobacco use; hospital diagnoses and treatment; and study characteristics collected via surveys and electronic medical records. Usable samples were returned by 69.8% of the 1178 eligible trial participants who reported 7-day point prevalence abstinence. The proportion of participants verified as quit was 57.8% [95% confidence interval (CI) = 54.4, 61.2; 10 ng/ml cut-off] or 60.6% (95% CI = 57.2, 63.9; 15 ng/ml). Factors associated independently with verification at 10 ng/ml were education beyond high school education [odds ratio (OR) = 1.51; 95% CI = 1.07, 2.11], continuous abstinence since hospitalization (OR = 2.82; 95% CI = 2.02, 3.94), mailed versus in-person sample (OR = 3.20; 95% CI = 1.96, 5.21) and race. African American participants were less likely to verify abstinence than white participants (OR = 0.64; 95% CI = 0.44, 0.93). Findings were similar for verification at 15 ng/ml. Verification rates did not differ by treatment group. In the United States, high rates (40%) of recently hospitalized smokers enrolled in smoking cessation trials fail biochemical verification of their self-reported abstinence. © 2017 Society for the Study of Addiction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
P.C. Weaver
2009-04-29
The primary objective of the independent verification was to determine if BJC performed the appropriate actions to meet the specified “hot spot” cleanup criteria of 500 picocuries per gram (pCi/g) uranium-238 (U-238) in surface soil. Specific tasks performed by the independent verification team (IVT) to satisfy this objective included: 1) performing radiological walkover surveys, and 2) collecting soil samples for independent analyses. The independent verification (IV) efforts were designed to evaluate radioactive contaminants (specifically U-238) in the exposed surfaces below one foot of the original site grade, given that the top one foot layer of soil on the site wasmore » removed in its entirety.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demuth, Scott F.; Trahan, Alexis Chanel
2017-06-26
DIV of facility layout, material flows, and other information provided in the DIQ. Material accountancy through an annual PIV and a number of interim inventory verifications, including UF6 cylinder identification and counting, NDA of cylinders, and DA on a sample collection of UF6. Application of C/S technologies utilizing seals and tamper-indicating devices (TIDs) on cylinders, containers, storage rooms, and IAEA instrumentation to provide continuity of knowledge between inspection. Verification of the absence of undeclared material and operations, especially HEU production, through SNRIs, LFUA of cascade halls, and environmental swipe sampling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.
As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.
Certification of NIST Room Temperature Low-Energy and High-Energy Charpy Verification Specimens
Lucon, Enrico; McCowan, Chris N.; Santoyo, Ray L.
2015-01-01
The possibility for NIST to certify Charpy reference specimens for testing at room temperature (21 °C ± 1 °C) instead of −40 °C was investigated by performing 130 room-temperature tests from five low-energy and four high-energy lots of steel on the three master Charpy machines located in Boulder, CO. The statistical analyses performed show that in most cases the variability of results (i.e., the experimental scatter) is reduced when testing at room temperature. For eight out of the nine lots considered, the observed variability was lower at 21 °C than at −40 °C. The results of this study will allow NIST to satisfy requests for room-temperature Charpy verification specimens that have been received from customers for several years: testing at 21 °C removes from the verification process the operator’s skill in transferring the specimen in a timely fashion from the cooling bath to the impact position, and puts the focus back on the machine performance. For NIST, it also reduces the time and cost for certifying new verification lots. For one of the low-energy lots tested with a C-shaped hammer, we experienced two specimens jamming, which yielded unusually high values of absorbed energy. For both specimens, the signs of jamming were clearly visible. For all the low-energy lots investigated, jamming is slightly more likely to occur at 21 °C than at −40 °C, since at room temperature low-energy samples tend to remain in the test area after impact rather than exiting in the opposite direction of the pendulum swing. In the evaluation of a verification set, any jammed specimen should be removed from the analyses. PMID:26958453
Certification of NIST Room Temperature Low-Energy and High-Energy Charpy Verification Specimens.
Lucon, Enrico; McCowan, Chris N; Santoyo, Ray L
2015-01-01
The possibility for NIST to certify Charpy reference specimens for testing at room temperature (21 °C ± 1 °C) instead of -40 °C was investigated by performing 130 room-temperature tests from five low-energy and four high-energy lots of steel on the three master Charpy machines located in Boulder, CO. The statistical analyses performed show that in most cases the variability of results (i.e., the experimental scatter) is reduced when testing at room temperature. For eight out of the nine lots considered, the observed variability was lower at 21 °C than at -40 °C. The results of this study will allow NIST to satisfy requests for room-temperature Charpy verification specimens that have been received from customers for several years: testing at 21 °C removes from the verification process the operator's skill in transferring the specimen in a timely fashion from the cooling bath to the impact position, and puts the focus back on the machine performance. For NIST, it also reduces the time and cost for certifying new verification lots. For one of the low-energy lots tested with a C-shaped hammer, we experienced two specimens jamming, which yielded unusually high values of absorbed energy. For both specimens, the signs of jamming were clearly visible. For all the low-energy lots investigated, jamming is slightly more likely to occur at 21 °C than at -40 °C, since at room temperature low-energy samples tend to remain in the test area after impact rather than exiting in the opposite direction of the pendulum swing. In the evaluation of a verification set, any jammed specimen should be removed from the analyses.
20 CFR 212.5 - Verification of military service.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Verification of military service. 212.5... MILITARY SERVICE § 212.5 Verification of military service. Military service may be verified by the... armed forces that shows the beginning and ending dates of the individual's active military service; or a...
Code of Federal Regulations, 2010 CFR
2010-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
Code of Federal Regulations, 2011 CFR
2011-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
Code of Federal Regulations, 2014 CFR
2014-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
Code of Federal Regulations, 2012 CFR
2012-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
Code of Federal Regulations, 2013 CFR
2013-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
Martinez-Garcia, Elena; Lesur, Antoine; Devis, Laura; Campos, Alexandre; Cabrera, Silvia; van Oostrum, Jan; Matias-Guiu, Xavier; Gil-Moreno, Antonio; Reventos, Jaume; Colas, Eva; Domon, Bruno
2016-08-16
About 30% of endometrial cancer (EC) patients are diagnosed at an advanced stage of the disease, which is associated with a drastic decrease in the 5-year survival rate. The identification of biomarkers in uterine aspirate samples, which are collected by a minimally invasive procedure, would improve early diagnosis of EC. We present a sequential workflow to select from a list of potential EC biomarkers, those which are the most promising to enter a validation study. After the elimination of confounding contributions by residual blood proteins, 52 potential biomarkers were analyzed in uterine aspirates from 20 EC patients and 18 non-EC controls by a high-resolution accurate mass spectrometer operated in parallel reaction monitoring mode. The differential abundance of 26 biomarkers was observed, and among them ten proteins showed a high sensitivity and specificity (AUC > 0.9). The study demonstrates that uterine aspirates are valuable samples for EC protein biomarkers screening. It also illustrates the importance of a biomarker verification phase to fill the gap between discovery and validation studies and highlights the benefits of high resolution mass spectrometry for this purpose. The proteins verified in this study have an increased likelihood to become a clinical assay after a subsequent validation phase.
NASA Astrophysics Data System (ADS)
Miller, Jacob; Sanders, Stephen; Miyake, Akimasa
2017-12-01
While quantum speed-up in solving certain decision problems by a fault-tolerant universal quantum computer has been promised, a timely research interest includes how far one can reduce the resource requirement to demonstrate a provable advantage in quantum devices without demanding quantum error correction, which is crucial for prolonging the coherence time of qubits. We propose a model device made of locally interacting multiple qubits, designed such that simultaneous single-qubit measurements on it can output probability distributions whose average-case sampling is classically intractable, under similar assumptions as the sampling of noninteracting bosons and instantaneous quantum circuits. Notably, in contrast to these previous unitary-based realizations, our measurement-based implementation has two distinctive features. (i) Our implementation involves no adaptation of measurement bases, leading output probability distributions to be generated in constant time, independent of the system size. Thus, it could be implemented in principle without quantum error correction. (ii) Verifying the classical intractability of our sampling is done by changing the Pauli measurement bases only at certain output qubits. Our usage of random commuting quantum circuits in place of computationally universal circuits allows a unique unification of sampling and verification, so they require the same physical resource requirements in contrast to the more demanding verification protocols seen elsewhere in the literature.
Property-driven functional verification technique for high-speed vision system-on-chip processor
NASA Astrophysics Data System (ADS)
Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian
2017-04-01
The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.
Kang, Leni; Zhang, Shaokai; Zhao, Fanghui; Qiao, Youlin
2014-03-01
To evaluate and adjust the verification bias existed in the screening or diagnostic tests. Inverse-probability weighting method was used to adjust the sensitivity and specificity of the diagnostic tests, with an example of cervical cancer screening used to introduce the Compare Tests package in R software which could be implemented. Sensitivity and specificity calculated from the traditional method and maximum likelihood estimation method were compared to the results from Inverse-probability weighting method in the random-sampled example. The true sensitivity and specificity of the HPV self-sampling test were 83.53% (95%CI:74.23-89.93)and 85.86% (95%CI: 84.23-87.36). In the analysis of data with randomly missing verification by gold standard, the sensitivity and specificity calculated by traditional method were 90.48% (95%CI:80.74-95.56)and 71.96% (95%CI:68.71-75.00), respectively. The adjusted sensitivity and specificity under the use of Inverse-probability weighting method were 82.25% (95% CI:63.11-92.62) and 85.80% (95% CI: 85.09-86.47), respectively, whereas they were 80.13% (95%CI:66.81-93.46)and 85.80% (95%CI: 84.20-87.41) under the maximum likelihood estimation method. The inverse-probability weighting method could effectively adjust the sensitivity and specificity of a diagnostic test when verification bias existed, especially when complex sampling appeared.
Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, S R; Bihari, B L; Salari, K
As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.
Atkinson, David A.
2002-01-01
Methods and apparatus for ion mobility spectrometry and analyte detection and identification verification system are disclosed. The apparatus is configured to be used in an ion mobility spectrometer and includes a plurality of reactant reservoirs configured to contain a plurality of reactants which can be reacted with the sample to form adducts having varying ion mobilities. A carrier fluid, such as air or nitrogen, is used to carry the sample into the spectrometer. The plurality of reactants are configured to be selectively added to the carrier stream by use inlet and outlet manifolds in communication with the reagent reservoirs, the reservoirs being selectively isolatable by valves. The invention further includes a spectrometer having the reagent system described. In the method, a first reactant is used with the sample. Following a positive result, a second reactant is used to determine whether a predicted response occurs. The occurrence of the second predicted response tends to verify the existence of a component of interest within the sample. A third reactant can also be used to provide further verification of the existence of a component of interest. A library can be established of known responses of compounds of interest with various reactants and the results of a specific multi-reactant survey of a sample can be compared against the library to determine whether a component detected in the sample is likely to be a specific component of interest.
Nikolac Gabaj, Nora; Miler, Marijana; Vrtarić, Alen; Hemar, Marina; Filipi, Petra; Kocijančić, Marija; Šupak Smolčić, Vesna; Ćelap, Ivana; Šimundić, Ana-Maria
2018-04-25
The aim of our study was to perform verification of serum indices on three clinical chemistry platforms. This study was done on three analyzers: Abbott Architect c8000, Beckman Coulter AU5800 (BC) and Roche Cobas 6000 c501. The following analytical specifications were verified: precision (two patient samples), accuracy (sample with the highest concentration of interferent was serially diluted and measured values compared to theoretical values), comparability (120 patients samples) and cross reactivity (samples with increasing concentrations of interferent were divided in two aliquots and remaining interferents were added in each aliquot. Measurements were done before and after adding interferents). Best results for precision were obtained for the H index (0.72%-2.08%). Accuracy for the H index was acceptable for Cobas and BC, while on Architect, deviations in the high concentration range were observed (y=0.02 [0.01-0.07]+1.07 [1.06-1.08]x). All three analyzers showed acceptable results in evaluating accuracy of L index and unacceptable results for I index. The H index was comparable between BC and both, Architect (Cohen's κ [95% CI]=0.795 [0.692-0.898]) and Roche (Cohen's κ [95% CI]=0.825 [0.729-0.922]), while Roche and Architect were not comparable. The I index was not comparable between all analyzer combinations, while the L index was only comparable between Abbott and BC. Cross reactivity analysis mostly showed that serum indices measurement is affected when a combination of interferences is present. There is heterogeneity between analyzers in the hemolysis, icteria, lipemia (HIL) quality performance. Verification of serum indices in routine work is necessary to establish analytical specifications.
Richardson, Michael L; Petscavage, Jonelle M
2011-11-01
The sensitivity and specificity of magnetic resonance imaging (MRI) for diagnosis of meniscal tears has been studied extensively, with tears usually verified by surgery. However, surgically unverified cases are often not considered in these studies, leading to verification bias, which can falsely increase the sensitivity and decrease the specificity estimates. Our study suggests that such bias may be very common in the meniscal MRI literature, and illustrates techniques to detect and correct for such bias. PubMed was searched for articles estimating sensitivity and specificity of MRI for meniscal tears. These were assessed for verification bias, deemed potentially present if a study included any patients whose MRI findings were not surgically verified. Retrospective global sensitivity analysis (GSA) was performed when possible. Thirty-nine of the 314 studies retrieved from PubMed specifically dealt with meniscal tears. All 39 included unverified patients, and hence, potential verification bias. Only seven articles included sufficient information to perform GSA. Of these, one showed definite verification bias, two showed no bias, and four others showed bias within certain ranges of disease prevalence. Only 9 of 39 acknowledged the possibility of verification bias. Verification bias is underrecognized and potentially common in published estimates of the sensitivity and specificity of MRI for the diagnosis of meniscal tears. When possible, it should be avoided by proper study design. If unavoidable, it should be acknowledged. Investigators should tabulate unverified as well as verified data. Finally, verification bias should be estimated; if present, corrected estimates of sensitivity and specificity should be used. Our online web-based calculator makes this process relatively easy. Copyright © 2011 AUR. Published by Elsevier Inc. All rights reserved.
Aoki, Kimiko; Tanaka, Hiroyuki; Kawahara, Takashi
2018-07-01
The standard method for personal identification and verification of urine samples in doping control is short tandem repeat (STR) analysis using nuclear DNA (nDNA). The DNA concentration of urine is very low and decreases under most conditions used for sample storage; therefore, the amount of DNA from cryopreserved urine samples may be insufficient for STR analysis. We aimed to establish a multiplexed assay for urine mitochondrial DNA typing containing only trace amounts of DNA, particularly for Japanese populations. A multiplexed suspension-array assay using oligo-tagged microspheres (Luminex MagPlex-TAG) was developed to measure C-stretch length in hypervariable region 1 (HV1) and 2 (HV2), five single nucleotide polymorphisms (SNPs), and one polymorphic indel. Based on these SNPs and the indel, the Japanese population can be classified into five major haplogroups (D4, B, M7a, A, D5). The assay was applied to DNA samples from urine cryopreserved for 1 - 1.5 years (n = 63) and fresh blood (n = 150). The assay with blood DNA enabled Japanese subjects to be categorized into 62 types, exhibiting a discriminatory power of 0.960. The detection limit for cryopreserved urine was 0.005 ng of nDNA. Profiling of blood and urine pairs revealed that 5 of 63 pairs showed different C-stretch patterns in HV1 or HV2. The assay described here yields valuable information in terms of the verification of urine sample sources employing only trace amounts of recovered DNA. However, blood cannot be used as a reference sample.
Research on key technology of the verification system of steel rule based on vision measurement
NASA Astrophysics Data System (ADS)
Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun
2018-01-01
The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.
Code of Federal Regulations, 2013 CFR
2013-07-01
... discrete-mode testing. For this check we consider water vapor a gaseous constituent. This verification does... for water removed from the sample done in post-processing according to § 1065.659 and it does not... humidification vessel that contains water. You must humidify NO2 span gas with another moist gas stream. We...
Code of Federal Regulations, 2014 CFR
2014-07-01
... discrete-mode testing. For this check we consider water vapor a gaseous constituent. This verification does... for water removed from the sample done in post-processing according to § 1065.659 (40 CFR 1066.620 for... contains water. You must humidify NO2 span gas with another moist gas stream. We recommend humidifying your...
Cleanup Verification Package for the 118-F-5 PNL Sawdust Pit
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. D. Habel
2008-05-20
This cleanup verification package documents completion of remedial action, sampling activities, and compliance with cleanup criteria for the 118-F-5 Burial Ground, the PNL (Pacific Northwest Laboratory) Sawdust Pit. The 118-F-5 Burial Ground was an unlined trench that received radioactive sawdust from the floors of animal pens in the 100-F Experimental Animal Farm.
High-speed autoverifying technology for printed wiring boards
NASA Astrophysics Data System (ADS)
Ando, Moritoshi; Oka, Hiroshi; Okada, Hideo; Sakashita, Yorihiro; Shibutani, Nobumi
1996-10-01
We have developed an automated pattern verification technique. The output of an automated optical inspection system contains many false alarms. Verification is needed to distinguish between minor irregularities and serious defects. In the past, this verification was usually done manually, which led to unsatisfactory product quality. The goal of our new automated verification system is to detect pattern features on surface mount technology boards. In our system, we employ a new illumination method, which uses multiple colors and multiple direction illumination. Images are captured with a CCD camera. We have developed a new algorithm that uses CAD data for both pattern matching and pattern structure determination. This helps to search for patterns around a defect and to examine defect definition rules. These are processed with a high speed workstation and a hard-wired circuits. The system can verify a defect within 1.5 seconds. The verification system was tested in a factory. It verified 1,500 defective samples and detected all significant defects with only a 0.1 percent of error rate (false alarm).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.
As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.
Exomars Mission Verification Approach
NASA Astrophysics Data System (ADS)
Cassi, Carlo; Gilardi, Franco; Bethge, Boris
According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests between the different levels (system, modules, subsystems, etc) and giving an overview of the main test defined at Spacecraft level. The paper is mainly focused on the verification aspects of the EDL Demonstrator Module and the Rover Module, for which an intense testing activity without previous heritage in Europe is foreseen. In particular the Descent Module has to survive to the Mars atmospheric entry and landing, its surface platform has to stay operational for 8 sols on Martian surface, transmitting scientific data to the Orbiter. The Rover Module has to perform 180 sols mission in Mars surface environment. These operative conditions cannot be verified only by analysis; consequently a test campaign is defined including mechanical tests to simulate the entry loads, thermal test in Mars environment and the simulation of Rover operations on a 'Mars like' terrain. Finally, the paper present an overview of the documentation flow defined to ensure the correct translation of the mission requirements in verification activities (test, analysis, review of design) until the final verification close-out of the above requirements with the final verification reports.
40 CFR 1066.135 - Linearity verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... CVS, double-dilution, and partial-flow systems. (3) PM sample. (4) Chiller sample, for gaseous sampling systems that use thermal chillers to dry samples, and that use chiller temperature to calculate dewpoint at the chiller outlet. For testing, if you choose to use the high alarm temperature setpoint for...
Deductive Evaluation: Formal Code Analysis With Low User Burden
NASA Technical Reports Server (NTRS)
Di Vito, Ben. L
2016-01-01
We describe a framework for symbolically evaluating iterative C code using a deductive approach that automatically discovers and proves program properties. Although verification is not performed, the method can infer detailed program behavior. Software engineering work flows could be enhanced by this type of analysis. Floyd-Hoare verification principles are applied to synthesize loop invariants, using a library of iteration-specific deductive knowledge. When needed, theorem proving is interleaved with evaluation and performed on the fly. Evaluation results take the form of inferred expressions and type constraints for values of program variables. An implementation using PVS (Prototype Verification System) is presented along with results for sample C functions.
Verification of a Viscous Computational Aeroacoustics Code using External Verification Analysis
NASA Technical Reports Server (NTRS)
Ingraham, Daniel; Hixon, Ray
2015-01-01
The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.
Verification of a Viscous Computational Aeroacoustics Code Using External Verification Analysis
NASA Technical Reports Server (NTRS)
Ingraham, Daniel; Hixon, Ray
2015-01-01
The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.
Probabilistic Elastic Part Model: A Pose-Invariant Representation for Real-World Face Verification.
Li, Haoxiang; Hua, Gang
2018-04-01
Pose variation remains to be a major challenge for real-world face recognition. We approach this problem through a probabilistic elastic part model. We extract local descriptors (e.g., LBP or SIFT) from densely sampled multi-scale image patches. By augmenting each descriptor with its location, a Gaussian mixture model (GMM) is trained to capture the spatial-appearance distribution of the face parts of all face images in the training corpus, namely the probabilistic elastic part (PEP) model. Each mixture component of the GMM is confined to be a spherical Gaussian to balance the influence of the appearance and the location terms, which naturally defines a part. Given one or multiple face images of the same subject, the PEP-model builds its PEP representation by sequentially concatenating descriptors identified by each Gaussian component in a maximum likelihood sense. We further propose a joint Bayesian adaptation algorithm to adapt the universally trained GMM to better model the pose variations between the target pair of faces/face tracks, which consistently improves face verification accuracy. Our experiments show that we achieve state-of-the-art face verification accuracy with the proposed representations on the Labeled Face in the Wild (LFW) dataset, the YouTube video face database, and the CMU MultiPIE dataset.
LLNL Genomic Assessment: Viral and Bacterial Sequencing Needs for TMTI, Task 1.4.2 Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slezak, T; Borucki, M; Lam, M
Good progress has been made on both bacterial and viral sequencing by the TMTI centers. While access to appropriate samples is a limiting factor to throughput, excellent progress has been made with respect to getting agreements in place with key sources of relevant materials. Sharing of sequenced genomes funded by TMTI has been extremely limited to date. The April 2010 exercise should force a resolution to this, but additional managerial pressures may be needed to ensure that rapid sharing of TMTI-funded sequencing occurs, regardless of collaborator constraints concerning ultimate publication(s). Policies to permit TMTI-internal rapid sharing of sequenced genomes shouldmore » be written into all TMTI agreements with collaborators now being negotiated. TMTI needs to establish a Web-based system for tracking samples destined for sequencing. This includes metadata on sample origins and contributor, information on sample shipment/receipt, prioritization by TMTI, assignment to one or more sequencing centers (including possible TMTI-sponsored sequencing at a contributor site), and status history of the sample sequencing effort. While this system could be a component of the AFRL system, it is not part of any current development effort. Policy and standardized procedures are needed to ensure appropriate verification of all TMTI samples prior to the investment in sequencing. PCR, arrays, and classical biochemical tests are examples of potential verification methods. Verification is needed to detect miss-labeled, degraded, mixed or contaminated samples. Regular QC exercises are needed to ensure that the TMTI-funded centers are meeting all standards for producing quality genomic sequence data.« less
40 CFR 1065.342 - Sample dryer verification.
Code of Federal Regulations, 2013 CFR
2013-07-01
... condensation as required in § 1065.145(d)(1)(i). We recommend that the sample system components be maintained at least 5 °C above the local humidified gas dewpoint to prevent aqueous condensation. (5) Measure...
40 CFR 1065.342 - Sample dryer verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... condensation as required in § 1065.145(d)(1)(i). We recommend that the sample system components be maintained at least 5 °C above the local humidified gas dewpoint to prevent aqueous condensation. (5) Measure...
40 CFR 1065.342 - Sample dryer verification.
Code of Federal Regulations, 2012 CFR
2012-07-01
... condensation as required in § 1065.145(d)(1)(i). We recommend that the sample system components be maintained at least 5 °C above the local humidified gas dewpoint to prevent aqueous condensation. (5) Measure...
40 CFR 761.130 - Sampling requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... developed by the Midwest Research Institute (MRI) for use in enforcement inspections: “Verification of PCB... the MRI report “Field Manual for Grid Sampling of PCB Spill Sites to Verify Cleanup.” Both the MRI...
40 CFR 761.130 - Sampling requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... developed by the Midwest Research Institute (MRI) for use in enforcement inspections: “Verification of PCB... the MRI report “Field Manual for Grid Sampling of PCB Spill Sites to Verify Cleanup.” Both the MRI...
40 CFR 761.130 - Sampling requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... developed by the Midwest Research Institute (MRI) for use in enforcement inspections: “Verification of PCB... the MRI report “Field Manual for Grid Sampling of PCB Spill Sites to Verify Cleanup.” Both the MRI...
Storms, S M; Feltus, A; Barker, A R; Joly, M-A; Girard, M
2009-03-01
Measurement of somatropin charged variants by isoelectric focusing was replaced with capillary zone electrophoresis in the January 2006 European Pharmacopoeia Supplement 5.3, based on results from an interlaboratory collaborative study. Due to incompatibilities and method-robustness issues encountered prior to verification, a number of method parameters required optimisation. As the use of a diode array detector at 195 nm or 200 nm led to a loss of resolution, a variable wavelength detector using a 200 nm filter was employed. Improved injection repeatability was obtained by increasing the injection time and pressure, and changing the sample diluent from water to running buffer. Finally, definition of capillary pre-treatment and rinse procedures resulted in more consistent separations over time. Method verification data are presented demonstrating linearity, specificity, repeatability, intermediate precision, limit of quantitation, sample stability, solution stability, and robustness. Based on these experiments, several modifications to the current method have been recommended and incorporated into the European Pharmacopoeia to help improve method performance across laboratories globally.
Applying Independent Verification and Validation to Automatic Test Equipment
NASA Technical Reports Server (NTRS)
Calhoun, Cynthia C.
1997-01-01
This paper describes a general overview of applying Independent Verification and Validation (IV&V) to Automatic Test Equipment (ATE). The overview is not inclusive of all IV&V activities that can occur or of all development and maintenance items that can be validated and verified, during the IV&V process. A sampling of possible IV&V activities that can occur within each phase of the ATE life cycle are described.
NASA Astrophysics Data System (ADS)
Tang, Xiaoli; Lin, Tong; Jiang, Steve
2009-09-01
We propose a novel approach for potential online treatment verification using cine EPID (electronic portal imaging device) images for hypofractionated lung radiotherapy based on a machine learning algorithm. Hypofractionated radiotherapy requires high precision. It is essential to effectively monitor the target to ensure that the tumor is within the beam aperture. We modeled the treatment verification problem as a two-class classification problem and applied an artificial neural network (ANN) to classify the cine EPID images acquired during the treatment into corresponding classes—with the tumor inside or outside of the beam aperture. Training samples were generated for the ANN using digitally reconstructed radiographs (DRRs) with artificially added shifts in the tumor location—to simulate cine EPID images with different tumor locations. Principal component analysis (PCA) was used to reduce the dimensionality of the training samples and cine EPID images acquired during the treatment. The proposed treatment verification algorithm was tested on five hypofractionated lung patients in a retrospective fashion. On average, our proposed algorithm achieved a 98.0% classification accuracy, a 97.6% recall rate and a 99.7% precision rate. This work was first presented at the Seventh International Conference on Machine Learning and Applications, San Diego, CA, USA, 11-13 December 2008.
Feedback Seeking in Early Adolescence: Self-Enhancement or Self-Verification?
Rosen, Lisa H; Principe, Connor P; Langlois, Judith H
2013-02-13
The authors examined whether early adolescents ( N = 90) solicit self-enhancing feedback (i.e., positive feedback) or self-verifying feedback (i.e., feedback congruent with self-views, even when these views are negative). Sixth, seventh, and eighth graders first completed a self-perception measure and then selected whether to receive positive or negative feedback from an unknown peer in different domains of self. Results were consistent with self-verification theory; adolescents who perceived themselves as having both strengths and weaknesses were more likely to seek negative feedback regarding a self-perceived weakness compared to a self-perceived strength. The authors found similar support for self-verification processes when they considered the entire sample regardless of perceived strengths and weaknesses; hierarchical linear modeling (HLM) examined the predictive power of ratings of self-perceived ability, certainty, and importance on feedback seeking for all participants and provided additional evidence of self-verification strivings in adolescence.
Feedback Seeking in Early Adolescence: Self-Enhancement or Self-Verification?
Rosen, Lisa H.; Principe, Connor P.; Langlois, Judith H.
2012-01-01
The authors examined whether early adolescents (N = 90) solicit self-enhancing feedback (i.e., positive feedback) or self-verifying feedback (i.e., feedback congruent with self-views, even when these views are negative). Sixth, seventh, and eighth graders first completed a self-perception measure and then selected whether to receive positive or negative feedback from an unknown peer in different domains of self. Results were consistent with self-verification theory; adolescents who perceived themselves as having both strengths and weaknesses were more likely to seek negative feedback regarding a self-perceived weakness compared to a self-perceived strength. The authors found similar support for self-verification processes when they considered the entire sample regardless of perceived strengths and weaknesses; hierarchical linear modeling (HLM) examined the predictive power of ratings of self-perceived ability, certainty, and importance on feedback seeking for all participants and provided additional evidence of self-verification strivings in adolescence. PMID:23543746
Design and Verification of Critical Pressurised Windows for Manned Spaceflight
NASA Astrophysics Data System (ADS)
Lamoure, Richard; Busto, Lara; Novo, Francisco; Sinnema, Gerben; Leal, Mendes M.
2014-06-01
The Window Design for Manned Spaceflight (WDMS) project was tasked with establishing the state-of-art and explore possible improvements to the current structural integrity verification and fracture control methodologies for manned spacecraft windows.A critical review of the state-of-art in spacecraft window design, materials and verification practice was conducted. Shortcomings of the methodology in terms of analysis, inspection and testing were identified. Schemes for improving verification practices and reducing conservatism whilst maintaining the required safety levels were then proposed.An experimental materials characterisation programme was defined and carried out with the support of the 'Glass and Façade Technology Research Group', at the University of Cambridge. Results of the sample testing campaign were analysed, post-processed and subsequently applied to the design of a breadboard window demonstrator.Two Fused Silica glass window panes were procured and subjected to dedicated analyses, inspection and testing comprising both qualification and acceptance programmes specifically tailored to the objectives of the activity.Finally, main outcomes have been compiled into a Structural Verification Guide for Pressurised Windows in manned spacecraft, incorporating best practices and lessons learned throughout this project.
Study on verifying the angle measurement performance of the rotary-laser system
NASA Astrophysics Data System (ADS)
Zhao, Jin; Ren, Yongjie; Lin, Jiarui; Yin, Shibin; Zhu, Jigui
2018-04-01
An angle verification method to verify the angle measurement performance of the rotary-laser system was developed. Angle measurement performance has a great impact on measuring accuracy. Although there is some previous research on the verification of angle measuring uncertainty for the rotary-laser system, there are still some limitations. High-precision reference angles are used in the study of the method, and an integrated verification platform is set up to evaluate the performance of the system. This paper also probes the error that has biggest influence on the verification system. Some errors of the verification system are avoided via the experimental method, and some are compensated through the computational formula and curve fitting. Experimental results show that the angle measurement performance meets the requirement for coordinate measurement. The verification platform can evaluate the uncertainty of angle measurement for the rotary-laser system efficiently.
Formal methods for dependable real-time systems
NASA Technical Reports Server (NTRS)
Rushby, John
1993-01-01
The motivation for using formal methods to specify and reason about real time properties is outlined and approaches that were proposed and used are sketched. The formal verifications of clock synchronization algorithms are concluded as showing that mechanically supported reasoning about complex real time behavior is feasible. However, there was significant increase in the effectiveness of verification systems since those verifications were performed, at it is to be expected that verifications of comparable difficulty will become fairly routine. The current challenge lies in developing perspicuous and economical approaches to the formalization and specification of real time properties.
NASA Technical Reports Server (NTRS)
Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.
1994-01-01
NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 sq m. Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging/diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg/sq ft of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVRs impinged from witness plates of 0.05 to 0.75 sq m.
NASA Technical Reports Server (NTRS)
Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.
1995-01-01
NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 m(exp 2). Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging-diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC-113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg-ft(exp 2) of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVR's impinged from witness plates of 0.05 to 0.75 m(exp 2).
Analysis of Nerve Agent Metabolites from Hair for Long-Term Verification of Nerve Agent Exposure
2016-05-09
Analysis of Nerve Agent Metabolites from Hair for Long-Term Verification of Nerve Agent Exposure Amanda S. Appel,† John H. McDonough,‡ Joseph D...feasible. In this study, hair was evaluated as a long-term repository of nerve agent hydrolysis products. Pinacolyl methylphosphonic acid (PMPA...hydrolysis product of soman) and isopropyl methylphosphonic acid (IMPA; hydrolysis product of sarin) were extracted from hair samples with N,N
Urine sampling and collection system
NASA Technical Reports Server (NTRS)
Fogal, G. L.; Mangialardi, J. K.; Reinhardt, C. G.
1971-01-01
This specification defines the performance and design requirements for the urine sampling and collection system engineering model and establishes requirements for its design, development, and test. The model shall provide conceptual verification of a system applicable to manned space flight which will automatically provide for collection, volume sensing, and sampling of urine.
SU-F-T-494: A Multi-Institutional Study of Independent Dose Verification Using Golden Beam Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Itano, M; Yamazaki, T; Tachibana, R
Purpose: In general, beam data of individual linac is measured for independent dose verification software program and the verification is performed as a secondary check. In this study, independent dose verification using golden beam data was compared to that using individual linac’s beam data. Methods: Six institutions were participated and three different beam data were prepared. The one was individual measured data (Original Beam Data, OBD) .The others were generated by all measurements from same linac model (Model-GBD) and all linac models (All-GBD). The three different beam data were registered to the independent verification software program for each institute. Subsequently,more » patient’s plans in eight sites (brain, head and neck, lung, esophagus, breast, abdomen, pelvis and bone) were analyzed using the verification program to compare doses calculated using the three different beam data. Results: 1116 plans were collected from six institutes. Compared to using the OBD, the results shows the variation using the Model-GBD based calculation and the All-GBD was 0.0 ± 0.3% and 0.0 ± 0.6%, respectively. The maximum variations were 1.2% and 2.3%, respectively. The plans with the variation over 1% shows the reference points were located away from the central axis with/without physical wedge. Conclusion: The confidence limit (2SD) using the Model-GBD and the All-GBD was within 0.6% and 1.2%, respectively. Thus, the use of golden beam data may be feasible for independent verification. In addition to it, the verification using golden beam data provide quality assurance of planning from the view of audit. This research is partially supported by Japan Agency for Medical Research and Development(AMED)« less
Implications of sampling design and sample size for national carbon accounting systems
Michael Köhl; Andrew Lister; Charles T. Scott; Thomas Baldauf; Daniel Plugge
2011-01-01
Countries willing to adopt a REDD regime need to establish a national Measurement, Reporting and Verification (MRV) system that provides information on forest carbon stocks and carbon stock changes. Due to the extensive areas covered by forests the information is generally obtained by sample based surveys. Most operational sampling approaches utilize a combination of...
NASA Astrophysics Data System (ADS)
Connick, Robert J.
Accurate measurement of normal incident transmission loss is essential for the acoustic characterization of building materials. In this research, a method of measuring normal incidence sound transmission loss proposed by Salissou et al. as a complement to standard E2611-09 of the American Society for Testing and Materials [Standard Test Method for Measurement of Normal Incidence Sound Transmission of Acoustical Materials Based on the Transfer Matrix Method (American Society for Testing and Materials, New York, 2009)] is verified. Two sam- ples from the original literature are used to verify the method as well as a Filtros RTM sample. Following the verification, several nano-material Aerogel samples are measured.
van Hoof, Joris J; Gosselt, Jordy F; de Jong, Menno D T
2010-02-01
To compare traditional in-store age verification with a newly developed remote age verification system, 100 cigarette purchase attempts were made by 15-year-old "mystery shoppers." The remote system led to a strong increase in compliance (96% vs. 12%), reflecting more identification requests and more sale refusals when adolescents showed their identification cards. Copyright 2010 Society for Adolescent Medicine. Published by Elsevier Inc. All rights reserved.
Formulating face verification with semidefinite programming.
Yan, Shuicheng; Liu, Jianzhuang; Tang, Xiaoou; Huang, Thomas S
2007-11-01
This paper presents a unified solution to three unsolved problems existing in face verification with subspace learning techniques: selection of verification threshold, automatic determination of subspace dimension, and deducing feature fusing weights. In contrast to previous algorithms which search for the projection matrix directly, our new algorithm investigates a similarity metric matrix (SMM). With a certain verification threshold, this matrix is learned by a semidefinite programming approach, along with the constraints of the kindred pairs with similarity larger than the threshold, and inhomogeneous pairs with similarity smaller than the threshold. Then, the subspace dimension and the feature fusing weights are simultaneously inferred from the singular value decomposition of the derived SMM. In addition, the weighted and tensor extensions are proposed to further improve the algorithmic effectiveness and efficiency, respectively. Essentially, the verification is conducted within an affine subspace in this new algorithm and is, hence, called the affine subspace for verification (ASV). Extensive experiments show that the ASV can achieve encouraging face verification accuracy in comparison to other subspace algorithms, even without the need to explore any parameters.
NASA Astrophysics Data System (ADS)
Hilmy, N.; Febrida, A.; Basril, A.
2007-11-01
Problems of tissue allografts in using International Standard (ISO) 11137 for validation of radiation sterilization dose (RSD) are limited and low numbers of uniform samples per production batch, those are products obtained from one donor. Allograft is a graft transplanted between two different individuals of the same species. The minimum number of uniform samples needed for verification dose (VD) experiment at the selected sterility assurance level (SAL) per production batch according to the IAEA Code is 20, i.e., 10 for bio-burden determination and the remaining 10 for sterilization test. Three methods of the IAEA Code have been used for validation of RSD, i.e., method A1 that is a modification of method 1 of ISO 11137:1995, method B (ISO 13409:1996), and method C (AAMI TIR 27:2001). This paper describes VD experiments using uniform products obtained from one cadaver donor, i.e., cancellous bones, demineralized bone powders and amnion grafts from one life donor. Results of the verification dose experiments show that RSD is 15.4 kGy for cancellous and demineralized bone grafts and 19.2 kGy for amnion grafts according to method A1 and 25 kGy according to methods B and C.
NASA Astrophysics Data System (ADS)
Poinsot, Audrey; Yang, Fan; Brost, Vincent
2011-02-01
Including multiple sources of information in personal identity recognition and verification gives the opportunity to greatly improve performance. We propose a contactless biometric system that combines two modalities: palmprint and face. Hardware implementations are proposed on the Texas Instrument Digital Signal Processor and Xilinx Field-Programmable Gate Array (FPGA) platforms. The algorithmic chain consists of a preprocessing (which includes palm extraction from hand images), Gabor feature extraction, comparison by Hamming distance, and score fusion. Fusion possibilities are discussed and tested first using a bimodal database of 130 subjects that we designed (uB database), and then two common public biometric databases (AR for face and PolyU for palmprint). High performance has been obtained for recognition and verification purpose: a recognition rate of 97.49% with AR-PolyU database and an equal error rate of 1.10% on the uB database using only two training samples per subject have been obtained. Hardware results demonstrate that preprocessing can easily be performed during the acquisition phase, and multimodal biometric recognition can be treated almost instantly (0.4 ms on FPGA). We show the feasibility of a robust and efficient multimodal hardware biometric system that offers several advantages, such as user-friendliness and flexibility.
ETV TEST OF PCDD/F EMISSIONS MONITORING SYSTEMS
Four polychlorinated dibenzodioxin and furan (PCDD/F) emission monitors were tested under the EPA Environmental Technology and Verification (ETV) program. Two long-term sampling devices, the DioxinMonitoringSystem and Adsorption Method for Sampling Dioxins and Furans, and two sem...
[Development of a microenvironment test chamber for airborne microbe research].
Zhan, Ningbo; Chen, Feng; Du, Yaohua; Cheng, Zhi; Li, Chenyu; Wu, Jinlong; Wu, Taihu
2017-10-01
One of the most important environmental cleanliness indicators is airborne microbe. However, the particularity of clean operating environment and controlled experimental environment often leads to the limitation of the airborne microbe research. This paper designed and implemented a microenvironment test chamber for airborne microbe research in normal test conditions. Numerical simulation by Fluent showed that airborne microbes were evenly dispersed in the upper part of test chamber, and had a bottom-up concentration growth distribution. According to the simulation results, the verification experiment was carried out by selecting 5 sampling points in different space positions in the test chamber. Experimental results showed that average particle concentrations of all sampling points reached 10 7 counts/m 3 after 5 minutes' distributing of Staphylococcus aureus , and all sampling points showed the accordant mapping of concentration distribution. The concentration of airborne microbe in the upper chamber was slightly higher than that in the middle chamber, and that was also slightly higher than that in the bottom chamber. It is consistent with the results of numerical simulation, and it proves that the system can be well used for airborne microbe research.
Time-space modal logic for verification of bit-slice circuits
NASA Astrophysics Data System (ADS)
Hiraishi, Hiromi
1996-03-01
The major goal of this paper is to propose a new modal logic aiming at formal verification of bit-slice circuits. The new logic is called as time-space modal logic and its major feature is that it can handle two transition relations: one for time transition and the other for space transition. As for a verification algorithm, a symbolic model checking algorithm of the new logic is shown. This could be applicable to verification of bit-slice microprocessor of infinite bit width and 1D systolic array of infinite length. A simple benchmark result shows the effectiveness of the proposed approach.
2006-09-30
High-Pressure Waterjet • CO2 Pellet/Turbine Wheel • Ultrahigh-Pressure Waterjet 5 Process Water Reuse/Recycle • Cross-Flow Microfiltration ...documented on a process or laboratory form. Corrective action will involve taking all necessary steps to restore a measuring system to proper working order...In all cases, a nonconformance will be rectified before sample processing and analysis continues. If corrective action does not restore the
Skates, Steven J.; Gillette, Michael A.; LaBaer, Joshua; Carr, Steven A.; Anderson, N. Leigh; Liebler, Daniel C.; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L.; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R.; Rodriguez, Henry; Boja, Emily S.
2014-01-01
Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC), with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance, and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step towards building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research. PMID:24063748
Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S
2013-12-06
Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.
Orthorexia nervosa: validation of a diagnosis questionnaire.
Donini, L M; Marsili, D; Graziani, M P; Imbriale, M; Cannella, C
2005-06-01
To validate a questionnaire for the diagnosis of orhorexia oervosa, an eating disorder defined as "maniacal obsession for healthy food". 525 subjects were enrolled. Then they were randomized into two samples (sample of 404 subjects for the construction of the test for the diagnosis of orthorexia ORTO-15; sample of 121 subjects for the validation of the test). The ORTO-15 questionnaire, validated for the diagnosis of orthorexia, is made-up of 15 multiple-choice items. The test we proposed for the diagnosis of orthorexia (ORTO 15) showed a good predictive capability at a threshold value of 40 (efficacy 73.8%, sensitivity 55.6% and specificity 75.8%) also on verification with a control sample. However, it has a limit in identifying the obsessive disorder. For this reason we maintain that further investigation is necessary and that new questions useful for the evaluation of the obsessive-compulsive behavior should be added to the ORTO-15 questionnaire.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. M. Capron
2008-08-08
The 100-F-46 french drain consisted of a 1.5 to 3 m long, vertically buried, gravel-filled pipe that was approximately 1 m in diameter. Also included in this waste site was a 5 cm cast-iron pipeline that drained condensate from the 119-F Stack Sampling Building into the 100-F-46 french drain. In accordance with this evaluation, the confirmatory sampling results support a reclassification of this site to No Action. The current site conditions achieve the remedial action objectives and the corresponding remedial action goals established in the Remaining Sites ROD. The results of confirmatory sampling show that residual contaminant concentrations do notmore » preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.« less
Applying Monte Carlo Simulation to Launch Vehicle Design and Requirements Verification
NASA Technical Reports Server (NTRS)
Hanson, John M.; Beard, Bernard B.
2010-01-01
This paper is focused on applying Monte Carlo simulation to probabilistic launch vehicle design and requirements verification. The approaches developed in this paper can be applied to other complex design efforts as well. Typically the verification must show that requirement "x" is met for at least "y" % of cases, with, say, 10% consumer risk or 90% confidence. Two particular aspects of making these runs for requirements verification will be explored in this paper. First, there are several types of uncertainties that should be handled in different ways, depending on when they become known (or not). The paper describes how to handle different types of uncertainties and how to develop vehicle models that can be used to examine their characteristics. This includes items that are not known exactly during the design phase but that will be known for each assembled vehicle (can be used to determine the payload capability and overall behavior of that vehicle), other items that become known before or on flight day (can be used for flight day trajectory design and go/no go decision), and items that remain unknown on flight day. Second, this paper explains a method (order statistics) for determining whether certain probabilistic requirements are met or not and enables the user to determine how many Monte Carlo samples are required. Order statistics is not new, but may not be known in general to the GN&C community. The methods also apply to determining the design values of parameters of interest in driving the vehicle design. The paper briefly discusses when it is desirable to fit a distribution to the experimental Monte Carlo results rather than using order statistics.
Using Concept Space to Verify Hyponymy in Building a Hyponymy Lexicon
NASA Astrophysics Data System (ADS)
Liu, Lei; Zhang, Sen; Diao, Lu Hong; Yan, Shu Ying; Cao, Cun Gen
Verification of hyponymy relations is a basic problem in knowledge acquisition. We present a method of hyponymy verification based on concept space. Firstly, we give the definition of concept space about a group of candidate hyponymy relations. Secondly we analyze the concept space and define a set of hyponymy features based on the space structure. Then we use them to verify candidate hyponymy relations. Experimental results show that the method can provide adequate verification of hyponymy.
Exploration of Uncertainty in Glacier Modelling
NASA Technical Reports Server (NTRS)
Thompson, David E.
1999-01-01
There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.
NASA Technical Reports Server (NTRS)
Venkatapathy, Ethiraj; Gage, Peter; Wright, Michael J.
2017-01-01
Mars Sample Return is our Grand Challenge for the coming decade. TPS (Thermal Protection System) nominal performance is not the key challenge. The main difficulty for designers is the need to verify unprecedented reliability for the entry system: current guidelines for prevention of backward contamination require that the probability of spores larger than 1 micron diameter escaping into the Earth environment be lower than 1 million for the entire system, and the allocation to TPS would be more stringent than that. For reference, the reliability allocation for Orion TPS is closer to 11000, and the demonstrated reliability for previous human Earth return systems was closer to 1100. Improving reliability by more than 3 orders of magnitude is a grand challenge indeed. The TPS community must embrace the possibility of new architectures that are focused on reliability above thermal performance and mass efficiency. MSR (Mars Sample Return) EEV (Earth Entry Vehicle) will be hit with MMOD (Micrometeoroid and Orbital Debris) prior to reentry. A chute-less aero-shell design which allows for self-righting shape was baselined in prior MSR studies, with the assumption that a passive system will maximize EEV robustness. Hence the aero-shell along with the TPS has to take ground impact and not break apart. System verification will require testing to establish ablative performance and thermal failure but also testing of damage from MMOD, and structural performance at ground impact. Mission requirements will demand analysis, testing and verification that are focused on establishing reliability of the design. In this proposed talk, we will focus on the grand challenge of MSR EEV TPS and the need for innovative approaches to address challenges in modeling, testing, manufacturing and verification.
40 CFR 1065.925 - PEMS preparation for field testing.
Code of Federal Regulations, 2010 CFR
2010-07-01
.... 1065.925 Section 1065.925 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... purge any gaseous sampling PEMS instruments with ambient air until sampling begins to prevent system contamination from excessive cold-start emissions. (e) Conduct calibrations and verifications. (f) Operate any...
ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) TEST OF DIOXIN EMISSION MONITORS
The performance of four dioxin emission monitors including two long-term sampling devices, the DMS (DioxinMonitoringSystem) and AMESA (Adsorption Method for Sampling Dioxins and Furans), and two semi-real-time continuous monitors, RIMMPA-TOFMS (Resonance Ionization with Multi-Mir...
Self-verification and depression among youth psychiatric inpatients.
Joiner, T E; Katz, J; Lew, A S
1997-11-01
According to self-verification theory (e.g., W.B. Swann, 1983), people are motivated to preserve stable self-concepts by seeking self-confirming interpersonal responses, even if the responses are negative. In the current study of 72 youth psychiatric inpatients (36 boys; 36 girls; ages 7-17, M = 13.18; SD = 2.59), the authors provide the 1st test of self-verification theory among a youth sample. Participants completed self-report questionnaires on depression, self-esteem, anxiety, negative and positive affect, and interest in negative feedback from others. The authors made chart diagnoses available, and they collected peer rejection ratings. Consistent with hypotheses, the authors found that interest in negative feedback was associated with depression, was predictive of peer rejection (but only within relatively longer peer relationships), was more highly related to cognitive than emotional aspects of depression, and was specifically associated with depression, rather than being generally associated with emotional distress. The authors discuss implications for self-verification theory and for the phenomenology of youth depression.
Improved Hip-Based Individual Recognition Using Wearable Motion Recording Sensor
NASA Astrophysics Data System (ADS)
Gafurov, Davrondzhon; Bours, Patrick
In todays society the demand for reliable verification of a user identity is increasing. Although biometric technologies based on fingerprint or iris can provide accurate and reliable recognition performance, they are inconvenient for periodic or frequent re-verification. In this paper we propose a hip-based user recognition method which can be suitable for implicit and periodic re-verification of the identity. In our approach we use a wearable accelerometer sensor attached to the hip of the person, and then the measured hip motion signal is analysed for identity verification purposes. The main analyses steps consists of detecting gait cycles in the signal and matching two sets of detected gait cycles. Evaluating the approach on a hip data set consisting of 400 gait sequences (samples) from 100 subjects, we obtained equal error rate (EER) of 7.5% and identification rate at rank 1 was 81.4%. These numbers are improvements by 37.5% and 11.2% respectively of the previous study using the same data set.
A study of applications scribe frame data verifications using design rule check
NASA Astrophysics Data System (ADS)
Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki
2013-06-01
In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.
Branck, Tobyn A.; Hurley, Matthew J.; Prata, Gianna N.; Crivello, Christina A.
2017-01-01
ABSTRACT Listeria monocytogenes is of great concern in food processing facilities because it persists in biofilms, facilitating biotransfer. Stainless steel is commonly used for food contact surfaces and transport containers. L. monocytogenes biofilms on stainless steel served as a model system for surface sampling, to test the performance of a sonicating swab in comparison with a standard cotton swab. Swab performance and consistency were determined using total viable counts. Stainless steel coupons sampled with both types of swabs were examined using scanning electron microscopy, to visualize biofilms and surface structures (i.e., polishing grooves and scratches). Laser scanning confocal microscopy was used to image and to quantitate the biofilms remaining after sampling with each swab type. The total viable counts were significantly higher (P ≤ 0.05) with the sonicating swab than with the standard swab in each trial. The sonicating swab was more consistent in cell recovery than was the standard swab, with coefficients of variation ranging from 8.9% to 12.3% and from 7.1% to 37.6%, respectively. Scanning electron microscopic imaging showed that biofilms remained in the polished grooves of the coupons sampled with the standard swab but were noticeably absent with the sonicating swab. Percent area measurements of biofilms remaining on the stainless steel coupons showed significantly (P ≤ 0.05) less biofilm remaining when the sonicating swab was used (median, 1.1%), compared with the standard swab (median, 70.4%). The sonicating swab provided greater recovery of cells, with more consistency, than did the standard swab, and it is employs sonication, suction, and scrubbing. IMPORTANCE Inadequate surface sampling can result in foodborne illness outbreaks from biotransfer, since verification of sanitization protocols relies on surface sampling and recovery of microorganisms for detection and enumeration. Swabbing is a standard method for microbiological sampling of surfaces. Although swabbing offers portability and ease of use, there are limitations, such as high user variability and low recovery rates, which can be attributed to many different causes. This study demonstrates some benefits that a sonicating swab has over a standard swab for removal and collection of microbiological samples from a surface, to provide better verification of surface cleanliness and to help decrease the potential for biotransfer of pathogens into foods. PMID:28314729
Branck, Tobyn A; Hurley, Matthew J; Prata, Gianna N; Crivello, Christina A; Marek, Patrick J
2017-06-01
Listeria monocytogenes is of great concern in food processing facilities because it persists in biofilms, facilitating biotransfer. Stainless steel is commonly used for food contact surfaces and transport containers. L. monocytogenes biofilms on stainless steel served as a model system for surface sampling, to test the performance of a sonicating swab in comparison with a standard cotton swab. Swab performance and consistency were determined using total viable counts. Stainless steel coupons sampled with both types of swabs were examined using scanning electron microscopy, to visualize biofilms and surface structures (i.e., polishing grooves and scratches). Laser scanning confocal microscopy was used to image and to quantitate the biofilms remaining after sampling with each swab type. The total viable counts were significantly higher ( P ≤ 0.05) with the sonicating swab than with the standard swab in each trial. The sonicating swab was more consistent in cell recovery than was the standard swab, with coefficients of variation ranging from 8.9% to 12.3% and from 7.1% to 37.6%, respectively. Scanning electron microscopic imaging showed that biofilms remained in the polished grooves of the coupons sampled with the standard swab but were noticeably absent with the sonicating swab. Percent area measurements of biofilms remaining on the stainless steel coupons showed significantly ( P ≤ 0.05) less biofilm remaining when the sonicating swab was used (median, 1.1%), compared with the standard swab (median, 70.4%). The sonicating swab provided greater recovery of cells, with more consistency, than did the standard swab, and it is employs sonication, suction, and scrubbing. IMPORTANCE Inadequate surface sampling can result in foodborne illness outbreaks from biotransfer, since verification of sanitization protocols relies on surface sampling and recovery of microorganisms for detection and enumeration. Swabbing is a standard method for microbiological sampling of surfaces. Although swabbing offers portability and ease of use, there are limitations, such as high user variability and low recovery rates, which can be attributed to many different causes. This study demonstrates some benefits that a sonicating swab has over a standard swab for removal and collection of microbiological samples from a surface, to provide better verification of surface cleanliness and to help decrease the potential for biotransfer of pathogens into foods. Copyright © 2017 American Society for Microbiology.
On marker-based parentage verification via non-linear optimization.
Boerner, Vinzent
2017-06-15
Parentage verification by molecular markers is mainly based on short tandem repeat markers. Single nucleotide polymorphisms (SNPs) as bi-allelic markers have become the markers of choice for genotyping projects. Thus, the subsequent step is to use SNP genotypes for parentage verification as well. Recent developments of algorithms such as evaluating opposing homozygous SNP genotypes have drawbacks, for example the inability of rejecting all animals of a sample of potential parents. This paper describes an algorithm for parentage verification by constrained regression which overcomes the latter limitation and proves to be very fast and accurate even when the number of SNPs is as low as 50. The algorithm was tested on a sample of 14,816 animals with 50, 100 and 500 SNP genotypes randomly selected from 40k genotypes. The samples of putative parents of these animals contained either five random animals, or four random animals and the true sire. Parentage assignment was performed by ranking of regression coefficients, or by setting a minimum threshold for regression coefficients. The assignment quality was evaluated by the power of assignment (P[Formula: see text]) and the power of exclusion (P[Formula: see text]). If the sample of putative parents contained the true sire and parentage was assigned by coefficient ranking, P[Formula: see text] and P[Formula: see text] were both higher than 0.99 for the 500 and 100 SNP genotypes, and higher than 0.98 for the 50 SNP genotypes. When parentage was assigned by a coefficient threshold, P[Formula: see text] was higher than 0.99 regardless of the number of SNPs, but P[Formula: see text] decreased from 0.99 (500 SNPs) to 0.97 (100 SNPs) and 0.92 (50 SNPs). If the sample of putative parents did not contain the true sire and parentage was rejected using a coefficient threshold, the algorithm achieved a P[Formula: see text] of 1 (500 SNPs), 0.99 (100 SNPs) and 0.97 (50 SNPs). The algorithm described here is easy to implement, fast and accurate, and is able to assign parentage using genomic marker data with a size as low as 50 SNPs.
Soil Gas Sample Handling: Evaluation of Water Removal and Sample Ganging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fritz, Brad G.; Abrecht, David G.; Hayes, James C.
2016-10-31
Soil gas sampling is currently conducted in support of Nuclear Test Ban treaty verification. Soil gas samples are collected and analyzed for isotopes of interest. Some issues that can impact sampling and analysis of these samples are excess moisture and sample processing time. Here we discuss three potential improvements to the current sampling protocol; a desiccant for water removal, use of molecular sieve to remove CO 2 from the sample during collection, and a ganging manifold to allow composite analysis of multiple samples.
Accurate Biomass Estimation via Bayesian Adaptive Sampling
NASA Technical Reports Server (NTRS)
Wheeler, Kevin R.; Knuth, Kevin H.; Castle, Joseph P.; Lvov, Nikolay
2005-01-01
The following concepts were introduced: a) Bayesian adaptive sampling for solving biomass estimation; b) Characterization of MISR Rahman model parameters conditioned upon MODIS landcover. c) Rigorous non-parametric Bayesian approach to analytic mixture model determination. d) Unique U.S. asset for science product validation and verification.
Rapid determination of alpha emitters using Actinide resin.
Navarro, N; Rodriguez, L; Alvarez, A; Sancho, C
2004-01-01
The European Commission has recently published the recommended radiological protection criteria for the clearance of building and building rubble from the dismantling of nuclear installations. Radionuclide specific clearance levels for actinides are very low (between 0.1 and 1 Bq g(-1)). The prevalence of natural radionuclides in rubble materials makes the verification of these levels by direct alpha counting impossible. The capability of Actinide resin (Eichrom Industries, Inc.) for extracting plutonium and americium from rubble samples has been tested in this work. Besides a strong affinity for actinides in the tri, tetra and hexavalent oxidation states, this extraction chromatographic resin presents an easy recovery of absorbed radionuclides. The retention capability was evaluated on rubble samples spiked with certified radionuclide standards (239Pu and 241Am). Samples were leached with nitric acid, passed through a chromatographic column containing the resin and the elution fraction was measured by LSC. Actinide retention varies from 60% to 80%. Based on these results, a rapid method for the verification of clearance levels for actinides in rubble samples is proposed.
Optimal Verification of Entangled States with Local Measurements
NASA Astrophysics Data System (ADS)
Pallister, Sam; Linden, Noah; Montanaro, Ashley
2018-04-01
Consider the task of verifying that a given quantum device, designed to produce a particular entangled state, does indeed produce that state. One natural approach would be to characterize the output state by quantum state tomography, or alternatively, to perform some kind of Bell test, tailored to the state of interest. We show here that neither approach is optimal among local verification strategies for 2-qubit states. We find the optimal strategy in this case and show that quadratically fewer total measurements are needed to verify to within a given fidelity than in published results for quantum state tomography, Bell test, or fidelity estimation protocols. We also give efficient verification protocols for any stabilizer state. Additionally, we show that requiring that the strategy be constructed from local, nonadaptive, and noncollective measurements only incurs a constant-factor penalty over a strategy without these restrictions.
Nguyen, Huynh; Morgan, David A F; Sly, Lindsay I; Benkovich, Morris; Cull, Sharon; Forwood, Mark R
2008-06-01
ISO 11137-2006 (ISO 11137-2a 2006) provides a VDmax 15 method for substantiation of 15 kGy as radiation sterilisation dose (RSD) for health care products with a relatively low sample requirement. Moreover, the method is also valid for products in which the bioburden level is less than or equal to 1.5. In the literature, the bioburden level of processed bone allografts is extremely low. Similarly, the Queensland Bone Bank (QBB) usually recovers no viable organisms from processed bone allografts. Because bone allografts are treated as a type of health care product, the aim of this research was to substantiate 15 kGy as a RSD for frozen bone allografts at the QBB using method VDmax 15-ISO 11137-2: 2006 (ISO 11137-2e, Procedure for method VDmax 15 for multiple production batches. Sterilisation of health care products - radiation - part 2: establishing the sterilisation dose, 2006; ISO 11137-2f, Procedure for method VDmax 15 for a single production batch. Sterilisation of health care products - radiation - part 2: establishing the sterilisation dose, 2006). 30 femoral heads, 40 milled bone allografts and 40 structural bone allografts manufactured according to QBB standard operating procedures were used. Estimated bioburdens for each bone allograft group were used to calculate the verification doses. Next, 10 samples per group were irradiated at the verification dose, sterility was tested and the number of positive tests of sterility recorded. If the number of positive samples was no more than 1, from the 10 tests carried out in each group, the verification was accepted and 15 kGy was substantiated as RSD for those bone allografts. The bioburdens in all three groups were 0, and therefore the verification doses were 0 kGy. Sterility tests of femoral heads and milled bones were all negative (no contamination), and there was one positive test of sterility in the structural bone allograft. Accordingly, the verification was accepted. Using the ISO validated protocol, VDmax 15, 15 kGy was substantiated as RSD for frozen bone allografts manufactured at the QBB.
Analysis of particulate contamination on tape lift samples from the VETA optical surfaces
NASA Technical Reports Server (NTRS)
Germani, Mark S.
1992-01-01
Particulate contamination analysis was carried out on samples taken from the Verification Engineering Test Article (VETA) x-ray detection system. A total of eighteen tape lift samples were taken from the VETA optical surfaces. Initially, the samples were tested using a scanning electron microscope. Additionally, particle composition was determined by energy dispersive x-ray spectrometry. Results are presented in terms of particle loading per sample.
NASA Technical Reports Server (NTRS)
Hornung, Steven D.; Biesinger, Paul; Kirsch, Mike; Beeson, Harold; Leuders, Kathy
1999-01-01
The NASA White Sands Test Facility (WSTF) has developed an entirely aqueous final cleaning and verification process to replace the current chlorofluorocarbon (CFC) 113 based process. This process has been accepted for final cleaning and cleanliness verification of WSTF ground support equipment. The aqueous process relies on ultrapure water at 50 C (323 K) and ultrasonic agitation for removal of organic compounds and particulate. The cleanliness is verified bv determining the total organic carbon (TOC) content and filtration with particulate counting. The effectiveness of the aqueous methods for detecting hydrocarbon contamination and particulate was compared to the accepted CFC 113 sampling procedures. Testing with known contaminants, such as hydraulic fluid and cutting and lubricating oils, to establish a correlation between aqueous TOC and CFC 113 nonvolatile residue (NVR) was performed. Particulate sampling on cleaned batches of hardware that were randomly separated and sampled by the two methods was performed. This paper presents the approach and results, and discusses the issues in establishing the equivalence of aqueous sampling to CFC 113 sampling, while describing the approach for implementing aqueous techniques on Space Shuttle Propulsion hardware.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peters, T. B.; Bannochie, C. J.
Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H in support of verification of Macrobatch (Salt Batch) 11 for the Interim Salt Disposition Program (ISDP) for processing. This document reports characterization data on the samples of Tank 21H and fulfills the requirements of Deliverable 3 of the Technical Task Request (TTR).
ENVIORNMENTAL TECHNOLOGY VERIFICATION REPORT: ANEST IWATA CORPORATION LPH400-LV HVLP SPRAY GUN
This Enviornmental Technology Verification reports on the characteristics of a paint spray gun. The research showed that the spray gun provided absolute and relative increases in transfer efficiency over the base line and provided a reduction in the use of paint.
NASA Astrophysics Data System (ADS)
Wier, Timothy P.; Moser, Cameron S.; Grant, Jonathan F.; Riley, Scott C.; Robbins-Wamsley, Stephanie H.; First, Matthew R.; Drake, Lisa A.
2017-10-01
Both L-shaped ("L") and straight ("Straight") sample probes have been used to collect water samples from a main ballast line in land-based or shipboard verification testing of ballast water management systems (BWMS). A series of experiments was conducted to quantify and compare the sampling efficiencies of L and Straight sample probes. The findings from this research-that both L and Straight probes sample organisms with similar efficiencies-permit increased flexibility for positioning sample probes aboard ships.
Galaxy bias from galaxy–galaxy lensing in the DES science verification data
Prat, J.; Sánchez, C.; Miquel, R.; ...
2017-09-25
Here, we present a measurement of galaxy–galaxy lensing around a magnitude-limited (i AB < 22.5) sample of galaxies from the dark energy survey science verification (DES-SV) data. We split these lenses into three photometric-redshift bins from 0.2 to 0.8, and determine the product of the galaxy bias b and cross-correlation coefficient between the galaxy and dark matter overdensity fields r in each bin, using scales above 4 h –1 Mpc comoving, where we find the linear bias model to be valid given our current uncertainties. We compare our galaxy bias results from galaxy–galaxy lensing with those obtained from galaxy clusteringmore » and CMB lensing for the same sample of galaxies, and find our measurements to be in good agreement with those in Crocce et al., while, in the lowest redshift bin (z ~ 0.3), they show some tension with the findings in Giannantonio et al. We measure b · r to be 0.87 ± 0.11, 1.12 ± 0.16 and 1.24 ± 0.23, respectively, for the three redshift bins of width Δz = 0.2 in the range 0.2 < z < 0.8, defined with the photometric-redshift algorithm bpz. Using a different code to split the lens sample, tpz, leads to changes in the measured biases at the 10–20 per cent level, but it does not alter the main conclusion of this work: when comparing with Crocce et al. we do not find strong evidence for a cross-correlation parameter significantly below one in this galaxy sample, except possibly at the lowest redshift bin (z ~ 0.3), where we find r = 0.71 ± 0.11 when using tpz, and 0.83 ± 0.12 with bpz.« less
Galaxy bias from galaxy–galaxy lensing in the DES science verification data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prat, J.; Sánchez, C.; Miquel, R.
Here, we present a measurement of galaxy–galaxy lensing around a magnitude-limited (i AB < 22.5) sample of galaxies from the dark energy survey science verification (DES-SV) data. We split these lenses into three photometric-redshift bins from 0.2 to 0.8, and determine the product of the galaxy bias b and cross-correlation coefficient between the galaxy and dark matter overdensity fields r in each bin, using scales above 4 h –1 Mpc comoving, where we find the linear bias model to be valid given our current uncertainties. We compare our galaxy bias results from galaxy–galaxy lensing with those obtained from galaxy clusteringmore » and CMB lensing for the same sample of galaxies, and find our measurements to be in good agreement with those in Crocce et al., while, in the lowest redshift bin (z ~ 0.3), they show some tension with the findings in Giannantonio et al. We measure b · r to be 0.87 ± 0.11, 1.12 ± 0.16 and 1.24 ± 0.23, respectively, for the three redshift bins of width Δz = 0.2 in the range 0.2 < z < 0.8, defined with the photometric-redshift algorithm bpz. Using a different code to split the lens sample, tpz, leads to changes in the measured biases at the 10–20 per cent level, but it does not alter the main conclusion of this work: when comparing with Crocce et al. we do not find strong evidence for a cross-correlation parameter significantly below one in this galaxy sample, except possibly at the lowest redshift bin (z ~ 0.3), where we find r = 0.71 ± 0.11 when using tpz, and 0.83 ± 0.12 with bpz.« less
Galaxy bias from galaxy-galaxy lensing in the DES science verification data
NASA Astrophysics Data System (ADS)
Prat, J.; Sánchez, C.; Miquel, R.; Kwan, J.; Blazek, J.; Bonnett, C.; Amara, A.; Bridle, S. L.; Clampitt, J.; Crocce, M.; Fosalba, P.; Gaztanaga, E.; Giannantonio, T.; Hartley, W. G.; Jarvis, M.; MacCrann, N.; Percival, W. J.; Ross, A. J.; Sheldon, E.; Zuntz, J.; Abbott, T. M. C.; Abdalla, F. B.; Annis, J.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Doel, P.; Eifler, T. F.; Evrard, A. E.; Fausti Neto, A.; Flaugher, B.; Frieman, J.; Gerdes, D. W.; Goldstein, D. A.; Gruen, D.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; James, D. J.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; Marshall, J. L.; Melchior, P.; Menanteau, F.; Nord, B.; Plazas, A. A.; Reil, K.; Romer, A. K.; Roodman, A.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Walker, A. R.
2018-01-01
We present a measurement of galaxy-galaxy lensing around a magnitude-limited (iAB < 22.5) sample of galaxies from the dark energy survey science verification (DES-SV) data. We split these lenses into three photometric-redshift bins from 0.2 to 0.8, and determine the product of the galaxy bias b and cross-correlation coefficient between the galaxy and dark matter overdensity fields r in each bin, using scales above 4 h-1 Mpc comoving, where we find the linear bias model to be valid given our current uncertainties. We compare our galaxy bias results from galaxy-galaxy lensing with those obtained from galaxy clustering and CMB lensing for the same sample of galaxies, and find our measurements to be in good agreement with those in Crocce et al., while, in the lowest redshift bin (z ∼ 0.3), they show some tension with the findings in Giannantonio et al. We measure b · r to be 0.87 ± 0.11, 1.12 ± 0.16 and 1.24 ± 0.23, respectively, for the three redshift bins of width Δz = 0.2 in the range 0.2 < z < 0.8, defined with the photometric-redshift algorithm BPZ. Using a different code to split the lens sample, TPZ, leads to changes in the measured biases at the 10-20 per cent level, but it does not alter the main conclusion of this work: when comparing with Crocce et al. we do not find strong evidence for a cross-correlation parameter significantly below one in this galaxy sample, except possibly at the lowest redshift bin (z ∼ 0.3), where we find r = 0.71 ± 0.11 when using TPZ, and 0.83 ± 0.12 with BPZ.
Upgrade Summer Severe Weather Tool
NASA Technical Reports Server (NTRS)
Watson, Leela
2011-01-01
The goal of this task was to upgrade to the existing severe weather database by adding observations from the 2010 warm season, update the verification dataset with results from the 2010 warm season, use statistical logistic regression analysis on the database and develop a new forecast tool. The AMU analyzed 7 stability parameters that showed the possibility of providing guidance in forecasting severe weather, calculated verification statistics for the Total Threat Score (TTS), and calculated warm season verification statistics for the 2010 season. The AMU also performed statistical logistic regression analysis on the 22-year severe weather database. The results indicated that the logistic regression equation did not show an increase in skill over the previously developed TTS. The equation showed less accuracy than TTS at predicting severe weather, little ability to distinguish between severe and non-severe weather days, and worse standard categorical accuracy measures and skill scores over TTS.
2013-09-01
33 4.7 SAMPLING RESULTS ...34 5.0 PERFORMANCE RESULTS ...PERFORMANCE RESULTS DISCUSSION ............................................................................ 39 5.2.1 Energy: Verify Power Production
Experimental evaluation of fingerprint verification system based on double random phase encoding
NASA Astrophysics Data System (ADS)
Suzuki, Hiroyuki; Yamaguchi, Masahiro; Yachida, Masuyoshi; Ohyama, Nagaaki; Tashima, Hideaki; Obi, Takashi
2006-03-01
We proposed a smart card holder authentication system that combines fingerprint verification with PIN verification by applying a double random phase encoding scheme. In this system, the probability of accurate verification of an authorized individual reduces when the fingerprint is shifted significantly. In this paper, a review of the proposed system is presented and preprocessing for improving the false rejection rate is proposed. In the proposed method, the position difference between two fingerprint images is estimated by using an optimized template for core detection. When the estimated difference exceeds the permissible level, the user inputs the fingerprint again. The effectiveness of the proposed method is confirmed by a computational experiment; its results show that the false rejection rate is improved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weaver, Phyllis C.
A team from ORAU's Independent Environmental Assessment and Verification Program performed verification survey activities on the South Test Tower and four Hazardous Waste Storage Lockers. Scan data collected by ORAU determined that both the alpha and alpha-plus-beta activity was representative of radiological background conditions. The count rate distribution showed no outliers that would be indicative of alpha or alpha-plus-beta count rates in excess of background. It is the opinion of ORAU that independent verification data collected support the site?s conclusions that the South Tower and Lockers sufficiently meet the site criteria for release to recycle and reuse.
In-orbit verification of MHS spectral channels co-registration using the moon
NASA Astrophysics Data System (ADS)
Bonsignori, Roberto
2017-09-01
In-orbit verification of the co-registration of channels in a scanning microwave or infrared radiometer can in principle be done during normal in-orbit operation, by using the regular events of lunar intrusion in the instrument cold space calibration view. A technique of data analysis based on best fit of data across lunar intrusions has been used to check the mutual alignment of the spectral channels of the MHS instrument. MHS (Microwave Humidity Sounder) is a cross-track scanning radiometer in the millimetre-wave range flying on EUMETSAT and NOAA polar satellites, used operationally for the retrieval of atmospheric parameters in numerical weather prediction and nowcasting. This technique does not require any special operation or manoeuvre and only relies on analysis of data from the nominal scanning operation. The co-alignment of sounding channels and window channels can be evaluated by this technique, which would not be possible by using earth landmarks, due to the absorption effect of the atmosphere. The analysis reported in this paper shows an achievable accuracy below 0.5 mrad against a beam width at 3dB and spatial sampling interval of about 20 mrad. In-orbit results for the MHS instrument on Metop-B are also compared with the pre-launch instrument characterisation, showing a good correlation.
Investigations on the magnetization behavior of magnetic composite particles
NASA Astrophysics Data System (ADS)
Eichholz, Christian; Knoll, Johannes; Lerche, Dietmar; Nirschl, Hermann
2014-11-01
In life sciences the application of surface functionalized magnetic composite particles is establishing in diagnostics and in downstream processing of modern biotechnology. These magnetic composite particles consist of non-magnetic material, e.g. polystyrene, which serves as a matrix for the second magnetic component, usually colloidal magnetite. Because of the multitude of magnetic cores these magnetic beads show a complex magnetization behavior which cannot be described with the available approaches for homogeneous magnetic material. Therefore, in this work a new model for the magnetization behavior of magnetic composite particles is developed. By introducing an effective magnetization and considering an overall demagnetization factor the deviation of the demagnetization of homogeneously magnetized particles is taken into account. Calculated and experimental results show a good agreement which allows for the verification of the adapted model of particle magnetization. Besides, a newly developed magnetic analyzing centrifuge is used for the characterization of magnetic composite particle systems. The experimental results, also used for the model verification, give both, information about the magnetic properties and the interaction behavior of particle systems. By adding further components to the particle solution, such as salts or proteins, industrial relevant systems can be reconstructed. The analyzing tool can be used to adapt industrial processes without time-consuming preliminary tests with large samples in the process equipments.
The TraceDetect's SafeGuard is designed to automatically measure total arsenic concentrations in drinking water samples (including raw water and treated water) over a range from 1 ppb to over 100 ppb. Once the operator has introduced the sample vial and selected "measure&qu...
ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, PCB DETECTION TECHNOLOGY, HYBRIZYME DELFIA TM ASSAY
The DELFIA PCB Assay is a solid-phase time-resolved fluoroimmunoassay based on the sequential addition of sample extract and europium-labeled PCB tracer to a monoclonal antibody reagent specific for PCBs. In this assay, the antibody reagent and sample extract are added to a strip...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-28
... persons to submit comments on this document. Comments may be submitted by one of the following methods... very low (less than one percent), and this carcass sampling was expensive for the Agency. As stated in.... Following the implementation of PR/HACCP, FSIS analyzed only one pathogen per sample. Then, in 2008, FSIS...
A VST and VISTA study of globular clusters in NGC 253
NASA Astrophysics Data System (ADS)
Cantiello, Michele; Grado, Aniello; Rejkuba, Marina; Arnaboldi, Magda; Capaccioli, Massimo; Greggio, Laura; Iodice, Enrica; Limatola, Luca
2018-03-01
Context. Globular clusters (GCs) are key to our understanding of the Universe, as laboratories of stellar evolution, fossil tracers of the past formation epoch of the host galaxy, and effective distance indicators from local to cosmological scales. Aim. We analyze the properties of the sources in the NGC 253 with the aim of defining an up to date catalog of GC candidates in the galaxy. Given the distance of the galaxy, GCs in NGC 253 are ideal targets for resolved color-magnitude diagram studies of extragalactic GCs with next-generation diffraction limited ground-based telescopes. Methods: Our analysis is based on the science verification data of two ESO survey telescopes, VST and VISTA. Using ugri photometry from VST and JKs from VISTA, GC candidates were selected using as reference the morpho-photometric and color properties of spectroscopically confirmed GCs available in the literature. The strength of the results was verified against available archival HST/ACS data from the GHOSTS survey: all but two of the selected GC candidates appear as star clusters in HST footprints. Results: The adopted GC selection leads to the definition of a sample of ˜350 GC candidates. At visual inspection, we find that 82 objects match all the requirements for selecting GC candidates and 155 are flagged as uncertain GC candidate; however, 110 are unlikely GCs, which are most likely background galaxies. Furthermore, our analysis shows that four of the previously spectroscopically confirmed GCs, i.e., ˜20% of the total spectroscopic sample, are more likely either background galaxies or high-velocity Milky Way stars. The radial density profile of the selected best candidates shows the typically observed r1/4-law radial profile. The analysis of the color distributions reveals only marginal evidence of the presence of color bimodality, which is normally observed in galaxies of similar luminosity. The GC luminosity function does not show the typical symmetry, mainly because of the lack of bright GCs. Part of the bright GCs missing might be at very large galactocentric distances or along the line of sight of the galaxy dusty disk. As an alternative possibility, we speculate that a fraction of low luminosity GC candidates might instead be metal-rich, intermediate age clusters, but fall in a similar color interval of old, metal-poor GCs. Conclusions: Defining a contaminant-free sample of GCs in extragalactic systems is not a straight forward exercise. Using optical and near-IR photometry we purged the list of GCs with spectroscopic membership and photometric GC candidates in NGC 253. Our results show that the use of either spectroscopic or photometric data only does not generally ensure a contaminant-free sample and a combination of both spectroscopy and photometry is preferred. Table 3 is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/611/A21This work is based on observations taken at the ESO La Silla Paranal Observatory within the VST Science Verification Programme ID 60.A-9286(A) and VISTA Science Verification Programme ID 60.A-9285(A).
Mazumder, Avik; Gupta, Hemendra K; Garg, Prabhat; Jain, Rajeev; Dubey, Devendra K
2009-07-03
This paper details an on-flow liquid chromatography-ultraviolet-nuclear magnetic resonance (LC-UV-NMR) method for the retrospective detection and identification of alkyl alkylphosphonic acids (AAPAs) and alkylphosphonic acids (APAs), the markers of the toxic nerve agents for verification of the Chemical Weapons Convention (CWC). Initially, the LC-UV-NMR parameters were optimized for benzyl derivatives of the APAs and AAPAs. The optimized parameters include stationary phase C(18), mobile phase methanol:water 78:22 (v/v), UV detection at 268nm and (1)H NMR acquisition conditions. The protocol described herein allowed the detection of analytes through acquisition of high quality NMR spectra from the aqueous solution of the APAs and AAPAs with high concentrations of interfering background chemicals which have been removed by preceding sample preparation. The reported standard deviation for the quantification is related to the UV detector which showed relative standard deviations (RSDs) for quantification within +/-1.1%, while lower limit of detection upto 16mug (in mug absolute) for the NMR detector. Finally the developed LC-UV-NMR method was applied to identify the APAs and AAPAs in real water samples, consequent to solid phase extraction and derivatization. The method is fast (total experiment time approximately 2h), sensitive, rugged and efficient.
Galaxy bias from galaxy-galaxy lensing in the DES Science Verification Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prat, J.; et al.
We present a measurement of galaxy-galaxy lensing around a magnitude-limited (more » $$i_{AB} < 22.5$$) sample of galaxies selected from the Dark Energy Survey Science Verification (DES-SV) data. We split these lenses into three photometric-redshift bins from 0.2 to 0.8, and determine the product of the galaxy bias $b$ and cross-correlation coefficient between the galaxy and dark matter overdensity fields $r$ in each bin, using scales above 4 Mpc/$h$ comoving, where we find the linear bias model to be valid given our current uncertainties. We compare our galaxy bias results from galaxy-galaxy lensing with those obtained from galaxy clustering (Crocce et al. 2016) and CMB lensing (Giannantonio et al. 2016) for the same sample of galaxies, and find our measurements to be in good agreement with those in Crocce et al. (2016), while, in the lowest redshift bin ($$z\\sim0.3$$), they show some tension with the findings in Giannantonio et al. (2016). Our results are found to be rather insensitive to a large range of systematic effects. We measure $$b\\cdot r$$ to be $$0.87\\pm 0.11$$, $$1.12 \\pm 0.16$$ and $$1.24\\pm 0.23$$, respectively for the three redshift bins of width $$\\Delta z = 0.2$$ in the range $0.2« less
The Environmental Response Laboratory Network supports the goal to increase national capacity for biological analysis of environmental samples. This includes methods development and verification, technology transfer, and collaboration with USDA, FERN, CDC.
Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk
2015-12-01
The purpose of this study is to verification of relationship model between Korean new elderly class's recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model.
Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk
2015-01-01
The purpose of this study is to verification of relationship model between Korean new elderly class’s recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model. PMID:26730383
Testing Dialog-Verification of SIP Phones with Single-Message Denial-of-Service Attacks
NASA Astrophysics Data System (ADS)
Seedorf, Jan; Beckers, Kristian; Huici, Felipe
The Session Initiation Protocol (SIP) is widely used for signaling in multimedia communications. However, many SIP implementations are still in their infancy and vulnerable to malicious messages. We investigate flaws in the SIP implementations of eight phones, showing that the deficient verification of SIP dialogs further aggravates the problem by making it easier for attacks to succeed. Our results show that the majority of the phones we tested are susceptible to these attacks.
The neural substrates of procrastination: A voxel-based morphometry study.
Hu, Yue; Liu, Peiwei; Guo, Yiqun; Feng, Tingyong
2018-03-01
Procrastination is a pervasive phenomenon across different cultures and brings about lots of serious consequences, including performance, subjective well-being, and even public policy. However, little is known about the neural substrates of procrastination. In order to shed light upon this question, we investigated the neuroanatomical substrates of procrastination across two independent samples using voxel-based morphometry (VBM) method. The whole-brain analysis showed procrastination was positively correlated with the graymatter (GM) volume of clusters in the parahippocampal gyrus (PHG) and the orbital frontal cortex (OFC), while negatively correlated with the GM volume of clusters in the inferior frontal gyrus (IFG) and the middle frontal gyrus (MFG) in sample one (151 participants). We further conducted a verification procedure on another sample (108 participants) using region-of-interest analysis to examine the reliability of these results. Results showed procrastination can be predicted by the GM volume of the OFC and the MFG. The present findings suggest that the MFG and OFC, which are the key regions of self-control and emotion regulation, may play an important role in procrastination. Copyright © 2018 Elsevier Inc. All rights reserved.
Development and Verification of the Charring Ablating Thermal Protection Implicit System Solver
NASA Technical Reports Server (NTRS)
Amar, Adam J.; Calvert, Nathan D.; Kirk, Benjamin S.
2010-01-01
The development and verification of the Charring Ablating Thermal Protection Implicit System Solver is presented. This work concentrates on the derivation and verification of the stationary grid terms in the equations that govern three-dimensional heat and mass transfer for charring thermal protection systems including pyrolysis gas flow through the porous char layer. The governing equations are discretized according to the Galerkin finite element method with first and second order implicit time integrators. The governing equations are fully coupled and are solved in parallel via Newton's method, while the fully implicit linear system is solved with the Generalized Minimal Residual method. Verification results from exact solutions and the Method of Manufactured Solutions are presented to show spatial and temporal orders of accuracy as well as nonlinear convergence rates.
Development and Verification of the Charring, Ablating Thermal Protection Implicit System Simulator
NASA Technical Reports Server (NTRS)
Amar, Adam J.; Calvert, Nathan; Kirk, Benjamin S.
2011-01-01
The development and verification of the Charring Ablating Thermal Protection Implicit System Solver (CATPISS) is presented. This work concentrates on the derivation and verification of the stationary grid terms in the equations that govern three-dimensional heat and mass transfer for charring thermal protection systems including pyrolysis gas flow through the porous char layer. The governing equations are discretized according to the Galerkin finite element method (FEM) with first and second order fully implicit time integrators. The governing equations are fully coupled and are solved in parallel via Newton s method, while the linear system is solved via the Generalized Minimum Residual method (GMRES). Verification results from exact solutions and Method of Manufactured Solutions (MMS) are presented to show spatial and temporal orders of accuracy as well as nonlinear convergence rates.
Fernández-Galán, Esther; Bedini, Josep Lluís; Filella, Xavier
2017-12-01
This study is the first verification of the novel iPTH Siemens ADVIA Centaur® Intact Parathyroid Hormone (iPTHm) chemiluminescence immunoassay based on monoclonal antibodies. We also compared the iPTH results obtained using this assay with the previous ADVIA Centaur® Parathyroid Hormone assay (iPTHp) based on polyclonal antibodies. The analytical performance study of iPTHm assay included LoD, LoQ, intra- and inter-assay reproducibility, and linearity. A comparison study was performed on 369 routine plasma samples. The results were analyzed independently for patients with normal and abnormal GFR, as well as patients on hemodialysis. In addition, clinical concordance between assays was assessed. Finally, we studied PTH stability of plasma samples at 4°C. For the iPTHm assay LoD and LoQ were 0.03pmol/L and 0.10pmol/L, respectively. Intra- and inter-assay CV were between 2.3% and 6.2%. Linearity was correct in the range from 3.82 to 203.08pmol/L. Correlation studies showed a good correlation (r=0.99) between iPTHm and iPTHp, with bias of -2.55% (IC -3.48% to -1.62%) in the range from 0.32 to 117.07pmol/L. Clinical concordance, assessed by Kappa Index, was 0.874. The stability study showed that differences compared to basal iPTH concentration did not exceed 20% in any of the samples analyzed. The iPTHm assay demonstrated acceptable performance and a very good clinical concordance with iPTHp assay, currently used in our laboratory. Thus, the novel iPTHm assay can replace the previous iPTHp assay, since results provided by both assays are very similar. In our study, the stability of iPTH is not affected by storage up to 14days. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan
2018-02-01
The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.
Self-verification and contextualized self-views.
Chen, Serena; English, Tammy; Peng, Kaiping
2006-07-01
Whereas most self-verification research has focused on people's desire to verify their global self-conceptions, the present studies examined self-verification with regard to contextualized selfviews-views of the self in particular situations and relationships. It was hypothesized that individuals whose core self-conceptions include contextualized self-views should seek to verify these self-views. In Study 1, the more individuals defined the self in dialectical terms, the more their judgments were biased in favor of verifying over nonverifying feedback about a negative, situation-specific self-view. In Study 2, consistent with research on gender differences in the importance of relationships to the self-concept, women but not men showed a similar bias toward feedback about a negative, relationship-specific self-view, a pattern not seen for global self-views. Together, the results support the notion that self-verification occurs for core self-conceptions, whatever form(s) they may take. Individual differences in self-verification and the nature of selfhood and authenticity are discussed.
Comparison of statistical models for writer verification
NASA Astrophysics Data System (ADS)
Srihari, Sargur; Ball, Gregory R.
2009-01-01
A novel statistical model for determining whether a pair of documents, a known and a questioned, were written by the same individual is proposed. The goal of this formulation is to learn the specific uniqueness of style in a particular author's writing, given the known document. Since there are often insufficient samples to extrapolate a generalized model of an writer's handwriting based solely on the document, we instead generalize over the differences between the author and a large population of known different writers. This is in contrast to an earlier model proposed whereby probability distributions were a priori without learning. We show the performance of the model along with a comparison in performance to the non-learning, older model, which shows significant improvement.
40 CFR 86.1845-01 - Manufacturer in-use verification testing requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... of test vehicles in the sample comply with the sample size requirements of this section. Any post... vehicles, light-duty trucks, and complete heavy-duty vehicles shall test, or cause to have tested a...) Low mileage testing. [Reserved] (c) High-mileage testing—(1) Test groups. Testing must be conducted...
40 CFR 86.1845-01 - Manufacturer in-use verification testing requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... of test vehicles in the sample comply with the sample size requirements of this section. Any post... vehicles, light-duty trucks, and complete heavy-duty vehicles shall test, or cause to have tested a...) Low mileage testing. [Reserved] (c) High-mileage testing—(1) Test groups. Testing must be conducted...
40 CFR 86.1845-01 - Manufacturer in-use verification testing requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... of test vehicles in the sample comply with the sample size requirements of this section. Any post... vehicles, light-duty trucks, and complete heavy-duty vehicles shall test, or cause to have tested a...) Low mileage testing. [Reserved] (c) High-mileage testing—(1) Test groups. Testing must be conducted...
40 CFR 86.1845-01 - Manufacturer in-use verification testing requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... of test vehicles in the sample comply with the sample size requirements of this section. Any post... vehicles, light-duty trucks, and complete heavy-duty vehicles shall test, or cause to have tested a...) Low mileage testing. [Reserved] (c) High-mileage testing—(1) Test groups. Testing must be conducted...
40 CFR 86.1845-04 - Manufacturer in-use verification testing requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... of test vehicles in the sample comply with the sample size requirements of this section. Any post... HDV must test, or cause to have tested, a specified number of vehicles. Such testing must be conducted... first test will be considered the official results for the test vehicle, regardless of any test results...
Demonstration Report for Visual Sample Plan (VSP) Verification Sampling Methods at the Navy/DRI Site
2011-08-01
population of 537,197 with an overall population density of 608 people per square mile (people/ mi2 ). However, the population density in the vicinity...Preliminary Assessment Findings approximately 12 people/ mi2 . Population density is expected to greatly increase following development of the site
Weak lensing magnification in the Dark Energy Survey Science Verification Data
Garcia-Fernandez, M.; et al.
2018-02-02
In this paper the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using data from the Dark Energy Survey Science Verification dataset. The analysis is carried out for two photometrically-selected galaxy samples, with mean photometric redshifts in themore » $0.2 < z < 0.4$ and $0.7 < z < 1.0$ ranges, in the riz bands. A signal is detected with a $$3.5\\sigma$$ significance level in each of the bands tested, and is compatible with the magnification predicted by the $$\\Lambda$$CDM model. After an extensive analysis, it cannot be attributed to any known systematic effect. The detection of the magnification signal is robust to estimated uncertainties in the outlier rate of the pho- tometric redshifts, but this will be an important issue for use of photometric redshifts in magnification mesurements from larger samples. In addition to the detection of the magnification signal, a method to select the sample with the maximum signal-to-noise is proposed and validated with data.« less
Weak lensing magnification in the Dark Energy Survey Science Verification Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia-Fernandez, M.; et al.
In this paper the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using data from the Dark Energy Survey Science Verification dataset. The analysis is carried out for two photometrically-selected galaxy samples, with mean photometric redshifts in themore » $0.2 < z < 0.4$ and $0.7 < z < 1.0$ ranges, in the riz bands. A signal is detected with a $$3.5\\sigma$$ significance level in each of the bands tested, and is compatible with the magnification predicted by the $$\\Lambda$$CDM model. After an extensive analysis, it cannot be attributed to any known systematic effect. The detection of the magnification signal is robust to estimated uncertainties in the outlier rate of the pho- tometric redshifts, but this will be an important issue for use of photometric redshifts in magnification mesurements from larger samples. In addition to the detection of the magnification signal, a method to select the sample with the maximum signal-to-noise is proposed and validated with data.« less
Weak lensing magnification in the Dark Energy Survey Science Verification Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia-Fernandez, M.; et al.
2016-11-30
In this paper the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using data from the Dark Energy Survey Science Verification dataset. The analysis is carried out for two photometrically-selected galaxy samples, with mean photometric redshifts in themore » $0.2 < z < 0.4$ and $0.7 < z < 1.0$ ranges, in the riz bands. A signal is detected with a $$3.5\\sigma$$ significance level in each of the bands tested, and is compatible with the magnification predicted by the $$\\Lambda$$CDM model. After an extensive analysis, it cannot be attributed to any known systematic effect. The detection of the magnification signal is robust to estimated uncertainties in the outlier rate of the pho- tometric redshifts, but this will be an important issue for use of photometric redshifts in magnification mesurements from larger samples. In addition to the detection of the magnification signal, a method to select the sample with the maximum signal-to-noise is proposed and validated with data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruedig, Elizabeth
Public Law 105-119 directs the U.S. Department of Energy (DOE) to convey or transfer parcels of land to the Incorporated County of Los Alamos or their designees and to the Department of Interior, Bureau of Indian Affairs, in trust for the Pueblo de San Ildefonso. Los Alamos National Security is tasked to support DOE in conveyance and/or transfer of identified land parcels no later than September 2022. Under DOE Order 458.1, Radiation Protection of the Public and the Environment (O458.1, 2013) and Los Alamos National Laboratory (LANL or the Laboratory) implementing Policy 412 (P412, 2014), real property with the potentialmore » to contain residual radioactive material must meet the criteria for clearance and release to the public. This Sampling and Analysis Plan (SAP) is a second investigation of Tract A-18-2 for the purpose of verifying the previous sampling results (LANL 2017). This sample plan requires 18 projectspecific soil samples for use in radiological clearance decisions consistent with LANL Procedure ENV-ES-TP-238 (2015a) and guidance in the Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM, 2000). The sampling work will be conducted by LANL, and samples will be evaluated by a LANL-contracted independent lab. However, there will be federal review (verification) of all steps of the sampling process.« less
Verification of hypergraph states
NASA Astrophysics Data System (ADS)
Morimae, Tomoyuki; Takeuchi, Yuki; Hayashi, Masahito
2017-12-01
Hypergraph states are generalizations of graph states where controlled-Z gates on edges are replaced with generalized controlled-Z gates on hyperedges. Hypergraph states have several advantages over graph states. For example, certain hypergraph states, such as the Union Jack states, are universal resource states for measurement-based quantum computing with only Pauli measurements, while graph state measurement-based quantum computing needs non-Clifford basis measurements. Furthermore, it is impossible to classically efficiently sample measurement results on hypergraph states unless the polynomial hierarchy collapses to the third level. Although several protocols have been proposed to verify graph states with only sequential single-qubit Pauli measurements, there was no verification method for hypergraph states. In this paper, we propose a method for verifying a certain class of hypergraph states with only sequential single-qubit Pauli measurements. Importantly, no i.i.d. property of samples is assumed in our protocol: any artificial entanglement among samples cannot fool the verifier. As applications of our protocol, we consider verified blind quantum computing with hypergraph states, and quantum computational supremacy demonstrations with hypergraph states.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darcy, Eric; Keyser, Matthew
The Internal Short Circuit (ISC) device enables critical battery safety verification. With the aluminum interstitial heat sink between the cells, normal trigger cells cannot be driven into thermal runaway without excessive temperature bias of adjacent cells. With an implantable, on-demand ISC device, thermal runaway tests show that the conductive heat sinks protected adjacent cells from propagation. High heat dissipation and structural support of Al heat sinks show high promise for safer, higher performing batteries.
A Lightweight RFID Mutual Authentication Protocol Based on Physical Unclonable Function.
Xu, He; Ding, Jie; Li, Peng; Zhu, Feng; Wang, Ruchuan
2018-03-02
With the fast development of the Internet of Things, Radio Frequency Identification (RFID) has been widely applied into many areas. Nevertheless, security problems of the RFID technology are also gradually exposed, when it provides life convenience. In particular, the appearance of a large number of fake and counterfeit goods has caused massive loss for both producers and customers, for which the clone tag is a serious security threat. If attackers acquire the complete information of a tag, they can then obtain the unique identifier of the tag by some technological means. In general, because there is no extra identifier of a tag, it is difficult to distinguish an original tag and its clone one. Once the legal tag data is obtained, attackers can be able to clone this tag. Therefore, this paper shows an efficient RFID mutual verification protocol. This protocol is based on the Physical Unclonable Function (PUF) and the lightweight cryptography to achieve efficient verification of a single tag. The protocol includes three process: tag recognition, mutual verification and update. The tag recognition is that the reader recognizes the tag; mutual verification is that the reader and tag mutually verify the authenticity of each other; update is supposed to maintain the latest secret key for the following verification. Analysis results show that this protocol has a good balance between performance and security.
A Lightweight RFID Mutual Authentication Protocol Based on Physical Unclonable Function
Ding, Jie; Zhu, Feng; Wang, Ruchuan
2018-01-01
With the fast development of the Internet of Things, Radio Frequency Identification (RFID) has been widely applied into many areas. Nevertheless, security problems of the RFID technology are also gradually exposed, when it provides life convenience. In particular, the appearance of a large number of fake and counterfeit goods has caused massive loss for both producers and customers, for which the clone tag is a serious security threat. If attackers acquire the complete information of a tag, they can then obtain the unique identifier of the tag by some technological means. In general, because there is no extra identifier of a tag, it is difficult to distinguish an original tag and its clone one. Once the legal tag data is obtained, attackers can be able to clone this tag. Therefore, this paper shows an efficient RFID mutual verification protocol. This protocol is based on the Physical Unclonable Function (PUF) and the lightweight cryptography to achieve efficient verification of a single tag. The protocol includes three process: tag recognition, mutual verification and update. The tag recognition is that the reader recognizes the tag; mutual verification is that the reader and tag mutually verify the authenticity of each other; update is supposed to maintain the latest secret key for the following verification. Analysis results show that this protocol has a good balance between performance and security. PMID:29498684
Signature Verification Based on Handwritten Text Recognition
NASA Astrophysics Data System (ADS)
Viriri, Serestina; Tapamo, Jules-R.
Signatures continue to be an important biometric trait because it remains widely used primarily for authenticating the identity of human beings. This paper presents an efficient text-based directional signature recognition algorithm which verifies signatures, even when they are composed of special unconstrained cursive characters which are superimposed and embellished. This algorithm extends the character-based signature verification technique. The experiments carried out on the GPDS signature database and an additional database created from signatures captured using the ePadInk tablet, show that the approach is effective and efficient, with a positive verification rate of 94.95%.
Van Hoof, Joris J
2017-04-01
Currently, two different age verification systems (AVS) are implemented to enhance compliance with legal age limits for the sale of alcohol in the Netherlands. In this study, we tested the operational procedures and effectiveness of ID readers and remote age verification technology in supermarkets during the sale of alcohol. Following a trained alcohol purchase protocol, eight mystery shoppers (both underage and in the branch's reference age) conducted 132 alcohol purchase attempts in stores that were equipped with ID readers or remote age verification or were part of a control group. In stores equipped with an ID reader, 34% of the purchases were conducted without any mistakes (full compliance). In stores with remote age verification, full compliance was achieved in 87% of the cases. The control group reached 57% compliance, which is in line with the national average. Stores with ID readers perform worse than stores with remote age verification, and also worse than stores without any AVS. For both systems, in addition to effectiveness, public support and user friendliness need to be investigated. This study shows that remote age verification technology is a promising intervention that increases vendor compliance during the sales of age restricted products. © The Author 2016. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.
Kuswandi, Bambang; Irmawati, Titi; Hidayat, Moch Amrun; Jayus; Ahmad, Musa
2014-01-27
A simple visual ethanol biosensor based on alcohol oxidase (AOX) immobilised onto polyaniline (PANI) film for halal verification of fermented beverage samples is described. This biosensor responds to ethanol via a colour change from green to blue, due to the enzymatic reaction of ethanol that produces acetaldehyde and hydrogen peroxide, when the latter oxidizes the PANI film. The procedure to obtain this biosensor consists of the immobilization of AOX onto PANI film by adsorption. For the immobilisation, an AOX solution is deposited on the PANI film and left at room temperature until dried (30 min). The biosensor was constructed as a dip stick for visual and simple use. The colour changes of the films have been scanned and analysed using image analysis software (i.e., ImageJ) to study the characteristics of the biosensor's response toward ethanol. The biosensor has a linear response in an ethanol concentration range of 0.01%-0.8%, with a correlation coefficient (r) of 0.996. The limit detection of the biosensor was 0.001%, with reproducibility (RSD) of 1.6% and a life time up to seven weeks when stored at 4 °C. The biosensor provides accurate results for ethanol determination in fermented drinks and was in good agreement with the standard method (gas chromatography) results. Thus, the biosensor could be used as a simple visual method for ethanol determination in fermented beverage samples that can be useful for Muslim community for halal verification.
SU-E-T-762: Toward Volume-Based Independent Dose Verification as Secondary Check
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tachibana, H; Tachibana, R
2015-06-15
Purpose: Lung SBRT plan has been shifted to volume prescription technique. However, point dose agreement is still verified using independent dose verification at the secondary check. The volume dose verification is more affected by inhomogeneous correction rather than point dose verification currently used as the check. A feasibility study for volume dose verification was conducted in lung SBRT plan. Methods: Six SBRT plans were collected in our institute. Two dose distributions with / without inhomogeneous correction were generated using Adaptive Convolve (AC) in Pinnacle3. Simple MU Analysis (SMU, Triangle Product, Ishikawa, JP) was used as the independent dose verification softwaremore » program, in which a modified Clarkson-based algorithm was implemented and radiological path length was computed using CT images independently to the treatment planning system. The agreement in point dose and mean dose between the AC with / without the correction and the SMU were assessed. Results: In the point dose evaluation for the center of the GTV, the difference shows the systematic shift (4.5% ± 1.9 %) in comparison of the AC with the inhomogeneous correction, on the other hands, there was good agreement of 0.2 ± 0.9% between the SMU and the AC without the correction. In the volume evaluation, there were significant differences in mean dose for not only PTV (14.2 ± 5.1 %) but also GTV (8.0 ± 5.1 %) compared to the AC with the correction. Without the correction, the SMU showed good agreement for GTV (1.5 ± 0.9%) as well as PTV (0.9% ± 1.0%). Conclusion: The volume evaluation for secondary check may be possible in homogenous region. However, the volume including the inhomogeneous media would make larger discrepancy. Dose calculation algorithm for independent verification needs to be modified to take into account the inhomogeneous correction.« less
Character Recognition Method by Time-Frequency Analyses Using Writing Pressure
NASA Astrophysics Data System (ADS)
Watanabe, Tatsuhito; Katsura, Seiichiro
With the development of information and communication technology, personal verification becomes more and more important. In the future ubiquitous society, the development of terminals handling personal information requires the personal verification technology. The signature is one of the personal verification methods; however, the number of characters is limited in the case of the signature and therefore false signature is used easily. Thus, personal identification is difficult from handwriting. This paper proposes a “haptic pen” that extracts the writing pressure, and shows a character recognition method by time-frequency analyses. Although the figures of characters written by different amanuenses are similar, the differences appear in the time-frequency domain. As a result, it is possible to use the proposed character recognition for personal identification more exactly. The experimental results showed the viability of the proposed method.
A multispectral photon-counting double random phase encoding scheme for image authentication.
Yi, Faliu; Moon, Inkyu; Lee, Yeon H
2014-05-20
In this paper, we propose a new method for color image-based authentication that combines multispectral photon-counting imaging (MPCI) and double random phase encoding (DRPE) schemes. The sparsely distributed information from MPCI and the stationary white noise signal from DRPE make intruder attacks difficult. In this authentication method, the original multispectral RGB color image is down-sampled into a Bayer image. The three types of color samples (red, green and blue color) in the Bayer image are encrypted with DRPE and the amplitude part of the resulting image is photon counted. The corresponding phase information that has nonzero amplitude after photon counting is then kept for decryption. Experimental results show that the retrieved images from the proposed method do not visually resemble their original counterparts. Nevertheless, the original color image can be efficiently verified with statistical nonlinear correlations. Our experimental results also show that different interpolation algorithms applied to Bayer images result in different verification effects for multispectral RGB color images.
Efficient logistic regression designs under an imperfect population identifier.
Albert, Paul S; Liu, Aiyi; Nansel, Tonja
2014-03-01
Motivated by actual study designs, this article considers efficient logistic regression designs where the population is identified with a binary test that is subject to diagnostic error. We consider the case where the imperfect test is obtained on all participants, while the gold standard test is measured on a small chosen subsample. Under maximum-likelihood estimation, we evaluate the optimal design in terms of sample selection as well as verification. We show that there may be substantial efficiency gains by choosing a small percentage of individuals who test negative on the imperfect test for inclusion in the sample (e.g., verifying 90% test-positive cases). We also show that a two-stage design may be a good practical alternative to a fixed design in some situations. Under optimal and nearly optimal designs, we compare maximum-likelihood and semi-parametric efficient estimators under correct and misspecified models with simulations. The methodology is illustrated with an analysis from a diabetes behavioral intervention trial. © 2013, The International Biometric Society.
Verification and validation of RADMODL Version 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kimball, K.D.
1993-03-01
RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transportmore » of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.« less
Design of verification platform for wireless vision sensor networks
NASA Astrophysics Data System (ADS)
Ye, Juanjuan; Shang, Fei; Yu, Chuang
2017-08-01
At present, the majority of research for wireless vision sensor networks (WVSNs) still remains in the software simulation stage, and the verification platforms of WVSNs that available for use are very few. This situation seriously restricts the transformation from theory research of WVSNs to practical application. Therefore, it is necessary to study the construction of verification platform of WVSNs. This paper combines wireless transceiver module, visual information acquisition module and power acquisition module, designs a high-performance wireless vision sensor node whose core is ARM11 microprocessor and selects AODV as the routing protocol to set up a verification platform called AdvanWorks for WVSNs. Experiments show that the AdvanWorks can successfully achieve functions of image acquisition, coding, wireless transmission, and obtain the effective distance parameters between nodes, which lays a good foundation for the follow-up application of WVSNs.
NASA Technical Reports Server (NTRS)
Melendez, Orlando; Trizzino, Mary; Fedderson, Bryan
1997-01-01
The National Aeronautics and Space Administration (NASA), Kennedy Space Center (KSC) Materials Science Division conducted a study to evaluate alternative solvents for CFC-113 in precision cleaning and verification on typical samples that are used in the KSC environment. The effects of AK-225(R), Vertrel(R), MCA, and HFE A 7100 on selected metal and polymer materials were studied over 1, 7 and 30 day test times. This report addresses a study on the compatibility aspects of replacement solvents for materials in aerospace applications.
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Rigaku ZSX Mini II (ZSX Mini II) XRF Services x-ray fluorescence (XRF) analyzer was demon-strated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ZSX Mini II analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ZSX Mini II analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element con
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Rontec PicoTAX x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the PicoTAX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the PicoTAX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by c
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Niton XLt 700 Series (XLt) XRF Services x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XLt analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XLt analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Oxford ED2000 x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ED2000 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ED2000 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by com
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Innov-X XT400 Series (XT400) x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XT400 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XT400 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was as
Regression Verification Using Impact Summaries
NASA Technical Reports Server (NTRS)
Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana
2013-01-01
Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program versions [19]. These techniques compare two programs with a large degree of syntactic similarity to prove that portions of one program version are equivalent to the other. Regression verification can be used for guaranteeing backward compatibility, and for showing behavioral equivalence in programs with syntactic differences, e.g., when a program is refactored to improve its performance, maintainability, or readability. Existing regression verification techniques leverage similarities between program versions by using abstraction and decomposition techniques to improve scalability of the analysis [10, 12, 19]. The abstractions and decomposition in the these techniques, e.g., summaries of unchanged code [12] or semantically equivalent methods [19], compute an over-approximation of the program behaviors. The equivalence checking results of these techniques are sound but not complete-they may characterize programs as not functionally equivalent when, in fact, they are equivalent. In this work we describe a novel approach that leverages the impact of the differences between two programs for scaling regression verification. We partition program behaviors of each version into (a) behaviors impacted by the changes and (b) behaviors not impacted (unimpacted) by the changes. Only the impacted program behaviors are used during equivalence checking. We then prove that checking equivalence of the impacted program behaviors is equivalent to checking equivalence of all program behaviors for a given depth bound. In this work we use symbolic execution to generate the program behaviors and leverage control- and data-dependence information to facilitate the partitioning of program behaviors. The impacted program behaviors are termed as impact summaries. The dependence analyses that facilitate the generation of the impact summaries, we believe, could be used in conjunction with other abstraction and decomposition based approaches, [10, 12], as a complementary reduction technique. An evaluation of our regression verification technique shows that our approach is capable of leveraging similarities between program versions to reduce the size of the queries and the time required to check for logical equivalence. The main contributions of this work are: - A regression verification technique to generate impact summaries that can be checked for functional equivalence using an off-the-shelf decision procedure. - A proof that our approach is sound and complete with respect to the depth bound of symbolic execution. - An implementation of our technique using the LLVMcompiler infrastructure, the klee Symbolic Virtual Machine [4], and a variety of Satisfiability Modulo Theory (SMT) solvers, e.g., STP [7] and Z3 [6]. - An empirical evaluation on a set of C artifacts which shows that the use of impact summaries can reduce the cost of regression verification.
The purpose of this SOP is to ensure suitable temperature maintenance of freezers used for storage of samples. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the "Border" study. Keywords: freezers; operation.
The National H...
The purpose of this SOP is to assure suitable temperature maintenance in refrigerators and freezers used for sample storage during the Arizona NHEXAS project and the "Border" study. Keywords: lab; equipment; refrigerators and freezers.
The National Human Exposure Assessment Su...
2013-09-01
17 5.6 SAMPLING RESULTS ........................................................................................ 18 6.0 PERFORMANCE...Page ii 8.0 IMPLEMENTATION ISSUES ........................................................................................ 37 8.1 FILTRATION ...15 iv LIST OF TABLES Page Table 1. Performance results
Calibration and verification of thermographic cameras for geometric measurements
NASA Astrophysics Data System (ADS)
Lagüela, S.; González-Jorge, H.; Armesto, J.; Arias, P.
2011-03-01
Infrared thermography is a technique with an increasing degree of development and applications. Quality assessment in the measurements performed with the thermal cameras should be achieved through metrology calibration and verification. Infrared cameras acquire temperature and geometric information, although calibration and verification procedures are only usual for thermal data. Black bodies are used for these purposes. Moreover, the geometric information is important for many fields as architecture, civil engineering and industry. This work presents a calibration procedure that allows the photogrammetric restitution and a portable artefact to verify the geometric accuracy, repeatability and drift of thermographic cameras. These results allow the incorporation of this information into the quality control processes of the companies. A grid based on burning lamps is used for the geometric calibration of thermographic cameras. The artefact designed for the geometric verification consists of five delrin spheres and seven cubes of different sizes. Metrology traceability for the artefact is obtained from a coordinate measuring machine. Two sets of targets with different reflectivity are fixed to the spheres and cubes to make data processing and photogrammetric restitution possible. Reflectivity was the chosen material propriety due to the thermographic and visual cameras ability to detect it. Two thermographic cameras from Flir and Nec manufacturers, and one visible camera from Jai are calibrated, verified and compared using calibration grids and the standard artefact. The calibration system based on burning lamps shows its capability to perform the internal orientation of the thermal cameras. Verification results show repeatability better than 1 mm for all cases, being better than 0.5 mm for the visible one. As it must be expected, also accuracy appears higher in the visible camera, and the geometric comparison between thermographic cameras shows slightly better results for the Nec camera.
DOE Office of Scientific and Technical Information (OSTI.GOV)
ADAMS, WADE C
At Pennsylvania Department of Environmental Protection's request, ORAU's IEAV program conducted verification surveys on the excavated surfaces of Section 3, SUs 1, 4, and 5 at the Whittaker site on March 13 and 14, 2013. The survey activities included visual inspections, gamma radiation surface scans, gamma activity measurements, and soil sampling activities. Verification activities also included the review and assessment of the licensee's project documentation and methodologies. Surface scans identified four areas of elevated direct gamma radiation distinguishable from background; one area within SUs 1 and 4 and two areas within SU5. One area within SU5 was remediated by removingmore » a golf ball size piece of slag while ORAU staff was onsite. With the exception of the golf ball size piece of slag within SU5, a review of the ESL Section 3 EXS data packages for SUs 1, 4, and 5 indicated that these locations of elevated gamma radiation were also identified by the ESL gamma scans and that ESL personnel performed additional investigations and soil sampling within these areas. The investigative results indicated that the areas met the release criteria.« less
Li, Chao; Zhang, Yan-po; Guo, Wei-dong; Zhu, Yue; Xu, Jing; Deng, Xun
2010-09-01
Fluorescence excitation-emission matrix (EEM) and absorption spectroscopy were applied to study the optical properties of 29 CDOM samples collected from different ballast tanks of nine international route vessels anchored in Xiamen Port between October 2007 and April 2008. The purpose was to examine the feasibility of these spectral properties as a tracer to verify if these vessels follow the mid-ocean ballast water exchange (BWE) regulation. Using parallel factor analysis, four fluorescent components were identified, including two humic-like components (C1: 245, 300/386 nm; C2: 250, 345/458 nm) and two protein-like components (C3: 220, 275/306 nm; C4: 235, 290/345 nm), of which C2 component was the suitable fluorescence verification indicator. The vertical distribution of all fluorescent components in ballast tank was nearly similar indicating that profile-mixing sampling was preferable. Combined use of C2 component, spectral slope ratio (SR) of absorption spectroscopy and salinity may provide reasonable verification if BWE carried out by these nine ships. The results suggested that the combined use of multiple parameters (fluorescence, absorption and salinity) would be much reliable to determine the origin of ballast water, and to provide the technical guarantee for fast examination of ballast water exchange in Chinese ports.
Hou, Guixue; Lou, Xiaomin; Sun, Yulin; Xu, Shaohang; Zi, Jin; Wang, Quanhui; Zhou, Baojin; Han, Bo; Wu, Lin; Zhao, Xiaohang; Lin, Liang; Liu, Siqi
2015-09-04
We propose an efficient integration of SWATH with MRM for biomarker discovery and verification when the corresponding ion library is well established. We strictly controlled the false positive rate associated with SWATH MS signals and carefully selected the target peptides coupled with SWATH and MRM. We collected 10 samples of esophageal squamous cell carcinoma (ESCC) tissues paired with tumors and adjacent regions and quantified 1758 unique proteins with FDR 1% at protein level using SWATH, in which 467 proteins were abundance-dependent with ESCC. After carefully evaluating the SWATH MS signals of the up-regulated proteins, we selected 120 proteins for MRM verification. MRM analysis of the pooled and individual esophageal tissues resulted in 116 proteins that exhibited similar abundance response modes to ESCC that were acquired with SWATH. Because the ESCC-related proteins consisted of a high percentile of secreted proteins, we conducted the MRM assay on patient sera that were collected from pre- and postoperation. Of the 116 target proteins, 42 were identified in the ESCC sera, including 11 with lowered abundances postoperation. Coupling SWATH and MRM is thus feasible and efficient for the discovery and verification of cancer-related protein biomarkers.
Enrichment Assay Methods Development for the Integrated Cylinder Verification System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.
2009-10-22
International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify eachmore » cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.« less
Development of CFC-Free Cleaning Processes at the NASA White Sands Test Facility
NASA Technical Reports Server (NTRS)
Beeson, Harold; Kirsch, Mike; Hornung, Steven; Biesinger, Paul
1995-01-01
The NASA White Sands Test Facility (WSTF) is developing cleaning and verification processes to replace currently used chlorofluorocarbon-113- (CFC-113-) based processes. The processes being evaluated include both aqueous- and solvent-based techniques. The presentation will include the findings of investigations of aqueous cleaning and verification processes that are based on a draft of a proposed NASA Kennedy Space Center (KSC) cleaning procedure. Verification testing with known contaminants, such as hydraulic fluid and commonly used oils, established correlations between nonvolatile residue and CFC-113. Recoveries ranged from 35 to 60 percent of theoretical. WSTF is also investigating enhancements to aqueous sampling for organics and particulates. Although aqueous alternatives have been identified for several processes, a need still exists for nonaqueous solvent cleaning, such as the cleaning and cleanliness verification of gauges used for oxygen service. The cleaning effectiveness of tetrachloroethylene (PCE), trichloroethylene (TCE), ethanol, hydrochlorofluorocarbon-225 (HCFC-225), tert-butylmethylether, and n-Hexane was evaluated using aerospace gauges and precision instruments and then compared to the cleaning effectiveness of CFC-113. Solvents considered for use in oxygen systems were also tested for oxygen compatibility using high-pressure oxygen autoignition and liquid oxygen mechanical impact testing.
Experimental verification and simulation of negative index of refraction using Snell's law.
Parazzoli, C G; Greegor, R B; Li, K; Koltenbah, B E C; Tanielian, M
2003-03-14
We report the results of a Snell's law experiment on a negative index of refraction material in free space from 12.6 to 13.2 GHz. Numerical simulations using Maxwell's equations solvers show good agreement with the experimental results, confirming the existence of negative index of refraction materials. The index of refraction is a function of frequency. At 12.6 GHz we measure and compute the real part of the index of refraction to be -1.05. The measurements and simulations of the electromagnetic field profiles were performed at distances of 14lambda and 28lambda from the sample; the fields were also computed at 100lambda.
Insitu aircraft verification of the quality of satellite cloud winds over oceanic regions
NASA Technical Reports Server (NTRS)
Hasler, A. F.; Skillman, W. C.
1979-01-01
A five year aircraft experiment to verify the quality of satellite cloud winds over oceans using in situ aircraft inertial navigation system wind measurements is presented. The final results show that satellite measured cumulus cloud motions are very good estimators of the cloud base wind for trade wind and subtropical high regions. The average magnitude of the vector differences between the cloud motion and the cloud base wind is given. For cumulus clouds near frontal regions, the cloud motion agreed best with the mean cloud layer wind. For a very limited sample, cirrus cloud motions also most closely followed the mean wind in the cloud layer.
Evaluation of the 29-km Eta Model. Part 1; Objective Verification at Three Selected Stations
NASA Technical Reports Server (NTRS)
Nutter, Paul A.; Manobianco, John; Merceret, Francis J. (Technical Monitor)
1998-01-01
This paper describes an objective verification of the National Centers for Environmental Prediction (NCEP) 29-km eta model from May 1996 through January 1998. The evaluation was designed to assess the model's surface and upper-air point forecast accuracy at three selected locations during separate warm (May - August) and cool (October - January) season periods. In order to enhance sample sizes available for statistical calculations, the objective verification includes two consecutive warm and cool season periods. Systematic model deficiencies comprise the larger portion of the total error in most of the surface forecast variables that were evaluated. The error characteristics for both surface and upper-air forecasts vary widely by parameter, season, and station location. At upper levels, a few characteristic biases are identified. Overall however, the upper-level errors are more nonsystematic in nature and could be explained partly by observational measurement uncertainty. With a few exceptions, the upper-air results also indicate that 24-h model error growth is not statistically significant. In February and August 1997, NCEP implemented upgrades to the eta model's physical parameterizations that were designed to change some of the model's error characteristics near the surface. The results shown in this paper indicate that these upgrades led to identifiable and statistically significant changes in forecast accuracy for selected surface parameters. While some of the changes were expected, others were not consistent with the intent of the model updates and further emphasize the need for ongoing sensitivity studies and localized statistical verification efforts. Objective verification of point forecasts is a stringent measure of model performance, but when used alone, is not enough to quantify the overall value that model guidance may add to the forecast process. Therefore, results from a subjective verification of the meso-eta model over the Florida peninsula are discussed in the companion paper by Manobianco and Nutter. Overall verification results presented here and in part two should establish a reasonable benchmark from which model users and developers may pursue the ongoing eta model verification strategies in the future.
Fabrication of a printed capacitive air-gap touch sensor
NASA Astrophysics Data System (ADS)
Lee, Sang Hoon; Seo, Hwiwon; Lee, Sangyoon
2018-05-01
Unlike lithography-based processes, printed electronics does not require etching, which makes it difficult to fabricate electronic devices with an air gap. In this study, we propose a method to fabricate capacitive air-gap touch sensors via printing and coating. First, the bottom electrode was fabricated on a flexible poly(ethylene terephthalate) (PET) substrate using roll-to-roll gravure printing with silver ink. Then poly(dimethylsiloxane) (PDMS) was spin coated to form a sacrificial layer. The top electrode was fabricated on the sacrificial layer by spin coating with a stretchable silver ink. The sensor samples were then put in a tetrabutylammonium (TBAF) bath to generate the air gap by removing the sacrificial layer. The capacitance of the samples was measured for verification, and the results show that the capacitance increases in proportion to the applied force from 0 to 2.5 N.
Verification testing of the compression performance of the HEVC screen content coding extensions
NASA Astrophysics Data System (ADS)
Sullivan, Gary J.; Baroncini, Vittorio A.; Yu, Haoping; Joshi, Rajan L.; Liu, Shan; Xiu, Xiaoyu; Xu, Jizheng
2017-09-01
This paper reports on verification testing of the coding performance of the screen content coding (SCC) extensions of the High Efficiency Video Coding (HEVC) standard (Rec. ITU-T H.265 | ISO/IEC 23008-2 MPEG-H Part 2). The coding performance of HEVC screen content model (SCM) reference software is compared with that of the HEVC test model (HM) without the SCC extensions, as well as with the Advanced Video Coding (AVC) joint model (JM) reference software, for both lossy and mathematically lossless compression using All-Intra (AI), Random Access (RA), and Lowdelay B (LB) encoding structures and using similar encoding techniques. Video test sequences in 1920×1080 RGB 4:4:4, YCbCr 4:4:4, and YCbCr 4:2:0 colour sampling formats with 8 bits per sample are tested in two categories: "text and graphics with motion" (TGM) and "mixed" content. For lossless coding, the encodings are evaluated in terms of relative bit-rate savings. For lossy compression, subjective testing was conducted at 4 quality levels for each coding case, and the test results are presented through mean opinion score (MOS) curves. The relative coding performance is also evaluated in terms of Bjøntegaard-delta (BD) bit-rate savings for equal PSNR quality. The perceptual tests and objective metric measurements show a very substantial benefit in coding efficiency for the SCC extensions, and provided consistent results with a high degree of confidence. For TGM video, the estimated bit-rate savings ranged from 60-90% relative to the JM and 40-80% relative to the HM, depending on the AI/RA/LB configuration category and colour sampling format.
The Role of Integrated Modeling in the Design and Verification of the James Webb Space Telescope
NASA Technical Reports Server (NTRS)
Mosier, Gary E.; Howard, Joseph M.; Johnston, John D.; Parrish, Keith A.; Hyde, T. Tupper; McGinnis, Mark A.; Bluth, Marcel; Kim, Kevin; Ha, Kong Q.
2004-01-01
The James Web Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2011. System-level verification of critical optical performance requirements will rely on integrated modeling to a considerable degree. In turn, requirements for accuracy of the models are significant. The size of the lightweight observatory structure, coupled with the need to test at cryogenic temperatures, effectively precludes validation of the models and verification of optical performance with a single test in 1-g. Rather, a complex series of steps are planned by which the components of the end-to-end models are validated at various levels of subassembly, and the ultimate verification of optical performance is by analysis using the assembled models. This paper describes the critical optical performance requirements driving the integrated modeling activity, shows how the error budget is used to allocate and track contributions to total performance, and presents examples of integrated modeling methods and results that support the preliminary observatory design. Finally, the concepts for model validation and the role of integrated modeling in the ultimate verification of observatory are described.
QPF verification using different radar-based analyses: a case study
NASA Astrophysics Data System (ADS)
Moré, J.; Sairouni, A.; Rigo, T.; Bravo, M.; Mercader, J.
2009-09-01
Verification of QPF in NWP models has been always challenging not only for knowing what scores are better to quantify a particular skill of a model but also for choosing the more appropriate methodology when comparing forecasts with observations. On the one hand, an objective verification technique can provide conclusions that are not in agreement with those ones obtained by the "eyeball" method. Consequently, QPF can provide valuable information to forecasters in spite of having poor scores. On the other hand, there are difficulties in knowing the "truth" so different results can be achieved depending on the procedures used to obtain the precipitation analysis. The aim of this study is to show the importance of combining different precipitation analyses and verification methodologies to obtain a better knowledge of the skills of a forecasting system. In particular, a short range precipitation forecasting system based on MM5 at 12 km coupled with LAPS is studied in a local convective precipitation event that took place in NE Iberian Peninsula on October 3rd 2008. For this purpose, a variety of verification methods (dichotomous, recalibration and object oriented methods) are used to verify this case study. At the same time, different precipitation analyses are used in the verification process obtained by interpolating radar data using different techniques.
An ontology based trust verification of software license agreement
NASA Astrophysics Data System (ADS)
Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo
2017-08-01
When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.
Area of Concern (AOC) 314 Verification Survey at Former McClellan AFB, Sacramento, CA
2015-03-31
also collected 22 soil samples from within AOC 314. Laboratory analysis revealed that the concentration of radium-226 (Ra-226) in 10 of the soil ...at least one sample that exceeded 2.0 pCi/g. The highest concentration of Ra-226 found in any of the soil samples was 25.8 pCi/g. Based on these...and ensure the potential health risk to future inhabitants is minimized. USAFSAM/OEC personnel also collected 22 soil samples from within AOC 314
Toward Automatic Verification of Goal-Oriented Flow Simulations
NASA Technical Reports Server (NTRS)
Nemec, Marian; Aftosmis, Michael J.
2014-01-01
We demonstrate the power of adaptive mesh refinement with adjoint-based error estimates in verification of simulations governed by the steady Euler equations. The flow equations are discretized using a finite volume scheme on a Cartesian mesh with cut cells at the wall boundaries. The discretization error in selected simulation outputs is estimated using the method of adjoint-weighted residuals. Practical aspects of the implementation are emphasized, particularly in the formulation of the refinement criterion and the mesh adaptation strategy. Following a thorough code verification example, we demonstrate simulation verification of two- and three-dimensional problems. These involve an airfoil performance database, a pressure signature of a body in supersonic flow and a launch abort with strong jet interactions. The results show reliable estimates and automatic control of discretization error in all simulations at an affordable computational cost. Moreover, the approach remains effective even when theoretical assumptions, e.g., steady-state and solution smoothness, are relaxed.
Category V Compliant Container for Mars Sample Return Missions
NASA Technical Reports Server (NTRS)
Dolgin, Benjamin; Sanok, Joseph; Sevilla, Donald; Bement, Laurence J.
2000-01-01
A novel containerization technique that satisfies Planetary Protection (PP) Category V requirements has been developed and demonstrated on the mock-up of the Mars Sample Return Container. The proposed approach uses explosive welding with a sacrificial layer and cut-through-the-seam techniques. The technology produces a container that is free from Martian contaminants on an atomic level. The containerization technique can be used on any celestial body that may support life. A major advantage of the proposed technology is the possibility of very fast (less than an hour) verification of both containment and cleanliness with typical metallurgical laboratory equipment. No separate biological verification is required. In addition to Category V requirements, the proposed container presents a surface that is clean from any, even nonviable organisms, and any molecular fragments of biological origin that are unique to Mars or any other celestial body other than Earth.
Diode step stress program for JANTX1N5615
NASA Technical Reports Server (NTRS)
1979-01-01
The effect of power/temperature step stress when applied to the switching diode JANTX1N5615 manufactured by Semtech and Micro semiconductor was examined. A total of 48 samples from each manufacturer were submitted to the process. In addition, two control sample units were maintained for verification of the electrical parametric testing. All test samples were subjected to the electrical tests after completing the prior power/temperature step stress point. Results are presented.
The purpose of this SOP is to assure suitable temperature maintenance in refrigerators and freezers used for sample storage during the Arizona NHEXAS project and the Border study. Keywords: lab; equipment; refrigerators and freezers.
The U.S.-Mexico Border Program is sponsored...
Transistor step stress testing program for JANTX2N2484
NASA Technical Reports Server (NTRS)
1979-01-01
The effect of power/temperature step stress when applied to the transistor JANTX2N2484, manufactured by Raytheon and Teledyne was evaluated. Forty-eight samples from each manufacturer were divided equally (16 per group) into three groups and submitted to the processes outlined. In addition, two control sample units were maintained for verification of the electrical parametric testing.
ERIC Educational Resources Information Center
Mesmer-Magnus, Jessica R.; Viswesvaran, Chockalingam
2005-01-01
The overlap between measures of work-to-family (WFC) and family-to-work conflict (FWC) was meta-analytically investigated. Researchers have assumed WFC and FWC to be distinct, however, this assumption requires empirical verification. Across 25 independent samples (total N=9079) the sample size weighted mean observed correlation was .38 and the…
Kim, Sang-Bog; Roche, Jennifer
2013-08-01
Organically bound tritium (OBT) is an important tritium species that can be measured in most environmental samples, but has only recently been recognized as a species of tritium in these samples. Currently, OBT is not routinely measured by environmental monitoring laboratories around the world. There are no certified reference materials (CRMs) for environmental samples. Thus, quality assurance (QA), or verification of the accuracy of the OBT measurement, is not possible. Alternatively, quality control (QC), or verification of the precision of the OBT measurement, can be achieved. In the past, there have been differences in OBT analysis results between environmental laboratories. A possible reason for the discrepancies may be differences in analytical methods. Therefore, inter-laboratory OBT comparisons among the environmental laboratories are important and would provide a good opportunity for adopting a reference OBT analytical procedure. Due to the analytical issues, only limited information is available on OBT measurement. Previously conducted OBT inter-laboratory practices are reviewed and the findings are described. Based on our experiences, a few considerations were suggested for the international OBT inter-laboratory comparison exercise to be completed in the near future. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
Prakash, Varuna; Koczmara, Christine; Savage, Pamela; Trip, Katherine; Stewart, Janice; McCurdie, Tara; Cafazzo, Joseph A; Trbovich, Patricia
2014-11-01
Nurses are frequently interrupted during medication verification and administration; however, few interventions exist to mitigate resulting errors, and the impact of these interventions on medication safety is poorly understood. The study objectives were to (A) assess the effects of interruptions on medication verification and administration errors, and (B) design and test the effectiveness of targeted interventions at reducing these errors. The study focused on medication verification and administration in an ambulatory chemotherapy setting. A simulation laboratory experiment was conducted to determine interruption-related error rates during specific medication verification and administration tasks. Interventions to reduce these errors were developed through a participatory design process, and their error reduction effectiveness was assessed through a postintervention experiment. Significantly more nurses committed medication errors when interrupted than when uninterrupted. With use of interventions when interrupted, significantly fewer nurses made errors in verifying medication volumes contained in syringes (16/18; 89% preintervention error rate vs 11/19; 58% postintervention error rate; p=0.038; Fisher's exact test) and programmed in ambulatory pumps (17/18; 94% preintervention vs 11/19; 58% postintervention; p=0.012). The rate of error commission significantly decreased with use of interventions when interrupted during intravenous push (16/18; 89% preintervention vs 6/19; 32% postintervention; p=0.017) and pump programming (7/18; 39% preintervention vs 1/19; 5% postintervention; p=0.017). No statistically significant differences were observed for other medication verification tasks. Interruptions can lead to medication verification and administration errors. Interventions were highly effective at reducing unanticipated errors of commission in medication administration tasks, but showed mixed effectiveness at reducing predictable errors of detection in medication verification tasks. These findings can be generalised and adapted to mitigate interruption-related errors in other settings where medication verification and administration are required. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Prakash, Varuna; Koczmara, Christine; Savage, Pamela; Trip, Katherine; Stewart, Janice; McCurdie, Tara; Cafazzo, Joseph A; Trbovich, Patricia
2014-01-01
Background Nurses are frequently interrupted during medication verification and administration; however, few interventions exist to mitigate resulting errors, and the impact of these interventions on medication safety is poorly understood. Objective The study objectives were to (A) assess the effects of interruptions on medication verification and administration errors, and (B) design and test the effectiveness of targeted interventions at reducing these errors. Methods The study focused on medication verification and administration in an ambulatory chemotherapy setting. A simulation laboratory experiment was conducted to determine interruption-related error rates during specific medication verification and administration tasks. Interventions to reduce these errors were developed through a participatory design process, and their error reduction effectiveness was assessed through a postintervention experiment. Results Significantly more nurses committed medication errors when interrupted than when uninterrupted. With use of interventions when interrupted, significantly fewer nurses made errors in verifying medication volumes contained in syringes (16/18; 89% preintervention error rate vs 11/19; 58% postintervention error rate; p=0.038; Fisher's exact test) and programmed in ambulatory pumps (17/18; 94% preintervention vs 11/19; 58% postintervention; p=0.012). The rate of error commission significantly decreased with use of interventions when interrupted during intravenous push (16/18; 89% preintervention vs 6/19; 32% postintervention; p=0.017) and pump programming (7/18; 39% preintervention vs 1/19; 5% postintervention; p=0.017). No statistically significant differences were observed for other medication verification tasks. Conclusions Interruptions can lead to medication verification and administration errors. Interventions were highly effective at reducing unanticipated errors of commission in medication administration tasks, but showed mixed effectiveness at reducing predictable errors of detection in medication verification tasks. These findings can be generalised and adapted to mitigate interruption-related errors in other settings where medication verification and administration are required. PMID:24906806
Kuswandi, Bambang; Irmawati, Titi; Hidayat, Moch Amrun; Jayus; Ahmad, Musa
2014-01-01
A simple visual ethanol biosensor based on alcohol oxidase (AOX) immobilised onto polyaniline (PANI) film for halal verification of fermented beverage samples is described. This biosensor responds to ethanol via a colour change from green to blue, due to the enzymatic reaction of ethanol that produces acetaldehyde and hydrogen peroxide, when the latter oxidizes the PANI film. The procedure to obtain this biosensor consists of the immobilization of AOX onto PANI film by adsorption. For the immobilisation, an AOX solution is deposited on the PANI film and left at room temperature until dried (30 min). The biosensor was constructed as a dip stick for visual and simple use. The colour changes of the films have been scanned and analysed using image analysis software (i.e., ImageJ) to study the characteristics of the biosensor's response toward ethanol. The biosensor has a linear response in an ethanol concentration range of 0.01%–0.8%, with a correlation coefficient (r) of 0.996. The limit detection of the biosensor was 0.001%, with reproducibility (RSD) of 1.6% and a life time up to seven weeks when stored at 4 °C. The biosensor provides accurate results for ethanol determination in fermented drinks and was in good agreement with the standard method (gas chromatography) results. Thus, the biosensor could be used as a simple visual method for ethanol determination in fermented beverage samples that can be useful for Muslim community for halal verification. PMID:24473284
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Elvatech, Ltd. ElvaX (ElvaX) x-ray fluorescence (XRF) analyzer distributed in the United States by Xcalibur XRF Services (Xcalibur), was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ElvaX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ElvaX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as s
Reassessment of the Access Testosterone chemiluminescence assay and comparison with LC-MS method.
Dittadi, Ruggero; Matteucci, Mara; Meneghetti, Elisa; Ndreu, Rudina
2018-03-01
To reassess the imprecision and Limit of Quantitation, to evaluate the cross-reaction with dehydroepiandrosterone-sulfate (DHEAS), the accuracy toward liquid chromatography-mass spectrometry (LC-MS) and the reference interval of the Access Testosterone method, performed by DxI immunoassay platform (Beckman Coulter). Imprecision was evaluated testing six pool samples assayed in 20 different run using two reagents lots. The cross-reaction with DHEAS was studied both by a displacement curve and by spiking DHEAS standard in two serum samples with known amount of testosterone. The comparison with LC-MS was evaluated by Passing-Bablock analysis in 21 routine serum samples and 19 control samples from an External Quality Assurance (EQA) scheme. The reference interval was verified by an indirect estimation on 2445 male and 2838 female outpatients. The imprecision study showed a coefficient of variation (CV) between 2.7% and 34.7% for serum pools from 16.3 and 0.27 nmol/L. The value of Limit of Quantitation at 20% CV was 0.53 nmol/L. The DHEAS showed a cross-reaction of 0.0074%. A comparison with LC-MS showed a trend toward a slight underestimation of immunoassay vs LC-MS (Passing-Bablock equations: DxI=-0.24+0.906 LCMS in serum samples and DxI=-0.299+0.981 LCMS in EQA samples). The verification of reference interval showed a 2.5th-97.5th percentile distribution of 6.6-24.3 nmol/L for male over 14 years and <0.5-2.78 nmol/L for female subjects, in accord with the reference intervals reported by the manufacturer. The Access Testosterone method could be considered an adequately reliable tool for the testosterone measurement. © 2017 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. M. Capron
2008-05-30
The 100-F-44:2 waste site is a steel pipeline that was discovered in a junction box during confirmatory sampling of the 100-F-26:4 pipeline from December 2004 through January 2005. The 100-F-44:2 pipeline feeds into the 100-F-26:4 subsite vitrified clay pipe (VCP) process sewer pipeline from the 108-F Biology Laboratory at the junction box. In accordance with this evaluation, the confirmatory sampling results support a reclassification of this site to No Action. The current site conditions achieve the remedial action objectives and the corresponding remedial action goals established in the Remaining Sites ROD. The results of confirmatory sampling show that residual contaminantmore » concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.« less
NASA Astrophysics Data System (ADS)
Ujianto, O.; Noviyanti, R.; Wijaya, R.; Ramadhoni, B.
2017-07-01
Natural rubber (NR)/coconut coir (CF) composites were fabricated using co-rotating twin screw extruder with maleated NR (MNR) used as compatibilizer. The MNR was produced at three level of maleic anhydride (MA), and analyzed qualitative and quantitatively using FTIR and titration technique. Analysis on MNR using FTIR and titration methods showed that MA was grafted on NR chain at different percentage (0.76, 2.23, 4.79%) depended on MA concentration. Tensile strength data showed the best tensile strength was produced at 7 phr of MNR with 1 phr of MA level in MNR resulting 16.4 MPa. The improvement of compatibilized samples were more than 300% compare to uncompatibilized composite attributed to better interfacial bonding. The improvement on tensile strength was significantly influenced by MNR level and amount of MA added to produce MNR, as well as their interaction. The optimum conditions for producing NR-CF composite were predicted at 6.5 phr of MNR level with 1 phr of MA concentration added in MNR production, regardless screw rotation settings. Results from verification experiments confirm that developed model was capable of describing phenomena during composite preparation. Morphology analysis using scanning electron microscopy shows smooth covered fiber in compatibilized samples than that of without MNR. The morphology also showed less voids on compatibilized samples attributed to better interfacial bonding leading to tensile strength improvement.
Diode step stress program, JANTX1N5614
NASA Technical Reports Server (NTRS)
1978-01-01
The reliability of switching diode JANTX1N5614 was tested. The effect of power/temperature step stress on the diode was determined. Control sample units were maintained for verification of the electrical parametric testing. Results are reported.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Damiani, Rick
This manual summarizes the theory and preliminary verifications of the JacketSE module, which is an offshore jacket sizing tool that is part of the Wind-Plant Integrated System Design & Engineering Model toolbox. JacketSE is based on a finite-element formulation and on user-prescribed inputs and design standards' criteria (constraints). The physics are highly simplified, with a primary focus on satisfying ultimate limit states and modal performance requirements. Preliminary validation work included comparing industry data and verification against ANSYS, a commercial finite-element analysis package. The results are encouraging, and future improvements to the code are recommended in this manual.
SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamashita, M; Kokubo, M; Institute of Biomedical Research and Innovation, Kobe, Hyogo
2016-06-15
Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used formore » dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less
Ontology Matching with Semantic Verification.
Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R
2009-09-01
ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.
1988-01-01
under field conditions. Sampling and analytical laboratory activities were performed by Ecology and Environment, Inc., and California Analytical...the proposed AER3 test conditions. All test samples would be obtained onsite by Ecology and Environment, Inc., of Buffalo, New York, and sent to...ensuring its safe operation. Ecology and Environment performed onsite verification sampling. This activity was coordinated with the Huber project team
Transistor step stress testing program for JANTX2N2905A
NASA Technical Reports Server (NTRS)
1979-01-01
The effect of power/temperature step stress when applied to the transistor JANTX2N2905A manufactured by Texas Instruments and Motorola is reported. A total of 48 samples from each manufacturer was submitted to the process outlined. In addition, two control sample units were maintained for verification of the electrical parametric testing. All test samples were subjected to the electrical tests outlined in Table 2 after completing the prior power/temperature step stress point.
Juillet, Y; Dubois, C; Bintein, F; Dissard, J; Bossée, A
2014-08-01
A new rapid, sensitive and reliable method was developed for the determination of phosgene in air samples using thermal desorption (TD) followed by gas chromatography-mass spectrometry (GC-MS). The method is based on a fast (10 min) active sampling of only 1 L of air onto a Tenax® GR tube doped with 0.5 mL of derivatizing mixture containing dimercaptotoluene and triethylamine in hexane solution. Validation of the TD-GC-MS method showed a low limit of detection (40 ppbv), acceptable repeatability, intermediate fidelity (relative standard deviation within 12 %) and excellent accuracy (>95%). Linearity was demonstrated for two concentration ranges (0.04 to 2.5 ppmv and 2.5 to 10 ppmv) owing to variation of derivatization recovery between low and high concentration levels. Due to its simple on-site implementation and its close similarity with recommended operating procedure (ROP) for chemical warfare agents vapour sampling, the method is particularly useful in the process of verification of the Chemical Weapons Convention.
2008-02-28
An ER-2 high-altitude Earth science aircraft banks away during a flight over the southern Sierra Nevada. NASA’s Armstrong Flight Research Center operates two of the Lockheed-built aircraft on a wide variety of environmental science, atmospheric sampling, and satellite data verification missions.
Methods and Procedures in PIRLS 2016
ERIC Educational Resources Information Center
Martin, Michael O., Ed.; Mullis, Ina V. S., Ed.; Hooper, Martin, Ed.
2017-01-01
"Methods and Procedures in PIRLS 2016" documents the development of the Progress in International Reading Literacy Study (PIRLS) assessments and questionnaires and describes the methods used in sampling, translation verification, data collection, database construction, and the construction of the achievement and context questionnaire…
NASA Technical Reports Server (NTRS)
Jackson, T. J.; Shiue, J.; Oneill, P.; Wang, J.; Fuchs, J.; Owe, M.
1984-01-01
The verification of a multi-sensor aircraft system developed to study soil moisture applications is discussed. This system consisted of a three beam push broom L band microwave radiometer, a thermal infrared scanner, a multispectral scanner, video and photographic cameras and an onboard navigational instrument. Ten flights were made of agricultural sites in Maryland and Delaware with little or no vegetation cover. Comparisons of aircraft and ground measurements showed that the system was reliable and consistent. Time series analysis of microwave and evaporation data showed a strong similarity that indicates a potential direction for future research.
"Edge-on" MOSkin detector for stereotactic beam measurement and verification.
Jong, Wei Loong; Ung, Ngie Min; Vannyat, Ath; Jamalludin, Zulaikha; Rosenfeld, Anatoly; Wong, Jeannie Hsiu Ding
2017-01-01
Dosimetry in small radiation field is challenging and complicated because of dose volume averaging and beam perturbations in a detector. We evaluated the suitability of the "Edge-on" MOSkin (MOSFET) detector in small radiation field measurement. We also tested the feasibility for dosimetric verification in stereotactic radiosurgery (SRS) and stereotactic radiotherapy (SRT). "Edge-on" MOSkin detector was calibrated and the reproducibility and linearity were determined. Lateral dose profiles and output factors were measured using the "Edge-on" MOSkin detector, ionization chamber, SRS diode and EBT2 film. Dosimetric verification was carried out on two SRS and five SRT plans. In dose profile measurements, the "Edge-on" MOSkin measurements concurred with EBT2 film measurements. It showed full width at half maximum of the dose profile with average difference of 0.11mm and penumbral width with difference of ±0.2mm for all SRS cones as compared to EBT2 film measurement. For output factor measurements, a 1.1% difference was observed between the "Edge-on" MOSkin detector and EBT2 film for 4mm SRS cone. The "Edge-on" MOSkin detector provided reproducible measurements for dose verification in real-time. The measured doses concurred with the calculated dose for SRS (within 1%) and SRT (within 3%). A set of output correction factors for the "Edge-on" MOSkin detector for small radiation fields were derived from EBT2 film measurement and presented. This study showed that the "Edge-on" MOSkin detector is a suitable tool for dose verification in small radiation field. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
High-resolution face verification using pore-scale facial features.
Li, Dong; Zhou, Huiling; Lam, Kin-Man
2015-08-01
Face recognition methods, which usually represent face images using holistic or local facial features, rely heavily on alignment. Their performances also suffer a severe degradation under variations in expressions or poses, especially when there is one gallery per subject only. With the easy access to high-resolution (HR) face images nowadays, some HR face databases have recently been developed. However, few studies have tackled the use of HR information for face recognition or verification. In this paper, we propose a pose-invariant face-verification method, which is robust to alignment errors, using the HR information based on pore-scale facial features. A new keypoint descriptor, namely, pore-Principal Component Analysis (PCA)-Scale Invariant Feature Transform (PPCASIFT)-adapted from PCA-SIFT-is devised for the extraction of a compact set of distinctive pore-scale facial features. Having matched the pore-scale features of two-face regions, an effective robust-fitting scheme is proposed for the face-verification task. Experiments show that, with one frontal-view gallery only per subject, our proposed method outperforms a number of standard verification methods, and can achieve excellent accuracy even the faces are under large variations in expression and pose.
Schmidt, Robert L; Walker, Brandon S; Cohen, Michael B
2015-03-01
Reliable estimates of accuracy are important for any diagnostic test. Diagnostic accuracy studies are subject to unique sources of bias. Verification bias and classification bias are 2 sources of bias that commonly occur in diagnostic accuracy studies. Statistical methods are available to estimate the impact of these sources of bias when they occur alone. The impact of interactions when these types of bias occur together has not been investigated. We developed mathematical relationships to show the combined effect of verification bias and classification bias. A wide range of case scenarios were generated to assess the impact of bias components and interactions on total bias. Interactions between verification bias and classification bias caused overestimation of sensitivity and underestimation of specificity. Interactions had more effect on sensitivity than specificity. Sensitivity was overestimated by at least 7% in approximately 6% of the tested scenarios. Specificity was underestimated by at least 7% in less than 0.1% of the scenarios. Interactions between verification bias and classification bias create distortions in accuracy estimates that are greater than would be predicted from each source of bias acting independently. © 2014 American Cancer Society.
Measuring Data Quality Through a Source Data Verification Audit in a Clinical Research Setting.
Houston, Lauren; Probst, Yasmine; Humphries, Allison
2015-01-01
Health data has long been scrutinised in relation to data quality and integrity problems. Currently, no internationally accepted or "gold standard" method exists measuring data quality and error rates within datasets. We conducted a source data verification (SDV) audit on a prospective clinical trial dataset. An audit plan was applied to conduct 100% manual verification checks on a 10% random sample of participant files. A quality assurance rule was developed, whereby if >5% of data variables were incorrect a second 10% random sample would be extracted from the trial data set. Error was coded: correct, incorrect (valid or invalid), not recorded or not entered. Audit-1 had a total error of 33% and audit-2 36%. The physiological section was the only audit section to have <5% error. Data not recorded to case report forms had the greatest impact on error calculations. A significant association (p=0.00) was found between audit-1 and audit-2 and whether or not data was deemed correct or incorrect. Our study developed a straightforward method to perform a SDV audit. An audit rule was identified and error coding was implemented. Findings demonstrate that monitoring data quality by a SDV audit can identify data quality and integrity issues within clinical research settings allowing quality improvement to be made. The authors suggest this approach be implemented for future research.
OH/H2O Detection Capability Evaluation on Chang'e-5 Lunar Mineralogical Spectrometer (LMS)
NASA Astrophysics Data System (ADS)
Liu, Bin; Ren, Xin; Liu, Jianjun; Li, Chunlai; Mu, Lingli; Deng, Liyan
2016-10-01
The Chang'e-5 (CE-5) lunar sample return mission is scheduled to launch in 2017 to bring back lunar regolith and drill samples. The Chang'e-5 Lunar Mineralogical Spectrometer (LMS), as one of the three sets of scientific payload installed on the lander, is used to collect in-situ spectrum and analyze the mineralogical composition of the samplingsite. It can also help to select the sampling site, and to compare the measured laboratory spectrum of returned sample with in-situ data. LMS employs acousto-optic tunable filters (AOTFs) and is composed of a VIS/NIR module (0.48μm-1.45μm) and an IR module (1.4μm -3.2μm). It has spectral resolution ranging from 3 to 25 nm, with a field of view (FOV) of 4.24°×4.24°. Unlike Chang'e-3 VIS/NIR Imaging Spectrometer (VNIS), the spectral coverage of LMS is extended from 2.4μm to 3.2μm, which has capability to identify H2O/OH absorption features around 2.7μm. An aluminum plate and an Infragold plate are fixed in the dust cover, being used as calibration targets in the VIS/NIR and IR spectral range respectively when the dust cover is open. Before launch, a ground verification test of LMS needs to be conducted in order to: 1) test and verify the detection capability of LMS through evaluation on the quality of image and spectral data collected for the simulated lunar samples; and 2) evaluate the accuracy of data processing methods by the simulation of instrument working on the moon. The ground verification test will be conducted both in the lab and field. The spectra of simulated lunar regolith/mineral samples will be collected simultaneously by the LMS and two calibrated spectrometers: a FTIR spectrometer (Model 102F) and an ASD FieldSpec 4 Hi-Res spectrometer. In this study, the results of the LMS ground verification test will be reported, and OH/H2O Detection Capability will be evaluated especially.
Barker, F. Keith; Oyler-McCance, Sara; Tomback, Diana F.
2015-01-01
Next generation sequencing methods allow rapid, economical accumulation of data that have many applications, even at relatively low levels of genome coverage. However, the utility of shotgun sequencing data sets for specific goals may vary depending on the biological nature of the samples sequenced. We show that the ability to assemble mitogenomes from three avian samples of two different tissue types varies widely. In particular, data with coverage typical of microsatellite development efforts (∼1×) from DNA extracted from avian blood failed to cover even 50% of the mitogenome, relative to at least 500-fold coverage from muscle-derived data. Researchers should consider possible applications of their data and select the tissue source for their work accordingly. Practitioners analyzing low-coverage shotgun sequencing data (including for microsatellite locus development) should consider the potential benefits of mitogenome assembly, including internal barcode verification of species identity, mitochondrial primer development, and phylogenetics.
[Microbiological verification of a self control plan for a hospital food service].
Torre, I; Pennino, F; Crispino, M
2006-01-01
During the past years, it has been an increment of food related infectious diseases. In order to avoid micro biological food contamination, adherence to good manufacturing is required through control measures of food safety practices. Updated national and European regulations underline the need to apply the HACCP system, overcoming the old concept of sample control on the end user product. This work shows results of microbiological controls made along the whole productive chain. Measurements are made using biomolecular techniques (PFGE) in order to assess the management of the micro biological risk of the self control plan applied to a hospital food service of Naples. The use of the PFGE applied on some micro-organisms gram negative potentially pathogen, underlines the circulation, continued in time, of these micro-organisms within the cooking area. In addition, cross contamination between several matrixes of samples has been detected.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. M. Capron
2008-04-17
The 100-F-54 waste site, part of the 100-FR-2 Operable Unit, is the soil associated with the former pastures for holding domestic farm animals used in experimental toxicology studies. Evaluation of historical information resulted in identification of the experimental animal farm pastures as having potential residual soil contamination due to excrement from experimental animals. The 100-F-54 animal farm pastures confirmatory sampling results support a reclassification of this site to No Action. The current site conditions achieve the remedial action objectives and the corresponding remedial action goals established in the Remaining Sites ROD. The results of confirmatory sampling show that residual contaminantmore » concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.« less
Advanced Curation Protocols for Mars Returned Sample Handling
NASA Astrophysics Data System (ADS)
Bell, M.; Mickelson, E.; Lindstrom, D.; Allton, J.
Introduction: Johnson Space Center has over 30 years experience handling precious samples which include Lunar rocks and Antarctic meteorites. However, we recognize that future curation of samples from such missions as Genesis, Stardust, and Mars S mple Return, will require a high degree of biosafety combined witha extremely low levels of inorganic, organic, and biological contamination. To satisfy these requirements, research in the JSC Advanced Curation Lab is currently focused toward two major areas: preliminary examination techniques and cleaning and verification techniques . Preliminary Examination Techniques : In order to minimize the number of paths for contamination we are exploring the synergy between human &robotic sample handling in a controlled environment to help determine the limits of clean curation. Within the Advanced Curation Laboratory is a prototype, next-generation glovebox, which contains a robotic micromanipulator. The remotely operated manipulator has six degrees-of- freedom and can be programmed to perform repetitive sample handling tasks. Protocols are being tested and developed to perform curation tasks such as rock splitting, weighing, imaging, and storing. Techniques for sample transfer enabling more detailed remote examination without compromising the integrity of sample science are also being developed . The glovebox is equipped with a rapid transfer port through which samples can be passed without exposure. The transfer is accomplished by using a unique seal and engagement system which allows passage between containers while maintaining a first seal to the outside environment and a second seal to prevent the outside of the container cover and port door from becoming contaminated by the material being transferred. Cleaning and Verification Techniques: As part of the contamination control effort, innovative cleaning techniques are being identified and evaluated in conjunction with sensitive cleanliness verification methods. Towards this end, cleaning techniques such as ultrasonication in ultra -pure water (UPW), oxygen (O2) plasma, and carbon dioxide (CO2) "snow" are being used to clean a variety of different contaminants on a variety of different surfaces. Additionally, once cleaned, techniques to directly verify the s rface cleanliness are being developed. Theseu include X ray photoelectron spectroscopy (XPS) quantification, and screening with- contact angle measure ments , which can be correlated with XPS standards. Methods developed in the Advanced Curation Laboratory will determine the extent to which inorganic and biological contamination can be controlled and minimized.
Genome-Scale Screen for DNA Methylation-Based Detection Markers for Ovarian Cancer
Houshdaran, Sahar; Shen, Hui; Widschwendter, Martin; Daxenbichler, Günter; Long, Tiffany; Marth, Christian; Laird-Offringa, Ite A.; Press, Michael F.; Dubeau, Louis; Siegmund, Kimberly D.; Wu, Anna H.; Groshen, Susan; Chandavarkar, Uma; Roman, Lynda D.; Berchuck, Andrew; Pearce, Celeste L.; Laird, Peter W.
2011-01-01
Background The identification of sensitive biomarkers for the detection of ovarian cancer is of high clinical relevance for early detection and/or monitoring of disease recurrence. We developed a systematic multi-step biomarker discovery and verification strategy to identify candidate DNA methylation markers for the blood-based detection of ovarian cancer. Methodology/Principal Findings We used the Illumina Infinium platform to analyze the DNA methylation status of 27,578 CpG sites in 41 ovarian tumors. We employed a marker selection strategy that emphasized sensitivity by requiring consistency of methylation across tumors, while achieving specificity by excluding markers with methylation in control leukocyte or serum DNA. Our verification strategy involved testing the ability of identified markers to monitor disease burden in serially collected serum samples from ovarian cancer patients who had undergone surgical tumor resection compared to CA-125 levels. We identified one marker, IFFO1 promoter methylation (IFFO1-M), that is frequently methylated in ovarian tumors and that is rarely detected in the blood of normal controls. When tested in 127 serially collected sera from ovarian cancer patients, IFFO1-M showed post-resection kinetics significantly correlated with serum CA-125 measurements in six out of 16 patients. Conclusions/Significance We implemented an effective marker screening and verification strategy, leading to the identification of IFFO1-M as a blood-based candidate marker for sensitive detection of ovarian cancer. Serum levels of IFFO1-M displayed post-resection kinetics consistent with a reflection of disease burden. We anticipate that IFFO1-M and other candidate markers emerging from this marker development pipeline may provide disease detection capabilities that complement existing biomarkers. PMID:22163280
Urine sampling and collection system optimization and testing
NASA Technical Reports Server (NTRS)
Fogal, G. L.; Geating, J. A.; Koesterer, M. G.
1975-01-01
A Urine Sampling and Collection System (USCS) engineering model was developed to provide for the automatic collection, volume sensing and sampling of urine from each micturition. The purpose of the engineering model was to demonstrate verification of the system concept. The objective of the optimization and testing program was to update the engineering model, to provide additional performance features and to conduct system testing to determine operational problems. Optimization tasks were defined as modifications to minimize system fluid residual and addition of thermoelectric cooling.
Pyroelectric effect in tryglicyne sulphate single crystals - Differential measurement method
NASA Astrophysics Data System (ADS)
Trybus, M.
2018-06-01
A simple mathematical model of the pyroelectric phenomenon was used to explain the electric response of the TGS (triglycine sulphate) samples in the linear heating process in ferroelectric and paraelectric phases. Experimental verification of mathematical model was realized. TGS single crystals were grown and four electrode samples were fabricated. Differential measurements of the pyroelectric response of two different regions of the samples were performed and the results were compared with data obtained from the model. Experimental results are in good agreement with model calculations.
Drumm, Daniel W; Greentree, Andrew D
2017-11-07
Finding a fluorescent target in a biological environment is a common and pressing microscopy problem. This task is formally analogous to the canonical search problem. In ideal (noise-free, truthful) search problems, the well-known binary search is optimal. The case of half-lies, where one of two responses to a search query may be deceptive, introduces a richer, Rényi-Ulam problem and is particularly relevant to practical microscopy. We analyse microscopy in the contexts of Rényi-Ulam games and half-lies, developing a new family of heuristics. We show the cost of insisting on verification by positive result in search algorithms; for the zero-half-lie case bisectioning with verification incurs a 50% penalty in the average number of queries required. The optimal partitioning of search spaces directly following verification in the presence of random half-lies is determined. Trisectioning with verification is shown to be the most efficient heuristic of the family in a majority of cases.
Formal verification of an avionics microprocessor
NASA Technical Reports Server (NTRS)
Srivas, Mandayam, K.; Miller, Steven P.
1995-01-01
Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.
AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) TESTING OF FOUR MERCURY EMISSION SAMPLING SYSTEMS
CEMs - Tekran Instrument Corp. Series 3300 and Thermo Electron's Mercury Freedom System Continuous Emission Monitors (CEMs) for mercury are designed to determine total and/or chemically speciated vapor-phase mercury in combustion emissions. Performance for mercury CEMs are cont...
FIELD VERIFICATION OF LINERS FROM SANITARY LANDFILLS
Liner specimens from three existing landfill sites were collected and examined to determine the changes in their physical properties over time and to validate data being developed through laboratory research. Samples examined included a 15-mil PVC liner from a sludge lagoon in Ne...
Some Methods for Evaluating Program Implementation.
ERIC Educational Resources Information Center
Hardy, Roy A.
An approach to evaluating program implementation is described. This approach includes the development of a project description which includes a structure matrix, sampling from the structure matrix, and preparing an implementation evaluation plan. The implementation evaluation plan should include: (1) verification of implementation of planned…
Influenza forecasting with Google Flu Trends.
Dugas, Andrea Freyer; Jalalpour, Mehdi; Gel, Yulia; Levin, Scott; Torcaso, Fred; Igusa, Takeru; Rothman, Richard E
2013-01-01
We developed a practical influenza forecast model based on real-time, geographically focused, and easy to access data, designed to provide individual medical centers with advanced warning of the expected number of influenza cases, thus allowing for sufficient time to implement interventions. Secondly, we evaluated the effects of incorporating a real-time influenza surveillance system, Google Flu Trends, and meteorological and temporal information on forecast accuracy. Forecast models designed to predict one week in advance were developed from weekly counts of confirmed influenza cases over seven seasons (2004-2011) divided into seven training and out-of-sample verification sets. Forecasting procedures using classical Box-Jenkins, generalized linear models (GLM), and generalized linear autoregressive moving average (GARMA) methods were employed to develop the final model and assess the relative contribution of external variables such as, Google Flu Trends, meteorological data, and temporal information. A GARMA(3,0) forecast model with Negative Binomial distribution integrating Google Flu Trends information provided the most accurate influenza case predictions. The model, on the average, predicts weekly influenza cases during 7 out-of-sample outbreaks within 7 cases for 83% of estimates. Google Flu Trend data was the only source of external information to provide statistically significant forecast improvements over the base model in four of the seven out-of-sample verification sets. Overall, the p-value of adding this external information to the model is 0.0005. The other exogenous variables did not yield a statistically significant improvement in any of the verification sets. Integer-valued autoregression of influenza cases provides a strong base forecast model, which is enhanced by the addition of Google Flu Trends confirming the predictive capabilities of search query based syndromic surveillance. This accessible and flexible forecast model can be used by individual medical centers to provide advanced warning of future influenza cases.
Qiu, Shi; Yang, Wen-Zhi; Yao, Chang-Liang; Qiu, Zhi-Dong; Shi, Xiao-Jian; Zhang, Jing-Xian; Hou, Jin-Jun; Wang, Qiu-Rong; Wu, Wan-Ying; Guo, De-An
2016-07-01
A key segment in authentication of herbal medicines is the establishment of robust biomarkers that embody the intrinsic metabolites difference independent of the growing environment or processing technics. We present a strategy by nontargeted metabolomics and "Commercial-homophyletic" comparison-induced biomarkers verification with new bioinformatic vehicles, to improve the efficiency and reliability in authentication of herbal medicines. The chemical differentiation of five different parts (root, leaf, flower bud, berry, and seed) of Panax ginseng was illustrated as a case study. First, an optimized ultra-performance liquid chromatography/quadrupole time-of-flight-MS(E) (UPLC/QTOF-MS(E)) approach was established for global metabolites profiling. Second, UNIFI™ combined with search of an in-house library was employed to automatically characterize the metabolites. Third, pattern recognition multivariate statistical analysis of the MS(E) data of different parts of commercial and homophyletic samples were separately performed to explore potential biomarkers. Fourth, potential biomarkers deduced from commercial and homophyletic root and leaf samples were cross-compared to infer robust biomarkers. Fifth, discriminating models by artificial neutral network (ANN) were established to identify different parts of P. ginseng. Consequently, 164 compounds were characterized, and 11 robust biomarkers enabling the differentiation among root, leaf, flower bud, and berry, were discovered by removing those structurally unstable and possibly processing-related ones. The ANN models using the robust biomarkers managed to exactly discriminate four different parts and root adulterant with leaf as well. Conclusively, biomarkers verification using homophyletic samples conduces to the discovery of robust biomarkers. The integrated strategy facilitates authentication of herbal medicines in a more efficient and more intelligent manner. Copyright © 2016 Elsevier B.V. All rights reserved.
Model Checking Satellite Operational Procedures
NASA Astrophysics Data System (ADS)
Cavaliere, Federico; Mari, Federico; Melatti, Igor; Minei, Giovanni; Salvo, Ivano; Tronci, Enrico; Verzino, Giovanni; Yushtein, Yuri
2011-08-01
We present a model checking approach for the automatic verification of satellite operational procedures (OPs). Building a model for a complex system as a satellite is a hard task. We overcome this obstruction by using a suitable simulator (SIMSAT) for the satellite. Our approach aims at improving OP quality assurance by automatic exhaustive exploration of all possible simulation scenarios. Moreover, our solution decreases OP verification costs by using a model checker (CMurphi) to automatically drive the simulator. We model OPs as user-executed programs observing the simulator telemetries and sending telecommands to the simulator. In order to assess feasibility of our approach we present experimental results on a simple meaningful scenario. Our results show that we can save up to 90% of verification time.
Chen, Jin-hong; Wu, Hai-yun; He, Kun-lun; He, Yao; Qin, Yin-he
2010-10-01
To establish and verify the prediction model for ischemic cardiovascular disease (ICVD) among the elderly population who were under the current health care programs. Statistical analysis on data from physical examination, hospitalization of the past years, from questionnaire and telephone interview was carried out in May, 2003. Data was from a hospital which implementing a health care program. Baseline population with a proportion of 4:1 was randomly selected to generate both module group and verification group. Baseline data was induced to make the verification group into regression model of module group and to generate the predictive value. Distinguished ability with area under ROC curve and the predictive veracity were verified through comparing the predictive incidence rate and actual incidence rate of every deciles group by Hosmer-Lemeshow test. Predictive veracity of the prediction model at population level was verified through comparing the predictive 6-year incidence rates of ICVD with actual 6-year accumulative incidence rates of ICVD with error rate calculated. The samples included 2271 males over the age of 65 with 1817 people for modeling population and 454 for verified population. All of the samples were stratified into two layers to establish hierarchical Cox proportional hazard regression model, including one advanced age group (greater than or equal to 75 years old), and another elderly group (less than 75 years old). Data from the statically analysis showed that the risk factors in aged group were age, systolic blood pressure, serum creatinine level, fasting blood glucose level, while protective factor was high density lipoprotein;in advanced age group, the risk factors were body weight index, systolic blood pressure, serum total cholesterol level, serum creatinine level, fasting blood glucose level, while protective factor was HDL-C. The area under the ROC curve (AUC) and 95%CI were 0.723 and 0.687 - 0.759 respectively. Discriminating power was good. All individual predictive ICVD cumulative incidence and actual incidence were analyzed using Hosmer-Lemeshow test, χ(2) = 1.43, P = 0.786, showing that the predictive veracity was good. The stratified Cox Hazards Regression model was used to establish prediction model of the aged male population under a certain health care program. The common prediction factor of the two age groups were: systolic blood pressure, serum creatinine level, fasting blood glucose level and HDL-C. The area under the ROC curve of the verification group was 0.723, showing that the distinguished ability was good and the predict ability at the individual level and at the group level were also satisfactory. It was feasible to using Cox Proportional Hazards Regression Model for predicting the population groups.
Melchior, P.; Gruen, D.; McClintock, T.; ...
2017-05-16
Here, we use weak-lensing shear measurements to determine the mean mass of optically selected galaxy clusters in Dark Energy Survey Science Verification data. In a blinded analysis, we split the sample of more than 8000 redMaPPer clusters into 15 subsets, spanning ranges in the richness parameter 5 ≤ λ ≤ 180 and redshift 0.2 ≤ z ≤ 0.8, and fit the averaged mass density contrast profiles with a model that accounts for seven distinct sources of systematic uncertainty: shear measurement and photometric redshift errors; cluster-member contamination; miscentring; deviations from the NFW halo profile; halo triaxiality and line-of-sight projections.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melchior, P.; Gruen, D.; McClintock, T.
Here, we use weak-lensing shear measurements to determine the mean mass of optically selected galaxy clusters in Dark Energy Survey Science Verification data. In a blinded analysis, we split the sample of more than 8000 redMaPPer clusters into 15 subsets, spanning ranges in the richness parameter 5 ≤ λ ≤ 180 and redshift 0.2 ≤ z ≤ 0.8, and fit the averaged mass density contrast profiles with a model that accounts for seven distinct sources of systematic uncertainty: shear measurement and photometric redshift errors; cluster-member contamination; miscentring; deviations from the NFW halo profile; halo triaxiality and line-of-sight projections.
Pella, A; Riboldi, M; Tagaste, B; Bianculli, D; Desplanques, M; Fontana, G; Cerveri, P; Seregni, M; Fattori, G; Orecchia, R; Baroni, G
2014-08-01
In an increasing number of clinical indications, radiotherapy with accelerated particles shows relevant advantages when compared with high energy X-ray irradiation. However, due to the finite range of ions, particle therapy can be severely compromised by setup errors and geometric uncertainties. The purpose of this work is to describe the commissioning and the design of the quality assurance procedures for patient positioning and setup verification systems at the Italian National Center for Oncological Hadrontherapy (CNAO). The accuracy of systems installed in CNAO and devoted to patient positioning and setup verification have been assessed using a laser tracking device. The accuracy in calibration and image based setup verification relying on in room X-ray imaging system was also quantified. Quality assurance tests to check the integration among all patient setup systems were designed, and records of daily QA tests since the start of clinical operation (2011) are presented. The overall accuracy of the patient positioning system and the patient verification system motion was proved to be below 0.5 mm under all the examined conditions, with median values below the 0.3 mm threshold. Image based registration in phantom studies exhibited sub-millimetric accuracy in setup verification at both cranial and extra-cranial sites. The calibration residuals of the OTS were found consistent with the expectations, with peak values below 0.3 mm. Quality assurance tests, daily performed before clinical operation, confirm adequate integration and sub-millimetric setup accuracy. Robotic patient positioning was successfully integrated with optical tracking and stereoscopic X-ray verification for patient setup in particle therapy. Sub-millimetric setup accuracy was achieved and consistently verified in daily clinical operation.
Hierarchical Representation Learning for Kinship Verification.
Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul
2017-01-01
Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.
Cognitive Bias in the Verification and Validation of Space Flight Systems
NASA Technical Reports Server (NTRS)
Larson, Steve
2012-01-01
Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of future systems.
Hayabusa: Navigation Challenges for Earth Return
NASA Technical Reports Server (NTRS)
Haw, Robert J.; Bhaskaran, S.; Strauss, W.; Sklyanskiy, E.; Graat, E. J.; Smith, J. J.; Menom, P.; Ardalan, S.; Ballard, C.; Williams, P.;
2011-01-01
Hayabusa was a JAXA sample-return mission to Itokawa navigated, in part, by JPL personnel. Hayabusa survived several near mission-ending failures at Itokawa yet returned to Earth with an asteroid regolith sample on June 13, 2010. This paper describes NASA/JPL's participation in the Hayabusa mission during the last 100 days of its mission, wherein JPL provided tracking data and orbit determination, plus verification of maneuver design and entry, descent and landing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
V Yashchuk; R Conley; E Anderson
Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested [1] and [2] and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanningmore » (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi2/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.« less
Photometric redshift analysis in the Dark Energy Survey Science Verification data
NASA Astrophysics Data System (ADS)
Sánchez, C.; Carrasco Kind, M.; Lin, H.; Miquel, R.; Abdalla, F. B.; Amara, A.; Banerji, M.; Bonnett, C.; Brunner, R.; Capozzi, D.; Carnero, A.; Castander, F. J.; da Costa, L. A. N.; Cunha, C.; Fausti, A.; Gerdes, D.; Greisel, N.; Gschwend, J.; Hartley, W.; Jouvel, S.; Lahav, O.; Lima, M.; Maia, M. A. G.; Martí, P.; Ogando, R. L. C.; Ostrovski, F.; Pellegrini, P.; Rau, M. M.; Sadeh, I.; Seitz, S.; Sevilla-Noarbe, I.; Sypniewski, A.; de Vicente, J.; Abbot, T.; Allam, S. S.; Atlee, D.; Bernstein, G.; Bernstein, J. P.; Buckley-Geer, E.; Burke, D.; Childress, M. J.; Davis, T.; DePoy, D. L.; Dey, A.; Desai, S.; Diehl, H. T.; Doel, P.; Estrada, J.; Evrard, A.; Fernández, E.; Finley, D.; Flaugher, B.; Frieman, J.; Gaztanaga, E.; Glazebrook, K.; Honscheid, K.; Kim, A.; Kuehn, K.; Kuropatkin, N.; Lidman, C.; Makler, M.; Marshall, J. L.; Nichol, R. C.; Roodman, A.; Sánchez, E.; Santiago, B. X.; Sako, M.; Scalzo, R.; Smith, R. C.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Tucker, D. L.; Uddin, S. A.; Valdés, F.; Walker, A.; Yuan, F.; Zuntz, J.
2014-12-01
We present results from a study of the photometric redshift performance of the Dark Energy Survey (DES), using the early data from a Science Verification period of observations in late 2012 and early 2013 that provided science-quality images for almost 200 sq. deg. at the nominal depth of the survey. We assess the photometric redshift (photo-z) performance using about 15 000 galaxies with spectroscopic redshifts available from other surveys. These galaxies are used, in different configurations, as a calibration sample, and photo-z's are obtained and studied using most of the existing photo-z codes. A weighting method in a multidimensional colour-magnitude space is applied to the spectroscopic sample in order to evaluate the photo-z performance with sets that mimic the full DES photometric sample, which is on average significantly deeper than the calibration sample due to the limited depth of spectroscopic surveys. Empirical photo-z methods using, for instance, artificial neural networks or random forests, yield the best performance in the tests, achieving core photo-z resolutions σ68 ˜ 0.08. Moreover, the results from most of the codes, including template-fitting methods, comfortably meet the DES requirements on photo-z performance, therefore, providing an excellent precedent for future DES data sets.
Photometric redshift analysis in the Dark Energy Survey Science Verification data
Sanchez, C.; Carrasco Kind, M.; Lin, H.; ...
2014-10-09
In this study, we present results from a study of the photometric redshift performance of the Dark Energy Survey (DES), using the early data from a Science Verification period of observations in late 2012 and early 2013 that provided science-quality images for almost 200 sq. deg. at the nominal depth of the survey. We assess the photometric redshift (photo-z) performance using about 15 000 galaxies with spectroscopic redshifts available from other surveys. These galaxies are used, in different configurations, as a calibration sample, and photo-z's are obtained and studied using most of the existing photo-z codes. A weighting method inmore » a multidimensional colour–magnitude space is applied to the spectroscopic sample in order to evaluate the photo-z performance with sets that mimic the full DES photometric sample, which is on average significantly deeper than the calibration sample due to the limited depth of spectroscopic surveys. In addition, empirical photo-z methods using, for instance, artificial neural networks or random forests, yield the best performance in the tests, achieving core photo-z resolutions σ68 ~ 0.08. Moreover, the results from most of the codes, including template-fitting methods, comfortably meet the DES requirements on photo-z performance, therefore, providing an excellent precedent for future DES data sets.« less
Online 3D EPID-based dose verification: Proof of concept.
Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel
2016-07-01
Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5-10 s irradiation time. A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.
An experimental verification of laser-velocimeter sampling bias and its correction
NASA Technical Reports Server (NTRS)
Johnson, D. A.; Modarress, D.; Owen, F. K.
1982-01-01
The existence of 'sampling bias' in individual-realization laser velocimeter measurements is experimentally verified and shown to be independent of sample rate. The experiments were performed in a simple two-stream mixing shear flow with the standard for comparison being laser-velocimeter results obtained under continuous-wave conditions. It is also demonstrated that the errors resulting from sampling bias can be removed by a proper interpretation of the sampling statistics. In addition, data obtained in a shock-induced separated flow and in the near-wake of airfoils are presented, both bias-corrected and uncorrected, to illustrate the effects of sampling bias in the extreme.
Version 2.0 Visual Sample Plan (VSP): UXO Module Code Description and Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, Richard O.; Wilson, John E.; O'Brien, Robert F.
2003-05-06
The Pacific Northwest National Laboratory (PNNL) is developing statistical methods for determining the amount of geophysical surveys conducted along transects (swaths) that are needed to achieve specified levels of confidence of finding target areas (TAs) of anomalous readings and possibly unexploded ordnance (UXO) at closed, transferring and transferred (CTT) Department of Defense (DoD) ranges and other sites. The statistical methods developed by PNNL have been coded into the UXO module of the Visual Sample Plan (VSP) software code that is being developed by PNNL with support from the DoD, the U.S. Department of Energy (DOE, and the U.S. Environmental Protectionmore » Agency (EPA). (The VSP software and VSP Users Guide (Hassig et al, 2002) may be downloaded from http://dqo.pnl.gov/vsp.) This report describes and documents the statistical methods developed and the calculations and verification testing that have been conducted to verify that VSPs implementation of these methods is correct and accurate.« less
SU-E-T-49: A Multi-Institutional Study of Independent Dose Verification for IMRT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baba, H; Tachibana, H; Kamima, T
2015-06-15
Purpose: AAPM TG114 does not cover the independent verification for IMRT. We conducted a study of independent dose verification for IMRT in seven institutes to show the feasibility. Methods: 384 IMRT plans in the sites of prostate and head and neck (HN) were collected from the institutes, where the planning was performed using Eclipse and Pinnacle3 with the two techniques of step and shoot (S&S) and sliding window (SW). All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiologicalmore » path length. An ion-chamber measurement in a water-equivalent slab phantom was performed to compare the doses computed using the TPS and an independent dose verification program. Additionally, the agreement in dose computed in patient CT images between using the TPS and using the SMU was assessed. The dose of the composite beams in the plan was evaluated. Results: The agreement between the measurement and the SMU were −2.3±1.9 % and −5.6±3.6 % for prostate and HN sites, respectively. The agreement between the TPSs and the SMU were −2.1±1.9 % and −3.0±3.7 for prostate and HN sites, respectively. There was a negative systematic difference with similar standard deviation and the difference was larger in the HN site. The S&S technique showed a statistically significant difference between the SW. Because the Clarkson-based method in the independent program underestimated (cannot consider) the dose under the MLC. Conclusion: The accuracy would be improved when the Clarkson-based algorithm should be modified for IMRT and the tolerance level would be within 5%.« less
Kuppusamy, Vijayalakshmi; Nagarajan, Vivekanandan; Jeevanandam, Prakash; Murugan, Lavanya
2016-02-01
The study was aimed to compare two different monitor unit (MU) or dose verification software in volumetric modulated arc therapy (VMAT) using modified Clarkson's integration technique for 6 MV photons beams. In-house Excel Spreadsheet based monitor unit verification calculation (MUVC) program and PTW's DIAMOND secondary check software (SCS), version-6 were used as a secondary check to verify the monitor unit (MU) or dose calculated by treatment planning system (TPS). In this study 180 patients were grouped into 61 head and neck, 39 thorax and 80 pelvic sites. Verification plans are created using PTW OCTAVIUS-4D phantom and also measured using 729 detector chamber and array with isocentre as the suitable point of measurement for each field. In the analysis of 154 clinically approved VMAT plans with isocentre at a region above -350 HU, using heterogeneity corrections, In-house Spreadsheet based MUVC program and Diamond SCS showed good agreement TPS. The overall percentage average deviations for all sites were (-0.93% + 1.59%) and (1.37% + 2.72%) for In-house Excel Spreadsheet based MUVC program and Diamond SCS respectively. For 26 clinically approved VMAT plans with isocentre at a region below -350 HU showed higher variations for both In-house Spreadsheet based MUVC program and Diamond SCS. It can be concluded that for patient specific quality assurance (QA), the In-house Excel Spreadsheet based MUVC program and Diamond SCS can be used as a simple and fast accompanying to measurement based verification for plans with isocentre at a region above -350 HU. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Design of the software development and verification system (SWDVS) for shuttle NASA study task 35
NASA Technical Reports Server (NTRS)
Drane, L. W.; Mccoy, B. J.; Silver, L. W.
1973-01-01
An overview of the Software Development and Verification System (SWDVS) for the space shuttle is presented. The design considerations, goals, assumptions, and major features of the design are examined. A scenario that shows three persons involved in flight software development using the SWDVS in response to a program change request is developed. The SWDVS is described from the standpoint of different groups of people with different responsibilities in the shuttle program to show the functional requirements that influenced the SWDVS design. The software elements of the SWDVS that satisfy the requirements of the different groups are identified.
Final Report - Regulatory Considerations for Adaptive Systems
NASA Technical Reports Server (NTRS)
Wilkinson, Chris; Lynch, Jonathan; Bharadwaj, Raj
2013-01-01
This report documents the findings of a preliminary research study into new approaches to the software design assurance of adaptive systems. We suggest a methodology to overcome the software validation and verification difficulties posed by the underlying assumption of non-adaptive software in the requirementsbased- testing verification methods in RTCA/DO-178B and C. An analysis of the relevant RTCA/DO-178B and C objectives is presented showing the reasons for the difficulties that arise in showing satisfaction of the objectives and suggested additional means by which they could be satisfied. We suggest that the software design assurance problem for adaptive systems is principally one of developing correct and complete high level requirements and system level constraints that define the necessary system functional and safety properties to assure the safe use of adaptive systems. We show how analytical techniques such as model based design, mathematical modeling and formal or formal-like methods can be used to both validate the high level functional and safety requirements, establish necessary constraints and provide the verification evidence for the satisfaction of requirements and constraints that supplements conventional testing. Finally the report identifies the follow-on research topics needed to implement this methodology.
Verification of road databases using multiple road models
NASA Astrophysics Data System (ADS)
Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian
2017-08-01
In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.
NASA Astrophysics Data System (ADS)
Chen, Peng; Liu, Yuwei; Gao, Bingkun; Jiang, Chunlei
2018-03-01
A semiconductor laser employed with two-external-cavity feedback structure for laser self-mixing interference (SMI) phenomenon is investigated and analyzed. The SMI model with two directions based on F-P cavity is deduced, and numerical simulation and experimental verification were conducted. Experimental results show that the SMI with the structure of two-external-cavity feedback under weak light feedback is similar to the sum of two SMIs.
Abstract Model of the SATS Concept of Operations: Initial Results and Recommendations
NASA Technical Reports Server (NTRS)
Dowek, Gilles; Munoz, Cesar; Carreno, Victor A.
2004-01-01
An abstract mathematical model of the concept of operations for the Small Aircraft Transportation System (SATS) is presented. The Concept of Operations consist of several procedures that describe nominal operations for SATS, Several safety properties of the system are proven using formal techniques. The final goal of the verification effort is to show that under nominal operations, aircraft are safely separated. The abstract model was written and formally verified in the Prototype Verification System (PVS).
An evaluation of SEASAT-A candidate ocean industry economic verification experiments
NASA Technical Reports Server (NTRS)
1977-01-01
A description of the candidate economic verification experiments which could be performed with SEASAT is provided. Experiments have been identified in each of the areas of ocean-based activity that are expected to show an economic impact from the use of operational SEASAT data. Experiments have been identified in the areas of Arctic operations, the ocean fishing industry, the offshore oil and natural gas industry, as well as ice monitoring and coastal zone applications.
Bai, Zhiliang; Chen, Shili; Jia, Lecheng; Zeng, Zhoumo
2018-01-01
Embracing the fact that one can recover certain signals and images from far fewer measurements than traditional methods use, compressive sensing (CS) provides solutions to huge amounts of data collection in phased array-based material characterization. This article describes how a CS framework can be utilized to effectively compress ultrasonic phased array images in time and frequency domains. By projecting the image onto its Discrete Cosine transform domain, a novel scheme was implemented to verify the potentiality of CS for data reduction, as well as to explore its reconstruction accuracy. The results from CIVA simulations indicate that both time and frequency domain CS can accurately reconstruct array images using samples less than the minimum requirements of the Nyquist theorem. For experimental verification of three types of artificial flaws, although a considerable data reduction can be achieved with defects clearly preserved, it is currently impossible to break Nyquist limitation in the time domain. Fortunately, qualified recovery in the frequency domain makes it happen, meaning a real breakthrough for phased array image reconstruction. As a case study, the proposed CS procedure is applied to the inspection of an engine cylinder cavity containing different pit defects and the results show that orthogonal matching pursuit (OMP)-based CS guarantees the performance for real application. PMID:29738452
Nejo, Takahide; Oya, Soichi; Tsukasa, Tsuchiya; Yamaguchi, Naomi; Matsui, Toru
2016-12-01
Several bedside approaches used in combination with thoracoabdominal X-ray are widely used to avoid severe complications that have been reported during nasogastric tube management. Although confirmation by X-ray is considered the gold standard, it is not yet perfect. We present 2 cases of rare complications in which the routine verification methods could not detect all the complications related to the nasogastric tube placement. Case 1 was a 17-year-old male who presented with a brain tumor and repeatedly required nasogastric tube placement. Despite normal auscultatory and X-ray findings, the patient's condition deteriorated rapidly after resuming the enteral nutrition (EN). Computed tomography images showed the presence of hepatic portal venous gas (HPVG). Urgent upper gastrointestinal endoscopy showed esophagogastric submucosal tunneling of the tube that required an emergency open total gastrectomy. Case 2 was a 76-year-old man with long-term EN after stroke. While the last auscultatory verification was normal, he suddenly developed extensive HPVG due to gastric mucosal injury following EN, which resulted in progressive intestinal necrosis, general peritonitis, and death. These 2 cases indicated that routine verification methods consisting of auscultation and X-ray may not be completely reliable, and the awareness of the limitations of these methods should be reaffirmed because expeditious examinations and necessary interventions are critical in preventing life-threatening complications.
30 CFR 250.913 - When must I resubmit Platform Verification Program plans?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication verification, or installation verification... 30 Mineral Resources 2 2010-07-01 2010-07-01 false When must I resubmit Platform Verification...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yashchuk, V.V.; Conley, R.; Anderson, E.H.
Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binarypseudo-random (BPR) gratings and arrays has been suggested and and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer. Here we describe the details of development of binarypseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electronmore » microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi{sub 2}/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML testsamples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.« less
Biometric verification in dynamic writing
NASA Astrophysics Data System (ADS)
George, Susan E.
2002-03-01
Pen-tablet devices capable of capturing the dynamics of writing record temporal and pressure information as well as the spatial pattern. This paper explores biometric verification based upon the dynamics of writing where writers are distinguished not on the basis of what they write (ie the signature), but how they write. We have collected samples of dynamic writing from 38 Chinese writers. Each writer was asked to provide 10 copies of a paragraph of text and the same number of signature samples. From the data we have extracted stroke-based primitives from the sentence data utilizing pen-up/down information and heuristic rules about the shape of the character. The x, y and pressure values of each primitive were interpolated into an even temporal range based upon a 20 msec sampling rate. We applied the Daubechies 1 wavelet transform to the x signal, y signal and pressure signal using the coefficients as inputs to a multi-layer perceptron trained with back-propagation on the sentence data. We found a sensitivity of 0.977 and specificity of 0.990 recognizing writers based on test primitives extracted from sentence data and measures of 0.916 and 0.961 respectively, from test primitives extracted from signature data.
Atmospheric transport modelling in support of CTBT verification—overview and basic concepts
NASA Astrophysics Data System (ADS)
Wotawa, Gerhard; De Geer, Lars-Erik; Denier, Philippe; Kalinowski, Martin; Toivonen, Harri; D'Amours, Real; Desiato, Franco; Issartel, Jean-Pierre; Langer, Matthias; Seibert, Petra; Frank, Andreas; Sloan, Craig; Yamazawa, Hiromi
Under the provisions of the Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global monitoring system comprising different verification technologies is currently being set up. The network will include 80 radionuclide (RN) stations distributed all over the globe that measure treaty-relevant radioactive species. While the seismic subsystem cannot distinguish between chemical and nuclear explosions, RN monitoring would provide the "smoking gun" of a possible treaty violation. Atmospheric transport modelling (ATM) will be an integral part of CTBT verification, since it provides a geo-temporal location capability for the RN technology. In this paper, the basic concept for the future ATM software system to be installed at the International Data Centre is laid out. The system is based on the operational computation of multi-dimensional source-receptor sensitivity fields for all RN samples by means of adjoint tracer transport modelling. While the source-receptor matrix methodology has already been applied in the past, the system that we suggest will be unique and unprecedented, since it is global, real-time and aims at uncovering source scenarios that are compatible with measurements. Furthermore, it has to deal with source dilution ratios that are by orders of magnitude larger than in typical transport model applications. This new verification software will need continuous scientific attention, and may well provide a prototype system for future applications in areas of environmental monitoring, emergency response and verification of other international agreements and treaties.
7 CFR 245.6a - Verification requirements.
Code of Federal Regulations, 2014 CFR
2014-01-01
... AGRICULTURE CHILD NUTRITION PROGRAMS DETERMINING ELIGIBILITY FOR FREE AND REDUCED PRICE MEALS AND FREE MILK IN... agencies serving foster, homeless, migrant, or runaway children, as defined in § 245.2. Agency records may...) General. The local educational agency must verify eligibility of children in a sample of household...
36 CFR 1237.28 - What special concerns apply to digital photographs?
Code of Federal Regulations, 2012 CFR
2012-07-01
... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...
36 CFR § 1237.28 - What special concerns apply to digital photographs?
Code of Federal Regulations, 2013 CFR
2013-07-01
... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...
36 CFR 1237.28 - What special concerns apply to digital photographs?
Code of Federal Regulations, 2014 CFR
2014-07-01
... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...
36 CFR 1237.28 - What special concerns apply to digital photographs?
Code of Federal Regulations, 2010 CFR
2010-07-01
... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...
36 CFR 1237.28 - What special concerns apply to digital photographs?
Code of Federal Regulations, 2011 CFR
2011-07-01
... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...
40 CFR 61.345 - Standards: Containers.
Code of Federal Regulations, 2011 CFR
2011-07-01
...., bungs, hatches, and sampling ports) shall be designed to operate with no detectable emissions as...) The total enclosure must be designed and operated in accordance with the criteria for a permanent total enclosure as specified in section 5 of the “Procedure T—Criteria for and Verification of a...
Climate Verification Using Running Mann Whitney Z Statistics
USDA-ARS?s Scientific Manuscript database
A robust method previously used to detect observed intra- to multi-decadal (IMD) climate regimes was adapted to test whether climate models could reproduce IMD variations in U.S. surface temperatures during 1919-2008. This procedure, called the running Mann Whitney Z (MWZ) method, samples data ranki...
NASA Astrophysics Data System (ADS)
Martinez, J. C.; Guzmán-Sepúlveda, J. R.; Bolañoz Evia, G. R.; Córdova, T.; Guzmán-Cabrera, R.
2018-06-01
In this work, we applied machine learning techniques to Raman spectra for the characterization and classification of manufactured pharmaceutical products. Our measurements were taken with commercial equipment, for accurate assessment of variations with respect to one calibrated control sample. Unlike the typical use of Raman spectroscopy in pharmaceutical applications, in our approach the principal components of the Raman spectrum are used concurrently as attributes in machine learning algorithms. This permits an efficient comparison and classification of the spectra measured from the samples under study. This also allows for accurate quality control as all relevant spectral components are considered simultaneously. We demonstrate our approach with respect to the specific case of acetaminophen, which is one of the most widely used analgesics in the market. In the experiments, commercial samples from thirteen different laboratories were analyzed and compared against a control sample. The raw data were analyzed based on an arithmetic difference between the nominal active substance and the measured values in each commercial sample. The principal component analysis was applied to the data for quantitative verification (i.e., without considering the actual concentration of the active substance) of the difference in the calibrated sample. Our results show that by following this approach adulterations in pharmaceutical compositions can be clearly identified and accurately quantified.
Dukić, Lora; Simundić, Ana-Maria; Malogorski, Davorin
2014-01-01
Sample type recommended by the manufacturer for the digoxin Abbott assay is either serum collected in glass tubes or plasma (sodium heparin, lithium heparin, citrate, EDTA or oxalate as anticoagulant) collected in plastic tubes. In our hospital samples are collected in plastic tubes. Our hypothesis was that the serum sample collected in plastic serum tube can be used interchangeably with plasma sample for measurement of digoxin concentration. Our aim was verification of plastic serum tubes for determination of digoxin concentration. Concentration of digoxin was determined simultaneously in 26 venous blood plasma (plastic Vacuette, LH Lithium heparin) and serum (plastic Vacuette, Z Serum Clot activator; both Greiner Bio-One GmbH, Kremsmünster, Austria) samples, on Abbott AxSYM analyzer using the original Abbott Digoxin III assay (Abbott, Wiesbaden, Germany). Tube comparability was assessed using the Passing Bablok regression and Bland-Altman plot. Serum and plasma digoxin concentrations are comparable. Passing Bablok intercept (0.08 [95% CI = -0.10 to 0.20]) and slope (0.99 [95% CI = 0.92 to 1.11]) showed there is no constant or proportional error. Blood samples drawn in plastic serum tubes and plastic plasma tubes can be interchangeably used for determination of digoxin concentration.
Dukić, Lora; Šimundić, Ana-Maria; Malogorski, Davorin
2014-01-01
Introduction: Sample type recommended by the manufacturer for the digoxin Abbott assay is either serum collected in glass tubes or plasma (sodium heparin, lithium heparin, citrate, EDTA or oxalate as anticoagulant) collected in plastic tubes. In our hospital samples are collected in plastic tubes. Our hypothesis was that the serum sample collected in plastic serum tube can be used interchangeably with plasma sample for measurement of digoxin concentration. Our aim was verification of plastic serum tubes for determination of digoxin concentration. Materials and methods: Concentration of digoxin was determined simultaneously in 26 venous blood plasma (plastic Vacuette, LH Lithium heparin) and serum (plastic Vacuette, Z Serum Clot activator; both Greiner Bio-One GmbH, Kremsmünster, Austria) samples, on Abbott AxSYM analyzer using the original Abbott Digoxin III assay (Abbott, Wiesbaden, Germany). Tube comparability was assessed using the Passing Bablok regression and Bland-Altman plot. Results: Serum and plasma digoxin concentrations are comparable. Passing Bablok intercept (0.08 [95% CI = −0.10 to 0.20]) and slope (0.99 [95% CI = 0.92 to 1.11]) showed there is no constant or proportional error. Conclusion: Blood samples drawn in plastic serum tubes and plastic plasma tubes can be interchangeably used for determination of digoxin concentration. PMID:24627723
Two-Level Verification of Data Integrity for Data Storage in Cloud Computing
NASA Astrophysics Data System (ADS)
Xu, Guangwei; Chen, Chunlin; Wang, Hongya; Zang, Zhuping; Pang, Mugen; Jiang, Ping
Data storage in cloud computing can save capital expenditure and relive burden of storage management for users. As the lose or corruption of files stored may happen, many researchers focus on the verification of data integrity. However, massive users often bring large numbers of verifying tasks for the auditor. Moreover, users also need to pay extra fee for these verifying tasks beyond storage fee. Therefore, we propose a two-level verification of data integrity to alleviate these problems. The key idea is to routinely verify the data integrity by users and arbitrate the challenge between the user and cloud provider by the auditor according to the MACs and ϕ values. The extensive performance simulations show that the proposed scheme obviously decreases auditor's verifying tasks and the ratio of wrong arbitration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nabavizadeh, Nima, E-mail: nabaviza@ohsu.edu; Elliott, David A.; Chen, Yiyi
Purpose: To survey image guided radiation therapy (IGRT) practice patterns, as well as IGRT's impact on clinical workflow and planning treatment volumes (PTVs). Methods and Materials: A sample of 5979 treatment site–specific surveys was e-mailed to the membership of the American Society for Radiation Oncology (ASTRO), with questions pertaining to IGRT modality/frequency, PTV expansions, method of image verification, and perceived utility/value of IGRT. On-line image verification was defined as images obtained and reviewed by the physician before treatment. Off-line image verification was defined as images obtained before treatment and then reviewed by the physician before the next treatment. Results: Of 601 evaluablemore » responses, 95% reported IGRT capabilities other than portal imaging. The majority (92%) used volumetric imaging (cone-beam CT [CBCT] or megavoltage CT), with volumetric imaging being the most commonly used modality for all sites except breast. The majority of respondents obtained daily CBCTs for head and neck intensity modulated radiation therapy (IMRT), lung 3-dimensional conformal radiation therapy or IMRT, anus or pelvis IMRT, prostate IMRT, and prostatic fossa IMRT. For all sites, on-line image verification was most frequently performed during the first few fractions only. No association was seen between IGRT frequency or CBCT utilization and clinical treatment volume to PTV expansions. Of the 208 academic radiation oncologists who reported working with residents, only 41% reported trainee involvement in IGRT verification processes. Conclusion: Consensus guidelines, further evidence-based approaches for PTV margin selection, and greater resident involvement are needed for standardized use of IGRT practices.« less
Nabavizadeh, Nima; Elliott, David A; Chen, Yiyi; Kusano, Aaron S; Mitin, Timur; Thomas, Charles R; Holland, John M
2016-03-15
To survey image guided radiation therapy (IGRT) practice patterns, as well as IGRT's impact on clinical workflow and planning treatment volumes (PTVs). A sample of 5979 treatment site-specific surveys was e-mailed to the membership of the American Society for Radiation Oncology (ASTRO), with questions pertaining to IGRT modality/frequency, PTV expansions, method of image verification, and perceived utility/value of IGRT. On-line image verification was defined as images obtained and reviewed by the physician before treatment. Off-line image verification was defined as images obtained before treatment and then reviewed by the physician before the next treatment. Of 601 evaluable responses, 95% reported IGRT capabilities other than portal imaging. The majority (92%) used volumetric imaging (cone-beam CT [CBCT] or megavoltage CT), with volumetric imaging being the most commonly used modality for all sites except breast. The majority of respondents obtained daily CBCTs for head and neck intensity modulated radiation therapy (IMRT), lung 3-dimensional conformal radiation therapy or IMRT, anus or pelvis IMRT, prostate IMRT, and prostatic fossa IMRT. For all sites, on-line image verification was most frequently performed during the first few fractions only. No association was seen between IGRT frequency or CBCT utilization and clinical treatment volume to PTV expansions. Of the 208 academic radiation oncologists who reported working with residents, only 41% reported trainee involvement in IGRT verification processes. Consensus guidelines, further evidence-based approaches for PTV margin selection, and greater resident involvement are needed for standardized use of IGRT practices. Copyright © 2016 Elsevier Inc. All rights reserved.
Autoverification process improvement by Six Sigma approach: Clinical chemistry & immunoassay.
Randell, Edward W; Short, Garry; Lee, Natasha; Beresford, Allison; Spencer, Margaret; Kennell, Marina; Moores, Zoë; Parry, David
2018-05-01
This study examines effectiveness of a project to enhance an autoverification (AV) system through application of Six Sigma (DMAIC) process improvement strategies. Similar AV systems set up at three sites underwent examination and modification to produce improved systems while monitoring proportions of samples autoverified, the time required for manual review and verification, sample processing time, and examining characteristics of tests not autoverified. This information was used to identify areas for improvement and monitor the impact of changes. Use of reference range based criteria had the greatest impact on the proportion of tests autoverified. To improve AV process, reference range based criteria was replaced with extreme value limits based on a 99.5% test result interval, delta check criteria were broadened, and new specimen consistency rules were implemented. Decision guidance tools were also developed to assist staff using the AV system. The mean proportion of tests and samples autoverified improved from <62% for samples and <80% for tests, to >90% for samples and >95% for tests across all three sites. The new AV system significantly decreased turn-around time and total sample review time (to about a third), however, time spent for manual review of held samples almost tripled. There was no evidence of compromise to the quality of testing process and <1% of samples held for exceeding delta check or extreme limits required corrective action. The Six Sigma (DMAIC) process improvement methodology was successfully applied to AV systems resulting in an increase in overall test and sample AV by >90%, improved turn-around time, reduced time for manual verification, and with no obvious compromise to quality or error detection. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Generic Verification Protocol for Verification of Online Turbidimeters
This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...
NASA Astrophysics Data System (ADS)
Lin, Y. Q.; Ren, W. X.; Fang, S. E.
2011-11-01
Although most vibration-based damage detection methods can acquire satisfactory verification on analytical or numerical structures, most of them may encounter problems when applied to real-world structures under varying environments. The damage detection methods that directly extract damage features from the periodically sampled dynamic time history response measurements are desirable but relevant research and field application verification are still lacking. In this second part of a two-part paper, the robustness and performance of the statistics-based damage index using the forward innovation model by stochastic subspace identification of a vibrating structure proposed in the first part have been investigated against two prestressed reinforced concrete (RC) beams tested in the laboratory and a full-scale RC arch bridge tested in the field under varying environments. Experimental verification is focused on temperature effects. It is demonstrated that the proposed statistics-based damage index is insensitive to temperature variations but sensitive to the structural deterioration or state alteration. This makes it possible to detect the structural damage for the real-scale structures experiencing ambient excitations and varying environmental conditions.
Particle shape accounts for instrumental discrepancy in ice core dust size distributions
NASA Astrophysics Data System (ADS)
Folden Simonsen, Marius; Cremonesi, Llorenç; Baccolo, Giovanni; Bosch, Samuel; Delmonte, Barbara; Erhardt, Tobias; Kjær, Helle Astrid; Potenza, Marco; Svensson, Anders; Vallelonga, Paul
2018-05-01
The Klotz Abakus laser sensor and the Coulter counter are both used for measuring the size distribution of insoluble mineral dust particles in ice cores. While the Coulter counter measures particle volume accurately, the equivalent Abakus instrument measurement deviates substantially from the Coulter counter. We show that the difference between the Abakus and the Coulter counter measurements is mainly caused by the irregular shape of dust particles in ice core samples. The irregular shape means that a new calibration routine based on standard spheres is necessary for obtaining fully comparable data. This new calibration routine gives an increased accuracy to Abakus measurements, which may improve future ice core record intercomparisons. We derived an analytical model for extracting the aspect ratio of dust particles from the difference between Abakus and Coulter counter data. For verification, we measured the aspect ratio of the same samples directly using a single-particle extinction and scattering instrument. The results demonstrate that the model is accurate enough to discern between samples of aspect ratio 0.3 and 0.4 using only the comparison of Abakus and Coulter counter data.
Determination of the depth dose distribution of proton beam using PRESAGE TM dosimeter
NASA Astrophysics Data System (ADS)
Zhao, L.; Das, I. J.; Zhao, Q.; Thomas, A.; Adamovics, J.; Oldman, M.
2010-11-01
PRESAGETM dosimeter dosimeter has been proved useful for 3D dosimetry in conventional photon therapy and IMRT [1-5]. Our objective is to examine the use of PRESAGETM dosimeter for verification of depth dose distribution in proton beam therapy. Three PRESAGETM samples were irradiated with a 79 MeV un-modulated proton beam. Percent depth dose profile measured from the PRESAGETM dosimeter is compared with data obtained in a water phantom using a parallel plate Advanced Markus chamber. The Bragg-peak position determined from the PRESAGETM is within 2 mm compared to measurements in water. PRESAGETM shows a highly linear response to proton dose. However, PRESAGETM also reveals an underdosage around the Bragg peak position due to LET effects. Depth scaling factor and quenching correction factor need further investigation. Our initial result shows that PRESAGETM has promising dosimetric characteristics that could be suitable for proton beam dosimetry.
NASA/BLM APT, phase 2. Volume 2: Technology demonstration. [Arizona
NASA Technical Reports Server (NTRS)
1981-01-01
Techniques described include: (1) steps in the preprocessing of LANDSAT data; (2) the training of a classifier; (3) maximum likelihood classification and precision; (4) geometric correction; (5) class description; (6) digitizing; (7) digital terrain data; (8) an overview of sample design; (9) allocation and selection of primary sample units; (10) interpretation of secondary sample units; (11) data collection ground plots; (12) data reductions; (13) analysis for productivity estimation and map verification; (14) cost analysis; and (150) LANDSAT digital products. The evaluation of the pre-inventory planning for P.J. is included.
Verification Games: Crowd-Sourced Formal Verification
2016-03-01
VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION UNIVERSITY OF WASHINGTON MARCH 2016 FINAL TECHNICAL REPORT...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750...clarification memorandum dated 16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Over the more than three years of the project Verification Games : Crowd-sourced
Simple method to verify OPC data based on exposure condition
NASA Astrophysics Data System (ADS)
Moon, James; Ahn, Young-Bae; Oh, Sey-Young; Nam, Byung-Ho; Yim, Dong Gyu
2006-03-01
In a world where Sub100nm lithography tool is an everyday household item for device makers, shrinkage of the device is at a rate that no one ever have imagined. With the shrinkage of device at such a high rate, demand placed on Optical Proximity Correction (OPC) is like never before. To meet this demand with respect to shrinkage rate of the device, more aggressive OPC tactic is involved. Aggressive OPC tactics is a must for sub 100nm lithography tech but this tactic eventually results in greater room for OPC error and complexity of the OPC data. Until now, Optical Rule Check (ORC) or Design Rule Check (DRC) was used to verify this complex OPC error. But each of these methods has its pros and cons. ORC verification of OPC data is rather accurate "process" wise but inspection of full chip device requires a lot of money (Computer , software,..) and patience (run time). DRC however has no such disadvantage, but accuracy of the verification is a total downfall "process" wise. In this study, we were able to create a new method for OPC data verification that combines the best of both ORC and DRC verification method. We created a method that inspects the biasing of the OPC data with respect to the illumination condition of the process that's involved. This new method for verification was applied to 80nm tech ISOLATION and GATE layer of the 512M DRAM device and showed accuracy equivalent to ORC inspection with run time that of DRC verification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dewberry, R.; Ayers, J.; Tietze, F.
The Analytical Development (AD) Section field nuclear measurement group performed six 'best available technique' verification measurements to satisfy a DOE requirement instituted for the March 2009 semi-annual inventory. The requirement of (1) yielded the need for SRNL Research Operations Department Material Control & Accountability (MC&A) group to measure the Pu content of five items and the highly enrich uranium (HEU) content of two. No 14Q-qualified measurement equipment was available to satisfy the requirement. The AD field nuclear group has routinely performed the required Confirmatory Measurements for the semi-annual inventories for fifteen years using sodium iodide and high purity germanium (HpGe)more » {gamma}-ray pulse height analysis nondestructive assay (NDA) instruments. With appropriate {gamma}-ray acquisition modeling, the HpGe spectrometers can be used to perform verification-type quantitative assay for Pu-isotopics and HEU content. The AD nuclear NDA group is widely experienced with this type of measurement and reports content for these species in requested process control, MC&A booking, and holdup measurements assays Site-wide. However none of the AD HpGe {gamma}-ray spectrometers have been 14Q-qualified, and the requirement of reference 1 specifically excluded a {gamma}-ray PHA measurement from those it would accept for the required verification measurements. The requirement of reference 1 was a new requirement for which the Savannah River National Laboratory (SRNL) Research Operations Department (ROD) MC&A group was unprepared. The criteria for exemption from verification were: (1) isotope content below 50 grams; (2) intrinsically tamper indicating or TID sealed items which contain a Category IV quantity of material; (3) assembled components; and (4) laboratory samples. Therefore all (SRNL) Material Balance Area (MBA) items with greater than 50 grams total Pu or greater than 50 grams HEU were subject to a verification measurement. The pass/fail criteria of reference 7 stated 'The facility will report measured values, book values, and statistical control limits for the selected items to DOE SR...', and 'The site/facility operator must develop, document, and maintain measurement methods for all nuclear material on inventory'. These new requirements exceeded SRNL's experience with prior semi-annual inventory expectations, but allowed the AD nuclear field measurement group to demonstrate its excellent adaptability and superior flexibility to respond to unpredicted expectations from the DOE customer. The requirements yielded five SRNL items subject to Pu verification and two SRNL items subject to HEU verification. These items are listed and described in Table 1.« less
Rosso, Osvaldo A; Ospina, Raydonal; Frery, Alejandro C
2016-01-01
We present a new approach for handwritten signature classification and verification based on descriptors stemming from time causal information theory. The proposal uses the Shannon entropy, the statistical complexity, and the Fisher information evaluated over the Bandt and Pompe symbolization of the horizontal and vertical coordinates of signatures. These six features are easy and fast to compute, and they are the input to an One-Class Support Vector Machine classifier. The results are better than state-of-the-art online techniques that employ higher-dimensional feature spaces which often require specialized software and hardware. We assess the consistency of our proposal with respect to the size of the training sample, and we also use it to classify the signatures into meaningful groups.
29 CFR 1903.19 - Abatement verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... identify the violation and the steps to be taken to achieve abatement, including a schedule for completing..., the progress report must identify, in a single sentence if possible, the action taken to achieve abatement and the date the action was taken. Note to paragraph (f): Appendix B contains a Sample Progress...
Code of Federal Regulations, 2014 CFR
2014-04-01
.... Each STD shall develop a quality assurance program which will assure that the materials and workmanship... criteria in § 637.207 and be approved by the FHWA. (b) STD capabilities. The STD shall maintain an adequate... qualified sampling and testing personnel employed by the STD or its designated agent. (d) Verification...
Code of Federal Regulations, 2010 CFR
2010-04-01
.... Each STD shall develop a quality assurance program which will assure that the materials and workmanship... criteria in § 637.207 and be approved by the FHWA. (b) STD capabilities. The STD shall maintain an adequate... qualified sampling and testing personnel employed by the STD or its designated agent. (d) Verification...
Code of Federal Regulations, 2012 CFR
2012-04-01
.... Each STD shall develop a quality assurance program which will assure that the materials and workmanship... criteria in § 637.207 and be approved by the FHWA. (b) STD capabilities. The STD shall maintain an adequate... qualified sampling and testing personnel employed by the STD or its designated agent. (d) Verification...
Code of Federal Regulations, 2013 CFR
2013-04-01
.... Each STD shall develop a quality assurance program which will assure that the materials and workmanship... criteria in § 637.207 and be approved by the FHWA. (b) STD capabilities. The STD shall maintain an adequate... qualified sampling and testing personnel employed by the STD or its designated agent. (d) Verification...
Code of Federal Regulations, 2011 CFR
2011-04-01
.... Each STD shall develop a quality assurance program which will assure that the materials and workmanship... criteria in § 637.207 and be approved by the FHWA. (b) STD capabilities. The STD shall maintain an adequate... qualified sampling and testing personnel employed by the STD or its designated agent. (d) Verification...
77 FR 9888 - Shiga Toxin-Producing Escherichia coli
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-21
... Toxin-Producing Escherichia coli in Certain Raw Beef Products AGENCY: Food Safety and Inspection Service... routine verification sampling and testing for raw beef manufacturing trimmings for six non-O157 Shiga... announced in September 2011 plans to test certain raw beef products for these six STEC serogroups in...
AERIAL PHOTOGRAPHY AND GROUND VERIFICATION AT POWER PLANT SITES: WISCONSIN POWER PLANT IMPACT STUDY
This study demonstrated and evaluated nine methods for monitoring the deterioration of a large wetland on the site of a newly-constructed coal-fired power plant in Columbia, County, Wisconsin. Four of the nine methods used data from ground sampling; two were remote sensing method...
7 CFR 245.6a - Verification requirements.
Code of Federal Regulations, 2013 CFR
2013-01-01
... AGRICULTURE CHILD NUTRITION PROGRAMS DETERMINING ELIGIBILITY FOR FREE AND REDUCED PRICE MEALS AND FREE MILK IN... established under the Runaway and Homeless Youth Act (42 U.S.C. 5701); or serving migratory children, as they...) General. The local educational agency must verify eligibility of children in a sample of household...
7 CFR 245.6a - Verification requirements.
Code of Federal Regulations, 2012 CFR
2012-01-01
... AGRICULTURE CHILD NUTRITION PROGRAMS DETERMINING ELIGIBILITY FOR FREE AND REDUCED PRICE MEALS AND FREE MILK IN... established under the Runaway and Homeless Youth Act (42 U.S.C. 5701); or serving migratory children, as they...) General. The local educational agency must verify eligibility of children in a sample of household...
7 CFR 245.6a - Verification requirements.
Code of Federal Regulations, 2011 CFR
2011-01-01
... AGRICULTURE CHILD NUTRITION PROGRAMS DETERMINING ELIGIBILITY FOR FREE AND REDUCED PRICE MEALS AND FREE MILK IN.... 5701); or serving migratory children, as they are defined in section 1309 of the Elementary and... children in a sample of household applications approved for free and reduced price meal benefits for that...
7 CFR 245.6a - Verification requirements.
Code of Federal Regulations, 2010 CFR
2010-01-01
... AGRICULTURE CHILD NUTRITION PROGRAMS DETERMINING ELIGIBILITY FOR FREE AND REDUCED PRICE MEALS AND FREE MILK IN.... 5701); or serving migratory children, as they are defined in section 1309 of the Elementary and... children in a sample of household applications approved for free and reduced price meal benefits for that...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-12
...) also requires Eligible Telecommunications Carriers (ETCs) to submit to the Universal Service.... Prior to 2009, USAC provided sample certification and verification letters on its website to assist ETCs... check box to accommodate wireless ETCs serving non-federal default states that do not assert...
Test/QA Plan for Verification of Coliform Detection Technologies for Drinking Water
The coliform detection technologies to be tested use chromatogenic and fluorogenic growth media to detect coliforms and E. coli based on the enzymatic activity of these organisms. The systems consist of single-use sample containers that contain pre-measured reagents and can be u...
A performance verification demonstration of technologies capable of detecting dioxin and dioxin-like compounds in soil and sediment samples was conducted in April 2004 under the U.S. Environmental Protection Agency's Superfund Innovative Technology Evaluation (SITE) Monitoring an...
Lou, Amy H; Elnenaei, Manal O; Sadek, Irene; Thompson, Shauna; Crocker, Bryan D; Nassar, Bassam A
2017-10-01
Core laboratory (CL), as a new business model, facilitates consolidation and integration of laboratory services to enhance efficiency and reduce costs. This study evaluates the impact of total laboratory automation system (TLA), electric track vehicle (ETV) system and auto-verification (AV) of results on overall turnaround time (TAT) (phlebotomy to reporting TAT: PR-TAT) within a CL setting. Mean, median and percentage of outlier (OP) for PR-TAT were compared for pre- and post-CL eras using five representative tests based on different request priorities. Comparison studies were also carried out on the intra-laboratory TAT (in-lab to reporting TAT: IR-TAT) and the delivery TAT (phlebotomy to in-lab TAT: PI-TAT) to reflect the efficiency of the TLA (both before and after introducing result AV) and ETV systems respectively. Median PR-TATs for the urgent samples were reduced on average by 16% across all representative analytes. Median PR-TATs for the routine samples were curtailed by 51%, 50%, 49%, 34% and 22% for urea, potassium, thyroid stimulating hormone (TSH), complete blood count (CBC) and prothrombin time (PT) respectively. The shorter PR-TAT was attributed to a significant reduction of IR-TAT through the TLA. However, the median PI-TAT was delayed when the ETV was used. Application of various AV rules shortened the median IR-TATs for potassium and urea. However, the OP of PR-TAT for the STAT requests exceeding 60min were all higher than those from the pre-CL era. TLA and auto-verification rules help to efficiently manage substantial volumes of urgent and routine samples. However, the ETV application as it stands shows a negative impact on the PR-TAT. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
A zero-knowledge protocol for nuclear warhead verification
NASA Astrophysics Data System (ADS)
Glaser, Alexander; Barak, Boaz; Goldston, Robert J.
2014-06-01
The verification of nuclear warheads for arms control involves a paradox: international inspectors will have to gain high confidence in the authenticity of submitted items while learning nothing about them. Proposed inspection systems featuring `information barriers', designed to hide measurements stored in electronic systems, are at risk of tampering and snooping. Here we show the viability of a fundamentally new approach to nuclear warhead verification that incorporates a zero-knowledge protocol, which is designed in such a way that sensitive information is never measured and so does not need to be hidden. We interrogate submitted items with energetic neutrons, making, in effect, differential measurements of both neutron transmission and emission. Calculations for scenarios in which material is diverted from a test object show that a high degree of discrimination can be achieved while revealing zero information. Our ideas for a physical zero-knowledge system could have applications beyond the context of nuclear disarmament. The proposed technique suggests a way to perform comparisons or computations on personal or confidential data without measuring the data in the first place.
The Mars Science Laboratory Organic Check Material
NASA Technical Reports Server (NTRS)
Conrad, Pamela G.; Eigenbrode, J. E.; Mogensen, C. T.; VonderHeydt, M. O.; Glavin, D. P.; Mahaffy, P. M.; Johnson, J. A.
2011-01-01
The Organic Check Material (OCM) has been developed for use on the Mars Science Laboratory mission to serve as a sample standard for verification of organic cleanliness and characterization of potential sample alteration as a function of the sample acquisition and portioning process on the Curiosity rover. OCM samples will be acquired using the same procedures for drilling, portioning and delivery as are used to study martian samples with The Sample Analysis at Mars (SAM) instrument suite during MSL surface operations. Because the SAM suite is highly sensitive to organic molecules, the mission can better verify the cleanliness of Curiosity's sample acquisition hardware if a known material can be processed through SAM and compared with the results obtained from martian samples.
Data Report on Post-Irradiation Dimensional Change of AGC-1 Samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
William Windes
This report summarizes the initial dimensional changes for loaded and unloaded AGC-1 samples. The dimensional change for all samples is presented as a function of dose. The data is further presented by graphite type and applied load levels to illustrate the differences between graphite forming processes and stress levels within the graphite components. While the three different loads placed on the samples have been verified [ ref: Larry Hull’s report] verification of the AGC-1 sample temperatures and dose levels are expected in the summer of 2012. Only estimated dose and temperature values for the samples are presented in this reportmore » to allow a partial analysis of the results.« less
Online 3D EPID-based dose verification: Proof of concept
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozenda
Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of thismore » study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5–10 s irradiation time. Conclusions: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.« less
Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier
2017-03-14
Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks causes a further verticalization of the health system. Our results highlight the potential disconnect between the theory of change behind RBF and the actual scheme's implementation. The implications are relevant at methodological level, stressing the importance of analyzing implementation processes to fully understand results, as well as at operational level, pointing to the need to carefully adapt the design of RBF schemes (including verification and other key functions) to the context and to allow room to iteratively modify it during implementation. They also question whether the rationale for thorough and costly verification is justified, or rather adaptations are possible.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takahashi, R; Kamima, T; Tachibana, H
2015-06-15
Purpose: To show the results of a multi-institutional study of the independent dose verification for conventional, Stereotactic radiosurgery and body radiotherapy (SRS and SBRT) plans based on the action level of AAPM TG-114. Methods: This study was performed at 12 institutions in Japan. To eliminate the bias of independent dose verification program (Indp), all of the institutions used the same CT-based independent dose verification software (Simple MU Analysis, Triangle Products, JP) with the Clarkson-based algorithm. Eclipse (AAA, PBC), Pinnacle{sup 3} (Adaptive Convolve) and Xio (Superposition) were used as treatment planning system (TPS). The confidence limits (CL, Mean±2SD) for 18 sitesmore » (head, breast, lung, pelvis, etc.) were evaluated in comparison in dose between the TPS and the Indp. Results: A retrospective analysis of 6352 treatment fields was conducted. The CLs for conventional, SRS and SBRT were 1.0±3.7 %, 2.0±2.5 % and 6.2±4.4 %, respectively. In conventional plans, most of the sites showed within 5 % of TG-114 action level. However, there were the systematic difference (4.0±4.0 % and 2.5±5.8 % for breast and lung, respectively). In SRS plans, our results showed good agreement compared to the action level. In SBRT plans, the discrepancy between the Indp was variable depending on dose calculation algorithms of TPS. Conclusion: The impact of dose calculation algorithms for the TPS and the Indp affects the action level. It is effective to set the site-specific tolerances, especially for the site where inhomogeneous correction can affect dose distribution strongly.« less
Verification of different forecasts of Hungarian Meteorological Service
NASA Astrophysics Data System (ADS)
Feher, B.
2009-09-01
In this paper I show the results of the forecasts made by the Hungarian Meteorological Service. I focus on the general short- and medium-range forecasts, which contains cloudiness, precipitation, wind speed and temperature for six regions of Hungary. I would like to show the results of some special forecasts as well, such as precipitation predictions which are made for the catchment area of Danube and Tisza rivers, and daily mean temperature predictions used by Hungarian energy companies. The product received by the user is made by the general forecaster, but these predictions are based on the ALADIN and ECMWF outputs. Because of these, the product of the forecaster and the models were also verified. Method like this is able to show us, which weather elements are more difficult to forecast or which regions have higher errors. During the verification procedure the basic errors (mean error, mean absolute error) are calculated. Precipitation amount is classified into five categories, and scores like POD, TS, PC,â¦etc. were defined by contingency table determined by these categories. The procedure runs fully automatically, all the things forecasters have to do is to print the daily result each morning. Beside the daily result, verification is also made for longer periods like week, month or year. Analyzing the results of longer periods we can say that the best predictions are made for the first few days, and precipitation forecasts are less good for mountainous areas, even, the scores of the forecasters sometimes are higher than the errors of the models. Since forecaster receive results next day, it can helps him/her to reduce mistakes and learn the weakness of the models. This paper contains the verification scores, their trends, the method by which these scores are calculated, and some case studies on worse forecasts.
NASA Astrophysics Data System (ADS)
Miedzinska, Danuta; Boczkowska, Anna; Zubko, Konrad
2010-07-01
In the article a method of numerical verification of experimental results for magnetorheological elastomer samples (MRE) is presented. The samples were shaped into cylinders with diameter of 8 mm and height of 20 mm with various carbonyl iron volume shares (1,5%, 11,5% and 33%). The diameter of soft ferromagnetic substance particles ranged from 6 to 9 μm. During the experiment, initially bended samples were exposed to the magnetic field with intensity levels at 0,1T, 0,3T, 0,5T, 0,7 and 1T. The reaction of the sample to the field action was measured as a displacement of a specimen. Numerical calculation was carried out with the MSC Patran/Marc computer code. For the purpose of numerical analysis the orthotropic material model with the material properties of magnetorheological elastomer along the iron chains, and of the pure elastomer along other directions, was applied. The material properties were obtained from the experimental tests. During the numerical analysis, the initial mechanical load resulting from cylinder deflection was set. Then, the equivalent external force, that was set on the basis of analytical calculations of intermolecular reaction within iron chains in the specific magnetic field, was put on the bended sample. Correspondence of such numerical model with results of the experiment was verified. Similar results of the experiments and both theoretical and FEM analysis indicates that macroscopic modeling of magnetorheological elastomer mechanical properties as orthotropic material delivers accurate enough description of the material's behavior.
NASA Astrophysics Data System (ADS)
Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong
2011-04-01
As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model application is presented.
2010-06-07
the materials properties of silicon carbide plates”, S. Kenderian et al., 2009 SPIE Proceedings, vol. 7425 • Materials – 10” x 16” SiC plates...CONFERENCE PROCEEDING 3. DATES COVERED (From - To) 2008-2010 4. TITLE AND SUBTITLE Results from Mechanical Testing of Silicon Carbide for Space...for silicon carbide optical systems that covers material verification through system development. Recent laboratory results for testing of materials
49 CFR 587.15 - Verification of aluminum honeycomb crush strength.
Code of Federal Regulations, 2012 CFR
2012-10-01
... material being tested. Four samples, each measuring 300 mm (11.8 in) × 300 mm (11.8 in) × 25 mm (1 in.... Samples of the following size are used for testing. The length is 150 mm (5.9 in) ±6 mm (0.24 in), the width is 150 mm (5.9 in) ±6 mm (0.24 in), and the thickness is 25 mm (1 in) ±2 mm (0.08 in). The walls...
49 CFR 587.15 - Verification of aluminum honeycomb crush strength.
Code of Federal Regulations, 2013 CFR
2013-10-01
... material being tested. Four samples, each measuring 300 mm (11.8 in) × 300 mm (11.8 in) × 25 mm (1 in.... Samples of the following size are used for testing. The length is 150 mm (5.9 in) ±6 mm (0.24 in), the width is 150 mm (5.9 in) ±6 mm (0.24 in), and the thickness is 25 mm (1 in) ±2 mm (0.08 in). The walls...
49 CFR 587.15 - Verification of aluminum honeycomb crush strength.
Code of Federal Regulations, 2014 CFR
2014-10-01
... material being tested. Four samples, each measuring 300 mm (11.8 in) × 300 mm (11.8 in) × 25 mm (1 in.... Samples of the following size are used for testing. The length is 150 mm (5.9 in) ±6 mm (0.24 in), the width is 150 mm (5.9 in) ±6 mm (0.24 in), and the thickness is 25 mm (1 in) ±2 mm (0.08 in). The walls...
49 CFR 587.15 - Verification of aluminum honeycomb crush strength.
Code of Federal Regulations, 2011 CFR
2011-10-01
... material being tested. Four samples, each measuring 300 mm (11.8 in) × 300 mm (11.8 in) × 25 mm (1 in.... Samples of the following size are used for testing. The length is 150 mm (5.9 in) ±6 mm (0.24 in), the width is 150 mm (5.9 in) ±6 mm (0.24 in), and the thickness is 25 mm (1 in) ±2 mm (0.08 in). The walls...
Verification of Abbott 25-OH-vitamin D assay on the architect system.
Hutchinson, Katrina; Healy, Martin; Crowley, Vivion; Louw, Michael; Rochev, Yury
2017-04-01
Analytical and clinical verification of both old and new generations of the Abbott total 25-hydroxyvitamin D (25OHD) assays, and an examination of reference Intervals. Determination of between-run precision, and Deming comparison between patient sample results for 25OHD on the Abbott Architect, DiaSorin Liaison and AB SCIEX API 4000 (LC-MS/MS). Establishment of uncertainty of measurement for 25OHD Architect methods using old and new generations of the reagents, and estimation of reference interval in healthy Irish population. For between-run precision the manufacturer claims 2.8% coefficients of variation (CVs) of 2.8% and 4.6% for their high and low controls, respectively. Our instrument showed CVs between 4% and 6.2% for all levels of the controls on both generations of the Abbott reagents. The between-run uncertainties were 0.28 and 0.36, with expanded uncertainties 0.87 and 0.98 for the old and the new generations of reagent, respectively. The difference between all methods used for patients' samples was within total allowable error, and the instruments produced clinically equivalent results. The results covered the medical decision points of 30, 40, 50 and 125 nmol/L. The reference interval for total 25OHD in our healthy Irish subjects was lower than recommended levels (24-111 nmol/L). In a clinical laboratory Abbott 25OHD immunoassays are a useful, rapid and accurate method for measuring total 25OHD. The new generation of the assay was confirmed to be reliable, accurate, and a good indicator for 25OHD measurement. More study is needed to establish reference intervals that correctly represent the healthy population in Ireland.
EXhype: A tool for mineral classification using hyperspectral data
NASA Astrophysics Data System (ADS)
Adep, Ramesh Nityanand; shetty, Amba; Ramesh, H.
2017-02-01
Various supervised classification algorithms have been developed to classify earth surface features using hyperspectral data. Each algorithm is modelled based on different human expertises. However, the performance of conventional algorithms is not satisfactory to map especially the minerals in view of their typical spectral responses. This study introduces a new expert system named 'EXhype (Expert system for hyperspectral data classification)' to map minerals. The system incorporates human expertise at several stages of it's implementation: (i) to deal with intra-class variation; (ii) to identify absorption features; (iii) to discriminate spectra by considering absorption features, non-absorption features and by full spectra comparison; and (iv) finally takes a decision based on learning and by emphasizing most important features. It is developed using a knowledge base consisting of an Optimal Spectral Library, Segmented Upper Hull method, Spectral Angle Mapper (SAM) and Artificial Neural Network. The performance of the EXhype is compared with a traditional, most commonly used SAM algorithm using Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data acquired over Cuprite, Nevada, USA. A virtual verification method is used to collect samples information for accuracy assessment. Further, a modified accuracy assessment method is used to get a real users accuracies in cases where only limited or desired classes are considered for classification. With the modified accuracy assessment method, SAM and EXhype yields an overall accuracy of 60.35% and 90.75% and the kappa coefficient of 0.51 and 0.89 respectively. It was also found that the virtual verification method allows to use most desired stratified random sampling method and eliminates all the difficulties associated with it. The experimental results show that EXhype is not only producing better accuracy compared to traditional SAM but, can also rightly classify the minerals. It is proficient in avoiding misclassification between target classes when applied on minerals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, C.; Pujol, A.; Gaztañaga, E.
We measure the redshift evolution of galaxy bias from a magnitude-limited galaxy sample by combining the galaxy density maps and weak lensing shear maps for amore » $$\\sim$$116 deg$$^{2}$$ area of the Dark Energy Survey (DES) Science Verification data. This method was first developed in Amara et al. (2012) and later re-examined in a companion paper (Pujol et al., in prep) with rigorous simulation tests and analytical treatment of tomographic measurements. In this work we apply this method to the DES SV data and measure the galaxy bias for a magnitude-limited galaxy sample. We find the galaxy bias and 1$$\\sigma$$ error bars in 4 photometric redshift bins to be 1.33$$\\pm$$0.18 (z=0.2-0.4), 1.19$$\\pm$$0.23 (z=0.4-0.6), 0.99$$\\pm$$0.36 ( z=0.6-0.8), and 1.66$$\\pm$$0.56 (z=0.8-1.0). These measurements are consistent at the 1-2$$\\sigma$$ level with mea- surements on the same dataset using galaxy clustering and cross-correlation of galaxies with CMB lensing. In addition, our method provides the only $$\\sigma_8$$-independent constraint among the three. We forward-model the main observational effects using mock galaxy catalogs by including shape noise, photo-z errors and masking effects. We show that our bias measurement from the data is consistent with that expected from simulations. With the forthcoming full DES data set, we expect this method to provide additional constraints on the galaxy bias measurement from more traditional methods. Furthermore, in the process of our measurement, we build up a 3D mass map that allows further exploration of the dark matter distribution and its relation to galaxy evolution.« less
Wan, Xiaohua; Katchalski, Tsvi; Churas, Christopher; Ghosh, Sreya; Phan, Sebastien; Lawrence, Albert; Hao, Yu; Zhou, Ziying; Chen, Ruijuan; Chen, Yu; Zhang, Fa; Ellisman, Mark H
2017-05-01
Because of the significance of electron microscope tomography in the investigation of biological structure at nanometer scales, ongoing improvement efforts have been continuous over recent years. This is particularly true in the case of software developments. Nevertheless, verification of improvements delivered by new algorithms and software remains difficult. Current analysis tools do not provide adaptable and consistent methods for quality assessment. This is particularly true with images of biological samples, due to image complexity, variability, low contrast and noise. We report an electron tomography (ET) simulator with accurate ray optics modeling of image formation that includes curvilinear trajectories through the sample, warping of the sample and noise. As a demonstration of the utility of our approach, we have concentrated on providing verification of the class of reconstruction methods applicable to wide field images of stained plastic-embedded samples. Accordingly, we have also constructed digital phantoms derived from serial block face scanning electron microscope images. These phantoms are also easily modified to include alignment features to test alignment algorithms. The combination of more realistic phantoms with more faithful simulations facilitates objective comparison of acquisition parameters, alignment and reconstruction algorithms and their range of applicability. With proper phantoms, this approach can also be modified to include more complex optical models, including distance-dependent blurring and phase contrast functions, such as may occur in cryotomography. Copyright © 2017 Elsevier Inc. All rights reserved.
Lessons from UNSCOM and IAEA regarding remote monitoring and air sampling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dupree, S.A.
1996-01-01
In 1991, at the direction of the United Nations Security Council, UNSCOM and IAEA developed plans for On-going Monitoring and Verification (OMV) in Iraq. The plans were accepted by the Security Council and remote monitoring and atmospheric sampling equipment has been installed at selected sites in Iraq. The remote monitoring equipment consists of video cameras and sensors positioned to observe equipment or activities at sites that could be used to support the development or manufacture of weapons of mass destruction, or long-range missiles. The atmospheric sampling equipment provides unattended collection of chemical samples from sites that could be used tomore » support the development or manufacture of chemical weapon agents. To support OMV in Iraq, UNSCOM has established the Baghdad Monitoring and Verification Centre. Imagery from the remote monitoring cameras can be accessed in near-real time from the Centre through RIF communication links with the monitored sites. The OMV program in Iraq has implications for international cooperative monitoring in both global and regional contexts. However, monitoring systems such as those used in Iraq are not sufficient, in and of themselves, to guarantee the absence of prohibited activities. Such systems cannot replace on-site inspections by competent, trained inspectors. However, monitoring similar to that used in Iraq can contribute to openness and confidence building, to the development of mutual trust, and to the improvement of regional stability.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yashchuk, Valeriy V; Conley, Raymond; Anderson, Erik H
Verification of the reliability of metrology data from high quality x-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested [Proc. SPIE 7077-7 (2007), Opt. Eng. 47(7), 073602-1-5 (2008)} and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [Nucl. Instr. and Meth. A 616, 172-82 (2010)]. Here we describe the details ofmore » development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi2/Si multilayer coating with pseudo randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize x-ray microscopes. Corresponding work with x-ray microscopes is in progress.« less
Wirojanagud, Wanpen; Srisatit, Thares
2014-01-01
Fuzzy overlay approach on three raster maps including land slope, soil type, and distance to stream can be used to identify the most potential locations of high arsenic contamination in soils. Verification of high arsenic contamination was made by collection samples and analysis of arsenic content and interpolation surface by spatial anisotropic method. A total of 51 soil samples were collected at the potential contaminated location clarified by fuzzy overlay approach. At each location, soil samples were taken at the depth of 0.00-1.00 m from the surface ground level. Interpolation surface of the analysed arsenic content using spatial anisotropic would verify the potential arsenic contamination location obtained from fuzzy overlay outputs. Both outputs of the spatial surface anisotropic and the fuzzy overlay mapping were significantly spatially conformed. Three contaminated areas with arsenic concentrations of 7.19 ± 2.86, 6.60 ± 3.04, and 4.90 ± 2.67 mg/kg exceeded the arsenic content of 3.9 mg/kg, the maximum concentration level (MCL) for agricultural soils as designated by Office of National Environment Board of Thailand. It is concluded that fuzzy overlay mapping could be employed for identification of potential contamination area with the verification by surface anisotropic approach including intensive sampling and analysis of the substances of interest. PMID:25110751
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricci, P., E-mail: paolo.ricci@epfl.ch; Riva, F.; Theiler, C.
In the present work, a Verification and Validation procedure is presented and applied showing, through a practical example, how it can contribute to advancing our physics understanding of plasma turbulence. Bridging the gap between plasma physics and other scientific domains, in particular, the computational fluid dynamics community, a rigorous methodology for the verification of a plasma simulation code is presented, based on the method of manufactured solutions. This methodology assesses that the model equations are correctly solved, within the order of accuracy of the numerical scheme. The technique to carry out a solution verification is described to provide a rigorousmore » estimate of the uncertainty affecting the numerical results. A methodology for plasma turbulence code validation is also discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The Verification and Validation methodology is then applied to the study of plasma turbulence in the basic plasma physics experiment TORPEX [Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulations carried out with the GBS code [Ricci et al., Plasma Phys. Controlled Fusion 54, 124047 (2012)]. The validation procedure allows progress in the understanding of the turbulent dynamics in TORPEX, by pinpointing the presence of a turbulent regime transition, due to the competition between the resistive and ideal interchange instabilities.« less
An analysis of random projection for changeable and privacy-preserving biometric verification.
Wang, Yongjin; Plataniotis, Konstantinos N
2010-10-01
Changeability and privacy protection are important factors for widespread deployment of biometrics-based verification systems. This paper presents a systematic analysis of a random-projection (RP)-based method for addressing these problems. The employed method transforms biometric data using a random matrix with each entry an independent and identically distributed Gaussian random variable. The similarity- and privacy-preserving properties, as well as the changeability of the biometric information in the transformed domain, are analyzed in detail. Specifically, RP on both high-dimensional image vectors and dimensionality-reduced feature vectors is discussed and compared. A vector translation method is proposed to improve the changeability of the generated templates. The feasibility of the introduced solution is well supported by detailed theoretical analyses. Extensive experimentation on a face-based biometric verification problem shows the effectiveness of the proposed method.
Abstraction and Assume-Guarantee Reasoning for Automated Software Verification
NASA Technical Reports Server (NTRS)
Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.
2004-01-01
Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.
NASA Astrophysics Data System (ADS)
Davis, C.; Rozo, E.; Roodman, A.; Alarcon, A.; Cawthon, R.; Gatti, M.; Lin, H.; Miquel, R.; Rykoff, E. S.; Troxel, M. A.; Vielzeuf, P.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Bechtol, K.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Doel, P.; Drlica-Wagner, A.; Fausti Neto, A.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gaztanaga, E.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; Jain, B.; James, D. J.; Jeltema, T.; Krause, E.; Kuehn, K.; Kuhlmann, S.; Kuropatkin, N.; Lahav, O.; Li, T. S.; Lima, M.; March, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Ogando, R. L. C.; Plazas, A. A.; Romer, A. K.; Sanchez, E.; Scarpine, V.; Schindler, R.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, M.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Vikram, V.; Walker, A. R.; Wechsler, R. H.
2018-06-01
Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogues with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty of Δz ˜ ±0.01. We forecast that our proposal can, in principle, control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Our results provide strong motivation to launch a programme to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.
Davis, C.; Rozo, E.; Roodman, A.; ...
2018-03-26
Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogs with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty ofmore » $$\\Delta z \\sim \\pm 0.01$$. We forecast that our proposal can in principle control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Here, our results provide strong motivation to launch a program to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, C.; Rozo, E.; Roodman, A.
Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogs with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty ofmore » $$\\Delta z \\sim \\pm 0.01$$. We forecast that our proposal can in principle control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Here, our results provide strong motivation to launch a program to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.« less
Low-Cost, High-Throughput Sequencing of DNA Assemblies Using a Highly Multiplexed Nextera Process.
Shapland, Elaine B; Holmes, Victor; Reeves, Christopher D; Sorokin, Elena; Durot, Maxime; Platt, Darren; Allen, Christopher; Dean, Jed; Serber, Zach; Newman, Jack; Chandran, Sunil
2015-07-17
In recent years, next-generation sequencing (NGS) technology has greatly reduced the cost of sequencing whole genomes, whereas the cost of sequence verification of plasmids via Sanger sequencing has remained high. Consequently, industrial-scale strain engineers either limit the number of designs or take short cuts in quality control. Here, we show that over 4000 plasmids can be completely sequenced in one Illumina MiSeq run for less than $3 each (15× coverage), which is a 20-fold reduction over using Sanger sequencing (2× coverage). We reduced the volume of the Nextera tagmentation reaction by 100-fold and developed an automated workflow to prepare thousands of samples for sequencing. We also developed software to track the samples and associated sequence data and to rapidly identify correctly assembled constructs having the fewest defects. As DNA synthesis and assembly become a centralized commodity, this NGS quality control (QC) process will be essential to groups operating high-throughput pipelines for DNA construction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
2007-08-30
The 1607-B1 Septic System includes a septic tank, drain field, and associated connecting pipelines and influent sanitary sewer lines. This septic system serviced the former 1701-B Badgehouse, 1720-B Patrol Building/Change Room, and the 1709-B Fire Headquarters. The 1607-B1 waste site received unknown amounts of nonhazardous, nonradioactive sanitary sewage from these facilities during its operational history from 1944 to approximately 1970. In accordance with this evaluation, the confirmatory sampling results support a reclassification of this site to No Action. The current site conditions achieve the remedial action objectives and the corresponding remedial action goals established in the Remaining Sites ROD. Themore » results of confirmatory sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.« less
DECHADE: DEtecting slight Changes with HArd DEcisions in Wireless Sensor Networks
NASA Astrophysics Data System (ADS)
Ciuonzo, D.; Salvo Rossi, P.
2018-07-01
This paper focuses on the problem of change detection through a Wireless Sensor Network (WSN) whose nodes report only binary decisions (on the presence/absence of a certain event to be monitored), due to bandwidth/energy constraints. The resulting problem can be modelled as testing the equality of samples drawn from independent Bernoulli probability mass functions, when the bit probabilities under both hypotheses are not known. Both One-Sided (OS) and Two-Sided (TS) tests are considered, with reference to: (i) identical bit probability (a homogeneous scenario), (ii) different per-sensor bit probabilities (a non-homogeneous scenario) and (iii) regions with identical bit probability (a block-homogeneous scenario) for the observed samples. The goal is to provide a systematic framework collecting a plethora of viable detectors (designed via theoretically founded criteria) which can be used for each instance of the problem. Finally, verification of the derived detectors in two relevant WSN-related problems is provided to show the appeal of the proposed framework.
Analysis of lard in meatball broth using Fourier transform infrared spectroscopy and chemometrics.
Kurniawati, Endah; Rohman, Abdul; Triyana, Kuwat
2014-01-01
Meatball is one of the favorite foods in Indonesia. For the economic reason (due to the price difference), the substitution of beef meat with pork can occur. In this study, FTIR spectroscopy in combination with chemometrics of partial least square (PLS) and principal component analysis (PCA) was used for analysis of pork fat (lard) in meatball broth. Lard in meatball broth was quantitatively determined at wavenumber region of 1018-1284 cm(-1). The coefficient of determination (R(2)) and root mean square error of calibration (RMSEC) values obtained were 0.9975 and 1.34% (v/v), respectively. Furthermore, the classification of lard and beef fat in meatball broth as well as in commercial samples was performed at wavenumber region of 1200-1000 cm(-1). The results showed that FTIR spectroscopy coupled with chemometrics can be used for quantitative analysis and classification of lard in meatball broth for Halal verification studies. The developed method is simple in operation, rapid and not involving extensive sample preparation. © 2013.
The SeaHorn Verification Framework
NASA Technical Reports Server (NTRS)
Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.
2015-01-01
In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Golshan, Maryam, E-mail: maryam.golshan@bccancer.bc.ca; Spadinger, Ingrid; Chng, Nick
2016-06-15
Purpose: Current methods of low dose rate brachytherapy source strength verification for sources preloaded into needles consist of either assaying a small number of seeds from a separate sample belonging to the same lot used to load the needles or performing batch assays of a subset of the preloaded seed trains. Both of these methods are cumbersome and have the limitations inherent to sampling. The purpose of this work was to investigate an alternative approach that uses an image-based, autoradiographic system capable of the rapid and complete assay of all sources without compromising sterility. Methods: The system consists of amore » flat panel image detector, an autoclavable needle holder, and software to analyze the detected signals. The needle holder was designed to maintain a fixed vertical spacing between the needles and the image detector, and to collimate the emissions from each seed. It also provides a sterile barrier between the needles and the imager. The image detector has a sufficiently large image capture area to allow several needles to be analyzed simultaneously.Several tests were performed to assess the accuracy and reproducibility of source strengths obtained using this system. Three different seed models (Oncura 6711 and 9011 {sup 125}I seeds, and IsoAid Advantage {sup 103}Pd seeds) were used in the evaluations. Seeds were loaded into trains with at least 1 cm spacing. Results: Using our system, it was possible to obtain linear calibration curves with coverage factor k = 1 prediction intervals of less than ±2% near the centre of their range for the three source models. The uncertainty budget calculated from a combination of type A and type B estimates of potential sources of error was somewhat larger, yielding (k = 1) combined uncertainties for individual seed readings of 6.2% for {sup 125}I 6711 seeds, 4.7% for {sup 125}I 9011 seeds, and 11.0% for Advantage {sup 103}Pd seeds. Conclusions: This study showed that a flat panel detector dosimetry system is a viable option for source strength verification in preloaded needles, as it is capable of measuring all of the sources intended for implantation. Such a system has the potential to directly and efficiently estimate individual source strengths, the overall mean source strength, and the positions within the seed-spacer train.« less
Verification and characterization of chromosome duplication in haploid maize.
de Oliveira Couto, E G; Resende Von Pinho, E V; Von Pinho, R G; Veiga, A D; de Carvalho, M R; de Oliveira Bustamante, F; Nascimento, M S
2015-06-26
Doubled haploid technology has been used by various private companies. However, information regarding chromosome duplication methodologies, particularly those concerning techniques used to identify duplication in cells, is limited. Thus, we analyzed and characterized artificially doubled haploids using microsatellites molecular markers, pollen viability, and flow cytometry techniques. Evaluated material was obtained using two different chromosome duplication protocols in maize seeds considered haploids, resulting from the cross between the haploid inducer line KEMS and 4 hybrids (GNS 3225, GNS 3032, GNS 3264, and DKB 393). Fourteen days after duplication, plant samples were collected and assessed by flow cytometry. Further, the plants were transplanted to a field, and samples were collected for DNA analyses using microsatellite markers. The tassels were collected during anthesis for pollen viability analyses. Haploid, diploid, and mixoploid individuals were detected using flow cytometry, demonstrating that this technique was efficient for identifying doubled haploids. The microsatellites markers were also efficient for confirming the ploidies preselected by flow cytometry and for identifying homozygous individuals. Pollen viability showed a significant difference between the evaluated ploidies when the Alexander and propionic-carmin stains were used. The viability rates between the plodies analyzed show potential for fertilization.
The two predominate sources of arsenic exposure are water and dietary ingestion. Dietary sources can easily exceed drinking water exposures based on "total" arsenic measurements. This can be deceiving because arsenic's toxicity is strongly dependent on its chemical form and the...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-21
... children and provide low cost or free school lunch meals to qualified students through subsidies to schools... records to demonstrate compliance with the meal requirements. To the extent practicable, schools ensure... verification of a required sample size), the number of meals served, and data from required reviews conducted...
Verification of Cloud Forecasts over the Eastern Pacific Using Passive Satellite Retrievals
2009-10-01
with increasing sample area. Ebert (2008) reviews a number of these methods, some examples include upscaling ( Zepeda -Arce et al. 2000), wavelet...evaluation of mesoscale simula- tions of the Algiers 2001 flash flood by the model-to-satellite approach. Adv. Geosci., 7, 247–250. Zepeda -Arce, J., E
Evaluation of the IWS Model 6000 SBR began in April 2004 when one SBR was taken off line and cleaned. The verification testing started July 1, 2004 and proceeded without interruption through June 30, 2005. All sixteen four-day sampling events were completed as scheduled, yielding...
Michael C. Wiemann; Edgard O. Espinoza
2017-01-01
To evade endangered timber species laws, unscrupulous importers sometimes attempt to pass protected Dalbergia nigra as look-alike but unprotected, Dalbergia Spruceana. Wood density and fluorescence properties are sometimes used to identify the species. Although these properties are useful and do not require special equipment,...
12 CFR 715.8 - Requirements for verification of accounts and passbooks.
Code of Federal Regulations, 2010 CFR
2010-01-01
... selection: (ii) A sample which is representative of the population from which it was selected; (iii) An equal chance of selecting each dollar in the population; (iv) Sufficient accounts in both number and... consistent with GAAS if such methods provide for: (i) Sufficient accounts in both number and scope on which...
Problems and Limitations in Studies on Screening for Language Delay
ERIC Educational Resources Information Center
Eriksson, Marten; Westerlund, Monica; Miniscalco, Carmela
2010-01-01
This study discusses six common methodological limitations in screening for language delay (LD) as illustrated in 11 recent studies. The limitations are (1) whether the studies define a target population, (2) whether the recruitment procedure is unbiased, (3) attrition, (4) verification bias, (5) small sample size and (6) inconsistencies in choice…
40 CFR 745.225 - Accreditation of training programs: target housing and child-occupied facilities.
Code of Federal Regulations, 2010 CFR
2010-07-01
... equipment to be used for lecture and hands-on training. (B) A copy of the course test blueprint for each..., the delivery of the lecture, course test, hands-on training, and assessment activities. This includes... containment and cleanup methods, and post-renovation cleaning verification. (vii) The dust sampling technician...
40 CFR 745.225 - Accreditation of training programs: target housing and child-occupied facilities.
Code of Federal Regulations, 2011 CFR
2011-07-01
... equipment to be used for lecture and hands-on training. (B) A copy of the course test blueprint for each..., the delivery of the lecture, course test, hands-on training, and assessment activities. This includes... containment and cleanup methods, and post-renovation cleaning verification. (vii) The dust sampling technician...
Code of Federal Regulations, 2010 CFR
2010-01-01
...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...
Code of Federal Regulations, 2011 CFR
2011-01-01
...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...
Code of Federal Regulations, 2014 CFR
2014-01-01
...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...
Code of Federal Regulations, 2012 CFR
2012-01-01
...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...
Code of Federal Regulations, 2013 CFR
2013-01-01
...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...
USDA-ARS?s Scientific Manuscript database
The USDA Food Safety and Inspection Service requires samples of raw broiler parts for performance standard verification for the detection of Campylobacter. Poultry processors must maintain process controls with Campylobacter prevalence levels below 7.7%. Establishments utilize antimicrobial processi...
Model Transformation for a System of Systems Dependability Safety Case
NASA Technical Reports Server (NTRS)
Murphy, Judy; Driskell, Steve
2011-01-01
The presentation reviews the dependability and safety effort of NASA's Independent Verification and Validation Facility. Topics include: safety engineering process, applications to non-space environment, Phase I overview, process creation, sample SRM artifact, Phase I end result, Phase II model transformation, fault management, and applying Phase II to individual projects.
40 CFR 1065.935 - Emission test sequence for field testing.
Code of Federal Regulations, 2014 CFR
2014-07-01
... verifications in emission calculations. (5) You may periodically condition and analyze batch samples in-situ... drift corrected results in emissions calculations. (6) Unless you weighed PM in-situ, such as by using... engine in-use until the engine coolant, block, or head absolute temperature is within ±10% of its mean...
40 CFR 1065.935 - Emission test sequence for field testing.
Code of Federal Regulations, 2013 CFR
2013-07-01
... verifications in emission calculations. (5) You may periodically condition and analyze batch samples in-situ... drift corrected results in emissions calculations. (6) Unless you weighed PM in-situ, such as by using... engine in-use until the engine coolant, block, or head absolute temperature is within ±10% of its mean...
40 CFR 1065.935 - Emission test sequence for field testing.
Code of Federal Regulations, 2012 CFR
2012-07-01
... verifications in emission calculations. (5) You may periodically condition and analyze batch samples in-situ... drift corrected results in emissions calculations. (6) Unless you weighed PM in-situ, such as by using... engine in-use until the engine coolant, block, or head absolute temperature is within ±10% of its mean...
40 CFR 1065.935 - Emission test sequence for field testing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... verifications in emission calculations. (5) You may periodically condition and analyze batch samples in-situ... drift corrected results in emissions calculations. (6) Unless you weighed PM in-situ, such as by using... engine in-use until the engine coolant, block, or head absolute temperature is within ±10% of its mean...
40 CFR 1065.935 - Emission test sequence for field testing.
Code of Federal Regulations, 2011 CFR
2011-07-01
... verifications in emission calculations. (5) You may periodically condition and analyze batch samples in-situ... drift corrected results in emissions calculations. (6) Unless you weighed PM in-situ, such as by using... engine in-use until the engine coolant, block, or head absolute temperature is within ±10% of its mean...
Does the Community of Inquiry Framework Predict Outcomes in Online MBA Courses?
ERIC Educational Resources Information Center
Arbaugh, J. B.
2008-01-01
While Garrison and colleagues' (2000) Community of Inquiry (CoI) framework has generated substantial interest among online learning researchers, it has yet to be subjected to extensive quantitative verification or tested for external validity. Using a sample of students from 55 online MBA courses, the findings of this study suggest strong…
Enhancing pre-service physics teachers' creative thinking skills through HOT lab design
NASA Astrophysics Data System (ADS)
Malik, Adam; Setiawan, Agus; Suhandi, Andi; Permanasari, Anna
2017-08-01
A research on the implementation of HOT (Higher Order Thinking) Laboratory has been carried out. This research is aimed to compare increasing of creative thinking skills of pre-service physics teachers who receive physics lesson with HOT Lab and with verification lab for the topic of electric circuit. This research used a quasi-experiment methods with control group pretest-posttest design. The subject of the research is 40 Physics Education pre-service physics teachers of UIN Sunan Gunung Djati Bandung. Research samples were selected by class random sampling technique. Data on pre-service physics teachers' creative thinking skills were collected using test of creative thinking skills in the form of essay. The results of the research reveal that average of N-gain of creative thinking skills are <0,69> for pre-service physics teachers who received lesson with HOT Lab design and <0,39> for pre-service physics teachers who received lesson with verification lab, respectively. Therefore, we conclude that application of HOT Lab design is more effective to increase creative thinking skills in the lesson of electric circuit.
Tuerxunyiming, Muhadasi; Xian, Feng; Zi, Jin; Yimamu, Yilihamujiang; Abuduwayite, Reshalaiti; Ren, Yan; Li, Qidan; Abudula, Abulizi; Liu, SiQi; Mohemaiti, Patamu
2018-01-05
Maturity-onset diabetes of the young (MODY) is an inherited monogenic type of diabetes. Genetic mutations in MODY often cause nonsynonymous changes that directly lead to the functional distortion of proteins and the pathological consequences. Herein, we proposed that the inherited mutations found in a MODY family could cause a disturbance of protein abundance, specifically in serum. The serum samples were collected from a Uyghur MODY family through three generations, and the serum proteins after depletion treatment were examined by quantitative proteomics to characterize the MODY-related serum proteins followed by verification using target quantification of proteomics. A total of 32 serum proteins were preliminarily identified as the MODY-related. Further verification test toward the individual samples demonstrated the 12 candidates with the significantly different abundance in the MODY patients. A comparison of the 12 proteins among the sera of type 1 diabetes, type 2 diabetes, MODY, and healthy subjects was conducted and revealed a protein signature related with MODY composed of the serum proteins such as SERPINA7, APOC4, LPA, C6, and F5.
Calibration of Ge gamma-ray spectrometers for complex sample geometries and matrices
NASA Astrophysics Data System (ADS)
Semkow, T. M.; Bradt, C. J.; Beach, S. E.; Haines, D. K.; Khan, A. J.; Bari, A.; Torres, M. A.; Marrantino, J. C.; Syed, U.-F.; Kitto, M. E.; Hoffman, T. J.; Curtis, P.
2015-11-01
A comprehensive study of the efficiency calibration and calibration verification of Ge gamma-ray spectrometers was performed using semi-empirical, computational Monte-Carlo (MC), and transfer methods. The aim of this study was to evaluate the accuracy of the quantification of gamma-emitting radionuclides in complex matrices normally encountered in environmental and food samples. A wide range of gamma energies from 59.5 to 1836.0 keV and geometries from a 10-mL jar to 1.4-L Marinelli beaker were studied on four Ge spectrometers with the relative efficiencies between 102% and 140%. Density and coincidence summing corrections were applied. Innovative techniques were developed for the preparation of artificial complex matrices from materials such as acidified water, polystyrene, ethanol, sugar, and sand, resulting in the densities ranging from 0.3655 to 2.164 g cm-3. They were spiked with gamma activity traceable to international standards and used for calibration verifications. A quantitative method of tuning MC calculations to experiment was developed based on a multidimensional chi-square paraboloid.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bojechko, Casey; Phillps, Mark; Kalet, Alan
Purpose: Complex treatments in radiation therapy require robust verification in order to prevent errors that can adversely affect the patient. For this purpose, the authors estimate the effectiveness of detecting errors with a “defense in depth” system composed of electronic portal imaging device (EPID) based dosimetry and a software-based system composed of rules-based and Bayesian network verifications. Methods: The authors analyzed incidents with a high potential severity score, scored as a 3 or 4 on a 4 point scale, recorded in an in-house voluntary incident reporting system, collected from February 2012 to August 2014. The incidents were categorized into differentmore » failure modes. The detectability, defined as the number of incidents that are detectable divided total number of incidents, was calculated for each failure mode. Results: In total, 343 incidents were used in this study. Of the incidents 67% were related to photon external beam therapy (EBRT). The majority of the EBRT incidents were related to patient positioning and only a small number of these could be detected by EPID dosimetry when performed prior to treatment (6%). A large fraction could be detected by in vivo dosimetry performed during the first fraction (74%). Rules-based and Bayesian network verifications were found to be complimentary to EPID dosimetry, able to detect errors related to patient prescriptions and documentation, and errors unrelated to photon EBRT. Combining all of the verification steps together, 91% of all EBRT incidents could be detected. Conclusions: This study shows that the defense in depth system is potentially able to detect a large majority of incidents. The most effective EPID-based dosimetry verification is in vivo measurements during the first fraction and is complemented by rules-based and Bayesian network plan checking.« less
Plasma protein absolute quantification by nano-LC Q-TOF UDMSE for clinical biomarker verification
ILIES, MARIA; IUGA, CRISTINA ADELA; LOGHIN, FELICIA; DHOPLE, VISHNU MUKUND; HAMMER, ELKE
2017-01-01
Background and aims Proteome-based biomarker studies are targeting proteins that could serve as diagnostic, prognosis, and prediction molecules. In the clinical routine, immunoassays are currently used for the absolute quantification of such biomarkers, with the major limitation that only one molecule can be targeted per assay. The aim of our study was to test a mass spectrometry based absolute quantification method for the verification of plasma protein sets which might serve as reliable biomarker panels for the clinical practice. Methods Six EDTA plasma samples were analyzed after tryptic digestion using a high throughput data independent acquisition nano-LC Q-TOF UDMSE proteomics approach. Synthetic Escherichia coli standard peptides were spiked in each sample for the absolute quantification. Data analysis was performed using ProgenesisQI v2.0 software (Waters Corporation). Results Our method ensured absolute quantification of 242 non redundant plasma proteins in a single run analysis. The dynamic range covered was 105. 86% were represented by classical plasma proteins. The overall median coefficient of variation was 0.36, while a set of 63 proteins was found to be highly stable. Absolute protein concentrations strongly correlated with values reviewed in the literature. Conclusions Nano-LC Q-TOF UDMSE proteomic analysis can be used for a simple and rapid determination of absolute amounts of plasma proteins. A large number of plasma proteins could be analyzed, while a wide dynamic range was covered with low coefficient of variation at protein level. The method proved to be a reliable tool for the quantification of protein panel for biomarker verification in the clinical practice. PMID:29151793
Landing System Development- Design and Test Prediction of a Lander Leg Using Nonlinear Analysis
NASA Astrophysics Data System (ADS)
Destefanis, Stefano; Buchwald, Robert; Pellegrino, Pasquale; Schroder, Silvio
2014-06-01
Several mission studies have been performed focusing on a soft and precision landing using landing legs. Examples for such missions are Mars Sample Return scenarios (MSR), Lunar landing scenarios (MoonNEXT, Lunar Lander) and small body sample return studies (Marco Polo, MMSR, Phootprint). Such missions foresee a soft landing on the planet surface for delivering payload in a controlled manner and limiting the landing loads.To ensure a successful final landing phase, a landing system is needed, capable of absorbing the residual velocities (vertical, horizontal and angular) at touch- down, and insuring a controlled attitude after landing. Such requirements can be fulfilled by using landing legs with adequate damping.The Landing System Development (LSD) study, currently in its phase 2, foresees the design, analysis, verification, manufacturing and testing of a representative landing leg breadboard based on the Phase B design of the ESA Lunar Lander. Drop tests of a single leg will be performed both on rigid and soft ground, at several impact angles. The activity is covered under ESA contract with TAS-I as Prime Contractor, responsible for analysis and verification, Astrium GmbH for design and test and QinetiQ Space for manufacturing. Drop tests will be performed at the Institute of Space Systems of the German Aerospace Center (DLR-RY) in Bremen.This paper presents an overview of the analytical simulations (test predictions and design verification) performed, comparing the results produced by Astrium made multi body model (rigid bodies, nonlinearities accounted for in mechanical joints and force definitions, based on development tests) and TAS-I made nonlinear explicit model (fully deformable bodies).
Winslow, Stephen D; Pepich, Barry V; Martin, John J; Hallberg, George R; Munch, David J; Frebis, Christopher P; Hedrick, Elizabeth J; Krop, Richard A
2006-01-01
The United States Environmental Protection Agency's Office of Ground Water and Drinking Water has developed a single-laboratory quantitation procedure: the lowest concentration minimum reporting level (LCMRL). The LCMRL is the lowest true concentration for which future recovery is predicted to fall, with high confidence (99%), between 50% and 150%. The procedure takes into account precision and accuracy. Multiple concentration replicates are processed through the entire analytical method and the data are plotted as measured sample concentration (y-axis) versus true concentration (x-axis). If the data support an assumption of constant variance over the concentration range, an ordinary least-squares regression line is drawn; otherwise, a variance-weighted least-squares regression is used. Prediction interval lines of 99% confidence are drawn about the regression. At the points where the prediction interval lines intersect with data quality objective lines of 50% and 150% recovery, lines are dropped to the x-axis. The higher of the two values is the LCMRL. The LCMRL procedure is flexible because the data quality objectives (50-150%) and the prediction interval confidence (99%) can be varied to suit program needs. The LCMRL determination is performed during method development only. A simpler procedure for verification of data quality objectives at a given minimum reporting level (MRL) is also presented. The verification procedure requires a single set of seven samples taken through the entire method procedure. If the calculated prediction interval is contained within data quality recovery limits (50-150%), the laboratory performance at the MRL is verified.
Galaxy-galaxy lensing in the Dark Energy Survey Science Verification data
NASA Astrophysics Data System (ADS)
Clampitt, J.; Sánchez, C.; Kwan, J.; Krause, E.; MacCrann, N.; Park, Y.; Troxel, M. A.; Jain, B.; Rozo, E.; Rykoff, E. S.; Wechsler, R. H.; Blazek, J.; Bonnett, C.; Crocce, M.; Fang, Y.; Gaztanaga, E.; Gruen, D.; Jarvis, M.; Miquel, R.; Prat, J.; Ross, A. J.; Sheldon, E.; Zuntz, J.; Abbott, T. M. C.; Abdalla, F. B.; Armstrong, R.; Becker, M. R.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Dietrich, J. P.; Doel, P.; Estrada, J.; Evrard, A. E.; Fausti Neto, A.; Flaugher, B.; Fosalba, P.; Frieman, J.; Gruendl, R. A.; Honscheid, K.; James, D. J.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; March, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Mohr, J. J.; Nichol, R. C.; Nord, B.; Plazas, A. A.; Romer, A. K.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Vikram, V.; Walker, A. R.
2017-03-01
We present galaxy-galaxy lensing results from 139 deg2 of Dark Energy Survey (DES) Science Verification (SV) data. Our lens sample consists of red galaxies, known as redMaGiC, which are specifically selected to have a low photometric redshift error and outlier rate. The lensing measurement has a total signal-to-noise ratio of 29 over scales 0.09 < R < 15 Mpc h-1, including all lenses over a wide redshift range 0.2 < z < 0.8. Dividing the lenses into three redshift bins for this constant moving number density sample, we find no evidence for evolution in the halo mass with redshift. We obtain consistent results for the lensing measurement with two independent shear pipelines, NGMIX and IM3SHAPE. We perform a number of null tests on the shear and photometric redshift catalogues and quantify resulting systematic uncertainties. Covariances from jackknife subsamples of the data are validated with a suite of 50 mock surveys. The result and systematic checks in this work provide a critical input for future cosmological and galaxy evolution studies with the DES data and redMaGiC galaxy samples. We fit a halo occupation distribution (HOD) model, and demonstrate that our data constrain the mean halo mass of the lens galaxies, despite strong degeneracies between individual HOD parameters.
The clinical impact of recent advances in LC-MS for cancer biomarker discovery and verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Hui; Shi, Tujin; Qian, Wei-Jun
2015-12-04
Mass spectrometry-based proteomics has become an indispensable tool in biomedical research with broad applications ranging from fundamental biology, systems biology, and biomarker discovery. Recent advances in LC-MS have made it become a major technology in clinical applications, especially in cancer biomarker discovery and verification. To overcome the challenges associated with the analysis of clinical samples, such as extremely wide dynamic range of protein concentrations in biofluids and the need to perform high throughput and accurate quantification, significant efforts have been devoted to improve the overall performance of LC-MS bases clinical proteomics. In this review, we summarize the recent advances inmore » LC-MS in the aspect of cancer biomarker discovery and quantification, and discuss its potentials, limitations, and future perspectives.« less
Ospina, Raydonal; Frery, Alejandro C.
2016-01-01
We present a new approach for handwritten signature classification and verification based on descriptors stemming from time causal information theory. The proposal uses the Shannon entropy, the statistical complexity, and the Fisher information evaluated over the Bandt and Pompe symbolization of the horizontal and vertical coordinates of signatures. These six features are easy and fast to compute, and they are the input to an One-Class Support Vector Machine classifier. The results are better than state-of-the-art online techniques that employ higher-dimensional feature spaces which often require specialized software and hardware. We assess the consistency of our proposal with respect to the size of the training sample, and we also use it to classify the signatures into meaningful groups. PMID:27907014
The clinical impact of recent advances in LC-MS for cancer biomarker discovery and verification.
Wang, Hui; Shi, Tujin; Qian, Wei-Jun; Liu, Tao; Kagan, Jacob; Srivastava, Sudhir; Smith, Richard D; Rodland, Karin D; Camp, David G
2016-01-01
Mass spectrometry (MS) -based proteomics has become an indispensable tool with broad applications in systems biology and biomedical research. With recent advances in liquid chromatography (LC) and MS instrumentation, LC-MS is making increasingly significant contributions to clinical applications, especially in the area of cancer biomarker discovery and verification. To overcome challenges associated with analyses of clinical samples (for example, a wide dynamic range of protein concentrations in bodily fluids and the need to perform high throughput and accurate quantification of candidate biomarker proteins), significant efforts have been devoted to improve the overall performance of LC-MS-based clinical proteomics platforms. Reviewed here are the recent advances in LC-MS and its applications in cancer biomarker discovery and quantification, along with the potentials, limitations and future perspectives.
Why do verification and validation?
Hu, Kenneth T.; Paez, Thomas L.
2016-02-19
In this discussion paper, we explore different ways to assess the value of verification and validation (V&V) of engineering models. We first present a literature review on the value of V&V and then use value chains and decision trees to show how value can be assessed from a decision maker's perspective. In this context, the value is what the decision maker is willing to pay for V&V analysis with the understanding that the V&V results are uncertain. As a result, the 2014 Sandia V&V Challenge Workshop is used to illustrate these ideas.
Model-based engineering for medical-device software.
Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi
2010-01-01
This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.
Bounded Parametric Model Checking for Elementary Net Systems
NASA Astrophysics Data System (ADS)
Knapik, Michał; Szreter, Maciej; Penczek, Wojciech
Bounded Model Checking (BMC) is an efficient verification method for reactive systems. BMC has been applied so far to verification of properties expressed in (timed) modal logics, but never to their parametric extensions. In this paper we show, for the first time that BMC can be extended to PRTECTL - a parametric extension of the existential version of CTL. To this aim we define a bounded semantics and a translation from PRTECTL to SAT. The implementation of the algorithm for Elementary Net Systems is presented, together with some experimental results.
Simulation-based MDP verification for leading-edge masks
NASA Astrophysics Data System (ADS)
Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki
2017-07-01
For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification acceptable.
1987-05-29
Controler A Fig.1 Experimental setip, P.S.O, : Phase sen:sitive detector. 0 VC.X.O. : Voltage controlled crystal oscillator. 1 A : Post - detector amplifier...the sampling period samples were obtained using a pair of fre- used in the experimental verification. :uency counters controlled by a desk-top...reduce the effect of group delay changes. The first method can te implemented by actively -_ - - . - or passively controlling the environment around
GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER
The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...
Reachability analysis of real-time systems using time Petri nets.
Wang, J; Deng, Y; Xu, G
2000-01-01
Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.
Power Performance Verification of a Wind Farm Using the Friedman's Test.
Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L
2016-06-03
In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman's test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.
Trajectory Based Behavior Analysis for User Verification
NASA Astrophysics Data System (ADS)
Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah
Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.
Power Performance Verification of a Wind Farm Using the Friedman’s Test
Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L.
2016-01-01
In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable. PMID:27271628
Secure voice-based authentication for mobile devices: vaulted voice verification
NASA Astrophysics Data System (ADS)
Johnson, R. C.; Scheirer, Walter J.; Boult, Terrance E.
2013-05-01
As the use of biometrics becomes more wide-spread, the privacy concerns that stem from the use of biometrics are becoming more apparent. As the usage of mobile devices grows, so does the desire to implement biometric identification into such devices. A large majority of mobile devices being used are mobile phones. While work is being done to implement different types of biometrics into mobile phones, such as photo based biometrics, voice is a more natural choice. The idea of voice as a biometric identifier has been around a long time. One of the major concerns with using voice as an identifier is the instability of voice. We have developed a protocol that addresses those instabilities and preserves privacy. This paper describes a novel protocol that allows a user to authenticate using voice on a mobile/remote device without compromising their privacy. We first discuss the Vaulted Verification protocol, which has recently been introduced in research literature, and then describe its limitations. We then introduce a novel adaptation and extension of the Vaulted Verification protocol to voice, dubbed Vaulted Voice Verification (V3). Following that we show a performance evaluation and then conclude with a discussion of security and future work.
Feasibility of conducting wetfall chemistry investigations around the Bowen Power Plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, N.C.J.; Patrinos, A.A.N.
1979-10-01
The feasibility of expanding the Meteorological Effects of Thermal Energy Releases - Oak Ridge National Laboratory (METER-ORNL) research at Bower Power Plant, a coal-fired power plant in northwest Georgia, to include wetfall chemistry is evaluated using results of similar studies around other power plants, several atmospheric washout models, analysis of spatial variability in precipitation, and field logistical considerations. An optimal wetfall chemistry network design is proposed, incorporating the inner portion of the existing rain-gauge network and augmented by additional sites to ensure adequate coverage of probable target areas. The predicted sulfate production rate differs by about four orders of magnitudemore » among the models reviewed with a pH of 3. No model can claim superiority over any other model without substantive data verification. The spatial uniformity in rain amount is evaluated using four storms that occurred at the METER-ORNL network. Values of spatial variability ranged from 8 to 31% and decreased as the mean rainfall increased. The field study of wetfall chemistry will require a minimum of 5 persons to operate the approximately 50 collectors covering an area of 740 km/sup 2/. Preliminary wetfall-only samples collected on an event basis showed lower pH and higher electrical conductivity of precipitation collected about 5 km downwind of the power plant relative to samples collected upwind. Wetfall samples collected on a weekly basis using automatic samplers, however, showed variable results, with no consistent pattern. This suggests the need for event sampling to minimize variable rain volume and multiple-source effects often associated with weekly samples.« less
Raiszadeh, Michelle M.; Ross, Mark M.; Russo, Paul S.; Schaepper, Mary Ann H.; Zhou, Weidong; Deng, Jianghong; Ng, Daniel; Dickson, April; Dickson, Cindy; Strom, Monica; Osorio, Carolina; Soeprono, Thomas; Wulfkuhle, Julia D.; Kabbani, Nadine; Petricoin, Emanuel F.; Liotta, Lance A.; Kirsch, Wolff M.
2012-01-01
Liquid chromatography tandem mass spectrometry (LC-MS/MS) and multiple reaction monitoring mass spectrometry (MRM-MS) proteomics analyses were performed on eccrine sweat of healthy controls, and the results were compared with those from individuals diagnosed with schizophrenia (SZ). This is the first large scale study of the sweat proteome. First, we performed LC-MS/MS on pooled SZ samples and pooled control samples for global proteomics analysis. Results revealed a high abundance of diverse proteins and peptides in eccrine sweat. Most of the proteins identified from sweat samples were found to be different than the most abundant proteins from serum, which indicates that eccrine sweat is not simply a plasma transudate, and may thereby be a source of unique disease-associated biomolecules. A second independent set of patient and control sweat samples were analyzed by LC-MS/MS and spectral counting to determine qualitative protein differential abundances between the control and disease groups. Differential abundances of selected proteins, initially determined by spectral counting, were verified by MRM-MS analyses. Seventeen proteins showed a differential abundance of approximately two-fold or greater between the SZ pooled sample and the control pooled sample. This study demonstrates the utility of LC-MS/MS and MRM-MS as a viable strategy for the discovery and verification of potential sweat protein disease biomarkers. PMID:22256890
A DVE Time Management Simulation and Verification Platform Based on Causality Consistency Middleware
NASA Astrophysics Data System (ADS)
Zhou, Hangjun; Zhang, Wei; Peng, Yuxing; Li, Sikun
During the course of designing a time management algorithm for DVEs, the researchers always become inefficiency for the distraction from the realization of the trivial and fundamental details of simulation and verification. Therefore, a platform having realized theses details is desirable. However, this has not been achieved in any published work to our knowledge. In this paper, we are the first to design and realize a DVE time management simulation and verification platform providing exactly the same interfaces as those defined by the HLA Interface Specification. Moreover, our platform is based on a new designed causality consistency middleware and might offer the comparison of three kinds of time management services: CO, RO and TSO. The experimental results show that the implementation of the platform only costs small overhead, and that the efficient performance of it is highly effective for the researchers to merely focus on the improvement of designing algorithms.
Synesthesia affects verification of simple arithmetic equations.
Ghirardelli, Thomas G; Mills, Carol Bergfeld; Zilioli, Monica K C; Bailey, Leah P; Kretschmar, Paige K
2010-01-01
To investigate the effects of color-digit synesthesia on numerical representation, we presented a synesthete, called SE, in the present study, and controls with mathematical equations for verification. In Experiment 1, SE verified addition equations made up of digits that either matched or mismatched her color-digit photisms or were in black. In Experiment 2A, the addends were presented in the different color conditions and the solution was presented in black, whereas in Experiment 2B the addends were presented in black and the solutions were presented in the different color conditions. In Experiment 3, multiplication and division equations were presented in the same color conditions as in Experiment 1. SE responded significantly faster to equations that matched her photisms than to those that did not; controls did not show this effect. These results suggest that photisms influence the processing of digits in arithmetic verification, replicating and extending previous findings.
Study of the penetration of a plate made of titanium alloy VT6 with a steel ball
NASA Astrophysics Data System (ADS)
Buzyurkin, A. E.
2018-03-01
The purpose of this work is the development and verification of mathematical relationships, adapted to the package of finite element analysis LS-DYNA and describing the deformation and destruction of a titanium plate in a high-speed collision. Using data from experiments on the interaction of a steel ball with a titanium plate made of VT6 alloy, verification of the available constants necessary for describing the behavior of the material using the Johnson-Cook relationships was performed, as well as verification of the parameters of the fracture model used in the numerical modeling of the collision process. An analysis of experimental data on the interaction of a spherical impactor with a plate showed that the data accepted for VT6 alloy in the first approximation for deformation hardening in the Johnson-Cook model give too high results on the residual velocities of the impactor when piercing the plate.
An effective one-dimensional anisotropic fingerprint enhancement algorithm
NASA Astrophysics Data System (ADS)
Ye, Zhendong; Xie, Mei
2012-01-01
Fingerprint identification is one of the most important biometric technologies. The performance of the minutiae extraction and the speed of the fingerprint verification system rely heavily on the quality of the input fingerprint images, so the enhancement of the low fingerprint is a critical and difficult step in a fingerprint verification system. In this paper we proposed an effective algorithm for fingerprint enhancement. Firstly we use normalization algorithm to reduce the variations in gray level values along ridges and valleys. Then we utilize the structure tensor approach to estimate each pixel of the fingerprint orientations. At last we propose a novel algorithm which combines the advantages of onedimensional Gabor filtering method and anisotropic method to enhance the fingerprint in recoverable region. The proposed algorithm has been evaluated on the database of Fingerprint Verification Competition 2004, and the results show that our algorithm performs within less time.
An effective one-dimensional anisotropic fingerprint enhancement algorithm
NASA Astrophysics Data System (ADS)
Ye, Zhendong; Xie, Mei
2011-12-01
Fingerprint identification is one of the most important biometric technologies. The performance of the minutiae extraction and the speed of the fingerprint verification system rely heavily on the quality of the input fingerprint images, so the enhancement of the low fingerprint is a critical and difficult step in a fingerprint verification system. In this paper we proposed an effective algorithm for fingerprint enhancement. Firstly we use normalization algorithm to reduce the variations in gray level values along ridges and valleys. Then we utilize the structure tensor approach to estimate each pixel of the fingerprint orientations. At last we propose a novel algorithm which combines the advantages of onedimensional Gabor filtering method and anisotropic method to enhance the fingerprint in recoverable region. The proposed algorithm has been evaluated on the database of Fingerprint Verification Competition 2004, and the results show that our algorithm performs within less time.
New generation of universal modeling for centrifugal compressors calculation
NASA Astrophysics Data System (ADS)
Galerkin, Y.; Drozdov, A.
2015-08-01
The Universal Modeling method is in constant use from mid - 1990th. Below is presented the newest 6th version of the Method. The flow path configuration of 3D impellers is presented in details. It is possible to optimize meridian configuration including hub/shroud curvatures, axial length, leading edge position, etc. The new model of vaned diffuser includes flow non-uniformity coefficient based on CFD calculations. The loss model was built from the results of 37 experiments with compressors stages of different flow rates and loading factors. One common set of empirical coefficients in the loss model guarantees the efficiency definition within an accuracy of 0.86% at the design point and 1.22% along the performance curve. The model verification was made. Four multistage compressors performances with vane and vaneless diffusers were calculated. As the model verification was made, four multistage compressors performances with vane and vaneless diffusers were calculated. Two of these compressors have quite unusual flow paths. The modeling results were quite satisfactory in spite of these peculiarities. One sample of the verification calculations is presented in the text. This 6th version of the developed computer program is being already applied successfully in the design practice.
Monitoring/Verification using DMS: TATP Example
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stephan Weeks, Kevin Kyle, Manuel Manard
Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations-management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biologicalmore » materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. Fast GC is the leading field analytical method for gas phase separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.« less
Replacement Technologies for Precision Cleaning of Aerospace Hardware for Propellant Service
NASA Technical Reports Server (NTRS)
Beeson, Harold; Kirsch, Mike; Hornung, Steven; Biesinger, Paul
1997-01-01
The NASA White Sands Test Facility (WSTF) is developing cleaning and verification processes to replace currently used chlorofluorocarbon-l13- (CFC-113-) based processes. The processes being evaluated include both aqueous- and solvent-based techniques. Replacement technologies are being investigated for aerospace hardware and for gauges and instrumentation. This paper includes the findings of investigations of aqueous cleaning and verification of aerospace hardware using known contaminants, such as hydraulic fluid and commonly used oils. The results correlate nonvolatile residue with CFC 113. The studies also include enhancements to aqueous sampling for organic and particulate contamination. Although aqueous alternatives have been identified for several processes, a need still exists for nonaqueous solvent cleaning, such as the cleaning and cleanliness verification of gauges used for oxygen service. The cleaning effectiveness of tetrachloroethylene (PCE), trichloroethylene (TCE), ethanol, hydrochlorofluorocarbon 225 (HCFC 225), HCFC 141b, HFE 7100(R), and Vertrel MCA(R) was evaluated using aerospace gauges and precision instruments and then compared to the cleaning effectiveness of CFC 113. Solvents considered for use in oxygen systems were also tested for oxygen compatibility using high-pressure oxygen autogenous ignition and liquid oxygen mechanical impact testing.
Monitoring/Verification Using DMS: TATP Example
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevin Kyle; Stephan Weeks
Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operationsmanagement systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biologicalmore » materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. GC is the leading analytical method for the separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.« less
18 CFR 281.213 - Data Verification Committee.
Code of Federal Regulations, 2011 CFR
2011-04-01
... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...
18 CFR 281.213 - Data Verification Committee.
Code of Federal Regulations, 2010 CFR
2010-04-01
... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...
Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers
NASA Technical Reports Server (NTRS)
Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.
1983-01-01
A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.
40 CFR 1065.920 - PEMS calibrations and verifications.
Code of Federal Regulations, 2014 CFR
2014-07-01
....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the... verifications and analysis. It may also be necessary to limit the range of conditions under which the PEMS can... additional information or analysis to support your conclusions. (b) Overall verification. This paragraph (b...
This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...
Comprehensive Evaluation and Implementation of Improvement Actions in Butcher Shops
Leotta, Gerardo A.; Brusa, Victoria; Galli, Lucía; Adriani, Cristian; Linares, Luciano; Etcheverría, Analía; Sanz, Marcelo; Sucari, Adriana; Peral García, Pilar; Signorini, Marcelo
2016-01-01
Foodborne pathogens can cause acute and chronic diseases and produce a wide range of symptoms. Since the consumption of ground beef is a risk factor for infections with some bacterial pathogens, we performed a comprehensive evaluation of butcher shops, implemented improvement actions for both butcher shops and consumers, and verified the impact of those actions implemented. A comprehensive evaluation was made and risk was quantified on a 1–100 scale as high-risk (1–40), moderate-risk (41–70) or low-risk (71–100). A total of 172 raw ground beef and 672 environmental samples were collected from 86 butcher shops during the evaluation (2010–2011) and verification (2013) stages of the study. Ground beef samples were analyzed for mesophilic aerobic organisms, Escherichia coli and coagulase-positive Staphylococcus aureus enumeration. Salmonella spp., E. coli O157:H7, non-O157 Shiga toxin-producing E. coli (STEC), and Listeria monocytogenes were detected and isolated from all samples. Risk quantification resulted in 43 (50.0%) high-risk, 34 (39.5%) moderate-risk, and nine (10.5%) low-risk butcher shops. Training sessions for 498 handlers and 4,506 consumers were held. Re-evaluation by risk quantification and microbiological analyses resulted in 19 (22.1%) high-risk, 42 (48.8%) moderate-risk and 25 (29.1%) low-risk butcher shops. The count of indicator microorganisms decreased with respect to the 2010–2011 period. After the implementation of improvement actions, the presence of L. monocytogenes, E. coli O157:H7 and stx genes in ground beef decreased. Salmonella spp. was isolated from 10 (11.6%) ground beef samples, without detecting statistically significant differences between both study periods (evaluation and verification). The percentage of pathogens in environmental samples was reduced in the verification period (Salmonella spp., 1.5%; L. monocytogenes, 10.7%; E. coli O157:H7, 0.6%; non-O157 STEC, 6.8%). Risk quantification was useful to identify those relevant facts in butcher shops. The reduction of contamination in ground beef and the environment was possible after training handlers based on the problems identified in their own butcher shops. Our results confirm the feasibility of implementing a comprehensive risk management program in butcher shops, and the importance of information campaigns targeting consumers. Further collaborative efforts would be necessary to improve foodstuffs safety at retail level and at home. PMID:27618439
Comprehensive Evaluation and Implementation of Improvement Actions in Butcher Shops.
Leotta, Gerardo A; Brusa, Victoria; Galli, Lucía; Adriani, Cristian; Linares, Luciano; Etcheverría, Analía; Sanz, Marcelo; Sucari, Adriana; Peral García, Pilar; Signorini, Marcelo
2016-01-01
Foodborne pathogens can cause acute and chronic diseases and produce a wide range of symptoms. Since the consumption of ground beef is a risk factor for infections with some bacterial pathogens, we performed a comprehensive evaluation of butcher shops, implemented improvement actions for both butcher shops and consumers, and verified the impact of those actions implemented. A comprehensive evaluation was made and risk was quantified on a 1-100 scale as high-risk (1-40), moderate-risk (41-70) or low-risk (71-100). A total of 172 raw ground beef and 672 environmental samples were collected from 86 butcher shops during the evaluation (2010-2011) and verification (2013) stages of the study. Ground beef samples were analyzed for mesophilic aerobic organisms, Escherichia coli and coagulase-positive Staphylococcus aureus enumeration. Salmonella spp., E. coli O157:H7, non-O157 Shiga toxin-producing E. coli (STEC), and Listeria monocytogenes were detected and isolated from all samples. Risk quantification resulted in 43 (50.0%) high-risk, 34 (39.5%) moderate-risk, and nine (10.5%) low-risk butcher shops. Training sessions for 498 handlers and 4,506 consumers were held. Re-evaluation by risk quantification and microbiological analyses resulted in 19 (22.1%) high-risk, 42 (48.8%) moderate-risk and 25 (29.1%) low-risk butcher shops. The count of indicator microorganisms decreased with respect to the 2010-2011 period. After the implementation of improvement actions, the presence of L. monocytogenes, E. coli O157:H7 and stx genes in ground beef decreased. Salmonella spp. was isolated from 10 (11.6%) ground beef samples, without detecting statistically significant differences between both study periods (evaluation and verification). The percentage of pathogens in environmental samples was reduced in the verification period (Salmonella spp., 1.5%; L. monocytogenes, 10.7%; E. coli O157:H7, 0.6%; non-O157 STEC, 6.8%). Risk quantification was useful to identify those relevant facts in butcher shops. The reduction of contamination in ground beef and the environment was possible after training handlers based on the problems identified in their own butcher shops. Our results confirm the feasibility of implementing a comprehensive risk management program in butcher shops, and the importance of information campaigns targeting consumers. Further collaborative efforts would be necessary to improve foodstuffs safety at retail level and at home.
Mineral mapping in the Maherabad area, eastern Iran, using the HyMap remote sensing data
NASA Astrophysics Data System (ADS)
Molan, Yusuf Eshqi; Refahi, Davood; Tarashti, Ali Hoseinmardi
2014-04-01
This study applies matched filtering on the HyMap airborne hyperspectral data to obtain the distribution map of alteration minerals in the Maherabad area and uses virtual verification to verify the results. This paper also introduces "moving threshold" which tries to find an appropriate threshold value to convert gray scale images, produced by mapping methods, to target and background pixels. The Maherabad area, located in the eastern part of the Lut block, is a Cu-Au porphyry system in which quartz-sericite-pyrite, argillic and propylitic alteration are most common. Minimum noise fraction transform coupled with a pixel purity index was applied on the HyMap images to extract the endmembers of the alteration minerals, including kaolinite, montmorillonite, sericite (muscovite/illite), calcite, chlorite, epidote, and goethite. Since there was no access to any portable spectrometer and/or lab spectral measurements for the verification of the remote sensing imagery results, virtual verification achieved using the USGS spectral library and showed an agreement of 83.19%. The comparison between the results of the matched filtering and X-ray diffraction (XRD) analyses also showed an agreement of 56.13%.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Verburg, J; Bortfeld, T
Purpose: We present a new system to perform prompt gamma-ray spectroscopy during proton pencil-beam scanning treatments, which enables in vivo verification of the proton range. This system will be used for the first clinical studies of this technology. Methods: After successful pre-clinical testing of prompt gamma-ray spectroscopy, a full scale system for clinical studies is now being assembled. Prompt gamma-rays will be detected during patient treatment using an array of 8 detector modules arranged behind a tungsten collimator. Each detector module consists of a lanthanum(III) bromide scintillator, a photomultiplier tube, and custom electronics for stable high voltage supply and signalmore » amplification. A new real-time data acquisition and control system samples the signals from the detectors with analog-to-digital converters, analyses events of interest, and communicates with the beam delivery systems. The timing of the detected events was synchronized to the cyclotron radiofrequency and the pencil-beam delivery. Range verification is performed by matching measured energy- and timeresolved gamma-ray spectra to nuclear reaction models based on the clinical treatment plan. Experiments in phantoms were performed using clinical beams in order to assess the performance of the systems. Results: The experiments showed reliable real-time analysis of more than 10 million detector events per second. The individual detector modules acquired accurate energy- and time-resolved gamma-ray measurements at a rate of 1 million events per second, which is typical for beams delivered with a clinical dose rate. The data acquisition system successfully tracked the delivery of the scanned pencil-beams to determine the location of range deviations within the treatment field. Conclusion: A clinical system for proton range verification using prompt gamma-ray spectroscopy has been designed and is being prepared for use during patient treatments. We anticipate to start a first clinical study in the near future. This work was supported by the Federal Share of program income earned by Massachusetts; General Hospital on C06-CA059267, Proton Therapy Research and Treatment Center.« less
Electroacoustic verification of frequency modulation systems in cochlear implant users.
Fidêncio, Vanessa Luisa Destro; Jacob, Regina Tangerino de Souza; Tanamati, Liége Franzini; Bucuvic, Érika Cristina; Moret, Adriane Lima Mortari
2017-12-26
The frequency modulation system is a device that helps to improve speech perception in noise and is considered the most beneficial approach to improve speech recognition in noise in cochlear implant users. According to guidelines, there is a need to perform a check before fitting the frequency modulation system. Although there are recommendations regarding the behavioral tests that should be performed at the fitting of the frequency modulation system to cochlear implant users, there are no published recommendations regarding the electroacoustic test that should be performed. Perform and determine the validity of an electroacoustic verification test for frequency modulation systems coupled to different cochlear implant speech processors. The sample included 40 participants between 5 and 18 year's users of four different models of speech processors. For the electroacoustic evaluation, we used the Audioscan Verifit device with the HA-1 coupler and the listening check devices corresponding to each speech processor model. In cases where the transparency was not achieved, a modification was made in the frequency modulation gain adjustment and we used the Brazilian version of the "Phrases in Noise Test" to evaluate the speech perception in competitive noise. It was observed that there was transparency between the frequency modulation system and the cochlear implant in 85% of the participants evaluated. After adjusting the gain of the frequency modulation receiver in the other participants, the devices showed transparency when the electroacoustic verification test was repeated. It was also observed that patients demonstrated better performance in speech perception in noise after a new adjustment, that is, in these cases; the electroacoustic transparency caused behavioral transparency. The electroacoustic evaluation protocol suggested was effective in evaluation of transparency between the frequency modulation system and the cochlear implant. Performing the adjustment of the speech processor and the frequency modulation system gain are essential when fitting this device. Copyright © 2017 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.
30 CFR 250.913 - When must I resubmit Platform Verification Program plans?
Code of Federal Regulations, 2011 CFR
2011-07-01
... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication... 30 Mineral Resources 2 2011-07-01 2011-07-01 false When must I resubmit Platform Verification...
30 CFR 250.909 - What is the Platform Verification Program?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What is the Platform Verification Program? 250... Platforms and Structures Platform Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms...
Study of techniques for redundancy verification without disrupting systems, phases 1-3
NASA Technical Reports Server (NTRS)
1970-01-01
The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.
Requirement Assurance: A Verification Process
NASA Technical Reports Server (NTRS)
Alexander, Michael G.
2011-01-01
Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.
Sample and population exponents of generalized Taylor's law.
Giometto, Andrea; Formentin, Marco; Rinaldo, Andrea; Cohen, Joel E; Maritan, Amos
2015-06-23
Taylor's law (TL) states that the variance V of a nonnegative random variable is a power function of its mean M; i.e., V = aM(b). TL has been verified extensively in ecology, where it applies to population abundance, physics, and other natural sciences. Its ubiquitous empirical verification suggests a context-independent mechanism. Sample exponents b measured empirically via the scaling of sample mean and variance typically cluster around the value b = 2. Some theoretical models of population growth, however, predict a broad range of values for the population exponent b pertaining to the mean and variance of population density, depending on details of the growth process. Is the widely reported sample exponent b ≃ 2 the result of ecological processes or could it be a statistical artifact? Here, we apply large deviations theory and finite-sample arguments to show exactly that in a broad class of growth models the sample exponent is b ≃ 2 regardless of the underlying population exponent. We derive a generalized TL in terms of sample and population exponents b(jk) for the scaling of the kth vs. the jth cumulants. The sample exponent b(jk) depends predictably on the number of samples and for finite samples we obtain b(jk) ≃ k = j asymptotically in time, a prediction that we verify in two empirical examples. Thus, the sample exponent b ≃ 2 may indeed be a statistical artifact and not dependent on population dynamics under conditions that we specify exactly. Given the broad class of models investigated, our results apply to many fields where TL is used although inadequately understood.
Verification of aerial photo stand volume tables for southeast Alaska.
Theodore S. Setzer; Bert R. Mead
1988-01-01
Aerial photo volume tables are used in the multilevel sampling system of Alaska Forest Inventory and Analysis. These volume tables are presented with a description of the data base and methods used to construct the tables. Volume estimates compiled from the aerial photo stand volume tables and associated ground-measured values are compared and evaluated.
Resistivity Correction Factor for the Four-Probe Method: Experiment I
NASA Astrophysics Data System (ADS)
Yamashita, Masato; Yamaguchi, Shoji; Enjoji, Hideo
1988-05-01
Experimental verification of the theoretically derived resistivity correction factor (RCF) is presented. Resistivity and sheet resistance measurements by the four-probe method are made on three samples: isotropic graphite, ITO film and Au film. It is indicated that the RCF can correct the apparent variations of experimental data to yield reasonable resistivities and sheet resistances.
ERIC Educational Resources Information Center
Hill, Clara E.; Williams, Elizabeth Nutt; Thompson, Barbara J.
1997-01-01
Offers reactions to critiques of a proposed research model: consensual qualitative research (CQR). Clarifies the meaning of consensus, explicates the representativeness of samples, analyzes the limitations and advantages of self-report data, and explores the nature of truth. Explores theory and verification in CQR and compares CQR to other…
ERIC Educational Resources Information Center
Rieben, James C., Jr.
2010-01-01
This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect…
40 CFR 1065.546 - Verification of minimum dilution ratio for PM batch sampling.
Code of Federal Regulations, 2014 CFR
2014-07-01
... chemical balance terms as given in § 1065.655(e). You may determine the raw exhaust flow rate based on the measured intake air and dilute exhaust molar flow rates and the dilute exhaust chemical balance terms as... air, fuel rate measurements, and fuel properties, consistent with good engineering judgment. (b...
Transistor step stress program for JANTX2N4150
NASA Technical Reports Server (NTRS)
1979-01-01
Reliability analysis of the transistor JANTX2N4150 manufactured by General Semiconductor and Transitron is reported. The discrete devices were subjected to power and temperature step stress tests and then to electrical tests after completing the power/temperature step stress point. Control sample units were maintained for verification of the electrical parametric testing. Results are presented.
Avoiding treatment bias of REDD+ monitoring by sampling with partial replacement
Michael Kohl; Charles T Scott; Andrew J Lister; Inez Demon; Daniel Plugge
2015-01-01
Implementing REDD+ renders the development of a measurement, reporting and verification (MRV) system necessary to monitor carbon stock changes. MRV systems generally apply a combination of remote sensing techniques and in-situ field assessments. In-situ assessments can be based on 1) permanent plots, which are assessed on all successive occasions, 2) temporary plots,...
SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thoelking, J; Yuvaraj, S; Jens, F
Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference)more » and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan verification. Funding Support, Disclosures, and Conflict of Interest: COIs: Frank Lohr: Elekta: research grant, travel grants, teaching honoraria IBA: research grant, travel grants, teaching honoraria, advisory board C-Rad: board honoraria, travel grants Frederik Wenz: Elekta: research grant, teaching honoraria, consultant, advisory board Zeiss: research grant, teaching honoraria, patent Hansjoerg Wertz: Elekta: research grant, teaching honoraria IBA: research grant.« less
Development and validation of MCNPX-based Monte Carlo treatment plan verification system
Jabbari, Iraj; Monadi, Shahram
2015-01-01
A Monte Carlo treatment plan verification (MCTPV) system was developed for clinical treatment plan verification (TPV), especially for the conformal and intensity-modulated radiotherapy (IMRT) plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation therapy (DICOM-RT) format. In MCTPV several methods were applied in order to reduce the simulation time. The relative dose distribution of a clinical prostate conformal plan calculated by the MCTPV was compared with that of TiGRT planning system. The results showed well implementation of the beams configuration and patient information in this system. For quantitative evaluation of MCTPV a two-dimensional (2D) diode array (MapCHECK2) and gamma index analysis were used. The gamma passing rate (3%/3 mm) of an IMRT plan was found to be 98.5% for total beams. Also, comparison of the measured and Monte Carlo calculated doses at several points inside an inhomogeneous phantom for 6- and 18-MV photon beams showed a good agreement (within 1.5%). The accuracy and timing results of MCTPV showed that MCTPV could be used very efficiently for additional assessment of complicated plans such as IMRT plan. PMID:26170554
Argon Collection And Purification For Proliferation Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Achey, R.; Hunter, D.
2015-10-09
In order to determine whether a seismic event was a declared/undeclared underground nuclear weapon test, environmental samples must be taken and analyzed for signatures that are unique to a nuclear explosion. These signatures are either particles or gases. Particle samples are routinely taken and analyzed under the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) verification regime as well as by individual countries. Gas samples are analyzed for signature gases, especially radioactive xenon. Underground nuclear tests also produce radioactive argon, but that signature is not well monitored. A radioactive argon signature, along with other signatures, can more conclusively determine whether an event wasmore » a nuclear test. This project has developed capabilities for collecting and purifying argon samples for ultra-low-background proportional counting. SRNL has developed a continuous gas enrichment system that produces an output stream containing 97% argon from whole air using adsorbent separation technology (the flow diagram for the system is shown in the figure). The vacuum swing adsorption (VSA) enrichment system is easily scalable to produce ten liters or more of 97% argon within twelve hours. A gas chromatographic separation using a column of modified hydrogen mordenite molecular sieve has been developed that can further purify the sample to better than 99% purity after separation from the helium carrier gas. The combination of these concentration and purification systems has the capability of being used for a field-deployable system for collecting argon samples suitable for ultra-low-background proportional counting for detecting nuclear detonations under the On-Site Inspection program of the CTBTO verification regime. The technology also has applications for the bulk argon separation from air for industrial purposes such as the semi-conductor industry.« less
30 CFR 250.909 - What is the Platform Verification Program?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What is the Platform Verification Program? 250... Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms; platforms of a new or unique design...
Verifying Sediment Fingerprinting Results with Known Mixtures
NASA Astrophysics Data System (ADS)
Gellis, A.; Gorman-Sanisaca, L.; Cashman, M. J.
2017-12-01
Sediment fingerprinting is a widely used approach to determine the specific sources of fluvial sediment within a watershed. It relies on the principle that potential sediment sources can be identified using a set of chemical tracers (or fingerprints), and comparison of these source fingerprints with fluvial (target) sediment allows for source apportionment of the fluvial sediment. There are numerous source classifications, fingerprints, and statistical approaches used in the literature to apportion sources of sediment. However, few of these studies have sought to test the method by creating controls on the ratio of sources in the target sediment. Without a controlled environment for inputs and outputs, such verification of results is ambiguous. Here, we generated artificial mixtures of source sediment from an agricultural/forested watershed in Virginia, USA (Smith Creek, 246 km2) to verify the apportionment results. Target samples were established from known mixtures of the four major sediment sources in the watershed (forest, pasture, cropland, and streambanks). The target samples were sieved to less than 63 microns and analyzed for elemental and isotopic chemistry. The target samples and source samples were run through the Sediment Source Assessment Tool (Sed_SAT) to verify if the statistical operations provided the correct apportionment. Sed_SAT uses a multivariate parametric approach to identify the minimum suite of fingerprints that discriminate the source areas and applies these fingerprints through an unmixng model to apportion sediment. The results of this sediment fingerprinting verification experiment will be presented in this session.
Hydrologic data-verification management program plan
Alexander, C.W.
1982-01-01
Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)
Distributed Capacitive Sensor for Sample Mass Measurement
NASA Technical Reports Server (NTRS)
Toda, Risaku; McKinney, Colin; Jackson, Shannon P.; Mojarradi, Mohammad; Manohara, Harish; Trebi-Ollennu, Ashitey
2011-01-01
Previous robotic sample return missions lacked in situ sample verification/ quantity measurement instruments. Therefore, the outcome of the mission remained unclear until spacecraft return. In situ sample verification systems such as this Distributed Capacitive (DisC) sensor would enable an unmanned spacecraft system to re-attempt the sample acquisition procedures until the capture of desired sample quantity is positively confirmed, thereby maximizing the prospect for scientific reward. The DisC device contains a 10-cm-diameter pressure-sensitive elastic membrane placed at the bottom of a sample canister. The membrane deforms under the weight of accumulating planetary sample. The membrane is positioned in close proximity to an opposing rigid substrate with a narrow gap. The deformation of the membrane makes the gap narrower, resulting in increased capacitance between the two parallel plates (elastic membrane and rigid substrate). C-V conversion circuits on a nearby PCB (printed circuit board) provide capacitance readout via LVDS (low-voltage differential signaling) interface. The capacitance method was chosen over other potential approaches such as the piezoelectric method because of its inherent temperature stability advantage. A reference capacitor and temperature sensor are embedded in the system to compensate for temperature effects. The pressure-sensitive membranes are aluminum 6061, stainless steel (SUS) 403, and metal-coated polyimide plates. The thicknesses of these membranes range from 250 to 500 m. The rigid substrate is made with a 1- to 2-mm-thick wafer of one of the following materials depending on the application requirements glass, silicon, polyimide, PCB substrate. The glass substrate is fabricated by a microelectromechanical systems (MEMS) fabrication approach. Several concentric electrode patterns are printed on the substrate. The initial gap between the two plates, 100 m, is defined by a silicon spacer ring that is anodically bonded to the glass substrate. The fabricated proof-of-concept devices have successfully demonstrated tens to hundreds of picofarads of capacitance change when a simulated sample (100 g to 500 g) is placed on the membrane.
Peng, Rongxue; Zhang, Rui; Lin, Guigao; Yang, Xin; Li, Ziyang; Zhang, Kuo; Zhang, Jiawei; Li, Jinming
2017-09-01
The echinoderm microtubule-associated protein-like 4 and anaplastic lymphoma kinase (ALK) receptor tyrosine kinase (EML4-ALK) rearrangement is an important biomarker that plays a pivotal role in therapeutic decision making for non-small-cell lung cancer (NSCLC) patients. Ensuring accuracy and reproducibility of EML4-ALK testing by fluorescence in situ hybridization, immunohistochemistry, RT-PCR, and next-generation sequencing requires reliable reference materials for monitoring assay sensitivity and specificity. Herein, we developed novel reference materials for various kinds of EML4-ALK testing. CRISPR/Cas9 was used to edit various NSCLC cell lines containing EML4-ALK rearrangement variants 1, 2, and 3a/b. After s.c. inoculation, the formalin-fixed, paraffin-embedded (FFPE) samples from xenografts were prepared and tested for suitability as candidate reference materials by fluorescence in situ hybridization, immunohistochemistry, RT-PCR, and next-generation sequencing. Sample validation and commutability assessments showed that all types of FFPE samples derived from xenograft tumors have typical histological structures, and EML4-ALK testing results were similar to the clinical ALK-positive NSCLC specimens. Among the four methods for EML4-ALK detection, the validation test showed 100% concordance. Furthermore, these novel FFPE reference materials showed good stability and homogeneity. Without limitations on variant types and production, our novel FFPE samples based on CRISPR/Cas9 editing and xenografts are suitable as candidate reference materials for the validation, verification, internal quality control, and proficiency testing of EML4-ALK detection. Copyright © 2017 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.
Ghosh, Sreya; Preza, Chrysanthe
2015-07-01
A three-dimensional (3-D) point spread function (PSF) model for wide-field fluorescence microscopy, suitable for imaging samples with variable refractive index (RI) in multilayered media, is presented. This PSF model is a key component for accurate 3-D image restoration of thick biological samples, such as lung tissue. Microscope- and specimen-derived parameters are combined with a rigorous vectorial formulation to obtain a new PSF model that accounts for additional aberrations due to specimen RI variability. Experimental evaluation and verification of the PSF model was accomplished using images from 175-nm fluorescent beads in a controlled test sample. Fundamental experimental validation of the advantage of using improved PSFs in depth-variant restoration was accomplished by restoring experimental data from beads (6 μm in diameter) mounted in a sample with RI variation. In the investigated study, improvement in restoration accuracy in the range of 18 to 35% was observed when PSFs from the proposed model were used over restoration using PSFs from an existing model. The new PSF model was further validated by showing that its prediction compares to an experimental PSF (determined from 175-nm beads located below a thick rat lung slice) with a 42% improved accuracy over the current PSF model prediction.