Sample records for verification assay based

  1. Verification of performance specifications for a US Food and Drug Administration-approved molecular microbiology test: Clostridium difficile cytotoxin B using the Becton, Dickinson and Company GeneOhm Cdiff assay.

    PubMed

    Schlaberg, Robert; Mitchell, Michael J; Taggart, Edward W; She, Rosemary C

    2012-01-01

    US Food and Drug Administration (FDA)-approved diagnostic tests based on molecular genetic technologies are becoming available for an increasing number of microbial pathogens. Advances in technology and lower costs have moved molecular diagnostic tests formerly performed for research purposes only into much wider use in clinical microbiology laboratories. To provide an example of laboratory studies performed to verify the performance of an FDA-approved assay for the detection of Clostridium difficile cytotoxin B compared with the manufacturer's performance standards. We describe the process and protocols used by a laboratory for verification of an FDA-approved assay, assess data from the verification studies, and implement the assay after verification. Performance data from the verification studies conducted by the laboratory were consistent with the manufacturer's performance standards and the assay was implemented into the laboratory's test menu. Verification studies are required for FDA-approved diagnostic assays prior to use in patient care. Laboratories should develop a standardized approach to verification studies that can be adapted and applied to different types of assays. We describe the verification of an FDA-approved real-time polymerase chain reaction assay for the detection of a toxin gene in a bacterial pathogen.

  2. Microsatellite Imputation for parental verification from SNP across multiple Bos taurus and indicus breeds

    USDA-ARS?s Scientific Manuscript database

    Microsatellite markers (MS) have traditionally been used for parental verification and are still the international standard in spite of their higher cost, error rate, and turnaround time compared with Single Nucleotide Polymorphisms (SNP)-based assays. Despite domestic and international demands fro...

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, PCB DETECTION TECHNOLOGY, HYBRIZYME DELFIA TM ASSAY

    EPA Science Inventory

    The DELFIA PCB Assay is a solid-phase time-resolved fluoroimmunoassay based on the sequential addition of sample extract and europium-labeled PCB tracer to a monoclonal antibody reagent specific for PCBs. In this assay, the antibody reagent and sample extract are added to a strip...

  4. Verification of performance specifications of a molecular test: cystic fibrosis carrier testing using the Luminex liquid bead array.

    PubMed

    Lacbawan, Felicitas L; Weck, Karen E; Kant, Jeffrey A; Feldman, Gerald L; Schrijver, Iris

    2012-01-01

    The number of clinical laboratories introducing various molecular tests to their existing test menu is continuously increasing. Prior to offering a US Food and Drug Administration-approved test, it is necessary that performance characteristics of the test, as claimed by the company, are verified before the assay is implemented in a clinical laboratory. To provide an example of the verification of a specific qualitative in vitro diagnostic test: cystic fibrosis carrier testing using the Luminex liquid bead array (Luminex Molecular Diagnostics, Inc, Toronto, Ontario). The approach used by an individual laboratory for verification of a US Food and Drug Administration-approved assay is described. Specific verification data are provided to highlight the stepwise verification approach undertaken by a clinical diagnostic laboratory. Protocols for verification of in vitro diagnostic assays may vary between laboratories. However, all laboratories must verify several specific performance specifications prior to implementation of such assays for clinical use. We provide an example of an approach used for verifying performance of an assay for cystic fibrosis carrier screening.

  5. Imputation of microsatellite alleles from dense SNP genotypes for parentage verification across multiple Bos taurus and Bos indicus breeds

    USDA-ARS?s Scientific Manuscript database

    Microsatellite markers (MS) have traditionally been used for parental verification and are still the international standard in spite of their higher cost, error rate, and turnaround time compared with Single Nucleotide Polymorphisms (SNP) -based assays. Despite domestic and international demands fr...

  6. Multi-site assessment of the precision and reproducibility of multiple reaction monitoring–based measurements of proteins in plasma

    PubMed Central

    Addona, Terri A; Abbatiello, Susan E; Schilling, Birgit; Skates, Steven J; Mani, D R; Bunk, David M; Spiegelman, Clifford H; Zimmerman, Lisa J; Ham, Amy-Joan L; Keshishian, Hasmik; Hall, Steven C; Allen, Simon; Blackman, Ronald K; Borchers, Christoph H; Buck, Charles; Cardasis, Helene L; Cusack, Michael P; Dodder, Nathan G; Gibson, Bradford W; Held, Jason M; Hiltke, Tara; Jackson, Angela; Johansen, Eric B; Kinsinger, Christopher R; Li, Jing; Mesri, Mehdi; Neubert, Thomas A; Niles, Richard K; Pulsipher, Trenton C; Ransohoff, David; Rodriguez, Henry; Rudnick, Paul A; Smith, Derek; Tabb, David L; Tegeler, Tony J; Variyath, Asokan M; Vega-Montoto, Lorenzo J; Wahlander, Åsa; Waldemarson, Sofia; Wang, Mu; Whiteaker, Jeffrey R; Zhao, Lei; Anderson, N Leigh; Fisher, Susan J; Liebler, Daniel C; Paulovich, Amanda G; Regnier, Fred E; Tempst, Paul; Carr, Steven A

    2010-01-01

    Verification of candidate biomarkers relies upon specific, quantitative assays optimized for selective detection of target proteins, and is increasingly viewed as a critical step in the discovery pipeline that bridges unbiased biomarker discovery to preclinical validation. Although individual laboratories have demonstrated that multiple reaction monitoring (MRM) coupled with isotope dilution mass spectrometry can quantify candidate protein biomarkers in plasma, reproducibility and transferability of these assays between laboratories have not been demonstrated. We describe a multilaboratory study to assess reproducibility, recovery, linear dynamic range and limits of detection and quantification of multiplexed, MRM-based assays, conducted by NCI-CPTAC. Using common materials and standardized protocols, we demonstrate that these assays can be highly reproducible within and across laboratories and instrument platforms, and are sensitive to low µg/ml protein concentrations in unfractionated plasma. We provide data and benchmarks against which individual laboratories can compare their performance and evaluate new technologies for biomarker verification in plasma. PMID:19561596

  7. Environmental Technology Verification Report for Abraxis Ecologenia® 17β-Estradiol (E2) Microplate Enzyme-Linked Immunosorbent Assay (ELISA) Test Kits

    EPA Science Inventory

    This verification test was conducted according to procedures specifiedin the Test/QA Planfor Verification of Enzyme-Linked Immunosorbent Assay (ELISA) Test Kis for the Quantitative Determination of Endocrine Disrupting Compounds (EDCs) in Aqueous Phase Samples. Deviations to the...

  8. CLSI-based transference and verification of CALIPER pediatric reference intervals for 29 Ortho VITROS 5600 chemistry assays.

    PubMed

    Higgins, Victoria; Truong, Dorothy; Woroch, Amy; Chan, Man Khun; Tahmasebi, Houman; Adeli, Khosrow

    2018-03-01

    Evidence-based reference intervals (RIs) are essential to accurately interpret pediatric laboratory test results. To fill gaps in pediatric RIs, the Canadian Laboratory Initiative on Pediatric Reference Intervals (CALIPER) project developed an age- and sex-specific pediatric RI database based on healthy pediatric subjects. Originally established for Abbott ARCHITECT assays, CALIPER RIs were transferred to assays on Beckman, Roche, Siemens, and Ortho analytical platforms. This study provides transferred reference intervals for 29 biochemical assays for the Ortho VITROS 5600 Chemistry System (Ortho). Based on Clinical Laboratory Standards Institute (CLSI) guidelines, a method comparison analysis was performed by measuring approximately 200 patient serum samples using Abbott and Ortho assays. The equation of the line of best fit was calculated and the appropriateness of the linear model was assessed. This equation was used to transfer RIs from Abbott to Ortho assays. Transferred RIs were verified using 84 healthy pediatric serum samples from the CALIPER cohort. RIs for most chemistry analytes successfully transferred from Abbott to Ortho assays. Calcium and CO 2 did not meet statistical criteria for transference (r 2 <0.70). Of the 32 transferred reference intervals, 29 successfully verified with approximately 90% of results from reference samples falling within transferred confidence limits. Transferred RIs for total bilirubin, magnesium, and LDH did not meet verification criteria and are not reported. This study broadens the utility of the CALIPER pediatric RI database to laboratories using Ortho VITROS 5600 biochemical assays. Clinical laboratories should verify CALIPER reference intervals for their specific analytical platform and local population as recommended by CLSI. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  9. A new method to address verification bias in studies of clinical screening tests: cervical cancer screening assays as an example.

    PubMed

    Xue, Xiaonan; Kim, Mimi Y; Castle, Philip E; Strickler, Howard D

    2014-03-01

    Studies to evaluate clinical screening tests often face the problem that the "gold standard" diagnostic approach is costly and/or invasive. It is therefore common to verify only a subset of negative screening tests using the gold standard method. However, undersampling the screen negatives can lead to substantial overestimation of the sensitivity and underestimation of the specificity of the diagnostic test. Our objective was to develop a simple and accurate statistical method to address this "verification bias." We developed a weighted generalized estimating equation approach to estimate, in a single model, the accuracy (eg, sensitivity/specificity) of multiple assays and simultaneously compare results between assays while addressing verification bias. This approach can be implemented using standard statistical software. Simulations were conducted to assess the proposed method. An example is provided using a cervical cancer screening trial that compared the accuracy of human papillomavirus and Pap tests, with histologic data as the gold standard. The proposed approach performed well in estimating and comparing the accuracy of multiple assays in the presence of verification bias. The proposed approach is an easy to apply and accurate method for addressing verification bias in studies of multiple screening methods. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Ehwang; Gao, Yuqian; Wu, Chaochao

    Here, mass spectrometry (MS) based targeted proteomic methods such as selected reaction monitoring (SRM) are becoming the method of choice for preclinical verification of candidate protein biomarkers. The Clinical Proteomic Tumor Analysis Consortium (CPTAC) of the National Cancer Institute has investigated the standardization and analytical validation of the SRM assays and demonstrated robust analytical performance on different instruments across different laboratories. An Assay Portal has also been established by CPTAC to provide the research community a resource consisting of large set of targeted MS-based assays, and a depository to share assays publicly, providing that assays meet the guidelines proposed bymore » CPTAC. Herein, we report 98 SRM assays covering 70 candidate protein biomarkers previously reported as associated with ovarian cancer that have been thoroughly characterized according to the CPTAC Assay Characterization Guidance Document. The experiments, methods and results for characterizing these SRM assays for their MS response, repeatability, selectivity, stability, and reproducible detection of endogenous analytes are described in detail.« less

  11. Test/QA Plan for Verification of Microcystin Test Kits

    EPA Science Inventory

    Microcystin test kits are used to quantitatively measure total microcystin in recreational waters. These test kits are based on enzyme-linked immunosorbent assays (ELISA) with antibodies that bind specifically to microcystins or phosphate activity inhibition where the phosphatas...

  12. Analytical verification and method comparison of the ADVIA Centaur® Intact Parathyroid Hormone assay.

    PubMed

    Fernández-Galán, Esther; Bedini, Josep Lluís; Filella, Xavier

    2017-12-01

    This study is the first verification of the novel iPTH Siemens ADVIA Centaur® Intact Parathyroid Hormone (iPTHm) chemiluminescence immunoassay based on monoclonal antibodies. We also compared the iPTH results obtained using this assay with the previous ADVIA Centaur® Parathyroid Hormone assay (iPTHp) based on polyclonal antibodies. The analytical performance study of iPTHm assay included LoD, LoQ, intra- and inter-assay reproducibility, and linearity. A comparison study was performed on 369 routine plasma samples. The results were analyzed independently for patients with normal and abnormal GFR, as well as patients on hemodialysis. In addition, clinical concordance between assays was assessed. Finally, we studied PTH stability of plasma samples at 4°C. For the iPTHm assay LoD and LoQ were 0.03pmol/L and 0.10pmol/L, respectively. Intra- and inter-assay CV were between 2.3% and 6.2%. Linearity was correct in the range from 3.82 to 203.08pmol/L. Correlation studies showed a good correlation (r=0.99) between iPTHm and iPTHp, with bias of -2.55% (IC -3.48% to -1.62%) in the range from 0.32 to 117.07pmol/L. Clinical concordance, assessed by Kappa Index, was 0.874. The stability study showed that differences compared to basal iPTH concentration did not exceed 20% in any of the samples analyzed. The iPTHm assay demonstrated acceptable performance and a very good clinical concordance with iPTHp assay, currently used in our laboratory. Thus, the novel iPTHm assay can replace the previous iPTHp assay, since results provided by both assays are very similar. In our study, the stability of iPTH is not affected by storage up to 14days. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  13. Targeted proteomic assays for quantitation of proteins identified by proteogenomic analysis of ovarian cancer

    DOE PAGES

    Song, Ehwang; Gao, Yuqian; Wu, Chaochao; ...

    2017-07-19

    Here, mass spectrometry (MS) based targeted proteomic methods such as selected reaction monitoring (SRM) are becoming the method of choice for preclinical verification of candidate protein biomarkers. The Clinical Proteomic Tumor Analysis Consortium (CPTAC) of the National Cancer Institute has investigated the standardization and analytical validation of the SRM assays and demonstrated robust analytical performance on different instruments across different laboratories. An Assay Portal has also been established by CPTAC to provide the research community a resource consisting of large set of targeted MS-based assays, and a depository to share assays publicly, providing that assays meet the guidelines proposed bymore » CPTAC. Herein, we report 98 SRM assays covering 70 candidate protein biomarkers previously reported as associated with ovarian cancer that have been thoroughly characterized according to the CPTAC Assay Characterization Guidance Document. The experiments, methods and results for characterizing these SRM assays for their MS response, repeatability, selectivity, stability, and reproducible detection of endogenous analytes are described in detail.« less

  14. Mass Spectrometry-based Assay for High Throughput and High Sensitivity Biomarker Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Xuejiang; Tang, Keqi

    Searching for disease specific biomarkers has become a major undertaking in the biomedical research field as the effective diagnosis, prognosis and treatment of many complex human diseases are largely determined by the availability and the quality of the biomarkers. A successful biomarker as an indicator to a specific biological or pathological process is usually selected from a large group of candidates by a strict verification and validation process. To be clinically useful, the validated biomarkers must be detectable and quantifiable by the selected testing techniques in their related tissues or body fluids. Due to its easy accessibility, protein biomarkers wouldmore » ideally be identified in blood plasma or serum. However, most disease related protein biomarkers in blood exist at very low concentrations (<1ng/mL) and are “masked” by many none significant species at orders of magnitude higher concentrations. The extreme requirements of measurement sensitivity, dynamic range and specificity make the method development extremely challenging. The current clinical protein biomarker measurement primarily relies on antibody based immunoassays, such as ELISA. Although the technique is sensitive and highly specific, the development of high quality protein antibody is both expensive and time consuming. The limited capability of assay multiplexing also makes the measurement an extremely low throughput one rendering it impractical when hundreds to thousands potential biomarkers need to be quantitatively measured across multiple samples. Mass spectrometry (MS)-based assays have recently shown to be a viable alternative for high throughput and quantitative candidate protein biomarker verification. Among them, the triple quadrupole MS based assay is the most promising one. When it is coupled with liquid chromatography (LC) separation and electrospray ionization (ESI) source, a triple quadrupole mass spectrometer operating in a special selected reaction monitoring (SRM) mode, also known as multiple reaction monitoring (MRM), is capable of quantitatively measuring hundreds of candidate protein biomarkers from a relevant clinical sample in a single analysis. The specificity, reproducibility and sensitivity could be as good as ELISA. Furthermore, SRM MS can also quantify protein isoforms and post-translational modifications, for which traditional antibody-based immunoassays often don’t exist.« less

  15. Spent Fuel Assay with an Ultra-High Rate HPGe Spectrometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fast, James; Fulsom, Bryan; Pitts, Karl

    2015-07-01

    Traditional verification of spent nuclear fuel (SNF) includes determination of initial enrichment, burnup and cool down time (IE, BU, CT). Along with neutron measurements, passive gamma assay provides important information for determining BU and CT. Other gamma-ray-based assay methods such as passive tomography and active delayed gamma offer the potential to measure the spatial distribution of fission products and the fissile isotopic concentration of the fuel, respectively. All fuel verification methods involving gamma-ray spectroscopy require that the spectrometers manage very high count rates while extracting the signatures of interest. PNNL has developed new digital filtering and analysis techniques to producemore » an ultra-high rate gamma-ray spectrometer from a standard coaxial high-purity germanium (HPGe) crystal. This 37% relative efficiency detector has been operated for SNF measurements at input count rates of 500-1300 kcps and throughput in excess of 150 kcps. Optimized filtering algorithms preserve the spectroscopic capability of the system even at these high rates. This paper will present the results of both passive and active SNF measurement performed with this system at PNNL. (authors)« less

  16. Multivariate statistical monitoring as applied to clean-in-place (CIP) and steam-in-place (SIP) operations in biopharmaceutical manufacturing.

    PubMed

    Roy, Kevin; Undey, Cenk; Mistretta, Thomas; Naugle, Gregory; Sodhi, Manbir

    2014-01-01

    Multivariate statistical process monitoring (MSPM) is becoming increasingly utilized to further enhance process monitoring in the biopharmaceutical industry. MSPM can play a critical role when there are many measurements and these measurements are highly correlated, as is typical for many biopharmaceutical operations. Specifically, for processes such as cleaning-in-place (CIP) and steaming-in-place (SIP, also known as sterilization-in-place), control systems typically oversee the execution of the cycles, and verification of the outcome is based on offline assays. These offline assays add to delays and corrective actions may require additional setup times. Moreover, this conventional approach does not take interactive effects of process variables into account and cycle optimization opportunities as well as salient trends in the process may be missed. Therefore, more proactive and holistic online continued verification approaches are desirable. This article demonstrates the application of real-time MSPM to processes such as CIP and SIP with industrial examples. The proposed approach has significant potential for facilitating enhanced continuous verification, improved process understanding, abnormal situation detection, and predictive monitoring, as applied to CIP and SIP operations. © 2014 American Institute of Chemical Engineers.

  17. Neutron spectrometry for UF 6 enrichment verification in storage cylinders

    DOE PAGES

    Mengesha, Wondwosen; Kiff, Scott D.

    2015-01-29

    Verification of declared UF 6 enrichment and mass in storage cylinders is of great interest in nuclear material nonproliferation. Nondestructive assay (NDA) techniques are commonly used for safeguards inspections to ensure accountancy of declared nuclear materials. Common NDA techniques used include gamma-ray spectrometry and both passive and active neutron measurements. In the present study, neutron spectrometry was investigated for verification of UF 6 enrichment in 30B storage cylinders based on an unattended and passive measurement approach. MCNP5 and Geant4 simulated neutron spectra, for selected UF 6 enrichments and filling profiles, were used in the investigation. The simulated neutron spectra weremore » analyzed using principal component analysis (PCA). The PCA technique is a well-established technique and has a wide area of application including feature analysis, outlier detection, and gamma-ray spectral analysis. Results obtained demonstrate that neutron spectrometry supported by spectral feature analysis has potential for assaying UF 6 enrichment in storage cylinders. Thus the results from the present study also showed that difficulties associated with the UF 6 filling profile and observed in other unattended passive neutron measurements can possibly be overcome using the approach presented.« less

  18. Analysis of historical delta values for IAEA/LANL NDA training courses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geist, William; Santi, Peter; Swinhoe, Martyn

    2009-01-01

    The Los Alamos National Laboratory (LANL) supports the International Atomic Energy Agency (IAEA) by providing training for IAEA inspectors in neutron and gamma-ray Nondestructive Assay (NDA) of nuclear material. Since 1980, all new IAEA inspectors attend this two week course at LANL gaining hands-on experience in the application of NDA techniques, procedures and analysis to measure plutonium and uranium nuclear material standards with well known pedigrees. As part of the course the inspectors conduct an inventory verification exercise. This exercise provides inspectors the opportunity to test their abilities in performing verification measurements using the various NDA techniques. For an inspector,more » the verification of an item is nominally based on whether the measured assay value agrees with the declared value to within three times the historical delta value. The historical delta value represents the average difference between measured and declared values from previous measurements taken on similar material with the same measurement technology. If the measurement falls outside a limit of three times the historical delta value, the declaration is not verified. This paper uses measurement data from five years of IAEA courses to calculate a historical delta for five non-destructive assay methods: Gamma-ray Enrichment, Gamma-ray Plutonium Isotopics, Passive Neutron Coincidence Counting, Active Neutron Coincidence Counting and the Neutron Coincidence Collar. These historical deltas provide information as to the precision and accuracy of these measurement techniques under realistic conditions.« less

  19. MRM for the verification of cancer biomarker proteins: recent applications to human plasma and serum.

    PubMed

    Chambers, Andrew G; Percy, Andrew J; Simon, Romain; Borchers, Christoph H

    2014-04-01

    Accurate cancer biomarkers are needed for early detection, disease classification, prediction of therapeutic response and monitoring treatment. While there appears to be no shortage of candidate biomarker proteins, a major bottleneck in the biomarker pipeline continues to be their verification by enzyme linked immunosorbent assays. Multiple reaction monitoring (MRM), also known as selected reaction monitoring, is a targeted mass spectrometry approach to protein quantitation and is emerging to bridge the gap between biomarker discovery and clinical validation. Highly multiplexed MRM assays are readily configured and enable simultaneous verification of large numbers of candidates facilitating the development of biomarker panels which can increase specificity. This review focuses on recent applications of MRM to the analysis of plasma and serum from cancer patients for biomarker verification. The current status of this approach is discussed along with future directions for targeted mass spectrometry in clinical biomarker validation.

  20. Accurate inclusion mass screening: a bridge from unbiased discovery to targeted assay development for biomarker verification.

    PubMed

    Jaffe, Jacob D; Keshishian, Hasmik; Chang, Betty; Addona, Theresa A; Gillette, Michael A; Carr, Steven A

    2008-10-01

    Verification of candidate biomarker proteins in blood is typically done using multiple reaction monitoring (MRM) of peptides by LC-MS/MS on triple quadrupole MS systems. MRM assay development for each protein requires significant time and cost, much of which is likely to be of little value if the candidate biomarker is below the detection limit in blood or a false positive in the original discovery data. Here we present a new technology, accurate inclusion mass screening (AIMS), designed to provide a bridge from unbiased discovery to MS-based targeted assay development. Masses on the software inclusion list are monitored in each scan on the Orbitrap MS system, and MS/MS spectra for sequence confirmation are acquired only when a peptide from the list is detected with both the correct accurate mass and charge state. The AIMS experiment confirms that a given peptide (and thus the protein from which it is derived) is present in the plasma. Throughput of the method is sufficient to qualify up to a hundred proteins/week. The sensitivity of AIMS is similar to MRM on a triple quadrupole MS system using optimized sample preparation methods (low tens of ng/ml in plasma), and MS/MS data from the AIMS experiments on the Orbitrap can be directly used to configure MRM assays. The method was shown to be at least 4-fold more efficient at detecting peptides of interest than undirected LC-MS/MS experiments using the same instrumentation, and relative quantitation information can be obtained by AIMS in case versus control experiments. Detection by AIMS ensures that a quantitative MRM-based assay can be configured for that protein. The method has the potential to qualify large number of biomarker candidates based on their detection in plasma prior to committing to the time- and resource-intensive steps of establishing a quantitative assay.

  1. False HDAC Inhibition by Aurone Compound.

    PubMed

    Itoh, Yukihiro; Suzuki, Miki; Matsui, Taiji; Ota, Yosuke; Hui, Zi; Tsubaki, Kazunori; Suzuki, Takayoshi

    2016-01-01

    Fluorescence assays are useful tools for estimating enzymatic activity. Their simplicity and manageability make them suitable for screening enzyme inhibitors in drug discovery studies. However, researchers need to pay attention to compounds that show auto-fluorescence and quench fluorescence, because such compounds lower the accuracy of the fluorescence assay systems by producing false-positive or negative results. In this study, we found that aurone compound 7, which has been reported as a histone deacetylase (HDAC) inhibitor, gave false-positive results. Although compound 7 was identified by an in vitro HDAC fluorescence assay, it did not show HDAC inhibitory activity in a cell-based assay, leading us to suspect its in vitro HDAC inhibitory activity. As a result of verification experiments, we found that compound 7 interferes with the HDAC fluorescence assay by quenching the HDAC fluorescence signal. Our findings underscore the faults of fluorescence assays and call attention to careless interpretation.

  2. Analysis of an Indirect Neutron Signature for Enhanced UF6 Cylinder Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kulisek, Jonathan A.; McDonald, Benjamin S.; Smith, Leon E.

    2017-02-21

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF6) cylinders. The current method provides relatively low accuracy for the assay of 235U enrichment, especially for natural and depleted UF6. Furthermore, the current method provides no capability to assay the absolute mass of 235U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from 235U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capablemore » cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVANT). HEVANT enables full-volume assay of UF6 cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF6. In this work, Monte Carlo modeling is used as the basis for characterizing HEVANT in terms of the individual contributions to HEVANT from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVANT signature to manipulation by the nearby placement of neutron-conversion materials.« less

  3. Multiplexed Microsphere Suspension-Array Assay for Urine Mitochondrial DNA Typing by C-Stretch Length in Hypervariable Regions.

    PubMed

    Aoki, Kimiko; Tanaka, Hiroyuki; Kawahara, Takashi

    2018-07-01

    The standard method for personal identification and verification of urine samples in doping control is short tandem repeat (STR) analysis using nuclear DNA (nDNA). The DNA concentration of urine is very low and decreases under most conditions used for sample storage; therefore, the amount of DNA from cryopreserved urine samples may be insufficient for STR analysis. We aimed to establish a multiplexed assay for urine mitochondrial DNA typing containing only trace amounts of DNA, particularly for Japanese populations. A multiplexed suspension-array assay using oligo-tagged microspheres (Luminex MagPlex-TAG) was developed to measure C-stretch length in hypervariable region 1 (HV1) and 2 (HV2), five single nucleotide polymorphisms (SNPs), and one polymorphic indel. Based on these SNPs and the indel, the Japanese population can be classified into five major haplogroups (D4, B, M7a, A, D5). The assay was applied to DNA samples from urine cryopreserved for 1 - 1.5 years (n = 63) and fresh blood (n = 150). The assay with blood DNA enabled Japanese subjects to be categorized into 62 types, exhibiting a discriminatory power of 0.960. The detection limit for cryopreserved urine was 0.005 ng of nDNA. Profiling of blood and urine pairs revealed that 5 of 63 pairs showed different C-stretch patterns in HV1 or HV2. The assay described here yields valuable information in terms of the verification of urine sample sources employing only trace amounts of recovered DNA. However, blood cannot be used as a reference sample.

  4. Statistical characterization of multiple-reaction monitoring mass spectrometry (MRM-MS) assays for quantitative proteomics

    PubMed Central

    2012-01-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) with stable isotope dilution (SID) is increasingly becoming a widely accepted assay for the quantification of proteins and peptides. These assays have shown great promise in relatively high throughput verification of candidate biomarkers. While the use of MRM-MS assays is well established in the small molecule realm, their introduction and use in proteomics is relatively recent. As such, statistical and computational methods for the analysis of MRM-MS data from proteins and peptides are still being developed. Based on our extensive experience with analyzing a wide range of SID-MRM-MS data, we set forth a methodology for analysis that encompasses significant aspects ranging from data quality assessment, assay characterization including calibration curves, limits of detection (LOD) and quantification (LOQ), and measurement of intra- and interlaboratory precision. We draw upon publicly available seminal datasets to illustrate our methods and algorithms. PMID:23176545

  5. Statistical characterization of multiple-reaction monitoring mass spectrometry (MRM-MS) assays for quantitative proteomics.

    PubMed

    Mani, D R; Abbatiello, Susan E; Carr, Steven A

    2012-01-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) with stable isotope dilution (SID) is increasingly becoming a widely accepted assay for the quantification of proteins and peptides. These assays have shown great promise in relatively high throughput verification of candidate biomarkers. While the use of MRM-MS assays is well established in the small molecule realm, their introduction and use in proteomics is relatively recent. As such, statistical and computational methods for the analysis of MRM-MS data from proteins and peptides are still being developed. Based on our extensive experience with analyzing a wide range of SID-MRM-MS data, we set forth a methodology for analysis that encompasses significant aspects ranging from data quality assessment, assay characterization including calibration curves, limits of detection (LOD) and quantification (LOQ), and measurement of intra- and interlaboratory precision. We draw upon publicly available seminal datasets to illustrate our methods and algorithms.

  6. Environmental Technology Verification Report for Abraxis Ecologenia® Ethynylestradiol (EE2) Microplate Enzyme-Linked Immunosorbent Assay (ELISA) Test Kits

    EPA Science Inventory

    The EPA's National Risk Management Research Laboratory (NRMRL) and its verification organization partner, Battelle, operate the Advanced Monitoring Systems (AMS) Center under ETV. The AMS Center recently evaluated the performance of the Abraxis Ecologenia Ethynylestradiol (EE2) ...

  7. Rapid Verification of Candidate Serological Biomarkers Using Gel-based, Label-free Multiple Reaction Monitoring

    PubMed Central

    Tang, Hsin-Yao; Beer, Lynn A.; Barnhart, Kurt T.; Speicher, David W.

    2011-01-01

    Stable isotope dilution-multiple reaction monitoring-mass spectrometry (SID-MRM-MS) has emerged as a promising platform for verification of serological candidate biomarkers. However, cost and time needed to synthesize and evaluate stable isotope peptides, optimize spike-in assays, and generate standard curves, quickly becomes unattractive when testing many candidate biomarkers. In this study, we demonstrate that label-free multiplexed MRM-MS coupled with major protein depletion and 1-D gel separation is a time-efficient, cost-effective initial biomarker verification strategy requiring less than 100 μl serum. Furthermore, SDS gel fractionation can resolve different molecular weight forms of targeted proteins with potential diagnostic value. Because fractionation is at the protein level, consistency of peptide quantitation profiles across fractions permits rapid detection of quantitation problems for specific peptides from a given protein. Despite the lack of internal standards, the entire workflow can be highly reproducible, and long-term reproducibility of relative protein abundance can be obtained using different mass spectrometers and LC methods with external reference standards. Quantitation down to ~200 pg/mL could be achieved using this workflow. Hence, the label-free GeLC-MRM workflow enables rapid, sensitive, and economical initial screening of large numbers of candidate biomarkers prior to setting up SID-MRM assays or immunoassays for the most promising candidate biomarkers. PMID:21726088

  8. Rapid verification of candidate serological biomarkers using gel-based, label-free multiple reaction monitoring.

    PubMed

    Tang, Hsin-Yao; Beer, Lynn A; Barnhart, Kurt T; Speicher, David W

    2011-09-02

    Stable isotope dilution-multiple reaction monitoring-mass spectrometry (SID-MRM-MS) has emerged as a promising platform for verification of serological candidate biomarkers. However, cost and time needed to synthesize and evaluate stable isotope peptides, optimize spike-in assays, and generate standard curves quickly becomes unattractive when testing many candidate biomarkers. In this study, we demonstrate that label-free multiplexed MRM-MS coupled with major protein depletion and 1D gel separation is a time-efficient, cost-effective initial biomarker verification strategy requiring less than 100 μL of serum. Furthermore, SDS gel fractionation can resolve different molecular weight forms of targeted proteins with potential diagnostic value. Because fractionation is at the protein level, consistency of peptide quantitation profiles across fractions permits rapid detection of quantitation problems for specific peptides from a given protein. Despite the lack of internal standards, the entire workflow can be highly reproducible, and long-term reproducibility of relative protein abundance can be obtained using different mass spectrometers and LC methods with external reference standards. Quantitation down to ~200 pg/mL could be achieved using this workflow. Hence, the label-free GeLC-MRM workflow enables rapid, sensitive, and economical initial screening of large numbers of candidate biomarkers prior to setting up SID-MRM assays or immunoassays for the most promising candidate biomarkers.

  9. Environmental Technology Verification Report for Abraxis 17β-Estradiol (E2) Magnetic Particle Enzyme-Linked Immunosorbent Assay (ELISA) Test Kits

    EPA Science Inventory

    The EPA's National Risk Management Research Laboratory (NRMRL) and its verification organization partner, Battelle, operate the Advanced Monitoring Systems (AMS) Center under ETV. The AMS Center recently evaluated the performance of the Abraxis 17(beta)-estradiol (E2) magnetic p...

  10. Safeguardability of the vitrification option for disposal of plutonium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pillay, K.K.S.

    1996-05-01

    Safeguardability of the vitrification option for plutonium disposition is rather complex and there is no experience base in either domestic or international safeguards for this approach. In the present treaty regime between the US and the states of the former Soviet Union, bilaterial verifications are considered more likely with potential for a third-party verification of safeguards. There are serious technological limitations to applying conventional bulk handling facility safeguards techniques to achieve independent verification of plutonium in borosilicate glass. If vitrification is the final disposition option chosen, maintaining continuity of knowledge of plutonium in glass matrices, especially those containing boron andmore » those spike with high-level wastes or {sup 137}Cs, is beyond the capability of present-day safeguards technologies and nondestructive assay techniques. The alternative to quantitative measurement of fissile content is to maintain continuity of knowledge through a combination of containment and surveillance, which is not the international norm for bulk handling facilities.« less

  11. Verification of chemistry reference ranges using a simple method in sub-Saharan Africa

    PubMed Central

    Taylor, Douglas; Mandala, Justin; Nanda, Kavita; Van Campenhout, Christel; Agingu, Walter; Madurai, Lorna; Barsch, Eva-Maria; Deese, Jennifer; Van Damme, Lut; Crucitti, Tania

    2016-01-01

    Background Chemistry safety assessments are interpreted by using chemistry reference ranges (CRRs). Verification of CRRs is time consuming and often requires a statistical background. Objectives We report on an easy and cost-saving method to verify CRRs. Methods Using a former method introduced by Sigma Diagnostics, three study sites in sub-Saharan Africa, Bondo, Kenya, and Pretoria and Bloemfontein, South Africa, verified the CRRs for hepatic and renal biochemistry assays performed during a clinical trial of HIV antiretroviral pre-exposure prophylaxis. The aspartate aminotransferase/alanine aminotransferase, creatinine and phosphorus results from 10 clinically-healthy participants at the screening visit were used. In the event the CRRs did not pass the verification, new CRRs had to be calculated based on 40 clinically-healthy participants. Results Within a few weeks, the study sites accomplished verification of the CRRs without additional costs. The aspartate aminotransferase reference ranges for the Bondo, Kenya site and the alanine aminotransferase reference ranges for the Pretoria, South Africa site required adjustment. The phosphorus CRR passed verification and the creatinine CRR required adjustment at every site. The newly-established CRR intervals were narrower than the CRRs used previously at these study sites due to decreases in the upper limits of the reference ranges. As a result, more toxicities were detected. Conclusion To ensure the safety of clinical trial participants, verification of CRRs should be standard practice in clinical trials conducted in settings where the CRR has not been validated for the local population. This verification method is simple, inexpensive, and can be performed by any medical laboratory. PMID:28879112

  12. Verification of chemistry reference ranges using a simple method in sub-Saharan Africa.

    PubMed

    De Baetselier, Irith; Taylor, Douglas; Mandala, Justin; Nanda, Kavita; Van Campenhout, Christel; Agingu, Walter; Madurai, Lorna; Barsch, Eva-Maria; Deese, Jennifer; Van Damme, Lut; Crucitti, Tania

    2016-01-01

    Chemistry safety assessments are interpreted by using chemistry reference ranges (CRRs). Verification of CRRs is time consuming and often requires a statistical background. We report on an easy and cost-saving method to verify CRRs. Using a former method introduced by Sigma Diagnostics, three study sites in sub-Saharan Africa, Bondo, Kenya, and Pretoria and Bloemfontein, South Africa, verified the CRRs for hepatic and renal biochemistry assays performed during a clinical trial of HIV antiretroviral pre-exposure prophylaxis. The aspartate aminotransferase/alanine aminotransferase, creatinine and phosphorus results from 10 clinically-healthy participants at the screening visit were used. In the event the CRRs did not pass the verification, new CRRs had to be calculated based on 40 clinically-healthy participants. Within a few weeks, the study sites accomplished verification of the CRRs without additional costs. The aspartate aminotransferase reference ranges for the Bondo, Kenya site and the alanine aminotransferase reference ranges for the Pretoria, South Africa site required adjustment. The phosphorus CRR passed verification and the creatinine CRR required adjustment at every site. The newly-established CRR intervals were narrower than the CRRs used previously at these study sites due to decreases in the upper limits of the reference ranges. As a result, more toxicities were detected. To ensure the safety of clinical trial participants, verification of CRRs should be standard practice in clinical trials conducted in settings where the CRR has not been validated for the local population. This verification method is simple, inexpensive, and can be performed by any medical laboratory.

  13. Enhanced sensitivity and multiplexing with 2D LC/MRM-MS and labeled standards for deeper and more comprehensive protein quantitation.

    PubMed

    Percy, Andrew J; Simon, Romain; Chambers, Andrew G; Borchers, Christoph H

    2014-06-25

    Mass spectrometry (MS)-based protein quantitation is increasingly being employed to verify candidate protein biomarkers. Multiple or selected reaction monitoring-mass spectrometry (MRM-MS or SRM-MS) with isotopically labeled internal standards has proven to be a successful approach in that regard, but has yet to reach its full potential in terms of multiplexing and sensitivity. Here, we report the development of a new MRM method for the quantitation of 253 disease-associated proteins (represented by 625 interference-free peptides) in 13 LC fractions. This 2D RPLC/MRM-MS approach extends the depth and breadth of the assay by 2 orders of magnitude over pre-fractionation-free assays, with 31 proteins below 10 ng/mL and 41 proteins above 10 ng/mL now quantifiable. Standard flow rates are used in both chromatographic dimensions, and up-front depletion or antibody-based enrichment is not required. The LC separations utilize high and low pH conditions, with the former employing an ammonium hydroxide-based eluent, instead of the conventional ammonium formate, resulting in improved LC column lifetime and performance. The high sensitivity (determined concentration range: 15 mg/mL to 452 pg/mL) and robustness afforded by this method makes the full MRM panel, or subsets thereof, useful for the verification of disease-associated plasma protein biomarkers in patient samples. The described research extends the breadth and depth of protein quantitation in undepleted and non-enriched human plasma by employing standard-flow 2D RPLC/MRM-MS in conjunction with a complex mixture of isotopically labeled peptide standards. The proteins quantified are mainly putative biomarkers of non-communicable (i.e., non-infectious) disease (e.g., cardiovascular or cancer), which require pre-clinical verification and validation before clinical implementation. Based on the enhanced sensitivity and multiplexing, this quantitative plasma proteomic method should prove useful in future candidate biomarker verification studies. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Development of a biomimetic enzyme-linked immunosorbent assay based on molecularly imprinted polymers on paper for the detection of carbaryl.

    PubMed

    Zhang, Can; Cui, Hanyu; Han, Yufeng; Yu, Fangfang; Shi, Xiaoman

    2018-02-01

    A biomimetic enzyme-linked immunosorbent assay (BELISA) which was based on molecularly imprinted polymers on paper (MIPs-paper) with specific recognition was developed. As a detector, the surface of paper was modified with γ-MAPS by hydrolytic action and anchored the MIP layer on γ-MAPS modified-paper by copolymerization to construct the artificial antibody Through a series of experimentation and verification, we successful got the MIPs-paper and established BELISA for the detection of carbaryl. The development of MIPs-paper based on BELISA was applied to detect carbaryl in real samples and validated by an enzyme-linked immunosorbent assay (ELISA) based on anti-carbaryl biological antibody. The results of these two methods (BELISA and ELISA) were well correlated (R 2 =0.944). The established method of MIPs-paper BELISA exhibits the advantages of low cost, higher stability and being re-generable, which can be applied as a convenient tool for the fast and efficient detection of carbaryl. Copyright © 2017. Published by Elsevier Ltd.

  15. NEUTRON MULTIPLICITY AND ACTIVE WELL NEUTRON COINCIDENCE VERIFICATION MEASUREMENTS PERFORMED FOR MARCH 2009 SEMI-ANNUAL DOE INVENTORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dewberry, R.; Ayers, J.; Tietze, F.

    The Analytical Development (AD) Section field nuclear measurement group performed six 'best available technique' verification measurements to satisfy a DOE requirement instituted for the March 2009 semi-annual inventory. The requirement of (1) yielded the need for SRNL Research Operations Department Material Control & Accountability (MC&A) group to measure the Pu content of five items and the highly enrich uranium (HEU) content of two. No 14Q-qualified measurement equipment was available to satisfy the requirement. The AD field nuclear group has routinely performed the required Confirmatory Measurements for the semi-annual inventories for fifteen years using sodium iodide and high purity germanium (HpGe)more » {gamma}-ray pulse height analysis nondestructive assay (NDA) instruments. With appropriate {gamma}-ray acquisition modeling, the HpGe spectrometers can be used to perform verification-type quantitative assay for Pu-isotopics and HEU content. The AD nuclear NDA group is widely experienced with this type of measurement and reports content for these species in requested process control, MC&A booking, and holdup measurements assays Site-wide. However none of the AD HpGe {gamma}-ray spectrometers have been 14Q-qualified, and the requirement of reference 1 specifically excluded a {gamma}-ray PHA measurement from those it would accept for the required verification measurements. The requirement of reference 1 was a new requirement for which the Savannah River National Laboratory (SRNL) Research Operations Department (ROD) MC&A group was unprepared. The criteria for exemption from verification were: (1) isotope content below 50 grams; (2) intrinsically tamper indicating or TID sealed items which contain a Category IV quantity of material; (3) assembled components; and (4) laboratory samples. Therefore all (SRNL) Material Balance Area (MBA) items with greater than 50 grams total Pu or greater than 50 grams HEU were subject to a verification measurement. The pass/fail criteria of reference 7 stated 'The facility will report measured values, book values, and statistical control limits for the selected items to DOE SR...', and 'The site/facility operator must develop, document, and maintain measurement methods for all nuclear material on inventory'. These new requirements exceeded SRNL's experience with prior semi-annual inventory expectations, but allowed the AD nuclear field measurement group to demonstrate its excellent adaptability and superior flexibility to respond to unpredicted expectations from the DOE customer. The requirements yielded five SRNL items subject to Pu verification and two SRNL items subject to HEU verification. These items are listed and described in Table 1.« less

  16. Quantification of transuranic elements by time interval correlation spectroscopy of the detected neutrons

    PubMed

    Baeten; Bruggeman; Paepen; Carchon

    2000-03-01

    The non-destructive quantification of transuranic elements in nuclear waste management or in safeguards verifications is commonly performed by passive neutron assay techniques. To minimise the number of unknown sample-dependent parameters, Neutron Multiplicity Counting (NMC) is applied. We developed a new NMC-technique, called Time Interval Correlation Spectroscopy (TICS), which is based on the measurement of Rossi-alpha time interval distributions. Compared to other NMC-techniques, TICS offers several advantages.

  17. An unattended verification station for UF6 cylinders: Field trial findings

    NASA Astrophysics Data System (ADS)

    Smith, L. E.; Miller, K. A.; McDonald, B. S.; Webster, J. B.; Zalavadia, M. A.; Garner, J. R.; Stewart, S. L.; Branney, S. J.; Todd, L. C.; Deshmukh, N. S.; Nordquist, H. A.; Kulisek, J. A.; Swinhoe, M. T.

    2017-12-01

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. Analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 "typical" Type 30B cylinders, and the viability of an "NDA Fingerprint" concept as a high-fidelity means to periodically verify that material diversion has not occurred.

  18. Verification of Plutonium Content in PuBe Sources Using MCNP® 6.2.0 Beta with TENDL 2012 Libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lockhart, Madeline Louise; McMath, Garrett Earl

    Although the production of PuBe neutron sources has discontinued, hundreds of sources with unknown or inaccurately declared plutonium content are in existence around the world. Institutions have undertaken the task of assaying these sources, measuring, and calculating the isotopic composition, plutonium content, and neutron yield. The nominal plutonium content, based off the neutron yield per gram of pure 239Pu, has shown to be highly inaccurate. New methods of measuring the plutonium content allow a more accurate estimate of the true Pu content, but these measurements need verification. Using the TENDL 2012 nuclear data libraries, MCNP6 has the capability to simulatemore » the (α, n) interactions in a PuBe source. Theoretically, if the source is modeled according to the plutonium content, isotopic composition, and other source characteristics, the calculated neutron yield in MCNP can be compared to the experimental yield, offering an indication of the accuracy of the declared plutonium content. In this study, three sets of PuBe sources from various backgrounds were modeled in MCNP6 1.2 Beta, according to the source specifications dictated by the individuals who assayed the source. Verification of the source parameters with MCNP6 also serves as a means to test the alpha transport capabilities of MCNP6 1.2 Beta with TENDL 2012 alpha transport libraries. Finally, good agreement in the comparison would indicate the accuracy of the source parameters in addition to demonstrating MCNP's capabilities in simulating (α, n) interactions.« less

  19. Verification of Plutonium Content in PuBe Sources Using MCNP® 6.2.0 Beta with TENDL 2012 Libraries

    DOE PAGES

    Lockhart, Madeline Louise; McMath, Garrett Earl

    2017-10-26

    Although the production of PuBe neutron sources has discontinued, hundreds of sources with unknown or inaccurately declared plutonium content are in existence around the world. Institutions have undertaken the task of assaying these sources, measuring, and calculating the isotopic composition, plutonium content, and neutron yield. The nominal plutonium content, based off the neutron yield per gram of pure 239Pu, has shown to be highly inaccurate. New methods of measuring the plutonium content allow a more accurate estimate of the true Pu content, but these measurements need verification. Using the TENDL 2012 nuclear data libraries, MCNP6 has the capability to simulatemore » the (α, n) interactions in a PuBe source. Theoretically, if the source is modeled according to the plutonium content, isotopic composition, and other source characteristics, the calculated neutron yield in MCNP can be compared to the experimental yield, offering an indication of the accuracy of the declared plutonium content. In this study, three sets of PuBe sources from various backgrounds were modeled in MCNP6 1.2 Beta, according to the source specifications dictated by the individuals who assayed the source. Verification of the source parameters with MCNP6 also serves as a means to test the alpha transport capabilities of MCNP6 1.2 Beta with TENDL 2012 alpha transport libraries. Finally, good agreement in the comparison would indicate the accuracy of the source parameters in addition to demonstrating MCNP's capabilities in simulating (α, n) interactions.« less

  20. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) OF FOUR TEST KITS FOR THE ANALYSIS OF ATRAZINE IN WATER: ABRAXIS LLC ATRAZINE ELISA KIT, BEACON ANALYTICAL SYSTEMS, INC. ATRAZINE TUBE KIT, SILVER LAKE RESEARCH CORP. WATERSAFE PESTICIDE TEST AND STRATEGIC DIAGNOSTICS, INC. RAPID ASSAY KIT

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to measure environmental quality. The ETV ...

  1. The visible touch: in planta visualization of protein-protein interactions by fluorophore-based methods

    PubMed Central

    Bhat, Riyaz A; Lahaye, Thomas; Panstruga, Ralph

    2006-01-01

    Non-invasive fluorophore-based protein interaction assays like fluorescence resonance energy transfer (FRET) and bimolecular fluorescence complementation (BiFC, also referred to as "split YFP") have been proven invaluable tools to study protein-protein interactions in living cells. Both methods are now frequently used in the plant sciences and are likely to develop into standard techniques for the identification, verification and in-depth analysis of polypeptide interactions. In this review, we address the individual strengths and weaknesses of both approaches and provide an outlook about new directions and possible future developments for both techniques. PMID:16800872

  2. Automated measurement of 25-OH Vitamin D on the LUMIPULSE® G1200: analytical verification and method comparison.

    PubMed

    Parra, Marina; Foj, Laura; Filella, Xavier

    2016-07-01

    Because of its potential value in several pathologies, clinical interest in 25-hydroxy Vitamin D (25OH-D) is increasing. However, the standardisation of assays remains a significant problem. Our aim was to evaluate the performance of the novel Lumipulse G 25-OH Vitamin D assay (Fujirebio), comparing results with the Liaison (Diasorin) method. Analytical verification of the Lumipulse G 25-OH Vitamin D assay was performed. Both methods were compared using sera from 226 patients, including 111 patients with chronic renal failure (39 on haemodialysis) and 115 patients without renal failure. In addition, clinical concordance between assays was assessed. For Lumipulse G 25-OH Vitamin D assay, the limit of detection was 0.3 ng/mL, and the limit of quantification was 2.5 ng/mL with a 9.7% of coefficient of variation. Intra-and inter-assay coefficients of variation were <2.3 and <1.8% (25.4-50.0 ng/mL), respectively. Dilution linearity was in the range of 4.5-144.5 ng/mL. Method comparison resulted in a mean difference of -6.5% (95% CI from -8.8 to -4.1) for all samples between Liaison and Lumipulse G. Clinical concordance assessed by Kappa Index was 0.66. Lumipulse G 25-OH Vitamin D showed a good clinical concordance with the Liaison assay, although overall results measured in Lumipulse were higher by an average of 6.5%.

  3. An unattended verification station for UF 6 cylinders: Field trial findings

    DOE PAGES

    Smith, L. E.; Miller, K. A.; McDonald, B. S.; ...

    2017-08-26

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less

  4. An unattended verification station for UF 6 cylinders: Field trial findings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, L. E.; Miller, K. A.; McDonald, B. S.

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less

  5. Viability Study for an Unattended UF 6 Cylinder Verification Station: Phase I Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Leon E.; Miller, Karen A.; Garner, James R.

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass and identification for all declared UF 6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field-measured instrument uncertainties, provides an assessment of the partial-defect sensitivity of HEVA and PNEM for both one-time assay and (repeated) NDA Fingerprint verification scenarios. The findings presented in this report represent a significant step forward in the community’s understanding of the strengths and limitations of the PNEM and HEVA NDA methods, and the viability of the UCVS concept in front-end fuel cycle facilities. This experience will inform Phase II of the UCVS viability study, should the IAEA pursue it.« less

  6. The selected reaction monitoring/multiple reaction monitoring-based mass spectrometry approach for the accurate quantitation of proteins: clinical applications in the cardiovascular diseases.

    PubMed

    Gianazza, Erica; Tremoli, Elena; Banfi, Cristina

    2014-12-01

    Selected reaction monitoring, also known as multiple reaction monitoring, is a powerful targeted mass spectrometry approach for a confident quantitation of proteins/peptides in complex biological samples. In recent years, its optimization and application have become pivotal and of great interest in clinical research to derive useful outcomes for patient care. Thus, selected reaction monitoring/multiple reaction monitoring is now used as a highly sensitive and selective method for the evaluation of protein abundances and biomarker verification with potential applications in medical screening. This review describes technical aspects for the development of a robust multiplex assay and discussing its recent applications in cardiovascular proteomics: verification of promising disease candidates to select only the highest quality peptides/proteins for a preclinical validation, as well as quantitation of protein isoforms and post-translational modifications.

  7. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data.

    PubMed

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-07-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users.

  8. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data

    PubMed Central

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-01-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users. PMID:20501601

  9. Active Interrogation for Spent Fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swinhoe, Martyn Thomas; Dougan, Arden

    2015-11-05

    The DDA instrument for nuclear safeguards is a fast, non-destructive assay, active neutron interrogation technique using an external 14 MeV DT neutron generator for characterization and verification of spent nuclear fuel assemblies.

  10. EURATOM safeguards efforts in the development of spent fuel verification methods by non-destructive assay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matloch, L.; Vaccaro, S.; Couland, M.

    The back end of the nuclear fuel cycle continues to develop. The European Commission, particularly the Nuclear Safeguards Directorate of the Directorate General for Energy, implements Euratom safeguards and needs to adapt to this situation. The verification methods for spent nuclear fuel, which EURATOM inspectors can use, require continuous improvement. Whereas the Euratom on-site laboratories provide accurate verification results for fuel undergoing reprocessing, the situation is different for spent fuel which is destined for final storage. In particular, new needs arise from the increasing number of cask loadings for interim dry storage and the advanced plans for the construction ofmore » encapsulation plants and geological repositories. Various scenarios present verification challenges. In this context, EURATOM Safeguards, often in cooperation with other stakeholders, is committed to further improvement of NDA methods for spent fuel verification. In this effort EURATOM plays various roles, ranging from definition of inspection needs to direct participation in development of measurement systems, including support of research in the framework of international agreements and via the EC Support Program to the IAEA. This paper presents recent progress in selected NDA methods. These methods have been conceived to satisfy different spent fuel verification needs, ranging from attribute testing to pin-level partial defect verification. (authors)« less

  11. Verification of responses of Japanese medaka (Oryzias latipes) to anti-androgens, vinclozolin and flutamide, in short-term assays.

    PubMed

    Nakamura, Ataru; Takanobu, Hitomi; Tamura, Ikumi; Yamamuro, Masumi; Iguchi, Taisen; Tatarazako, Norihisa

    2014-05-01

    Various testing methods for the detection of the endocrine disruptive activities of chemicals have been developed in freshwater fish species. However, a few relatively easier specific methods for detecting anti-androgenic activities are available for fish. The aim of this study was to verify the papillary process in Japanese medaka (Oryzias latipes) as an indicator of the anti-androgenic activity of chemicals. Japanese medaka were exposed to two types of anti-androgenic compounds, vinclozolin and flutamide, using two short-term assays; one was conformed to the existing short-term reproduction assay using adult fish (adult test) and the other was a test based on the same methods but using juvenile fish at the beginning of exposure (juvenile test). Significant decreases in male papillary processes were observed in the juvenile test treated with the highest concentration of both antiandrogens (640 µg l(-1) vinclozolin and 1000 µg l(-1) flutamide); however, no significant effects were observed in the adult test. Consequently, our results indicate that papillary processes in Japanese medaka can be used as the end-point for screening the anti-androgenic activity of chemicals using juvenile fish for a specific period based on the existing short-term reproduction assay. Copyright © 2013 John Wiley & Sons, Ltd.

  12. The Impact of Gate Width Setting and Gate Utilization Factors on Plutonium Assay in Passive Correlated Neutron Counting

    DOE PAGES

    Henzlova, Daniela; Menlove, Howard Olsen; Croft, Stephen; ...

    2015-06-15

    In the field of nuclear safeguards, passive neutron multiplicity counting (PNMC) is a method typically employed in non-destructive assay (NDA) of special nuclear material (SNM) for nonproliferation, verification and accountability purposes. PNMC is generally performed using a well-type thermal neutron counter and relies on the detection of correlated pairs or higher order multiplets of neutrons emitted by an assayed item. To assay SNM, a set of parameters for a given well-counter is required to link the measured multiplicity rates to the assayed item properties. Detection efficiency, die-away time, gate utilization factors (tightly connected to die-away time) as well as optimummore » gate width setting are among the key parameters. These parameters along with the underlying model assumptions directly affect the accuracy of the SNM assay. In this paper we examine the role of gate utilization factors and the single exponential die-away time assumption and their impact on the measurements for a range of plutonium materials. In addition, we examine the importance of item-optimized coincidence gate width setting as opposed to using a universal gate width value. Finally, the traditional PNMC based on multiplicity shift register electronics is extended to Feynman-type analysis and application of this approach to Pu mass assay is demonstrated.« less

  13. An approach to rule-out an acute cardiovascular event or death in emergency department patients using outcome-based cutoffs for high-sensitivity cardiac troponin assays and glucose.

    PubMed

    Shortt, Colleen; Phan, Kim; Hill, Stephen A; Worster, Andrew; Kavsak, Peter A

    2015-03-01

    The application of "undetectable" high-sensitivity cardiac troponin (hs-cTn) concentrations to "rule-out" myocardial infarction is appealing, but there are analytical concerns and a lack of consensus on what concentration should be used to define the lower reportable limit; i.e., limit of detection (LoD) or limit of blank. An alternative approach is to utilize a measurable hs-cTn concentration that identifies patients at low-risk for a future cardiovascular event combined with another prognostic test, such as glucose. We assessed both of these approaches in different emergency department (ED) cohorts to rule-out an event. We used cohort 1 (all-comer ED population, n=4773; derivation cohort) to determine the most appropriate approach at presentation (i.e., Dual Panel test: hs-cTn/glucose vs. LoD vs. LoD/glucose) for an early rule-out of hospital death using the Abbott ARCHITECT hs-cTnI assay. We used cohort 2 (n=144) and cohort 3 (n=127), both early chest pain onset ED populations as the verification datasets (outcome: composite cardiovascular event at 72h) with three hs-cTn assays assessed (Abbott Laboratories, Beckman Coulter, Roche Diagnostics). In cohort 1, the sensitivity was >99% for all three approaches; however the specificity (11%; 95% CI: 10-12%) was significantly higher for the Dual Panel as compared to the LoD approach (specificity=5%; 95% CI: 4-6%). Verification of the Dual Panel in cohort 2 and cohort 3 revealed 100% sensitivity and negative predictive values for all three hs-cTn assays. The combination of a "healthy" hs-cTn concentration with glucose might effectively rule-out patients for an acute cardiovascular event at ED presentation. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  14. Amino acids in a targeted versus a non-targeted metabolomics LC-MS/MS assay. Are the results consistent?

    PubMed

    Klepacki, Jacek; Klawitter, Jost; Klawitter, Jelena; Karimpour-Fard, Anis; Thurman, Joshua; Ingle, Gordon; Patel, Dharmesh; Christians, Uwe

    2016-09-01

    The results of plasma amino acid patterns in samples from kidney transplant patients with good and impaired renal function using a targeted LC-MS/MS amino acid assay and a non-targeted metabolomics assay were compared. EDTA plasma samples were prospectively collected at baseline, 1, 2, 4 and 6months post-transplant (n=116 patients, n=398 samples). Each sample was analyzed using both a commercial amino acid LC-MS/MS assay and a non-targeted metabolomics assay also based on MS/MS ion transitions. The results of both assays were independently statistically analyzed to identify amino acids associated with estimated glomerular filtration rates using correlation and partial least squares-discriminant analysis. Although there was overlap between the results of the targeted and non-targeted metabolomics assays (tryptophan, 1-methyl histidine), there were also substantial inconsistencies, with the non-targeted assay resulting in more "hits" than the targeted assay. Without further verification of the hits detected by the non-targeted discovery assay, this would have led to different interpretation of the results. There were also false negative results when the non-targeted assay was used (hydroxy proline). Several of said discrepancies could be explained by loss of sensitivity during analytical runs for selected amino acids (serine and threonine), retention time shifts, signals above the range of linear detector response and integration of peaks not separated from background and interferences (aspartate) when the non-targeted metabolomics assay was used. Whenever assessment of a specific pathway such as amino acids is the focus of interest, a targeted seems preferable to a non-targeted metabolomics assay. Copyright © 2016. Published by Elsevier Inc.

  15. Quantitative non-destructive assay of PuBe neutron sources

    NASA Astrophysics Data System (ADS)

    Lakosi, László; Bagi, János; Nguyen, Cong Tam

    2006-02-01

    PuBe neutron sources were assayed, using a combination of high resolution γ-spectrometry (HRGS) and neutron correlation technique. In a previous publication [J. Bagi, C. Tam Nguyen, L. Lakosi, Nucl. Instr. and Meth. B 222 (2004) 242] a passive neutron well-counter was reported with 3He tubes embedded in a polyamide (TERRAMID) moderator (lined inside with Cd) surrounding the sources to be measured. Gross and coincidence neutron counting was performed, and the Pu content of the sources was found out from isotope analysis and by adopting specific (α, n) reaction yields of the Pu isotopes and 241Am in Be, based on supplier's information and literature data. The method was further developed and refined. Evaluation algorithm was more precisely worked out. The contribution of secondary (correlated) neutrons to the total neutron output was derived from the coincidence (doubles) count rate and taken into account in assessing the Pu content. A new evaluation of former results was performed. Assay was extended to other PuBe sources, and new results were added. In order to attain higher detection efficiency, a more efficient moderator was also applied, with and without Cd shielding around the assay chamber. Calibration seems possible using neutron measurements only (without γ-spectrometry), based on a correlation between the Pu amount and the coincidence-to-total ratio. It is expected that the method could be used for Pu accountancy and safeguards verification as well as identification and assay of seized, found, or not documented PuBe neutron sources.

  16. Absorption of p,p'-dichlorodiphenyldichloroethylene and dieldrin in largemouth bass from a 60-D slow-release pellet and detection using a novel enzyme-linked immunosorbent assay method for blood plasma

    USGS Publications Warehouse

    Muller, Jennifer K.; Sepulveda, Maria S.; Borgert, Christopher J.; Gross, Timothy S.

    2005-01-01

    This work describes the uptake of two organochlorine pesticides from slow-release pellets by largemouth bass and the utility of a blood plasma enzyme-linked immunosorbent assay (ELISA) method for exposure verification. We measured blood and tissue levels by gas chromatography/mass spectrometry and by a novel ELISA method, and present a critical comparison of the results.

  17. Laboratory Evaluation of the Liat HIV Quant (IQuum) Whole-Blood and Plasma HIV-1 Viral Load Assays for Point-of-Care Testing in South Africa

    PubMed Central

    Gous, Natasha; Carmona, Sergio; Stevens, Wendy

    2015-01-01

    Point-of-care (POC) HIV viral load (VL) testing offers the potential to reduce turnaround times for antiretroviral therapy monitoring, offer near-patient acute HIV diagnosis in adults, extend existing centralized VL services, screen women in labor, and prompt pediatrics to early treatment. The Liat HIV Quant plasma and whole-blood assays, prerelease version, were evaluated in South Africa. The precision, accuracy, linearity, and agreement of the Liat HIV Quant whole-blood and plasma assays were compared to those of reference technologies (Roche CAP CTMv2.0 and Abbott RealTime HIV-1) on an HIV verification plasma panel (n = 42) and HIV clinical specimens (n = 163). HIV Quant plasma assay showed good performance, with a 2.7% similarity coefficient of variation (CV) compared to the Abbott assay and a 1.8% similarity CV compared to the Roche test on the verification panel, and 100% specificity. HIV Quant plasma had substantial agreement (pc [concordance correlation] = 0.96) with Roche on clinical specimens and increased variability (pc = 0.73) in the range of <3.0 log copies/ml range with the HIV Quant whole-blood assay. HIV Quant plasma assay had good linearity (2.0 to 5.0 log copies/ml; R2 = 0.99). Clinical sensitivity at a viral load of 1,000 copies/ml of the HIV Quant plasma and whole-blood assays compared to that of the Roche assay (n = 94) was 100% (confidence interval [CI], 95.3% to 100%). The specificity of HIV Quant plasma was 88.2% (CI, 63.6% to 98.5%), and that for whole blood was 41.2% (CI, 18.4% to 67.1%). No virological failure (downward misclassification) was missed. Liat HIV Quant plasma assay can be interchanged with existing VL technology in South Africa. Liat HIV Quant whole-blood assay would be advantageous for POC early infant diagnosis at birth and adult adherence monitoring and needs to be evaluated further in this clinical context. LIAT cartridges currently require cold storage, but the technology is user-friendly and robust. Clinical cost and implementation modeling is required. PMID:25740777

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - RAPID ASSAY SYSTEM FOR PCB ANALYSIS - STRATEGIC DIAGNOSTICS INC.

    EPA Science Inventory

    In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of polychlorinated biphenyl (PCB) field analytical techniques. The demonstration design was subjected to extensive review and comment by EPA's National Exposure Research Laboratory (NERL) Envi...

  19. Source strength verification and quality assurance of preloaded brachytherapy needles using a CMOS flat panel detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Golshan, Maryam, E-mail: maryam.golshan@bccancer.bc.ca; Spadinger, Ingrid; Chng, Nick

    2016-06-15

    Purpose: Current methods of low dose rate brachytherapy source strength verification for sources preloaded into needles consist of either assaying a small number of seeds from a separate sample belonging to the same lot used to load the needles or performing batch assays of a subset of the preloaded seed trains. Both of these methods are cumbersome and have the limitations inherent to sampling. The purpose of this work was to investigate an alternative approach that uses an image-based, autoradiographic system capable of the rapid and complete assay of all sources without compromising sterility. Methods: The system consists of amore » flat panel image detector, an autoclavable needle holder, and software to analyze the detected signals. The needle holder was designed to maintain a fixed vertical spacing between the needles and the image detector, and to collimate the emissions from each seed. It also provides a sterile barrier between the needles and the imager. The image detector has a sufficiently large image capture area to allow several needles to be analyzed simultaneously.Several tests were performed to assess the accuracy and reproducibility of source strengths obtained using this system. Three different seed models (Oncura 6711 and 9011 {sup 125}I seeds, and IsoAid Advantage {sup 103}Pd seeds) were used in the evaluations. Seeds were loaded into trains with at least 1 cm spacing. Results: Using our system, it was possible to obtain linear calibration curves with coverage factor k = 1 prediction intervals of less than ±2% near the centre of their range for the three source models. The uncertainty budget calculated from a combination of type A and type B estimates of potential sources of error was somewhat larger, yielding (k = 1) combined uncertainties for individual seed readings of 6.2% for {sup 125}I 6711 seeds, 4.7% for {sup 125}I 9011 seeds, and 11.0% for Advantage {sup 103}Pd seeds. Conclusions: This study showed that a flat panel detector dosimetry system is a viable option for source strength verification in preloaded needles, as it is capable of measuring all of the sources intended for implantation. Such a system has the potential to directly and efficiently estimate individual source strengths, the overall mean source strength, and the positions within the seed-spacer train.« less

  20. Install active/passive neutron examination and assay (APNEA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1996-04-01

    This document describes activities pertinent to the installation of the prototype Active/Passive Neutron Examination and Assay (APNEA) system built in Area 336 into its specially designed trailer. It also documents the basic theory of operation, design and protective features, basic personnel training, and the proposed characterization site location at Lockheed Martin Specialty Components, Inc., (Specialty Components) with the estimated 10 mrem/year boundary. Additionally, the document includes the Preventive Change Analysis (PCA) form, and a checklist of items for verification prior to unrestricted system use.

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: IMMUNOASSAY KIT, ENVIROLOGIX, INC., PCB IN SOIL TUBE ASSAY

    EPA Science Inventory

    In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCB's in soi...

  2. Biological effects-based tools for monitoring impacted surface waters in the Great Lakes: a multiagency program in support of the Great Lakes Restoration Initiative

    USGS Publications Warehouse

    Ekman, Drew R.; Ankley, Gerald T.; Blazer, Vicki; Collette, Timothy W.; Garcia-Reyero, Natàlia; Iwanowicz, Luke R.; Jorgensen, Zachary G.; Lee, Kathy E.; Mazik, Pat M.; Miller, David H.; Perkins, Edward J.; Smith, Edwin T.; Tietge, Joseph E.; Villeneuve, Daniel L.

    2013-01-01

    There is increasing demand for the implementation of effects-based monitoring and surveillance (EBMS) approaches in the Great Lakes Basin to complement traditional chemical monitoring. Herein, we describe an ongoing multiagency effort to develop and implement EBMS tools, particularly with regard to monitoring potentially toxic chemicals and assessing Areas of Concern (AOCs), as envisioned by the Great Lakes Restoration Initiative (GLRI). Our strategy includes use of both targeted and open-ended/discovery techniques, as appropriate to the amount of information available, to guide a priori end point and/or assay selection. Specifically, a combination of in vivo and in vitro tools is employed by using both wild and caged fish (in vivo), and a variety of receptor- and cell-based assays (in vitro). We employ a work flow that progressively emphasizes in vitro tools for long-term or high-intensity monitoring because of their greater practicality (e.g., lower cost, labor) and relying on in vivo assays for initial surveillance and verification. Our strategy takes advantage of the strengths of a diversity of tools, balancing the depth, breadth, and specificity of information they provide against their costs, transferability, and practicality. Finally, a series of illustrative scenarios is examined that align EBMS options with management goals to illustrate the adaptability and scaling of EBMS approaches and how they can be used in management decisions.

  3. Plasma protein absolute quantification by nano-LC Q-TOF UDMSE for clinical biomarker verification

    PubMed Central

    ILIES, MARIA; IUGA, CRISTINA ADELA; LOGHIN, FELICIA; DHOPLE, VISHNU MUKUND; HAMMER, ELKE

    2017-01-01

    Background and aims Proteome-based biomarker studies are targeting proteins that could serve as diagnostic, prognosis, and prediction molecules. In the clinical routine, immunoassays are currently used for the absolute quantification of such biomarkers, with the major limitation that only one molecule can be targeted per assay. The aim of our study was to test a mass spectrometry based absolute quantification method for the verification of plasma protein sets which might serve as reliable biomarker panels for the clinical practice. Methods Six EDTA plasma samples were analyzed after tryptic digestion using a high throughput data independent acquisition nano-LC Q-TOF UDMSE proteomics approach. Synthetic Escherichia coli standard peptides were spiked in each sample for the absolute quantification. Data analysis was performed using ProgenesisQI v2.0 software (Waters Corporation). Results Our method ensured absolute quantification of 242 non redundant plasma proteins in a single run analysis. The dynamic range covered was 105. 86% were represented by classical plasma proteins. The overall median coefficient of variation was 0.36, while a set of 63 proteins was found to be highly stable. Absolute protein concentrations strongly correlated with values reviewed in the literature. Conclusions Nano-LC Q-TOF UDMSE proteomic analysis can be used for a simple and rapid determination of absolute amounts of plasma proteins. A large number of plasma proteins could be analyzed, while a wide dynamic range was covered with low coefficient of variation at protein level. The method proved to be a reliable tool for the quantification of protein panel for biomarker verification in the clinical practice. PMID:29151793

  4. Model Based Verification of Cyber Range Event Environments

    DTIC Science & Technology

    2015-12-10

    Model Based Verification of Cyber Range Event Environments Suresh K. Damodaran MIT Lincoln Laboratory 244 Wood St., Lexington, MA, USA...apply model based verification to cyber range event environment configurations, allowing for the early detection of errors in event environment...Environment Representation (CCER) ontology. We also provide an overview of a methodology to specify verification rules and the corresponding error

  5. Multiplexed MRM-based assays for the quantitation of proteins in mouse plasma and heart tissue.

    PubMed

    Percy, Andrew J; Michaud, Sarah A; Jardim, Armando; Sinclair, Nicholas J; Zhang, Suping; Mohammed, Yassene; Palmer, Andrea L; Hardie, Darryl B; Yang, Juncong; LeBlanc, Andre M; Borchers, Christoph H

    2017-04-01

    The mouse is the most commonly used laboratory animal, with more than 14 million mice being used for research each year in North America alone. The number and diversity of mouse models is increasing rapidly through genetic engineering strategies, but detailed characterization of these models is still challenging because most phenotypic information is derived from time-consuming histological and biochemical analyses. To expand the biochemists' toolkit, we generated a set of targeted proteomic assays for mouse plasma and heart tissue, utilizing bottom-up LC/MRM-MS with isotope-labeled peptides as internal standards. Protein quantitation was performed using reverse standard curves, with LC-MS platform and curve performance evaluated by quality control standards. The assays comprising the final panel (101 peptides for 81 proteins in plasma; 227 peptides for 159 proteins in heart tissue) have been rigorously developed under a fit-for-purpose approach and utilize stable-isotope labeled peptides for every analyte to provide high-quality, precise relative quantitation. In addition, the peptides have been tested to be interference-free and the assay is highly multiplexed, with reproducibly determined protein concentrations spanning >4 orders of magnitude. The developed assays have been used in a small pilot study to demonstrate their application to molecular phenotyping or biomarker discovery/verification studies. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  7. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  8. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  9. Audio-visual imposture

    NASA Astrophysics Data System (ADS)

    Karam, Walid; Mokbel, Chafic; Greige, Hanna; Chollet, Gerard

    2006-05-01

    A GMM based audio visual speaker verification system is described and an Active Appearance Model with a linear speaker transformation system is used to evaluate the robustness of the verification. An Active Appearance Model (AAM) is used to automatically locate and track a speaker's face in a video recording. A Gaussian Mixture Model (GMM) based classifier (BECARS) is used for face verification. GMM training and testing is accomplished on DCT based extracted features of the detected faces. On the audio side, speech features are extracted and used for speaker verification with the GMM based classifier. Fusion of both audio and video modalities for audio visual speaker verification is compared with face verification and speaker verification systems. To improve the robustness of the multimodal biometric identity verification system, an audio visual imposture system is envisioned. It consists of an automatic voice transformation technique that an impostor may use to assume the identity of an authorized client. Features of the transformed voice are then combined with the corresponding appearance features and fed into the GMM based system BECARS for training. An attempt is made to increase the acceptance rate of the impostor and to analyzing the robustness of the verification system. Experiments are being conducted on the BANCA database, with a prospect of experimenting on the newly developed PDAtabase developed within the scope of the SecurePhone project.

  10. Mass spectrometry based biomarker discovery, verification, and validation--quality assurance and control of protein biomarker assays.

    PubMed

    Parker, Carol E; Borchers, Christoph H

    2014-06-01

    In its early years, mass spectrometry (MS)-based proteomics focused on the cataloging of proteins found in different species or different tissues. By 2005, proteomics was being used for protein quantitation, typically based on "proteotypic" peptides which act as surrogates for the parent proteins. Biomarker discovery is usually done by non-targeted "shotgun" proteomics, using relative quantitation methods to determine protein expression changes that correlate with disease (output given as "up-or-down regulation" or "fold-increases"). MS-based techniques can also perform "absolute" quantitation which is required for clinical applications (output given as protein concentrations). Here we describe the differences between these methods, factors that affect the precision and accuracy of the results, and some examples of recent studies using MS-based proteomics to verify cancer-related biomarkers. Copyright © 2014 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  11. Laboratory evaluation of the Liat HIV Quant (IQuum) whole-blood and plasma HIV-1 viral load assays for point-of-care testing in South Africa.

    PubMed

    Scott, Lesley; Gous, Natasha; Carmona, Sergio; Stevens, Wendy

    2015-05-01

    Point-of-care (POC) HIV viral load (VL) testing offers the potential to reduce turnaround times for antiretroviral therapy monitoring, offer near-patient acute HIV diagnosis in adults, extend existing centralized VL services, screen women in labor, and prompt pediatrics to early treatment. The Liat HIV Quant plasma and whole-blood assays, prerelease version, were evaluated in South Africa. The precision, accuracy, linearity, and agreement of the Liat HIV Quant whole-blood and plasma assays were compared to those of reference technologies (Roche CAP CTMv2.0 and Abbott RealTime HIV-1) on an HIV verification plasma panel (n = 42) and HIV clinical specimens (n = 163). HIV Quant plasma assay showed good performance, with a 2.7% similarity coefficient of variation (CV) compared to the Abbott assay and a 1.8% similarity CV compared to the Roche test on the verification panel, and 100% specificity. HIV Quant plasma had substantial agreement (pc [concordance correlation] = 0.96) with Roche on clinical specimens and increased variability (pc = 0.73) in the range of <3.0 log copies/ml range with the HIV Quant whole-blood assay. HIV Quant plasma assay had good linearity (2.0 to 5.0 log copies/ml; R(2) = 0.99). Clinical sensitivity at a viral load of 1,000 copies/ml of the HIV Quant plasma and whole-blood assays compared to that of the Roche assay (n = 94) was 100% (confidence interval [CI], 95.3% to 100%). The specificity of HIV Quant plasma was 88.2% (CI, 63.6% to 98.5%), and that for whole blood was 41.2% (CI, 18.4% to 67.1%). No virological failure (downward misclassification) was missed. Liat HIV Quant plasma assay can be interchanged with existing VL technology in South Africa. Liat HIV Quant whole-blood assay would be advantageous for POC early infant diagnosis at birth and adult adherence monitoring and needs to be evaluated further in this clinical context. LIAT cartridges currently require cold storage, but the technology is user-friendly and robust. Clinical cost and implementation modeling is required. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  12. Research on key technology of the verification system of steel rule based on vision measurement

    NASA Astrophysics Data System (ADS)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  13. Systematic Model-in-the-Loop Test of Embedded Control Systems

    NASA Astrophysics Data System (ADS)

    Krupp, Alexander; Müller, Wolfgang

    Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.

  14. Identification of multiple novel protein biomarkers shed by human serous ovarian tumors into the blood of immunocompromised mice and verified in patient sera.

    PubMed

    Beer, Lynn A; Wang, Huan; Tang, Hsin-Yao; Cao, Zhijun; Chang-Wong, Tony; Tanyi, Janos L; Zhang, Rugang; Liu, Qin; Speicher, David W

    2013-01-01

    The most cancer-specific biomarkers in blood are likely to be proteins shed directly by the tumor rather than less specific inflammatory or other host responses. The use of xenograft mouse models together with in-depth proteome analysis for identification of human proteins in the mouse blood is an under-utilized strategy that can clearly identify proteins shed by the tumor. In the current study, 268 human proteins shed into mouse blood from human OVCAR-3 serous tumors were identified based upon human vs. mouse species differences using a four-dimensional plasma proteome fractionation strategy. A multi-step prioritization and verification strategy was subsequently developed to efficiently select some of the most promising biomarkers from this large number of candidates. A key step was parallel analysis of human proteins detected in the tumor supernatant, because substantially greater sequence coverage for many of the human proteins initially detected in the xenograft mouse plasma confirmed assignments as tumor-derived human proteins. Verification of candidate biomarkers in patient sera was facilitated by in-depth, label-free quantitative comparisons of serum pools from patients with ovarian cancer and benign ovarian tumors. The only proteins that advanced to multiple reaction monitoring (MRM) assay development were those that exhibited increases in ovarian cancer patients compared with benign tumor controls. MRM assays were facilely developed for all 11 novel biomarker candidates selected by this process and analysis of larger pools of patient sera suggested that all 11 proteins are promising candidate biomarkers that should be further evaluated on individual patient blood samples.

  15. PCR tools for the verification of the specific identity of ascaridoid nematodes from dogs and cats.

    PubMed

    Li, M W; Lin, R Q; Chen, H H; Sani, R A; Song, H Q; Zhu, X Q

    2007-01-01

    Based on the sequences of the internal transcribed spacers (ITS-1 and ITS-2) of nuclear ribosomal DNA (rDNA) of Toxocara canis, Toxocara cati, Toxocara malaysiensis and Toxascaris leonina, specific forward primers were designed in the ITS-1 or ITS-2 for each of the four ascaridoid species of dogs and cats. These primers were used individually together with a conserved primer in the large subunit of rDNA to amplify partial ITS-1 and/or ITS-2 of rDNA from 107 DNA samples from ascaridoids from dogs and cats in China, Australia, Malaysia, England and the Netherlands. This approach allowed their specific identification, with no amplicons being amplified from heterogeneous DNA samples, and sequencing confirmed the identity of the sequences amplified. The minimum amounts of DNA detectable using the PCR assays were 0.13-0.54ng. These PCR assays should provide useful tools for the diagnosis and molecular epidemiological investigations of toxocariasis in humans and animals.

  16. Biomarker Discovery and Verification of Esophageal Squamous Cell Carcinoma Using Integration of SWATH/MRM.

    PubMed

    Hou, Guixue; Lou, Xiaomin; Sun, Yulin; Xu, Shaohang; Zi, Jin; Wang, Quanhui; Zhou, Baojin; Han, Bo; Wu, Lin; Zhao, Xiaohang; Lin, Liang; Liu, Siqi

    2015-09-04

    We propose an efficient integration of SWATH with MRM for biomarker discovery and verification when the corresponding ion library is well established. We strictly controlled the false positive rate associated with SWATH MS signals and carefully selected the target peptides coupled with SWATH and MRM. We collected 10 samples of esophageal squamous cell carcinoma (ESCC) tissues paired with tumors and adjacent regions and quantified 1758 unique proteins with FDR 1% at protein level using SWATH, in which 467 proteins were abundance-dependent with ESCC. After carefully evaluating the SWATH MS signals of the up-regulated proteins, we selected 120 proteins for MRM verification. MRM analysis of the pooled and individual esophageal tissues resulted in 116 proteins that exhibited similar abundance response modes to ESCC that were acquired with SWATH. Because the ESCC-related proteins consisted of a high percentile of secreted proteins, we conducted the MRM assay on patient sera that were collected from pre- and postoperation. Of the 116 target proteins, 42 were identified in the ESCC sera, including 11 with lowered abundances postoperation. Coupling SWATH and MRM is thus feasible and efficient for the discovery and verification of cancer-related protein biomarkers.

  17. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify eachmore » cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.« less

  18. Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers

    NASA Technical Reports Server (NTRS)

    Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.

    1983-01-01

    A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.

  19. The capability of lithography simulation based on MVM-SEM® system

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Shingo; Fujii, Nobuaki; Kanno, Koichi; Imai, Hidemichi; Hayano, Katsuya; Miyashita, Hiroyuki; Shida, Soichi; Murakawa, Tsutomu; Kuribara, Masayuki; Matsumoto, Jun; Nakamura, Takayuki; Matsushita, Shohei; Hara, Daisuke; Pang, Linyong

    2015-10-01

    The 1Xnm technology node lithography is using SMO-ILT, NTD or more complex pattern. Therefore in mask defect inspection, defect verification becomes more difficult because many nuisance defects are detected in aggressive mask feature. One key Technology of mask manufacture is defect verification to use aerial image simulator or other printability simulation. AIMS™ Technology is excellent correlation for the wafer and standards tool for defect verification however it is difficult for verification over hundred numbers or more. We reported capability of defect verification based on lithography simulation with a SEM system that architecture and software is excellent correlation for simple line and space.[1] In this paper, we use a SEM system for the next generation combined with a lithography simulation tool for SMO-ILT, NTD and other complex pattern lithography. Furthermore we will use three dimension (3D) lithography simulation based on Multi Vision Metrology SEM system. Finally, we will confirm the performance of the 2D and 3D lithography simulation based on SEM system for a photomask verification.

  20. LEGO plot for simultaneous application of multiple quality requirements during trueness verification of quantitative laboratory tests.

    PubMed

    Park, Hae-il; Chae, Hyojin; Kim, Myungshin; Lee, Jehoon; Kim, Yonggoo

    2014-03-01

    We developed a two-dimensional plot for viewing trueness that takes into account potential shift and variable quality requirements to verify trueness using certified reference material (CRM). Glucose, total cholesterol (TC), and creatinine levels were determined by two kinds of assay in two levels of a CRM. Available quality requirements were collected, codified, and sorted in an ascending order in the plot's header row. Centering on the mean of measured values from CRM, the "mean ± US CLIA '88 allowable total error" was located in the header of the leftmost and rightmost columns. Twenty points were created in intervening columns as potential shifts. Uncertainties were calculated according to regression between certified values and uncertainties of CRM, and positioned in the corresponding columns. Cells were assigned different colors where column and row intersected based on comparison of the 95% confidence interval of the percentage bias with each quality requirement. A glucose assay failed to meet the highest quality criteria, for which shift of +0.13-0.14 mmol/l was required. A TC assay met the quality requirement and a shift of ±0.03 mmol/l was tolerable. A creatinine assay also met the quality requirement but any shift was not tolerable. The plot provides a systematic view of the trueness of quantitative laboratory tests. © 2014 Wiley Periodicals, Inc.

  1. DFM flow by using combination between design based metrology system and model based verification at sub-50nm memory device

    NASA Astrophysics Data System (ADS)

    Kim, Cheol-kyun; Kim, Jungchan; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu; Kim, Jinwoong

    2007-03-01

    As the minimum transistor length is getting smaller, the variation and uniformity of transistor length seriously effect device performance. So, the importance of optical proximity effects correction (OPC) and resolution enhancement technology (RET) cannot be overemphasized. However, OPC process is regarded by some as a necessary evil in device performance. In fact, every group which includes process and design, are interested in whole chip CD variation trend and CD uniformity, which represent real wafer. Recently, design based metrology systems are capable of detecting difference between data base to wafer SEM image. Design based metrology systems are able to extract information of whole chip CD variation. According to the results, OPC abnormality was identified and design feedback items are also disclosed. The other approaches are accomplished on EDA companies, like model based OPC verifications. Model based verification will be done for full chip area by using well-calibrated model. The object of model based verification is the prediction of potential weak point on wafer and fast feed back to OPC and design before reticle fabrication. In order to achieve robust design and sufficient device margin, appropriate combination between design based metrology system and model based verification tools is very important. Therefore, we evaluated design based metrology system and matched model based verification system for optimum combination between two systems. In our study, huge amount of data from wafer results are classified and analyzed by statistical method and classified by OPC feedback and design feedback items. Additionally, novel DFM flow would be proposed by using combination of design based metrology and model based verification tools.

  2. A methodology for the rigorous verification of plasma simulation codes

    NASA Astrophysics Data System (ADS)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  3. LLCEDATA and LLCECALC for Windows version 1.0, Volume 3: Software verification and validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McFadden, J.G.

    1998-09-04

    LLCEDATA and LLCECALC for Windows are user-friendly computer software programs that work together to determine the proper waste designation, handling, and disposition requirements for Long Length Contaminated Equipment (LLCE). LLCEDATA reads from a variety of data bases to produce an equipment data file(EDF) that represents a snapshot of both the LLCE and the tank from which it originates. LLCECALC reads the EDF and the gamma assay file (AV2) that is produced by the flexible Receiver Gamma Energy Analysis System. LLCECALC performs corrections to the AV2 file as it is being read and characterizes the LLCE. Both programs produce a varietymore » of reports, including a characterization report and a status report. The status report documents each action taken by the user, LLCEDATA, and LLCECALC. Documentation for LLCEDATA and LLCECALC for Windows is available in three volumes. Volume 1 is a user`s manual, which is intended as a quick reference for both LLCEDATA and LLCECALC. Volume 2 is a technical manual, which discusses system limitations and provides recommendations to the LLCE process. Volume 3 documents LLCEDATA and LLCECALC`s verification and validation. Two of the three installation test cases, from Volume 1, are independently confirmed. Data bases used in LLCEDATA are verified and referenced. Both phases of LLCECALC process gamma and characterization, are extensively tested to verify that the methodology and algorithms used are correct.« less

  4. CD volume design and verification

    NASA Technical Reports Server (NTRS)

    Li, Y. P.; Hughes, J. S.

    1993-01-01

    In this paper, we describe a prototype for CD-ROM volume design and verification. This prototype allows users to create their own model of CD volumes by modifying a prototypical model. Rule-based verification of the test volumes can then be performed later on against the volume definition. This working prototype has proven the concept of model-driven rule-based design and verification for large quantity of data. The model defined for the CD-ROM volumes becomes a data model as well as an executable specification.

  5. Identification of Multiple Novel Protein Biomarkers Shed by Human Serous Ovarian Tumors into the Blood of Immunocompromised Mice and Verified in Patient Sera

    PubMed Central

    Beer, Lynn A.; Wang, Huan; Tang, Hsin-Yao; Cao, Zhijun; Chang-Wong, Tony; Tanyi, Janos L.; Zhang, Rugang; Liu, Qin; Speicher, David W.

    2013-01-01

    The most cancer-specific biomarkers in blood are likely to be proteins shed directly by the tumor rather than less specific inflammatory or other host responses. The use of xenograft mouse models together with in-depth proteome analysis for identification of human proteins in the mouse blood is an under-utilized strategy that can clearly identify proteins shed by the tumor. In the current study, 268 human proteins shed into mouse blood from human OVCAR-3 serous tumors were identified based upon human vs. mouse species differences using a four-dimensional plasma proteome fractionation strategy. A multi-step prioritization and verification strategy was subsequently developed to efficiently select some of the most promising biomarkers from this large number of candidates. A key step was parallel analysis of human proteins detected in the tumor supernatant, because substantially greater sequence coverage for many of the human proteins initially detected in the xenograft mouse plasma confirmed assignments as tumor-derived human proteins. Verification of candidate biomarkers in patient sera was facilitated by in-depth, label-free quantitative comparisons of serum pools from patients with ovarian cancer and benign ovarian tumors. The only proteins that advanced to multiple reaction monitoring (MRM) assay development were those that exhibited increases in ovarian cancer patients compared with benign tumor controls. MRM assays were facilely developed for all 11 novel biomarker candidates selected by this process and analysis of larger pools of patient sera suggested that all 11 proteins are promising candidate biomarkers that should be further evaluated on individual patient blood samples. PMID:23544127

  6. The laboratory diagnosis of testosterone deficiency.

    PubMed

    Paduch, Darius A; Brannigan, Robert E; Fuchs, Eugene F; Kim, Edward D; Marmar, Joel L; Sandlow, Jay I

    2014-05-01

    The evaluation and treatment of hypogonadal men has become an important part of urologic practice. Fatigue, loss of libido, and erectile dysfunction are commonly reported, but nonspecific symptoms and laboratory verification of low testosterone (T) are an important part of evaluation in addition to a detailed history and physical examination. Significant intraindividual fluctuations in serum T levels, biologic variation of T action on end organs, the wide range of T levels in human serum samples, and technical limitations of currently available assays have led to poor reliability of T measurements in the clinical laboratory setting. There is no universally accepted threshold of T concentration that distinguishes eugonadal from hypogonadal men; thus, laboratory results have to be interpreted in the appropriate clinical setting. This review focuses on clinical, biological, and technological challenges that affect serum T measurements to educate clinicians regarding technological advances and limitations of the currently available laboratory methods to diagnose hypogonadism. A collaborative effort led by the American Urological Association between practicing clinicians, patient advocacy groups, government regulatory agencies, industry, and professional societies is underway to provide optimized assay platforms and evidence-based normal assay ranges to guide clinical decision making. Until such standardization is commonplace in clinical laboratories, the decision to treat should be based on the presence of signs and symptoms in addition to serum T measurements. Rigid interpretation of T ranges should not dictate clinical decision making or define coverage of treatment by third party payers. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Real-Time Sequence-Validated Loop-Mediated Isothermal Amplification Assays for Detection of Middle East Respiratory Syndrome Coronavirus (MERS-CoV)

    PubMed Central

    Bhadra, Sanchita; Jiang, Yu Sherry; Kumar, Mia R.; Johnson, Reed F.; Hensley, Lisa E.; Ellington, Andrew D.

    2015-01-01

    The Middle East respiratory syndrome coronavirus (MERS-CoV), an emerging human coronavirus, causes severe acute respiratory illness with a 35% mortality rate. In light of the recent surge in reported infections we have developed asymmetric five-primer reverse transcription loop-mediated isothermal amplification (RT-LAMP) assays for detection of MERS-CoV. Isothermal amplification assays will facilitate the development of portable point-of-care diagnostics that are crucial for management of emerging infections. The RT-LAMP assays are designed to amplify MERS-CoV genomic loci located within the open reading frame (ORF)1a and ORF1b genes and upstream of the E gene. Additionally we applied one-step strand displacement probes (OSD) for real-time sequence-specific verification of LAMP amplicons. Asymmetric amplification effected by incorporating a single loop primer in each assay accelerated the time-to-result of the OSD-RT-LAMP assays. The resulting assays could detect 0.02 to 0.2 plaque forming units (PFU) (5 to 50 PFU/ml) of MERS-CoV in infected cell culture supernatants within 30 to 50 min and did not cross-react with common human respiratory pathogens. PMID:25856093

  8. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  9. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    NASA Astrophysics Data System (ADS)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  10. Simulation environment based on the Universal Verification Methodology

    NASA Astrophysics Data System (ADS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  11. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  12. Field test of short-notice random inspections for inventory-change verification at a low-enriched-uranium fuel-fabrication plant: Preliminary summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fishbone, L.G.; Moussalli, G.; Naegele, G.

    1994-04-01

    An approach of short-notice random inspections (SNRIs) for inventory-change verification can enhance the effectiveness and efficiency of international safeguards at natural or low-enriched uranium (LEU) fuel fabrication plants. According to this approach, the plant operator declares the contents of nuclear material items before knowing if an inspection will occur to verify them. Additionally, items about which declarations are newly made should remain available for verification for an agreed time. This report details a six-month field test of the feasibility of such SNRIs which took place at the Westinghouse Electric Corporation Commercial Nuclear Fuel Division. Westinghouse personnel made daily declarations aboutmore » both feed and product items, uranium hexafluoride cylinders and finished fuel assemblies, using a custom-designed computer ``mailbox``. Safeguards inspectors from the IAEA conducted eight SNRIs to verify these declarations. Items from both strata were verified during the SNRIs by means of nondestructive assay equipment. The field test demonstrated the feasibility and practicality of key elements of the SNRI approach for a large LEU fuel fabrication plant.« less

  13. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations... concentration subcomponents (e.g., THC and CH4 for NMHC) separately. For example, for NMHC measurements, perform drift verification on NMHC; do not verify THC and CH4 separately. (2) Drift verification requires two...

  14. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  15. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  16. Glove-based approach to online signature verification.

    PubMed

    Kamel, Nidal S; Sayeed, Shohel; Ellis, Grant A

    2008-06-01

    Utilizing the multiple degrees of freedom offered by the data glove for each finger and the hand, a novel on-line signature verification system using the Singular Value Decomposition (SVD) numerical tool for signature classification and verification is presented. The proposed technique is based on the Singular Value Decomposition in finding r singular vectors sensing the maximal energy of glove data matrix A, called principal subspace, so the effective dimensionality of A can be reduced. Having modeled the data glove signature through its r-principal subspace, signature authentication is performed by finding the angles between the different subspaces. A demonstration of the data glove is presented as an effective high-bandwidth data entry device for signature verification. This SVD-based signature verification technique is tested and its performance is shown to be able to recognize forgery signatures with a false acceptance rate of less than 1.2%.

  17. Genetic Mimetics of Mycobacterium tuberculosis and Methicillin-Resistant Staphylococcus aureus as Verification Standards for Molecular Diagnostics.

    PubMed

    Machowski, Edith Erika; Kana, Bavesh Davandra

    2017-12-01

    Molecular diagnostics have revolutionized the management of health care through enhanced detection of disease or infection and effective enrollment into treatment. In recognition of this, the World Health Organization approved the rollout of nucleic acid amplification technologies for identification of Mycobacterium tuberculosis using platforms such as GeneXpert MTB/RIF, the GenoType MTBDR plus line probe assay, and, more recently, GeneXpert MTB/RIF Ultra. These assays can simultaneously detect tuberculosis infection and assess rifampin resistance. However, their widespread use in health systems requires verification and quality assurance programs. To enable development of these, we report the construction of genetically modified strains of Mycobacterium smegmatis that mimic the profile of Mycobacterium tuberculosis on both the GeneXpert MTB/RIF and the MTBDR plus line probe diagnostic tests. Using site-specific gene editing, we also created derivatives that faithfully mimic the diagnostic result of rifampin-resistant M. tuberculosis , with mutations at positions 513, 516, 526, 531, and 533 in the rifampin resistance-determining region of the rpoB gene. Next, we extended this approach to other diseases and demonstrated that a Staphylococcus aureus gene sequence can be introduced into M. smegmatis to generate a positive response for the SCC mec probe in the GeneXpert SA Nasal Complete molecular diagnostic cartridge, designed for identification of methicillin-resistant S. aureus These biomimetic strains are cost-effective, have low biohazard content, accurately mimic drug resistance, and can be produced with relative ease, thus illustrating their potential for widespread use as verification standards for diagnosis of a variety of diseases. Copyright © 2017 American Society for Microbiology.

  18. Blood collection tubes as medical devices: The potential to affect assays and proposed verification and validation processes for the clinical laboratory.

    PubMed

    Bowen, Raffick A R; Adcock, Dorothy M

    2016-12-01

    Blood collection tubes (BCTs) are an often under-recognized variable in the preanalytical phase of clinical laboratory testing. Unfortunately, even the best-designed and manufactured BCTs may not work well in all clinical settings. Clinical laboratories, in collaboration with healthcare providers, should carefully evaluate BCTs prior to putting them into clinical use to determine their limitations and ensure that patients are not placed at risk because of inaccuracies due to poor tube performance. Selection of the best BCTs can be achieved through comparing advertising materials, reviewing the literature, observing the device at a scientific meeting, receiving a demonstration, evaluating the device under simulated conditions, or testing the device with patient samples. Although many publications have discussed method validations, few detail how to perform experiments for tube verification and validation. This article highlights the most common and impactful variables related to BCTs and discusses the validation studies that a typical clinical laboratory should perform when selecting BCTs. We also present a brief review of how in vitro diagnostic devices, particularly BCTs, are regulated in the United States, the European Union, and Canada. The verification and validation of BCTs will help to avoid the economic and human costs associated with incorrect test results, including poor patient care, unnecessary testing, and delays in test results. We urge laboratorians, tube manufacturers, diagnostic companies, and other researchers to take all the necessary steps to protect against the adverse effects of BCT components and their additives on clinical assays. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  19. Neutron coincidence counting based on time interval analysis with one- and two-dimensional Rossi-alpha distributions: an application for passive neutron waste assay

    NASA Astrophysics Data System (ADS)

    Bruggeman, M.; Baeten, P.; De Boeck, W.; Carchon, R.

    1996-02-01

    Neutron coincidence counting is commonly used for the non-destructive assay of plutonium bearing waste or for safeguards verification measurements. A major drawback of conventional coincidence counting is related to the fact that a valid calibration is needed to convert a neutron coincidence count rate to a 240Pu equivalent mass ( 240Pu eq). In waste assay, calibrations are made for representative waste matrices and source distributions. The actual waste however may have quite different matrices and source distributions compared to the calibration samples. This often results in a bias of the assay result. This paper presents a new neutron multiplicity sensitive coincidence counting technique including an auto-calibration of the neutron detection efficiency. The coincidence counting principle is based on the recording of one- and two-dimensional Rossi-alpha distributions triggered respectively by pulse pairs and by pulse triplets. Rossi-alpha distributions allow an easy discrimination between real and accidental coincidences and are aimed at being measured by a PC-based fast time interval analyser. The Rossi-alpha distributions can be easily expressed in terms of a limited number of factorial moments of the neutron multiplicity distributions. The presented technique allows an unbiased measurement of the 240Pu eq mass. The presented theory—which will be indicated as Time Interval Analysis (TIA)—is complementary to Time Correlation Analysis (TCA) theories which were developed in the past, but is from the theoretical point of view much simpler and allows a straightforward calculation of deadtime corrections and error propagation. Analytical expressions are derived for the Rossi-alpha distributions as a function of the factorial moments of the efficiency dependent multiplicity distributions. The validity of the proposed theory is demonstrated and verified via Monte Carlo simulations of pulse trains and the subsequent analysis of the simulated data.

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSDF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, SEPTITECH, INC. MODEL 400 SYSTEM - 02/04/WQPC-SWP

    EPA Science Inventory

    Verification testing of the SeptiTech Model 400 System was conducted over a twelve month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at the Otis Air National Guard Base in Borne, MA. Sanitary Sewerage from the base residential housing was u...

  1. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  2. Gender Verification of Female Olympic Athletes.

    ERIC Educational Resources Information Center

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  3. Assessment of test methods for evaluating effectiveness of cleaning flexible endoscopes.

    PubMed

    Washburn, Rebecca E; Pietsch, Jennifer J

    2018-06-01

    Strict adherence to each step of reprocessing is imperative to removing potentially infectious agents. Multiple methods for verifying proper reprocessing exist; however, each presents challenges and limitations, and best practice within the industry has not been established. Our goal was to evaluate endoscope cleaning verification tests with particular interest in the evaluation of the manual cleaning step. The results of the cleaning verification tests were compared with microbial culturing to see if a positive cleaning verification test would be predictive of microbial growth. This study was conducted at 2 high-volume endoscopy units within a multisite health care system. Each of the 90 endoscopes were tested for adenosine triphosphate, protein, microbial growth via agar plate, and rapid gram-negative culture via assay. The endoscopes were tested in 3 locations: the instrument channel, control knob, and elevator mechanism. This analysis showed substantial level of agreement between protein detection postmanual cleaning and protein detection post-high-level disinfection at the control head for scopes sampled sequentially. This study suggests that if protein is detected postmanual cleaning, there is a significant likelihood that protein will also be detected post-high-level disinfection. It also infers that a cleaning verification test is not predictive of microbial growth. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  4. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices

    PubMed Central

    Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei

    2017-01-01

    In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer’s forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices. PMID:28075375

  5. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices.

    PubMed

    Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei

    2017-01-10

    In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer's forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices.

  6. Development of a sequential workflow based on LC-PRM for the verification of endometrial cancer protein biomarkers in uterine aspirate samples.

    PubMed

    Martinez-Garcia, Elena; Lesur, Antoine; Devis, Laura; Campos, Alexandre; Cabrera, Silvia; van Oostrum, Jan; Matias-Guiu, Xavier; Gil-Moreno, Antonio; Reventos, Jaume; Colas, Eva; Domon, Bruno

    2016-08-16

    About 30% of endometrial cancer (EC) patients are diagnosed at an advanced stage of the disease, which is associated with a drastic decrease in the 5-year survival rate. The identification of biomarkers in uterine aspirate samples, which are collected by a minimally invasive procedure, would improve early diagnosis of EC. We present a sequential workflow to select from a list of potential EC biomarkers, those which are the most promising to enter a validation study. After the elimination of confounding contributions by residual blood proteins, 52 potential biomarkers were analyzed in uterine aspirates from 20 EC patients and 18 non-EC controls by a high-resolution accurate mass spectrometer operated in parallel reaction monitoring mode. The differential abundance of 26 biomarkers was observed, and among them ten proteins showed a high sensitivity and specificity (AUC > 0.9). The study demonstrates that uterine aspirates are valuable samples for EC protein biomarkers screening. It also illustrates the importance of a biomarker verification phase to fill the gap between discovery and validation studies and highlights the benefits of high resolution mass spectrometry for this purpose. The proteins verified in this study have an increased likelihood to become a clinical assay after a subsequent validation phase.

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, F. R. MAHONEY & ASSOC., AMPHIDROME SYSTEM FOR SINGLE FAMILY HOMES - 02/05/WQPC-SWP

    EPA Science Inventory

    Verification testing of the F.R. Mahoney Amphidrome System was conducted over a twelve month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at the Otis Air National Guard Base in Borne, MA. Sanitary Sewerage from the base residential housing w...

  8. Performance evaluation of a chemiluminescence microparticle immunoassay for CK-MB.

    PubMed

    Lin, Zhi-Yuan; Fang, Yi-Zhen; Jin, Hong-Wei; Lin, Hua-Yue; Dai, Zhang; Luo, Qing; Li, Hong-Wei; Lin, Yan-Ling; Huang, Shui-Zhen; Gao, Lei; Xu, Fei-Hai; Zhang, Zhong-Ying

    2018-03-31

    To verify and evaluate the performance characteristics of a creatine kinase phosphokinase isoenzymes MB (CK-MB) assay kit, which produced by Xiamen Innodx Biotech Co. Ltd. Evaluation was carried out according to "Guidelines for principle of analysis performance evaluation of in vitro diagnostic reagent." The performance parameters included detection limit, linearity range, reportable range, recovery test, precision verification, interference test, cross-reactivity, matrix effect, and method comparison. The detection limit was 0.1 ng/mL. The assay had clinical linearity over range of 0.1 ng/mL-500 ng/mL. Reportable range was from 0.1 ng/mL to 1000 ng/mL. The average percent of recovery was 99.66%. The coefficient of variation (CV) for within-run and between-run of low CK-MB sample was 5.55% and 6.16%, respectively. As for high-level sample, it was 7.88% and 7.80%. In medical decision level, the relative deviation (Bias) of all interference tests was lower than 15%. When the sample had mild-hemolysis; hemoglobin ≤15 g/L; triglyceride ≤17 mmol/L; bilirubin ≤427.5 μmol/L; rheumatoid factor ≤206U/mL, there was no significant interference to be found. Moreover, assay kit had no cross-reaction with CK-MM and CK-BB. At last, total diagnostic accuracy of kit was 93.24%, when compared with refer kit. Overall the results of the verification study indicated the performance of kit is met the requirements of the clinical test. © 2018 Wiley Periodicals, Inc.

  9. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  10. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  11. Fabrication of 12% {sup 240}Pu calorimetry standards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, S.M.; Hildner, S.; Gutierrez, D.

    1995-08-01

    Throughout the DOE complex, laboratories are performing calorimetric assays on items containing high burnup plutonium. These materials contain higher isotopic range and higher wattages than materials previously encountered in vault holdings. Currently, measurement control standards have been limited to utilizing 6% {sup 240}Pu standards. The lower isotopic and wattage value standards do not complement the measurement of the higher burnup material. Participants of the Calorimetry Exchange (CALEX) Program have identified the need for new calorimetric assay standards with a higher wattage and isotopic range. This paper describes the fabrication and verification measurements of the new CALEX standard containing 12% {supmore » 240}Pu oxide with a wattage of about 6 to 8 watts.« less

  12. Inter-laboratory verification of European pharmacopoeia monograph on derivative spectrophotometry method and its application for chitosan hydrochloride.

    PubMed

    Marković, Bojan; Ignjatović, Janko; Vujadinović, Mirjana; Savić, Vedrana; Vladimirov, Sote; Karljiković-Rajić, Katarina

    2015-01-01

    Inter-laboratory verification of European pharmacopoeia (EP) monograph on derivative spectrophotometry (DS) method and its application for chitosan hydrochloride was carried out on two generation of instruments (earlier GBC Cintra 20 and current technology TS Evolution 300). Instruments operate with different versions of Savitzky-Golay algorithm and modes of generating digital derivative spectra. For resolution power parameter, defined as the amplitude ratio A/B in DS method EP monograph, comparable results were obtained only with algorithm's parameters smoothing points (SP) 7 and the 2nd degree polynomial and those provided corresponding data with other two modes on TS Evolution 300 Medium digital indirect and Medium digital direct. Using quoted algorithm's parameters, the differences in percentages between the amplitude ratio A/B averages, were within accepted criteria (±3%) for assay of drug product for method transfer. The deviation of 1.76% for the degree of deacetylation assessment of chitosan hydrochloride, determined on two instruments, (amplitude (1)D202; the 2nd degree polynomial and SP 9 in Savitzky-Golay algorithm), was acceptable, since it was within allowed criteria (±2%) for assay deviation of drug substance, for method transfer in pharmaceutical analyses. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Authentication of Botanical Origin in Herbal Teas by Plastid Noncoding DNA Length Polymorphisms.

    PubMed

    Uncu, Ali Tevfik; Uncu, Ayse Ozgur; Frary, Anne; Doganlar, Sami

    2015-07-01

    The aim of this study was to develop a DNA barcode assay to authenticate the botanical origin of herbal teas. To reach this aim, we tested the efficiency of a PCR-capillary electrophoresis (PCR-CE) approach on commercial herbal tea samples using two noncoding plastid barcodes, the trnL intron and the intergenic spacer between trnL and trnF. Barcode DNA length polymorphisms proved successful in authenticating the species origin of herbal teas. We verified the validity of our approach by sequencing species-specific barcode amplicons from herbal tea samples. Moreover, we displayed the utility of PCR-CE assays coupled with sequencing to identify the origin of undeclared plant material in herbal tea samples. The PCR-CE assays proposed in this work can be applied as routine tests for the verification of botanical origin in herbal teas and can be extended to authenticate all types of herbal foodstuffs.

  14. Authoring and verification of clinical guidelines: a model driven approach.

    PubMed

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc. All rights reserved.

  15. A study of applications scribe frame data verifications using design rule check

    NASA Astrophysics Data System (ADS)

    Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki

    2013-06-01

    In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.

  16. Security Verification Techniques Applied to PatchLink COTS Software

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt; Andrew, Chris; Jog, Sameer

    2006-01-01

    Verification of the security of software artifacts is a challenging task. An integrated approach that combines verification techniques can increase the confidence in the security of software artifacts. Such an approach has been developed by the Jet Propulsion Laboratory (JPL) and the University of California at Davis (UC Davis). Two security verification instruments were developed and then piloted on PatchLink's UNIX Agent, a Commercial-Off-The-Shelf (COTS) software product, to assess the value of the instruments and the approach. The two instruments are the Flexible Modeling Framework (FMF) -- a model-based verification instrument (JPL), and a Property-Based Tester (UC Davis). Security properties were formally specified for the COTS artifact and then verified using these instruments. The results were then reviewed to determine the effectiveness of the approach and the security of the COTS product.

  17. Property-driven functional verification technique for high-speed vision system-on-chip processor

    NASA Astrophysics Data System (ADS)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  18. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    NASA Astrophysics Data System (ADS)

    Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB

    2017-11-01

    Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  19. Instant screening and verification of carbapenemase activity in Bacteroides fragilis in positive blood culture, using matrix-assisted laser desorption ionization--time of flight mass spectrometry.

    PubMed

    Johansson, Åsa; Nagy, Elisabeth; Sóki, József

    2014-08-01

    Rapid identification of isolates in positive blood cultures are of great importance to secure correct treatment of septicaemic patients. As antimicrobial resistance is increasing, rapid detection of resistance is crucial. Carbapenem resistance in Bacteroides fragilis associated with cfiA-encoded class B metallo-beta-lactamase is emerging. In our study we spiked blood culture bottles with 26 B. fragilis strains with various cfiA-status and ertapenem MICs. By using main spectra specific for cfiA-positive and cfiA-negative B. fragilis strains, isolates could be screened for resistance. To verify strains that were positive in the screening, a carbapenemase assay was performed where the specific peaks of intact and hydrolysed ertapenem were analysed with matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS). We show here that it is possible to correctly identify B. fragilis and to screen for enzymic carbapenem resistance directly from the pellet of positive blood cultures. The carbapenemase assay to verify the presence of the enzyme was successfully performed on the pellet from the direct identification despite the presence of blood components. The result of the procedure was achieved in 3 h. Also the Bruker mass spectrometric β-lactamase assay (MSBL assay) prototype software was proven not only to be based on an algorithm that correlated with the manual inspection of the spectra, but also to improve the interpretation by showing the variation in the dataset. © 2014 The Authors.

  20. Using Concept Space to Verify Hyponymy in Building a Hyponymy Lexicon

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Zhang, Sen; Diao, Lu Hong; Yan, Shu Ying; Cao, Cun Gen

    Verification of hyponymy relations is a basic problem in knowledge acquisition. We present a method of hyponymy verification based on concept space. Firstly, we give the definition of concept space about a group of candidate hyponymy relations. Secondly we analyze the concept space and define a set of hyponymy features based on the space structure. Then we use them to verify candidate hyponymy relations. Experimental results show that the method can provide adequate verification of hyponymy.

  1. Knowledge Based Systems (KBS) Verification, Validation, Evaluation, and Testing (VVE&T) Bibliography: Topical Categorization

    DTIC Science & Technology

    2003-03-01

    Different?," Jour. of Experimental & Theoretical Artificial Intelligence, Special Issue on Al for Systems Validation and Verification, 12(4), 2000, pp...Hamilton, D., " Experiences in Improving the State of Practice in Verification and Validation of Knowledge-Based Systems," Workshop Notes of the AAAI...Unsuspected Power of the Standard Turing Test," Jour. of Experimental & Theoretical Artificial Intelligence., 12, 2000, pp3 3 1-3 4 0 . [30] Gaschnig

  2. FORMED: Bringing Formal Methods to the Engineering Desktop

    DTIC Science & Technology

    2016-02-01

    integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT; ULTRASONIC AQUEOUS CLEANING SYSTEMS, SMART SONIC CORPORATION, SMART SONIC

    EPA Science Inventory

    This report is a product of the U.S. EPA's Environmental Technoloy Verification (ETV) Program and is focused on the Smart Sonics Ultrasonic Aqueous Cleaning Systems. The verification is based on three main objectives. (1) The Smart Sonic Aqueous Cleaning Systems, Model 2000 and...

  4. A quantification of the effectiveness of EPID dosimetry and software-based plan verification systems in detecting incidents in radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bojechko, Casey; Phillps, Mark; Kalet, Alan

    Purpose: Complex treatments in radiation therapy require robust verification in order to prevent errors that can adversely affect the patient. For this purpose, the authors estimate the effectiveness of detecting errors with a “defense in depth” system composed of electronic portal imaging device (EPID) based dosimetry and a software-based system composed of rules-based and Bayesian network verifications. Methods: The authors analyzed incidents with a high potential severity score, scored as a 3 or 4 on a 4 point scale, recorded in an in-house voluntary incident reporting system, collected from February 2012 to August 2014. The incidents were categorized into differentmore » failure modes. The detectability, defined as the number of incidents that are detectable divided total number of incidents, was calculated for each failure mode. Results: In total, 343 incidents were used in this study. Of the incidents 67% were related to photon external beam therapy (EBRT). The majority of the EBRT incidents were related to patient positioning and only a small number of these could be detected by EPID dosimetry when performed prior to treatment (6%). A large fraction could be detected by in vivo dosimetry performed during the first fraction (74%). Rules-based and Bayesian network verifications were found to be complimentary to EPID dosimetry, able to detect errors related to patient prescriptions and documentation, and errors unrelated to photon EBRT. Combining all of the verification steps together, 91% of all EBRT incidents could be detected. Conclusions: This study shows that the defense in depth system is potentially able to detect a large majority of incidents. The most effective EPID-based dosimetry verification is in vivo measurements during the first fraction and is complemented by rules-based and Bayesian network plan checking.« less

  5. S14 as a Therapeutic Target in Breast Cancer

    DTIC Science & Technology

    2005-08-01

    dimethylthiazol-2yl)-5-(3-carboxymethyphenyl)-2-(4-sulfophenyl)-2H-tetrazolium (MTS) assay (Promega). Oxidation of MTS by viable mitochondria yields a...potential mediators of the observed superinduction. The amplified signal could not be attributed to progestin induction of PPAR Gamma Coactivator-113 (PGC-lf3...quantitation of siRNA transfection efficiency, verification of localization of the siRNA to the interior of the cells, using more than one siRNA, and the

  6. Method and computer product to increase accuracy of time-based software verification for sensor networks

    DOEpatents

    Foo Kune, Denis [Saint Paul, MN; Mahadevan, Karthikeyan [Mountain View, CA

    2011-01-25

    A recursive verification protocol to reduce the time variance due to delays in the network by putting the subject node at most one hop from the verifier node provides for an efficient manner to test wireless sensor nodes. Since the software signatures are time based, recursive testing will give a much cleaner signal for positive verification of the software running on any one node in the sensor network. In this protocol, the main verifier checks its neighbor, who in turn checks its neighbor, and continuing this process until all nodes have been verified. This ensures minimum time delays for the software verification. Should a node fail the test, the software verification downstream is halted until an alternative path (one not including the failed node) is found. Utilizing techniques well known in the art, having a node tested twice, or not at all, can be avoided.

  7. Post-OPC verification using a full-chip pattern-based simulation verification method

    NASA Astrophysics Data System (ADS)

    Hung, Chi-Yuan; Wang, Ching-Heng; Ma, Cliff; Zhang, Gary

    2005-11-01

    In this paper, we evaluated and investigated techniques for performing fast full-chip post-OPC verification using a commercial product platform. A number of databases from several technology nodes, i.e. 0.13um, 0.11um and 90nm are used in the investigation. Although it has proven that for most cases, our OPC technology is robust in general, due to the variety of tape-outs with complicated design styles and technologies, it is difficult to develop a "complete or bullet-proof" OPC algorithm that would cover every possible layout patterns. In the evaluation, among dozens of databases, some OPC databases were found errors by Model-based post-OPC checking, which could cost significantly in manufacturing - reticle, wafer process, and more importantly the production delay. From such a full-chip OPC database verification, we have learned that optimizing OPC models and recipes on a limited set of test chip designs may not provide sufficient coverage across the range of designs to be produced in the process. And, fatal errors (such as pinch or bridge) or poor CD distribution and process-sensitive patterns may still occur. As a result, more than one reticle tape-out cycle is not uncommon to prove models and recipes that approach the center of process for a range of designs. So, we will describe a full-chip pattern-based simulation verification flow serves both OPC model and recipe development as well as post OPC verification after production release of the OPC. Lastly, we will discuss the differentiation of the new pattern-based and conventional edge-based verification tools and summarize the advantages of our new tool and methodology: 1). Accuracy: Superior inspection algorithms, down to 1nm accuracy with the new "pattern based" approach 2). High speed performance: Pattern-centric algorithms to give best full-chip inspection efficiency 3). Powerful analysis capability: Flexible error distribution, grouping, interactive viewing and hierarchical pattern extraction to narrow down to unique patterns/cells.

  8. Signature Verification Based on Handwritten Text Recognition

    NASA Astrophysics Data System (ADS)

    Viriri, Serestina; Tapamo, Jules-R.

    Signatures continue to be an important biometric trait because it remains widely used primarily for authenticating the identity of human beings. This paper presents an efficient text-based directional signature recognition algorithm which verifies signatures, even when they are composed of special unconstrained cursive characters which are superimposed and embellished. This algorithm extends the character-based signature verification technique. The experiments carried out on the GPDS signature database and an additional database created from signatures captured using the ePadInk tablet, show that the approach is effective and efficient, with a positive verification rate of 94.95%.

  9. A framework of multitemplate ensemble for fingerprint verification

    NASA Astrophysics Data System (ADS)

    Yin, Yilong; Ning, Yanbin; Ren, Chunxiao; Liu, Li

    2012-12-01

    How to improve performance of an automatic fingerprint verification system (AFVS) is always a big challenge in biometric verification field. Recently, it becomes popular to improve the performance of AFVS using ensemble learning approach to fuse related information of fingerprints. In this article, we propose a novel framework of fingerprint verification which is based on the multitemplate ensemble method. This framework is consisted of three stages. In the first stage, enrollment stage, we adopt an effective template selection method to select those fingerprints which best represent a finger, and then, a polyhedron is created by the matching results of multiple template fingerprints and a virtual centroid of the polyhedron is given. In the second stage, verification stage, we measure the distance between the centroid of the polyhedron and a query image. In the final stage, a fusion rule is used to choose a proper distance from a distance set. The experimental results on the FVC2004 database prove the improvement on the effectiveness of the new framework in fingerprint verification. With a minutiae-based matching method, the average EER of four databases in FVC2004 drops from 10.85 to 0.88, and with a ridge-based matching method, the average EER of these four databases also decreases from 14.58 to 2.51.

  10. Use of metaknowledge in the verification of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Morell, Larry J.

    1989-01-01

    Knowledge-based systems are modeled as deductive systems. The model indicates that the two primary areas of concern in verification are demonstrating consistency and completeness. A system is inconsistent if it asserts something that is not true of the modeled domain. A system is incomplete if it lacks deductive capability. Two forms of consistency are discussed along with appropriate verification methods. Three forms of incompleteness are discussed. The use of metaknowledge, knowledge about knowledge, is explored in connection to each form of incompleteness.

  11. Hyperplex-MRM: a hybrid multiple reaction monitoring method using mTRAQ/iTRAQ labeling for multiplex absolute quantification of human colorectal cancer biomarker.

    PubMed

    Yin, Hong-Rui; Zhang, Lei; Xie, Li-Qi; Huang, Li-Yong; Xu, Ye; Cai, San-Jun; Yang, Peng-Yuan; Lu, Hao-Jie

    2013-09-06

    Novel biomarker verification assays are urgently required to improve the efficiency of biomarker development. Benefitting from lower development costs, multiple reaction monitoring (MRM) has been used for biomarker verification as an alternative to immunoassay. However, in general MRM analysis, only one sample can be quantified in a single experiment, which restricts its application. Here, a Hyperplex-MRM quantification approach, which combined mTRAQ for absolute quantification and iTRAQ for relative quantification, was developed to increase the throughput of biomarker verification. In this strategy, equal amounts of internal standard peptides were labeled with mTRAQ reagents Δ0 and Δ8, respectively, as double references, while 4-plex iTRAQ reagents were used to label four different samples as an alternative to mTRAQ Δ4. From the MRM trace and MS/MS spectrum, total amounts and relative ratios of target proteins/peptides of four samples could be acquired simultaneously. Accordingly, absolute amounts of target proteins/peptides in four different samples could be achieved in a single run. In addition, double references were used to increase the reliability of the quantification results. Using this approach, three biomarker candidates, ademosylhomocysteinase (AHCY), cathepsin D (CTSD), and lysozyme C (LYZ), were successfully quantified in colorectal cancer (CRC) tissue specimens of different stages with high accuracy, sensitivity, and reproducibility. To summarize, we demonstrated a promising quantification method for high-throughput verification of biomarker candidates.

  12. Ultra Performance Liquid Chromatography with Tandem Mass Spectrometry for the Quantitation of Seventeen Sedative Hypnotics in Six Common Toxicological Matrices.

    PubMed

    Mata, Dani C; Davis, John F; Figueroa, Ariana K; Stanford, Mary June

    2016-01-01

    An ultra performance liquid chromatography triple quadrupole mass spectrometry (LC-MS-MS) method for the quantification of 14 benzodiazepines and three sedative hypnotics is presented. The fast and inexpensive assay was developed for California's Orange County Crime Lab for use in antemortem (AM) and postmortem casework. The drugs were rapidly cleaned up from AM blood, postmortem blood, urine, liver, brain and stomach contents using DPX(®) Weak Anion Exchange (DPX WAX) tips fitted on a pneumatic extractor, which can process up to 48 samples at one time. Assay performance was determined for validation based on recommendations by the Scientific Working Group for Forensic Toxicology for linearity, limit of quantitation, limit of detection, bias, precision (within run and between run), dilution integrity, carry-over, selectivity, recovery, ion suppression and extracted sample stability. Linearity was verified using the therapeutic and toxic ranges of all 17 analytes. Final verification of the method was confirmed by four analysts using 20 blind matrix matched samples. All results were within 20% of each other and the expected value. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  13. Exploring implementation practices in results-based financing: the case of the verification in Benin.

    PubMed

    Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier

    2017-03-14

    Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks causes a further verticalization of the health system. Our results highlight the potential disconnect between the theory of change behind RBF and the actual scheme's implementation. The implications are relevant at methodological level, stressing the importance of analyzing implementation processes to fully understand results, as well as at operational level, pointing to the need to carefully adapt the design of RBF schemes (including verification and other key functions) to the context and to allow room to iteratively modify it during implementation. They also question whether the rationale for thorough and costly verification is justified, or rather adaptations are possible.

  14. Feasibility study on dosimetry verification of volumetric-modulated arc therapy-based total marrow irradiation.

    PubMed

    Liang, Yun; Kim, Gwe-Ya; Pawlicki, Todd; Mundt, Arno J; Mell, Loren K

    2013-03-04

    The purpose of this study was to develop dosimetry verification procedures for volumetric-modulated arc therapy (VMAT)-based total marrow irradiation (TMI). The VMAT based TMI plans were generated for three patients: one child and two adults. The planning target volume (PTV) was defined as bony skeleton, from head to mid-femur, with a 3 mm margin. The plan strategy similar to published studies was adopted. The PTV was divided into head and neck, chest, and pelvic regions, with separate plans each of which is composed of 2-3 arcs/fields. Multiple isocenters were evenly distributed along the patient's axial direction. The focus of this study is to establish a dosimetry quality assurance procedure involving both two-dimensional (2D) and three-dimensional (3D) volumetric verifications, which is desirable for a large PTV treated with multiple isocenters. The 2D dose verification was performed with film for gamma evaluation and absolute point dose was measured with ion chamber, with attention to the junction between neighboring plans regarding hot/cold spots. The 3D volumetric dose verification used commercial dose reconstruction software to reconstruct dose from electronic portal imaging devices (EPID) images. The gamma evaluation criteria in both 2D and 3D verification were 5% absolute point dose difference and 3 mm of distance to agreement. With film dosimetry, the overall average gamma passing rate was 98.2% and absolute dose difference was 3.9% in junction areas among the test patients; with volumetric portal dosimetry, the corresponding numbers were 90.7% and 2.4%. A dosimetry verification procedure involving both 2D and 3D was developed for VMAT-based TMI. The initial results are encouraging and warrant further investigation in clinical trials.

  15. Verification of Abbott 25-OH-vitamin D assay on the architect system.

    PubMed

    Hutchinson, Katrina; Healy, Martin; Crowley, Vivion; Louw, Michael; Rochev, Yury

    2017-04-01

    Analytical and clinical verification of both old and new generations of the Abbott total 25-hydroxyvitamin D (25OHD) assays, and an examination of reference Intervals. Determination of between-run precision, and Deming comparison between patient sample results for 25OHD on the Abbott Architect, DiaSorin Liaison and AB SCIEX API 4000 (LC-MS/MS). Establishment of uncertainty of measurement for 25OHD Architect methods using old and new generations of the reagents, and estimation of reference interval in healthy Irish population. For between-run precision the manufacturer claims 2.8% coefficients of variation (CVs) of 2.8% and 4.6% for their high and low controls, respectively. Our instrument showed CVs between 4% and 6.2% for all levels of the controls on both generations of the Abbott reagents. The between-run uncertainties were 0.28 and 0.36, with expanded uncertainties 0.87 and 0.98 for the old and the new generations of reagent, respectively. The difference between all methods used for patients' samples was within total allowable error, and the instruments produced clinically equivalent results. The results covered the medical decision points of 30, 40, 50 and 125 nmol/L. The reference interval for total 25OHD in our healthy Irish subjects was lower than recommended levels (24-111 nmol/L). In a clinical laboratory Abbott 25OHD immunoassays are a useful, rapid and accurate method for measuring total 25OHD. The new generation of the assay was confirmed to be reliable, accurate, and a good indicator for 25OHD measurement. More study is needed to establish reference intervals that correctly represent the healthy population in Ireland.

  16. Microbial burden prediction model for unmanned planetary spacecraft

    NASA Technical Reports Server (NTRS)

    Hoffman, A. R.; Winterburn, D. A.

    1972-01-01

    The technical development of a computer program for predicting microbial burden on unmanned planetary spacecraft is outlined. The discussion includes the derivation of the basic analytical equations, the selection of a method for handling several random variables, the macrologic of the computer programs and the validation and verification of the model. The prediction model was developed to (1) supplement the biological assays of a spacecraft by simulating the microbial accretion during periods when assays are not taken; (2) minimize the necessity for a large number of microbiological assays; and (3) predict the microbial loading on a lander immediately prior to sterilization and other non-lander equipment prior to launch. It is shown that these purposes not only were achieved but also that the prediction results compare favorably to the estimates derived from the direct assays. The computer program can be applied not only as a prediction instrument but also as a management and control tool. The basic logic of the model is shown to have possible applicability to other sequential flow processes, such as food processing.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marleau, Peter; Brubaker, Erik; Deland, Sharon M.

    This report summarizes the discussion and conclusions reached during a table top exercise held at Sandia National Laboratories, Albuquerque on September 3, 2014 regarding a recently described approach for nuclear warhead verification based on the cryptographic concept of a zero-knowledge protocol (ZKP) presented in a recent paper authored by Glaser, Barak, and Goldston. A panel of Sandia National Laboratories researchers, whose expertise includes radiation instrumentation design and development, cryptography, and arms control verification implementation, jointly reviewed the paper and identified specific challenges to implementing the approach as well as some opportunities. It was noted that ZKP as used in cryptographymore » is a useful model for the arms control verification problem, but the direct analogy to arms control breaks down quickly. The ZKP methodology for warhead verification fits within the general class of template-based verification techniques, where a reference measurement is used to confirm that a given object is like another object that has already been accepted as a warhead by some other means. This can be a powerful verification approach, but requires independent means to trust the authenticity of the reference warhead - a standard that may be difficult to achieve, which the ZKP authors do not directly address. Despite some technical challenges, the concept of last-minute selection of the pre-loads and equipment could be a valuable component of a verification regime.« less

  18. Survey of Verification and Validation Techniques for Small Satellite Software Development

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  19. Identification of Brucella genus and eight Brucella species by Luminex bead-based suspension array.

    PubMed

    Lusk Pfefer, Tina S; Timme, Ruth; Kase, Julie A

    2018-04-01

    Globally, unpasteurized milk products are vehicles for the transmission of brucellosis, a zoonosis responsible for cases of foodborne illness in the United States and elsewhere. Existing PCR assays to detect Brucella species are restricted by the resolution of band sizes on a gel or the number of fluorescent channels in a single real-time system. The Luminex bead-based suspension array is performed in a 96-well plate allowing for high throughput screening of up to 100 targets in one sample with easily discernible results. We have developed an array using the Bio-Plex 200 to differentiate the most common Brucella species: B. abortus, B. melitensis, B. suis, B. suis bv5, B. canis, B. ovis, B. pinnipedia, and B. neotomae, as well as Brucella genus. All probes showed high specificity, with no cross-reaction with non-Brucella strains. We could detect pure DNA from B. abortus, B. melitensis, and genus-level Brucella at concentrations of ≤5 fg/μL. Pure DNA from all other species tested positive at concentrations well below 500 fg/μL and we positively identified B. neotomae in six artificially contaminated cheese and milk products. An intra-laboratory verification further demonstrated the assay's accuracy and robustness in the rapid screening (3-4 h including PCR) of DNA. Published by Elsevier Ltd.

  20. Simulation-Based Verification of Autonomous Controllers via Livingstone PathFinder

    NASA Technical Reports Server (NTRS)

    Lindsey, A. E.; Pecheur, Charles

    2004-01-01

    AI software is often used as a means for providing greater autonomy to automated systems, capable of coping with harsh and unpredictable environments. Due in part to the enormous space of possible situations that they aim to addrs, autonomous systems pose a serious challenge to traditional test-based verification approaches. Efficient verification approaches need to be perfected before these systems can reliably control critical applications. This publication describes Livingstone PathFinder (LPF), a verification tool for autonomous control software. LPF applies state space exploration algorithms to an instrumented testbed, consisting of the controller embedded in a simulated operating environment. Although LPF has focused on NASA s Livingstone model-based diagnosis system applications, the architecture is modular and adaptable to other systems. This article presents different facets of LPF and experimental results from applying the software to a Livingstone model of the main propulsion feed subsystem for a prototype space vehicle.

  1. A Framework for Evidence-Based Licensure of Adaptive Autonomous Systems

    DTIC Science & Technology

    2016-03-01

    insights gleaned to DoD. The autonomy community has identified significant challenges associated with test, evaluation verification and validation of...licensure as a test, evaluation, verification , and validation (TEVV) framework that can address these challenges. IDA found that traditional...language requirements to testable (preferably machine testable) specifications • Design of architectures that treat development and verification of

  2. Verification of an on line in vivo semiconductor dosimetry system for TBI with two TLD procedures.

    PubMed

    Sánchez-Doblado, F; Terrón, J A; Sánchez-Nieto, B; Arráns, R; Errazquin, L; Biggs, D; Lee, C; Núñez, L; Delgado, A; Muñiz, J L

    1995-01-01

    This work presents the verification of an on line in vivo dosimetry system based on semiconductors. Software and hardware has been designed to convert the diode signal into absorbed dose. Final verification was made in the form of an intercomparison with two independent thermoluminiscent (TLD) dosimetry systems, under TBI conditions.

  3. Pooled analysis of the accuracy of five cervical cancer screening tests assessed in eleven studies in Africa and India.

    PubMed

    Arbyn, Marc; Sankaranarayanan, Rengaswamy; Muwonge, Richard; Keita, Namory; Dolo, Amadou; Mbalawa, Charles Gombe; Nouhou, Hassan; Sakande, Boblewende; Wesley, Ramani; Somanathan, Thara; Sharma, Anjali; Shastri, Surendra; Basu, Parthasarathy

    2008-07-01

    Cervical cancer is the main cancer among women in sub-Saharan Africa, India and other parts of the developing world. Evaluation of screening performance of effective, feasible and affordable early detection and management methods is a public health priority. Five screening methods, naked eye visual inspection of the cervix uteri after application of diluted acetic acid (VIA), or Lugol's iodine (VILI) or with a magnifying device (VIAM), the Pap smear and human papillomavirus testing with the high-risk probe of the Hybrid Capture-2 assay (HC2), were evaluated in 11 studies in India and Africa. More than 58,000 women, aged 25-64 years, were tested with 2-5 screening tests and outcome verification was done on all women independent of the screen test results. The outcome was presence or absence of cervical intraepithelial neoplasia (CIN) of different degrees or invasive cervical cancer. Verification was based on colposcopy and histological interpretation of colposcopy-directed biopsies. Negative colposcopy was accepted as a truly negative outcome. VIA showed a sensitivity of 79% (95% CI 73-85%) and 83% (95% CI 77-89%), and a specificity of 85% (95% CI 81-89%) and 84% (95% CI 80-88%) for the outcomes CIN2+ or CIN3+, respectively. VILI was on average 10% more sensitive and equally specific. VIAM showed similar results as VIA. The Pap smear showed lowest sensitivity, even at the lowest cutoff of atypical squamous cells of undetermined significance (57%; 95% CI 38-76%) for CIN2+ but the specificity was rather high (93%; 95% CI 89-97%). The HC2-assay showed a sensitivity for CIN2+ of 62% (95% CI 56-68%) and a specificity of 94% (95% CI 92-95%). Substantial interstudy variation was observed in the accuracy of the visual screening methods. Accuracy of visual methods and cytology increased over time, whereas performance of HC2 was constant. Results of visual tests and colposcopy were highly correlated. This study was the largest ever done that evaluates the cross-sectional accuracy of screening tests for cervical cancer precursors in developing countries. The merit of the study was that all screened subjects were submitted to confirmatory investigations avoiding to verification bias. A major finding was the consistently higher sensitivity but equal specificity of VILI compared with VIA. Nevertheless, some caution is warranted in the interpretation of observed accuracy measures, since a certain degree of gold standard misclassification cannot be excluded. Because of the correlation between visual screening tests and colposcopy and a certain degree of over-diagnosis of apparent CIN2+ by study pathologists, it is possible that both sensitivity and specificity of VIA and VILI were overestimated. Gold standard verification error could also explain the surprisingly low sensitivity of HC2, which contrasts with findings from other studies. (c) 2008 Wiley-Liss, Inc.

  4. Blocking Breast Cancer Metastasis by Targeting RNA-Binding Protein HuR

    DTIC Science & Technology

    2017-10-01

    used for proposed pre -clinical anti-HuR drug testing . Local IACUC was renewed and we are awaiting re-review by DOD. 15. SUBJECT TERMS HuR, Mouse model...metastasis-associated phenotypes. This project will be accomplished on two campuses, where our group will be primarily responsible for the in vivo... pre -clinical testing in experimental and spontaneous metastasis assays. To date, our objective has been to obtain, validate and begin verification of

  5. Assaying Ornithine and Arginine Decarboxylases in Some Plant Species 1

    PubMed Central

    Birecka, Helena; Bitonti, Alan J.; McCann, Peter P.

    1985-01-01

    A release of 14CO2 not related to ornithine decarboxylase activity was found in crude leaf extracts from Lycopersicon esculentum, Avena sativa, and especially from the pyrrolizidine alkaloid-bearing Heliotropium angiospermum when incubated with [1-14C]- or [U-14C]ornithine. The total 14CO2 produced was about 5- to 100-fold higher than that due to ornithine decarboxylase activities calculated from labeled putrescine (Put) found by thin-layer electrophoresis in the incubation mixtures. Partial purification with (NH4)2SO4 did not eliminate completely the interfering decarboxylation. When incubated with labeled arginine, a very significant 14CO2 release not related to arginine decarboxylase activity was observed only in extracts from H. angiospermum leaves, especially in Tris·HCl buffer. Under the assay conditions, these extracts exhibited oxidative degradation of added Put and agmatine (Agm) and also revealed a high arginase activity. Amino-guanidine at 0.1 to 0.2 millimolar prevented Put degradation and greatly decreased oxidative degradation of Agm; ornithine at 15 to 20 millimolar significantly inhibited arginase activity. A verification of the reliability of the standard 14CO2-based method by assessing labeled Put and/or Agm—formed in the presence of added aminoguanidine and/or ornithine when needed—is recommended especially when crude or semicrude plant extracts are assayed. When based on Put and/or Agm formed at 1.0 to 2.5 millimolar of substrate, the activities of ornithine decarboxylase and arginine decarboxylase in the youngest leaves of the tested species ranged between 1.1 and 3.6 and 1 and 1600 nanomoles per hour per gram fresh weight, respectively. The enzyme activities are discussed in relation to the biosynthesis of pyrrolizidine alkaloids. PMID:16664441

  6. Characterization of 137 Genomic DNA Reference Materials for 28 Pharmacogenetic Genes

    PubMed Central

    Pratt, Victoria M.; Everts, Robin E.; Aggarwal, Praful; Beyer, Brittany N.; Broeckel, Ulrich; Epstein-Baak, Ruth; Hujsak, Paul; Kornreich, Ruth; Liao, Jun; Lorier, Rachel; Scott, Stuart A.; Smith, Chingying Huang; Toji, Lorraine H.; Turner, Amy; Kalman, Lisa V.

    2017-01-01

    Pharmacogenetic testing is increasingly available from clinical laboratories. However, only a limited number of quality control and other reference materials are currently available to support clinical testing. To address this need, the Centers for Disease Control and Prevention–based Genetic Testing Reference Material Coordination Program, in collaboration with members of the pharmacogenetic testing community and the Coriell Cell Repositories, has characterized 137 genomic DNA samples for 28 genes commonly genotyped by pharmacogenetic testing assays (CYP1A1, CYP1A2, CYP2A6, CYP2B6, CYP2C8, CYP2C9, CYP2C19, CYP2D6, CYP2E1, CYP3A4, CYP3A5, CYP4F2, DPYD, GSTM1, GSTP1, GSTT1, NAT1, NAT2, SLC15A2, SLC22A2, SLCO1B1, SLCO2B1, TPMT, UGT1A1, UGT2B7, UGT2B15, UGT2B17, and VKORC1). One hundred thirty-seven Coriell cell lines were selected based on ethnic diversity and partial genotype characterization from earlier testing. DNA samples were coded and distributed to volunteer testing laboratories for targeted genotyping using a number of commercially available and laboratory developed tests. Through consensus verification, we confirmed the presence of at least 108 variant pharmacogenetic alleles. These samples are also being characterized by other pharmacogenetic assays, including next-generation sequencing, which will be reported separately. Genotyping results were consistent among laboratories, with most differences in allele assignments attributed to assay design and variability in reported allele nomenclature, particularly for CYP2D6, UGT1A1, and VKORC1. These publicly available samples will help ensure the accuracy of pharmacogenetic testing. PMID:26621101

  7. Targets Fishing and Identification of Calenduloside E as Hsp90AB1: Design, Synthesis, and Evaluation of Clickable Activity-Based Probe

    PubMed Central

    Wang, Shan; Tian, Yu; Zhang, Jing-Yi; Xu, Hui-Bo; Zhou, Ping; Wang, Min; Lu, Sen-Bao; Luo, Yun; Wang, Min; Sun, Gui-Bo; Xu, Xu-Dong; Sun, Xiao-Bo

    2018-01-01

    Calenduloside E (CE), a natural triterpenoid compound isolated from Aralia elata, can protect against ox-LDL-induced human umbilical vein endothelial cell (HUVEC) injury in our previous reports. However, the exact targets and mechanisms of CE remain elusive. For the sake of resolving this question, we designed and synthesized a clickable activity-based probe (CE-P), which could be utilized to fish the functional targets in HUVECs using a gel-based strategy. Based on the previous studies of the structure-activity relationship (SAR), we introduced an alkyne moiety at the C-28 carboxylic group of CE, which kept the protective and anti-apoptosis activity. Via proteomic approach, one of the potential proteins bound to CE-P was identified as Hsp90AB1, and further verification was performed by pure recombinant Hsp90AB1 and competitive assay. These results demonstrated that CE could bind to Hsp90AB1. We also found that CE could reverse the Hsp90AB1 decrease after ox-LDL treatment. To make our results more convincing, we performed SPR analysis and the affinity kinetic assay showed that CE/CE-P could bind to Hsp90AB1 in a dose-dependent manner. Taken together, our research showed CE could probably bind to Hsp90AB1 to protect the cell injury, which might provide the basis for the further exploration of its cardiovascular protective mechanisms. For the sake of resolving this question, we designed and synthesized a clickable activity-based probe (CE-P), which could be utilized to fish the functional targets in HUVECs using a gel-based strategy. PMID:29875664

  8. Isocenter verification for linac‐based stereotactic radiation therapy: review of principles and techniques

    PubMed Central

    Sabet, Mahsheed; O'Connor, Daryl J.; Greer, Peter B.

    2011-01-01

    There have been several manual, semi‐automatic and fully‐automatic methods proposed for verification of the position of mechanical isocenter as part of comprehensive quality assurance programs required for linear accelerator‐based stereotactic radiosurgery/radiotherapy (SRS/SRT) treatments. In this paper, a systematic review has been carried out to discuss the present methods for isocenter verification and compare their characteristics, to help physicists in making a decision on selection of their quality assurance routine. PACS numbers: 87.53.Ly, 87.56.Fc, 87.56.‐v PMID:22089022

  9. Current status of 3D EPID-based in vivo dosimetry in The Netherlands Cancer Institute

    NASA Astrophysics Data System (ADS)

    Mijnheer, B.; Olaciregui-Ruiz, I.; Rozendaal, R.; Spreeuw, H.; van Herk, M.; Mans, A.

    2015-01-01

    3D in vivo dose verification using a-Si EPIDs is performed routinely in our institution for almost all RT treatments. The EPID-based 3D dose distribution is reconstructed using a back-projection algorithm and compared with the planned dose distribution using 3D gamma evaluation. Dose-reconstruction and gamma-evaluation software runs automatically, and deviations outside the alert criteria are immediately available and investigated, in combination with inspection of cone-beam CT scans. The implementation of our 3D EPID- based in vivo dosimetry approach was able to replace pre-treatment verification for more than 90% of the patient treatments. Clinically relevant deviations could be detected for approximately 1 out of 300 patient treatments (IMRT and VMAT). Most of these errors were patient related anatomical changes or deviations from the routine clinical procedure, and would not have been detected by pre-treatment verification. Moreover, 3D EPID-based in vivo dose verification is a fast and accurate tool to assure the safe delivery of RT treatments. It provides clinically more useful information and is less time consuming than pre-treatment verification measurements. Automated 3D in vivo dosimetry is therefore a prerequisite for large-scale implementation of patient-specific quality assurance of RT treatments.

  10. Development of independent MU/treatment time verification algorithm for non-IMRT treatment planning: A clinical experience

    NASA Astrophysics Data System (ADS)

    Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan

    2018-02-01

    The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.

  11. Resolution verification targets for airborne and spaceborne imaging systems at the Stennis Space Center

    NASA Astrophysics Data System (ADS)

    McKellip, Rodney; Yuan, Ding; Graham, William; Holland, Donald E.; Stone, David; Walser, William E.; Mao, Chengye

    1997-06-01

    The number of available spaceborne and airborne systems will dramatically increase over the next few years. A common systematic approach toward verification of these systems will become important for comparing the systems' operational performance. The Commercial Remote Sensing Program at the John C. Stennis Space Center (SSC) in Mississippi has developed design requirements for a remote sensing verification target range to provide a means to evaluate spatial, spectral, and radiometric performance of optical digital remote sensing systems. The verification target range consists of spatial, spectral, and radiometric targets painted on a 150- by 150-meter concrete pad located at SSC. The design criteria for this target range are based upon work over a smaller, prototypical target range at SSC during 1996. This paper outlines the purpose and design of the verification target range based upon an understanding of the systems to be evaluated as well as data analysis results from the prototypical target range.

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION: Reduction of Nitrogen in Domestic Wastewater from Individual Residential Homes. BioConcepts, Inc. ReCip® RTS ~ 500 System

    EPA Science Inventory

    Verification testing of the ReCip® RTS-500 System was conducted over a 12-month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located on Otis Air National Guard Base in Bourne, Massachusetts. A nine-week startup period preceded the verification test t...

  13. A robust method using propensity score stratification for correcting verification bias for binary tests

    PubMed Central

    He, Hua; McDermott, Michael P.

    2012-01-01

    Sensitivity and specificity are common measures of the accuracy of a diagnostic test. The usual estimators of these quantities are unbiased if data on the diagnostic test result and the true disease status are obtained from all subjects in an appropriately selected sample. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the result of the diagnostic test and other characteristics of the subjects. Estimators of sensitivity and specificity based on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias under the assumption that the missing data on disease status are missing at random (MAR), that is, the probability of missingness depends on the true (missing) disease status only through the test result and observed covariate information. When some of the covariates are continuous, or the number of covariates is relatively large, the existing methods require parametric models for the probability of disease or the probability of verification (given the test result and covariates), and hence are subject to model misspecification. We propose a new method for correcting verification bias based on the propensity score, defined as the predicted probability of verification given the test result and observed covariates. This is estimated separately for those with positive and negative test results. The new method classifies the verified sample into several subsamples that have homogeneous propensity scores and allows correction for verification bias. Simulation studies demonstrate that the new estimators are more robust to model misspecification than existing methods, but still perform well when the models for the probability of disease and probability of verification are correctly specified. PMID:21856650

  14. Design and verification of a highly reliable Linear-After-The-Exponential PCR (LATE-PCR) assay for the detection of African swine fever virus.

    PubMed

    Ronish, B; Hakhverdyan, M; Ståhl, K; Gallardo, C; Fernandez-Pinero, J; Belák, S; Leblanc, N; Wangh, L

    2011-03-01

    African swine fever virus (ASFV) is a highly pathogenic DNA virus that is the causative agent of African swine fever (ASF), an infectious disease of domestic and wild pigs of all breeds and ages, causing a range of syndromes. Acute disease is characterized by high fever, haemorrhages in the reticuloendothelial system, and a high mortality rate. A powerful novel diagnostic assay based on the Linear-After-The-Exponential-PCR (LATE-PCR) principle was developed to detect ASFV. LATE-PCR is an advanced form of asymmetric PCR which results in direct amplification of large amount of single-stranded DNA. Fluorescent readings are acquired using endpoint analysis after PCR amplification. Amplification of the correct product is verified by melting curve analysis. The assay was designed to amplify the VP72 gene of ASFV genome. Nineteen ASFV DNA cell culture virus strains and three tissue samples (spleen, tonsil, and liver) from infected experimental pigs were tested. Virus was detected in all of the cell culture and tissue samples. None of five ASFV-related viruses tested produced a positive signal, demonstrating the high specificity of the assay. The sensitivity of the LATE-PCR assay was determined in two separate real-time monoplex reactions using samples of synthetic ASFV and synthetic control-DNA targets that were diluted serially from 10⁹ to 1 initial copies per reaction. The detection limit was 1 and 10 copies/reaction, respectively. The sensitivity of the assay was also tested in a duplex end-point reactions comprised of a constant level of 150 copies of synthetic control-DNA and a clinical sample of spleen tissue diluted serially from 10⁻¹ to 10⁻⁵. The detection limit was 10⁻⁵ dilution which corresponds to approximately 1 copy/reaction. Since the assay is designed to be used in either laboratory settings or in a portable PCR machine (Bio-Seeq Portable Veterinary Diagnostics Laboratory; Smiths Detection, Watford UK), the LATE-PCR provides a robust and novel tool for the diagnosis of ASF both in the laboratory and in the field. Copyright © 2010 Elsevier B.V. All rights reserved.

  15. Gaia challenging performances verification: combination of spacecraft models and test results

    NASA Astrophysics Data System (ADS)

    Ecale, Eric; Faye, Frédéric; Chassat, François

    2016-08-01

    To achieve the ambitious scientific objectives of the Gaia mission, extremely stringent performance requirements have been given to the spacecraft contractor (Airbus Defence and Space). For a set of those key-performance requirements (e.g. end-of-mission parallax, maximum detectable magnitude, maximum sky density or attitude control system stability), this paper describes how they are engineered during the whole spacecraft development process, with a focus on the end-to-end performance verification. As far as possible, performances are usually verified by end-to-end tests onground (i.e. before launch). However, the challenging Gaia requirements are not verifiable by such a strategy, principally because no test facility exists to reproduce the expected flight conditions. The Gaia performance verification strategy is therefore based on a mix between analyses (based on spacecraft models) and tests (used to directly feed the models or to correlate them). Emphasis is placed on how to maximize the test contribution to performance verification while keeping the test feasible within an affordable effort. In particular, the paper highlights the contribution of the Gaia Payload Module Thermal Vacuum test to the performance verification before launch. Eventually, an overview of the in-flight payload calibration and in-flight performance verification is provided.

  16. Integrity Verification for Multiple Data Copies in Cloud Storage Based on Spatiotemporal Chaos

    NASA Astrophysics Data System (ADS)

    Long, Min; Li, You; Peng, Fei

    Aiming to strike for a balance between the security, efficiency and availability of the data verification in cloud storage, a novel integrity verification scheme based on spatiotemporal chaos is proposed for multiple data copies. Spatiotemporal chaos is implemented for node calculation of the binary tree, and the location of the data in the cloud is verified. Meanwhile, dynamic operation can be made to the data. Furthermore, blind information is used to prevent a third-party auditor (TPA) leakage of the users’ data privacy in a public auditing process. Performance analysis and discussion indicate that it is secure and efficient, and it supports dynamic operation and the integrity verification of multiple copies of data. It has a great potential to be implemented in cloud storage services.

  17. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  18. UIVerify: A Web-Based Tool for Verification and Automatic Generation of User Interfaces

    NASA Technical Reports Server (NTRS)

    Shiffman, Smadar; Degani, Asaf; Heymann, Michael

    2004-01-01

    In this poster, we describe a web-based tool for verification and automatic generation of user interfaces. The verification component of the tool accepts as input a model of a machine and a model of its interface, and checks that the interface is adequate (correct). The generation component of the tool accepts a model of a given machine and the user's task, and then generates a correct and succinct interface. This write-up will demonstrate the usefulness of the tool by verifying the correctness of a user interface to a flight-control system. The poster will include two more examples of using the tool: verification of the interface to an espresso machine, and automatic generation of a succinct interface to a large hypothetical machine.

  19. Information verification and encryption based on phase retrieval with sparsity constraints and optical inference

    NASA Astrophysics Data System (ADS)

    Zhong, Shenlu; Li, Mengjiao; Tang, Xiajie; He, Weiqing; Wang, Xiaogang

    2017-01-01

    A novel optical information verification and encryption method is proposed based on inference principle and phase retrieval with sparsity constraints. In this method, a target image is encrypted into two phase-only masks (POMs), which comprise sparse phase data used for verification. Both of the two POMs need to be authenticated before being applied for decrypting. The target image can be optically reconstructed when the two authenticated POMs are Fourier transformed and convolved by the correct decryption key, which is also generated in encryption process. No holographic scheme is involved in the proposed optical verification and encryption system and there is also no problem of information disclosure in the two authenticable POMs. Numerical simulation results demonstrate the validity and good performance of this new proposed method.

  20. 78 FR 56266 - Consent Based Social Security Number Verification (CBSV) Service

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-12

    ... developed CBSV as a user- friendly, internet-based application with safeguards that protect the public's information. In addition to the benefit of providing high volume, centralized SSN verification services to users in a secure manner, CBSV provides us with cost and workload management benefits. New Information...

  1. Model-Based Building Verification in Aerial Photographs.

    DTIC Science & Technology

    1987-09-01

    Powers ’ordon E. Schacher Chaii nan Dean of Science and Electrical and Computer Engineering Engineering "p. 5.€ ’ ,’"..€ € . € -, _ _ . ."€ . 4...paper, we have proposed an ex)erimental knowledge-based " verification syste, te organization for change (letection is oitliinet. , Kowledge rules and

  2. Formal Verification for a Next-Generation Space Shuttle

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2002-01-01

    This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.

  3. Expression, purification of metallothionein genes from freshwater crab (Sinopotamon yangtsekiense) and development of an anti-metallothionein ELISA

    PubMed Central

    Zhang, Hao; Zhou, Hui

    2017-01-01

    Using the phoA-fusion technology, the recombinant metallothionein (MT) from freshwater crab (Sinopotamon yangtsekiense) has been successfully produced in Escherichia coli. MT purified from the bacterial suspension showed one polypeptide with a molecular weight of 7 kDa by tricine-sodium dodecyl sulfate-polyacrylamide gel electrophoresis (Tricine-SDS-PAGE). Western-blotting confirmed the polypeptides had a specific reactivity with mouse polyclonal MT anti-serum. Based on the purified MT and MT anti-serum, the reaction parameters for an enzyme-linked immunosorbent assay (ELISA) were developed. The direct coating ELISA showed a higher linear relationship compared to antibody sandwich coating ELISA. The optimal dilution rates of purified MT anti-serum and coating period were shown to be 1:160,000 and 12 hours at 4°C. At 37°C, the appropriate reaction duration of the first antibody and the second antibody were 2 hours and 1 hour, respectively. According to these optimal parameters, the standard linear equation, y = 0.0032x + 0.1769 (R2 = 0.9779, x, y representing MT concentration and OD450 value), was established for the determination of MT concentration with a valid range of 3.9–500 ng/ml. In verification experiments, the mean coefficients of variation of the intra-assay and inter-assay were 3.260% and 3.736%, respectively. According to the result of MT recovery, ELISA with an approaching 100% MT recovery was more reliable and sensitive than the Cd saturation assay. In conclusion, the newly developed ELISA of this study was precise, stable and repeatable, and could be used as a biomarker tool to monitor pollution by heavy metals. PMID:28350826

  4. Mining for sensitive and reliable species-specific primers for PCR for detection of Cronobacter sakazakii by a bioinformatics approach.

    PubMed

    Qiming, Chen; Tingting, Tao; Xiaomei, Bie; Yingjian, Lu; Fengxia, Lu; Ligong, Zhai; Zhaoxin, Lu

    2015-08-01

    Although several studies have reported PCR assays for distinguishing Cronobacter sakazakii from other species in the genus, reports regarding assay sensitivity and specificity, as well as applications for food testing, are lacking. Hence, the objective of this study was to develop a sensitive and reliable PCR-based method for detection of C. sakazakii by screening for specific target genes. The genome sequence of C. sakazakii in the GenBank database was compared with that of other organisms using BLAST. Thirty-eight DNA fragments unique to C. sakazakii were identified, and primers targeting these sequences were designed. Finally, 3 primer sets (CS14, CS21, and CS38) were found to be specific for C. sakazakii by PCR verification. The detection limit of PCR assays using the 3 pairs of primers was 1.35 pg/μL, 135 fg/μL, and 135 fg/μL, respectively, for genomic DNA, and 5.5×10(5), 5.5×10(3), 5.5×10(3) cfu/mL, respectively, using pure cultures of the bacteria, compared with 13.5 pg/μLand 5.5×10(5) cfu/mLfor primer set SpeCronsaka, which has been previously described. Cronobacter sakazakii were detected in artificially contaminated powdered infant formula (PIF) by PCR using primer sets CS21 and CS38 after 8h of enrichment. The detection limit was 5.5×10(-1) cfu/10g of PIF. Thus, the PCR assay can be used for rapid and sensitive detection of C. sakazakii in PIF. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  5. Formal verification of an oral messages algorithm for interactive consistency

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1992-01-01

    The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.

  6. Evaluation of an alternative in vitro test battery for detecting reproductive toxicants in a grouping context.

    PubMed

    Kroese, E Dinant; Bosgra, Sieto; Buist, Harrie E; Lewin, Geertje; van der Linden, Sander C; Man, Hai-yen; Piersma, Aldert H; Rorije, Emiel; Schulpen, Sjors H W; Schwarz, Michael; Uibel, Frederik; van Vugt-Lussenburg, Barbara M A; Wolterbeek, Andre P M; van der Burg, Bart

    2015-08-01

    Previously we showed a battery consisting of CALUX transcriptional activation assays, the ReProGlo assay, and the embryonic stem cell test, and zebrafish embryotoxicity assay as 'apical' tests to correctly predict developmental toxicity for 11 out of 12 compounds, and to explain the one false negative [7]. Here we report on applying this battery within the context of grouping and read across, put forward as a potential tool to fill data gaps and avoid animal testing, to distinguish in vivo non- or weak developmental toxicants from potent developmental toxicants within groups of structural analogs. The battery correctly distinguished 2-methylhexanoic acid, monomethyl phthalate, and monobutyltin trichloride as non- or weak developmental toxicants from structurally related developmental toxicants valproic acid, mono-ethylhexyl phthalate, and tributyltin chloride, respectively, and, therefore, holds promise as a biological verification model in grouping and read across approaches. The relevance of toxicokinetic information is indicated. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. MR Imaging Based Treatment Planning for Radiotherapy of Prostate Cancer

    DTIC Science & Technology

    2007-02-01

    developed practical methods for heterogeneity correction for MRI - based dose calculations (Chen et al 2007). 6) We will use existing Monte Carlo ... Monte Carlo verification of IMRT dose distributions from a commercial treatment planning optimization system, Phys. Med. Biol., 45:2483-95 (2000) Ma...accuracy and consistency for MR based IMRT treatment planning for prostate cancer. A short paper entitled “ Monte Carlo dose verification of MR image based

  8. Mutation Testing for Effective Verification of Digital Components of Physical Systems

    NASA Astrophysics Data System (ADS)

    Kushik, N. G.; Evtushenko, N. V.; Torgaev, S. N.

    2015-12-01

    Digital components of modern physical systems are often designed applying circuitry solutions based on the field programmable gate array technology (FPGA). Such (embedded) digital components should be carefully tested. In this paper, an approach for the verification of digital physical system components based on mutation testing is proposed. The reference description of the behavior of a digital component in the hardware description language (HDL) is mutated by introducing into it the most probable errors and, unlike mutants in high-level programming languages, the corresponding test case is effectively derived based on a comparison of special scalable representations of the specification and the constructed mutant using various logic synthesis and verification systems.

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL RESIDENTIAL HOMES, WATERLOO BIOFILTER® MODEL 4-BEDROOM (NSF 02/03/WQPC-SWP)

    EPA Science Inventory

    Verification testing of the Waterloo Biofilter Systems (WBS), Inc. Waterloo Biofilter® Model 4-Bedroom system was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at Otis Air National Guard Base in Bourne, Mas...

  10. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, AQUAPOINT, INC. BIOCLERE MODEL 16/12 - 02/02/WQPC-SWP

    EPA Science Inventory

    Verification testing of the Aquapoint, Inc. (AQP) BioclereTM Model 16/12 was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC), located at Otis Air National Guard Base in Bourne, Massachusetts. Sanitary sewerage from the ba...

  12. Dynamic testing for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.

    1972-01-01

    Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.

  13. SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thoelking, J; Yuvaraj, S; Jens, F

    Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference)more » and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan verification. Funding Support, Disclosures, and Conflict of Interest: COIs: Frank Lohr: Elekta: research grant, travel grants, teaching honoraria IBA: research grant, travel grants, teaching honoraria, advisory board C-Rad: board honoraria, travel grants Frederik Wenz: Elekta: research grant, teaching honoraria, consultant, advisory board Zeiss: research grant, teaching honoraria, patent Hansjoerg Wertz: Elekta: research grant, teaching honoraria IBA: research grant.« less

  14. Online 3D EPID-based dose verification: Proof of concept.

    PubMed

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel

    2016-07-01

    Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5-10 s irradiation time. A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.

  15. Improved Hip-Based Individual Recognition Using Wearable Motion Recording Sensor

    NASA Astrophysics Data System (ADS)

    Gafurov, Davrondzhon; Bours, Patrick

    In todays society the demand for reliable verification of a user identity is increasing. Although biometric technologies based on fingerprint or iris can provide accurate and reliable recognition performance, they are inconvenient for periodic or frequent re-verification. In this paper we propose a hip-based user recognition method which can be suitable for implicit and periodic re-verification of the identity. In our approach we use a wearable accelerometer sensor attached to the hip of the person, and then the measured hip motion signal is analysed for identity verification purposes. The main analyses steps consists of detecting gait cycles in the signal and matching two sets of detected gait cycles. Evaluating the approach on a hip data set consisting of 400 gait sequences (samples) from 100 subjects, we obtained equal error rate (EER) of 7.5% and identification rate at rank 1 was 81.4%. These numbers are improvements by 37.5% and 11.2% respectively of the previous study using the same data set.

  16. Improvement and automation of a real-time PCR assay for vaginal fluids.

    PubMed

    De Vittori, E; Giampaoli, S; Barni, F; Baldi, M; Berti, A; Ripani, L; Romano Spica, V

    2016-05-01

    The identification of vaginal fluids is crucial in forensic science. Several molecular protocols based on PCR amplification of mfDNA (microflora DNA) specific for vaginal bacteria are now available. Unfortunately mfDNA extraction and PCR reactions require manual optimization of several steps. The aim of present study was the verification of a partial automatization of vaginal fluids identification through two instruments widely diffused in forensic laboratories: EZ1 Advanced robot and Rotor Gene Q 5Plex HRM. Moreover, taking advantage of 5-plex thermocycler technology, the ForFluid kit performances were improved by expanding the mfDNA characterization panel with a new bacterial target for vaginal fluids and with an internal positive control (IPC) to monitor PCR inhibition. Results underlined the feasibility of a semi-automated extraction of mfDNA using a BioRobot and demonstrated the analytical improvements of the kit. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. An Efficient Location Verification Scheme for Static Wireless Sensor Networks.

    PubMed

    Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok

    2017-01-24

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.

  18. An Efficient Location Verification Scheme for Static Wireless Sensor Networks

    PubMed Central

    Kim, In-hwan; Kim, Bo-sung; Song, JooSeok

    2017-01-01

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007

  19. Simulation-based MDP verification for leading-edge masks

    NASA Astrophysics Data System (ADS)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification acceptable.

  20. Method for secure electronic voting system: face recognition based approach

    NASA Astrophysics Data System (ADS)

    Alim, M. Affan; Baig, Misbah M.; Mehboob, Shahzain; Naseem, Imran

    2017-06-01

    In this paper, we propose a framework for low cost secure electronic voting system based on face recognition. Essentially Local Binary Pattern (LBP) is used for face feature characterization in texture format followed by chi-square distribution is used for image classification. Two parallel systems are developed based on smart phone and web applications for face learning and verification modules. The proposed system has two tire security levels by using person ID followed by face verification. Essentially class specific threshold is associated for controlling the security level of face verification. Our system is evaluated three standard databases and one real home based database and achieve the satisfactory recognition accuracies. Consequently our propose system provides secure, hassle free voting system and less intrusive compare with other biometrics.

  1. Reachability analysis of real-time systems using time Petri nets.

    PubMed

    Wang, J; Deng, Y; Xu, G

    2000-01-01

    Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.

  2. Marker-based quantification of interfractional tumor position variation and the use of markers for setup verification in radiation therapy for esophageal cancer.

    PubMed

    Jin, Peng; van der Horst, Astrid; de Jong, Rianne; van Hooft, Jeanin E; Kamphuis, Martijn; van Wieringen, Niek; Machiels, Melanie; Bel, Arjan; Hulshof, Maarten C C M; Alderliesten, Tanja

    2015-12-01

    The aim of this study was to quantify interfractional esophageal tumor position variation using markers and investigate the use of markers for setup verification. Sixty-five markers placed in the tumor volumes of 24 esophageal cancer patients were identified in computed tomography (CT) and follow-up cone-beam CT. For each patient we calculated pairwise distances between markers over time to evaluate geometric tumor volume variation. We then quantified marker displacements relative to bony anatomy and estimated the variation of systematic (Σ) and random errors (σ). During bony anatomy-based setup verification, we visually inspected whether the markers were inside the planning target volume (PTV) and attempted marker-based registration. Minor time trends with substantial fluctuations in pairwise distances implied tissue deformation. Overall, Σ(σ) in the left-right/cranial-caudal/anterior-posterior direction was 2.9(2.4)/4.1(2.4)/2.2(1.8) mm; for the proximal stomach, it was 5.4(4.3)/4.9(3.2)/1.9(2.4) mm. After bony anatomy-based setup correction, all markers were inside the PTV. However, due to large tissue deformation, marker-based registration was not feasible. Generally, the interfractional position variation of esophageal tumors is more pronounced in the cranial-caudal direction and in the proximal stomach. Currently, marker-based setup verification is not feasible for clinical routine use, but markers can facilitate the setup verification by inspecting whether the PTV covers the tumor volume adequately. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  3. Offline signature verification using convolution Siamese network

    NASA Astrophysics Data System (ADS)

    Xing, Zi-Jian; Yin, Fei; Wu, Yi-Chao; Liu, Cheng-Lin

    2018-04-01

    This paper presents an offline signature verification approach using convolutional Siamese neural network. Unlike the existing methods which consider feature extraction and metric learning as two independent stages, we adopt a deepleaning based framework which combines the two stages together and can be trained end-to-end. The experimental results on two offline public databases (GPDSsynthetic and CEDAR) demonstrate the superiority of our method on the offline signature verification problem.

  4. QPF verification using different radar-based analyses: a case study

    NASA Astrophysics Data System (ADS)

    Moré, J.; Sairouni, A.; Rigo, T.; Bravo, M.; Mercader, J.

    2009-09-01

    Verification of QPF in NWP models has been always challenging not only for knowing what scores are better to quantify a particular skill of a model but also for choosing the more appropriate methodology when comparing forecasts with observations. On the one hand, an objective verification technique can provide conclusions that are not in agreement with those ones obtained by the "eyeball" method. Consequently, QPF can provide valuable information to forecasters in spite of having poor scores. On the other hand, there are difficulties in knowing the "truth" so different results can be achieved depending on the procedures used to obtain the precipitation analysis. The aim of this study is to show the importance of combining different precipitation analyses and verification methodologies to obtain a better knowledge of the skills of a forecasting system. In particular, a short range precipitation forecasting system based on MM5 at 12 km coupled with LAPS is studied in a local convective precipitation event that took place in NE Iberian Peninsula on October 3rd 2008. For this purpose, a variety of verification methods (dichotomous, recalibration and object oriented methods) are used to verify this case study. At the same time, different precipitation analyses are used in the verification process obtained by interpolating radar data using different techniques.

  5. Bayesian Estimation of Combined Accuracy for Tests with Verification Bias

    PubMed Central

    Broemeling, Lyle D.

    2011-01-01

    This presentation will emphasize the estimation of the combined accuracy of two or more tests when verification bias is present. Verification bias occurs when some of the subjects are not subject to the gold standard. The approach is Bayesian where the estimation of test accuracy is based on the posterior distribution of the relevant parameter. Accuracy of two combined binary tests is estimated employing either “believe the positive” or “believe the negative” rule, then the true and false positive fractions for each rule are computed for two tests. In order to perform the analysis, the missing at random assumption is imposed, and an interesting example is provided by estimating the combined accuracy of CT and MRI to diagnose lung cancer. The Bayesian approach is extended to two ordinal tests when verification bias is present, and the accuracy of the combined tests is based on the ROC area of the risk function. An example involving mammography with two readers with extreme verification bias illustrates the estimation of the combined test accuracy for ordinal tests. PMID:26859487

  6. Verification of road databases using multiple road models

    NASA Astrophysics Data System (ADS)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  7. Environmental Technology Verification: Test Report of Mobile Source Selective Catalytic Reduction--Nett Technologies, Inc., BlueMAX 100 version A urea-based selective catalytic reduction technology

    EPA Science Inventory

    Nett Technologies’ BlueMAX 100 version A Urea-Based SCR System utilizes a zeolite catalyst coating on a cordierite honeycomb substrate for heavy-duty diesel nonroad engines for use with commercial ultra-low–sulfur diesel fuel. This environmental technology verification (ETV) repo...

  8. Face verification system for Android mobile devices using histogram based features

    NASA Astrophysics Data System (ADS)

    Sato, Sho; Kobayashi, Kazuhiro; Chen, Qiu

    2016-07-01

    This paper proposes a face verification system that runs on Android mobile devices. In this system, facial image is captured by a built-in camera on the Android device firstly, and then face detection is implemented using Haar-like features and AdaBoost learning algorithm. The proposed system verify the detected face using histogram based features, which are generated by binary Vector Quantization (VQ) histogram using DCT coefficients in low frequency domains, as well as Improved Local Binary Pattern (Improved LBP) histogram in spatial domain. Verification results with different type of histogram based features are first obtained separately and then combined by weighted averaging. We evaluate our proposed algorithm by using publicly available ORL database and facial images captured by an Android tablet.

  9. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    PubMed Central

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  10. Electrochemical impedance spectroscopy based-on interferon-gamma detection

    NASA Astrophysics Data System (ADS)

    Li, Guan-Wei; Kuo, Yi-Ching; Tsai, Pei-I.; Lee, Chih-Kung

    2014-03-01

    Tuberculosis (TB) is an ancient disease constituted a long-term menace to public health. According to World Health Organization (WHO), mycobacterium tuberculosis (MTB) infected nearly a third of people of the world. There is about one new TB occurrence every second. Interferon-gamma (IFN-γ) is associated with susceptibility to TB, and interferongamma release assays (IGRA) is considered to be the best alternative of tuberculin skin test (TST) for diagnosis of latent tuberculosis infection (LTBI). Although significant progress has been made with regard to the design of enzyme immunoassays for IFN-γ, adopting this assay is still labor-intensive and time-consuming. To alleviate these drawbacks, we used IFN-γ antibody to facilitate the detection of IFN-γ. An experimental verification on the performance of IGRA was done in this research. We developed two biosensor configurations, both of which possess high sensitivity, specificity, and rapid IFN-γ diagnoses. The first is the electrochemical method. The second is a circular polarization interferometry configuration, which incorporates two light beams with p-polarization and s-polarization states individually along a common path, a four photo-detector quadrature configuration to arrive at a phase modulated ellipsometer. With these two methods, interaction between IFN-γ antibody and IFN-γ were explored and presented in detail.

  11. YIP Formal Synthesis of Software-Based Control Protocols for Fractionated,Composable Autonomous Systems

    DTIC Science & Technology

    2016-07-08

    Systems Using Automata Theory and Barrier Certifi- cates We developed a sound but incomplete method for the computational verification of specifications...method merges ideas from automata -based model checking with those from control theory including so-called barrier certificates and optimization-based... Automata theory meets barrier certificates: Temporal logic verification of nonlinear systems,” IEEE Transactions on Automatic Control, 2015. [J2] R

  12. Genotoxicity assessment of nanomaterials: recommendations on best practices, assays and methods.

    PubMed

    Elespuru, Rosalie; Pfuhler, Stefan; Aardema, Marilyn; Chen, Tao; Doak, Shareen H; Doherty, Ann; Farabaugh, Christopher S; Kenny, Julia; Manjanatha, Mugimane; Mahadevan, Brinda; Moore, Martha M; Ouédraogo, Gladys; Stankowski, Leon F; Tanir, Jennifer Y

    2018-04-26

    Nanomaterials (NMs) present unique challenges in safety evaluation. An international working group, the Genetic Toxicology Technical Committee of the International Life Sciences Institute's Health and Environmental Sciences Institute, has addressed issues related to the genotoxicity assessment of NMs. A critical review of published data has been followed by recommendations on methods alterations and best practices for the standard genotoxicity assays: bacterial reverse mutation (Ames); in vitro mammalian assays for mutations, chromosomal aberrations, micronucleus induction, or DNA strand breaks (comet); and in vivo assays for genetic damage (micronucleus, comet and transgenic mutation assays). The analysis found a great diversity of tests and systems used for in vitro assays; many did not meet criteria for a valid test, and/or did not use validated cells and methods in the Organization for Economic Co-operation and Development Test Guidelines, and so these results could not be interpreted. In vivo assays were less common but better performed. It was not possible to develop conclusions on test system agreement, NM activity, or mechanism of action. However, the limited responses observed for most NMs were consistent with indirect genotoxic effects, rather than direct interaction of NMs with DNA. We propose a revised genotoxicity test battery for NMs that includes in vitro mammalian cell mutagenicity and clastogenicity assessments; in vivo assessments would be added only if warranted by information on specific organ exposure or sequestration of NMs. The bacterial assays are generally uninformative for NMs due to limited particle uptake and possible lack of mechanistic relevance, and are thus omitted in our recommended test battery for NM assessment. Recommendations include NM characterization in the test medium, verification of uptake into target cells, and limited assay-specific methods alterations to avoid interference with uptake or endpoint analysis. These recommendations are summarized in a Roadmap guideline for testing.

  13. Establishing and Monitoring an Aseptic Workspace for Building the MOMA Mass Spectrometer

    NASA Technical Reports Server (NTRS)

    Lalime, Erin

    2016-01-01

    Mars Organic Molecule Analyzer (MOMA) is an instrument suite on the ESA ExoMars 2018 Rover, and the Mass Spectrometer (MOMA-MS) is being built at Goddard Space Flight Center (GSFC). As MOMA-MS is a life-detection instrument and it thus falls in the most stringent category of Planetary Protection (PP) biological cleanliness requirements. Less than 0.03 sporem2 is allowed in the instrument sample path. In order to meet these PP requirements, MOMA-MS must be built and maintained in a low bioburden environment. The MOMA-MS project at GSFC maintains three cleanrooms with varying levels of bioburden control. The Aseptic Assembly Cleanroom has the highest level of control, applying three different bioburden reducing methods: 70 IPA, 7.5 Hydrogen Peroxide, and Ultra-Violet C light. The three methods are used in rotation and each kills microbes by a different mechanism, reducing the likelihood of microorganisms developing resistance to all three. The Integration and Mars Chamber Cleanrooms use less biocidal cleaning, with the option to deploy extra techniques as necessary. To support the monitoring of cleanrooms and verification that MOMA-MS hardware meets PP requirements, a new Planetary Protection lab was established that currently has the capabilities of standard growth assays for spore or vegetative bacteria, rapid bioburden analysis that detects Adenosine Triphosphate (ATP), plus autoclave and DHMR verification. The cleanrooms are monitored both for vegetative microorganisms and by rapid ATP assay, and a clear difference in bioburden is observed between the aseptic the other cleanroom.

  14. Verification and Validation of KBS with Neural Network Components

    NASA Technical Reports Server (NTRS)

    Wen, Wu; Callahan, John

    1996-01-01

    Artificial Neural Network (ANN) play an important role in developing robust Knowledge Based Systems (KBS). The ANN based components used in these systems learn to give appropriate predictions through training with correct input-output data patterns. Unlike traditional KBS that depends on a rule database and a production engine, the ANN based system mimics the decisions of an expert without specifically formulating the if-than type of rules. In fact, the ANNs demonstrate their superiority when such if-then type of rules are hard to generate by human expert. Verification of traditional knowledge based system is based on the proof of consistency and completeness of the rule knowledge base and correctness of the production engine.These techniques, however, can not be directly applied to ANN based components.In this position paper, we propose a verification and validation procedure for KBS with ANN based components. The essence of the procedure is to obtain an accurate system specification through incremental modification of the specifications using an ANN rule extraction algorithm.

  15. RF model of the distribution system as a communication channel, phase 2. Volume 1: Summary Report

    NASA Technical Reports Server (NTRS)

    Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.

    1982-01-01

    The design, implementation, and verification of a computerized model for predicting the steady-state sinusoidal response of radial (tree) configured distribution feeders was undertaken. That work demonstrated the feasibility and validity based on verification measurements made on a limited size portion of an actual live feeder. On that basis a follow-on effort concerned with (1) extending the verification based on a greater variety of situations and network size, (2) extending the model capabilities for reverse direction propagation, (3) investigating parameter sensitivities, (4) improving transformer models, and (5) investigating procedures/fixes for ameliorating propagation trouble spots was conducted. Results are summarized.

  16. Applications of a Fast Neutron Detector System to Verification of Special Nuclear Materials

    NASA Astrophysics Data System (ADS)

    Mayo, Douglas R.; Byrd, Roger C.; Ensslin, Norbert; Krick, Merlyn S.; Mercer, David J.; Miller, Michael C.; Prettyman, Thomas H.; Russo, Phyllis A.

    1998-04-01

    An array of boron-loaded plastic optically coupled to bismuth germanate scintillators has been developed to detect neutrons for measurement of special nuclear materials. The phoswiched detection system has the advantage of a high neutron detection efficiency and short die-away time. This is achieved by mixing the moderator (plastic) and the detector (^10B) at the molecular level. Simulations indicate that the neutron capture probabilities equal or exceed those of the current thermal neutron multiplicity techniques which have the moderator (polyethylene) and detectors (^3He gas proportional tubes) macroscopically separate. Experiments have been performed to characterize the response of these detectors and validate computer simulations. The fast neutron detection system may be applied to the quantitative assay of plutonium in high (α,n) backgrounds, with emphasis on safeguards and enviromental scenarios. Additional applications of the insturment, in a non-quantative mode, has been tested for possible verification activities involving dismantlement of nuclear weapons. A description of the detector system, simulations and preliminary data will be presented.

  17. A high-throughput next-generation sequencing-based method for detecting the mutational fingerprint of carcinogens

    PubMed Central

    Besaratinia, Ahmad; Li, Haiqing; Yoon, Jae-In; Zheng, Albert; Gao, Hanlin; Tommasi, Stella

    2012-01-01

    Many carcinogens leave a unique mutational fingerprint in the human genome. These mutational fingerprints manifest as specific types of mutations often clustering at certain genomic loci in tumor genomes from carcinogen-exposed individuals. To develop a high-throughput method for detecting the mutational fingerprint of carcinogens, we have devised a cost-, time- and labor-effective strategy, in which the widely used transgenic Big Blue® mouse mutation detection assay is made compatible with the Roche/454 Genome Sequencer FLX Titanium next-generation sequencing technology. As proof of principle, we have used this novel method to establish the mutational fingerprints of three prominent carcinogens with varying mutagenic potencies, including sunlight ultraviolet radiation, 4-aminobiphenyl and secondhand smoke that are known to be strong, moderate and weak mutagens, respectively. For verification purposes, we have compared the mutational fingerprints of these carcinogens obtained by our newly developed method with those obtained by parallel analyses using the conventional low-throughput approach, that is, standard mutation detection assay followed by direct DNA sequencing using a capillary DNA sequencer. We demonstrate that this high-throughput next-generation sequencing-based method is highly specific and sensitive to detect the mutational fingerprints of the tested carcinogens. The method is reproducible, and its accuracy is comparable with that of the currently available low-throughput method. In conclusion, this novel method has the potential to move the field of carcinogenesis forward by allowing high-throughput analysis of mutations induced by endogenous and/or exogenous genotoxic agents. PMID:22735701

  18. A high-throughput next-generation sequencing-based method for detecting the mutational fingerprint of carcinogens.

    PubMed

    Besaratinia, Ahmad; Li, Haiqing; Yoon, Jae-In; Zheng, Albert; Gao, Hanlin; Tommasi, Stella

    2012-08-01

    Many carcinogens leave a unique mutational fingerprint in the human genome. These mutational fingerprints manifest as specific types of mutations often clustering at certain genomic loci in tumor genomes from carcinogen-exposed individuals. To develop a high-throughput method for detecting the mutational fingerprint of carcinogens, we have devised a cost-, time- and labor-effective strategy, in which the widely used transgenic Big Blue mouse mutation detection assay is made compatible with the Roche/454 Genome Sequencer FLX Titanium next-generation sequencing technology. As proof of principle, we have used this novel method to establish the mutational fingerprints of three prominent carcinogens with varying mutagenic potencies, including sunlight ultraviolet radiation, 4-aminobiphenyl and secondhand smoke that are known to be strong, moderate and weak mutagens, respectively. For verification purposes, we have compared the mutational fingerprints of these carcinogens obtained by our newly developed method with those obtained by parallel analyses using the conventional low-throughput approach, that is, standard mutation detection assay followed by direct DNA sequencing using a capillary DNA sequencer. We demonstrate that this high-throughput next-generation sequencing-based method is highly specific and sensitive to detect the mutational fingerprints of the tested carcinogens. The method is reproducible, and its accuracy is comparable with that of the currently available low-throughput method. In conclusion, this novel method has the potential to move the field of carcinogenesis forward by allowing high-throughput analysis of mutations induced by endogenous and/or exogenous genotoxic agents.

  19. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  20. WE-DE-201-11: Sensitivity and Specificity of Verification Methods Based On Total Reference Air Kerma (TRAK) Or On User Provided Dose Points for Graphically Planned Skin HDR Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damato, A; Devlin, P; Bhagwat, M

    Purpose: To investigate the sensitivity and specificity of a novel verification methodology for image-guided skin HDR brachytherapy plans using a TRAK-based reasonableness test, compared to a typical manual verification methodology. Methods: Two methodologies were used to flag treatment plans necessitating additional review due to a potential discrepancy of 3 mm between planned dose and clinical target in the skin. Manual verification was used to calculate the discrepancy between the average dose to points positioned at time of planning representative of the prescribed depth and the expected prescription dose. Automatic verification was used to calculate the discrepancy between TRAK of themore » clinical plan and its expected value, which was calculated using standard plans with varying curvatures, ranging from flat to cylindrically circumferential. A plan was flagged if a discrepancy >10% was observed. Sensitivity and specificity were calculated using as a criteria for true positive that >10% of plan dwells had a distance to prescription dose >1 mm different than prescription depth (3 mm + size of applicator). All HDR image-based skin brachytherapy plans treated at our institution in 2013 were analyzed. Results: 108 surface applicator plans to treat skin of the face, scalp, limbs, feet, hands or abdomen were analyzed. Median number of catheters was 19 (range, 4 to 71) and median number of dwells was 257 (range, 20 to 1100). Sensitivity/specificity were 57%/78% for manual and 70%/89% for automatic verification. Conclusion: A check based on expected TRAK value is feasible for irregularly shaped, image-guided skin HDR brachytherapy. This test yielded higher sensitivity and specificity than a test based on the identification of representative points, and can be implemented with a dedicated calculation code or with pre-calculated lookup tables of ideally shaped, uniform surface applicators.« less

  1. Verification of EPA's " Preliminary remediation goals for radionuclides" (PRG) electronic calculator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stagich, B. H.

    The U.S. Environmental Protection Agency (EPA) requested an external, independent verification study of their “Preliminary Remediation Goals for Radionuclides” (PRG) electronic calculator. The calculator provides information on establishing PRGs for radionuclides at Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) sites with radioactive contamination (Verification Study Charge, Background). These risk-based PRGs set concentration limits using carcinogenic toxicity values under specific exposure conditions (PRG User’s Guide, Section 1). The purpose of this verification study is to ascertain that the computer codes has no inherit numerical problems with obtaining solutions as well as to ensure that the equations are programmed correctly.

  2. VERIFYING THE VOC CONTROL PERFORMANCE OF BIOREACTORS

    EPA Science Inventory

    The paper describes the verification testing approach used to collect high-quality, peer-reviewed data on the performance of bioreaction-based technologies for the control of volatile organic compounds (VOCs). The verification protocol that describes the approach for these tests ...

  3. Feasibility of biochemical verification in a web-based smoking cessation study.

    PubMed

    Cha, Sarah; Ganz, Ollie; Cohn, Amy M; Ehlke, Sarah J; Graham, Amanda L

    2017-10-01

    Cogent arguments have been made against the need for biochemical verification in population-based studies with low-demand characteristics. Despite this fact, studies involving digital interventions (low-demand) are often required in peer review to report biochemically verified abstinence. To address this discrepancy, we examined the feasibility and costs of biochemical verification in a web-based study conducted with a national sample. Participants were 600U.S. adult current smokers who registered on a web-based smoking cessation program and completed surveys at baseline and 3months. Saliva sampling kits were sent to participants who reported 7-day abstinence at 3months, and analyzed for cotinine. The response rate at 3-months was 41.2% (n=247): 93 participants reported 7-day abstinence (38%) and were mailed a saliva kit (71% returned). The discordance rate was 36.4%. Participants with discordant responses were more likely to report 3-month use of nicotine replacement therapy or e-cigarettes than those with concordant responses (79.2% vs. 45.2%, p=0.007). The total cost of saliva sampling was $8280 ($125/sample). Biochemical verification was both time- and cost-intensive, and yielded a relatively small number of samples due to low response rates and use of other nicotine products during the follow-up period. There was a high rate of discordance of self-reported abstinence and saliva testing. Costs for data collection may be prohibitive for studies with large sample sizes or limited budgets. Our findings echo previous statements that biochemical verification is not necessary in population-based studies, and add evidence specific to technology-based studies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Comparison between In-house developed and Diamond commercial software for patient specific independent monitor unit calculation and verification with heterogeneity corrections.

    PubMed

    Kuppusamy, Vijayalakshmi; Nagarajan, Vivekanandan; Jeevanandam, Prakash; Murugan, Lavanya

    2016-02-01

    The study was aimed to compare two different monitor unit (MU) or dose verification software in volumetric modulated arc therapy (VMAT) using modified Clarkson's integration technique for 6 MV photons beams. In-house Excel Spreadsheet based monitor unit verification calculation (MUVC) program and PTW's DIAMOND secondary check software (SCS), version-6 were used as a secondary check to verify the monitor unit (MU) or dose calculated by treatment planning system (TPS). In this study 180 patients were grouped into 61 head and neck, 39 thorax and 80 pelvic sites. Verification plans are created using PTW OCTAVIUS-4D phantom and also measured using 729 detector chamber and array with isocentre as the suitable point of measurement for each field. In the analysis of 154 clinically approved VMAT plans with isocentre at a region above -350 HU, using heterogeneity corrections, In-house Spreadsheet based MUVC program and Diamond SCS showed good agreement TPS. The overall percentage average deviations for all sites were (-0.93% + 1.59%) and (1.37% + 2.72%) for In-house Excel Spreadsheet based MUVC program and Diamond SCS respectively. For 26 clinically approved VMAT plans with isocentre at a region below -350 HU showed higher variations for both In-house Spreadsheet based MUVC program and Diamond SCS. It can be concluded that for patient specific quality assurance (QA), the In-house Excel Spreadsheet based MUVC program and Diamond SCS can be used as a simple and fast accompanying to measurement based verification for plans with isocentre at a region above -350 HU. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  5. Acoustic-based proton range verification in heterogeneous tissue: simulation studies

    NASA Astrophysics Data System (ADS)

    Jones, Kevin C.; Nie, Wei; Chu, James C. H.; Turian, Julius V.; Kassaee, Alireza; Sehgal, Chandra M.; Avery, Stephen

    2018-01-01

    Acoustic-based proton range verification (protoacoustics) is a potential in vivo technique for determining the Bragg peak position. Previous measurements and simulations have been restricted to homogeneous water tanks. Here, a CT-based simulation method is proposed and applied to a liver and prostate case to model the effects of tissue heterogeneity on the protoacoustic amplitude and time-of-flight range verification accuracy. For the liver case, posterior irradiation with a single proton pencil beam was simulated for detectors placed on the skin. In the prostate case, a transrectal probe measured the protoacoustic pressure generated by irradiation with five separate anterior proton beams. After calculating the proton beam dose deposition, each CT voxel’s material properties were mapped based on Hounsfield Unit values, and thermoacoustically-generated acoustic wave propagation was simulated with the k-Wave MATLAB toolbox. By comparing the simulation results for the original liver CT to homogenized variants, the effects of heterogeneity were assessed. For the liver case, 1.4 cGy of dose at the Bragg peak generated 50 mPa of pressure (13 cm distal), a 2×  lower amplitude than simulated in a homogeneous water tank. Protoacoustic triangulation of the Bragg peak based on multiple detector measurements resulted in 0.4 mm accuracy for a δ-function proton pulse irradiation of the liver. For the prostate case, higher amplitudes are simulated (92-1004 mPa) for closer detectors (<8 cm). For four of the prostate beams, the protoacoustic range triangulation was accurate to  ⩽1.6 mm (δ-function proton pulse). Based on the results, application of protoacoustic range verification to heterogeneous tissue will result in decreased signal amplitudes relative to homogeneous water tank measurements, but accurate range verification is still expected to be possible.

  6. A Comparative Study of Two Azimuth Based Non Standard Location Methods

    DTIC Science & Technology

    2017-03-23

    Standard Location Methods Rongsong JIH U.S. Department of State / Arms Control, Verification, and Compliance Bureau, 2201 C Street, NW, Washington...COMPARATIVE STUDY OF TWO AZIMUTH-BASED NON-STANDARD LOCATION METHODS R. Jih Department of State / Arms Control, Verification, and Compliance Bureau...cable. The so-called “Yin Zhong Xian” (“引中线” in Chinese) algorithm, hereafter the YZX method , is an Oriental version of IPB-based procedure. It

  7. Online 3D EPID-based dose verification: Proof of concept

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozenda

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of thismore » study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5–10 s irradiation time. Conclusions: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.« less

  8. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform

    PubMed Central

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-01-01

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform’s mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument’s working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform. PMID:27869722

  9. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform.

    PubMed

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-11-18

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  10. Deductive Evaluation: Implicit Code Verification With Low User Burden

    NASA Technical Reports Server (NTRS)

    Di Vito, Ben L.

    2016-01-01

    We describe a framework for symbolically evaluating C code using a deductive approach that discovers and proves program properties. The framework applies Floyd-Hoare verification principles in its treatment of loops, with a library of iteration schemes serving to derive loop invariants. During evaluation, theorem proving is performed on-the-fly, obviating the generation of verification conditions normally needed to establish loop properties. A PVS-based prototype is presented along with results for sample C functions.

  11. The help of simulation codes in designing waste assay systems using neutron measurement methods: Application to the alpha low level waste assay system PROMETHEE 6

    NASA Astrophysics Data System (ADS)

    Mariani, A.; Passard, C.; Jallu, F.; Toubon, H.

    2003-11-01

    The design of a specific nuclear assay system for a dedicated application begins with a phase of development, which relies on information from the literature or on knowledge resulting from experience, and on specific experimental verifications. The latter ones may require experimental devices which can be restricting in terms of deadline, cost and safety. One way generally chosen to bypass these difficulties is to use simulation codes to study particular aspects. This paper deals with the potentialities offered by the simulation in the case of a passive-active neutron (PAN) assay system for alpha low level waste characterization; this system has been carried out at the Nuclear Measurements Development Laboratory of the French Atomic Energy Commission. Due to the high number of parameters to be taken into account for its development, this is a particularly sophisticated example. Since the PAN assay system, called PROMETHEE (prompt epithermal and thermal interrogation experiment), must have a detection efficiency of more than 20% and preserve a high level of modularity for various applications, an improved version has been studied using the MCNP4 (Monte Carlo N-Particle) transport code. Parameters such as the dimensions of the assay system, of the cavity and of the detection blocks, and the thicknesses of the nuclear materials of neutronic interest have been optimised. Therefore, the number of necessary experiments was reduced.

  12. Verification of the FtCayuga fault-tolerant microprocessor system. Volume 1: A case study in theorem prover-based verification

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam; Bickford, Mark

    1991-01-01

    The design and formal verification of a hardware system for a task that is an important component of a fault tolerant computer architecture for flight control systems is presented. The hardware system implements an algorithm for obtaining interactive consistancy (byzantine agreement) among four microprocessors as a special instruction on the processors. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, provided certain preconditions hold. An assumption is made that the processors execute synchronously. For verification, the authors used a computer aided design hardware design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.

  13. Method for fabrication and verification of conjugated nanoparticle-antibody tuning elements for multiplexed electrochemical biosensors.

    PubMed

    La Belle, Jeffrey T; Fairchild, Aaron; Demirok, Ugur K; Verma, Aman

    2013-05-15

    There is a critical need for more accurate, highly sensitive and specific assay for disease diagnosis and management. A novel, multiplexed, single sensor using rapid and label free electrochemical impedance spectroscopy tuning method has been developed. The key challenges while monitoring multiple targets is frequency overlap. Here we describe the methods to circumvent the overlap, tune by use of nanoparticle (NP) and discuss the various fabrication and characterization methods to develop this technique. First sensors were fabricated using printed circuit board (PCB) technology and nickel and gold layers were electrodeposited onto the PCB sensors. An off-chip conjugation of gold NP's to molecular recognition elements (with verification technique) is described as well. A standard covalent immobilization of the molecular recognition elements is also discussed with quality control techniques. Finally use and verification of sensitivity and specificity is also presented. By use of gold NP's of various sizes, we have demonstrated the possibility and shown little loss of sensitivity and specificity in the molecular recognition of inflammatory markers as "model" targets for our tuning system. By selection of other sized NP's or NP's of various materials, the tuning effect can be further exploited. The novel platform technology developed could be utilized in critical care, clinical management and at home health and disease management. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. Development and Verification of an RNA Sequencing (RNA-Seq) Assay for the Detection of Gene Fusions in Tumors.

    PubMed

    Winters, Jennifer L; Davila, Jaime I; McDonald, Amber M; Nair, Asha A; Fadra, Numrah; Wehrs, Rebecca N; Thomas, Brittany C; Balcom, Jessica R; Jin, Long; Wu, Xianglin; Voss, Jesse S; Klee, Eric W; Oliver, Gavin R; Graham, Rondell P; Neff, Jadee L; Rumilla, Kandelaria M; Aypar, Umut; Kipp, Benjamin R; Jenkins, Robert B; Jen, Jin; Halling, Kevin C

    2018-06-13

    We assessed the performance characteristics of an RNA sequencing (RNA-Seq) assay designed to detect gene fusions in 571 genes to help manage patients with cancer. Polyadenylated RNA was converted to cDNA, which was then used to prepare next-generation sequencing libraries that were sequenced on an Illumina HiSeq 2500 instrument and analyzed with an in-house developed bioinformatic pipeline. The assay identified 38 of 41 gene fusions detected by another method, such as fluorescence in situ hybridization or RT-PCR, for a sensitivity of 93%. No false-positive gene fusions were identified in 15 normal tissue specimens and 10 tumor specimens that were negative for fusions by RNA sequencing or Mate Pair NGS (100% specificity). The assay also identified 22 fusions in 17 tumor specimens that had not been detected by other methods. Eighteen of the 22 fusions had not previously been described. Good intra-assay and interassay reproducibility was observed with complete concordance for the presence or absence of gene fusions in replicates. The analytical sensitivity of the assay was tested by diluting RNA isolated from gene fusion-positive cases with fusion-negative RNA. Gene fusions were generally detectable down to 12.5% dilutions for most fusions and as little as 3% for some fusions. This assay can help identify fusions in patients with cancer; these patients may in turn benefit from both US Food and Drug Administration-approved and investigational targeted therapies. Copyright © 2018 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  15. GENERIC VERIFICATION PROTOCOL FOR AQUEOUS CLEANER RECYCLING TECHNOLOGIES

    EPA Science Inventory

    This generic verification protocol has been structured based on a format developed for ETV-MF projects. This document describes the intended approach and explain plans for testing with respect to areas such as test methodology, procedures, parameters, and instrumentation. Also ...

  16. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience.

    PubMed

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Jörgen; Nyholm, Tufve; Ahnesjö, Anders; Karlsson, Mikael

    2007-08-21

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm(3) ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 +/- 1.2% and 0.5 +/- 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 +/- 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach. The physical effects modelled in the dose calculation software MUV allow accurate dose calculations in individual verification points. Independent calculations may be used to replace experimental dose verification once the IMRT programme is mature.

  17. Methodology for the specification of communication activities within the framework of a multi-layered architecture: Toward the definition of a knowledge base

    NASA Astrophysics Data System (ADS)

    Amyay, Omar

    A method defined in terms of synthesis and verification steps is presented. The specification of the services and protocols of communication within a multilayered architecture of the Open Systems Interconnection (OSI) type is an essential issue for the design of computer networks. The aim is to obtain an operational specification of the protocol service couple of a given layer. Planning synthesis and verification steps constitute a specification trajectory. The latter is based on the progressive integration of the 'initial data' constraints and verification of the specification originating from each synthesis step, through validity constraints that characterize an admissible solution. Two types of trajectories are proposed according to the style of the initial specification of the service protocol couple: operational type and service supplier viewpoint; knowledge property oriented type and service viewpoint. Synthesis and verification activities were developed and formalized in terms of labeled transition systems, temporal logic and epistemic logic. The originality of the second specification trajectory and the use of the epistemic logic are shown. An 'artificial intelligence' approach enables a conceptual model to be defined for a knowledge base system for implementing the method proposed. It is structured in three levels of representation of the knowledge relating to the domain, the reasoning characterizing synthesis and verification activities and the planning of the steps of a specification trajectory.

  18. Transference of CALIPER pediatric reference intervals to biochemical assays on the Roche cobas 6000 and the Roche Modular P.

    PubMed

    Higgins, Victoria; Chan, Man Khun; Nieuwesteeg, Michelle; Hoffman, Barry R; Bromberg, Irvin L; Gornall, Doug; Randell, Edward; Adeli, Khosrow

    2016-01-01

    The Canadian Laboratory Initiative on Pediatric Reference Intervals (CALIPER) has recently established pediatric age- and sex-specific reference intervals for over 85 biochemical markers on the Abbott Architect system. Previously, CALIPER reference intervals for several biochemical markers were successfully transferred from Abbott assays to Roche, Beckman, Ortho, and Siemens assays. This study further broadens the CALIPER database by performing transference and verification for 52 biochemical assays on the Roche cobas 6000 and the Roche Modular P. Using CLSI C28-A3 and EP9-A2 guidelines, transference of the CALIPER reference intervals was attempted for 16 assays on the Roche cobas 6000 and 36 on the Modular P. Calculated reference intervals were further verified using 100 healthy CALIPER samples. Most assays showed strong correlation between assay systems and were transferable from Abbott to the Roche cobas 6000 (81%) and the Modular P (86%). Bicarbonate and magnesium were not transferable on either system and calcium and prealbumin were not transferable to the Modular P. Of the transferable analytes, 62% and 61% were verified on the cobas 6000 and the Modular P, respectively. This study extends the utility of the CALIPER database to two additional analytical systems, which facilitates the broad application of CALIPER reference intervals at pediatric centers utilizing Roche biochemical assays. Transference studies across different analytical platforms can later be collectively analyzed in an attempt to develop common reference intervals across all clinical chemistry instruments to harmonize laboratory test interpretation in diagnosis and monitoring of pediatric disease. Copyright © 2015 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  19. An ontology based trust verification of software license agreement

    NASA Astrophysics Data System (ADS)

    Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo

    2017-08-01

    When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.

  20. Identification and verification of the anabolic steroid boldenone in equine blood and urine by HPLC/ELISA.

    PubMed

    Hagedorn, H W; Schulz, R; Jaeschke, G

    1994-01-01

    An enzyme linked immunosorbent assay (ELISA) was developed to detect the anabolic steroid boldenone in equine blood and urine. The polyclonal antiserum was raised in rabbits, employing boldenone-17-hemisuccinate-bovine serum albumin as antigen. Boldenone-17-hemisuccinate-horseradish peroxidase served as enzyme conjugate. Sensitivity of the assay was 26.0 +/- 3.0 pg/well. Among the endogenous steroids tested only progesterone and testosterone exhibited moderate cross-reactivities, 3.4 and 2.5%, respectively. These cross-reactivities are of no importance for the boldenone assay. For the reduction of background levels, screening for boldenone of equine serum was performed after extraction. Urine samples were determined directly after dilution, omitting hydrolysis of boldenone conjugates. Positive screening results were confirmed by means of two independent HPLC systems combined with off-line detection, employing the boldenone ELISA. Methandienone served as internal standard to ascertain retention factors. In horses treated with boldenone-17-undecylenate the presence of boldenone in serum was confirmed up to 28 days and in unhydrolyzed urine up to 56 days post applicationem.

  1. Multiplexed targeted mass spectrometry assays for prostate cancer-associated urinary proteins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Tujin; Quek, Sue-Ing; Gao, Yuqian

    Biomarkers for effective early diagnosis and prognosis of prostate cancer are still lacking. Multiplexed assays for cancer-associated proteins could be useful for identifying biomarkers for cancer detection and stratification. Herein, we report the development of sensitive targeted mass spectrometry assays for simultaneous quantification of 10 prostate cancer-associated proteins in urine. The diagnostic utility of these markers was evaluated with an initial cohort of 20 clinical urine samples. Individual marker concentration was normalized against the measured urinary prostate-specific antigen level as a reference of prostate-specific secretion. The areas under the receiver-operating characteristic curves for the 10 proteins ranged from 0.75 formore » CXCL14 to 0.87 for CEACAM5. Furthermore, MMP9 level was found to be significantly higher in patients with high Gleason scores, suggesting a potential of MMP9 as a marker for risk level assessment. Taken together, our work illustrated the feasibility of accurate multiplexed measurements of low-abundance cancer-associated proteins in urine and provided a viable path forward for preclinical verification of candidate biomarkers for prostate cancer.« less

  2. Biometrics based authentication scheme for session initiation protocol.

    PubMed

    Xie, Qi; Tang, Zhixiong

    2016-01-01

    Many two-factor challenge-response based session initiation protocol (SIP) has been proposed, but most of them are vulnerable to smart card stolen attacks and password guessing attacks. In this paper, we propose a novel three-factor SIP authentication scheme using biometrics, password and smart card, and utilize the pi calculus-based formal verification tool ProVerif to prove that the proposed protocol achieves security and authentication. Furthermore, our protocol is highly efficient when compared to other related protocols.

  3. Verification of Java Programs using Symbolic Execution and Invariant Generation

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  4. Implementation and verification of global optimization benchmark problems

    NASA Astrophysics Data System (ADS)

    Posypkin, Mikhail; Usov, Alexander

    2017-12-01

    The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  5. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM: QUALITY AND MANAGEMENT PLAN FOR THE PILOT PERIOD (1995-2000)

    EPA Science Inventory

    Based upon the structure and specifications in ANSI/ASQC E4-1994, Specifications and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs, the Environmental Technology Verification (ETV) program Quality and Management Plan (QMP) f...

  7. An analysis of random projection for changeable and privacy-preserving biometric verification.

    PubMed

    Wang, Yongjin; Plataniotis, Konstantinos N

    2010-10-01

    Changeability and privacy protection are important factors for widespread deployment of biometrics-based verification systems. This paper presents a systematic analysis of a random-projection (RP)-based method for addressing these problems. The employed method transforms biometric data using a random matrix with each entry an independent and identically distributed Gaussian random variable. The similarity- and privacy-preserving properties, as well as the changeability of the biometric information in the transformed domain, are analyzed in detail. Specifically, RP on both high-dimensional image vectors and dimensionality-reduced feature vectors is discussed and compared. A vector translation method is proposed to improve the changeability of the generated templates. The feasibility of the introduced solution is well supported by detailed theoretical analyses. Extensive experimentation on a face-based biometric verification problem shows the effectiveness of the proposed method.

  8. A Scala DSL for RETE-Based Runtime Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2013-01-01

    Runtime verification (RV) consists in part of checking execution traces against formalized specifications. Several systems have emerged, most of which support specification notations based on state machines, regular expressions, temporal logic, or grammars. The field of Artificial Intelligence (AI) has for an even longer period of time studied rule-based production systems, which at a closer look appear to be relevant for RV, although seemingly focused on slightly different application domains, such as for example business processes and expert systems. The core algorithm in many of these systems is the Rete algorithm. We have implemented a Rete-based runtime verification system, named LogFire (originally intended for offline log analysis but also applicable to online analysis), as an internal DSL in the Scala programming language, using Scala's support for defining DSLs. This combination appears attractive from a practical point of view. Our contribution is in part conceptual in arguing that such rule-based frameworks originating from AI may be suited for RV.

  9. Commissioning and quality assurance of an integrated system for patient positioning and setup verification in particle therapy.

    PubMed

    Pella, A; Riboldi, M; Tagaste, B; Bianculli, D; Desplanques, M; Fontana, G; Cerveri, P; Seregni, M; Fattori, G; Orecchia, R; Baroni, G

    2014-08-01

    In an increasing number of clinical indications, radiotherapy with accelerated particles shows relevant advantages when compared with high energy X-ray irradiation. However, due to the finite range of ions, particle therapy can be severely compromised by setup errors and geometric uncertainties. The purpose of this work is to describe the commissioning and the design of the quality assurance procedures for patient positioning and setup verification systems at the Italian National Center for Oncological Hadrontherapy (CNAO). The accuracy of systems installed in CNAO and devoted to patient positioning and setup verification have been assessed using a laser tracking device. The accuracy in calibration and image based setup verification relying on in room X-ray imaging system was also quantified. Quality assurance tests to check the integration among all patient setup systems were designed, and records of daily QA tests since the start of clinical operation (2011) are presented. The overall accuracy of the patient positioning system and the patient verification system motion was proved to be below 0.5 mm under all the examined conditions, with median values below the 0.3 mm threshold. Image based registration in phantom studies exhibited sub-millimetric accuracy in setup verification at both cranial and extra-cranial sites. The calibration residuals of the OTS were found consistent with the expectations, with peak values below 0.3 mm. Quality assurance tests, daily performed before clinical operation, confirm adequate integration and sub-millimetric setup accuracy. Robotic patient positioning was successfully integrated with optical tracking and stereoscopic X-ray verification for patient setup in particle therapy. Sub-millimetric setup accuracy was achieved and consistently verified in daily clinical operation.

  10. ECR-MAPK regulation in liver early development.

    PubMed

    Zhao, Xiu-Ju; Zhuo, Hexian

    2014-01-01

    Early growth is connected to a key link between embryonic development and aging. In this paper, liver gene expression profiles were assayed at postnatal day 22 and week 16 of age. Meanwhile another independent animal experiment and cell culture were carried out for validation. Significance analysis of microarrays, qPCR verification, drug induction/inhibition assays, and metabonomics indicated that alpha-2u globulin (extracellular region)-socs2 (-SH2-containing signals/receptor tyrosine kinases)-ppp2r2a/pik3c3 (MAPK signaling)-hsd3b5/cav2 (metabolism/organization) plays a vital role in early development. Taken together, early development of male rats is ECR and MAPK-mediated coordination of cancer-like growth and negative regulations. Our data represent the first comprehensive description of early individual development, which could be a valuable basis for understanding the functioning of the gene interaction network of infant development.

  11. Verification of natural infection of peridomestic rodents by PCV2 on commercial swine farms.

    PubMed

    Pinheiro, Albanno Leonard Braz Campos; Bulos, Luiz Henrique Silva; Onofre, Thiago Souza; de Paula Gabardo, Michelle; de Carvalho, Otávio Valério; Fausto, Mariana Costa; Guedes, Roberto Maurício Carvalho; de Almeida, Márcia Rogéria; Silva Júnior, Abelardo

    2013-06-01

    The porcine circovirus-2 (PCV2) is the main agent responsible for porcine circovirus associated diseases (PCVAD). Few studies have been done regarding PCV2 infection in other species. The purpose of this study was to investigate the occurrence of PCV2 infection in the peridomestic rodent species Mus musculus and Rattus rattus on commercial pig farms in Brazil. Immunohistochemistry assay demonstrated PCV2 in the spleen, lung and kidney. Viral DNA was detected in tissues by nested PCR assay. Partial sequences of PCV2 genomes detected in the rodents had strong identity with gene sequences of PCV2 isolates from pigs. These results show that the studied peridomestic rodent species can be naturally infected by PCV2. However, further studies are needed to confirm PCV2 transmission from rodents to pigs. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Resorption behavior of a nanostructured bone substitute: in vitro investigation and clinical application.

    PubMed

    Reichert, Christoph; Götz, Werner; Reimann, Susanne; Keilig, Ludger; Hagner, Martin; Bourauel, Christoph; Jäger, Andreas

    2013-03-01

    To develop an in vitro assay for quantitative analysis of the degradation to which a bone substitute is exposed by osteoclasts. The aim of establishing this method was to improve the predictability of carrying out tooth movements via bone substitutes and to provide a basis for verification in exemplary clinical cases. After populating a bone substitute (NanoBone®; ArtOss, Germany) with osteoclastic cells, inductively-coupled mass spectrometry was used to evaluate changing calcium levels in the culture medium as a marker of resorption activity. It was observed that calcium levels increased substantially in the culture medium with the cells populating the bone substitute. This in vitro assay is a valid method that can assist clinicians in selecting the appropriate materials for certain patients. While tooth movements occurring through this material were successful, uncertainty about the approach will remain as long-term results are not available.

  13. Computer Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossiblemore » to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes« less

  14. Diversity in Secondary Metabolites Including Mycotoxins from Strains of Aspergillus Section Nigri Isolated from Raw Cashew Nuts from Benin, West Africa.

    PubMed

    Lamboni, Yendouban; Nielsen, Kristian F; Linnemann, Anita R; Gezgin, Yüksel; Hell, Kerstin; Nout, Martinus J R; Smid, Eddy J; Tamo, Manuele; van Boekel, Martinus A J S; Hoof, Jakob Blæsbjerg; Frisvad, Jens Christian

    2016-01-01

    In a previous study, raw cashew kernels were assayed for the fungal contamination focusing on strains belonging to the genus Aspergillus and on aflatoxins producers. These samples showed high contamination with Aspergillus section Nigri species and absence of aflatoxins. To investigate the diversity of secondary metabolites, including mycotoxins, the species of A. section Nigri may produce and thus threaten to contaminate the raw cashew kernels, 150 strains were isolated from cashew samples and assayed for their production of secondary metabolites using liquid chromatography high resolution mass spectrometry (LC-HRMS). Seven species of black Aspergilli were isolated based on morphological and chemical identification: A. tubingensis (44%), A. niger (32%), A. brasiliensis (10%), A. carbonarius (8.7%), A. luchuensis (2.7%), A. aculeatus (2%) and A. aculeatinus (0.7%). From these, 45 metabolites and their isomers were identified. Aurasperone and pyranonigrin A, produced by all species excluding A. aculeatus and A. aculeatinus, were most prevalent and were encountered in 146 (97.3%) and 145 (95.7%) isolates, respectively. Three mycotoxins groups were detected: fumonisins (B2 and B4) (2.7%) ochratoxin A (13.3%), and secalonic acids (2%), indicating that these mycotoxins could occur in raw cashew nuts. Thirty strains of black Aspergilli were randomly sampled for verification of species identity based on sequences of β-tubulin and calmodulin genes. Among them, 27 isolates were positive to the primers used and 11 were identified as A. niger, 7 as A. tubingensis, 6 as A. carbonarius, 2 as A. luchuensis and 1 as A. welwitschiae confirming the species names as based on morphology and chemical features. These strains clustered in 5 clades in A. section Nigri. Chemical profile clustering also showed also 5 groups confirming the species specific metabolites production.

  15. Diversity in Secondary Metabolites Including Mycotoxins from Strains of Aspergillus Section Nigri Isolated from Raw Cashew Nuts from Benin, West Africa

    PubMed Central

    Lamboni, Yendouban; Nielsen, Kristian F.; Linnemann, Anita R.; Gezgin, Yüksel; Hell, Kerstin; Nout, Martinus J. R.; Smid, Eddy J.; Tamo, Manuele; van Boekel, Martinus A. J. S.; Hoof, Jakob Blæsbjerg; Frisvad, Jens Christian

    2016-01-01

    In a previous study, raw cashew kernels were assayed for the fungal contamination focusing on strains belonging to the genus Aspergillus and on aflatoxins producers. These samples showed high contamination with Aspergillus section Nigri species and absence of aflatoxins. To investigate the diversity of secondary metabolites, including mycotoxins, the species of A. section Nigri may produce and thus threaten to contaminate the raw cashew kernels, 150 strains were isolated from cashew samples and assayed for their production of secondary metabolites using liquid chromatography high resolution mass spectrometry (LC-HRMS). Seven species of black Aspergilli were isolated based on morphological and chemical identification: A. tubingensis (44%), A. niger (32%), A. brasiliensis (10%), A. carbonarius (8.7%), A. luchuensis (2.7%), A. aculeatus (2%) and A. aculeatinus (0.7%). From these, 45 metabolites and their isomers were identified. Aurasperone and pyranonigrin A, produced by all species excluding A. aculeatus and A. aculeatinus, were most prevalent and were encountered in 146 (97.3%) and 145 (95.7%) isolates, respectively. Three mycotoxins groups were detected: fumonisins (B2 and B4) (2.7%) ochratoxin A (13.3%), and secalonic acids (2%), indicating that these mycotoxins could occur in raw cashew nuts. Thirty strains of black Aspergilli were randomly sampled for verification of species identity based on sequences of β-tubulin and calmodulin genes. Among them, 27 isolates were positive to the primers used and 11 were identified as A. niger, 7 as A. tubingensis, 6 as A. carbonarius, 2 as A. luchuensis and 1 as A. welwitschiae confirming the species names as based on morphology and chemical features. These strains clustered in 5 clades in A. section Nigri. Chemical profile clustering also showed also 5 groups confirming the species specific metabolites production. PMID:27768708

  16. A Candidate H1N1 Pandemic Influenza Vaccine Elicits Protective Immunity in Mice

    PubMed Central

    Steitz, Julia; Barlow, Peter G.; Hossain, Jaber; Kim, Eun; Okada, Kaori; Kenniston, Tom; Rea, Sheri; Donis, Ruben O.; Gambotto, Andrea

    2010-01-01

    Background In 2009 a new pandemic disease appeared and spread globally. The recent emergence of the pandemic influenza virus H1N1 first isolated in Mexico and USA raised concerns about vaccine availability. We here report our development of an adenovirus-based influenza H1N1 vaccine tested for immunogenicity and efficacy to confer protection in animal model. Methods We generated two adenovirus(Ad5)-based influenza vaccine candidates encoding the wildtype or a codon-optimized hemagglutinin antigen (HA) from the recently emerged swine influenza isolate A/California/04/2009 (H1N1)pdm. After verification of antigen expression, immunogenicity of the vaccine candidates were tested in a mouse model using dose escalations for subcutaneous immunization. Sera of immunized animals were tested in microneutalization and hemagglutination inhibition assays for the presence of HA-specific antibodies. HA-specific T-cells were measured in IFNγ Elispot assays. The efficiency of the influenza vaccine candidates were evaluated in a challenge model by measuring viral titer in lung and nasal turbinate 3 days after inoculation of a homologous H1N1 virus. Conclusions/Significance A single immunization resulted in robust cellular and humoral immune response. Remarkably, the intensity of the immune response was substantially enhanced with codon-optimized antigen, indicating the benefit of manipulating the genetic code of HA antigens in the context of recombinant influenza vaccine design. These results highlight the value of advanced technologies in vaccine development and deployment in response to infections with pandemic potential. Our study emphasizes the potential of an adenoviral-based influenza vaccine platform with the benefits of speed of manufacture and efficacy of a single dose immunization. PMID:20463955

  17. ENVIORNMENTAL TECHNOLOGY VERIFICATION REPORT: ANEST IWATA CORPORATION LPH400-LV HVLP SPRAY GUN

    EPA Science Inventory

    This Enviornmental Technology Verification reports on the characteristics of a paint spray gun. The research showed that the spray gun provided absolute and relative increases in transfer efficiency over the base line and provided a reduction in the use of paint.

  18. IN PURSUIT OF AN INTERNATIONAL APPROACH TO QUALITY ASSURANCE FOR ENVIRONMENTAL TECHNOLOGY VERIFICATION

    EPA Science Inventory

    In the mid-1990's, the USEPA began the Environmental Technology Verification (ETV) Program in order to provide purchasers of environmental technology with independently acquired, quality-assured, test data, upon which to base their purchasing decisions. From the beginning, a str...

  19. 78 FR 27390 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-10

    ... programs voluntarily self-nominate their practice or healthcare system by completing a web-based nomination... CDC with a ranked list of nominees. Finalists will be asked to participate in a data verification process that includes verification of how information was obtained from electronic records, remote...

  20. Formal hardware verification of digital circuits

    NASA Technical Reports Server (NTRS)

    Joyce, J.; Seger, C.-J.

    1991-01-01

    The use of formal methods to verify the correctness of digital circuits is less constrained by the growing complexity of digital circuits than conventional methods based on exhaustive simulation. This paper briefly outlines three main approaches to formal hardware verification: symbolic simulation, state machine analysis, and theorem-proving.

  1. Characterization of 137 Genomic DNA Reference Materials for 28 Pharmacogenetic Genes: A GeT-RM Collaborative Project.

    PubMed

    Pratt, Victoria M; Everts, Robin E; Aggarwal, Praful; Beyer, Brittany N; Broeckel, Ulrich; Epstein-Baak, Ruth; Hujsak, Paul; Kornreich, Ruth; Liao, Jun; Lorier, Rachel; Scott, Stuart A; Smith, Chingying Huang; Toji, Lorraine H; Turner, Amy; Kalman, Lisa V

    2016-01-01

    Pharmacogenetic testing is increasingly available from clinical laboratories. However, only a limited number of quality control and other reference materials are currently available to support clinical testing. To address this need, the Centers for Disease Control and Prevention-based Genetic Testing Reference Material Coordination Program, in collaboration with members of the pharmacogenetic testing community and the Coriell Cell Repositories, has characterized 137 genomic DNA samples for 28 genes commonly genotyped by pharmacogenetic testing assays (CYP1A1, CYP1A2, CYP2A6, CYP2B6, CYP2C8, CYP2C9, CYP2C19, CYP2D6, CYP2E1, CYP3A4, CYP3A5, CYP4F2, DPYD, GSTM1, GSTP1, GSTT1, NAT1, NAT2, SLC15A2, SLC22A2, SLCO1B1, SLCO2B1, TPMT, UGT1A1, UGT2B7, UGT2B15, UGT2B17, and VKORC1). One hundred thirty-seven Coriell cell lines were selected based on ethnic diversity and partial genotype characterization from earlier testing. DNA samples were coded and distributed to volunteer testing laboratories for targeted genotyping using a number of commercially available and laboratory developed tests. Through consensus verification, we confirmed the presence of at least 108 variant pharmacogenetic alleles. These samples are also being characterized by other pharmacogenetic assays, including next-generation sequencing, which will be reported separately. Genotyping results were consistent among laboratories, with most differences in allele assignments attributed to assay design and variability in reported allele nomenclature, particularly for CYP2D6, UGT1A1, and VKORC1. These publicly available samples will help ensure the accuracy of pharmacogenetic testing. Copyright © 2016 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  2. Establishing and monitoring an aseptic workspace for building the MOMA mass spectrometer

    NASA Astrophysics Data System (ADS)

    Lalime, Erin N.; Berlin, David

    2016-09-01

    Mars Organic Molecule Analyzer (MOMA) is an instrument suite on the European Space Agency (ESA) ExoMars 2020 Rover, and the Mass Spectrometer (MOMA-MS) is being built at Goddard Space Flight Center (GSFC). MOMA-MS is a life-detection instrument and thus falls in the most stringent category of Planetary Protection (PP) biological cleanliness requirements. Less than 0.03 spore/m2 are allowed in the instrument sample path. In order to meet these PP requirements, MOMA-MS must be built and maintained in a low bioburden environment. The MOMA-MS project at GSFC maintains three clean rooms with varying levels of bioburden control. The Aseptic Assembly Clean room has the highest level of control, applying three different bioburden reducing methods: 70% Isopropyl Alcohol (IPA), 7.5% Hydrogen Peroxide, and Ultra-Violet C (UVC) light. The three methods are used in rotation and each kills microorganisms by a different mechanism, reducing the likelihood of microorganisms developing resistance to all three. The Integration and Mars Chamber Clean rooms use less biocidal cleaning, with the option to deploy extra techniques as necessary. To support the monitoring of clean rooms and verification that MOMA-MS hardware meets PP requirements, a new Planetary Protection lab was established that currently has the capabilities of standard growth assays for spore or vegetative bacteria, rapid bioburden analysis that detects Adenosine Triphosphate (ATP), plus autoclave and Dry Heat microbial Reduction (DHMR) verification. The clean rooms are monitored for vegetative microorganisms and by rapid ATP assay, and a clear difference in bioburden is observed between the aseptic and other clean room.

  3. Establishing and Monitoring an Aseptic Workspace for Building the MOMA Mass Spectrometer

    NASA Technical Reports Server (NTRS)

    Lalime, Erin N.; Berlin, David

    2016-01-01

    Mars Organic Molecule Analyzer (MOMA) is an instrument suite on the European Space Agency (ESA) ExoMars 2020 Rover, and the Mass Spectrometer (MOMA-MS) is being built at Goddard Space Flight Center (GSFC). MOMA-MS is a life-detection instrument and thus falls in the most stringent category of Planetary Protection (PP) biological cleanliness requirements. Less than 0.03 spore/m2 are allowed in the instrument sample path. In order to meet these PP requirements, MOMA-MS must be built and maintained in a low bioburden environment. The MOMA-MS project at GSFC maintains three clean rooms with varying levels of bioburden control. The Aseptic Assembly Clean room has the highest level of control, applying three different bioburden reducing methods: 70% Isopropyl Alcohol (IPA), 7.5% Hydrogen Peroxide, and Ultra-Violet C (UVC) light. The three methods are used in rotation and each kills microorganisms by a different mechanism, reducing the likelihood of microorganisms developing resistance to all three. The Integration and Mars Chamber Clean rooms use less biocidal cleaning, with the option to deploy extra techniques as necessary. To support the monitoring of clean rooms and verification that MOMA-MS hardware meets PP requirements, a new Planetary Protection lab was established that currently has the capabilities of standard growth assays for spore or vegetative bacteria, rapid bioburden analysis that detects Adenosine Triphosphate (ATP), plus autoclave and Dry Heat microbial Reduction (DHMR) verification. The clean rooms are monitored for vegetative microorganisms and by rapid ATP assay, and a clear difference in bioburden is observed between the aseptic and other clean room.

  4. Trajectory Based Behavior Analysis for User Verification

    NASA Astrophysics Data System (ADS)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  5. Development of CFC-Free Cleaning Processes at the NASA White Sands Test Facility

    NASA Technical Reports Server (NTRS)

    Beeson, Harold; Kirsch, Mike; Hornung, Steven; Biesinger, Paul

    1995-01-01

    The NASA White Sands Test Facility (WSTF) is developing cleaning and verification processes to replace currently used chlorofluorocarbon-113- (CFC-113-) based processes. The processes being evaluated include both aqueous- and solvent-based techniques. The presentation will include the findings of investigations of aqueous cleaning and verification processes that are based on a draft of a proposed NASA Kennedy Space Center (KSC) cleaning procedure. Verification testing with known contaminants, such as hydraulic fluid and commonly used oils, established correlations between nonvolatile residue and CFC-113. Recoveries ranged from 35 to 60 percent of theoretical. WSTF is also investigating enhancements to aqueous sampling for organics and particulates. Although aqueous alternatives have been identified for several processes, a need still exists for nonaqueous solvent cleaning, such as the cleaning and cleanliness verification of gauges used for oxygen service. The cleaning effectiveness of tetrachloroethylene (PCE), trichloroethylene (TCE), ethanol, hydrochlorofluorocarbon-225 (HCFC-225), tert-butylmethylether, and n-Hexane was evaluated using aerospace gauges and precision instruments and then compared to the cleaning effectiveness of CFC-113. Solvents considered for use in oxygen systems were also tested for oxygen compatibility using high-pressure oxygen autoignition and liquid oxygen mechanical impact testing.

  6. Experimental evaluation of fingerprint verification system based on double random phase encoding

    NASA Astrophysics Data System (ADS)

    Suzuki, Hiroyuki; Yamaguchi, Masahiro; Yachida, Masuyoshi; Ohyama, Nagaaki; Tashima, Hideaki; Obi, Takashi

    2006-03-01

    We proposed a smart card holder authentication system that combines fingerprint verification with PIN verification by applying a double random phase encoding scheme. In this system, the probability of accurate verification of an authorized individual reduces when the fingerprint is shifted significantly. In this paper, a review of the proposed system is presented and preprocessing for improving the false rejection rate is proposed. In the proposed method, the position difference between two fingerprint images is estimated by using an optimized template for core detection. When the estimated difference exceeds the permissible level, the user inputs the fingerprint again. The effectiveness of the proposed method is confirmed by a computational experiment; its results show that the false rejection rate is improved.

  7. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, SWINE WASTE ELECTRIC POWER AND HEAT PRODUCTION--CAPSTONE 30KW MICROTURBINE SYSTEM

    EPA Science Inventory

    Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, a combined heat and power system was evaluated based on the Capstone 30kW Microturbine developed by Cain Ind...

  9. Finding the Bio in Biobased Products: Electrophoretic Identification of Wheat Proteins in Processed Products

    USDA-ARS?s Scientific Manuscript database

    Verification of the bio-content in bio-based or green products identifies genuine products, exposes counterfeit copies, supports or refutes content claims and ensures consumer confidence. When the bio-content includes protein, elemental nitrogen analysis is insufficient for verification since non-pr...

  10. 76 FR 60112 - Consent Based Social Security Number Verification (CBSV) Service

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-28

    ... protect the public's information. In addition to the benefit of providing high volume, centralized SSN verification services to the business community in a secure manner, CBSV provides us with cost and workload management benefits. New Information: To use CBSV, interested parties must pay a one- time non-refundable...

  11. RF model of the distribution system as a communication channel, phase 2. Volume 2: Task reports

    NASA Technical Reports Server (NTRS)

    Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.

    1982-01-01

    Based on the established feasibility of predicting, via a model, the propagation of Power Line Frequency on radial type distribution feeders, verification studies comparing model predictions against measurements were undertaken using more complicated feeder circuits and situations. Detailed accounts of the major tasks are presented. These include: (1) verification of model; (2) extension, implementation, and verification of perturbation theory; (3) parameter sensitivity; (4) transformer modeling; and (5) compensation of power distribution systems for enhancement of power line carrier communication reliability.

  12. Experimental preparation and verification of quantum money

    NASA Astrophysics Data System (ADS)

    Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei

    2018-03-01

    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.

  13. Formal Verification of Complex Systems based on SysML Functional Requirements

    DTIC Science & Technology

    2014-12-23

    Formal Verification of Complex Systems based on SysML Functional Requirements Hoda Mehrpouyan1, Irem Y. Tumer2, Chris Hoyle2, Dimitra Giannakopoulou3...requirements for design of complex engineered systems. The proposed ap- proach combines a SysML modeling approach to document and structure safety requirements...methods and tools to support the integration of safety into the design solution. 2.1. SysML for Complex Engineered Systems Traditional methods and tools

  14. Delayed Gamma-Ray Spectroscopy for Non-Destructive Assay of Nuclear Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ludewigt, Bernhard; Mozin, Vladimir; Campbell, Luke

    2015-06-01

    High-­energy, beta-delayed gamma-­ray spectroscopy is a potential, non-­destructive assay techniques for the independent verification of declared quantities of special nuclear materials at key stages of the fuel cycle and for directly assaying nuclear material inventories for spent fuel handling, interim storage, reprocessing facilities, repository sites, and final disposal. Other potential applications include determination of MOX fuel composition, characterization of nuclear waste packages, and challenges in homeland security and arms control verification. Experimental measurements were performed to evaluate fission fragment yields, to test methods for determining isotopic fractions, and to benchmark the modeling code package. Experimental measurement campaigns were carried outmore » at the IAC using a photo-­neutron source and at OSU using a thermal neutron beam from the TRIGA reactor to characterize the emission of high-­energy delayed gamma rays from 235U, 239Pu, and 241Pu targets following neutron induced fission. Data were collected for pure and combined targets for several irradiation/spectroscopy cycle times ranging from 10/10 seconds to 15/30 minutes.The delayed gamma-ray signature of 241Pu, a significant fissile constituent in spent fuel, was measured and compared to 239Pu. The 241Pu/ 239Pu ratios varied between 0.5 and 1.2 for ten prominent lines in the 2700-­3600 keV energy range. Such significant differences in relative peak intensities make it possible to determine relative fractions of these isotopes in a mixed sample. A method for determining fission product yields by fitting the energy and time dependence of the delayed gamma-­ray emission was developed and demonstrated on a limited 235U data set. De-­convolution methods for determining fissile fractions were developed and tested on the experimental data. The use of high count-­rate LaBr 3 detectors was investigated as a potential alternative to HPGe detectors. Modeling capabilities were added to an existing framework and codes were adapted as needed for analyzing experiments and assessing application-­specific assay concepts. A de-­convolution analysis of the delayed gamma-­ray response spectra modeled for spent fuel assemblies was performed using the same method that was applied to the experimental spectra.« less

  15. Rule groupings: An approach towards verification of expert systems

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala

    1991-01-01

    Knowledge-based expert systems are playing an increasingly important role in NASA space and aircraft systems. However, many of NASA's software applications are life- or mission-critical and knowledge-based systems do not lend themselves to the traditional verification and validation techniques for highly reliable software. Rule-based systems lack the control abstractions found in procedural languages. Hence, it is difficult to verify or maintain such systems. Our goal is to automatically structure a rule-based system into a set of rule-groups having a well-defined interface to other rule-groups. Once a rule base is decomposed into such 'firewalled' units, studying the interactions between rules would become more tractable. Verification-aid tools can then be developed to test the behavior of each such rule-group. Furthermore, the interactions between rule-groups can be studied in a manner similar to integration testing. Such efforts will go a long way towards increasing our confidence in the expert-system software. Our research efforts address the feasibility of automating the identification of rule groups, in order to decompose the rule base into a number of meaningful units.

  16. Spot scanning proton therapy plan assessment: design and development of a dose verification application for use in routine clinical practice

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Walsh, Timothy J.; Beltran, Chris J.; Stoker, Joshua B.; Mundy, Daniel W.; Parry, Mark D.; Bues, Martin; Fatyga, Mirek

    2016-04-01

    The use of radiation therapy for the treatment of cancer has been carried out clinically since the late 1800's. Early on however, it was discovered that a radiation dose sufficient to destroy cancer cells can also cause severe injury to surrounding healthy tissue. Radiation oncologists continually strive to find the perfect balance between a dose high enough to destroy the cancer and one that avoids damage to healthy organs. Spot scanning or "pencil beam" proton radiotherapy offers another option to improve on this. Unlike traditional photon therapy, proton beams stop in the target tissue, thus better sparing all organs beyond the targeted tumor. In addition, the beams are far narrower and thus can be more precisely "painted" onto the tumor, avoiding exposure to surrounding healthy tissue. To safely treat patients with proton beam radiotherapy, dose verification should be carried out for each plan prior to treatment. Proton dose verification systems are not currently commercially available so the Department of Radiation Oncology at the Mayo Clinic developed its own, called DOSeCHECK, which offers two distinct dose simulation methods: GPU-based Monte Carlo and CPU-based analytical. The three major components of the system include the web-based user interface, the Linux-based dose verification simulation engines, and the supporting services and components. The architecture integrates multiple applications, libraries, platforms, programming languages, and communication protocols and was successfully deployed in time for Mayo Clinic's first proton beam therapy patient. Having a simple, efficient application for dose verification greatly reduces staff workload and provides additional quality assurance, ultimately improving patient safety.

  17. Telomere length and procedural justice predict stress reactivity responses to unfair outcomes in African Americans.

    PubMed

    Lucas, Todd; Pierce, Jennifer; Lumley, Mark A; Granger, Douglas A; Lin, Jue; Epel, Elissa S

    2017-12-01

    This experiment demonstrates that chromosomal telomere length (TL) moderates response to injustice among African Americans. Based on worldview verification theory - an emerging psychosocial framework for understanding stress - we predicted that acute stress responses would be most pronounced when individual-level expectancies for justice were discordant with justice experiences. Healthy African Americans (N=118; 30% male; M age=31.63years) provided dried blood spot samples that were assayed for TL, and completed a social-evaluative stressor task during which high versus low levels of distributive (outcome) and procedural (decision process) justice were simultaneously manipulated. African Americans with longer telomeres appeared more resilient (in emotional and neuroendocrine response-higher DHEAs:cortisol) to receiving an unfair outcome when a fair decision process was used, whereas African Americans with shorter telomeres appeared more resilient when an unfair decision process was used. TL may indicate personal histories of adversity and associated stress-related expectancies that influence responses to injustice. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Advancing the sensitivity of selected reaction monitoring-based targeted quantitative proteomics

    PubMed Central

    Shi, Tujin; Su, Dian; Liu, Tao; Tang, Keqi; Camp, David G.; Qian, Wei-Jun; Smith, Richard D.

    2012-01-01

    Selected reaction monitoring (SRM)—also known as multiple reaction monitoring (MRM)—has emerged as a promising high-throughput targeted protein quantification technology for candidate biomarker verification and systems biology applications. A major bottleneck for current SRM technology, however, is insufficient sensitivity for e.g., detecting low-abundance biomarkers likely present at the low ng/mL to pg/mL range in human blood plasma or serum, or extremely low-abundance signaling proteins in cells or tissues. Herein we review recent advances in methods and technologies, including front-end immunoaffinity depletion, fractionation, selective enrichment of target proteins/peptides including posttranslational modifications (PTMs), as well as advances in MS instrumentation which have significantly enhanced the overall sensitivity of SRM assays and enabled the detection of low-abundance proteins at low to sub- ng/mL level in human blood plasma or serum. General perspectives on the potential of achieving sufficient sensitivity for detection of pg/mL level proteins in plasma are also discussed. PMID:22577010

  19. Advancing the sensitivity of selected reaction monitoring-based targeted quantitative proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Tujin; Su, Dian; Liu, Tao

    2012-04-01

    Selected reaction monitoring (SRM)—also known as multiple reaction monitoring (MRM)—has emerged as a promising high-throughput targeted protein quantification technology for candidate biomarker verification and systems biology applications. A major bottleneck for current SRM technology, however, is insufficient sensitivity for e.g., detecting low-abundance biomarkers likely present at the pg/mL to low ng/mL range in human blood plasma or serum, or extremely low-abundance signaling proteins in the cells or tissues. Herein we review recent advances in methods and technologies, including front-end immunoaffinity depletion, fractionation, selective enrichment of target proteins/peptides or their posttranslational modifications (PTMs), as well as advances in MS instrumentation, whichmore » have significantly enhanced the overall sensitivity of SRM assays and enabled the detection of low-abundance proteins at low to sub- ng/mL level in human blood plasma or serum. General perspectives on the potential of achieving sufficient sensitivity for detection of pg/mL level proteins in plasma are also discussed.« less

  20. SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamashita, M; Kokubo, M; Institute of Biomedical Research and Innovation, Kobe, Hyogo

    2016-06-15

    Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used formore » dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  1. Knowledge based system verification and validation as related to automation of space station subsystems: Rationale for a knowledge based system lifecycle

    NASA Technical Reports Server (NTRS)

    Richardson, Keith; Wong, Carla

    1988-01-01

    The role of verification and validation (V and V) in software has been to support and strengthen the software lifecycle and to ensure that the resultant code meets the standards of the requirements documents. Knowledge Based System (KBS) V and V should serve the same role, but the KBS lifecycle is ill-defined. The rationale of a simple form of the KBS lifecycle is presented, including accommodation to certain critical KBS differences from software development.

  2. A formal approach to validation and verification for knowledge-based control systems

    NASA Technical Reports Server (NTRS)

    Castore, Glen

    1987-01-01

    As control systems become more complex in response to desires for greater system flexibility, performance and reliability, the promise is held out that artificial intelligence might provide the means for building such systems. An obstacle to the use of symbolic processing constructs in this domain is the need for verification and validation (V and V) of the systems. Techniques currently in use do not seem appropriate for knowledge-based software. An outline of a formal approach to V and V for knowledge-based control systems is presented.

  3. Options and Risk for Qualification of Electric Propulsion System

    NASA Technical Reports Server (NTRS)

    Bailey, Michelle; Daniel, Charles; Cook, Steve (Technical Monitor)

    2002-01-01

    Electric propulsion vehicle systems envelop a wide range of propulsion alternatives including solar and nuclear, which present unique circumstances for qualification. This paper will address the alternatives for qualification of electric propulsion spacecraft systems. The approach taken will be to address the considerations for qualification at the various levels of systems definition. Additionally, for each level of qualification the system level risk implications will be developed. Also, the paper will explore the implications of analysis verses test for various levels of systems definition, while retaining the objectives of a verification program. The limitations of terrestrial testing will be explored along with the risk and implications of orbital demonstration testing. The paper will seek to develop a template for structuring of a verification program based on cost, risk and value return. A successful verification program should establish controls and define objectives of the verification compliance program. Finally the paper will seek to address the political and programmatic factors, which may impact options for system verification.

  4. [Uniqueness seeking behavior as a self-verification: an alternative approach to the study of uniqueness].

    PubMed

    Yamaoka, S

    1995-06-01

    Uniqueness theory explains that extremely high perceived similarity between self and others evokes negative emotional reactions and causes uniqueness seeking behavior. However, the theory conceptualizes similarity so ambiguously that it appears to suffer from low predictive validity. The purpose of the current article is to propose an alternative explanation of uniqueness seeking behavior. It posits that perceived uniqueness deprivation is a threat to self-concepts, and therefore causes self-verification behavior. Two levels of self verification are conceived: one based on personal categorization and the other on social categorization. The present approach regards uniqueness seeking behavior as the personal-level self verification. To test these propositions, a 2 (very high or moderate similarity information) x 2 (with or without outgroup information) x 2 (high or low need for uniqueness) between-subject factorial-design experiment was conducted with 95 university students. Results supported the self-verification approach, and were discussed in terms of effects of uniqueness deprivation, levels of self-categorization, and individual differences in need for uniqueness.

  5. WRAP-RIB antenna technology development

    NASA Technical Reports Server (NTRS)

    Freeland, R. E.; Garcia, N. F.; Iwamoto, H.

    1985-01-01

    The wrap-rib deployable antenna concept development is based on a combination of hardware development and testing along with extensive supporting analysis. The proof-of-concept hardware models are large in size so they will address the same basic problems associated with the design fabrication, assembly and test as the full-scale systems which were selected to be 100 meters at the beginning of the program. The hardware evaluation program consists of functional performance tests, design verification tests and analytical model verification tests. Functional testing consists of kinematic deployment, mesh management and verification of mechanical packaging efficiencies. Design verification consists of rib contour precision measurement, rib cross-section variation evaluation, rib materials characterizations and manufacturing imperfections assessment. Analytical model verification and refinement include mesh stiffness measurement, rib static and dynamic testing, mass measurement, and rib cross-section characterization. This concept was considered for a number of potential applications that include mobile communications, VLBI, and aircraft surveillance. In fact, baseline system configurations were developed by JPL, using the appropriate wrap-rib antenna, for all three classes of applications.

  6. A Secure Framework for Location Verification in Pervasive Computing

    NASA Astrophysics Data System (ADS)

    Liu, Dawei; Lee, Moon-Chuen; Wu, Dan

    The way people use computing devices has been changed in some way by the relatively new pervasive computing paradigm. For example, a person can use a mobile device to obtain its location information at anytime and anywhere. There are several security issues concerning whether this information is reliable in a pervasive environment. For example, a malicious user may disable the localization system by broadcasting a forged location, and it may impersonate other users by eavesdropping their locations. In this paper, we address the verification of location information in a secure manner. We first present the design challenges for location verification, and then propose a two-layer framework VerPer for secure location verification in a pervasive computing environment. Real world GPS-based wireless sensor network experiments confirm the effectiveness of the proposed framework.

  7. Land Ice Verification and Validation Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and testmore » data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  8. Specification, Synthesis, and Verification of Software-based Control Protocols for Fault-Tolerant Space Systems

    DTIC Science & Technology

    2016-08-16

    Force Research Laboratory Space Vehicles Directorate AFRL /RVSV 3550 Aberdeen Ave, SE 11. SPONSOR/MONITOR’S REPORT Kirtland AFB, NM 87117-5776 NUMBER...Ft Belvoir, VA 22060-6218 1 cy AFRL /RVIL Kirtland AFB, NM 87117-5776 2 cys Official Record Copy AFRL /RVSV/Richard S. Erwin 1 cy... AFRL -RV-PS- AFRL -RV-PS- TR-2016-0112 TR-2016-0112 SPECIFICATION, SYNTHESIS, AND VERIFICATION OF SOFTWARE-BASED CONTROL PROTOCOLS FOR FAULT-TOLERANT

  9. Formal Verification at System Level

    NASA Astrophysics Data System (ADS)

    Mazzini, S.; Puri, S.; Mari, F.; Melatti, I.; Tronci, E.

    2009-05-01

    System Level Analysis calls for a language comprehensible to experts with different background and yet precise enough to support meaningful analyses. SysML is emerging as an effective balance between such conflicting goals. In this paper we outline some the results obtained as for SysML based system level functional formal verification by an ESA/ESTEC study, with a collaboration among INTECS and La Sapienza University of Roma. The study focuses on SysML based system level functional requirements techniques.

  10. Pilot Guidelines for Improving Instructional Materials Through the Process of Learner Verification and Revision.

    ERIC Educational Resources Information Center

    Educational Products Information Exchange Inst., Stony Brook, NY.

    Learner Verification and Revision (LVR) Process of Instructional Materials is an ongoing effort for the improvement of instructional materials based on systematic feedback from learners who have used the materials. This evaluation gives publishers a method of identifying instructional strengths and weaknesses of a product and provides an…

  11. ANDalyze Lead 100 Test Kit and AND1000 Fluorimeter Environmental Technology Verification Report and Statement

    EPA Science Inventory

    This report provides results for the verification testing of the Lead100/AND1000. The following is a description of the technology based on information provided by the vendor. The information provided below was not verified in this test. The ANDalyze Lead100/AND1000 was des...

  12. Integrating Model-Based Verification into Software Design Education

    ERIC Educational Resources Information Center

    Yilmaz, Levent; Wang, Shuo

    2005-01-01

    Proper design analysis is indispensable to assure quality and reduce emergent costs due to faulty software. Teaching proper design verification skills early during pedagogical development is crucial, as such analysis is the only tractable way of resolving software problems early when they are easy to fix. The premise of the presented strategy is…

  13. Verification in Referral-Based Crowdsourcing

    PubMed Central

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  14. A Verification-Driven Approach to Traceability and Documentation for Auto-Generated Mathematical Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen W.; Fischer, Bernd

    2009-01-01

    Model-based development and automated code generation are increasingly used for production code in safety-critical applications, but since code generators are typically not qualified, the generated code must still be fully tested, reviewed, and certified. This is particularly arduous for mathematical and control engineering software which requires reviewers to trace subtle details of textbook formulas and algorithms to the code, and to match requirements (e.g., physical units or coordinate frames) not represented explicitly in models or code. Both tasks are complicated by the often opaque nature of auto-generated code. We address these problems by developing a verification-driven approach to traceability and documentation. We apply the AUTOCERT verification system to identify and then verify mathematical concepts in the code, based on a mathematical domain theory, and then use these verified traceability links between concepts, code, and verification conditions to construct a natural language report that provides a high-level structured argument explaining why and how the code uses the assumptions and complies with the requirements. We have applied our approach to generate review documents for several sub-systems of NASA s Project Constellation.

  15. A comparative verification of high resolution precipitation forecasts using model output statistics

    NASA Astrophysics Data System (ADS)

    van der Plas, Emiel; Schmeits, Maurice; Hooijman, Nicolien; Kok, Kees

    2017-04-01

    Verification of localized events such as precipitation has become even more challenging with the advent of high-resolution meso-scale numerical weather prediction (NWP). The realism of a forecast suggests that it should compare well against precipitation radar imagery with similar resolution, both spatially and temporally. Spatial verification methods solve some of the representativity issues that point verification gives rise to. In this study a verification strategy based on model output statistics is applied that aims to address both double penalty and resolution effects that are inherent to comparisons of NWP models with different resolutions. Using predictors based on spatial precipitation patterns around a set of stations, an extended logistic regression (ELR) equation is deduced, leading to a probability forecast distribution of precipitation for each NWP model, analysis and lead time. The ELR equations are derived for predictands based on areal calibrated radar precipitation and SYNOP observations. The aim is to extract maximum information from a series of precipitation forecasts, like a trained forecaster would. The method is applied to the non-hydrostatic model Harmonie (2.5 km resolution), Hirlam (11 km resolution) and the ECMWF model (16 km resolution), overall yielding similar Brier skill scores for the 3 post-processed models, but larger differences for individual lead times. Besides, the Fractions Skill Score is computed using the 3 deterministic forecasts, showing somewhat better skill for the Harmonie model. In other words, despite the realism of Harmonie precipitation forecasts, they only perform similarly or somewhat better than precipitation forecasts from the 2 lower resolution models, at least in the Netherlands.

  16. Age verification cards fail to fully prevent minors from accessing tobacco products.

    PubMed

    Kanda, Hideyuki; Osaki, Yoneatsu; Ohida, Takashi; Kaneita, Yoshitaka; Munezawa, Takeshi

    2011-03-01

    Proper age verification can prevent minors from accessing tobacco products. For this reason, electronic locking devices based on a proof-of age system utilising cards were installed in almost every tobacco vending machine across Japan and Germany to restrict sales to minors. We aimed to clarify the associations between amount smoked by high school students and the usage of age verification cards by conducting a nationwide cross-sectional survey of students in Japan. This survey was conducted in 2008. We asked high school students, aged 13-18 years, in Japan about their smoking behaviour, where they purchase cigarettes, if or if not they have used age verification cards, and if yes, how they obtained this card. As the amount smoked increased, the prevalence of purchasing cigarettes from vending machines also rose for both males and females. The percentage of those with experience of using an age verification card was also higher among those who smoked more. Somebody outside of family was the top source of obtaining cards. Surprisingly, around 5% of males and females belonging to the group with highest smoking levels applied for cards themselves. Age verification cards cannot fully prevent minors from accessing tobacco products. These findings suggest that a total ban of tobacco vending machines, not an age verification system, is needed to prevent sales to minors.

  17. Formal verification of a set of memory management units

    NASA Technical Reports Server (NTRS)

    Schubert, E. Thomas; Levitt, K.; Cohen, Gerald C.

    1992-01-01

    This document describes the verification of a set of memory management units (MMU). The verification effort demonstrates the use of hierarchical decomposition and abstract theories. The MMUs can be organized into a complexity hierarchy. Each new level in the hierarchy adds a few significant features or modifications to the lower level MMU. The units described include: (1) a page check translation look-aside module (TLM); (2) a page check TLM with supervisor line; (3) a base bounds MMU; (4) a virtual address translation MMU; and (5) a virtual address translation MMU with memory resident segment table.

  18. Towards Verification and Validation for Increased Autonomy

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra

    2017-01-01

    This presentation goes over the work we have performed over the last few years on verification and validation of the next generation onboard collision avoidance system, ACAS X, for commercial aircraft. It describes our work on probabilistic verification and synthesis of the model that ACAS X is based on, and goes on to the validation of that model with respect to actual simulation and flight data. The presentation then moves on to identify the characteristics of ACAS X that are related to autonomy and to discuss the challenges that autonomy pauses on VV. All work presented has already been published.

  19. EOS-AM precision pointing verification

    NASA Technical Reports Server (NTRS)

    Throckmorton, A.; Braknis, E.; Bolek, J.

    1993-01-01

    The Earth Observing System (EOS) AM mission requires tight pointing knowledge to meet scientific objectives, in a spacecraft with low frequency flexible appendage modes. As the spacecraft controller reacts to various disturbance sources and as the inherent appendage modes are excited by this control action, verification of precision pointing knowledge becomes particularly challenging for the EOS-AM mission. As presently conceived, this verification includes a complementary set of multi-disciplinary analyses, hardware tests and real-time computer in the loop simulations, followed by collection and analysis of hardware test and flight data and supported by a comprehensive data base repository for validated program values.

  20. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  1. On verifying a high-level design. [cost and error analysis

    NASA Technical Reports Server (NTRS)

    Mathew, Ben; Wehbeh, Jalal A.; Saab, Daniel G.

    1993-01-01

    An overview of design verification techniques is presented, and some of the current research in high-level design verification is described. Formal hardware description languages that are capable of adequately expressing the design specifications have been developed, but some time will be required before they can have the expressive power needed to be used in real applications. Simulation-based approaches are more useful in finding errors in designs than they are in proving the correctness of a certain design. Hybrid approaches that combine simulation with other formal design verification techniques are argued to be the most promising over the short term.

  2. Integrated Disposal Facility FY 2016: ILAW Verification and Validation of the eSTOMP Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freedman, Vicky L.; Bacon, Diana H.; Fang, Yilin

    2016-05-13

    This document describes two sets of simulations carried out to further verify and validate the eSTOMP simulator. In this report, a distinction is made between verification and validation, and the focus is on verifying eSTOMP through a series of published benchmarks on cementitious wastes, and validating eSTOMP based on a lysimeter experiment for the glassified waste. These activities are carried out within the context of a scientific view of validation that asserts that models can only be invalidated, and that model validation (and verification) is a subjective assessment.

  3. Limitations in learning: How treatment verifications fail and what to do about it?

    PubMed

    Richardson, Susan; Thomadsen, Bruce

    The purposes of this study were: to provide dialog on why classic incident learning systems have been insufficient for patient safety improvements, discuss failures in treatment verification, and to provide context to the reasons and lessons that can be learned from these failures. Historically, incident learning in brachytherapy is performed via database mining which might include reading of event reports and incidents followed by incorporating verification procedures to prevent similar incidents. A description of both classic event reporting databases and current incident learning and reporting systems is given. Real examples of treatment failures based on firsthand knowledge are presented to evaluate the effectiveness of verification. These failures will be described and analyzed by outlining potential pitfalls and problems based on firsthand knowledge. Databases and incident learning systems can be limited in value and fail to provide enough detail for physicists seeking process improvement. Four examples of treatment verification failures experienced firsthand by experienced brachytherapy physicists are described. These include both underverification and oververification of various treatment processes. Database mining is an insufficient method to affect substantial improvements in the practice of brachytherapy. New incident learning systems are still immature and being tested. Instead, a new method of shared learning and implementation of changes must be created. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  4. A Lightweight RFID Mutual Authentication Protocol Based on Physical Unclonable Function.

    PubMed

    Xu, He; Ding, Jie; Li, Peng; Zhu, Feng; Wang, Ruchuan

    2018-03-02

    With the fast development of the Internet of Things, Radio Frequency Identification (RFID) has been widely applied into many areas. Nevertheless, security problems of the RFID technology are also gradually exposed, when it provides life convenience. In particular, the appearance of a large number of fake and counterfeit goods has caused massive loss for both producers and customers, for which the clone tag is a serious security threat. If attackers acquire the complete information of a tag, they can then obtain the unique identifier of the tag by some technological means. In general, because there is no extra identifier of a tag, it is difficult to distinguish an original tag and its clone one. Once the legal tag data is obtained, attackers can be able to clone this tag. Therefore, this paper shows an efficient RFID mutual verification protocol. This protocol is based on the Physical Unclonable Function (PUF) and the lightweight cryptography to achieve efficient verification of a single tag. The protocol includes three process: tag recognition, mutual verification and update. The tag recognition is that the reader recognizes the tag; mutual verification is that the reader and tag mutually verify the authenticity of each other; update is supposed to maintain the latest secret key for the following verification. Analysis results show that this protocol has a good balance between performance and security.

  5. A Lightweight RFID Mutual Authentication Protocol Based on Physical Unclonable Function

    PubMed Central

    Ding, Jie; Zhu, Feng; Wang, Ruchuan

    2018-01-01

    With the fast development of the Internet of Things, Radio Frequency Identification (RFID) has been widely applied into many areas. Nevertheless, security problems of the RFID technology are also gradually exposed, when it provides life convenience. In particular, the appearance of a large number of fake and counterfeit goods has caused massive loss for both producers and customers, for which the clone tag is a serious security threat. If attackers acquire the complete information of a tag, they can then obtain the unique identifier of the tag by some technological means. In general, because there is no extra identifier of a tag, it is difficult to distinguish an original tag and its clone one. Once the legal tag data is obtained, attackers can be able to clone this tag. Therefore, this paper shows an efficient RFID mutual verification protocol. This protocol is based on the Physical Unclonable Function (PUF) and the lightweight cryptography to achieve efficient verification of a single tag. The protocol includes three process: tag recognition, mutual verification and update. The tag recognition is that the reader recognizes the tag; mutual verification is that the reader and tag mutually verify the authenticity of each other; update is supposed to maintain the latest secret key for the following verification. Analysis results show that this protocol has a good balance between performance and security. PMID:29498684

  6. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    NASA Astrophysics Data System (ADS)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  7. A calibration method for patient specific IMRT QA using a single therapy verification film

    PubMed Central

    Shukla, Arvind Kumar; Oinam, Arun S.; Kumar, Sanjeev; Sandhu, I.S.; Sharma, S.C.

    2013-01-01

    Aim The aim of the present study is to develop and verify the single film calibration procedure used in intensity-modulated radiation therapy (IMRT) quality assurance. Background Radiographic films have been regularly used in routine commissioning of treatment modalities and verification of treatment planning system (TPS). The radiation dosimetery based on radiographic films has ability to give absolute two-dimension dose distribution and prefer for the IMRT quality assurance. However, the single therapy verification film gives a quick and significant reliable method for IMRT verification. Materials and methods A single extended dose rate (EDR 2) film was used to generate the sensitometric curve of film optical density and radiation dose. EDR 2 film was exposed with nine 6 cm × 6 cm fields of 6 MV photon beam obtained from a medical linear accelerator at 5-cm depth in solid water phantom. The nine regions of single film were exposed with radiation doses raging from 10 to 362 cGy. The actual dose measurements inside the field regions were performed using 0.6 cm3 ionization chamber. The exposed film was processed after irradiation using a VIDAR film scanner and the value of optical density was noted for each region. Ten IMRT plans of head and neck carcinoma were used for verification using a dynamic IMRT technique, and evaluated using the gamma index method with TPS calculated dose distribution. Results Sensitometric curve has been generated using a single film exposed at nine field region to check quantitative dose verifications of IMRT treatments. The radiation scattered factor was observed to decrease exponentially with the increase in the distance from the centre of each field region. The IMRT plans based on calibration curve were verified using the gamma index method and found to be within acceptable criteria. Conclusion The single film method proved to be superior to the traditional calibration method and produce fast daily film calibration for highly accurate IMRT verification. PMID:24416558

  8. SU-E-T-762: Toward Volume-Based Independent Dose Verification as Secondary Check

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tachibana, H; Tachibana, R

    2015-06-15

    Purpose: Lung SBRT plan has been shifted to volume prescription technique. However, point dose agreement is still verified using independent dose verification at the secondary check. The volume dose verification is more affected by inhomogeneous correction rather than point dose verification currently used as the check. A feasibility study for volume dose verification was conducted in lung SBRT plan. Methods: Six SBRT plans were collected in our institute. Two dose distributions with / without inhomogeneous correction were generated using Adaptive Convolve (AC) in Pinnacle3. Simple MU Analysis (SMU, Triangle Product, Ishikawa, JP) was used as the independent dose verification softwaremore » program, in which a modified Clarkson-based algorithm was implemented and radiological path length was computed using CT images independently to the treatment planning system. The agreement in point dose and mean dose between the AC with / without the correction and the SMU were assessed. Results: In the point dose evaluation for the center of the GTV, the difference shows the systematic shift (4.5% ± 1.9 %) in comparison of the AC with the inhomogeneous correction, on the other hands, there was good agreement of 0.2 ± 0.9% between the SMU and the AC without the correction. In the volume evaluation, there were significant differences in mean dose for not only PTV (14.2 ± 5.1 %) but also GTV (8.0 ± 5.1 %) compared to the AC with the correction. Without the correction, the SMU showed good agreement for GTV (1.5 ± 0.9%) as well as PTV (0.9% ± 1.0%). Conclusion: The volume evaluation for secondary check may be possible in homogenous region. However, the volume including the inhomogeneous media would make larger discrepancy. Dose calculation algorithm for independent verification needs to be modified to take into account the inhomogeneous correction.« less

  9. SU-E-T-398: Feasibility of Automated Tools for Robustness Evaluation of Advanced Photon and Proton Techniques in Oropharyngeal Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, H; Liang, X; Kalbasi, A

    2014-06-01

    Purpose: Advanced radiotherapy (RT) techniques such as proton pencil beam scanning (PBS) and photon-based volumetric modulated arc therapy (VMAT) have dosimetric advantages in the treatment of head and neck malignancies. However, anatomic or alignment changes during treatment may limit robustness of PBS and VMAT plans. We assess the feasibility of automated deformable registration tools for robustness evaluation in adaptive PBS and VMAT RT of oropharyngeal cancer (OPC). Methods: We treated 10 patients with bilateral OPC with advanced RT techniques and obtained verification CT scans with physician-reviewed target and OAR contours. We generated 3 advanced RT plans for each patient: protonmore » PBS plan using 2 posterior oblique fields (2F), proton PBS plan using an additional third low-anterior field (3F), and a photon VMAT plan using 2 arcs (Arc). For each of the planning techniques, we forward calculated initial (Ini) plans on the verification scans to create verification (V) plans. We extracted DVH indicators based on physician-generated contours for 2 target and 14 OAR structures to investigate the feasibility of two automated tools (contour propagation (CP) and dose deformation (DD)) as surrogates for routine clinical plan robustness evaluation. For each verification scan, we compared DVH indicators of V, CP and DD plans in a head-to-head fashion using Student's t-test. Results: We performed 39 verification scans; each patient underwent 3 to 6 verification scan. We found no differences in doses to target or OAR structures between V and CP, V and DD, and CP and DD plans across all patients (p > 0.05). Conclusions: Automated robustness evaluation tools, CP and DD, accurately predicted dose distributions of verification (V) plans using physician-generated contours. These tools may be further developed as a potential robustness screening tool in the workflow for adaptive treatment of OPC using advanced RT techniques, reducing the need for physician-generated contours.« less

  10. Development and verification of an agent-based model of opinion leadership.

    PubMed

    Anderson, Christine A; Titler, Marita G

    2014-09-27

    The use of opinion leaders is a strategy used to speed the process of translating research into practice. Much is still unknown about opinion leader attributes and activities and the context in which they are most effective. Agent-based modeling is a methodological tool that enables demonstration of the interactive and dynamic effects of individuals and their behaviors on other individuals in the environment. The purpose of this study was to develop and test an agent-based model of opinion leadership. The details of the design and verification of the model are presented. The agent-based model was developed by using a software development platform to translate an underlying conceptual model of opinion leadership into a computer model. Individual agent attributes (for example, motives and credibility) and behaviors (seeking or providing an opinion) were specified as variables in the model in the context of a fictitious patient care unit. The verification process was designed to test whether or not the agent-based model was capable of reproducing the conditions of the preliminary conceptual model. The verification methods included iterative programmatic testing ('debugging') and exploratory analysis of simulated data obtained from execution of the model. The simulation tests included a parameter sweep, in which the model input variables were adjusted systematically followed by an individual time series experiment. Statistical analysis of model output for the 288 possible simulation scenarios in the parameter sweep revealed that the agent-based model was performing, consistent with the posited relationships in the underlying model. Nurse opinion leaders act on the strength of their beliefs and as a result, become an opinion resource for their uncertain colleagues, depending on their perceived credibility. Over time, some nurses consistently act as this type of resource and have the potential to emerge as opinion leaders in a context where uncertainty exists. The development and testing of agent-based models is an iterative process. The opinion leader model presented here provides a basic structure for continued model development, ongoing verification, and the establishment of validation procedures, including empirical data collection.

  11. An investigation of the effects of relevant samples and a comparison of verification versus discovery based lab design

    NASA Astrophysics Data System (ADS)

    Rieben, James C., Jr.

    This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the use of real world samples. In the organic chemistry experiment, results suggest that the discovery-based design improved student retention of the chain length differentiation by physical properties relative to the verification-based design.

  12. From MetroII to Metronomy, Designing Contract-based Function-Architecture Co-simulation Framework for Timing Verification of Cyber-Physical Systems

    DTIC Science & Technology

    2015-03-13

    A. Lee. “A Programming Model for Time - Synchronized Distributed Real- Time Systems”. In: Proceedings of Real Time and Em- bedded Technology and Applications Symposium. 2007, pp. 259–268. ...From MetroII to Metronomy, Designing Contract-based Function-Architecture Co-simulation Framework for Timing Verification of Cyber-Physical Systems...the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data

  13. A Mechanism of Modeling and Verification for SaaS Customization Based on TLA

    NASA Astrophysics Data System (ADS)

    Luan, Shuai; Shi, Yuliang; Wang, Haiyang

    With the gradually mature of SOA and the rapid development of Internet, SaaS has become a popular software service mode. The customized action of SaaS is usually subject to internal and external dependency relationships. This paper first introduces a method for modeling customization process based on Temporal Logic of Actions, and then proposes a verification algorithm to assure that each step in customization will not cause unpredictable influence on system and follow the related rules defined by SaaS provider.

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: UTC FUEL CELLS' PC25C POWER PLANT - GAS PROCESSING UNIT PERFORMANCE FOR ANAEROBIC DIGESTER GAS

    EPA Science Inventory

    Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, a combined heat and power system based on the UTC Fuel Cell's PC25C Fuel Cell Power Plant was evaluated. The...

  15. "Expert" Verification of Classroom-Based Indicators of Teaching and Learning Effectiveness for Professional Renewable Certification.

    ERIC Educational Resources Information Center

    Naik, Nitin S.; And Others

    The results are provided of a statewide content verification survey of "expert" educators designed to verify indicators in the 1989-90 System for Teaching and Learning Assessment and Review (STAR) as reasonable expectations for beginning and/or experienced teachers (BETs) in Louisiana and as providing professional endorsement at the…

  16. RELAP-7 Software Verification and Validation Plan: Requirements Traceability Matrix (RTM) Part 1 – Physics and numerical methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Yong Joon; Yoo, Jun Soo; Smith, Curtis Lee

    2015-09-01

    This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  17. 78 FR 69602 - Foreign Supplier Verification Programs for Importers of Food for Humans and Animals; Extension of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Food for... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration 21 CFR Part 1 [Docket No. FDA-2011-N-0143] RIN 0910-AG64 Foreign Supplier Verification Programs for Importers of Food for Humans and...

  18. A New "Moodle" Module Supporting Automatic Verification of VHDL-Based Assignments

    ERIC Educational Resources Information Center

    Gutierrez, Eladio; Trenas, Maria A.; Ramos, Julian; Corbera, Francisco; Romero, Sergio

    2010-01-01

    This work describes a new "Moodle" module developed to give support to the practical content of a basic computer organization course. This module goes beyond the mere hosting of resources and assignments. It makes use of an automatic checking and verification engine that works on the VHDL designs submitted by the students. The module automatically…

  19. Integrated Formal Analysis of Timed-Triggered Ethernet

    NASA Technical Reports Server (NTRS)

    Dutertre, Bruno; Shankar, Nstarajan; Owre, Sam

    2012-01-01

    We present new results related to the verification of the Timed-Triggered Ethernet (TTE) clock synchronization protocol. This work extends previous verification of TTE based on model checking. We identify a suboptimal design choice in a compression function used in clock synchronization, and propose an improvement. We compare the original design and the improved definition using the SAL model checker.

  20. Registration verification of SEA/AR fields. [Oregon, Texas, Montana, Nebraska, Washington, Colorado, Kansas, Oklahoma, and North Dakota

    NASA Technical Reports Server (NTRS)

    Austin, W. W.; Lautenschlager, L. (Principal Investigator)

    1981-01-01

    A method of field registration verification for 20 SEA/AR sites for the 1979 crop year is evaluated. Field delineations for the sites were entered into the data base, and their registration verified using single channel gray scale computer printout maps of LANDSAT data taken over the site.

  1. Description of a Website Resource for Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Smith, Brian R.; Huang, George P.

    2010-01-01

    The activities of the Turbulence Model Benchmarking Working Group - which is a subcommittee of the American Institute of Aeronautics and Astronautics (AIAA) Fluid Dynamics Technical Committee - are described. The group s main purpose is to establish a web-based repository for Reynolds-averaged Navier-Stokes turbulence model documentation, including verification and validation cases. This turbulence modeling resource has been established based on feedback from a survey on what is needed to achieve consistency and repeatability in turbulence model implementation and usage, and to document and disseminate information on new turbulence models or improvements to existing models. The various components of the website are described in detail: description of turbulence models, turbulence model readiness rating system, verification cases, validation cases, validation databases, and turbulence manufactured solutions. An outline of future plans of the working group is also provided.

  2. Fingerprint Identification Using SIFT-Based Minutia Descriptors and Improved All Descriptor-Pair Matching

    PubMed Central

    Zhou, Ru; Zhong, Dexing; Han, Jiuqiang

    2013-01-01

    The performance of conventional minutiae-based fingerprint authentication algorithms degrades significantly when dealing with low quality fingerprints with lots of cuts or scratches. A similar degradation of the minutiae-based algorithms is observed when small overlapping areas appear because of the quite narrow width of the sensors. Based on the detection of minutiae, Scale Invariant Feature Transformation (SIFT) descriptors are employed to fulfill verification tasks in the above difficult scenarios. However, the original SIFT algorithm is not suitable for fingerprint because of: (1) the similar patterns of parallel ridges; and (2) high computational resource consumption. To enhance the efficiency and effectiveness of the algorithm for fingerprint verification, we propose a SIFT-based Minutia Descriptor (SMD) to improve the SIFT algorithm through image processing, descriptor extraction and matcher. A two-step fast matcher, named improved All Descriptor-Pair Matching (iADM), is also proposed to implement the 1:N verifications in real-time. Fingerprint Identification using SMD and iADM (FISiA) achieved a significant improvement with respect to accuracy in representative databases compared with the conventional minutiae-based method. The speed of FISiA also can meet real-time requirements. PMID:23467056

  3. A new approach to hand-based authentication

    NASA Astrophysics Data System (ADS)

    Amayeh, G.; Bebis, G.; Erol, A.; Nicolescu, M.

    2007-04-01

    Hand-based authentication is a key biometric technology with a wide range of potential applications both in industry and government. Traditionally, hand-based authentication is performed by extracting information from the whole hand. To account for hand and finger motion, guidance pegs are employed to fix the position and orientation of the hand. In this paper, we consider a component-based approach to hand-based verification. Our objective is to investigate the discrimination power of different parts of the hand in order to develop a simpler, faster, and possibly more accurate and robust verification system. Specifically, we propose a new approach which decomposes the hand in different regions, corresponding to the fingers and the back of the palm, and performs verification using information from certain parts of the hand only. Our approach operates on 2D images acquired by placing the hand on a flat lighting table. Using a part-based representation of the hand allows the system to compensate for hand and finger motion without using any guidance pegs. To decompose the hand in different regions, we use a robust methodology based on morphological operators which does not require detecting any landmark points on the hand. To capture the geometry of the back of the palm and the fingers in suffcient detail, we employ high-order Zernike moments which are computed using an effcient methodology. The proposed approach has been evaluated on a database of 100 subjects with 10 images per subject, illustrating promising performance. Comparisons with related approaches using the whole hand for verification illustrate the superiority of the proposed approach. Moreover, qualitative comparisons with state-of-the-art approaches indicate that the proposed approach has comparable or better performance.

  4. Performing Verification and Validation in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  5. The FoReVer Methodology: A MBSE Framework for Formal Verification

    NASA Astrophysics Data System (ADS)

    Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald

    2013-08-01

    The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.

  6. Structural damage detection based on stochastic subspace identification and statistical pattern recognition: II. Experimental validation under varying temperature

    NASA Astrophysics Data System (ADS)

    Lin, Y. Q.; Ren, W. X.; Fang, S. E.

    2011-11-01

    Although most vibration-based damage detection methods can acquire satisfactory verification on analytical or numerical structures, most of them may encounter problems when applied to real-world structures under varying environments. The damage detection methods that directly extract damage features from the periodically sampled dynamic time history response measurements are desirable but relevant research and field application verification are still lacking. In this second part of a two-part paper, the robustness and performance of the statistics-based damage index using the forward innovation model by stochastic subspace identification of a vibrating structure proposed in the first part have been investigated against two prestressed reinforced concrete (RC) beams tested in the laboratory and a full-scale RC arch bridge tested in the field under varying environments. Experimental verification is focused on temperature effects. It is demonstrated that the proposed statistics-based damage index is insensitive to temperature variations but sensitive to the structural deterioration or state alteration. This makes it possible to detect the structural damage for the real-scale structures experiencing ambient excitations and varying environmental conditions.

  7. Planetary protection - assaying new methods

    NASA Astrophysics Data System (ADS)

    Nellen, J.; Rettberg, P.; Horneck, G.

    Space age began in 1957 when the USSR launched the first satellite into earth orbit. In response to this new challenge the International Council for Science, formerly know as International Council of Scientific Unions (ICSU), established the Committee on Space Research (COSPAR) in 1958. The role of COSPAR was to channel the international scientific research in space and establish an international forum. Through COSPAR the scientific community agreed on the need for screening interplanetary probes for forward (contamination of foreign planets) and backward (contamination of earth by returned samples/probes) contamination. To prevent both forms of contamination a set of rules, as a guideline was established. Nowadays the standard implementation of the planetary protection rules is based on the experience gained during NASA's Viking project in 1975/76. Since then the evaluation-methods for microbial contamination of spacecrafts have been changed or updated just slowly. In this study the standard method of sample taking will be evaluated. New methods for examination of those samples, based on the identification of life on the molecular level, will be reviewed and checked for their feasibility as microbial detection systems. The methods will be examined for their qualitative (detection and verification of different organisms) and quantitative (detection limit and concentration verification) qualities. Amongst the methods analyzed will be i.e. real-time / PCR (poly-chain-reaction), using specific primer-sets for the amplification of highly conserved rRNA or DNA regions. Measurement of intrinsic fluorescence, i.e ATP using luciferin-luciferase reagents. The use of FAME (fatty acid methyl esters) and microchips for microbial identification purposes. The methods will be chosen to give a good overall coverage of different possible molecular markers and approaches. The most promising methods shall then be lab-tested and evaluated for their use under spacecraft assembly conditions. Since mars became one of the most sought-after planets in our solar system and will be visited by man-made probes quiet often in the near future, planetary protection is as important as never before.

  8. Inverse probability weighting estimation of the volume under the ROC surface in the presence of verification bias.

    PubMed

    Zhang, Ying; Alonzo, Todd A

    2016-11-01

    In diagnostic medicine, the volume under the receiver operating characteristic (ROC) surface (VUS) is a commonly used index to quantify the ability of a continuous diagnostic test to discriminate between three disease states. In practice, verification of the true disease status may be performed only for a subset of subjects under study since the verification procedure is invasive, risky, or expensive. The selection for disease examination might depend on the results of the diagnostic test and other clinical characteristics of the patients, which in turn can cause bias in estimates of the VUS. This bias is referred to as verification bias. Existing verification bias correction in three-way ROC analysis focuses on ordinal tests. We propose verification bias-correction methods to construct ROC surface and estimate the VUS for a continuous diagnostic test, based on inverse probability weighting. By applying U-statistics theory, we develop asymptotic properties for the estimator. A Jackknife estimator of variance is also derived. Extensive simulation studies are performed to evaluate the performance of the new estimators in terms of bias correction and variance. The proposed methods are used to assess the ability of a biomarker to accurately identify stages of Alzheimer's disease. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  10. SU-E-T-49: A Multi-Institutional Study of Independent Dose Verification for IMRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baba, H; Tachibana, H; Kamima, T

    2015-06-15

    Purpose: AAPM TG114 does not cover the independent verification for IMRT. We conducted a study of independent dose verification for IMRT in seven institutes to show the feasibility. Methods: 384 IMRT plans in the sites of prostate and head and neck (HN) were collected from the institutes, where the planning was performed using Eclipse and Pinnacle3 with the two techniques of step and shoot (S&S) and sliding window (SW). All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiologicalmore » path length. An ion-chamber measurement in a water-equivalent slab phantom was performed to compare the doses computed using the TPS and an independent dose verification program. Additionally, the agreement in dose computed in patient CT images between using the TPS and using the SMU was assessed. The dose of the composite beams in the plan was evaluated. Results: The agreement between the measurement and the SMU were −2.3±1.9 % and −5.6±3.6 % for prostate and HN sites, respectively. The agreement between the TPSs and the SMU were −2.1±1.9 % and −3.0±3.7 for prostate and HN sites, respectively. There was a negative systematic difference with similar standard deviation and the difference was larger in the HN site. The S&S technique showed a statistically significant difference between the SW. Because the Clarkson-based method in the independent program underestimated (cannot consider) the dose under the MLC. Conclusion: The accuracy would be improved when the Clarkson-based algorithm should be modified for IMRT and the tolerance level would be within 5%.« less

  11. Optical/digital identification/verification system based on digital watermarking technology

    NASA Astrophysics Data System (ADS)

    Herrigel, Alexander; Voloshynovskiy, Sviatoslav V.; Hrytskiv, Zenon D.

    2000-06-01

    This paper presents a new approach for the secure integrity verification of driver licenses, passports or other analogue identification documents. The system embeds (detects) the reference number of the identification document with the DCT watermark technology in (from) the owner photo of the identification document holder. During verification the reference number is extracted and compared with the reference number printed in the identification document. The approach combines optical and digital image processing techniques. The detection system must be able to scan an analogue driver license or passport, convert the image of this document into a digital representation and then apply the watermark verification algorithm to check the payload of the embedded watermark. If the payload of the watermark is identical with the printed visual reference number of the issuer, the verification was successful and the passport or driver license has not been modified. This approach constitutes a new class of application for the watermark technology, which was originally targeted for the copyright protection of digital multimedia data. The presented approach substantially increases the security of the analogue identification documents applied in many European countries.

  12. Test and Verification Approach for the NASA Constellation Program

    NASA Technical Reports Server (NTRS)

    Strong, Edward

    2008-01-01

    This viewgraph presentation is a test and verification approach for the NASA Constellation Program. The contents include: 1) The Vision for Space Exploration: Foundations for Exploration; 2) Constellation Program Fleet of Vehicles; 3) Exploration Roadmap; 4) Constellation Vehicle Approximate Size Comparison; 5) Ares I Elements; 6) Orion Elements; 7) Ares V Elements; 8) Lunar Lander; 9) Map of Constellation content across NASA; 10) CxP T&V Implementation; 11) Challenges in CxP T&V Program; 12) T&V Strategic Emphasis and Key Tenets; 13) CxP T&V Mission & Vision; 14) Constellation Program Organization; 15) Test and Evaluation Organization; 16) CxP Requirements Flowdown; 17) CxP Model Based Systems Engineering Approach; 18) CxP Verification Planning Documents; 19) Environmental Testing; 20) Scope of CxP Verification; 21) CxP Verification - General Process Flow; 22) Avionics and Software Integrated Testing Approach; 23) A-3 Test Stand; 24) Space Power Facility; 25) MEIT and FEIT; 26) Flight Element Integrated Test (FEIT); 27) Multi-Element Integrated Testing (MEIT); 28) Flight Test Driving Principles; and 29) Constellation s Integrated Flight Test Strategy Low Earth Orbit Servicing Capability.

  13. Scoping study to expedite development of a field deployable and portable instrument for UF6 enrichment assay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chan, George; Valentine, John D.; Russo, Richard E.

    The primary objective of the present study is to identity the most promising, viable technologies that are likely to culminate in an expedited development of the next-generation, field-deployable instrument for providing rapid, accurate, and precise enrichment assay of uranium hexafluoride (UF6). UF6 is typically involved, and is arguably the most important uranium compound, in uranium enrichment processes. As the first line of defense against proliferation, accurate analytical techniques to determine the uranium isotopic distribution in UF6 are critical for materials verification, accounting, and safeguards at enrichment plants. As nuclear fuel cycle technology becomes more prevalent around the world, international nuclearmore » safeguards and interest in UF6 enrichment assay has been growing. At present, laboratory-based mass spectrometry (MS), which offers the highest attainable analytical accuracy and precision, is the technique of choice for the analysis of stable and long-lived isotopes. Currently, the International Atomic Energy Agency (IAEA) monitors the production of enriched UF6 at declared facilities by collecting a small amount (between 1 to 10 g) of gaseous UF6 into a sample bottle, which is then shipped under chain of custody to a central laboratory (IAEA’s Nuclear Materials Analysis Laboratory) for high-precision isotopic assay by MS. The logistics are cumbersome and new shipping regulations are making it more difficult to transport UF6. Furthermore, the analysis is costly, and results are not available for some time after sample collection. Hence, the IAEA is challenged to develop effective safeguards approaches at enrichment plants. In-field isotopic analysis of UF6 has the potential to substantially reduce the time, logistics and expense of sample handling. However, current laboratory-based MS techniques require too much infrastructure and operator expertise for field deployment and operation. As outlined in the IAEA Department of Safeguards Long-Term R&D Plan, 2012–2023, one of the IAEA long-term R&D needs is to “develop tools and techniques to enable timely, potentially real-time, detection of HEU (Highly Enriched Uranium) production in LEU (Lowly Enriched Uranium) enrichment facilities” (Milestone 5.2). Because it is common that the next generation of analytical instruments is driven by technologies that are either currently available or just now emerging, one reasonable and practical approach to project the next generation of chemical instrumentation is to track the recent trends and to extrapolate them. This study adopted a similar approach, and an extensive literature review on existing and emerging technologies for UF6 enrichment assay was performed. The competitive advantages and current limitations of different analytical techniques for in-field UF6 enrichment assay were then compared, and the main gaps between needs and capabilities for their field use were examined. Subsequently, based on these results, technologies for the next-generation field-deployable instrument for UF6 enrichment assay were recommended. The study was organized in a way that a suite of assessment metric was first identified. Criteria used in this evaluation are presented in Section 1 of this report, and the most important ones are described briefly in the next few paragraphs. Because one driving force for in-field UF6 enrichment assay is related to the demanding transportation regulation for gaseous UF6, Section 2 contains a review of solid sorbents that convert and immobilized gaseous UF6 to a solid state, which is regarded as more transportation friendly and is less regulated. Furthermore, candidate solid sorbents, which show promise in mating with existing and emerging assay technologies, also factor into technology recommendations. Extensive literature reviews on existing and emerging technologies for UF6 enrichment assay, covering their scientific principles, instrument options, and current limitations are detailed in Sections 3 and 4, respectively. In Section 5, the technological gaps as well as start-of-the-art and commercial off-the-shelf components that can be adopted to expedite the development of a fieldable or portable UF6 enrichment-assay instrument are identified and discussed. Finally, based on the results of the review, requirements and recommendations for developing the next-generation field-deployable instrument for UF6 enrichment assay are presented in Section 6.« less

  14. Assessing Requirements Quality through Requirements Coverage

    NASA Technical Reports Server (NTRS)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.

  15. Metrological tests of a 200 L calibration source for HPGE detector systems for assay of radioactive waste drums.

    PubMed

    Boshkova, T; Mitev, K

    2016-03-01

    In this work we present test procedures, approval criteria and results from two metrological inspections of a certified large volume (152)Eu source (drum about 200L) intended for calibration of HPGe gamma assay systems used for activity measurement of radioactive waste drums. The aim of the inspections was to prove the stability of the calibration source during its working life. The large volume source was designed and produced in 2007. It consists of 448 identical sealed radioactive sources (modules) apportioned in 32 transparent plastic tubes which were placed in a wooden matrix which filled the drum. During the inspections the modules were subjected to tests for verification of their certified characteristics. The results show a perfect compliance with the NIST basic guidelines for the properties of a radioactive certified reference material (CRM) and demonstrate the stability of the large volume CRM-drum after 7 years of operation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems

    DTIC Science & Technology

    1994-07-29

    time systems and to evaluate the design. The evaluation of the design includes investigation of both the capability and potential usefulness of the toolkit environment and the feasibility of its implementation....The goals of Phase 1 are to design in detail a toolkit environment based on formal methods for the specification and verification of distributed real

  17. Automatic Methods and Tools for the Verification of Real Time Systems

    DTIC Science & Technology

    1997-11-30

    We developed formal methods and tools for the verification of real - time systems . This was accomplished by extending techniques, based on automata...embedded real - time systems , we introduced hybrid automata, which equip traditional discrete automata with real-numbered clock variables and continuous... real - time systems , and we identified the exact boundary between decidability and undecidability of real-time reasoning.

  18. 2007 Beyond SBIR Phase II: Bringing Technology Edge to the Warfighter

    DTIC Science & Technology

    2007-08-23

    Systems Trade-Off Analysis and Optimization Verification and Validation On-Board Diagnostics and Self - healing Security and Anti-Tampering Rapid...verification; Safety and reliability analysis of flight and mission critical systems On-Board Diagnostics and Self - Healing Model-based monitoring and... self - healing On-board diagnostics and self - healing ; Autonomic computing; Network intrusion detection and prevention Anti-Tampering and Trust

  19. Review of waste package verification tests. Semiannual report, October 1982-March 1983

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soo, P.

    1983-08-01

    The current study is part of an ongoing task to specify tests that may be used to verify that engineered waste package/repository systems comply with NRC radionuclide containment and controlled release performance objectives. Work covered in this report analyzes verification tests for borosilicate glass waste forms and bentonite- and zeolite-based packing mateials (discrete backfills). 76 references.

  20. Applying Formal Verification Techniques to Ambient Assisted Living Systems

    NASA Astrophysics Data System (ADS)

    Benghazi, Kawtar; Visitación Hurtado, María; Rodríguez, María Luisa; Noguera, Manuel

    This paper presents a verification approach based on timed traces semantics and MEDISTAM-RT [1] to check the fulfillment of non-functional requirements, such as timeliness and safety, and assure the correct functioning of the Ambient Assisted Living (AAL) systems. We validate this approach by its application to an Emergency Assistance System for monitoring people suffering from cardiac alteration with syncope.

  1. Action-based verification of RTCP-nets with CADP

    NASA Astrophysics Data System (ADS)

    Biernacki, Jerzy; Biernacka, Agnieszka; Szpyrka, Marcin

    2015-12-01

    The paper presents an RTCP-nets' (real-time coloured Petri nets) coverability graphs into Aldebaran format translation algorithm. The approach provides the possibility of automatic RTCP-nets verification using model checking techniques provided by the CADP toolbox. An actual fire alarm control panel system has been modelled and several of its crucial properties have been verified to demonstrate the usability of the approach.

  2. Generalization of information-based concepts in forecast verification

    NASA Astrophysics Data System (ADS)

    Tödter, J.; Ahrens, B.

    2012-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification. Recent findings concerning the Ignorance Score are shortly reviewed, then the generalization to continuous forecasts is shown. For ensemble forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are the prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up the natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The applicability and usefulness of the conceptually appealing CRIGN is illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This is also directly applicable to the more traditional CRPS.

  3. Secure voice-based authentication for mobile devices: vaulted voice verification

    NASA Astrophysics Data System (ADS)

    Johnson, R. C.; Scheirer, Walter J.; Boult, Terrance E.

    2013-05-01

    As the use of biometrics becomes more wide-spread, the privacy concerns that stem from the use of biometrics are becoming more apparent. As the usage of mobile devices grows, so does the desire to implement biometric identification into such devices. A large majority of mobile devices being used are mobile phones. While work is being done to implement different types of biometrics into mobile phones, such as photo based biometrics, voice is a more natural choice. The idea of voice as a biometric identifier has been around a long time. One of the major concerns with using voice as an identifier is the instability of voice. We have developed a protocol that addresses those instabilities and preserves privacy. This paper describes a novel protocol that allows a user to authenticate using voice on a mobile/remote device without compromising their privacy. We first discuss the Vaulted Verification protocol, which has recently been introduced in research literature, and then describe its limitations. We then introduce a novel adaptation and extension of the Vaulted Verification protocol to voice, dubbed Vaulted Voice Verification (V3). Following that we show a performance evaluation and then conclude with a discussion of security and future work.

  4. Design and Mechanical Evaluation of a Capacitive Sensor-Based Indexed Platform for Verification of Portable Coordinate Measuring Instruments

    PubMed Central

    Avila, Agustín Brau; Mazo, Jorge Santolaria; Martín, Juan José Aguilar

    2014-01-01

    During the last years, the use of Portable Coordinate Measuring Machines (PCMMs) in industry has increased considerably, mostly due to their flexibility for accomplishing in-line measuring tasks as well as their reduced costs and operational advantages as compared to traditional coordinate measuring machines (CMMs). However, their operation has a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification procedures. In this work the mechanical design of an indexed metrology platform (IMP) is presented. The aim of the IMP is to increase the final accuracy and to radically simplify the calibration, identification and verification of geometrical parameter procedures of PCMMs. The IMP allows us to fix the calibrated gauge object and move the measuring instrument in such a way that it is possible to cover most of the instrument working volume, reducing the time and operator fatigue to carry out these types of procedures. PMID:24451458

  5. Design and mechanical evaluation of a capacitive sensor-based indexed platform for verification of portable coordinate measuring instruments.

    PubMed

    Avila, Agustín Brau; Mazo, Jorge Santolaria; Martín, Juan José Aguilar

    2014-01-02

    During the last years, the use of Portable Coordinate Measuring Machines (PCMMs) in industry has increased considerably, mostly due to their flexibility for accomplishing in-line measuring tasks as well as their reduced costs and operational advantages as compared to traditional coordinate measuring machines (CMMs). However, their operation has a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification procedures. In this work the mechanical design of an indexed metrology platform (IMP) is presented. The aim of the IMP is to increase the final accuracy and to radically simplify the calibration, identification and verification of geometrical parameter procedures of PCMMs. The IMP allows us to fix the calibrated gauge object and move the measuring instrument in such a way that it is possible to cover most of the instrument working volume, reducing the time and operator fatigue to carry out these types of procedures.

  6. Identification of Cyclin-dependent Kinase 1 Specific Phosphorylation Sites by an In Vitro Kinase Assay.

    PubMed

    Cui, Heying; Loftus, Kyle M; Noell, Crystal R; Solmaz, Sozanne R

    2018-05-03

    Cyclin-dependent kinase 1 (Cdk1) is a master controller for the cell cycle in all eukaryotes and phosphorylates an estimated 8 - 13% of the proteome; however, the number of identified targets for Cdk1, particularly in human cells is still low. The identification of Cdk1-specific phosphorylation sites is important, as they provide mechanistic insights into how Cdk1 controls the cell cycle. Cell cycle regulation is critical for faithful chromosome segregation, and defects in this complicated process lead to chromosomal aberrations and cancer. Here, we describe an in vitro kinase assay that is used to identify Cdk1-specific phosphorylation sites. In this assay, a purified protein is phosphorylated in vitro by commercially available human Cdk1/cyclin B. Successful phosphorylation is confirmed by SDS-PAGE, and phosphorylation sites are subsequently identified by mass spectrometry. We also describe purification protocols that yield highly pure and homogeneous protein preparations suitable for the kinase assay, and a binding assay for the functional verification of the identified phosphorylation sites, which probes the interaction between a classical nuclear localization signal (cNLS) and its nuclear transport receptor karyopherin α. To aid with experimental design, we review approaches for the prediction of Cdk1-specific phosphorylation sites from protein sequences. Together these protocols present a very powerful approach that yields Cdk1-specific phosphorylation sites and enables mechanistic studies into how Cdk1 controls the cell cycle. Since this method relies on purified proteins, it can be applied to any model organism and yields reliable results, especially when combined with cell functional studies.

  7. Multiplex Amplification Refractory Mutation System Polymerase Chain Reaction (ARMS-PCR) for diagnosis of natural infection with canine distemper virus

    PubMed Central

    2010-01-01

    Background Canine distemper virus (CDV) is present worldwide and produces a lethal systemic infection of wild and domestic Canidae. Pre-existing antibodies acquired from vaccination or previous CDV infection might interfere the interpretation of a serologic diagnosis method. In addition, due to the high similarity of nucleic acid sequences between wild-type CDV and the new vaccine strain, current PCR derived methods cannot be applied for the definite confirmation of CD infection. Hence, it is worthy of developing a simple and rapid nucleotide-based assay for differentiation of wild-type CDV which is a cause of disease from attenuated CDVs after vaccination. High frequency variations have been found in the region spanning from the 3'-untranslated region (UTR) of the matrix (M) gene to the fusion (F) gene (designated M-F UTR) in a few CDV strains. To establish a differential diagnosis assay, an amplification refractory mutation analysis was established based on the highly variable region on M-F UTR and F regions. Results Sequences of frequent polymorphisms were found scattered throughout the M-F UTR region; the identity of nucleic acid between local strains and vaccine strains ranged from 82.5% to 93.8%. A track of AAA residue located 35 nucleotides downstream from F gene start codon highly conserved in three vaccine strains were replaced with TGC in the local strains; that severed as target sequences for deign of discrimination primers. The method established in the present study successfully differentiated seven Taiwanese CDV field isolates, all belonging to the Asia-1 lineage, from vaccine strains. Conclusions The method described herein would be useful for several clinical applications, such as confirmation of nature CDV infection, evaluation of vaccination status and verification of the circulating viral genotypes. PMID:20534175

  8. Multiplex Amplification Refractory Mutation System Polymerase Chain Reaction (ARMS-PCR) for diagnosis of natural infection with canine distemper virus.

    PubMed

    Chulakasian, Songkhla; Lee, Min-Shiuh; Wang, Chi-Young; Chiou, Shyan-Song; Lin, Kuan-Hsun; Lin, Fong-Yuan; Hsu, Tien-Huan; Wong, Min-Liang; Chang, Tien-Jye; Hsu, Wei-Li

    2010-06-10

    Canine distemper virus (CDV) is present worldwide and produces a lethal systemic infection of wild and domestic Canidae. Pre-existing antibodies acquired from vaccination or previous CDV infection might interfere the interpretation of a serologic diagnosis method. In addition, due to the high similarity of nucleic acid sequences between wild-type CDV and the new vaccine strain, current PCR derived methods cannot be applied for the definite confirmation of CD infection. Hence, it is worthy of developing a simple and rapid nucleotide-based assay for differentiation of wild-type CDV which is a cause of disease from attenuated CDVs after vaccination. High frequency variations have been found in the region spanning from the 3'-untranslated region (UTR) of the matrix (M) gene to the fusion (F) gene (designated M-F UTR) in a few CDV strains. To establish a differential diagnosis assay, an amplification refractory mutation analysis was established based on the highly variable region on M-F UTR and F regions. Sequences of frequent polymorphisms were found scattered throughout the M-F UTR region; the identity of nucleic acid between local strains and vaccine strains ranged from 82.5% to 93.8%. A track of AAA residue located 35 nucleotides downstream from F gene start codon highly conserved in three vaccine strains were replaced with TGC in the local strains; that severed as target sequences for deign of discrimination primers. The method established in the present study successfully differentiated seven Taiwanese CDV field isolates, all belonging to the Asia-1 lineage, from vaccine strains. The method described herein would be useful for several clinical applications, such as confirmation of nature CDV infection, evaluation of vaccination status and verification of the circulating viral genotypes.

  9. Recombinant barley-produced antibody for detection and immunoprecipitation of the major bovine milk allergen, β-lactoglobulin.

    PubMed

    Ritala, A; Leelavathi, S; Oksman-Caldentey, K-M; Reddy, V S; Laukkanen, M-L

    2014-06-01

    Recombinant allergens and antibodies are needed for diagnostic, therapeutic, food processing and quality verification purposes. The aim of this work was to develop a barley-based production system for β-lactoglobulin (BLG) specific immunoglobulin E antibody (D1 scFv). The expression level in the best barley cell clone was 0.8-1.2 mg/kg fresh weight, and was constant over an expression period of 21 days. In the case of barley grains, the highest stable productivity (followed up to T2 grains) was obtained when the D1 scFv cDNA was expressed under a seed-specific Glutelin promoter rather than under the constitutive Ubiquitin promoter. Translational fusion of ER retention signal significantly improved the accumulation of recombinant antibody. Furthermore, lines without ER retention signal lost D1 scFv accumulation in T2 grains. Pilot scale purification was performed for a T2 grain pool (51 g) containing 55.0 mg D1 scFv/kg grains. The crude extract was purified by a two-step purification protocol including IMAC and size exclusion chromatography. The purification resulted in a yield of 0.47 mg of D1 scFv (31 kD) with high purity. Enzyme-linked immunosorbent assay revealed that 29 % of the purified protein was fully functional. In immunoprecipitation assay the purified D1 scFv recognized the native 18 kD BLG in the milk sample. No binding was observed with the heat-treated milk sample, as expected. The developed barley-based expression system clearly demonstrated its potential for application in the processing of dairy milk products as well as in detecting allergens from foods possibly contaminated by bovine milk.

  10. Implementation of a cloud-based electronic medical record for maternal and child health in rural Kenya.

    PubMed

    Haskew, John; Rø, Gunnar; Saito, Kaori; Turner, Kenrick; Odhiambo, George; Wamae, Annah; Sharif, Shahnaaz; Sugishita, Tomohiko

    2015-05-01

    Complete and timely health information is essential to inform public health decision-making for maternal and child health, but is often lacking in resource-constrained settings. Electronic medical record (EMR) systems are increasingly being adopted to support the delivery of health care, and are particularly amenable to maternal and child health services. An EMR system could enable the mother and child to be tracked and monitored throughout maternity shared care, improve quality and completeness of data collected and enhance sharing of health information between outpatient clinic and the hospital, and between clinical and public health services to inform decision-making. This study implemented a novel cloud-based electronic medical record system in a maternal and child health outpatient setting in Western Kenya between April and June 2013 and evaluated its impact on improving completeness of data collected by clinical and public health services. The impact of the system was assessed using a two-sample test of proportions pre- and post-implementation of EMR-based data verification. Significant improvements in completeness of the antenatal record were recorded through implementation of EMR-based data verification. A difference of 42.9% in missing data (including screening for hypertension, tuberculosis, malaria, HIV status or ART status of HIV positive women) was recorded pre- and post-implementation. Despite significant impact of EMR-based data verification on data completeness, overall screening rates in antenatal care were low. This study has shown that EMR-based data verification can improve the completeness of data collected in the patient record for maternal and child health. A number of issues, including data management and patient confidentiality, must be considered but significant improvements in data quality are recorded through implementation of this EMR model. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. A Roadmap for the Implementation of Continued Process Verification.

    PubMed

    Boyer, Marcus; Gampfer, Joerg; Zamamiri, Abdel; Payne, Robin

    2016-01-01

    In 2014, the members of the BioPhorum Operations Group (BPOG) produced a 100-page continued process verification case study, entitled "Continued Process Verification: An Industry Position Paper with Example Protocol". This case study captures the thought processes involved in creating a continued process verification plan for a new product in response to the U.S. Food and Drug Administration's guidance on the subject introduced in 2011. In so doing, it provided the specific example of a plan developed for a new molecular antibody product based on the "A MAb Case Study" that preceded it in 2009.This document provides a roadmap that draws on the content of the continued process verification case study to provide a step-by-step guide in a more accessible form, with reference to a process map of the product life cycle. It could be used as a basis for continued process verification implementation in a number of different scenarios: For a single product and process;For a single site;To assist in the sharing of data monitoring responsibilities among sites;To assist in establishing data monitoring agreements between a customer company and a contract manufacturing organization. The U.S. Food and Drug Administration issued guidance on the management of manufacturing processes designed to improve quality and control of drug products. This involved increased focus on regular monitoring of manufacturing processes, reporting of the results, and the taking of opportunities to improve. The guidance and practice associated with it is known as continued process verification This paper summarizes good practice in responding to continued process verification guidance, gathered from subject matter experts in the biopharmaceutical industry. © PDA, Inc. 2016.

  12. SU-F-T-494: A Multi-Institutional Study of Independent Dose Verification Using Golden Beam Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Itano, M; Yamazaki, T; Tachibana, R

    Purpose: In general, beam data of individual linac is measured for independent dose verification software program and the verification is performed as a secondary check. In this study, independent dose verification using golden beam data was compared to that using individual linac’s beam data. Methods: Six institutions were participated and three different beam data were prepared. The one was individual measured data (Original Beam Data, OBD) .The others were generated by all measurements from same linac model (Model-GBD) and all linac models (All-GBD). The three different beam data were registered to the independent verification software program for each institute. Subsequently,more » patient’s plans in eight sites (brain, head and neck, lung, esophagus, breast, abdomen, pelvis and bone) were analyzed using the verification program to compare doses calculated using the three different beam data. Results: 1116 plans were collected from six institutes. Compared to using the OBD, the results shows the variation using the Model-GBD based calculation and the All-GBD was 0.0 ± 0.3% and 0.0 ± 0.6%, respectively. The maximum variations were 1.2% and 2.3%, respectively. The plans with the variation over 1% shows the reference points were located away from the central axis with/without physical wedge. Conclusion: The confidence limit (2SD) using the Model-GBD and the All-GBD was within 0.6% and 1.2%, respectively. Thus, the use of golden beam data may be feasible for independent verification. In addition to it, the verification using golden beam data provide quality assurance of planning from the view of audit. This research is partially supported by Japan Agency for Medical Research and Development(AMED)« less

  13. Verification bias: an under-recognized source of error in assessing the efficacy of MRI of the meniscii.

    PubMed

    Richardson, Michael L; Petscavage, Jonelle M

    2011-11-01

    The sensitivity and specificity of magnetic resonance imaging (MRI) for diagnosis of meniscal tears has been studied extensively, with tears usually verified by surgery. However, surgically unverified cases are often not considered in these studies, leading to verification bias, which can falsely increase the sensitivity and decrease the specificity estimates. Our study suggests that such bias may be very common in the meniscal MRI literature, and illustrates techniques to detect and correct for such bias. PubMed was searched for articles estimating sensitivity and specificity of MRI for meniscal tears. These were assessed for verification bias, deemed potentially present if a study included any patients whose MRI findings were not surgically verified. Retrospective global sensitivity analysis (GSA) was performed when possible. Thirty-nine of the 314 studies retrieved from PubMed specifically dealt with meniscal tears. All 39 included unverified patients, and hence, potential verification bias. Only seven articles included sufficient information to perform GSA. Of these, one showed definite verification bias, two showed no bias, and four others showed bias within certain ranges of disease prevalence. Only 9 of 39 acknowledged the possibility of verification bias. Verification bias is underrecognized and potentially common in published estimates of the sensitivity and specificity of MRI for the diagnosis of meniscal tears. When possible, it should be avoided by proper study design. If unavoidable, it should be acknowledged. Investigators should tabulate unverified as well as verified data. Finally, verification bias should be estimated; if present, corrected estimates of sensitivity and specificity should be used. Our online web-based calculator makes this process relatively easy. Copyright © 2011 AUR. Published by Elsevier Inc. All rights reserved.

  14. Software Verification of Orion Cockpit Displays

    NASA Technical Reports Server (NTRS)

    Biswas, M. A. Rafe; Garcia, Samuel; Prado, Matthew; Hossain, Sadad; Souris, Matthew; Morin, Lee

    2017-01-01

    NASA's latest spacecraft Orion is in the development process of taking humans deeper into space. Orion is equipped with three main displays to monitor and control the spacecraft. To ensure the software behind the glass displays operates without faults, rigorous testing is needed. To conduct such testing, the Rapid Prototyping Lab at NASA's Johnson Space Center along with the University of Texas at Tyler employed a software verification tool, EggPlant Functional by TestPlant. It is an image based test automation tool that allows users to create scripts to verify the functionality within a program. A set of edge key framework and Common EggPlant Functions were developed to enable creation of scripts in an efficient fashion. This framework standardized the way to code and to simulate user inputs in the verification process. Moreover, the Common EggPlant Functions can be used repeatedly in verification of different displays.

  15. Toward Automatic Verification of Goal-Oriented Flow Simulations

    NASA Technical Reports Server (NTRS)

    Nemec, Marian; Aftosmis, Michael J.

    2014-01-01

    We demonstrate the power of adaptive mesh refinement with adjoint-based error estimates in verification of simulations governed by the steady Euler equations. The flow equations are discretized using a finite volume scheme on a Cartesian mesh with cut cells at the wall boundaries. The discretization error in selected simulation outputs is estimated using the method of adjoint-weighted residuals. Practical aspects of the implementation are emphasized, particularly in the formulation of the refinement criterion and the mesh adaptation strategy. Following a thorough code verification example, we demonstrate simulation verification of two- and three-dimensional problems. These involve an airfoil performance database, a pressure signature of a body in supersonic flow and a launch abort with strong jet interactions. The results show reliable estimates and automatic control of discretization error in all simulations at an affordable computational cost. Moreover, the approach remains effective even when theoretical assumptions, e.g., steady-state and solution smoothness, are relaxed.

  16. Expert system verification and validation study. Delivery 3A and 3B: Trip summaries

    NASA Technical Reports Server (NTRS)

    French, Scott

    1991-01-01

    Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.

  17. Hailstorms over Switzerland: Verification of Crowd-sourced Data

    NASA Astrophysics Data System (ADS)

    Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia

    2016-04-01

    The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.

  18. Formal verification of software-based medical devices considering medical guidelines.

    PubMed

    Daw, Zamira; Cleaveland, Rance; Vetter, Marcus

    2014-01-01

    Software-based devices have increasingly become an important part of several clinical scenarios. Due to their critical impact on human life, medical devices have very strict safety requirements. It is therefore necessary to apply verification methods to ensure that the safety requirements are met. Verification of software-based devices is commonly limited to the verification of their internal elements without considering the interaction that these elements have with other devices as well as the application environment in which they are used. Medical guidelines define clinical procedures, which contain the necessary information to completely verify medical devices. The objective of this work was to incorporate medical guidelines into the verification process in order to increase the reliability of the software-based medical devices. Medical devices are developed using the model-driven method deterministic models for signal processing of embedded systems (DMOSES). This method uses unified modeling language (UML) models as a basis for the development of medical devices. The UML activity diagram is used to describe medical guidelines as workflows. The functionality of the medical devices is abstracted as a set of actions that is modeled within these workflows. In this paper, the UML models are verified using the UPPAAL model-checker. For this purpose, a formalization approach for the UML models using timed automaton (TA) is presented. A set of requirements is verified by the proposed approach for the navigation-guided biopsy. This shows the capability for identifying errors or optimization points both in the workflow and in the system design of the navigation device. In addition to the above, an open source eclipse plug-in was developed for the automated transformation of UML models into TA models that are automatically verified using UPPAAL. The proposed method enables developers to model medical devices and their clinical environment using clinical workflows as one UML diagram. Additionally, the system design can be formally verified automatically.

  19. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  20. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE PAGES

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...

    2017-03-23

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  1. Applications of a hologram watermarking protocol: aging-aware biometric signature verification and time validity check with personal documents

    NASA Astrophysics Data System (ADS)

    Vielhauer, Claus; Croce Ferri, Lucilla

    2003-06-01

    Our paper addresses two issues of a biometric authentication algorithm for ID cardholders previously presented namely the security of the embedded reference data and the aging process of the biometric data. We describe a protocol that allows two levels of verification, combining a biometric hash technique based on handwritten signature and hologram watermarks with cryptographic signatures in a verification infrastructure. This infrastructure consists of a Trusted Central Public Authority (TCPA), which serves numerous Enrollment Stations (ES) in a secure environment. Each individual performs an enrollment at an ES, which provides the TCPA with the full biometric reference data and a document hash. The TCPA then calculates the authentication record (AR) with the biometric hash, a validity timestamp, and a document hash provided by the ES. The AR is then signed with a cryptographic signature function, initialized with the TCPA's private key and embedded in the ID card as a watermark. Authentication is performed at Verification Stations (VS), where the ID card will be scanned and the signed AR is retrieved from the watermark. Due to the timestamp mechanism and a two level biometric verification technique based on offline and online features, the AR can deal with the aging process of the biometric feature by forcing a re-enrollment of the user after expiry, making use of the ES infrastructure. We describe some attack scenarios and we illustrate the watermarking embedding, retrieval and dispute protocols, analyzing their requisites, advantages and disadvantages in relation to security requirements.

  2. A DVE Time Management Simulation and Verification Platform Based on Causality Consistency Middleware

    NASA Astrophysics Data System (ADS)

    Zhou, Hangjun; Zhang, Wei; Peng, Yuxing; Li, Sikun

    During the course of designing a time management algorithm for DVEs, the researchers always become inefficiency for the distraction from the realization of the trivial and fundamental details of simulation and verification. Therefore, a platform having realized theses details is desirable. However, this has not been achieved in any published work to our knowledge. In this paper, we are the first to design and realize a DVE time management simulation and verification platform providing exactly the same interfaces as those defined by the HLA Interface Specification. Moreover, our platform is based on a new designed causality consistency middleware and might offer the comparison of three kinds of time management services: CO, RO and TSO. The experimental results show that the implementation of the platform only costs small overhead, and that the efficient performance of it is highly effective for the researchers to merely focus on the improvement of designing algorithms.

  3. How Formal Dynamic Verification Tools Facilitate Novel Concurrency Visualizations

    NASA Astrophysics Data System (ADS)

    Aananthakrishnan, Sriram; Delisi, Michael; Vakkalanka, Sarvani; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev

    With the exploding scale of concurrency, presenting valuable pieces of information collected by formal verification tools intuitively and graphically can greatly enhance concurrent system debugging. Traditional MPI program debuggers present trace views of MPI program executions. Such views are redundant, often containing equivalent traces that permute independent MPI calls. In our ISP formal dynamic verifier for MPI programs, we present a collection of alternate views made possible by the use of formal dynamic verification. Some of ISP’s views help pinpoint errors, some facilitate discerning errors by eliminating redundancy, while others help understand the program better by displaying concurrent even orderings that must be respected by all MPI implementations, in the form of completes-before graphs. In this paper, we describe ISP’s graphical user interface (GUI) capabilities in all these areas which are currently supported by a portable Java based GUI, a Microsoft Visual Studio GUI, and an Eclipse based GUI whose development is in progress.

  4. Verification and Validation Methodology of Real-Time Adaptive Neural Networks for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Gupta, Pramod; Loparo, Kenneth; Mackall, Dale; Schumann, Johann; Soares, Fola

    2004-01-01

    Recent research has shown that adaptive neural based control systems are very effective in restoring stability and control of an aircraft in the presence of damage or failures. The application of an adaptive neural network with a flight critical control system requires a thorough and proven process to ensure safe and proper flight operation. Unique testing tools have been developed as part of a process to perform verification and validation (V&V) of real time adaptive neural networks used in recent adaptive flight control system, to evaluate the performance of the on line trained neural networks. The tools will help in certification from FAA and will help in the successful deployment of neural network based adaptive controllers in safety-critical applications. The process to perform verification and validation is evaluated against a typical neural adaptive controller and the results are discussed.

  5. The IAEA neutron coincidence counting (INCC) and the DEMING least-squares fitting programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krick, M.S.; Harker, W.C.; Rinard, P.M.

    1998-12-01

    Two computer programs are described: (1) the INCC (IAEA or International Neutron Coincidence Counting) program and (2) the DEMING curve-fitting program. The INCC program is an IAEA version of the Los Alamos NCC (Neutron Coincidence Counting) code. The DEMING program is an upgrade of earlier Windows{reg_sign} and DOS codes with the same name. The versions described are INCC 3.00 and DEMING 1.11. The INCC and DEMING codes provide inspectors with the software support needed to perform calibration and verification measurements with all of the neutron coincidence counting systems used in IAEA inspections for the nondestructive assay of plutonium and uranium.

  6. Gene expression-based chemical genomics identifies potential therapeutic drugs in hepatocellular carcinoma.

    PubMed

    Chen, Ming-Huang; Yang, Wu-Lung R; Lin, Kuan-Ting; Liu, Chia-Hung; Liu, Yu-Wen; Huang, Kai-Wen; Chang, Peter Mu-Hsin; Lai, Jin-Mei; Hsu, Chun-Nan; Chao, Kun-Mao; Kao, Cheng-Yan; Huang, Chi-Ying F

    2011-01-01

    Hepatocellular carcinoma (HCC) is an aggressive tumor with a poor prognosis. Currently, only sorafenib is approved by the FDA for advanced HCC treatment; therefore, there is an urgent need to discover candidate therapeutic drugs for HCC. We hypothesized that if a drug signature could reverse, at least in part, the gene expression signature of HCC, it might have the potential to inhibit HCC-related pathways and thereby treat HCC. To test this hypothesis, we first built an integrative platform, the "Encyclopedia of Hepatocellular Carcinoma genes Online 2", dubbed EHCO2, to systematically collect, organize and compare the publicly available data from HCC studies. The resulting collection includes a total of 4,020 genes. To systematically query the Connectivity Map (CMap), which includes 6,100 drug-mediated expression profiles, we further designed various gene signature selection and enrichment methods, including a randomization technique, majority vote, and clique analysis. Subsequently, 28 out of 50 prioritized drugs, including tanespimycin, trichostatin A, thioguanosine, and several anti-psychotic drugs with anti-tumor activities, were validated via MTT cell viability assays and clonogenic assays in HCC cell lines. To accelerate their future clinical use, possibly through drug-repurposing, we selected two well-established drugs to test in mice, chlorpromazine and trifluoperazine. Both drugs inhibited orthotopic liver tumor growth. In conclusion, we successfully discovered and validated existing drugs for potential HCC therapeutic use with the pipeline of Connectivity Map analysis and lab verification, thereby suggesting the usefulness of this procedure to accelerate drug repurposing for HCC treatment.

  7. Trueness verification of actual creatinine assays in the European market demonstrates a disappointing variability that needs substantial improvement. An international study in the framework of the EC4 creatinine standardization working group.

    PubMed

    Delanghe, Joris R; Cobbaert, Christa; Galteau, Marie-Madeleine; Harmoinen, Aimo; Jansen, Rob; Kruse, Rolf; Laitinen, Päivi; Thienpont, Linda M; Wuyts, Birgitte; Weykamp, Cas; Panteghini, Mauro

    2008-01-01

    The European In Vitro Diagnostics (IVD) directive requires traceability to reference methods and materials of analytes. It is a task of the profession to verify the trueness of results and IVD compatibility. The results of a trueness verification study by the European Communities Confederation of Clinical Chemistry (EC4) working group on creatinine standardization are described, in which 189 European laboratories analyzed serum creatinine in a commutable serum-based material, using analytical systems from seven companies. Values were targeted using isotope dilution gas chromatography/mass spectrometry. Results were tested on their compliance to a set of three criteria: trueness, i.e., no significant bias relative to the target value, between-laboratory variation and within-laboratory variation relative to the maximum allowable error. For the lower and intermediate level, values differed significantly from the target value in the Jaffe and the dry chemistry methods. At the high level, dry chemistry yielded higher results. Between-laboratory coefficients of variation ranged from 4.37% to 8.74%. Total error budget was mainly consumed by the bias. Non-compensated Jaffe methods largely exceeded the total error budget. Best results were obtained for the enzymatic method. The dry chemistry method consumed a large part of its error budget due to calibration bias. Despite the European IVD directive and the growing needs for creatinine standardization, an unacceptable inter-laboratory variation was observed, which was mainly due to calibration differences. The calibration variation has major clinical consequences, in particular in pediatrics, where reference ranges for serum and plasma creatinine are low, and in the estimation of glomerular filtration rate.

  8. Model-based engineering for medical-device software.

    PubMed

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  9. A High-Level Language for Modeling Algorithms and Their Properties

    NASA Astrophysics Data System (ADS)

    Akhtar, Sabina; Merz, Stephan; Quinson, Martin

    Designers of concurrent and distributed algorithms usually express them using pseudo-code. In contrast, most verification techniques are based on more mathematically-oriented formalisms such as state transition systems. This conceptual gap contributes to hinder the use of formal verification techniques. Leslie Lamport introduced PlusCal, a high-level algorithmic language that has the "look and feel" of pseudo-code, but is equipped with a precise semantics and includes a high-level expression language based on set theory. PlusCal models can be compiled to TLA + and verified using the model checker tlc.

  10. Verification technology of remote sensing camera satellite imaging simulation based on ray tracing

    NASA Astrophysics Data System (ADS)

    Gu, Qiongqiong; Chen, Xiaomei; Yang, Deyun

    2017-08-01

    Remote sensing satellite camera imaging simulation technology is broadly used to evaluate the satellite imaging quality and to test the data application system. But the simulation precision is hard to examine. In this paper, we propose an experimental simulation verification method, which is based on the test parameter variation comparison. According to the simulation model based on ray-tracing, the experiment is to verify the model precision by changing the types of devices, which are corresponding the parameters of the model. The experimental results show that the similarity between the imaging model based on ray tracing and the experimental image is 91.4%, which can simulate the remote sensing satellite imaging system very well.

  11. Computational Prediction and Experimental Verification of New MAP Kinase Docking Sites and Substrates Including Gli Transcription Factors

    PubMed Central

    Whisenant, Thomas C.; Ho, David T.; Benz, Ryan W.; Rogers, Jeffrey S.; Kaake, Robyn M.; Gordon, Elizabeth A.; Huang, Lan; Baldi, Pierre; Bardwell, Lee

    2010-01-01

    In order to fully understand protein kinase networks, new methods are needed to identify regulators and substrates of kinases, especially for weakly expressed proteins. Here we have developed a hybrid computational search algorithm that combines machine learning and expert knowledge to identify kinase docking sites, and used this algorithm to search the human genome for novel MAP kinase substrates and regulators focused on the JNK family of MAP kinases. Predictions were tested by peptide array followed by rigorous biochemical verification with in vitro binding and kinase assays on wild-type and mutant proteins. Using this procedure, we found new ‘D-site’ class docking sites in previously known JNK substrates (hnRNP-K, PPM1J/PP2Czeta), as well as new JNK-interacting proteins (MLL4, NEIL1). Finally, we identified new D-site-dependent MAPK substrates, including the hedgehog-regulated transcription factors Gli1 and Gli3, suggesting that a direct connection between MAP kinase and hedgehog signaling may occur at the level of these key regulators. These results demonstrate that a genome-wide search for MAP kinase docking sites can be used to find new docking sites and substrates. PMID:20865152

  12. Arithmetic Circuit Verification Based on Symbolic Computer Algebra

    NASA Astrophysics Data System (ADS)

    Watanabe, Yuki; Homma, Naofumi; Aoki, Takafumi; Higuchi, Tatsuo

    This paper presents a formal approach to verify arithmetic circuits using symbolic computer algebra. Our method describes arithmetic circuits directly with high-level mathematical objects based on weighted number systems and arithmetic formulae. Such circuit description can be effectively verified by polynomial reduction techniques using Gröbner Bases. In this paper, we describe how the symbolic computer algebra can be used to describe and verify arithmetic circuits. The advantageous effects of the proposed approach are demonstrated through experimental verification of some arithmetic circuits such as multiply-accumulator and FIR filter. The result shows that the proposed approach has a definite possibility of verifying practical arithmetic circuits.

  13. Control of operating parameters of laser ceilometers with the application of fiber optic delay line imitation

    NASA Astrophysics Data System (ADS)

    Kim, A. A.; Klochkov, D. V.; Konyaev, M. A.; Mihaylenko, A. S.

    2017-11-01

    The article considers the problem of control and verification of the laser ceilometers basic performance parameters and describes an alternative method based on the use of multi-length fiber optic delay line, simulating atmospheric track. The results of the described experiment demonstrate the great potential of this method for inspection and verification procedures of laser ceilometers.

  14. Improving semi-text-independent method of writer verification using difference vector

    NASA Astrophysics Data System (ADS)

    Li, Xin; Ding, Xiaoqing

    2009-01-01

    The semi-text-independent method of writer verification based on the linear framework is a method that can use all characters of two handwritings to discriminate the writers in the condition of knowing the text contents. The handwritings are allowed to just have small numbers of even totally different characters. This fills the vacancy of the classical text-dependent methods and the text-independent methods of writer verification. Moreover, the information, what every character is, is used for the semi-text-independent method in this paper. Two types of standard templates, generated from many writer-unknown handwritten samples and printed samples of each character, are introduced to represent the content information of each character. The difference vectors of the character samples are gotten by subtracting the standard templates from the original feature vectors and used to replace the original vectors in the process of writer verification. By removing a large amount of content information and remaining the style information, the verification accuracy of the semi-text-independent method is improved. On a handwriting database involving 30 writers, when the query handwriting and the reference handwriting are composed of 30 distinct characters respectively, the average equal error rate (EER) of writer verification reaches 9.96%. And when the handwritings contain 50 characters, the average EER falls to 6.34%, which is 23.9% lower than the EER of not using the difference vectors.

  15. Industrial methodology for process verification in research (IMPROVER): toward systems biology verification

    PubMed Central

    Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo

    2012-01-01

    Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044

  16. Improved Detection Technique for Solvent Rinse Cleanliness Verification

    NASA Technical Reports Server (NTRS)

    Hornung, S. D.; Beeson, H. D.

    2001-01-01

    The NASA White Sands Test Facility (WSTF) has an ongoing effort to reduce or eliminate usage of cleaning solvents such as CFC-113 and its replacements. These solvents are used in the final clean and cleanliness verification processes for flight and ground support hardware, especially for oxygen systems where organic contaminants can pose an ignition hazard. For the final cleanliness verification in the standard process, the equivalent of one square foot of surface area of parts is rinsed with the solvent, and the final 100 mL of the rinse is captured. The amount of nonvolatile residue (NVR) in the solvent is determined by weight after the evaporation of the solvent. An improved process of sampling this rinse, developed at WSTF, requires evaporation of less than 2 mL of the solvent to make the cleanliness verification. Small amounts of the solvent are evaporated in a clean stainless steel cup, and the cleanliness of the stainless steel cup is measured using a commercially available surface quality monitor. The effectiveness of this new cleanliness verification technique was compared to the accepted NVR sampling procedures. Testing with known contaminants in solution, such as hydraulic fluid, fluorinated lubricants, and cutting and lubricating oils, was performed to establish a correlation between amount in solution and the process response. This report presents the approach and results and discusses the issues in establishing the surface quality monitor-based cleanliness verification.

  17. Application of virtual distances methodology to laser tracker verification with an indexed metrology platform

    NASA Astrophysics Data System (ADS)

    Acero, R.; Santolaria, J.; Pueo, M.; Aguilar, J. J.; Brau, A.

    2015-11-01

    High-range measuring equipment like laser trackers need large dimension calibrated reference artifacts in their calibration and verification procedures. In this paper, a new verification procedure for portable coordinate measuring instruments based on the generation and evaluation of virtual distances with an indexed metrology platform is developed. This methodology enables the definition of an unlimited number of reference distances without materializing them in a physical gauge to be used as a reference. The generation of the virtual points and reference lengths derived is linked to the concept of the indexed metrology platform and the knowledge of the relative position and orientation of its upper and lower platforms with high accuracy. It is the measuring instrument together with the indexed metrology platform one that remains still, rotating the virtual mesh around them. As a first step, the virtual distances technique is applied to a laser tracker in this work. The experimental verification procedure of the laser tracker with virtual distances is simulated and further compared with the conventional verification procedure of the laser tracker with the indexed metrology platform. The results obtained in terms of volumetric performance of the laser tracker proved the suitability of the virtual distances methodology in calibration and verification procedures for portable coordinate measuring instruments, broadening and expanding the possibilities for the definition of reference distances in these procedures.

  18. Design exploration and verification platform, based on high-level modeling and FPGA prototyping, for fast and flexible digital communication in physics experiments

    NASA Astrophysics Data System (ADS)

    Magazzù, G.; Borgese, G.; Costantino, N.; Fanucci, L.; Incandela, J.; Saponara, S.

    2013-02-01

    In many research fields as high energy physics (HEP), astrophysics, nuclear medicine or space engineering with harsh operating conditions, the use of fast and flexible digital communication protocols is becoming more and more important. The possibility to have a smart and tested top-down design flow for the design of a new protocol for control/readout of front-end electronics is very useful. To this aim, and to reduce development time, costs and risks, this paper describes an innovative design/verification flow applied as example case study to a new communication protocol called FF-LYNX. After the description of the main FF-LYNX features, the paper presents: the definition of a parametric SystemC-based Integrated Simulation Environment (ISE) for high-level protocol definition and validation; the set up of figure of merits to drive the design space exploration; the use of ISE for early analysis of the achievable performances when adopting the new communication protocol and its interfaces for a new (or upgraded) physics experiment; the design of VHDL IP cores for the TX and RX protocol interfaces; their implementation on a FPGA-based emulator for functional verification and finally the modification of the FPGA-based emulator for testing the ASIC chipset which implements the rad-tolerant protocol interfaces. For every step, significant results will be shown to underline the usefulness of this design and verification approach that can be applied to any new digital protocol development for smart detectors in physics experiments.

  19. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment.

    PubMed

    Fuangrod, Todsaporn; Woodruff, Henry C; van Uytven, Eric; McCurdy, Boyd M C; Kuncic, Zdenka; O'Connor, Daryl J; Greer, Peter B

    2013-09-01

    To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient. The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance. The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s). A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  20. CSTI Earth-to-orbit propulsion research and technology program overview

    NASA Technical Reports Server (NTRS)

    Gentz, Steven J.

    1993-01-01

    NASA supports a vigorous Earth-to-orbit (ETO) research and technology program as part of its Civil Space Technology Initiative. The purpose of this program is to provide an up-to-date technology base to support future space transportation needs for a new generation of lower cost, operationally efficient, long-lived and highly reliable ETO propulsion systems by enhancing the knowledge, understanding and design methodology applicable to advanced oxygen/hydrogen and oxygen/hydrocarbon ETO propulsion systems. Program areas of interest include analytical models, advanced component technology, instrumentation, and validation/verification testing. Organizationally, the program is divided between technology acquisition and technology verification as follows: (1) technology acquisition; and (2) technology verification.

  1. Development of a Calibration Strip for Immunochromatographic Assay Detection Systems.

    PubMed

    Gao, Yue-Ming; Wei, Jian-Chong; Mak, Peng-Un; Vai, Mang-I; Du, Min; Pun, Sio-Hang

    2016-06-29

    With many benefits and applications, immunochromatographic (ICG) assay detection systems have been reported on a great deal. However, the existing research mainly focuses on increasing the dynamic detection range or application fields. Calibration of the detection system, which has a great influence on the detection accuracy, has not been addressed properly. In this context, this work develops a calibration strip for ICG assay photoelectric detection systems. An image of the test strip is captured by an image acquisition device, followed by performing a fuzzy c-means (FCM) clustering algorithm and maximin-distance algorithm for image segmentation. Additionally, experiments are conducted to find the best characteristic quantity. By analyzing the linear coefficient, an average value of hue (H) at 14 min is chosen as the characteristic quantity and the empirical formula between H and optical density (OD) value is established. Therefore, H, saturation (S), and value (V) are calculated by a number of selected OD values. Then, H, S, and V values are transferred to the RGB color space and a high-resolution printer is used to print the strip images on cellulose nitrate membranes. Finally, verification of the printed calibration strips is conducted by analyzing the linear correlation between OD and the spectral reflectance, which shows a good linear correlation (R² = 98.78%).

  2. Identification of a small-molecule ligand of the epigenetic reader protein Spindlin1 via a versatile screening platform

    PubMed Central

    Wagner, Tobias; Greschik, Holger; Burgahn, Teresa; Schmidtkunz, Karin; Schott, Anne-Kathrin; McMillan, Joel; Baranauskienė, Lina; Xiong, Yan; Fedorov, Oleg; Jin, Jian; Oppermann, Udo; Matulis, Daumantas; Schüle, Roland; Jung, Manfred

    2016-01-01

    Epigenetic modifications of histone tails play an essential role in the regulation of eukaryotic transcription. Writer and eraser enzymes establish and maintain the epigenetic code by creating or removing posttranslational marks. Specific binding proteins, called readers, recognize the modifications and mediate epigenetic signalling. Here, we present a versatile assay platform for the investigation of the interaction between methyl lysine readers and their ligands. This can be utilized for the screening of small-molecule inhibitors of such protein–protein interactions and the detailed characterization of the inhibition. Our platform is constructed in a modular way consisting of orthogonal in vitro binding assays for ligand screening and verification of initial hits and biophysical, label-free techniques for further kinetic characterization of confirmed ligands. A stability assay for the investigation of target engagement in a cellular context complements the platform. We applied the complete evaluation chain to the Tudor domain containing protein Spindlin1 and established the in vitro test systems for the double Tudor domain of the histone demethylase JMJD2C. We finally conducted an exploratory screen for inhibitors of the interaction between Spindlin1 and H3K4me3 and identified A366 as the first nanomolar small-molecule ligand of a Tudor domain containing methyl lysine reader. PMID:26893353

  3. Protein isoform-specific validation defines multiple chloride intracellular channel and tropomyosin isoforms as serological biomarkers of ovarian cancer

    PubMed Central

    Tang, Hsin-Yao; Beer, Lynn A.; Tanyi, Janos L.; Zhang, Rugang; Liu, Qin; Speicher, David W.

    2013-01-01

    New serological biomarkers for early detection and clinical management of ovarian cancer are urgently needed, and many candidates have been reported. A major challenge frequently encountered when validating candidates in patients is establishing quantitative assays that distinguish between highly homologous proteins. The current study tested whether multiple members of two recently discovered ovarian cancer biomarker protein families, chloride intracellular channel (CLIC) proteins and tropomyosins (TPM), were detectable in ovarian cancer patient sera. A multiplexed, label-free multiple reaction monitoring (MRM) assay was established to target peptides specific to all detected CLIC and TPM family members, and their serum levels were quantitated for ovarian cancer patients and non-cancer controls. In addition to CLIC1 and TPM1, which were the proteins initially discovered in a xenograft mouse model, CLIC4, TPM2, TPM3, and TPM4 were present in ovarian cancer patient sera at significantly elevated levels compared with controls. Some of the additional biomarkers identified in this homolog-centric verification and validation approach may be superior to the previously identified biomarkers at discriminating between ovarian cancer and non-cancer patients. This demonstrates the importance of considering all potential protein homologs and using quantitative assays for cancer biomarker validation with well-defined isoform specificity. PMID:23792823

  4. Clinical validation of the 50 gene AmpliSeq Cancer Panel V2 for use on a next generation sequencing platform using formalin fixed, paraffin embedded and fine needle aspiration tumour specimens.

    PubMed

    Rathi, Vivek; Wright, Gavin; Constantin, Diana; Chang, Siok; Pham, Huong; Jones, Kerryn; Palios, Atha; Mclachlan, Sue-Anne; Conron, Matthew; McKelvie, Penny; Williams, Richard

    2017-01-01

    The advent of massively parallel sequencing has caused a paradigm shift in the ways cancer is treated, as personalised therapy becomes a reality. More and more laboratories are looking to introduce next generation sequencing (NGS) as a tool for mutational analysis, as this technology has many advantages compared to conventional platforms like Sanger sequencing. In Australia all massively parallel sequencing platforms are still considered in-house in vitro diagnostic tools by the National Association of Testing Authorities (NATA) and a comprehensive analytical validation of all assays, and not just mere verification, is a strict requirement before accreditation can be granted for clinical testing on these platforms. Analytical validation of assays on NGS platforms can prove to be extremely challenging for pathology laboratories. Although there are many affordable and easily accessible NGS instruments available, there are no standardised guidelines as yet for clinical validation of NGS assays. We present an accreditation development procedure that was both comprehensive and applicable in a setting of hospital laboratory for NGS services. This approach may also be applied to other NGS applications in service laboratories. Copyright © 2016 Royal College of Pathologists of Australasia. Published by Elsevier B.V. All rights reserved.

  5. Protein isoform-specific validation defines multiple chloride intracellular channel and tropomyosin isoforms as serological biomarkers of ovarian cancer.

    PubMed

    Tang, Hsin-Yao; Beer, Lynn A; Tanyi, Janos L; Zhang, Rugang; Liu, Qin; Speicher, David W

    2013-08-26

    New serological biomarkers for early detection and clinical management of ovarian cancer are urgently needed, and many candidates have been reported. A major challenge frequently encountered when validating candidates in patients is establishing quantitative assays that distinguish between highly homologous proteins. The current study tested whether multiple members of two recently discovered ovarian cancer biomarker protein families, chloride intracellular channel (CLIC) proteins and tropomyosins (TPM), were detectable in ovarian cancer patient sera. A multiplexed, label-free multiple reaction monitoring (MRM) assay was established to target peptides specific to all detected CLIC and TPM family members, and their serum levels were quantitated for ovarian cancer patients and non-cancer controls. In addition to CLIC1 and TPM1, which were the proteins initially discovered in a xenograft mouse model, CLIC4, TPM2, TPM3, and TPM4 were present in ovarian cancer patient sera at significantly elevated levels compared with controls. Some of the additional biomarkers identified in this homolog-centric verification and validation approach may be superior to the previously identified biomarkers at discriminating between ovarian cancer and non-cancer patients. This demonstrates the importance of considering all potential protein homologs and using quantitative assays for cancer biomarker validation with well-defined isoform specificity. This manuscript addresses the importance of distinguishing between protein homologs and isoforms when identifying and validating cancer biomarkers in plasma or serum. Specifically, it describes the use of targeted in-depth LC-MS/MS analysis to determine the members of two protein families, chloride intracellular channel (CLIC) and tropomyosin (TPM) proteins that are detectable in sera of ovarian cancer patients. It then establishes a multiplexed isoform- and homology-specific MRM assay to quantify all observed gene products in these two protein families as well as many of the closely related tropomyosin isoforms. Using this assay, levels of all detected CLICs and TPMs were quantified in ovarian cancer patient and control subject sera. These results demonstrate that in addition to the previously known CLIC1, multiple tropomyosins and CLIC4 are promising new ovarian cancer biomarkers. Based on these initial validation studies, these new ovarian cancer biomarkers appear to be superior to most previously known ovarian cancer biomarkers. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Baseline Assessment and Prioritization Framework for IVHM Integrity Assurance Enabling Capabilities

    NASA Technical Reports Server (NTRS)

    Cooper, Eric G.; DiVito, Benedetto L.; Jacklin, Stephen A.; Miner, Paul S.

    2009-01-01

    Fundamental to vehicle health management is the deployment of systems incorporating advanced technologies for predicting and detecting anomalous conditions in highly complex and integrated environments. Integrated structural integrity health monitoring, statistical algorithms for detection, estimation, prediction, and fusion, and diagnosis supporting adaptive control are examples of advanced technologies that present considerable verification and validation challenges. These systems necessitate interactions between physical and software-based systems that are highly networked with sensing and actuation subsystems, and incorporate technologies that are, in many respects, different from those employed in civil aviation today. A formidable barrier to deploying these advanced technologies in civil aviation is the lack of enabling verification and validation tools, methods, and technologies. The development of new verification and validation capabilities will not only enable the fielding of advanced vehicle health management systems, but will also provide new assurance capabilities for verification and validation of current generation aviation software which has been implicated in anomalous in-flight behavior. This paper describes the research focused on enabling capabilities for verification and validation underway within NASA s Integrated Vehicle Health Management project, discusses the state of the art of these capabilities, and includes a framework for prioritizing activities.

  7. Exploration of Uncertainty in Glacier Modelling

    NASA Technical Reports Server (NTRS)

    Thompson, David E.

    1999-01-01

    There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.

  8. High-resolution face verification using pore-scale facial features.

    PubMed

    Li, Dong; Zhou, Huiling; Lam, Kin-Man

    2015-08-01

    Face recognition methods, which usually represent face images using holistic or local facial features, rely heavily on alignment. Their performances also suffer a severe degradation under variations in expressions or poses, especially when there is one gallery per subject only. With the easy access to high-resolution (HR) face images nowadays, some HR face databases have recently been developed. However, few studies have tackled the use of HR information for face recognition or verification. In this paper, we propose a pose-invariant face-verification method, which is robust to alignment errors, using the HR information based on pore-scale facial features. A new keypoint descriptor, namely, pore-Principal Component Analysis (PCA)-Scale Invariant Feature Transform (PPCASIFT)-adapted from PCA-SIFT-is devised for the extraction of a compact set of distinctive pore-scale facial features. Having matched the pore-scale features of two-face regions, an effective robust-fitting scheme is proposed for the face-verification task. Experiments show that, with one frontal-view gallery only per subject, our proposed method outperforms a number of standard verification methods, and can achieve excellent accuracy even the faces are under large variations in expression and pose.

  9. Analytical Verifications in Cryogenic Testing of NGST Advanced Mirror System Demonstrators

    NASA Technical Reports Server (NTRS)

    Cummings, Ramona; Levine, Marie; VanBuren, Dave; Kegley, Jeff; Green, Joseph; Hadaway, James; Presson, Joan; Cline, Todd; Stahl, H. Philip (Technical Monitor)

    2002-01-01

    Ground based testing is a critical and costly part of component, assembly, and system verifications of large space telescopes. At such tests, however, with integral teamwork by planners, analysts, and test personnel, segments can be included to validate specific analytical parameters and algorithms at relatively low additional cost. This paper opens with strategy of analytical verification segments added to vacuum cryogenic testing of Advanced Mirror System Demonstrator (AMSD) assemblies. These AMSD assemblies incorporate material and architecture concepts being considered in the Next Generation Space Telescope (NGST) design. The test segments for workmanship testing, cold survivability, and cold operation optical throughput are supplemented by segments for analytical verifications of specific structural, thermal, and optical parameters. Utilizing integrated modeling and separate materials testing, the paper continues with support plan for analyses, data, and observation requirements during the AMSD testing, currently slated for late calendar year 2002 to mid calendar year 2003. The paper includes anomaly resolution as gleaned by authors from similar analytical verification support of a previous large space telescope, then closes with draft of plans for parameter extrapolations, to form a well-verified portion of the integrated modeling being done for NGST performance predictions.

  10. Authentication Based on Pole-zero Models of Signature Velocity

    PubMed Central

    Rashidi, Saeid; Fallah, Ali; Towhidkhah, Farzad

    2013-01-01

    With the increase of communication and financial transaction through internet, on-line signature verification is an accepted biometric technology for access control and plays a significant role in authenticity and authorization in modernized society. Therefore, fast and precise algorithms for the signature verification are very attractive. The goal of this paper is modeling of velocity signal that pattern and properties is stable for persons. With using pole-zero models based on discrete cosine transform, precise method is proposed for modeling and then features is founded from strokes. With using linear, parzen window and support vector machine classifiers, the signature verification technique was tested with a large number of authentic and forgery signatures and has demonstrated the good potential of this technique. The signatures are collected from three different database include a proprietary database, the SVC2004 and the Sabanci University signature database benchmark databases. Experimental results based on Persian, SVC2004 and SUSIG databases show that our method achieves an equal error rate of 5.91%, 5.62% and 3.91% in the skilled forgeries, respectively. PMID:24696797

  11. Knowledge-based verification of clinical guidelines by detection of anomalies.

    PubMed

    Duftschmid, G; Miksch, S

    2001-04-01

    As shown in numerous studies, a significant part of published clinical guidelines is tainted with different types of semantical errors that interfere with their practical application. The adaptation of generic guidelines, necessitated by circumstances such as resource limitations within the applying organization or unexpected events arising in the course of patient care, further promotes the introduction of defects. Still, most current approaches for the automation of clinical guidelines are lacking mechanisms, which check the overall correctness of their output. In the domain of software engineering in general and in the domain of knowledge-based systems (KBS) in particular, a common strategy to examine a system for potential defects consists in its verification. The focus of this work is to present an approach, which helps to ensure the semantical correctness of clinical guidelines in a three-step process. We use a particular guideline specification language called Asbru to demonstrate our verification mechanism. A scenario-based evaluation of our method is provided based on a guideline for the artificial ventilation of newborn infants. The described approach is kept sufficiently general in order to allow its application to several other guideline representation formats.

  12. Video-Based Fingerprint Verification

    PubMed Central

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283

  13. Biomarker Discovery for Early Detection of Hepatocellular Carcinoma in Hepatitis C–infected Patients*

    PubMed Central

    Mustafa, Mehnaz G.; Petersen, John R.; Ju, Hyunsu; Cicalese, Luca; Snyder, Ned; Haidacher, Sigmund J.; Denner, Larry; Elferink, Cornelis

    2013-01-01

    Chronic hepatic disease damages the liver, and the resulting wound-healing process leads to liver fibrosis and the subsequent development of cirrhosis. The leading cause of hepatic fibrosis and cirrhosis is infection with hepatitis C virus (HCV), and of the patients with HCV-induced cirrhosis, 2% to 5% develop hepatocellular carcinoma (HCC), with a survival rate of 7%. HCC is one of the leading causes of cancer-related death worldwide, and the poor survival rate is largely due to late-stage diagnosis, which makes successful intervention difficult, if not impossible. The lack of sensitive and specific diagnostic tools and the urgent need for early-stage diagnosis prompted us to discover new candidate biomarkers for HCV and HCC. We used aptamer-based fractionation technology to reduce serum complexity, differentially labeled samples (six HCV and six HCC) with fluorescent dyes, and resolved proteins in pairwise two-dimensional difference gel electrophoresis. DeCyder software was used to identify differentially expressed proteins and spots picked, and MALDI-MS/MS was used to determine that ApoA1 was down-regulated by 22% (p < 0.004) in HCC relative to HCV. Differential expression quantified via two-dimensional difference gel electrophoresis was confirmed by means of 18O/16O stable isotope differential labeling with LC-MS/MS zoom scans. Technically independent confirmation was demonstrated by triple quadrupole LC-MS/MS selected reaction monitoring (SRM) assays with three peptides specific to human ApoA1 (DLATVYVDVLK, WQEEMELYR, and VSFLSALEEYTK) using 18O/16O-labeled samples and further verified with AQUA peptides as internal standards for quantification. In 50 patient samples (24 HCV and 26 HCC), all three SRM assays yielded highly similar differential expression of ApoA1 in HCC and HCV patients. These results validated the SRM assays, which were independently confirmed by Western blotting. Thus, ApoA1 is a candidate member of an SRM biomarker panel for early diagnosis, prognosis, and monitoring of HCC. Future multiplexing of SRM assays for other candidate biomarkers is envisioned to develop a biomarker panel for subsequent verification and validation studies. PMID:24008390

  14. A simplified high-throughput method for pyrethroid knock-down resistance (kdr) detection in Anopheles gambiae

    PubMed Central

    Lynd, Amy; Ranson, Hilary; McCall, P J; Randle, Nadine P; Black, William C; Walker, Edward D; Donnelly, Martin J

    2005-01-01

    Background A single base pair mutation in the sodium channel confers knock-down resistance to pyrethroids in many insect species. Its occurrence in Anopheles mosquitoes may have important implications for malaria vector control especially considering the current trend for large scale pyrethroid-treated bednet programmes. Screening Anopheles gambiae populations for the kdr mutation has become one of the mainstays of programmes that monitor the development of insecticide resistance. The screening is commonly performed using a multiplex Polymerase Chain Reaction (PCR) which, since it is reliant on a single nucleotide polymorphism, can be unreliable. Here we present a reliable and potentially high throughput method for screening An. gambiae for the kdr mutation. Methods A Hot Ligation Oligonucleotide Assay (HOLA) was developed to detect both the East and West African kdr alleles in the homozygous and heterozygous states, and was optimized for use in low-tech developing world laboratories. Results from the HOLA were compared to results from the multiplex PCR for field and laboratory mosquito specimens to provide verification of the robustness and sensitivity of the technique. Results and Discussion The HOLA assay, developed for detection of the kdr mutation, gives a bright blue colouration for a positive result whilst negative reactions remain colourless. The results are apparent within a few minutes of adding the final substrate and can be scored by eye. Heterozygotes are scored when a sample gives a positive reaction to the susceptible probe and the kdr probe. The technique uses only basic laboratory equipment and skills and can be carried out by anyone familiar with the Enzyme-linked immunosorbent assay (ELISA) technique. A comparison to the multiplex PCR method showed that the HOLA assay was more reliable, and scoring of the plates was less ambiguous. Conclusion The method is capable of detecting both the East and West African kdr alleles in the homozygous and heterozygous states from fresh or dried material using several DNA extraction methods. It is more reliable than the traditional PCR method and may be more sensitive for the detection of heterozygotes. It is inexpensive, simple and relatively safe making it suitable for use in resource-poor countries. PMID:15766386

  15. LLCEDATA and LLCECALC for Windows version 1.0, Volume 1: User`s manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McFadden, J.G.

    LLCEDATA and LLCECALC for Windows are user-friendly computer software programs that work together to determine the proper waste designation, handling, and disposition requirements for Long Length Contaminated Equipment (LLCE). LLCEDATA reads from a variety of data bases to produce an equipment data file (EDF) that represents a snapshot of both the LLCE and the tank it originates from. LLCECALC reads the EDF and a gamma assay (AV2) file that is produced by the Flexible Receiver Gamma Energy Analysis System. LLCECALC performs corrections to the AV2 file as it is being read and characterizes the LLCE. Both programs produce a varietymore » of reports, including a characterization report and a status report. The status report documents each action taken by the user, LLCEDATA, and LLCECALC. Documentation for LLCEDATA and LLCECALC for Windows is available in three volumes. Volume 1 is a user`s manual, which is intended as a quick reference for both LLCEDATA and LLCECALC. Volume 2 is a technical manual, and Volume 3 is a software verification and validation document.« less

  16. eBiometrics: an enhanced multi-biometrics authentication technique for real-time remote applications on mobile devices

    NASA Astrophysics Data System (ADS)

    Kuseler, Torben; Lami, Ihsan; Jassim, Sabah; Sellahewa, Harin

    2010-04-01

    The use of mobile communication devices with advance sensors is growing rapidly. These sensors are enabling functions such as Image capture, Location applications, and Biometric authentication such as Fingerprint verification and Face & Handwritten signature recognition. Such ubiquitous devices are essential tools in today's global economic activities enabling anywhere-anytime financial and business transactions. Cryptographic functions and biometric-based authentication can enhance the security and confidentiality of mobile transactions. Using Biometric template security techniques in real-time biometric-based authentication are key factors for successful identity verification solutions, but are venerable to determined attacks by both fraudulent software and hardware. The EU-funded SecurePhone project has designed and implemented a multimodal biometric user authentication system on a prototype mobile communication device. However, various implementations of this project have resulted in long verification times or reduced accuracy and/or security. This paper proposes to use built-in-self-test techniques to ensure no tampering has taken place on the verification process prior to performing the actual biometric authentication. These techniques utilises the user personal identification number as a seed to generate a unique signature. This signature is then used to test the integrity of the verification process. Also, this study proposes the use of a combination of biometric modalities to provide application specific authentication in a secure environment, thus achieving optimum security level with effective processing time. I.e. to ensure that the necessary authentication steps and algorithms running on the mobile device application processor can not be undermined or modified by an imposter to get unauthorized access to the secure system.

  17. Performance of a new HPV and biomarker assay in the management of hrHPV positive women: Subanalysis of the ongoing multicenter TRACE clinical trial (n > 6,000) to evaluate POU4F3 methylation as a potential biomarker of cervical precancer and cancer.

    PubMed

    Kocsis, Adrienn; Takács, Tibor; Jeney, Csaba; Schaff, Zsuzsa; Koiss, Róbert; Járay, Balázs; Sobel, Gábor; Pap, Károly; Székely, István; Ferenci, Tamás; Lai, Hung-Cheng; Nyíri, Miklós; Benczik, Márta

    2017-03-01

    The ongoing Triage and Risk Assessment of Cervical Precancer by Epigenetic Biomarker (TRACE) prospective, multicenter study aimed to provide a clinical evaluation of the CONFIDENCE™ assay, which comprises a human papillomavirus (HPV) DNA and a human epigenetic biomarker test. Between 2013 and 2015 over 6,000 women aged 18 or older were recruited in Hungary. Liquid-based cytology (LBC), high-risk HPV (hrHPV) DNA detection and single target host gene methylation test of the promoter sequence of the POU4F3 gene by quantitative methylation-specific polymerase chain reaction (PCR) were performed from the same liquid-based cytology sample. The current analysis is focused on the baseline cross-sectional clinical results of 5,384 LBC samples collected from subjects aged 25 years or older. The performance of the CONFIDENCE HPV™ test was found to be comparable to the cobas® HPV test with good agreement. When applying the CONFIDENCE Marker™ test alone in hrHPV positives, it showed significantly higher sensitivity with matching specificity compared to LBC-based triage. For CIN3+ histological endpoint in the age group of 25-65 and 30-65, the methylation test of POU4F3 achieved relative sensitivities of 1.74 (95% CI: 1.25-2.33) and 1.64 (95% CI: 1.08-2.27), respectively, after verification bias adjustment. On the basis of our findings, POU4F3 methylation as a triage test of hrHPV positives appears to be a noteworthy method. We can reasonably assume that its quantitative nature offers the potential for a more objective and discriminative risk assessment tool in the prevention and diagnostics of high-grade cervical intraepithelial neoplasia (CIN) lesions and cervical cancer. © 2016 UICC.

  18. Establishing a reliable multiple reaction monitoring-based method for the quantification of obesity-associated comorbidities in serum and adipose tissue requires intensive clinical validation.

    PubMed

    Oberbach, Andreas; Schlichting, Nadine; Neuhaus, Jochen; Kullnick, Yvonne; Lehmann, Stefanie; Heinrich, Marco; Dietrich, Arne; Mohr, Friedrich Wilhelm; von Bergen, Martin; Baumann, Sven

    2014-12-05

    Multiple reaction monitoring (MRM)-based mass spectrometric quantification of peptides and their corresponding proteins has been successfully applied for biomarker validation in serum. The option of multiplexing offers the chance to analyze various proteins in parallel, which is especially important in obesity research. Here, biomarkers that reflect multiple comorbidities and allow monitoring of therapy outcomes are required. Besides the suitability of established MRM assays for serum protein quantification, it is also feasible for analysis of tissues secreting the markers of interest. Surprisingly, studies comparing MRM data sets with established methods are rare, and therefore the biological and clinical value of most analytes remains questionable. A MRM method using nano-UPLC-MS/MS for the quantification of obesity related surrogate markers for several comorbidities in serum, plasma, visceral and subcutaneous adipose tissue was established. Proteotypic peptides for complement C3, adiponectin, angiotensinogen, and plasma retinol binding protein (RBP4) were quantified using isotopic dilution analysis and compared to the standard ELISA method. MRM method variabilities were mainly below 10%. The comparison with other MS-based approaches showed a good correlation. However, large differences in absolute quantification for complement C3 and adiponectin were obtained compared to ELISA, while less marked differences were observed for angiotensinogen and RBP4. The verification of MRM in obesity was performed to discriminate first lean and obese phenotype and second to monitor excessive weight loss after gastric bypass surgery in a seven-month follow-up. The presented MRM assay was able to discriminate obese phenotype from lean and monitor weight loss related changes of surrogate markers. However, inclusion of additional biomarkers was necessary to interpret the MRM data on obesity phenotype properly. In summary, the development of disease-related MRMs should include a step of matching the MRM data with clinically approved standard methods and defining reference values in well-sized representative age, gender, and disease-matched cohorts.

  19. ADVANCED SURVEILLANCE OF ENVIROMENTAL RADIATION IN AUTOMATIC NETWORKS.

    PubMed

    Benito, G; Sáez, J C; Blázquez, J B; Quiñones, J

    2018-06-01

    The objective of this study is the verification of the operation of a radiation monitoring network conformed by several sensors. The malfunction of a surveillance network has security and economic consequences, which derive from its maintenance and could be avoided with an early detection. The proposed method is based on a kind of multivariate distance, and the verification for the methodology has been tested at CIEMAT's local radiological early warning network.

  20. Verification of Weather Running Estimate-Nowcast (WRE-N) Forecasts Using a Spatial-Categorical Method

    DTIC Science & Technology

    2017-07-01

    forecasts and observations on a common grid, which enables the application a number of different spatial verification methods that reveal various...forecasts of continuous meteorological variables using categorical and object-based methods . White Sands Missile Range (NM): Army Research Laboratory (US... Research version of the Weather Research and Forecasting Model adapted for generating short-range nowcasts and gridded observations produced by the

  1. Recent literature on structural modeling, identification, and analysis

    NASA Technical Reports Server (NTRS)

    Craig, Roy R., Jr.

    1990-01-01

    The literature on the mathematical modeling of large space structures is first reviewed, with attention given to continuum models, model order reduction, substructuring, and computational techniques. System identification and mode verification are then discussed with reference to the verification of mathematical models of large space structures. In connection with analysis, the paper surveys recent research on eigensolvers and dynamic response solvers for large-order finite-element-based models.

  2. Distance Metric Learning Using Privileged Information for Face Verification and Person Re-Identification.

    PubMed

    Xu, Xinxing; Li, Wen; Xu, Dong

    2015-12-01

    In this paper, we propose a new approach to improve face verification and person re-identification in the RGB images by leveraging a set of RGB-D data, in which we have additional depth images in the training data captured using depth cameras such as Kinect. In particular, we extract visual features and depth features from the RGB images and depth images, respectively. As the depth features are available only in the training data, we treat the depth features as privileged information, and we formulate this task as a distance metric learning with privileged information problem. Unlike the traditional face verification and person re-identification tasks that only use visual features, we further employ the extra depth features in the training data to improve the learning of distance metric in the training process. Based on the information-theoretic metric learning (ITML) method, we propose a new formulation called ITML with privileged information (ITML+) for this task. We also present an efficient algorithm based on the cyclic projection method for solving the proposed ITML+ formulation. Extensive experiments on the challenging faces data sets EUROCOM and CurtinFaces for face verification as well as the BIWI RGBD-ID data set for person re-identification demonstrate the effectiveness of our proposed approach.

  3. Safety Verification of the Small Aircraft Transportation System Concept of Operations

    NASA Technical Reports Server (NTRS)

    Carreno, Victor; Munoz, Cesar

    2005-01-01

    A critical factor in the adoption of any new aeronautical technology or concept of operation is safety. Traditionally, safety is accomplished through a rigorous process that involves human factors, low and high fidelity simulations, and flight experiments. As this process is usually performed on final products or functional prototypes, concept modifications resulting from this process are very expensive to implement. This paper describe an approach to system safety that can take place at early stages of a concept design. It is based on a set of mathematical techniques and tools known as formal methods. In contrast to testing and simulation, formal methods provide the capability of exhaustive state exploration analysis. We present the safety analysis and verification performed for the Small Aircraft Transportation System (SATS) Concept of Operations (ConOps). The concept of operations is modeled using discrete and hybrid mathematical models. These models are then analyzed using formal methods. The objective of the analysis is to show, in a mathematical framework, that the concept of operation complies with a set of safety requirements. It is also shown that the ConOps has some desirable characteristic such as liveness and absence of dead-lock. The analysis and verification is performed in the Prototype Verification System (PVS), which is a computer based specification language and a theorem proving assistant.

  4. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    NASA Astrophysics Data System (ADS)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  5. Supporting Technology for Chain of Custody of Nuclear Weapons and Materials throughout the Dismantlement and Disposition Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bunch, Kyle J.; Jones, Anthony M.; Ramuhalli, Pradeep

    The ratification and ongoing implementation of the New START Treaty have been widely regarded as noteworthy global security achievements for both the Obama Administration and the Putin (formerly Medvedev) regime. But deeper cuts that move beyond the United States and Russia to engage the P-5 and other nuclear weapons possessor states are envisioned under future arms control regimes, and are indeed required for the P-5 in accordance with their Article VI disarmament obligations in the Nuclear Non-Proliferation Treaty. Future verification needs will include monitoring the cessation of production of new fissile material for weapons, monitoring storage of warhead components andmore » fissile materials and verifying dismantlement of warheads, pits, secondary stages, and other materials. A fundamental challenge to implementing a nuclear disarmament regime is the ability to thwart unauthorized material diversion throughout the dismantlement and disposition process through strong chain of custody implementation. Verifying the declared presence, or absence, of nuclear materials and weapons components throughout the dismantlement and disposition lifecycle is a critical aspect of the disarmament process. From both the diplomatic and technical perspectives, verification under these future arms control regimes will require new solutions. Since any acceptable verification technology must protect sensitive design information and attributes to prevent the release of classified or other proliferation-sensitive information, non-nuclear non-sensitive modalities may provide significant new verification tools which do not require the use of additional information barriers. Alternative verification technologies based upon electromagnetic and acoustics could potentially play an important role in fulfilling the challenging requirements of future verification regimes. For example, researchers at the Pacific Northwest National Laboratory (PNNL) have demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to rapidly confirm the presence of specific components on a yes/no basis without revealing classified information. PNNL researchers have also used ultrasonic measurements to obtain images of material microstructures which may be used as templates or unique identifiers of treaty-limited items. Such alternative technologies are suitable for application in various stages of weapons dismantlement and often include the advantage of an inherent information barrier due to the inability to extract classified weapon design information from the collected data. As a result, these types of technologies complement radiation-based verification methods for arms control. This article presents an overview of several alternative verification technologies that are suitable for supporting a future, broader and more intrusive arms control regime that spans the nuclear weapons disarmament lifecycle. The general capabilities and limitations of each verification modality are discussed and example technologies are presented. Potential applications are defined in the context of the nuclear material and weapons lifecycle. Example applications range from authentication (e.g., tracking and signatures within the chain of custody from downloading through weapons storage, unclassified templates and unique identification) to verification of absence and final material disposition.« less

  6. On flattening filter‐free portal dosimetry

    PubMed Central

    Novais, Juan Castro; Molina López, María Yolanda; Maqueda, Sheila Ruiz

    2016-01-01

    Varian introduced (in 2010) the option of removing the flattening filter (FF) in their C‐Arm linacs for intensity‐modulated treatments. This mode, called flattening filter‐free (FFF), offers the advantage of a greater dose rate. Varian's “Portal Dosimetry” is an electronic portal imager device (EPID)‐based tool for IMRT verification. This tool lacks the capability of verifying flattening filter‐free (FFF) modes due to saturation and lack of an image prediction algorithm. (Note: the latest versions of this software and EPID correct these issues.) The objective of the present study is to research the feasibility of said verifications (with the older versions of the software and EPID). By placing the EPID at a greater distance, the images can be acquired without saturation, yielding a linearity similar to the flattened mode. For the image prediction, a method was optimized based on the clinically used algorithm (analytical anisotropic algorithm (AAA)) over a homogeneous phantom. The depth inside the phantom and its electronic density were tailored. An application was developed to allow the conversion of a dose plane (in DICOM format) to Varian's custom format for Portal Dosimetry. The proposed method was used for the verification of test and clinical fields for the three qualities used in our institution for IMRT: 6X, 6FFF and 10FFF. The method developed yielded a positive verification (more than 95% of the points pass a 2%/2 mm gamma) for both the clinical and test fields. This method was also capable of “predicting” static and wedged fields. A workflow for the verification of FFF fields was developed. This method relies on the clinical algorithm used for dose calculation and is able to verify the FFF modes, as well as being useful for machine quality assurance. The procedure described does not require new hardware. This method could be used as a verification of Varian's Portal Dose Image Prediction. PACS number(s): 87.53.Kn, 87.55.T‐, 87.56.bd, 87.59.‐e PMID:27455487

  7. Verification of Emergent Behaviors in Swarm-based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James

    2004-01-01

    The emergent properties of swarms make swarm-based missions powerful, but at the same time more difficult to design and to assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of swarm-based missions. The Autonomous Nano-Technology Swarm (ANTS) mission is being used as an example and case study for swarm-based missions to experiment and test current formal methods with intelligent swarms. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior. This paper introduces how intelligent swarm technology is being proposed for NASA missions, and gives the results of a comparison of several formal methods and approaches for specifying intelligent swarm-based systems and their effectiveness for predicting emergent behavior.

  8. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  9. Property-Based Monitoring of Analog and Mixed-Signal Systems

    NASA Astrophysics Data System (ADS)

    Havlicek, John; Little, Scott; Maler, Oded; Nickovic, Dejan

    In the recent past, there has been a steady growth of the market for consumer embedded devices such as cell phones, GPS and portable multimedia systems. In embedded systems, digital, analog and software components are combined on a single chip, resulting in increasingly complex designs that introduce richer functionality on smaller devices. As a consequence, the potential insertion of errors into a design becomes higher, yielding an increasing need for automated analog and mixed-signal validation tools. In the purely digital setting, formal verification based on properties expressed in industrial specification languages such as PSL and SVA is nowadays successfully integrated in the design flow. On the other hand, the validation of analog and mixed-signal systems still largely depends on simulation-based, ad-hoc methods. In this tutorial, we consider some ingredients of the standard verification methodology that can be successfully exported from digital to analog and mixed-signal setting, in particular property-based monitoring techniques. Property-based monitoring is a lighter approach to the formal verification, where the system is seen as a "black-box" that generates sets of traces, whose correctness is checked against a property, that is its high-level specification. Although incomplete, monitoring is effectively used to catch faults in systems, without guaranteeing their full correctness.

  10. Quantum supremacy in constant-time measurement-based computation: A unified architecture for sampling and verification

    NASA Astrophysics Data System (ADS)

    Miller, Jacob; Sanders, Stephen; Miyake, Akimasa

    2017-12-01

    While quantum speed-up in solving certain decision problems by a fault-tolerant universal quantum computer has been promised, a timely research interest includes how far one can reduce the resource requirement to demonstrate a provable advantage in quantum devices without demanding quantum error correction, which is crucial for prolonging the coherence time of qubits. We propose a model device made of locally interacting multiple qubits, designed such that simultaneous single-qubit measurements on it can output probability distributions whose average-case sampling is classically intractable, under similar assumptions as the sampling of noninteracting bosons and instantaneous quantum circuits. Notably, in contrast to these previous unitary-based realizations, our measurement-based implementation has two distinctive features. (i) Our implementation involves no adaptation of measurement bases, leading output probability distributions to be generated in constant time, independent of the system size. Thus, it could be implemented in principle without quantum error correction. (ii) Verifying the classical intractability of our sampling is done by changing the Pauli measurement bases only at certain output qubits. Our usage of random commuting quantum circuits in place of computationally universal circuits allows a unique unification of sampling and verification, so they require the same physical resource requirements in contrast to the more demanding verification protocols seen elsewhere in the literature.

  11. Evidence flow graph methods for validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant

    1989-01-01

    The results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems are given. A translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis were developed. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could affect the output from a set of rules.

  12. Safety Verification of a Fault Tolerant Reconfigurable Autonomous Goal-Based Robotic Control System

    NASA Technical Reports Server (NTRS)

    Braman, Julia M. B.; Murray, Richard M; Wagner, David A.

    2007-01-01

    Fault tolerance and safety verification of control systems are essential for the success of autonomous robotic systems. A control architecture called Mission Data System (MDS), developed at the Jet Propulsion Laboratory, takes a goal-based control approach. In this paper, a method for converting goal network control programs into linear hybrid systems is developed. The linear hybrid system can then be verified for safety in the presence of failures using existing symbolic model checkers. An example task is simulated in MDS and successfully verified using HyTech, a symbolic model checking software for linear hybrid systems.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barrett, Christopher A.; Martinez, Alonzo; McNamara, Bruce K.

    International Atom Energy Agency (IAEA) safeguard verification measures in gaseous centrifuge enrichment plants (GCEPs) rely on environmental sampling, non-destructive assay (NDA), and destructive assay (DA) sampling and analysis to determine uranium enrichment. UF6 bias defect measurements are made by DA sampling and analysis to assure that enrichment is consistent with declarations. DA samples are collected from a limited number of cylinders for high precision, offsite mass spectrometer analysis. Samples are typically drawn from a sampling tap into a UF6 sample bottle, then packaged, sealed, and shipped under IAEA chain of custody to an offsite analytical laboratory. Future DA safeguard measuresmore » may require improvements in efficiency and effectiveness as GCEP capacities increase and UF6 shipping regulations become increasingly more restrictive. The Pacific Northwest National Laboratory (PNNL) DA sampler concept and Laser Ablation Absorption Ratio Spectrometry (LAARS) assay method are under development to potentially provide DA safeguard tools that increase inspection effectiveness and reduce sample shipping constraints. The PNNL DA sampler concept uses a handheld sampler to collect DA samples for either onsite LAARS assay or offsite laboratory analysis. The DA sampler design will use a small sampling planchet that is coated with an adsorptive film to collect controlled quantities of UF6 gas directly from a cylinder or process sampling tap. Development efforts are currently underway at PNNL to enhance LAARS assay performance to allow high-precision onsite bias defect measurements. In this paper, we report on the experimental investigation to develop adsorptive films for the PNNL DA sampler concept. These films are intended to efficiently capture UF6 and then stabilize the collected DA sample prior to onsite LAARS or offsite laboratory analysis. Several porous material composite films were investigated, including a film designed to maximize the chemical adsorption and binding of gaseous UF6 onto the sampling planchet.« less

  14. Development of Internal Controls for the Luminex Instrument as Part of a Multiplex Seven-Analyte Viral Respiratory Antibody Profile

    PubMed Central

    Martins, Thomas B.

    2002-01-01

    The ability of the Luminex system to simultaneously quantitate multiple analytes from a single sample source has proven to be a feasible and cost-effective technology for assay development. In previous studies, my colleagues and I introduced two multiplex profiles consisting of 20 individual assays into the clinical laboratory. With the Luminex instrument’s ability to classify up to 100 distinct microspheres, however, we have only begun to realize the enormous potential of this technology. By utilizing additional microspheres, it is now possible to add true internal controls to each individual sample. During the development of a seven-analyte serologic viral respiratory antibody profile, internal controls for detecting sample addition and interfering rheumatoid factor (RF) were investigated. To determine if the correct sample was added, distinct microspheres were developed for measuring the presence of sufficient quantities of immunoglobulin G (IgG) or IgM in the diluted patient sample. In a multiplex assay of 82 samples, the IgM verification control correctly identified 23 out of 23 samples with low levels (<20 mg/dl) of this antibody isotype. An internal control microsphere for RF detected 30 out of 30 samples with significant levels (>10 IU/ml) of IgM RF. Additionally, RF-positive samples causing false-positive adenovirus and influenza A virus IgM results were correctly identified. By exploiting the Luminex instrument’s multiplexing capabilities, I have developed true internal controls to ensure correct sample addition and identify interfering RF as part of a respiratory viral serologic profile that includes influenza A and B viruses, adenovirus, parainfluenza viruses 1, 2, and 3, and respiratory syncytial virus. Since these controls are not assay specific, they can be incorporated into any serologic multiplex assay. PMID:11777827

  15. Development of internal controls for the Luminex instrument as part of a multiplex seven-analyte viral respiratory antibody profile.

    PubMed

    Martins, Thomas B

    2002-01-01

    The ability of the Luminex system to simultaneously quantitate multiple analytes from a single sample source has proven to be a feasible and cost-effective technology for assay development. In previous studies, my colleagues and I introduced two multiplex profiles consisting of 20 individual assays into the clinical laboratory. With the Luminex instrument's ability to classify up to 100 distinct microspheres, however, we have only begun to realize the enormous potential of this technology. By utilizing additional microspheres, it is now possible to add true internal controls to each individual sample. During the development of a seven-analyte serologic viral respiratory antibody profile, internal controls for detecting sample addition and interfering rheumatoid factor (RF) were investigated. To determine if the correct sample was added, distinct microspheres were developed for measuring the presence of sufficient quantities of immunoglobulin G (IgG) or IgM in the diluted patient sample. In a multiplex assay of 82 samples, the IgM verification control correctly identified 23 out of 23 samples with low levels (<20 mg/dl) of this antibody isotype. An internal control microsphere for RF detected 30 out of 30 samples with significant levels (>10 IU/ml) of IgM RF. Additionally, RF-positive samples causing false-positive adenovirus and influenza A virus IgM results were correctly identified. By exploiting the Luminex instrument's multiplexing capabilities, I have developed true internal controls to ensure correct sample addition and identify interfering RF as part of a respiratory viral serologic profile that includes influenza A and B viruses, adenovirus, parainfluenza viruses 1, 2, and 3, and respiratory syncytial virus. Since these controls are not assay specific, they can be incorporated into any serologic multiplex assay.

  16. The Evolution of the NASA Commercial Crew Program Mission Assurance Process

    NASA Technical Reports Server (NTRS)

    Canfield, Amy C.

    2016-01-01

    In 2010, the National Aeronautics and Space Administration (NASA) established the Commercial Crew Program (CCP) in order to provide human access to the International Space Station and low Earth orbit via the commercial (non-governmental) sector. A particular challenge to NASA has been how to determine that the Commercial Provider's transportation system complies with programmatic safety requirements. The process used in this determination is the Safety Technical Review Board which reviews and approves provider submitted hazard reports. One significant product of the review is a set of hazard control verifications. In past NASA programs, 100% of these safety critical verifications were typically confirmed by NASA. The traditional Safety and Mission Assurance (S&MA) model does not support the nature of the CCP. To that end, NASA S&MA is implementing a Risk Based Assurance process to determine which hazard control verifications require NASA authentication. Additionally, a Shared Assurance Model is also being developed to efficiently use the available resources to execute the verifications.

  17. Verification of space weather forecasts at the UK Met Office

    NASA Astrophysics Data System (ADS)

    Bingham, S.; Sharpe, M.; Jackson, D.; Murray, S.

    2017-12-01

    The UK Met Office Space Weather Operations Centre (MOSWOC) has produced space weather guidance twice a day since its official opening in 2014. Guidance includes 4-day probabilistic forecasts of X-ray flares, geomagnetic storms, high-energy electron events and high-energy proton events. Evaluation of such forecasts is important to forecasters, stakeholders, model developers and users to understand the performance of these forecasts and also strengths and weaknesses to enable further development. Met Office terrestrial near real-time verification systems have been adapted to provide verification of X-ray flare and geomagnetic storm forecasts. Verification is updated daily to produce Relative Operating Characteristic (ROC) curves and Reliability diagrams, and rolling Ranked Probability Skill Scores (RPSSs) thus providing understanding of forecast performance and skill. Results suggest that the MOSWOC issued X-ray flare forecasts are usually not statistically significantly better than a benchmark climatological forecast (where the climatology is based on observations from the previous few months). By contrast, the issued geomagnetic storm activity forecast typically performs better against this climatological benchmark.

  18. Content analysis of age verification, purchase and delivery methods of internet e-cigarette vendors, 2013 and 2014.

    PubMed

    Williams, Rebecca S; Derrick, Jason; Liebman, Aliza Kate; LaFleur, Kevin; Ribisl, Kurt M

    2018-05-01

    Identify the population of internet e-cigarette vendors (IEVs) and conduct content analyses of their age verification, purchase and delivery methods in 2013 and 2014. We used multiple sources to identify IEV websites, primarily complex search algorithms scanning more than 180 million websites. In 2013, we manually screened 32 446 websites, identifying 980 IEVs, selecting the 281 most popular for content analysis. This methodology yielded 31 239 websites for screening in 2014, identifying 3096 IEVs, with 283 selected for content analysis. The proportion of vendors that sold online-only, with no retail store, dropped significantly from 2013 (74.7%) to 2014 (64.3%) (p<0.01), with a corresponding significant decrease in US-based vendors (71.9% in 2013 and 65% in 2014). Most vendors did little to prevent youth access in either year, with 67.6% in 2013 and 63.2% in 2014 employing no age verification or relying exclusively on strategies that cannot effectively verify age. Effective age verification strategies such as online age verification services (7.1% in 2013 and 8.5% in 2014), driving licences (1.8% in 2013 and 7.4% in 2014, p<0.01) or age verification at delivery (6.4% in 2013 and 8.1% in 2104) were rarely advertised on IEV websites. Nearly all vendors advertised accepting credit cards, and about ¾ shipping via United States Postal Service, similar to the internet cigarette industry prior to federal bans. The number of IEVs grew sharply from 2013 to 2014, with poor age verification practices. New and expanded regulations for online e-cigarette sales are needed, including strict age and identity verification requirements. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication verification, or installation verification... 30 Mineral Resources 2 2010-07-01 2010-07-01 false When must I resubmit Platform Verification...

  20. Verification of Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  1. Hierarchical Representation Learning for Kinship Verification.

    PubMed

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  2. SU-F-T-267: A Clarkson-Based Independent Dose Verification for the Helical Tomotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagata, H; Juntendo University, Hongo, Tokyo; Hongo, H

    2016-06-15

    Purpose: There have been few reports for independent dose verification for Tomotherapy. We evaluated the accuracy and the effectiveness of an independent dose verification system for the Tomotherapy. Methods: Simple MU Analysis (SMU, Triangle Product, Ishikawa, Japan) was used as the independent verification system and the system implemented a Clarkson-based dose calculation algorithm using CT image dataset. For dose calculation in the SMU, the Tomotherapy machine-specific dosimetric parameters (TMR, Scp, OAR and MLC transmission factor) were registered as the machine beam data. Dose calculation was performed after Tomotherapy sinogram from DICOM-RT plan information was converted to the information for MUmore » and MLC location at more segmented control points. The performance of the SMU was assessed by a point dose measurement in non-IMRT and IMRT plans (simple target and mock prostate plans). Subsequently, 30 patients’ treatment plans for prostate were compared. Results: From the comparison, dose differences between the SMU and the measurement were within 3% for all cases in non-IMRT plans. In the IMRT plan for the simple target, the differences (Average±1SD) were −0.70±1.10% (SMU vs. TPS), −0.40±0.10% (measurement vs. TPS) and −1.20±1.00% (measurement vs. SMU), respectively. For the mock prostate, the differences were −0.40±0.60% (SMU vs. TPS), −0.50±0.90% (measurement vs. TPS) and −0.90±0.60% (measurement vs. SMU), respectively. For patients’ plans, the difference was −0.50±2.10% (SMU vs. TPS). Conclusion: A Clarkson-based independent dose verification for the Tomotherapy can be clinically available as a secondary check with the similar tolerance level of AAPM Task group 114. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  3. Monte Carlo verification of radiotherapy treatments with CloudMC.

    PubMed

    Miras, Hector; Jiménez, Rubén; Perales, Álvaro; Terrón, José Antonio; Bertolet, Alejandro; Ortiz, Antonio; Macías, José

    2018-06-27

    A new implementation has been made on CloudMC, a cloud-based platform presented in a previous work, in order to provide services for radiotherapy treatment verification by means of Monte Carlo in a fast, easy and economical way. A description of the architecture of the application and the new developments implemented is presented together with the results of the tests carried out to validate its performance. CloudMC has been developed over Microsoft Azure cloud. It is based on a map/reduce implementation for Monte Carlo calculations distribution over a dynamic cluster of virtual machines in order to reduce calculation time. CloudMC has been updated with new methods to read and process the information related to radiotherapy treatment verification: CT image set, treatment plan, structures and dose distribution files in DICOM format. Some tests have been designed in order to determine, for the different tasks, the most suitable type of virtual machines from those available in Azure. Finally, the performance of Monte Carlo verification in CloudMC is studied through three real cases that involve different treatment techniques, linac models and Monte Carlo codes. Considering computational and economic factors, D1_v2 and G1 virtual machines were selected as the default type for the Worker Roles and the Reducer Role respectively. Calculation times up to 33 min and costs of 16 € were achieved for the verification cases presented when a statistical uncertainty below 2% (2σ) was required. The costs were reduced to 3-6 € when uncertainty requirements are relaxed to 4%. Advantages like high computational power, scalability, easy access and pay-per-usage model, make Monte Carlo cloud-based solutions, like the one presented in this work, an important step forward to solve the long-lived problem of truly introducing the Monte Carlo algorithms in the daily routine of the radiotherapy planning process.

  4. Runtime Verification of Pacemaker Functionality Using Hierarchical Fuzzy Colored Petri-nets.

    PubMed

    Majma, Negar; Babamir, Seyed Morteza; Monadjemi, Amirhassan

    2017-02-01

    Today, implanted medical devices are increasingly used for many patients and in case of diverse health problems. However, several runtime problems and errors are reported by the relevant organizations, even resulting in patient death. One of those devices is the pacemaker. The pacemaker is a device helping the patient to regulate the heartbeat by connecting to the cardiac vessels. This device is directed by its software, so any failure in this software causes a serious malfunction. Therefore, this study aims to a better way to monitor the device's software behavior to decrease the failure risk. Accordingly, we supervise the runtime function and status of the software. The software verification means examining limitations and needs of the system users by the system running software. In this paper, a method to verify the pacemaker software, based on the fuzzy function of the device, is presented. So, the function limitations of the device are identified and presented as fuzzy rules and then the device is verified based on the hierarchical Fuzzy Colored Petri-net (FCPN), which is formed considering the software limits. Regarding the experiences of using: 1) Fuzzy Petri-nets (FPN) to verify insulin pumps, 2) Colored Petri-nets (CPN) to verify the pacemaker and 3) To verify the pacemaker by a software agent with Petri-network based knowledge, which we gained during the previous studies, the runtime behavior of the pacemaker software is examined by HFCPN, in this paper. This is considered a developing step compared to the earlier work. HFCPN in this paper, compared to the FPN and CPN used in our previous studies reduces the complexity. By presenting the Petri-net (PN) in a hierarchical form, the verification runtime, decreased as 90.61% compared to the verification runtime in the earlier work. Since we need an inference engine in the runtime verification, we used the HFCPN to enhance the performance of the inference engine.

  5. PET/CT imaging for treatment verification after proton therapy: A study with plastic phantoms and metallic implants

    PubMed Central

    Parodi, Katia; Paganetti, Harald; Cascio, Ethan; Flanz, Jacob B.; Bonab, Ali A.; Alpert, Nathaniel M.; Lohmann, Kevin; Bortfeld, Thomas

    2008-01-01

    The feasibility of off-line positron emission tomography/computed tomography (PET/CT) for routine three dimensional in-vivo treatment verification of proton radiation therapy is currently under investigation at Massachusetts General Hospital in Boston. In preparation for clinical trials, phantom experiments were carried out to investigate the sensitivity and accuracy of the method depending on irradiation and imaging parameters. Furthermore, they addressed the feasibility of PET/CT as a robust verification tool in the presence of metallic implants. These produce x-ray CT artifacts and fluence perturbations which may compromise the accuracy of treatment planning algorithms. Spread-out Bragg peak proton fields were delivered to different phantoms consisting of polymethylmethacrylate (PMMA), PMMA stacked with lung and bone equivalent materials, and PMMA with titanium rods to mimic implants in patients. PET data were acquired in list mode starting within 20 min after irradiation at a commercial luthetium-oxyorthosilicate (LSO)-based PET/CT scanner. The amount and spatial distribution of the measured activity could be well reproduced by calculations based on the GEANT4 and FLUKA Monte Carlo codes. This phantom study supports the potential of millimeter accuracy for range monitoring and lateral field position verification even after low therapeutic dose exposures of 2 Gy, despite the delay between irradiation and imaging. It also indicates the value of PET for treatment verification in the presence of metallic implants, demonstrating a higher sensitivity to fluence perturbations in comparison to a commercial analytical treatment planning system. Finally, it addresses the suitability of LSO-based PET detectors for hadron therapy monitoring. This unconventional application of PET involves countrates which are orders of magnitude lower than in diagnostic tracer imaging, i.e., the signal of interest is comparable to the noise originating from the intrinsic radioactivity of the detector itself. In addition to PET alone, PET/CT imaging provides accurate information on the position of the imaged object and may assess possible anatomical changes during fractionated radiotherapy in clinical applications. PMID:17388158

  6. Determination of somatropin charged variants by capillary zone electrophoresis - optimisation, verification and implementation of the European pharmacopoeia method.

    PubMed

    Storms, S M; Feltus, A; Barker, A R; Joly, M-A; Girard, M

    2009-03-01

    Measurement of somatropin charged variants by isoelectric focusing was replaced with capillary zone electrophoresis in the January 2006 European Pharmacopoeia Supplement 5.3, based on results from an interlaboratory collaborative study. Due to incompatibilities and method-robustness issues encountered prior to verification, a number of method parameters required optimisation. As the use of a diode array detector at 195 nm or 200 nm led to a loss of resolution, a variable wavelength detector using a 200 nm filter was employed. Improved injection repeatability was obtained by increasing the injection time and pressure, and changing the sample diluent from water to running buffer. Finally, definition of capillary pre-treatment and rinse procedures resulted in more consistent separations over time. Method verification data are presented demonstrating linearity, specificity, repeatability, intermediate precision, limit of quantitation, sample stability, solution stability, and robustness. Based on these experiments, several modifications to the current method have been recommended and incorporated into the European Pharmacopoeia to help improve method performance across laboratories globally.

  7. Direct and full-scale experimental verifications towards ground-satellite quantum key distribution

    NASA Astrophysics Data System (ADS)

    Wang, Jian-Yu; Yang, Bin; Liao, Sheng-Kai; Zhang, Liang; Shen, Qi; Hu, Xiao-Fang; Wu, Jin-Cai; Yang, Shi-Ji; Jiang, Hao; Tang, Yan-Lin; Zhong, Bo; Liang, Hao; Liu, Wei-Yue; Hu, Yi-Hua; Huang, Yong-Mei; Qi, Bo; Ren, Ji-Gang; Pan, Ge-Sheng; Yin, Juan; Jia, Jian-Jun; Chen, Yu-Ao; Chen, Kai; Peng, Cheng-Zhi; Pan, Jian-Wei

    2013-05-01

    Quantum key distribution (QKD) provides the only intrinsically unconditional secure method for communication based on the principle of quantum mechanics. Compared with fibre-based demonstrations, free-space links could provide the most appealing solution for communication over much larger distances. Despite significant efforts, all realizations to date rely on stationary sites. Experimental verifications are therefore extremely crucial for applications to a typical low Earth orbit satellite. To achieve direct and full-scale verifications of our set-up, we have carried out three independent experiments with a decoy-state QKD system, and overcome all conditions. The system is operated on a moving platform (using a turntable), on a floating platform (using a hot-air balloon), and with a high-loss channel to demonstrate performances under conditions of rapid motion, attitude change, vibration, random movement of satellites, and a high-loss regime. The experiments address wide ranges of all leading parameters relevant to low Earth orbit satellites. Our results pave the way towards ground-satellite QKD and a global quantum communication network.

  8. Verification of Eulerian-Eulerian and Eulerian-Lagrangian simulations for turbulent fluid-particle flows

    DOE PAGES

    Patel, Ravi G.; Desjardins, Olivier; Kong, Bo; ...

    2017-09-01

    Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less

  9. Verification of Eulerian-Eulerian and Eulerian-Lagrangian simulations for turbulent fluid-particle flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patel, Ravi G.; Desjardins, Olivier; Kong, Bo

    Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less

  10. A melting-point-of gallium apparatus for thermometer calibration.

    PubMed

    Sostman, H E; Manley, K A

    1978-08-01

    We have investigated the equilibrium melting point of gallium as a temperature fixed-point at which to calibrate small thermistor thermometers, such as those used to measure temperature in enzyme reaction analysis and other temperature-dependent biological assays. We have determined that the melting temperature of "6N" (99.999% pure) gallium is 29.770 +/- 0.002 degrees C, and that the constant-temperature plateau can be prolonged for several hours. We have designed a simple automated apparatus that exploits this phenomenon and that permits routine calibration verification of thermistor temperature probes throughout the laboratory day. We describe the physics of the gallium melt, and the design and use of the apparatus.

  11. Safeguards: The past present, and future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seaton, M.B.

    1987-07-01

    The non-destructive assay techniques developed at Los Alamos have become a primary means for verification by the IAEA and most important for domestic safeguards. We must challenge our assumptions, e.g., that inventory differences are a valid measure of safeguards performance, that more money is the solution, and the threats are much exaggerated. A human reliability program will be initiated. Material control, accounting, and physical protection need further integration. A serious effort involving scholarships, internships, etc. is needed to attract and motivate young people. Increased emphasis will be placed on designing safeguards into new systems such as laser isotope separation. Finally,more » continuing generous support for the IAEA is most important.« less

  12. Formal Techniques for Synchronized Fault-Tolerant Systems

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Butler, Ricky W.

    1992-01-01

    We present the formal verification of synchronizing aspects of the Reliable Computing Platform (RCP), a fault-tolerant computing system for digital flight control applications. The RCP uses NMR-style redundancy to mask faults and internal majority voting to purge the effects of transient faults. The system design has been formally specified and verified using the EHDM verification system. Our formalization is based on an extended state machine model incorporating snapshots of local processors clocks.

  13. A dedicated software application for treatment verification with off-line PET/CT imaging at the Heidelberg Ion Beam Therapy Center

    NASA Astrophysics Data System (ADS)

    Chen, W.; Bauer, J.; Kurz, C.; Tessonnier, T.; Handrack, J.; Haberer, T.; Debus, J.; Parodi, K.

    2017-01-01

    We present the workflow of the offline-PET based range verification method used at the Heidelberg Ion Beam Therapy Center, detailing the functionalities of an in-house developed software application, SimInterface14, with which range analysis is performed. Moreover, we introduce the design of a decision support system assessing uncertainties and facilitating physicians in decisions making for plan adaptation.

  14. Automatic Methods and Tools for the Verification of Real Time Systems

    DTIC Science & Technology

    1997-07-31

    real - time systems . This was accomplished by extending techniques, based on automata theory and temporal logic, that have been successful for the verification of time-independent reactive systems. As system specification lanmaage for embedded real - time systems , we introduced hybrid automata, which equip traditional discrete automata with real-numbered clock variables and continuous environment variables. As requirements specification languages, we introduced temporal logics with clock variables for expressing timing constraints.

  15. Expert system verification and validation study. Phase 2: Requirements Identification. Delivery 2: Current requirements applicability

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The second phase of a task is described which has the ultimate purpose of ensuring that adequate Expert Systems (ESs) Verification and Validation (V and V) tools and techniques are available for Space Station Freedom Program Knowledge Based Systems development. The purpose of this phase is to recommend modifications to current software V and V requirements which will extend the applicability of the requirements to NASA ESs.

  16. Secure Image Hash Comparison for Warhead Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruillard, Paul J.; Jarman, Kenneth D.; Robinson, Sean M.

    2014-06-06

    The effort to inspect and verify warheads in the context of possible future arms control treaties is rife with security and implementation issues. In this paper we review prior work on perceptual image hashing for template-based warhead verification. Furthermore, we formalize the notion of perceptual hashes and demonstrate that large classes of such functions are likely not cryptographically secure. We close with a brief discussion of fully homomorphic encryption as an alternative technique.

  17. A Methodology for Formal Hardware Verification, with Application to Microprocessors.

    DTIC Science & Technology

    1993-08-29

    concurrent programming lan- guages. Proceedings of the NATO Advanced Study Institute on Logics and Models of Concurrent Systems ( Colle - sur - Loup , France, 8-19...restricted class of formu- las . Bose and Fisher [26] developed a symbolic model checker based on a Cosmos switch-level model. Their modeling approach...verification using SDVS-the method and a case study. 17th Anuual Microprogramming Workshop (New Orleans, LA , 30 October-2 November 1984). Published as

  18. Towards the development of a rapid, portable, surface enhanced Raman spectroscopy based cleaning verification system for the drug nelarabine.

    PubMed

    Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey

    2010-09-01

    Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.

  19. Selectivity verification of cardiac troponin monoclonal antibodies for cardiac troponin detection by using conventional ELISA

    NASA Astrophysics Data System (ADS)

    Fathil, M. F. M.; Arshad, M. K. Md; Gopinath, Subash C. B.; Adzhri, R.; Ruslinda, A. R.; Hashim, U.

    2017-03-01

    This paper presents preparation and characterization of conventional enzyme-linked immunosorbent assay (ELISA) for cardiac troponin detection to determine the selectivity of the cardiac troponin monoclonal antibodies. Monoclonal antibodies, used to capture and bind the targets in this experiment, are cTnI monoclonal antibody (MAb-cTnI) and cTnT monoclonal antibody (MAb-cTnT), while both cardiac troponin I (cTnI) and T (cTnT) are used as targets. ELISA is performed inside two microtiter plates for MAb-cTnI and MAb-cTnT. For each plate, monoclonal antibodies are tested by various concentrations of cTnI and cTnT ranging from 0-6400 µg/l. The binding selectivity and level of detection between monoclonal antibodies and antigen are determined through visual observation based on the color change inside each well on the plate. ELISA reader is further used to quantitatively measured the optical density of the color changes, thus produced more accurate reading. The results from this experiment are utilized to justify the use of these monoclonal antibodies as bio-receptors for cardiac troponin detection by using field-effect transistor (FET)-based biosensors coupled with substrate-gate in the future.

  20. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  1. Speaker verification using committee neural networks.

    PubMed

    Reddy, Narender P; Buch, Ojas A

    2003-10-01

    Security is a major problem in web based access or remote access to data bases. In the present study, the technique of committee neural networks was developed for speech based speaker verification. Speech data from the designated speaker and several imposters were obtained. Several parameters were extracted in the time and frequency domains, and fed to neural networks. Several neural networks were trained and the five best performing networks were recruited into the committee. The committee decision was based on majority voting of the member networks. The committee opinion was evaluated with further testing data. The committee correctly identified the designated speaker in (50 out of 50) 100% of the cases and rejected imposters in (150 out of 150) 100% of the cases. The committee decision was not unanimous in majority of the cases tested.

  2. Detection of bovine central nervous system tissues in rendered animal by-products by one-step real-time reverse transcription PCR assay.

    PubMed

    Andrievskaia, Olga; Tangorra, Erin

    2014-12-01

    Contamination of rendered animal byproducts with central nervous system tissues (CNST) from animals with bovine spongiform encephalopathy is considered one of the vehicles of disease transmission. Removal from the animal feed chain of CNST originated from cattle of a specified age category, species-labeling of rendered meat products, and testing of rendered products for bovine CNST are tasks associated with the epidemiological control of bovine spongiform encephalopathy. A single-step TaqMan real-time reverse transcriptase (RRT) PCR assay was developed and evaluated for specific detection of bovine glial fibrillary acidic protein (GFAP) mRNA, a biomarker of bovine CNST, in rendered animal by-products. An internal amplification control, mammalian b -actin mRNA, was coamplified in the duplex RRT-PCR assay to monitor amplification efficiency, normalize amplification signals, and avoid false-negative results. The functionality of the GFAP mRNA RRT-PCR was assessed through analysis of laboratory-generated binary mixtures of bovine central nervous system (CNS) and muscle tissues treated under various thermal settings imitating industrial conditions. The assay was able to detect as low as 0.05 % (wt/wt) bovine brain tissue in binary mixtures heat treated at 110 to 130°C for 20 to 60 min. Further evaluation of the GFAP mRNA RRT-PCR assay involved samples of industrial rendered products of various species origin and composition obtained from commercial sources and rendering plants. Low amounts of bovine GFAP mRNA were detected in several bovine-rendered products, which was in agreement with declared species composition. An accurate estimation of CNS tissue content in industrial-rendered products was complicated due to a wide range of temperature and time settings in rendering protocols. Nevertheless, the GFAP mRNA RRT-PCR assay may be considered for bovine CNS tissue detection in rendered products in combination with other available tools (for example, animal age verification) in inspection programs.

  3. Computer simulation of storm runoff for three watersheds in Albuquerque, New Mexico

    USGS Publications Warehouse

    Knutilla, R.L.; Veenhuis, J.E.

    1994-01-01

    Rainfall-runoff data from three watersheds were selected for calibration and verification of the U.S. Geological Survey's Distributed Routing Rainfall-Runoff Model. The watersheds chosen are residentially developed. The conceptually based model uses an optimization process that adjusts selected parameters to achieve the best fit between measured and simulated runoff volumes and peak discharges. Three of these optimization parameters represent soil-moisture conditions, three represent infiltration, and one accounts for effective impervious area. Each watershed modeled was divided into overland-flow segments and channel segments. The overland-flow segments were further subdivided to reflect pervious and impervious areas. Each overland-flow and channel segment was assigned representative values of area, slope, percentage of imperviousness, and roughness coefficients. Rainfall-runoff data for each watershed were separated into two sets for use in calibration and verification. For model calibration, seven input parameters were optimized to attain a best fit of the data. For model verification, parameter values were set using values from model calibration. The standard error of estimate for calibration of runoff volumes ranged from 19 to 34 percent, and for peak discharge calibration ranged from 27 to 44 percent. The standard error of estimate for verification of runoff volumes ranged from 26 to 31 percent, and for peak discharge verification ranged from 31 to 43 percent.

  4. Calibration and verification of thermographic cameras for geometric measurements

    NASA Astrophysics Data System (ADS)

    Lagüela, S.; González-Jorge, H.; Armesto, J.; Arias, P.

    2011-03-01

    Infrared thermography is a technique with an increasing degree of development and applications. Quality assessment in the measurements performed with the thermal cameras should be achieved through metrology calibration and verification. Infrared cameras acquire temperature and geometric information, although calibration and verification procedures are only usual for thermal data. Black bodies are used for these purposes. Moreover, the geometric information is important for many fields as architecture, civil engineering and industry. This work presents a calibration procedure that allows the photogrammetric restitution and a portable artefact to verify the geometric accuracy, repeatability and drift of thermographic cameras. These results allow the incorporation of this information into the quality control processes of the companies. A grid based on burning lamps is used for the geometric calibration of thermographic cameras. The artefact designed for the geometric verification consists of five delrin spheres and seven cubes of different sizes. Metrology traceability for the artefact is obtained from a coordinate measuring machine. Two sets of targets with different reflectivity are fixed to the spheres and cubes to make data processing and photogrammetric restitution possible. Reflectivity was the chosen material propriety due to the thermographic and visual cameras ability to detect it. Two thermographic cameras from Flir and Nec manufacturers, and one visible camera from Jai are calibrated, verified and compared using calibration grids and the standard artefact. The calibration system based on burning lamps shows its capability to perform the internal orientation of the thermal cameras. Verification results show repeatability better than 1 mm for all cases, being better than 0.5 mm for the visible one. As it must be expected, also accuracy appears higher in the visible camera, and the geometric comparison between thermographic cameras shows slightly better results for the Nec camera.

  5. A methodology for model-based development and automated verification of software for aerospace systems

    NASA Astrophysics Data System (ADS)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  6. MO-FG-202-01: A Fast Yet Sensitive EPID-Based Real-Time Treatment Verification System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmad, M; Nourzadeh, H; Neal, B

    2016-06-15

    Purpose: To create a real-time EPID-based treatment verification system which robustly detects treatment delivery and patient attenuation variations. Methods: Treatment plan DICOM files sent to the record-and-verify system are captured and utilized to predict EPID images for each planned control point using a modified GPU-based digitally reconstructed radiograph algorithm which accounts for the patient attenuation, source energy fluence, source size effects, and MLC attenuation. The DICOM and predicted images are utilized by our C++ treatment verification software which compares EPID acquired 1024×768 resolution frames acquired at ∼8.5hz from Varian Truebeam™ system. To maximize detection sensitivity, image comparisons determine (1) ifmore » radiation exists outside of the desired treatment field; (2) if radiation is lacking inside the treatment field; (3) if translations, rotations, and magnifications of the image are within tolerance. Acquisition was tested with known test fields and prior patient fields. Error detection was tested in real-time and utilizing images acquired during treatment with another system. Results: The computational time of the prediction algorithms, for a patient plan with 350 control points and 60×60×42cm^3 CT volume, is 2–3minutes on CPU and <27 seconds on GPU for 1024×768 images. The verification software requires a maximum of ∼9ms and ∼19ms for 512×384 and 1024×768 resolution images, respectively, to perform image analysis and dosimetric validations. Typical variations in geometric parameters between reference and the measured images are 0.32°for gantry rotation, 1.006 for scaling factor, and 0.67mm for translation. For excess out-of-field/missing in-field fluence, with masks extending 1mm (at isocenter) from the detected aperture edge, the average total in-field area missing EPID fluence was 1.5mm2 the out-of-field excess EPID fluence was 8mm^2, both below error tolerances. Conclusion: A real-time verification software, with EPID images prediction algorithm, was developed. The system is capable of performing verifications between frames acquisitions and identifying source(s) of any out-of-tolerance variations. This work was supported in part by Varian Medical Systems.« less

  7. Verification Games: Crowd-Sourced Formal Verification

    DTIC Science & Technology

    2016-03-01

    VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION UNIVERSITY OF WASHINGTON MARCH 2016 FINAL TECHNICAL REPORT...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750...clarification memorandum dated 16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Over the more than three years of the project Verification Games : Crowd-sourced

  8. Destructive analysis capabilities for plutonium and uranium characterization at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tandon, Lav; Kuhn, Kevin J; Drake, Lawrence R

    Los Alamos National Laboratory's (LANL) Actinide Analytical Chemistry (AAC) group has been in existence since the Manhattan Project. It maintains a complete set of analytical capabilities for performing complete characterization (elemental assay, isotopic, metallic and non metallic trace impurities) of uranium and plutonium samples in different forms. For a majority of the customers there are strong quality assurance (QA) and quality control (QC) objectives including highest accuracy and precision with well defined uncertainties associated with the analytical results. Los Alamos participates in various international and national programs such as the Plutonium Metal Exchange Program, New Brunswick Laboratory's (NBL' s) Safeguardsmore » Measurement Evaluation Program (SME) and several other inter-laboratory round robin exercises to monitor and evaluate the data quality generated by AAC. These programs also provide independent verification of analytical measurement capabilities, and allow any technical problems with analytical measurements to be identified and corrected. This presentation will focus on key analytical capabilities for destructive analysis in AAC and also comparative data between LANL and peer groups for Pu assay and isotopic analysis.« less

  9. 4D offline PET-based treatment verification in scanned ion beam therapy: a phantom study

    NASA Astrophysics Data System (ADS)

    Kurz, Christopher; Bauer, Julia; Unholtz, Daniel; Richter, Daniel; Stützer, Kristin; Bert, Christoph; Parodi, Katia

    2015-08-01

    At the Heidelberg Ion-Beam Therapy Center, patient irradiation with scanned proton and carbon ion beams is verified by offline positron emission tomography (PET) imaging: the {β+} -activity measured within the patient is compared to a prediction calculated on the basis of the treatment planning data in order to identify potential delivery errors. Currently, this monitoring technique is limited to the treatment of static target structures. However, intra-fractional organ motion imposes considerable additional challenges to scanned ion beam radiotherapy. In this work, the feasibility and potential of time-resolved (4D) offline PET-based treatment verification with a commercial full-ring PET/CT (x-ray computed tomography) device are investigated for the first time, based on an experimental campaign with moving phantoms. Motion was monitored during the gated beam delivery as well as the subsequent PET acquisition and was taken into account in the corresponding 4D Monte-Carlo simulations and data evaluation. Under the given experimental conditions, millimeter agreement between the prediction and measurement was found. Dosimetric consequences due to the phantom motion could be reliably identified. The agreement between PET measurement and prediction in the presence of motion was found to be similar as in static reference measurements, thus demonstrating the potential of 4D PET-based treatment verification for future clinical applications.

  10. Development of an inpatient operational pharmacy productivity model.

    PubMed

    Naseman, Ryan W; Lopez, Ben R; Forrey, Ryan A; Weber, Robert J; Kipp, Kris M

    2015-02-01

    An innovative model for measuring the operational productivity of medication order management in inpatient settings is described. Order verification within a computerized prescriber order-entry system was chosen as the pharmacy workload driver. To account for inherent variability in the tasks involved in processing different types of orders, pharmaceutical products were grouped by class, and each class was assigned a time standard, or "medication complexity weight" reflecting the intensity of pharmacist and technician activities (verification of drug indication, verification of appropriate dosing, adverse-event prevention and monitoring, medication preparation, product checking, product delivery, returns processing, nurse/provider education, and problem-order resolution). The resulting "weighted verifications" (WV) model allows productivity monitoring by job function (pharmacist versus technician) to guide hiring and staffing decisions. A 9-month historical sample of verified medication orders was analyzed using the WV model, and the calculations were compared with values derived from two established models—one based on the Case Mix Index (CMI) and the other based on the proprietary Pharmacy Intensity Score (PIS). Evaluation of Pearson correlation coefficients indicated that values calculated using the WV model were highly correlated with those derived from the CMI-and PIS-based models (r = 0.845 and 0.886, respectively). Relative to the comparator models, the WV model offered the advantage of less period-to-period variability. The WV model yielded productivity data that correlated closely with values calculated using two validated workload management models. The model may be used as an alternative measure of pharmacy operational productivity. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  11. Evidence flow graph methods for validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant

    1988-01-01

    This final report describes the results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems. This was approached by developing a translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could effect the output from a set of rules.

  12. Satellite Power System (SPS) concept definition study (exhibit C)

    NASA Technical Reports Server (NTRS)

    Haley, G. M.

    1979-01-01

    The major outputs of the study are the constructability studies which resulted in the definition of the concepts for satellite, rectenna, and satellite construction base construction. Transportation analyses resulted in definition of heavy-lift launch vehicle, electric orbit transfer vehicle, personnel orbit transfer vehicle, and intra-orbit transfer vehicle as well as overall operations related to transportation systems. The experiment/verification program definition resulted in the definition of elements for the Ground-Based Experimental Research and Key Technology plans. These studies also resulted in conceptual approaches for early space technology verification. The cost analysis defined the overall program and cost data for all program elements and phases.

  13. Test and training simulator for ground-based teleoperated in-orbit servicing

    NASA Technical Reports Server (NTRS)

    Schaefer, Bernd E.

    1989-01-01

    For the Post-IOC(In-Orbit Construction)-Phase of COLUMBUS it is intended to use robotic devices for the routine operations of ground-based teleoperated In-Orbit Servicing. A hardware simulator for verification of the relevant in-orbit operations technologies, the Servicing Test Facility, is necessary which mainly will support the Flight Control Center for the Manned Space-Laboratories for operational specific tasks like system simulation, training of teleoperators, parallel operation simultaneously to actual in-orbit activities and for the verification of the ground operations segment for telerobotics. The present status of definition for the facility functional and operational concept is described.

  14. Determining the mechanical properties of a radiochromic silicone-based 3D dosimeter

    NASA Astrophysics Data System (ADS)

    Kaplan, L. P.; Høye, E. M.; Balling, P.; Muren, L. P.; Petersen, J. B. B.; Poulsen, P. R.; Yates, E. S.; Skyt, P. S.

    2017-07-01

    New treatment modalities in radiotherapy (RT) enable delivery of highly conformal dose distributions in patients. This creates a need for precise dose verification in three dimensions (3D). A radiochromic silicone-based 3D dosimetry system has recently been developed. Such a dosimeter can be used for dose verification in deformed geometries, which requires knowledge of the dosimeter’s mechanical properties. In this study we have characterized the dosimeter’s elastic behaviour under tensile and compressive stress. In addition, the dose response under strain was determined. It was found that the dosimeter behaved as an incompressible hyperelastic material with a non-linear stress/strain curve and with no observable hysteresis or plastic deformation even at high strains. The volume was found to be constant within a 2% margin at deformations up to 60%. Furthermore, it was observed that the dosimeter returned to its original geometry within a 2% margin when irradiated under stress, and that the change in optical density per centimeter was constant regardless of the strain during irradiation. In conclusion, we have shown that this radiochromic silicone-based dosimeter’s mechanical properties make it a viable candidate for dose verification in deformable 3D geometries.

  15. Provable Transient Recovery for Frame-Based, Fault-Tolerant Computing Systems

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Butler, Ricky W.

    1992-01-01

    We present a formal verification of the transient fault recovery aspects of the Reliable Computing Platform (RCP), a fault-tolerant computing system architecture for digital flight control applications. The RCP uses NMR-style redundancy to mask faults and internal majority voting to purge the effects of transient faults. The system design has been formally specified and verified using the EHDM verification system. Our formalization accommodates a wide variety of voting schemes for purging the effects of transients.

  16. Using a Modular Open Systems Approach in Defense Acquisitions: Implications for the Contracting Process

    DTIC Science & Technology

    2006-01-30

    He has taught contract management courses for the UCLA Government Contracts Certificate program and is also a senior faculty member for the Keller...standards for its key interfaces, and has been subjected to successful validation and verification tests to ensure the openness of its key interfaces...widely supported and consensus based standards for its key interfaces, and is subject to validation and verification tests to ensure the openness of its

  17. Sensor Based Framework for Secure Multimedia Communication in VANET

    PubMed Central

    Rahim, Aneel; Khan, Zeeshan Shafi; Bin Muhaya, Fahad T.; Sher, Muhammad; Kim, Tai-Hoon

    2010-01-01

    Secure multimedia communication enhances the safety of passengers by providing visual pictures of accidents and danger situations. In this paper we proposed a framework for secure multimedia communication in Vehicular Ad-Hoc Networks (VANETs). Our proposed framework is mainly divided into four components: redundant information, priority assignment, malicious data verification and malicious node verification. The proposed scheme jhas been validated with the help of the NS-2 network simulator and the Evalvid tool. PMID:22163462

  18. Modeling and experimental verification of laser self-mixing interference phenomenon with the structure of two-external-cavity feedback

    NASA Astrophysics Data System (ADS)

    Chen, Peng; Liu, Yuwei; Gao, Bingkun; Jiang, Chunlei

    2018-03-01

    A semiconductor laser employed with two-external-cavity feedback structure for laser self-mixing interference (SMI) phenomenon is investigated and analyzed. The SMI model with two directions based on F-P cavity is deduced, and numerical simulation and experimental verification were conducted. Experimental results show that the SMI with the structure of two-external-cavity feedback under weak light feedback is similar to the sum of two SMIs.

  19. Accuracy Evaluation of a CE-Marked Glucometer System for Self-Monitoring of Blood Glucose With Three Reagent Lots Following ISO 15197:2013.

    PubMed

    Hehmke, Bernd; Berg, Sabine; Salzsieder, Eckhard

    2017-05-01

    Continuous standardized verification of the accuracy of blood glucose meter systems for self-monitoring after their introduction into the market is an important clinically tool to assure reliable performance of subsequently released lots of strips. Moreover, such published verification studies permit comparison of different blood glucose monitoring systems and, thus, are increasingly involved in the process of evidence-based purchase decision making.

  20. Verification using Satisfiability Checking, Predicate Abstraction, and Craig Interpolation

    DTIC Science & Technology

    2008-09-01

    297, 2007. 4.10.1 196 [48] Roberto Bruttomesso, Alessandro Cimatti, Anders Franzen, Alberto Grig- gio, Ziyad Hanna, Alexander Nadel, Amit Palti, and...using SAT based conflict analysis. In Formal Methods in Computer Aided Design, pages 33–51, 2002. 1.1, 7 [54] Alessandro Cimatti, Alberto Griggio, and...and D. Vroon. Automatic memory reductions for RTL-level verification. In ICCAD, 2006. 1.2.4, 6.2, 7 [108] Joao P. Marques-Silva and Karem A. Sakallah

  1. Optical detection of random features for high security applications

    NASA Astrophysics Data System (ADS)

    Haist, T.; Tiziani, H. J.

    1998-02-01

    Optical detection of random features in combination with digital signatures based on public key codes in order to recognize counterfeit objects will be discussed. Without applying expensive production techniques objects are protected against counterfeiting. Verification is done off-line by optical means without a central authority. The method is applied for protecting banknotes. Experimental results for this application are presented. The method is also applicable for identity verification of a credit- or chip-card holder.

  2. Toward a formal verification of a floating-point coprocessor and its composition with a central processing unit

    NASA Technical Reports Server (NTRS)

    Pan, Jing; Levitt, Karl N.; Cohen, Gerald C.

    1991-01-01

    Discussed here is work to formally specify and verify a floating point coprocessor based on the MC68881. The HOL verification system developed at Cambridge University was used. The coprocessor consists of two independent units: the bus interface unit used to communicate with the cpu and the arithmetic processing unit used to perform the actual calculation. Reasoning about the interaction and synchronization among processes using higher order logic is demonstrated.

  3. An evaluation of SEASAT-A candidate ocean industry economic verification experiments

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A description of the candidate economic verification experiments which could be performed with SEASAT is provided. Experiments have been identified in each of the areas of ocean-based activity that are expected to show an economic impact from the use of operational SEASAT data. Experiments have been identified in the areas of Arctic operations, the ocean fishing industry, the offshore oil and natural gas industry, as well as ice monitoring and coastal zone applications.

  4. Predicate Abstraction of ANSI-C Programs using SAT

    DTIC Science & Technology

    2003-09-23

    compositionally and automatically. In Alan J. Hu and Moshe Y. Vardi, editors, Computer-Aided Verification, CAV ’98, volume 1427, pages 319–331, Vancouver...Languages, POPL ’77, pages 238–252, 1977. [14] David W. Currie, Alan J. Hu, Sreeranga Rajan, and Masahira Fujita. Automatic formal verification of dsp...Languages and Systems (TOPLAS), 2(4):564–79, 1980. [19] A. Gupta, Z. Yang, P. Ashar , and A. Gupta. SAT-based image computation with application in

  5. New Aspects of Probabilistic Forecast Verification Using Information Theory

    NASA Astrophysics Data System (ADS)

    Tödter, Julian; Ahrens, Bodo

    2013-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification, particularly concerning ensemble forecasts. Recent findings concerning the "Ignorance Score" are shortly reviewed, then a consistent generalization to continuous forecasts is motivated. For ensemble-generated forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up a natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The useful properties of the conceptually appealing CRIGN are illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This algorithm can also be used to calculate the decomposition of the more traditional CRPS exactly. The applicability of the "new" measures is demonstrated in a small evaluation study of ensemble-based precipitation forecasts.

  6. Personal Verification/Identification via Analysis of the Peripheral ECG Leads: Influence of the Personal Health Status on the Accuracy

    PubMed Central

    Bortolan, Giovanni

    2015-01-01

    Traditional means for identity validation (PIN codes, passwords), and physiological and behavioral biometric characteristics (fingerprint, iris, and speech) are susceptible to hacker attacks and/or falsification. This paper presents a method for person verification/identification based on correlation of present-to-previous limb ECG leads: I (r I), II (r II), calculated from them first principal ECG component (r PCA), linear and nonlinear combinations between r I, r II, and r PCA. For the verification task, the one-to-one scenario is applied and threshold values for r I, r II, and r PCA and their combinations are derived. The identification task supposes one-to-many scenario and the tested subject is identified according to the maximal correlation with a previously recorded ECG in a database. The population based ECG-ILSA database of 540 patients (147 healthy subjects, 175 patients with cardiac diseases, and 218 with hypertension) has been considered. In addition a common reference PTB dataset (14 healthy individuals) with short time interval between the two acquisitions has been taken into account. The results on ECG-ILSA database were satisfactory with healthy people, and there was not a significant decrease in nonhealthy patients, demonstrating the robustness of the proposed method. With PTB database, the method provides an identification accuracy of 92.9% and a verification sensitivity and specificity of 100% and 89.9%. PMID:26568954

  7. MESA: Message-Based System Analysis Using Runtime Verification

    NASA Technical Reports Server (NTRS)

    Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter

    2017-01-01

    In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.

  8. Warhead verification as inverse problem: Applications of neutron spectrum unfolding from organic-scintillator measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawrence, Chris C.; Flaska, Marek; Pozzi, Sara A.

    2016-08-14

    Verification of future warhead-dismantlement treaties will require detection of certain warhead attributes without the disclosure of sensitive design information, and this presents an unusual measurement challenge. Neutron spectroscopy—commonly eschewed as an ill-posed inverse problem—may hold special advantages for warhead verification by virtue of its insensitivity to certain neutron-source parameters like plutonium isotopics. In this article, we investigate the usefulness of unfolded neutron spectra obtained from organic-scintillator data for verifying a particular treaty-relevant warhead attribute: the presence of high-explosive and neutron-reflecting materials. Toward this end, several improvements on current unfolding capabilities are demonstrated: deuterated detectors are shown to have superior response-matrixmore » condition to that of standard hydrogen-base scintintillators; a novel data-discretization scheme is proposed which removes important detector nonlinearities; and a technique is described for re-parameterizing the unfolding problem in order to constrain the parameter space of solutions sought, sidestepping the inverse problem altogether. These improvements are demonstrated with trial measurements and verified using accelerator-based time-of-flight calculation of reference spectra. Then, a demonstration is presented in which the elemental compositions of low-Z neutron-attenuating materials are estimated to within 10%. These techniques could have direct application in verifying the presence of high-explosive materials in a neutron-emitting test item, as well as other for treaty verification challenges.« less

  9. Warhead verification as inverse problem: Applications of neutron spectrum unfolding from organic-scintillator measurements

    NASA Astrophysics Data System (ADS)

    Lawrence, Chris C.; Febbraro, Michael; Flaska, Marek; Pozzi, Sara A.; Becchetti, F. D.

    2016-08-01

    Verification of future warhead-dismantlement treaties will require detection of certain warhead attributes without the disclosure of sensitive design information, and this presents an unusual measurement challenge. Neutron spectroscopy—commonly eschewed as an ill-posed inverse problem—may hold special advantages for warhead verification by virtue of its insensitivity to certain neutron-source parameters like plutonium isotopics. In this article, we investigate the usefulness of unfolded neutron spectra obtained from organic-scintillator data for verifying a particular treaty-relevant warhead attribute: the presence of high-explosive and neutron-reflecting materials. Toward this end, several improvements on current unfolding capabilities are demonstrated: deuterated detectors are shown to have superior response-matrix condition to that of standard hydrogen-base scintintillators; a novel data-discretization scheme is proposed which removes important detector nonlinearities; and a technique is described for re-parameterizing the unfolding problem in order to constrain the parameter space of solutions sought, sidestepping the inverse problem altogether. These improvements are demonstrated with trial measurements and verified using accelerator-based time-of-flight calculation of reference spectra. Then, a demonstration is presented in which the elemental compositions of low-Z neutron-attenuating materials are estimated to within 10%. These techniques could have direct application in verifying the presence of high-explosive materials in a neutron-emitting test item, as well as other for treaty verification challenges.

  10. Full-chip level MEEF analysis using model based lithography verification

    NASA Astrophysics Data System (ADS)

    Kim, Juhwan; Wang, Lantian; Zhang, Daniel; Tang, Zongwu

    2005-11-01

    MEEF (Mask Error Enhancement Factor) has become a critical factor in CD uniformity control since optical lithography process moved to sub-resolution era. A lot of studies have been done by quantifying the impact of the mask CD (Critical Dimension) errors on the wafer CD errors1-2. However, the benefits from those studies were restricted only to small pattern areas of the full-chip data due to long simulation time. As fast turn around time can be achieved for the complicated verifications on very large data by linearly scalable distributed processing technology, model-based lithography verification becomes feasible for various types of applications such as post mask synthesis data sign off for mask tape out in production and lithography process development with full-chip data3,4,5. In this study, we introduced two useful methodologies for the full-chip level verification of mask error impact on wafer lithography patterning process. One methodology is to check MEEF distribution in addition to CD distribution through process window, which can be used for RET/OPC optimization at R&D stage. The other is to check mask error sensitivity on potential pinch and bridge hotspots through lithography process variation, where the outputs can be passed on to Mask CD metrology to add CD measurements on those hotspot locations. Two different OPC data were compared using the two methodologies in this study.

  11. Personal Verification/Identification via Analysis of the Peripheral ECG Leads: Influence of the Personal Health Status on the Accuracy.

    PubMed

    Jekova, Irena; Bortolan, Giovanni

    2015-01-01

    Traditional means for identity validation (PIN codes, passwords), and physiological and behavioral biometric characteristics (fingerprint, iris, and speech) are susceptible to hacker attacks and/or falsification. This paper presents a method for person verification/identification based on correlation of present-to-previous limb ECG leads: I (r I), II (r II), calculated from them first principal ECG component (r PCA), linear and nonlinear combinations between r I, r II, and r PCA. For the verification task, the one-to-one scenario is applied and threshold values for r I, r II, and r PCA and their combinations are derived. The identification task supposes one-to-many scenario and the tested subject is identified according to the maximal correlation with a previously recorded ECG in a database. The population based ECG-ILSA database of 540 patients (147 healthy subjects, 175 patients with cardiac diseases, and 218 with hypertension) has been considered. In addition a common reference PTB dataset (14 healthy individuals) with short time interval between the two acquisitions has been taken into account. The results on ECG-ILSA database were satisfactory with healthy people, and there was not a significant decrease in nonhealthy patients, demonstrating the robustness of the proposed method. With PTB database, the method provides an identification accuracy of 92.9% and a verification sensitivity and specificity of 100% and 89.9%.

  12. Development of an integrated, unattended assay system for LWR-MOX fuel pellet trays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, J.E.; Hatcher, C.R.; Pollat, L.L.

    1994-08-01

    Four identical unattended plutonium assay systems have been developed for use at the new light-water-reactor mixed oxide (LWR-MOX) fuel fabrication facility at Hanau, Germany. The systems provide quantitative plutonium verification for all MOX pellet trays entering or leaving a large, intermediate store. Pellet-tray transport and storage systems are highly automated. Data from the ``I-Point`` (information point) assay systems will be shared by the Euratom and International Atomic Energy Agency (IAEA) Inspectorates. The I-Point system integrates, for the first time, passive neutron coincidence counting (NCC) with electro-mechanical sensing (EMS) in unattended mode. Also, provisions have been made for adding high-resolution gammamore » spectroscopy. The system accumulates data for every tray entering or leaving the store between inspector visits. During an inspection, data are analyzed and compared with operator declarations for the previous inspection period, nominally one month. Specification of the I-point system resulted from a collaboration between the IAEA, Euratom, Siemens, and Los Alamos. Hardware was developed by Siemens and Los Alamos through a bilateral agreement between the German Federal Ministry of Research and Technology (BMFT) and the US DOE. Siemens also provided the EMS subsystem, including software. Through the USSupport Program to the IAEA, Los Alamos developed the NCC software (NCC COLLECT) and also the software for merging and reviewing the EMS and NCC data (MERGE/REVIEW). This paper describes the overall I-Point system, but emphasizes the NCC subsystem, along with the NCC COLLECT and MERGE/REVIEW codes. We also summarize comprehensive testing results that define the quality of assay performance.« less

  13. Verification of an immunoturbidimetric assay for heart-type fatty acid-binding protein (H-FABP) on a clinical chemistry platform and establishment of the upper reference limit.

    PubMed

    Da Molin, Simona; Cappellini, Fabrizio; Falbo, Rosanna; Signorini, Stefano; Brambilla, Paolo

    2014-11-01

    Heart-type fatty acid-binding protein (H-FABP) is an early biomarker of cardiac injury. Randox Laboratories developed an immunoturbidimetric H-FABP assay for non-proprietary automated clinical chemistry analysers that could be useful in the emergency department. We verified the analytical performances claimed by Randox Laboratories on Roche Cobas 6000 clinical chemistry platform in use in our laboratory, and we defined our own 99th percentile upper reference limit for H-FABP. For the verification of method performances, we used pools of spared patient samples from routine and two levels of quality control material, while samples for the reference value study were collected from 545 blood donors. Following CLSI guidelines we verified limit of blank (LOB), limit of detection (LOD), limit of quantitation (LOQ), repeatability and within-laboratory precision, trueness, linearity, and the stability of H-FABP in EDTA over 24h. The LOQ (3.19 μg/L) was verified with a CV% of 10.4. The precision was verified for the low (mean 5.88 μg/L, CV=6.7%), the medium (mean 45.28 μg/L, CV=3.0%), and the high concentration (mean 88.81 μg/L, CV=4.0%). The trueness was verified as well as the linearity over the indicated measurement interval of 0.747-120 μg/L. The H-FABP in EDTA samples is stable throughout 24h both at room temperature and at 4 °C. The H-FABP 99th percentile upper reference limit for all subjects (3.60 μg/L, 95% CI 3.51-3.77) is more appropriate than gender-specific ones that are not statistically different. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  14. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  15. Collaborative Localization and Location Verification in WSNs

    PubMed Central

    Miao, Chunyu; Dai, Guoyong; Ying, Kezhen; Chen, Qingzhang

    2015-01-01

    Localization is one of the most important technologies in wireless sensor networks. A lightweight distributed node localization scheme is proposed by considering the limited computational capacity of WSNs. The proposed scheme introduces the virtual force model to determine the location by incremental refinement. Aiming at solving the drifting problem and malicious anchor problem, a location verification algorithm based on the virtual force mode is presented. In addition, an anchor promotion algorithm using the localization reliability model is proposed to re-locate the drifted nodes. Extended simulation experiments indicate that the localization algorithm has relatively high precision and the location verification algorithm has relatively high accuracy. The communication overhead of these algorithms is relative low, and the whole set of reliable localization methods is practical as well as comprehensive. PMID:25954948

  16. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  17. Optimized Temporal Monitors for SystemC

    NASA Technical Reports Server (NTRS)

    Tabakov, Deian; Rozier, Kristin Y.; Vardi, Moshe Y.

    2012-01-01

    SystemC is a modeling language built as an extension of C++. Its growing popularity and the increasing complexity of designs have motivated research efforts aimed at the verification of SystemC models using assertion-based verification (ABV), where the designer asserts properties that capture the design intent in a formal language such as PSL or SVA. The model then can be verified against the properties using runtime or formal verification techniques. In this paper we focus on automated generation of runtime monitors from temporal properties. Our focus is on minimizing runtime overhead, rather than monitor size or monitor-generation time. We identify four issues in monitor generation: state minimization, alphabet representation, alphabet minimization, and monitor encoding. We conduct extensive experimentation and identify a combination of settings that offers the best performance in terms of runtime overhead.

  18. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  19. Simple method to verify OPC data based on exposure condition

    NASA Astrophysics Data System (ADS)

    Moon, James; Ahn, Young-Bae; Oh, Sey-Young; Nam, Byung-Ho; Yim, Dong Gyu

    2006-03-01

    In a world where Sub100nm lithography tool is an everyday household item for device makers, shrinkage of the device is at a rate that no one ever have imagined. With the shrinkage of device at such a high rate, demand placed on Optical Proximity Correction (OPC) is like never before. To meet this demand with respect to shrinkage rate of the device, more aggressive OPC tactic is involved. Aggressive OPC tactics is a must for sub 100nm lithography tech but this tactic eventually results in greater room for OPC error and complexity of the OPC data. Until now, Optical Rule Check (ORC) or Design Rule Check (DRC) was used to verify this complex OPC error. But each of these methods has its pros and cons. ORC verification of OPC data is rather accurate "process" wise but inspection of full chip device requires a lot of money (Computer , software,..) and patience (run time). DRC however has no such disadvantage, but accuracy of the verification is a total downfall "process" wise. In this study, we were able to create a new method for OPC data verification that combines the best of both ORC and DRC verification method. We created a method that inspects the biasing of the OPC data with respect to the illumination condition of the process that's involved. This new method for verification was applied to 80nm tech ISOLATION and GATE layer of the 512M DRAM device and showed accuracy equivalent to ORC inspection with run time that of DRC verification.

  20. Onsite Gaseous Centrifuge Enrichment Plant UF6 Cylinder Destructive Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anheier, Norman C.; Cannon, Bret D.; Qiao, Hong

    2012-07-17

    The IAEA safeguards approach for gaseous centrifuge enrichment plants (GCEPs) includes measurements of gross, partial, and bias defects in a statistical sampling plan. These safeguard methods consist principally of mass and enrichment nondestructive assay (NDA) verification. Destructive assay (DA) samples are collected from a limited number of cylinders for high precision offsite mass spectrometer analysis. DA is typically used to quantify bias defects in the GCEP material balance. Under current safeguards measures, the operator collects a DA sample from a sample tap following homogenization. The sample is collected in a small UF6 sample bottle, then sealed and shipped under IAEAmore » chain of custody to an offsite analytical laboratory. Current practice is expensive and resource intensive. We propose a new and novel approach for performing onsite gaseous UF6 DA analysis that provides rapid and accurate assessment of enrichment bias defects. DA samples are collected using a custom sampling device attached to a conventional sample tap. A few micrograms of gaseous UF6 is chemically adsorbed onto a sampling coupon in a matter of minutes. The collected DA sample is then analyzed onsite using Laser Ablation Absorption Ratio Spectrometry-Destructive Assay (LAARS-DA). DA results are determined in a matter of minutes at sufficient accuracy to support reliable bias defect conclusions, while greatly reducing DA sample volume, analysis time, and cost.« less

  1. ICSH guidelines for the verification and performance of automated cell counters for body fluids.

    PubMed

    Bourner, G; De la Salle, B; George, T; Tabe, Y; Baum, H; Culp, N; Keng, T B

    2014-12-01

    One of the many challenges facing laboratories is the verification of their automated Complete Blood Count cell counters for the enumeration of body fluids. These analyzers offer improved accuracy, precision, and efficiency in performing the enumeration of cells compared with manual methods. A patterns of practice survey was distributed to laboratories that participate in proficiency testing in Ontario, Canada, the United States, the United Kingdom, and Japan to determine the number of laboratories that are testing body fluids on automated analyzers and the performance specifications that were performed. Based on the results of this questionnaire, an International Working Group for the Verification and Performance of Automated Cell Counters for Body Fluids was formed by the International Council for Standardization in Hematology (ICSH) to prepare a set of guidelines to help laboratories plan and execute the verification of their automated cell counters to provide accurate and reliable results for automated body fluid counts. These guidelines were discussed at the ICSH General Assemblies and reviewed by an international panel of experts to achieve further consensus. © 2014 John Wiley & Sons Ltd.

  2. The New Payload Handling System for the G erman On- Orbit Verification Satellite TET with the Sensor Bus as Example for Payloads

    NASA Astrophysics Data System (ADS)

    Heyer, H.-V.; Föckersperger, S.; Lattner, K.; Moldenhauer, W.; Schmolke, J.; Turk, M.; Willemsen, P.; Schlicker, M.; Westerdorff, K.

    2008-08-01

    The technology verification satellite TET (Technologie ErprobungsTräger) is the core element of the German On-Orbit-Verification (OOV) program of new technologies and techniques. The goal of this program is the support of the German space industry and research facilities for on-orbit verification of satellite technologies. The TET satellite is a small satellite developed and built in Germany under leadership of Kayser-Threde. The satellite bus is based on the successfully operated satellite BIRD and the newly developed payload platform with the new payload handling system called NVS (Nutzlastversorgungs-system). The NVS can be detailed in three major parts: the power supply the processor boards and the I/O-interfaces. The NVS is realized via several PCBs in Europe format which are connected to each other via an integrated backplane. The payloads are connected by front connectors to the NVS. This paper describes the concept, architecture, and the hard-/software of the NVS. Phase B of this project was successfully finished last year.

  3. Experimental verification of a Monte Carlo-based MLC simulation model for IMRT dose calculations in heterogeneous media

    NASA Astrophysics Data System (ADS)

    Tyagi, N.; Curran, B. H.; Roberson, P. L.; Moran, J. M.; Acosta, E.; Fraass, B. A.

    2008-02-01

    IMRT often requires delivering small fields which may suffer from electronic disequilibrium effects. The presence of heterogeneities, particularly low-density tissues in patients, complicates such situations. In this study, we report on verification of the DPM MC code for IMRT treatment planning in heterogeneous media, using a previously developed model of the Varian 120-leaf MLC. The purpose of this study is twofold: (a) design a comprehensive list of experiments in heterogeneous media for verification of any dose calculation algorithm and (b) verify our MLC model in these heterogeneous type geometries that mimic an actual patient geometry for IMRT treatment. The measurements have been done using an IMRT head and neck phantom (CIRS phantom) and slab phantom geometries. Verification of the MLC model has been carried out using point doses measured with an A14 slim line (SL) ion chamber inside a tissue-equivalent and a bone-equivalent material using the CIRS phantom. Planar doses using lung and bone equivalent slabs have been measured and compared using EDR films (Kodak, Rochester, NY).

  4. Weak lensing magnification in the Dark Energy Survey Science Verification data

    NASA Astrophysics Data System (ADS)

    Garcia-Fernandez, M.; Sanchez, E.; Sevilla-Noarbe, I.; Suchyta, E.; Huff, E. M.; Gaztanaga, E.; Aleksić, J.; Ponce, R.; Castander, F. J.; Hoyle, B.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Eifler, T. F.; Evrard, A. E.; Fernandez, E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Jarvis, M.; Kirk, D.; Krause, E.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; MacCrann, N.; Maia, M. A. G.; March, M.; Marshall, J. L.; Melchior, P.; Miquel, R.; Mohr, J. J.; Plazas, A. A.; Romer, A. K.; Roodman, A.; Rykoff, E. S.; Scarpine, V.; Schubnell, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Tarle, G.; Thomas, D.; Walker, A. R.; Wester, W.; DES Collaboration

    2018-05-01

    In this paper, the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using the Dark Energy Survey Science Verification data set. This analysis is carried out for galaxies that are selected only by its photometric redshift. An extensive analysis of the systematic effects, using new methods based on simulations is performed, including a Monte Carlo sampling of the selection function of the survey.

  5. Model based verification of the Secure Socket Layer (SSL) Protocol for NASA systems

    NASA Technical Reports Server (NTRS)

    Powell, John D.; Gilliam, David

    2004-01-01

    The National Aeronautics and Space Administration (NASA) has tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information theft, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach' offers formal verification of information technology (IT), through the creation of a Software Security Assessment Instrument (SSAI), to address software security risks.

  6. Bayesian truthing as experimental verification of C4ISR sensors

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz; Forrester, Thomas; Romanov, Volodymyr; Wang, Wenjian; Nielsen, Thomas; Kostrzewski, Andrew

    2015-05-01

    In this paper, the general methodology for experimental verification/validation of C4ISR and other sensors' performance, is presented, based on Bayesian inference, in general, and binary sensors, in particular. This methodology, called Bayesian Truthing, defines Performance Metrics for binary sensors in: physics, optics, electronics, medicine, law enforcement, C3ISR, QC, ATR (Automatic Target Recognition), terrorism related events, and many others. For Bayesian Truthing, the sensing medium itself is not what is truly important; it is how the decision process is affected.

  7. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.

  8. Non-Lethal Weapons Effectiveness Assessment Development and Verification Study (Etude d’evaluation, de developpement et de verification de l’efficacite des armes non letales)

    DTIC Science & Technology

    2009-10-01

    will guarantee a solid base for the future. The content of this publication has been reproduced directly from material supplied by RTO or the...intensity threat involving a local population wanting to break into the camp to steal material and food supplies ; and • A higher intensity threat...combatant evacuation opeations, distribute emergency supplies , and evacuate/ relocate refugees and displaced persons. Specified NLW-relevant tasks are

  9. The potential for the indirect crystal structure verification of methyl glycosides based on acetates' parent structures: GIPAW and solid-state NMR approaches

    NASA Astrophysics Data System (ADS)

    Szeleszczuk, Łukasz; Gubica, Tomasz; Zimniak, Andrzej; Pisklak, Dariusz M.; Dąbrowska, Kinga; Cyrański, Michał K.; Kańska, Marianna

    2017-10-01

    A convenient method for the indirect crystal structure verification of methyl glycosides was demonstrated. Single-crystal X-ray diffraction structures for methyl glycoside acetates were deacetylated and subsequently subjected to DFT calculations under periodic boundary conditions. Solid-state NMR spectroscopy served as a guide for calculations. A high level of accuracy of the modelled crystal structures of methyl glycosides was confirmed by comparison with published results of neutron diffraction study using RMSD method.

  10. Verification of Sulfate Attack Penetration Rates for Saltstone Disposal Unit Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flach, G. P.

    Recent Special Analysis modeling of Saltstone Disposal Units consider sulfate attack on concrete and utilize degradation rates estimated from Cementitious Barriers Partnership software simulations. This study provides an independent verification of those simulation results using an alternative analysis method and an independent characterization data source. The sulfate penetration depths estimated herein are similar to the best-estimate values in SRNL-STI-2013-00118 Rev. 2 and well below the nominal values subsequently used to define Saltstone Special Analysis base cases.

  11. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Kranz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.

  12. Risk-Based Tailoring of the Verification, Validation, and Accreditation/Acceptance Processes (Adaptation fondee sur le risque, des processus de verification, de validation, et d’accreditation/d’acceptation)

    DTIC Science & Technology

    2012-04-01

    Systems Concepts and Integration SET Sensors and Electronics Technology SISO Simulation Interoperability Standards Organization SIW Simulation...conjunction with 2006 Fall SIW 2006 September SISO Standards Activity Committee approved beginning IEEE balloting 2006 October IEEE Project...019 published 2008 June Edinborough, UK Held in conjunction with 2008 Euro- SIW 2008 September Laurel, MD, US Work on Composite Model 2008 December

  13. Description of a Computer Program Written for Approach and Landing Test Post Flight Data Extraction of Proximity Separation Aerodynamic Coefficients and Aerodynamic Data Base Verification

    NASA Technical Reports Server (NTRS)

    Homan, D. J.

    1977-01-01

    A computer program written to calculate the proximity aerodynamic force and moment coefficients of the Orbiter/Shuttle Carrier Aircraft (SCA) vehicles based on flight instrumentation is described. The ground reduced aerodynamic coefficients and instrumentation errors (GRACIE) program was developed as a tool to aid in flight test verification of the Orbiter/SCA separation aerodynamic data base. The program calculates the force and moment coefficients of each vehicle in proximity to the other, using the load measurement system data, flight instrumentation data and the vehicle mass properties. The uncertainty in each coefficient is determined, based on the quoted instrumentation accuracies. A subroutine manipulates the Orbiter/747 Carrier Separation Aerodynamic Data Book to calculate a comparable set of predicted coefficients for comparison to the calculated flight test data.

  14. Using Model Replication to Improve the Reliability of Agent-Based Models

    NASA Astrophysics Data System (ADS)

    Zhong, Wei; Kim, Yushim

    The basic presupposition of model replication activities for a computational model such as an agent-based model (ABM) is that, as a robust and reliable tool, it must be replicable in other computing settings. This assumption has recently gained attention in the community of artificial society and simulation due to the challenges of model verification and validation. Illustrating the replication of an ABM representing fraudulent behavior in a public service delivery system originally developed in the Java-based MASON toolkit for NetLogo by a different author, this paper exemplifies how model replication exercises provide unique opportunities for model verification and validation process. At the same time, it helps accumulate best practices and patterns of model replication and contributes to the agenda of developing a standard methodological protocol for agent-based social simulation.

  15. SU-D-BRC-03: Development and Validation of an Online 2D Dose Verification System for Daily Patient Plan Delivery Accuracy Check

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, J; Hu, W; Xing, Y

    Purpose: All plan verification systems for particle therapy are designed to do plan verification before treatment. However, the actual dose distributions during patient treatment are not known. This study develops an online 2D dose verification tool to check the daily dose delivery accuracy. Methods: A Siemens particle treatment system with a modulated scanning spot beam is used in our center. In order to do online dose verification, we made a program to reconstruct the delivered 2D dose distributions based on the daily treatment log files and depth dose distributions. In the log files we can get the focus size, positionmore » and particle number for each spot. A gamma analysis is used to compare the reconstructed dose distributions with the dose distributions from the TPS to assess the daily dose delivery accuracy. To verify the dose reconstruction algorithm, we compared the reconstructed dose distributions to dose distributions measured using PTW 729XDR ion chamber matrix for 13 real patient plans. Then we analyzed 100 treatment beams (58 carbon and 42 proton) for prostate, lung, ACC, NPC and chordoma patients. Results: For algorithm verification, the gamma passing rate was 97.95% for the 3%/3mm and 92.36% for the 2%/2mm criteria. For patient treatment analysis,the results were 97.7%±1.1% and 91.7%±2.5% for carbon and 89.9%±4.8% and 79.7%±7.7% for proton using 3%/3mm and 2%/2mm criteria, respectively. The reason for the lower passing rate for the proton beam is that the focus size deviations were larger than for the carbon beam. The average focus size deviations were −14.27% and −6.73% for proton and −5.26% and −0.93% for carbon in the x and y direction respectively. Conclusion: The verification software meets our requirements to check for daily dose delivery discrepancies. Such tools can enhance the current treatment plan and delivery verification processes and improve safety of clinical treatments.« less

  16. Improvement on post-OPC verification efficiency for contact/via coverage check by final CD biasing of metal lines and considering their location on the metal layout

    NASA Astrophysics Data System (ADS)

    Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong

    2011-04-01

    As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model application is presented.

  17. Control/structure interaction design methodology

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.; Layman, William E.

    1989-01-01

    The Control Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts (such as active structure) and new tools (such as a combined structure and control optimization algorithm) and their verification in ground and possibly flight test. The new CSI design methodology is centered around interdisciplinary engineers using new tools that closely integrate structures and controls. Verification is an important CSI theme and analysts will be closely integrated to the CSI Test Bed laboratory. Components, concepts, tools and algorithms will be developed and tested in the lab and in future Shuttle-based flight experiments. The design methodology is summarized in block diagrams depicting the evolution of a spacecraft design and descriptions of analytical capabilities used in the process. The multiyear JPL CSI implementation plan is described along with the essentials of several new tools. A distributed network of computation servers and workstations was designed that will provide a state-of-the-art development base for the CSI technologies.

  18. Reconstruction based finger-knuckle-print verification with score level adaptive binary fusion.

    PubMed

    Gao, Guangwei; Zhang, Lei; Yang, Jian; Zhang, Lin; Zhang, David

    2013-12-01

    Recently, a new biometrics identifier, namely finger knuckle print (FKP), has been proposed for personal authentication with very interesting results. One of the advantages of FKP verification lies in its user friendliness in data collection. However, the user flexibility in positioning fingers also leads to a certain degree of pose variations in the collected query FKP images. The widely used Gabor filtering based competitive coding scheme is sensitive to such variations, resulting in many false rejections. We propose to alleviate this problem by reconstructing the query sample with a dictionary learned from the template samples in the gallery set. The reconstructed FKP image can reduce much the enlarged matching distance caused by finger pose variations; however, both the intra-class and inter-class distances will be reduced. We then propose a score level adaptive binary fusion rule to adaptively fuse the matching distances before and after reconstruction, aiming to reduce the false rejections without increasing much the false acceptances. Experimental results on the benchmark PolyU FKP database show that the proposed method significantly improves the FKP verification accuracy.

  19. Hand Grasping Synergies As Biometrics.

    PubMed

    Patel, Vrajeshri; Thukral, Poojita; Burns, Martin K; Florescu, Ionut; Chandramouli, Rajarathnam; Vinjamuri, Ramana

    2017-01-01

    Recently, the need for more secure identity verification systems has driven researchers to explore other sources of biometrics. This includes iris patterns, palm print, hand geometry, facial recognition, and movement patterns (hand motion, gait, and eye movements). Identity verification systems may benefit from the complexity of human movement that integrates multiple levels of control (neural, muscular, and kinematic). Using principal component analysis, we extracted spatiotemporal hand synergies (movement synergies) from an object grasping dataset to explore their use as a potential biometric. These movement synergies are in the form of joint angular velocity profiles of 10 joints. We explored the effect of joint type, digit, number of objects, and grasp type. In its best configuration, movement synergies achieved an equal error rate of 8.19%. While movement synergies can be integrated into an identity verification system with motion capture ability, we also explored a camera-ready version of hand synergies-postural synergies. In this proof of concept system, postural synergies performed well, but only when specific postures were chosen. Based on these results, hand synergies show promise as a potential biometric that can be combined with other hand-based biometrics for improved security.

  20. Approaching the investigation of plasma turbulence through a rigorous verification and validation procedure: A practical example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricci, P., E-mail: paolo.ricci@epfl.ch; Riva, F.; Theiler, C.

    In the present work, a Verification and Validation procedure is presented and applied showing, through a practical example, how it can contribute to advancing our physics understanding of plasma turbulence. Bridging the gap between plasma physics and other scientific domains, in particular, the computational fluid dynamics community, a rigorous methodology for the verification of a plasma simulation code is presented, based on the method of manufactured solutions. This methodology assesses that the model equations are correctly solved, within the order of accuracy of the numerical scheme. The technique to carry out a solution verification is described to provide a rigorousmore » estimate of the uncertainty affecting the numerical results. A methodology for plasma turbulence code validation is also discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The Verification and Validation methodology is then applied to the study of plasma turbulence in the basic plasma physics experiment TORPEX [Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulations carried out with the GBS code [Ricci et al., Plasma Phys. Controlled Fusion 54, 124047 (2012)]. The validation procedure allows progress in the understanding of the turbulent dynamics in TORPEX, by pinpointing the presence of a turbulent regime transition, due to the competition between the resistive and ideal interchange instabilities.« less

  1. Online fingerprint verification.

    PubMed

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  2. Analytical Formulation for Sizing and Estimating the Dimensions and Weight of Wind Turbine Hub and Drivetrain Components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Y.; Parsons, T.; King, R.

    This report summarizes the theory, verification, and validation of a new sizing tool for wind turbine drivetrain components, the Drivetrain Systems Engineering (DriveSE) tool. DriveSE calculates the dimensions and mass properties of the hub, main shaft, main bearing(s), gearbox, bedplate, transformer if up-tower, and yaw system. The level of fi¬ delity for each component varies depending on whether semiempirical parametric or physics-based models are used. The physics-based models have internal iteration schemes based on system constraints and design criteria. Every model is validated against available industry data or finite-element analysis. The verification and validation results show that the models reasonablymore » capture primary drivers for the sizing and design of major drivetrain components.« less

  3. A Design Verification of the Parallel Pipelined Image Processings

    NASA Astrophysics Data System (ADS)

    Wasaki, Katsumi; Harai, Toshiaki

    2008-11-01

    This paper presents a case study of the design and verification of a parallel and pipe-lined image processing unit based on an extended Petri net, which is called a Logical Colored Petri net (LCPN). This is suitable for Flexible-Manufacturing System (FMS) modeling and discussion of structural properties. LCPN is another family of colored place/transition-net(CPN) with the addition of the following features: integer value assignment of marks, representation of firing conditions as marks' value based formulae, and coupling of output procedures with transition firing. Therefore, to study the behavior of a system modeled with this net, we provide a means of searching the reachability tree for markings.

  4. Validation of a SysML based design for wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Berrachedi, Amel; Rahim, Messaoud; Ioualalen, Malika; Hammad, Ahmed

    2017-07-01

    When developing complex systems, the requirement for the verification of the systems' design is one of the main challenges. Wireless Sensor Networks (WSNs) are examples of such systems. We address the problem of how WSNs must be designed to fulfil the system requirements. Using the SysML Language, we propose a Model Based System Engineering (MBSE) specification and verification methodology for designing WSNs. This methodology uses SysML to describe the WSNs requirements, structure and behaviour. Then, it translates the SysML elements to an analytic model, specifically, to a Deterministic Stochastic Petri Net. The proposed approach allows to design WSNs and study their behaviors and their energy performances.

  5. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development and lower operating costs. However, as those system close control loops and arbitrate resources on board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques and concrete experiments at NASA.

  6. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development, and lower operating costs. However, as those system close control loops and arbitrate resources on-board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques, and concrete experiments at NASA.

  7. Verification and Validation of Adaptive and Intelligent Systems with Flight Test Results

    NASA Technical Reports Server (NTRS)

    Burken, John J.; Larson, Richard R.

    2009-01-01

    F-15 IFCS project goals are: a) Demonstrate Control Approaches that can Efficiently Optimize Aircraft Performance in both Normal and Failure Conditions [A] & [B] failures. b) Advance Neural Network-Based Flight Control Technology for New Aerospace Systems Designs with a Pilot in the Loop. Gen II objectives include; a) Implement and Fly a Direct Adaptive Neural Network Based Flight Controller; b) Demonstrate the Ability of the System to Adapt to Simulated System Failures: 1) Suppress Transients Associated with Failure; 2) Re-Establish Sufficient Control and Handling of Vehicle for Safe Recovery. c) Provide Flight Experience for Development of Verification and Validation Processes for Flight Critical Neural Network Software.

  8. SU-E-T-48: A Multi-Institutional Study of Independent Dose Verification for Conventional, SRS and SBRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takahashi, R; Kamima, T; Tachibana, H

    2015-06-15

    Purpose: To show the results of a multi-institutional study of the independent dose verification for conventional, Stereotactic radiosurgery and body radiotherapy (SRS and SBRT) plans based on the action level of AAPM TG-114. Methods: This study was performed at 12 institutions in Japan. To eliminate the bias of independent dose verification program (Indp), all of the institutions used the same CT-based independent dose verification software (Simple MU Analysis, Triangle Products, JP) with the Clarkson-based algorithm. Eclipse (AAA, PBC), Pinnacle{sup 3} (Adaptive Convolve) and Xio (Superposition) were used as treatment planning system (TPS). The confidence limits (CL, Mean±2SD) for 18 sitesmore » (head, breast, lung, pelvis, etc.) were evaluated in comparison in dose between the TPS and the Indp. Results: A retrospective analysis of 6352 treatment fields was conducted. The CLs for conventional, SRS and SBRT were 1.0±3.7 %, 2.0±2.5 % and 6.2±4.4 %, respectively. In conventional plans, most of the sites showed within 5 % of TG-114 action level. However, there were the systematic difference (4.0±4.0 % and 2.5±5.8 % for breast and lung, respectively). In SRS plans, our results showed good agreement compared to the action level. In SBRT plans, the discrepancy between the Indp was variable depending on dose calculation algorithms of TPS. Conclusion: The impact of dose calculation algorithms for the TPS and the Indp affects the action level. It is effective to set the site-specific tolerances, especially for the site where inhomogeneous correction can affect dose distribution strongly.« less

  9. Technical experiences of implementing a wireless tracking and facial biometric verification system for a clinical environment

    NASA Astrophysics Data System (ADS)

    Liu, Brent; Lee, Jasper; Documet, Jorge; Guo, Bing; King, Nelson; Huang, H. K.

    2006-03-01

    By implementing a tracking and verification system, clinical facilities can effectively monitor workflow and heighten information security in today's growing demand towards digital imaging informatics. This paper presents the technical design and implementation experiences encountered during the development of a Location Tracking and Verification System (LTVS) for a clinical environment. LTVS integrates facial biometrics with wireless tracking so that administrators can manage and monitor patient and staff through a web-based application. Implementation challenges fall into three main areas: 1) Development and Integration, 2) Calibration and Optimization of Wi-Fi Tracking System, and 3) Clinical Implementation. An initial prototype LTVS has been implemented within USC's Healthcare Consultation Center II Outpatient Facility, which currently has a fully digital imaging department environment with integrated HIS/RIS/PACS/VR (Voice Recognition).

  10. Validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Gilstrap, Lewey

    1991-01-01

    Validation and verification (V&V) are procedures used to evaluate system structure or behavior with respect to a set of requirements. Although expert systems are often developed as a series of prototypes without requirements, it is not possible to perform V&V on any system for which requirements have not been prepared. In addition, there are special problems associated with the evaluation of expert systems that do not arise in the evaluation of conventional systems, such as verification of the completeness and accuracy of the knowledge base. The criticality of most NASA missions make it important to be able to certify the performance of the expert systems used to support these mission. Recommendations for the most appropriate method for integrating V&V into the Expert System Development Methodology (ESDM) and suggestions for the most suitable approaches for each stage of ESDM development are presented.

  11. A software engineering approach to expert system design and verification

    NASA Technical Reports Server (NTRS)

    Bochsler, Daniel C.; Goodwin, Mary Ann

    1988-01-01

    Software engineering design and verification methods for developing expert systems are not yet well defined. Integration of expert system technology into software production environments will require effective software engineering methodologies to support the entire life cycle of expert systems. The software engineering methods used to design and verify an expert system, RENEX, is discussed. RENEX demonstrates autonomous rendezvous and proximity operations, including replanning trajectory events and subsystem fault detection, onboard a space vehicle during flight. The RENEX designers utilized a number of software engineering methodologies to deal with the complex problems inherent in this system. An overview is presented of the methods utilized. Details of the verification process receive special emphasis. The benefits and weaknesses of the methods for supporting the development life cycle of expert systems are evaluated, and recommendations are made based on the overall experiences with the methods.

  12. Verification of the FtCayuga fault-tolerant microprocessor system. Volume 2: Formal specification and correctness theorems

    NASA Technical Reports Server (NTRS)

    Bickford, Mark; Srivas, Mandayam

    1991-01-01

    Presented here is a formal specification and verification of a property of a quadruplicately redundant fault tolerant microprocessor system design. A complete listing of the formal specification of the system and the correctness theorems that are proved are given. The system performs the task of obtaining interactive consistency among the processors using a special instruction on the processors. The design is based on an algorithm proposed by Pease, Shostak, and Lamport. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, providing certain preconditions hold, using a computer aided design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.

  13. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  14. SU-F-T-288: Impact of Trajectory Log Files for Clarkson-Based Independent Dose Verification of IMRT and VMAT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takahashi, R; Kamima, T; Tachibana, H

    2016-06-15

    Purpose: To investigate the effect of the trajectory files from linear accelerator for Clarkson-based independent dose verification in IMRT and VMAT plans. Methods: A CT-based independent dose verification software (Simple MU Analysis: SMU, Triangle Products, Japan) with a Clarksonbased algorithm was modified to calculate dose using the trajectory log files. Eclipse with the three techniques of step and shoot (SS), sliding window (SW) and Rapid Arc (RA) was used as treatment planning system (TPS). In this study, clinically approved IMRT and VMAT plans for prostate and head and neck (HN) at two institutions were retrospectively analyzed to assess the dosemore » deviation between DICOM-RT plan (PL) and trajectory log file (TJ). An additional analysis was performed to evaluate MLC error detection capability of SMU when the trajectory log files was modified by adding systematic errors (0.2, 0.5, 1.0 mm) and random errors (5, 10, 30 mm) to actual MLC position. Results: The dose deviations for prostate and HN in the two sites were 0.0% and 0.0% in SS, 0.1±0.0%, 0.1±0.1% in SW and 0.6±0.5%, 0.7±0.9% in RA, respectively. The MLC error detection capability shows the plans for HN IMRT were the most sensitive and 0.2 mm of systematic error affected 0.7% dose deviation on average. Effect of the MLC random error did not affect dose error. Conclusion: The use of trajectory log files including actual information of MLC location, gantry angle, etc should be more effective for an independent verification. The tolerance level for the secondary check using the trajectory file may be similar to that of the verification using DICOM-RT plan file. From the view of the resolution of MLC positional error detection, the secondary check could detect the MLC position error corresponding to the treatment sites and techniques. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  15. TU-H-CAMPUS-JeP1-02: Fully Automatic Verification of Automatically Contoured Normal Tissues in the Head and Neck

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCarroll, R; UT Health Science Center, Graduate School of Biomedical Sciences, Houston, TX; Beadle, B

    Purpose: To investigate and validate the use of an independent deformable-based contouring algorithm for automatic verification of auto-contoured structures in the head and neck towards fully automated treatment planning. Methods: Two independent automatic contouring algorithms [(1) Eclipse’s Smart Segmentation followed by pixel-wise majority voting, (2) an in-house multi-atlas based method] were used to create contours of 6 normal structures of 10 head-and-neck patients. After rating by a radiation oncologist, the higher performing algorithm was selected as the primary contouring method, the other used for automatic verification of the primary. To determine the ability of the verification algorithm to detect incorrectmore » contours, contours from the primary method were shifted from 0.5 to 2cm. Using a logit model the structure-specific minimum detectable shift was identified. The models were then applied to a set of twenty different patients and the sensitivity and specificity of the models verified. Results: Per physician rating, the multi-atlas method (4.8/5 point scale, with 3 rated as generally acceptable for planning purposes) was selected as primary and the Eclipse-based method (3.5/5) for verification. Mean distance to agreement and true positive rate were selected as covariates in an optimized logit model. These models, when applied to a group of twenty different patients, indicated that shifts could be detected at 0.5cm (brain), 0.75cm (mandible, cord), 1cm (brainstem, cochlea), or 1.25cm (parotid), with sensitivity and specificity greater than 0.95. If sensitivity and specificity constraints are reduced to 0.9, detectable shifts of mandible and brainstem were reduced by 0.25cm. These shifts represent additional safety margins which might be considered if auto-contours are used for automatic treatment planning without physician review. Conclusion: Automatically contoured structures can be automatically verified. This fully automated process could be used to flag auto-contours for special review or used with safety margins in a fully automatic treatment planning system.« less

  16. Annual Report FY2013-- A Kinematically Complete, Interdisciplinary, and Co-Institutional Measurement of the 19F(α,n) Cross-section for Nuclear Safeguards Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peters, William A; Smith, Michael Scott; Clement, Ryan

    2013-10-01

    The goal of this proposal is to enable neutron detection for precision Non-Destructive Assays (NDAs) of actinide-fluoride samples. Neutrons are continuously generated from a UFx matrix in a container or sample as a result of the interaction of alpha particles from uranium-decay α particles with fluorine nuclei in the matrix. Neutrons from 19F(α,n)22Na were once considered a poorly characterized background for assays of UFx samples via 238U spontaneous fission neutron detection [SMI2010B]. However, the yield of decay-α-driven neutrons is critical for 234,235U LEU and HEU assays, as it can used to determine both the total amount of uranium and themore » enrichment [BER2010]. This approach can be extremely valuable in a variety of safeguard applications, such as cylinder monitoring in underground uranium storage facilities, nuclear criticality safety studies, nuclear materials accounting, and other nonproliferation applications. The success of neutron-based assays critically depends on an accurate knowledge of the cross section of the (α,n) reaction that generates the neutrons. The 40% uncertainty in the 19F(α,n)22Na cross section currently limits the precision of such assays, and has been identified as a key factor in preventing accurate enrichment determinations [CRO2003]. The need for higher quality cross section data for (α,n) reactions has been a recurring conclusion in reviews of the nuclear data needs to support safeguards. The overarching goal of this project is to enable neutron detection to be used for precision Non- Destructive Assays (NDAs) of actinide-fluoride samples. This will significantly advance safeguards verification at existing declared facilities, nuclear materials accounting, process control, nuclear criticality safety monitoring, and a variety of other nonproliferation applications. To reach this goal, Idaho National Laboratory (INL), in partnership with Oak Ridge National Laboratory (ORNL), Rutgers University (RU), and the University of Notre Dame (UND), will focus on three specific items: (1) making a precision (better than 10 %) determination of the absolute cross section of the 19F(α,n)22Na reaction as a function of energy; (2) determining the spectrum of neutrons and γ-rays emitted from 19F(α,n)22Na over an energy range pertinent to NDA; and (3) performing simulations with this new cross section to extract the neutron yield (neutrons/gram/second) and resulting neutron- and gamma ray-spectra when α particles interact with fluorine nuclei in actinide samples, to aid in the design and reduce uncertainty of future NDA measurements and simulations.« less

  17. Rule Systems for Runtime Verification: A Short Tutorial

    NASA Astrophysics Data System (ADS)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cherpak, Amanda

    Purpose: The Octavius 1000{sup SRS} detector was commissioned in December 2014 and is used routinely for verification of all SRS and SBRT plans. Results of verifications were analyzed to assess trends and limitations of the device and planning methods. Methods: Plans were delivered using a True Beam STx and results were evaluated using gamma analysis (95%, 3%/3mm) and absolute dose difference (5%). Verification results were analyzed based on several plan parameters including tumour volume, degree of modulation and prescribed dose. Results: During a 12 month period, a total of 124 patient plans were verified using the Octavius detector. Thirteen plansmore » failed the gamma criteria, while 7 plans failed based on the absolute dose difference. When binned according to degree of modulation, a significant correlation was found between MU/cGy and both mean dose difference (r=0.78, p<0.05) and gamma (r=−0.60, p<0.05). When data was binned according to tumour volume, the standard deviation of average gamma dropped from 2.2% – 3.7% for the volumes less than 30 cm{sup 3} to below 1% for volumes greater than 30 cm{sup 3}. Conclusions: The majority of plans and verification failures involved tumour volumes smaller than 30 cm{sup 3}. This was expected due to the nature of disease treated with SBRT and SRS techniques and did not increase rate of failure. Correlations found with MU/cGy indicate that as modulation increased, results deteriorated but not beyond the previously set thresholds.« less

  19. Probabilistic Sizing and Verification of Space Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Denaux, David; Ballhause, Dirk; Logut, Daniel; Lucarelli, Stefano; Coe, Graham; Laine, Benoit

    2012-07-01

    Sizing of ceramic parts is best optimised using a probabilistic approach which takes into account the preexisting flaw distribution in the ceramic part to compute a probability of failure of the part depending on the applied load, instead of a maximum allowable load as for a metallic part. This requires extensive knowledge of the material itself but also an accurate control of the manufacturing process. In the end, risk reduction approaches such as proof testing may be used to lower the final probability of failure of the part. Sizing and verification of ceramic space structures have been performed by Astrium for more than 15 years, both with Zerodur and SiC: Silex telescope structure, Seviri primary mirror, Herschel telescope, Formosat-2 instrument, and other ceramic structures flying today. Throughout this period of time, Astrium has investigated and developed experimental ceramic analysis tools based on the Weibull probabilistic approach. In the scope of the ESA/ESTEC study: “Mechanical Design and Verification Methodologies for Ceramic Structures”, which is to be concluded in the beginning of 2012, existing theories, technical state-of-the-art from international experts, and Astrium experience with probabilistic analysis tools have been synthesized into a comprehensive sizing and verification method for ceramics. Both classical deterministic and more optimised probabilistic methods are available, depending on the criticality of the item and on optimisation needs. The methodology, based on proven theory, has been successfully applied to demonstration cases and has shown its practical feasibility.

  20. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

Top